Next Article in Journal
Health Literacy for the General Public: Making a Case for Non-Trivial Visualizations
Next Article in Special Issue
Advancing Social Media and Mobile Technologies in Healthcare Education
Previous Article in Journal
How The Arts Can Help Tangible Interaction Design: A Critical Re-Orientation
Previous Article in Special Issue
digiMe: An Online Portal to Support Connectivity through E-Learning in Medical Education
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Evaluation Tools to Appraise Social Media and Mobile Applications

University of Colorado College of Nursing, Anschutz Medical Center Campus, Aurora, CO 80045, USA
Informatics 2017, 4(3), 32; https://doi.org/10.3390/informatics4030032
Submission received: 16 March 2017 / Revised: 12 September 2017 / Accepted: 13 September 2017 / Published: 15 September 2017
(This article belongs to the Special Issue Social Media and Mobile Technologies for Healthcare Education)

Abstract

:
In a connected care environment, more citizens are engaging in their health care through mobile apps and social media tools. Given this growing health care engagement, it is important for health care professionals to have the knowledge and skills to evaluate and recommend appropriate digital tools. The purpose of this article is to identify and review criteria or instruments that can be used to evaluate mobile apps and social media. The analysis will review current literature as well as literature designed by professional health care organizations. This review will facilitate health care professionals’ assessment of mobile apps and social media tools that may be pertinent to their patient population. The review will also highlight strategies which a health care system can use to provide guidance in recommending mobile apps and social media tools for their patients, families, and caregivers.

1. Introduction

In the Connected Health Age, patients, families, caregivers, health care providers, health care administrators, and informatics specialists will be using digital tools to facilitate the delivery of health care. To be an informed health care professional, it is important to understand how to find and evaluate digital tools. The ability to find and evaluate digital tools will provide better guidance to patients/consumers and their families. This is critical, as patients/consumers become partners in a collaborative care model of health care. The purpose of this paper is to provide evaluation tools that consumers or health care providers can use to appraise social media sites and mobile apps.

2. Background

A snapshot of the current use of the internet, social media, and mobile devices across the globe demonstrated the increased use of the Internet, social media, and mobile applications. According to the January 2017 Global Digital Snapshot provided by We are Social Singapore [1], the Internet has reached a 50% penetration and active users of social media has reached 37% penetration. In terms of mobile, there is close to 5 billion unique mobile users, or 66% penetration. There are 1871 billion users on Facebook, with 87% of that population accessing the social media site via their mobile device. Over 55% use Facebook on a daily basis. According to Statista [2], Facebook has 1871 billion users, followed by WhatsApp and Facebook Messenger at 1 billion and QQ instant messaging at 877 million.
In our connected age, society is “Liking, following, linking, tagging, stumbling…and social media is changing the nature of health-related interactions” [3]. “According to Price Waterhouse Cooper’s consumer survey of 1060 U.S. adults, about one-third of consumers are using the social space as a natural habitat for health discussions. Social media typically consists of four characteristics that have changed the nature of interactions among people and organizations: user-generated content, community, rapid distribution, and open, two-way dialogue.” [3] (p. 5). The following are highlights of their key findings: 42% read health-related consumer reviews, 32% read friends/family’s health experiences, 29% read about patient experiences with their diseases, and 24% looked at health-related videos or images posted by patients. As indicated by the data, patients are trying to educate themselves through the use of social media. Almost 59% of consumers use recommendations from social media to seek a second opinion. Around 40–42% of consumers use social media to cope with a chronic condition, manage stress or weight, and choose a specific hospital or physician.
In 2014, Price Waterhouse Cooper [4] also examined mobile health (mHealth) from a global perspective. Although many people across the global have cellular technology, there is only a limited movement of using mHealth tools. More importantly, it is important to look at what people are expecting in three years. Participants were asked if mHealth would change their health care experience. Here are some of the results:
  • 59% how I seek information for health issues
  • 51% how providers provide health information
  • 49% how I manage my health care
  • 48% how I manage my chronic conditions
  • 48% how I communicate with my provider
  • 48% how I manage my medications
  • 47% how I measure and share vital health information
  • 46% how providers will monitor my condition and compliance
In 2016, Accenture [5] conducted a global study to investigate patient engagement across the following countries: Australia, Brazil, England, Norway, Saudi Arabia, Singapore, and the United States. With 7840 participants, the report highlighted the digital engagement in health care. The majority of the report focused on patients’ access and use of EHRs, but there are some interesting results in terms of social media and wearables. On a global level, 52% used websites to access health information, 38% used mobile/tablets for health matters, 27% used social media, 16% used online support communities, and 19% had wearable technologies. The use of social media was higher in Saudi Arabia (41%) and Brazil (40%). Mobile usage was highest in Singapore (44%) and Saudi Arabia (40%). Wearable technologies were highest in Singapore (23%), United States (21%), and England (20%). Online support communities were highest in Brazil (22%). On a global level, fitness and diet/nutrition apps were used by 55% and 53% of people, respectively. The use of symptom navigators or health/condition trackers was 24%. Fitness apps were the most common in Australia (66%); diet/nutrition apps were most common in Brazil (68%). Symptom navigator apps were the most common in the United States (36%), and health or condition tracker apps were the most common in Saudi Arabia (28%).
Thus, the use of social media and mobile apps are transforming health care. Given this growing trend, it is of great importance that both health care professionals and consumers have the necessary knowledge and skills to guide the appropriate use of these tools. There is ample literature related to the use of social media and mobile apps in health care. There is also a growing body of evidence examining the impact of social media tools and mobile apps on patient satisfaction, engagement, and health outcomes [6,7,8,9]. However, there are relatively few articles that examine tools to help providers and consumers “search, select, appraise and apply online health information and health care-related digital applications” [10] (p. e27).

3. Methods

This is a descriptive review of the published literature using the Medline database for the last five years. The intent of this article was not to conduct a systematic review. The following keywords were used to search the published literature: social media, mHealth applications, mHealth, assessment, evaluation, ratings, standards and quality. In consultation with a health sciences librarian and a Google search, additional resources were extracted from websites. The websites were from government agencies and health care university libraries. There were relatively few articles that focused on tools to evaluate mHealth apps or social media. To address the lack of tools, online health information tools were first described as evaluation criteria that might be adapted for use with social media and mHealth apps. The remaining articles found related to evaluation tools for social media and mHealth apps were also included.

4. Evaluation Tools

4.1. Online Health Information Evaluation Tools

When the web first made health information available online, libraries and professional organizations developed numerous guides to help consumers and health care professionals assess online health information. These resources tend to be instructions or checklists for finding and evaluating online health information. The following examples of web sites are used by students, professionals, and consumers to find online health information:

4.2. Social Media Evaluation Tools

Much of the information found on these websites can be adapted and used with social media and mobile apps. Over the last few years, there are more web sites and tools that specifically examine social media tools such as social networks, twitter, Instagram, and blogs. For example, the Albert Einstein College of Medicine Library has developed specific criteria to assess social media web sites (http://libguides.einstein.yu.edu/c.php?g=123516&p=808220). In addition to the five website evaluation criteria (sponsorship, audience, currency, verifiability, and disclaimer), social networking sites should be evaluated by:
  • Ease of use: Images, icons and other visual elements that can be used to facilitate learning. Navigation and next steps should be intuitive.
  • Privacy policy: The privacy policy should be available within two clicks of the main page. Policies should be readable by those with an eighth grade reading level.
  • Checks on quality of content: Links to an outside organization should be on the home page. Specific information about numbers or credentials of moderators should be readily available.
  • Transparency of ads: There should be a clear distinction between advertising and editorial content.
  • Security of member data: The privacy policy should be backed up by technology, such as encryption, a secure socket layer, and an external audit of security practices.
  • Member control of information sharing: Users should be able to restrict access or sharing of information to community members.
The Johns Hopkins University Library guide also provides similar criteria for assessing social networks (http://guides.library.jhu.edu/c.php?g=202581&p=1335031). For example, they included criteria such as reliability of information, age of the account, location of the source, content corroboration from other sources, who is in the network, who are followers, and contextual updates.
Paterson and colleagues [11] conducted a review of the literature and derived 151 quality indicators that were sorted into three themes: credibility, content, and design. In another study [12], they used a modified Delphi method to sample emergency and critical care medicine bloggers and podcasters in regards to the relative importance of each of the quality indicators. The sample of bloggers and podcasters were determined using the Social Media Index, and there were two iterations to reach consensus. A total of 44 quality indicators reached a 70% agreement for blogging, whereas there were 80 quality indicators that reached 70% for podcasting. A post hoc reassessment using a 90% or above agreement shortened the list to the most important quality indicators. There were 14 criteria for blogs and 26 criteria for podcasts.
Kostick and colleagues [13] (p. 518) “developed a conceptual framework for analyzing content, scope, and character of social media sites.” The parameters used in the framework consisted of the following variables: (1) interactivity, (2) user friendliness (which also included a readability index), (3) type of medium (text, visuals, mixed), (4) purpose, (5) audience, (6) accuracy and consistency, and (7) tone. A principal components analysis was done on all variables encompassed in the seven parameters. The analysis yielded four factors: social media type (purpose and audience), user friendliness and appeal, interactivity, and accuracy and consistency. Although this tool certainly addressed key factors, Theron, Redmond, and Borycki [14] (p. 322) noted that “the criteria to appraise the actual content of information properly are still lacking.”
Another interesting tool is the Social Media Competency Inventory (SMCI) that was developed to measure social media competencies for certified health education specialists [15]. The researchers conducted a multi-staged instrument development project. In the first phase, an extensive literature review and conceptualization process with experts was conducted. An expert panel who participated in think-aloud sessions and a pilot test of the initial version of the SMCI was conducted. The final phase included a field test to establish psychometric properties of validity and reliability. The tool consisted of six constructs (social media self-efficacy, social media experience, effort expectancy, performance expectancy, facilitating conditions, and social influence) and initially had a total of 148 items. Sixteen items were eliminated during the pilot test and 52 items were eliminated during the field text. The authors concluded that sufficient validity and reliability evidence were established to support the SMCI. Further research would need to be conducted to examine further refinements with the tool and to establish predictive validity.
There are also two organizations that provide a wealth of materials about social media in health care, particularly as it relates to the consumers, patients, and families. The first is the Centers for Disease Control and Prevention which maintains an extensive site on tools, best practices, and guides. This resource (http://www.cdc.gov/socialmedia/tools/guidelines/index.html) is valuable for patients as well as providers. The second resource is the Mayo Clinic—a pioneer in the use of social media. Many years ago, they established the Center for Social Media (http://network.socialmedia.mayoclinic.org/). They also maintain the Social Media List for Hospitals (http://network.socialmedia.mayoclinic.org/hcsml-grid/).

4.3. Mobile Health App Evaluation

The sheer volume of potential health apps—particularly in the Apple and Android stores—necessitate the development of criteria specific to mobile apps [16]. In terms of mobile apps, there are various clearinghouses to find health-related apps and also several tools that have been developed to evaluate mobile apps.
Boudreaux and colleagues [17] recommended several strategies that health care professionals and health care institutions can use to find appropriate mobile apps. Below is an abbreviated list of their recommended strategies:
  • Ask your health care institution if they have a list of approved mobile apps and networks for patients to join.
  • Search app clearinghouse web sites. The following list is provided in their article: National Health Service (NHS) Health Tools, including smartphone apps (http://www.nhs.uk/tools/pages/toolslibrary.aspx?); Happtique (mHealth app store); iMedicalApps (online medical publication); HealthTap’s AppRX (targeted to consumers) Veterans Administration App Store: (https://mobile.va.gov/appstore).
  • Review the scientific literature: search the scientific literature for papers reviewing apps in a content domain or strong clinical trials. One potential open access journal that may be helpful is the Journal of Medical Internet Research (JMIR).
  • Go to professional organizations or foundations to see if they have any m-apps.
  • Search app stores: this may be very challenging, given that there are many apps and that they are broadly classified.
  • Talk with a health care professional via social media to discover any reputable apps.
One of the first tools to examine mobile apps was developed by the mHealth Group of the HealthCare Information Management Systems Society [18]. The monograph developed by the mHealth Group provides an excellent background to understand usability. The purpose is to provide support to help health care professionals or staff members in selecting appropriate mobile app tools for themselves, as well as patient populations. The document provides guidance for the selection and evaluation process. The evaluation of usability is built upon current usability practices and best practices. The usability variables include simplicity, naturalness, consistency, forgiveness and feedback, effective use of language, efficient interactions, effective information presentation, preservation of context, and minimization of cognitive overload. The guide also informs users that before evaluating a tool, one should address how practice goals are addressed by these measurable objectives of efficiency, effectiveness, ease of learning and user satisfaction. They also provide useful information on how to implement this selection process within your organization. There are also some very good practice use cases based on three different clinical scenarios. To learn more about the tool and view some slides related to this project, you can access these at http://www.himss.org/selecting-mobile-app-evaluating-usability-medical-applications-0.
In 2013, The Health IT Usability Evaluation Model (Health-ITUEM) was developed to evaluate mobile apps [19]. The tool was designed using a variety of frameworks such as The Technology Acceptance Model, ISO 9241-11 document, and principles from various usability experts such as Nielsen [20], Shneiderman [21], and Norman [22]. “The concepts include: Error prevention, Completeness, Memorability, Information needs, Flexibility/Customizability, Learnability, Performance speed, Competency and Other outcomes” [19] (p. 1080). To assess the Health-ITUEM, two series of focus groups were used. The first focus group was used to examine the use of mobile technologies for health in a group of adolescents. The second group was also adolescents who were given a smartphone with a health-related mobile app. This second group participated in “a 30-day ecological momentary assessment” [19] (p. 1082). Using a mixed method approach to the data, the authors concluded that “Health-ITUEM offers a new framework for understanding the usability issues related to mHealth technology and demonstrated the flexibility, robustness, and limitations of this model” [19] (p. 1086). They also identified that more work was needed to generalize this to other populations and there were some issues with concepts such as error prevention, learnability, and memorability being used less frequently by raters.
Chan, Torous, Hinton, and Yellowlees [23] developed a framework for both patients and consumers to evaluate mobile apps and wearable devices. Their framework includes three dimensions of evaluative criteria: usefulness, usability, and integration and infrastructure. In the last dimension, security, privacy, safety, data integration, and workflow integration were included. This dimension contains key items (security, privacy, and safety) that are of utmost importance to patients. Under usability, there were also items related to cultural, disability, socioeconomic, and generational accessibility. Again, there was more of an emphasis on the patient context in addition to the typical satisfaction and usability factors.
After an extensive review of the literature, a research team [24] created the Mobile App Rating Scale (MARS). This scale consisted of “five broad categories of criteria that were identified including four objective quality scales: engagement, functionality, aesthetics, and information quality; and one subjective quality scale” [24] (p. e27). The purpose was to create a multidimensional scale for researchers to classify and assess the quality of mobile apps. It was projected that further research would be conducted to adapt this tool for consumers and health care professionals. The tool was tested using 60 mental health mobile apps identified through an extensive search. There was a pilot stage with two reviewers on 10 apps where the tool was refined. The testing phase utilized the remaining 50 apps. Concurrent validity of the tool was calculated by comparing the MARS score with an App Store star rating scale. There were only 15 of the 50 apps listed in the star rating, so results were moderately correlated. The interrater reliability intraclass coefficient was 0.79 and the internal consistency reliability was a 0.90. Overall, the MARS tool demonstrated validity and reliability. Further development of the tool and its generalized use across various health-related mobile apps are projected as future work.
Fiore [16] conducted a scoping review to evaluate mobile apps in health care. In his review, he mentioned two other checklists for evaluating m-apps. The first is the CRAAP test, which was used by students to evaluate mobile apps [25]. This tool is based on the CRAAP test (currency, relevance, authority, accuracy, purpose) designed by the California State University Chico (http://www.csuchico.edu/lins/handouts/eval_websites.pdf). The faculty adapted this tool and added an “other” category that had students assess ease of use and whether the app was interesting and fun. The second is a checklist [26] designed as a created a mnemonic, NP-MED-APP, for nurse practitioners to use for assessing mobile apps. The mnemonic represents the following categories: novel, potential benefit versus risk, medically sound, ease of use, developer, audience, price, and platform. Neither checklist has been validated, but both are simple to use.

5. Future Directions

In summary, several tools are now available for use in evaluating health-related social media and mobile apps. As the validity and reliability of these tools become established, the use of adapting online health information criteria could be eliminated. There are many similarities across the various tools. Usability measures are typically listed as standard criteria, although the depth and breadth of the various measures are different across the tools. For example, it can be as simple as asking a question on ease of use to having multiple elements such as those mentioned in the mHealth usability document. It was interesting to note that some appraisal tools did not specifically address some of the major concerns expressed by patients and providers, which are privacy, security, and confidentiality [27]. The quality of the content was also mentioned in various tools. Quality indicators also included terms like credibility, reliability, completeness, and accuracy. Only a few mentioned data integration or control of data sharing, which will become increasingly more important as patients and consumers generated their own health data to share with their clinicians. Aesthetics and functionality were also mentioned by some tools.
With the availability of some tools, it is important that educators begin to incorporate knowledge and skills for evaluating social media tools and mobile apps into the curriculum of health care professionals. This is receiving more attention in academia in light of discussions around the meaning and measurement of the digital health literacy of our health care professionals [10]. There is a growing trend of faculty moving from computer literacy to digital health literacy that incorporates knowledge and skills related to the use of digital tools in our connected health care environment.
It is equally important to develop instructional materials for professional development for use by the current workforce. Some health care organizations have created committees that review and evaluate social media and mHealth apps for their patient populations. Several professional organizations and conferences are also providing these opportunities to evaluate digital tools for patient use. This is a wonderful opportunity for academia and service to work together to ensure all health care professionals have the necessary knowledge and skills. Healthcare is in need of informed health care professionals who can guide patients as they engage with various digital tools to improve their health.

Conflicts of Interest

The author declares no conflict of interest.

References

  1. We Are Social Singapore. Global Digital Snapshot. 2017. Available online: https://www.slideshare.net/wearesocialsg/digital-in-2017-global-overview (accessed on 10 March 2017).
  2. Statista. 2016. Available online: https://www.statista.com/statistics/272014/global-social-networks-ranked-by-number-of-users/ (accessed on 10 March 2017).
  3. Price, Waterhouse & Cooper. Social Media “Likes” Healthcare: From Marketing to Social Business. 2012. Available online: http://www.pwc.com/us/en/health-industries/health-research-institute/publications/health-care-social-media.html (accessed on 10 March 2017).
  4. Price, Waterhouse & Cooper. Emerging mHealth: Paths for Growth. 2014. Available online: https://www.pwc.com/gx/en/healthcare/mhealth/assets/pwc-emerging-mhealth-full.pdf (accessed on 10 March 2017).
  5. Accenture Consulting. Accenture 2016 Consumer Survey on Patient Engagement. 2016. Available online: https://www.accenture.com/t20160629T045304__w__/us-en/_acnmedia/PDF-15/Accenture-Patients-Want-A-Heavy-Dose-of-Digital-Research-Global-Report.pdf (accessed on 10 March 2017).
  6. Hui, C.; Walton, R.; McKinstry, B.; Jackson, T.; Parker, R.; Pinnock, H. The use of mobile applications to support self-management for people with asthma: A systematic review of controlled studies to identify features associated with clinical effectiveness and adherence. J. Am. Med. Inform. Assoc. 2017, 24, 619–632. [Google Scholar] [CrossRef] [PubMed]
  7. Kooij, L.; Groen, W.G.; van Harten, W.H. The Effectiveness of Information Technology-Supported Shared Care for Patients With Chronic Disease: A Systematic Review. J. Med. Internet Res. 2017, 19, e221. [Google Scholar] [CrossRef] [PubMed]
  8. Smailhodzic, E.; Hooijsma, W.; Boonstra, A.; Langley, D.J. Social media use in healthcare: A systematic review of effects on patients and on their relationship with healthcare professionals. BMC Health Serv. Res. 2016, 16, 442. [Google Scholar] [CrossRef] [PubMed]
  9. Ali, E.E.; Chew, L.; Yap, K.Y. Evolution and current status of mhealth research: A systematic review. BMJ Innov. 2016, 2, 33–40. [Google Scholar] [CrossRef]
  10. Van der Vaart, R.; Drossaert, C. Development of the Digital Health Literacy Instrument: Measuring a Broad Spectrum of Health 1.0 and Health 2.0 Skills. J. Med. Internet Res. 2017, 19, e27. [Google Scholar] [CrossRef]
  11. Paterson, Q.S.; Thoma, B.; Lin, M.; Trueger, N.S.; Ankel, F.; Sherbino, J.; Chan, T. Quality indicators for medical education blog posts and podcasts: A qualitative analysis and focus group. In Proceedings of the Association of American Medical Colleges Medical Education Meeting, Chicago, IL, USA, 6–7 November 2014. [Google Scholar]
  12. Toma, B.; Chan, T.A.; Paterson, Q.S.; Milne, W.K.; Sanders, J.L.; Lin, M. Emergency Medicine and Critical Care Blogs and Podcasts: Establishing an International Consensus on Quality. Ann. Emerg. Med. 2015, 66, 396–402. [Google Scholar] [CrossRef]
  13. Kostick, K.M.; Blumenthal-Barby, J.S.; Wilhelms, L.A.; Delgado, E.D.; Bruce, C.R. Content Analysis of Social Media Related to Left Ventricular Assist Devices. Circ. Cardiovasc. Qual. Outcomes 2015, 8, 517–523. [Google Scholar] [CrossRef] [PubMed]
  14. Theron, M.; Redmond, A.; Borycki, E. Baccalaureate nursing student’s abilities in critically identifying and evaluating the quality of online health information. In Studies in Health Technology and Informatics: Building Capacity for Health Informatics in the Future; Lau, F., Bartle-Clar, J., Bliss, G., Borycki, E., Courtney, K., Kuo, A., Eds.; IOS Press: Amsterdam, The Netherlands, 2017; Volume 234, Available online: http://ebooks.iospress.nl/publication/46186 (accessed on 10 March 2017).
  15. Alber, J.M.; Bernhardt, J.M.; Stellefson, M.; Weiler, R.M.; Anderson-Lewis, C.; Miller, D.M.; MacInnes, J. Designing and Testing an Inventory for Measuring Social Media Competency of Certified Health Education Specialists. J. Med. Internet Res. 2015, 17, e221. [Google Scholar] [CrossRef] [PubMed]
  16. Fiore, P. How to evaluate mobile applications: A scoping review. In Studies in Health Technology and Informatics: Building Capacity for Health Informatics in the Future; Lau, F., Bartle-Clar, J., Bliss, G., Borycki, E., Courtney, K., Kuo, A., Eds.; IOS Press: Clifton, VA, USA, 2017; Volume 234, Available online: http://ebooks.iospress.nl/publication/46149 (accessed on 10 March 2017).
  17. Boudreaux, E.; Waring, M.; Hayes, R.; Sadasivam, R.; Mullen, S.; Pagoto, S. Evaluating and selecting mobile health apps: Strategies for healthcare providers and healthcare organizations. Trans. Behav. Med. 2014, 4, 363–371. [Google Scholar]
  18. HIMSS. Selecting a Mobile App: Evaluating the Usability of Medical Applications. mHIMSS App Usability Work Group. 2012. Available online: http://s3.amazonaws.com/rdcms-himss/files/production/public/HIMSSguidetoappusabilityv1mHIMSS.pdf (accessed on 19 March 2017).
  19. Brown, W.; Yen, P.Y.; Rojas, M.; Schnall, R. Assessment of the Health IT Usability Evaluation Model (Health-ITUEM) for evaluating mobile health (mHealth) technology. J. Biomed. Inform. 2013, 46, 1080–1087. [Google Scholar] [PubMed]
  20. Nielsen, J.; Mack, R.L. Usability Inspection Methods; Wiley: New York, NY, USA, 1994. [Google Scholar]
  21. Shneiderman, B.; Plaisant, C. Designing the User Interface: Strategies for Effective Human–Computer Interaction, 5th ed.; Addison-Wesley: Boston, MA, USA, 2010. [Google Scholar]
  22. Norman, D.A. The Design of Everyday Things; Basic Books: New York, NY, USA, 2002. [Google Scholar]
  23. Chan, S.; Torous, J.; Hinton, L.; Yellowlees, P. Toward a framework for evaluating mobile health apps. Telemed. e-Health 2015, 21, 1037–1041. [Google Scholar] [CrossRef]
  24. Stoyanov, S.R.; Hides, L.; Kavanagh, D.J.; Zelenko, P.; Tjondronegoro, D.; Mani, M. Mobile App Rating Scale: A New Tool for Assessing the Quality of Health Mobile Apps. J. Med. Internet Res. 2015, 3, e27. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  25. McNiel, P.; McArthur, E.C. Evaluating health mobile apps: Information literacy in undergraduate and graduate nursing courses. J. Nurs. Educ. 2016, 55, 480. [Google Scholar] [CrossRef] [PubMed]
  26. Golden, A.; Krauskopf, P. Systematic Evaluation of mobile apps. J. Nurs. Pract. 2016, 6, e27–e28. [Google Scholar] [CrossRef]
  27. Moorhead, S.; Hazlett, D.; Harrison, L.; Carroll, J.; Irwin, A.; Hoving, C. A New Dimension of Health Care: Systematic Review of the Uses, Benefits, and Limitations of Social Media for Health Communication. J. Med. Internet Res. 2013, 15, e85. [Google Scholar] [CrossRef] [PubMed]

Share and Cite

MDPI and ACS Style

Skiba, D. Evaluation Tools to Appraise Social Media and Mobile Applications. Informatics 2017, 4, 32. https://doi.org/10.3390/informatics4030032

AMA Style

Skiba D. Evaluation Tools to Appraise Social Media and Mobile Applications. Informatics. 2017; 4(3):32. https://doi.org/10.3390/informatics4030032

Chicago/Turabian Style

Skiba, Diane. 2017. "Evaluation Tools to Appraise Social Media and Mobile Applications" Informatics 4, no. 3: 32. https://doi.org/10.3390/informatics4030032

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop