Abstract
This study explores the relationship between journalistic background, content creation experience, and self-reported information literacy among global content creators. Based on an online survey of 500 content creators in eight languages around the world, the study explains whether journalistic training or experience in content creation influences perceived information literacy, while controlling for education and economic development of the country. Results indicate that both having a journalistic background and content creation experience significantly predict perceived information literacy, with education of creators as a significant covariate. Economic development (Global South vs. Global North) is not a significant factor. Grounded in Flavell’s metacognitive theory, the findings suggest that content creators gain confidence in evaluating information through having a journalistic background and content creator experience, even if their actual ability remains untested. The implications of perceived information literacy compared to actual practice in information checking and information literacy training based on metacognition are discussed.
1. Introduction
The ability to effectively access, evaluate, and use information has become one of the most essential skills in the era of the Fourth Industrial Revolution. As information flows continuously across digital platforms, news outlets, and social media, navigating this landscape requires advanced information literacy. The digital revolution has significantly transformed the challenges faced by journalists. One major challenge is the rapid rise in misinformation and disinformation, particularly content generated or manipulated by artificial intelligence, which has further complicated the information ecosystem (Adami, 2024). As a result, fact-checking has become increasingly important for both journalists and the public. But shockingly, an overreliance on fact-checking websites may undermine the credibility of traditional media if audiences feel they cannot trust the information presented without external verification (Ognyanova et al., 2020).
At the same time, content creators who produce and share information on digital platforms are emerging as powerful influencers in shaping public opinion (Muth & Peter, 2023). Unlike journalists, they often operate outside formal media institutions, yet they reach wide audiences and play a central role in the information landscape. One-in-five Americans regularly obtain their news from social media influencers. This trend is even more pronounced among younger adults, with 37% of those under 30 relying on influencers for news. Notably, most of these influencers (77%) do not have any professional ties or background with traditional news organizations (Stocking et al., 2024). Their influence makes it essential for them to demonstrate strong information literacy skills, especially given their hands-on experience with digital content production.
Given these developments, it is crucial to understand how both journalistic training and content creation experience relate to information literacy. It is important to remember that few studies have compared global samples of content creators with and without journalistic backgrounds in terms of their perceived information literacy. So, such insights can help build strategies to ensure that today’s content creators are equipped with the necessary skills to navigate and contribute responsibly to the digital information environment. Education and economic development were included in the study to control for differences in education levels and disparities in structural differences in information access and instructional resources across regions.
2. Literature Review
The increasing scholarly interest in information literacy, with a focus on examining this in the context of the digital age, is not coincidental, as evidenced by the diverse range of experts in information science, media education, literacy, culture, human–computer interaction, and social studies of technology (Livingstone, 2004). The heart of these concerns for information literacy underscores the importance of literacy skills in the effective use of information in society and the need for lifelong learning (Thoman & Jolls, 2008). Utilizing, sharing, modifying, and producing information is becoming consequential, particularly for knowledge workers who depend on computers and the internet (Hobbs, 2008). In 2023, the Chinese government formalized information literacy as a basic skill deemed important and necessary in the age of widespread AI use by introducing new standards with four parts: information metacognition, information knowledge, information application and creation, and information ethics (Jin, 2025).
2.1. Information Literacy
Information literacy is commonly defined as the ability to recognize when information is needed and to effectively locate, evaluate, and use that information to solve a problem (American Library Association, 1989). Hobbs (2006) argues that information literacy highlights how important it is to retrieve and choose the available information, and emphasizes the identification of message quality, authenticity, and credibility. Those who are information literate will have the abilities necessary to efficiently collect, use, manage, synthesize, and produce data and information in an ethical manner (Bent & Stubbings, 2011). Information literacy encompasses a range of essential functions, such as cultivating reading habits, enhancing academic performance, supporting problem-solving and decision-making, and fostering critical thinking and lifelong learning (Machin-Mastromatteo, 2021).
Media literacy, information literacy, and digital literacy are closely related but distinct concepts that reflect different dimensions of engaging with media and information. Media literacy emphasizes the ability to access, understand, critically evaluate, and create media content, often highlighting its social, political, and commercial dimensions. And as we have discussed, information literacy focuses on recognizing when information is needed and developing the skills to locate, evaluate, and use that information effectively. On the other hand, digital literacy extends these concepts into digital environments, encompassing the use of digital tools to search, assess, produce, and communicate information. Although these types of literacies share common elements, such as critical thinking and communication, they differ in focus, application, and context (Koltay, 2011). Along with these, algorithmic literacy has also become increasingly important today. Developing proficiency in this area involves various behaviors, such as identifying disinformation, verifying information, revealing or analyzing search results, challenging inaccurate outcomes, and demanding greater transparency and accountability from digital platforms and services. It also upholds key values traditionally promoted by media and information literacy (MIL), including information integrity, data quality, freedom of expression, and the promotion of media pluralism and diversity (Frau-Meigs, 2024).
2.2. Information Literacy and Critical Thinking
Most information literacy conceptualizations have an element of critical thinking (Koltay, 2011). When evaluating the facts offered in a message, information literacy allows one to think critically and receive information appropriately. According to Hobbs (2006), finding information in certain areas, fields, and situations requires critical thinking and metacognitive skills, all of which are emphasized in information literacy education. Lynch (1998) believed that the entire spectrum of visual (image and video) and multimedia communication genres must be covered by critical and analytical reading, which includes evaluating purpose, bias, correctness, and quality. Ultimately, information knowledge is important, as it can have a strong positive effect on information application and creation (Jin, 2025).
2.3. Difficulties Faced by Journalists in the Digital Age
In the ever-changing digital information landscape, the role of the journalist has become more complex, multifaceted, and demanding (Naveed & Saadia, 2023). The emergence and proliferation of digital media platforms have fundamentally transformed the way news is produced, consumed, and disseminated. Therefore, journalists must navigate a landscape where the ability to critically evaluate, utilize, and disseminate digital information is paramount (Boss et al., 2022; MacMillan, 2014). Journalists’ credibility is being assessed based on their capacity to identify, assess, validate, and truthfully report reliable sources (Franklin & Carlson, 2013).
2.4. Information Literacy, Journalism Education, and Training
The technological, financial, and social underpinnings of journalism have been significantly impacted by digitalization, which has also altered journalistic practices and methods in numerous ways (Kirchhoff, 2021). Given the prevalence of political microtargeting, fake news, strategic disinformation, and a decline in public confidence in traditional media (Lipka & Shearer, 2023), many educators believe that fact-checking abilities, media and information literacy, and understanding of media ethics and accountability are becoming increasingly crucial for journalism students (Nordenstreng, 2009). There is a noticeable increase in awareness of the social function of media and a renewed desire to improve journalistic standards (Kirchhoff, 2021). Different approaches have been taken in the past by professionals and educators to improve information literacy for journalists.
In 2011, the Association of College and Research Libraries (ACRL) linked its Information Literacy Competency Standards to undergraduate journalism education, showing how important information literacy is for future journalists (ACRL Board of Directors, 2012). Since journalism is mainly about finding, checking, and sharing accurate information, information literacy skills are included in journalism training materials (Spilsbury, 2012, 2014) and are part of the core skills listed by leading journalism education organizations (EJTA, 2020). According to the Accrediting Council on Education in Journalism and Mass Communications (n.d.), departments and schools of journalism are required to demonstrate that graduates can evaluate information using methods appropriate to their communications professions and critically assess their own work and that of others for accuracy and fairness.
2.5. Information Literacy and Digital Content Creators
Social media content creation has become common and popular among journalists as well. In addition to producing content for mainstream media, many journalists now actively create and share content on their personal social media platforms. Currently, with the increasing number of social media platforms and the amount of information being produced, information literacy is regarded as a crucial skill for journalists. Naveed and Saadia (2023) found that journalists view themselves as information literate not just at basic levels, but also at advanced levels. Jones-Jang et al. (2019) found that only information literacy, not other forms of literacy, significantly improves the ability to recognize fake news stories. Interestingly, when it comes to social media influencers, regardless of whether or not they have a journalistic background, their information literacy is rarely examined, despite them being major content creators. For many of them, creating and sharing content has become such a regular part of daily life that it no longer clearly connects to traditional journalism values (Holton et al., 2013).
Education and frequency of social media use, but not age, have been found to be positively associated with social media literacy (Heiss et al., 2023). The frequency of social media use can be part of the content creation experience if the frequency is related to content creation. In our study, Flavell’s (1979) metacognitive theory is used as the lens to explore whether experienced content creators feel more confident in detecting disinformation and deepfake content than less experienced creators. According to Schraw and Dennison (1994), “Metacognition refers to the ability to reflect upon, understand, and control one’s learning”. Flavell’s model of cognitive monitoring includes four key parts: metacognitive knowledge, metacognitive experiences, goals, and strategies (Flavell, 1979). This study focuses on metacognitive knowledge and metacognitive experiences.
Metacognitive knowledge is the understanding of how different factors interact to influence thinking and decision-making (Flavell, 1979). Experienced content creators work with digital media often. They learn about content patterns, editing techniques, and platform trends. This knowledge makes them more confident in spotting false or manipulated content, even if their actual ability is not always accurate. Less experienced creators may not have the same exposure, which can lead to lower confidence in detecting disinformation. Metacognitive experiences involve monitoring one’s thinking in real-time. This includes feelings of certainty or doubt about understanding a topic. Experienced content creators may rely on quick judgments based on their familiarity with digital content. This can make them trust their instincts more. Less experienced creators may take a more careful approach and feel unsure about their ability to detect false information. Therefore, cultivating self-regulated questioning is crucial for promoting critical literacy and preventing shallow engagement with digital content. Instructional implications emphasize the importance of providing explicit metacognitive support and ensuring fair access to digital literacy resources (Adams et al., 2025).
Though metacognition supports journalists in regulating bias, managing institutional constraints, adapting to new technologies, and understanding the political and economic forces influencing their work (Johnson et al., 2025), it is still necessary to explore how these processes function for content creators. This overconfidence bias is particularly important since unrecognized incompetence not only results in poor performance but also hinders individuals from recognizing their own limitations (Kruger & Dunning, 1999). Grounded in metacognitive theory, this study explores if level of content creation experience is linked to self-reported information literacy. In this context, self-reported information literacy reflects the confidence that content creators derive from their experience.
2.6. Information Literacy, Education, and Development
Information literacy, critical thinking, and creativity are widely categorized as essential competencies for college students in the 21st century. In this study, we included education as a controlling variable, as higher levels of education are typically associated with greater critical thinking skills and more experience in processing complex information. Previous studies suggest education is a predictor of information literacy (Williams & Evans, 2008; Conner, 2012). Such experience can enhance content creators’ confidence in managing information, reflected in their self-perceived information literacy.
Economic development (Global South vs. Global North) was also included as a controlling variable to account for differences related to information infrastructure and media systems across regions, as our study is a global study of creators from countries with different levels of economic development. This inclusion ensured that the effects of journalistic background and content creation experience on self-perceived information literacy were not confounded by regional disparities.
As discussed in our literature review on information literacy, we found no evidence on how these content creators perceive their own information literacy in the context of content creation. To address this gap, we pose the following research question:
RQ1: To what extent does journalistic background and content creation experience affect perceived information literacy level, while controlling for the education level of creators and economic development (Global South vs. Global North)?
3. Method
3.1. Sample
A global online survey was conducted in eight languages among 500 social media influencers, ranging from nano-influencers (1000–10,000 followers) to mega-influencers (over one million followers). Country selection was guided by language region and level of economic development (Global South and Global North according to UNESCO’s classification) to ensure representation of major cultures worldwide. Since English is the most widely spoken second language, English-speaking countries were allocated a larger portion of the sample.
The questionnaire was first developed in English, then translated into Arabic, Chinese, French, German, Portuguese, Russian, and Spanish by professional translators. Each translation was reviewed by native-speaking communication scholars to ensure accuracy and cultural relevance. Quota sampling was applied for each language region as follows: 100 from English-speaking Global North countries (e.g., United States, United Kingdom, Canada, Singapore, Australia, Ireland); 50 from English-speaking Global South countries (e.g., India, South Africa, Pacific Islands, Southeast Asia); 50 from French-speaking countries (France, French-speaking African countries, Pacific Islands); 50 from Spanish-speaking countries (Spain, Latin America); 50 from Portuguese-speaking countries (Portugal, Brazil); 50 from German-speaking countries (Germany, Switzerland, Austria); 50 from Russian-speaking countries (Russia, Eastern Europe); 50 from China and its special administrative territories; and 50 from Arabic-speaking countries in the Middle East and North Africa (MENA).
Data collection began from September 2024 to October 2024 after the Institution Review Board of the authors’ university reviewed and approved the study. Data collection was conducted through the research company Qualtrics, which distributed the survey to its global panel. Eligible participants were required to have at least 1000 followers and to regularly produce public content on social media. Standard Qualtrics compensation was provided, and a gender quota ensured equal participation of men and women. The survey was fully anonymous, with no identifying information collected.
Of the total sample, 52.4% were from Global South countries, 47.6% from Global North countries and respondents represented 44 countries across six continents.
3.2. Measurement
This study examined differences in information literacy between content creators with and without a journalistic background. Journalistic background was measured using a dichotomous item: “Do you have a journalistic background (with professional experience or journalism training)?” with response options yes and no. The study also assessed the association between content creation experience and information literacy. Experience was measured with the question: “How long have you created public content on social media?” with four interval options: less than one year, 1 to 3 years, more than 3 years but less than 10 years, and 10 years or more.
Information literacy was measured using six items adapted from the SCONUL Seven Pillars of Information Literacy: Core Model (Bent & Stubbings, 2011). As this is only a self-reported measure, information literacy is more a measure of their confidence: perceived competence rather than actual ability. Participants rated each statement on a 5-point Likert-type scale (1 = strongly disagree, 5 = strongly agree). The items were as follows: (1) I am good at finding the information I need to create the content, (2) I am good at determining the quality of the information I collected, (3) I am good at identifying deepfake or disinformation, (4) I am good at knowing the vested interest in the sources of information, (5) I am good at understanding the consequences of disseminating incorrect information to the audience, and (6) I am good at using diverse sources for my content.
The six items were averaged to create an information literacy scale (α = 0.879, M = 24.20, SD = 4.64), indicating high internal consistency and supporting the scale’s reliability for this study.
Education was measured using a single item: “What is your highest education attainment?” Participants selected one of five categorical options: (1) High school or below, (2) Some college/associate degree, (3) bachelor’s degree, (4) master’s degree or graduate certificate, and (5) PhD/JD or equivalent.
4. Results
The content creators in our study reported a generally high level of information literacy (M = 24.2, SD = 4.64). Table 1 is a summary table of the descriptives of the main variables of the study. It shows that our creator sample has a diverse range of experiences and educational backgrounds. Most of them do not have journalistic background. Only 21.1% have journalistic background.
Table 1.
Summary statistics of key variables.
An Analysis of Covariance (ANCOVA) was conducted to examine predictors of perceived information literacy: journalistic background and content creation experience, while controlling for education and economic development. Having a journalistic background emerged as a significant predictor, F(1, 479) = 4.88, p = 0.028. Furthermore, content creation experience also significantly predicted perceived information literacy, F(3, 479) = 3.64, p = 0.013 (see Table 2).
Table 2.
Analysis of covariance tests of between-subjects effects on perceived information literacy.
Results showed that education significantly predicted perceived information literacy, F(1, 479) = 14.07, p < 0.001, with higher education levels associated with greater perceived information literacy. By contrast, the region of development (Global South vs. Global North) was not a significant predictor, F(1, 479) = 3.37, p = 0.067.
An ANCOVA was conducted to examine predictors of each dimension of informational literacy, controlling for education and economic development. For the dimension “finding information need to create the content”, the results showed that education was a significant covariate, F(1, 479) = 12.26, p < 0.001, indicating that participants with higher levels of education reported greater confidence in their self-perceived ability to find information. Content creation experience also significantly predicted performance in this dimension, F(3, 479) = 2.96, p = 0.032, suggesting that more experienced content creators felt more competent in locating the information necessary for content production.
For the dimension “determining the quality of the collected information,” the results showed that education was still a significant covariate, F(1, 479) = 8.90, p = 0.003, indicating that participants with higher levels of education reported greater confidence in their self-perceived ability to evaluate the quality of information. Additionally, having a journalistic background significantly predicted this dimension, F(1, 479) = 5.69, p = 0.017, suggesting that participants with journalism experience felt more competent in assessing the quality of information.
For the dimension “identifying deepfake or disinformation,” the results showed that education was a significant covariate, F(1, 479) = 13.17, p < 0.001, indicating that participants with higher levels of education reported greater confidence in their self-perceived ability to detect deepfakes or disinformation. Content creation experience also significantly predicted this dimension, F(3, 479) = 3.69, p = 0.012, suggesting that participants with more content creation experience felt more competent in identifying deepfakes or disinformation.
For the dimension “understanding the consequences of disseminating incorrect information to the audience” while controlling for education and development, the results showed that education was a significant covariate, F(1, 479) = 6.38, p = 0.012, indicating that participants with higher education levels reported greater confidence in understanding the consequences of sharing incorrect information. Additionally, having a journalistic background significantly predicted this dimension, F(1, 479) = 7.75, p = 0.006, and content creation experience also had a significant effect, F(3, 479) = 2.71, p = 0.045. These findings suggest that both professional training and practical experience contribute to creators’ perceived competence in evaluating the potential impact of inaccurate information.
For the dimension “using diverse sources for content,” the results showed that economic development (Global South vs. Global North) was a significant covariate, F(1, 479) = 4.20, p = 0.041, indicating that participants from less developed regions reported slightly higher confidence in using diverse sources. Content creation experience also significantly predicted this dimension, F(3, 479) = 2.86, p = 0.036, suggesting that more experienced content creators felt more competent in sourcing diverse information.
5. Discussion and Conclusions
Metacognition has already been found to mediate the positive relationship between media literacy and fact-checking behavior (Lee & Ramazan, 2021). In the current era of disinformation and deepfakes, fact-checking has become an essential practice for content creators. Our findings indicate that having a journalistic background predicted higher self-perceived information literacy even after controlling for education and development, suggesting that a journalistic background may cultivate additional literacy skills beyond what is gained through formal education.
Similarly, our results also reveal that content creation experience contributed to self-perceived information literacy, indicating that active engagement in producing content such as blogs, videos, or other digital media may foster self-perceived information literacy. Greater content creation experience is associated with stronger confidence in one’s information literacy competencies. This finding aligns with the metacognition theory, particularly the concepts of metacognitive knowledge and metacognitive experiences, which explain how practical engagement can strengthen self-perceptions of competence.
As predicted, education is a predictor of self-perceived information literacy, corroborating previous research (Williams & Evans, 2008; Conner, 2012). However, economic development (Global South vs. Global North) does not predict self-perceived information literacy except in finding diverse sources. It indicates that content creators in both economically developed and less economically developed regions shared similar levels of information literacy when factors of education, journalistic background, and content creation experiences were taken into consideration. The growing accessibility of digital platforms and resources across regions can also level the playing field for information literacy for content creators. Nonetheless, in the economically disadvantaged Global South, creators are even more eager to report themselves to be good at finding diverse sources for their content than their Global North counterparts.
The significance of this study lies in its implication that experienced content creators or creators with journalistic backgrounds have high levels of self-perceived information literacy and often believe they can effectively detect misinformation or AI-generated content, such as deepfakes. However, this confidence may increase their vulnerability to errors in judgment. It might lead them to rely more on intuition rather than careful verification, overlook subtle cues of manipulation, and dismiss fact-checking practices they consider unnecessary. As a result, they may unintentionally spread or accept false information, assuming their expertise protects them from deception. In fact, another study of global content creators found that influencers with larger follower sizes do not do more rigorous information checking than those with smaller follower sizes, even though they have higher social capital, and they preserve their credibility by diversifying the types of their sources (Ali et al., 2025). From a marketing perspective, when ambassadors or influencers misjudge or share malicious or false information, it can harm consumers. The precaution extends beyond personal credibility, potentially damaging the trustworthiness of the brand they present.
However, this study also leads us to suggest tapping into the metacognition of experienced content creators. Experienced creators’ metacognition for misinformation detection can be a useful heuristic cue, and the cues they use in social media posts can be tested for factual accuracy. Those cues that are proven to be effective at predicting accuracy and misinformation can be included in information literacy education.
One potential direction for social media platforms is to integrate an interface that supports transparency and informed decision-making. For instance, creators could be provided with AI-detection dashboards that allow them to assess whether their content or resources may involve AI-generated media, giving content creators greater control over the credibility of their work. This will help both experienced and inexperienced creators, because experienced creators may trust their instincts and make prompt judgments as to whether the content is authentic or not, but may be wrong due to their biases. Having a tool will allow them to do the checking easily. Inexperienced creators do not have enough confidence to make these judgments and need tools to help them check the accuracy of their information. Commonly available tools such as these will make it easier for them to check their sources. In addition, incorporating a warning indicator or an AI-generated content score for posts will enable influencers and content creators to evaluate potential risks before publishing or promoting sponsored content. Such tools will foster trust between creators, audiences, and advertisers by making the content production process more transparent. Also, it will be beneficial to identify the cues for misinformation used by experienced content creators in developing screening tools for misinformation and to see how accurate or inaccurate these cues are for judging misinformation. Alongside the above-mentioned technological solutions, it would also be beneficial to include more practical misinformation detection training programs for content creators and to strengthen digital platform policies that reward content creators for careful fact-checking such as adding an “information verified” badge to the post.
Despite its contributions to understanding to what extent content creators’ prior knowledge and journalistic experience affect their perceived information literacy, this study has some limitations. First, journalistic background was measured using a single dichotomous question, which may have oversimplified respondents’ prior training and experience. We acknowledge this overlooked the complexity in training and professional experience, which can vary widely in depth and formality. Future research can measure journalistic training in more specific detail to show how training affects perceived information literacy ability.
Second, information literacy was measured entirely through self-reported items, including the self-perceived ability to identify deepfakes and disinformation. This approach carries the risk that participants may have overestimated their capabilities. For example, they may believe they can detect fabricated content when, in practice, they cannot. Although it would be desirable to conduct a comprehensive information literacy test for the participating content creators, the data in this study is part of a larger study which is unable to focus only on information literacy. Nonetheless, self-reported information literacy is a confidence measure that affects how content creators manage the content they receive and create daily.
Future research should further investigate this area, especially given the rapid transformations in content creation following the integration of artificial intelligence tools. Experimental methods could be used to assess actual levels of information literacy rather than relying solely on self-reported measures. The discrepancy between self-reported information literacy and actual information literacy knowledge and practice should be compared so that content creator education and support can pinpoint areas that need more emphasis. A longitudinal design may also help determine whether metacognitive confidence increases or decreases over time. Additionally, future studies should explore the gap between self-perceived information literacy and actual information literacy to provide a clear understanding of the content creators’ actual competencies and what skills need to be developed to navigate the challenges of today’s digital information environment.
Author Contributions
Conceptualization, O.B.; Introduction and theoretical framing, A.B.; Literature review, O.B.; Methodology, O.B. and L.H.; Formal analysis, O.B., A.B. and L.H.; Investigation, O.B.; Findings interpretation, O.B., A.B. and L.H.; Discussion, A.B.; Conclusion, O.B. and L.H.; Writing—original draft preparation, O.B.; Writing—review and editing, A.B. and L.H.; Visualization, A.B.; Supervision and project administration, L.H.; Formatting, A.B. and L.H. All authors have read and agreed to the published version of the manuscript.
Funding
This study was commissioned by UNESCO as a contribution to the report,‘Behind the Screens: insights from digital content creators; understanding their intentions, practices and challenges’. ©UNESCO 2024. This work is available under the Creative Commons Attribution-ShareAlike 3.0 IGO license (CC-BY-SA 3.0 IGO). The authors alone are responsible for the views expressed in this publication and they do not necessarily represent the views, decisions or policies of UNESCO.
Institutional Review Board Statement
The study was conducted in accordance with the Declaration of Helsinki, and approved by the Institutional Review Board of Bowling Green State University. Approval Code: 2221820-3. Approval Date: 5 August 2024.
Informed Consent Statement
Informed consent was obtained from all subjects involved in the study.
Data Availability Statement
The data presented in this study are available on request from the corresponding author. The data are not publicly available due to restriction of the funder.
Conflicts of Interest
The authors declare no conflicts of interest.
References
- Accrediting Council on Education in Journalism and Mass Communications. (n.d.). Accrediting standards. Available online: https://www.acejmc.org/policies-process/accrediting-standards (accessed on 2 October 2025).
- ACRL Board of Directors. (2012). Information literacy competency standards for journalism students and professionals: Approved by the ACRL Board of Directors, October 2011. College & Research Libraries News, 73(5), 274–285. [Google Scholar] [CrossRef][Green Version]
- Adami, M. (2024, March 15). How AI-generated disinformation might impact this year’s elections and how journalists should report on it. Reuters Institute for the Study of Journalism. Available online: https://reutersinstitute.politics.ox.ac.uk/news/how-ai-generated-disinformation-might-impact-years-elections-and-how-journalists-should-report (accessed on 2 October 2025).
- Adams, B., Wilson, N. S., & Mertens, G. E. (2025). dmQAR: Mapping metacognition in digital spaces onto question–answer relationship. Education Sciences, 15(6), 751. [Google Scholar] [CrossRef]
- Ali, H., Kasirye, F., & Ha, L. (2025). Does follower size matter? Diversity of sources and credibility assessment among social media influencers. Information, 16(11), 958. [Google Scholar] [CrossRef]
- American Library Association. (1989). Presidential committee on information literacy: Final report. Available online: https://www.ala.org/acrl/publications/whitepapers/presidential (accessed on 2 October 2025).
- Bent, M., & Stubbings, R. (2011). The SCONUL seven pillars of information literacy: Core model. Available online: https://eprints.ncl.ac.uk/192827 (accessed on 2 October 2025).
- Boss, K. E., De Voe, K. M., Gilbert, S. R., Hernandez, C., Heuer, M. B., Hines, A., Knapp, J. A., Tokarz, R. E., Tucker, C. E., & Bisbee, K. V. (2022). Uncovering the research behaviors of reporters: A conceptual framework for information literacy in journalism. Journal & Mass Communication Educator, 77(4), 393–413. [Google Scholar] [CrossRef]
- Conner, T. R. (2012). The relationship between self-directed learning and information literacy among adult learners in higher education [Ph.D. Dissertation, University of Tennessee]. [Google Scholar]
- EJTA. (2020). Framework for competencies: Tartu declaration. Available online: https://ejta.eu/index.php/about-us/tartu-declaration/ (accessed on 2 October 2025).
- Flavell, J. H. (1979). Metacognition and cognitive monitoring: A new area of cognitive developmental inquiry. American Psychologist, 34(10), 906–911. [Google Scholar] [CrossRef]
- Franklin, B., & Carlson, M. (Eds.). (2013). Journalists, sources, and credibility: New perspectives. Routledge. [Google Scholar]
- Frau-Meigs, D. (2024). Algorithm literacy as a subset of media and information literacy: Competences and design considerations. Digital, 4(2), 512–528. [Google Scholar] [CrossRef]
- Heiss, R., Nanz, A., & Matthes, J. (2023). Social media information literacy: Conceptualization and associations with information overload, news avoidance and conspiracy mentality. Computers in Hum Behaviors, 148, 107908. [Google Scholar] [CrossRef]
- Hobbs, R. (2006). Multiple visions of multimedia literacy. Routledge Handbooks Online. [Google Scholar] [CrossRef]
- Hobbs, R. (2008). Literacy challenges in the Arab states region: Building partnerships and promoting innovative approaches. In Approaches to instruction and teacher education in media literacy: Research paper. Doha. [Google Scholar]
- Holton, A. E., Coddington, M., & De Zúñiga, H. G. (2013). Whose news? whose values? Journalism Practice, 7(6), 720–737. [Google Scholar] [CrossRef]
- Jin, Z. (2025). A case study on Chinese latest information literacy standards in the AI era: Impact of inquiry-based projects on information application and creation. Information Development, 1–17. [Google Scholar] [CrossRef]
- Johnson, P. R., Gran, E., & Cohn, S. (2025). Reflecting, regulating, adapting: Metacognition’s role in journalism practices. Journalism Studies, 26(11), 1376–1397. [Google Scholar] [CrossRef]
- Jones-Jang, S. M., Mortensen, T., & Liu, J. (2019). Does media literacy help identification of fake news? Information literacy helps, but other literacies don’t. American Behavior Scientist, 65(2), 371–388. [Google Scholar] [CrossRef]
- Kirchhoff, S. (2021). Journalism education’s response to the challenges of digital transformation: A dispositive analysis of journalism training and education programs. Journalism Studies, 23(1), 108–130. [Google Scholar] [CrossRef]
- Koltay, T. (2011). The media and the literacies: Media literacy, information literacy, digital literacy. Media, Culture & Society, 33(2), 211–221. [Google Scholar] [CrossRef]
- Kruger, J., & Dunning, D. (1999). Unskilled and unaware of it: How difficulties in recognizing one’s own incompetence lead to inflated self-assessments. Journal of Personality and Social Psychology, 77(6), 1121–1134. [Google Scholar] [CrossRef]
- Lee, D. K. L., & Ramazan, O. (2021). Fact-checking of health information: The effect of media literacy, metacognition and health information exposure. Journal of Health Communication, 26(7), 491–500. [Google Scholar] [CrossRef]
- Lipka, M., & Shearer, E. (2023, November 28). Audiences are declining for traditional news media in the U.S.—With some exceptions. Pew Research Center. Available online: https://www.pewresearch.org/short-reads/2023/11/28/audiences-are-declining-for-traditional-news-media-in-the-us-with-some-exceptions/ (accessed on 2 October 2025).
- Livingstone, S. (2004). Media literacy and the challenge of new information and communication technologies. The Communication Review, 7(1), 3–14. [Google Scholar] [CrossRef]
- Lynch, C. (1998). Information literacy and information technology literacy: New components in the curriculum for a digital culture. Coalition for Networked Information. Available online: https://www.cni.org/publications/cliffs-pubs/information-literacy-and-information-technology-literacy (accessed on 2 October 2025).
- Machin-Mastromatteo, J. D. (2021). Information and digital literacy initiatives. Information Development, 37(3), 329–333. [Google Scholar] [CrossRef]
- MacMillan, M. E. (2014). Fostering the integration of information literacy and journalism practice: A long-term study of journalism students. Journal of Information Literacy, 8(2), 3–22. [Google Scholar] [CrossRef]
- Muth, L., & Peter, C. (2023). Social media influencers’ role in shaping political opinions and actions of young audiences. Media and Communication, 11(3), 164–174. [Google Scholar] [CrossRef]
- Naveed, M. A., & Saadia, H. (2023). Information literacy at journalists’ workplace in Pakistan. Journal of Librarianship and Information Science, 56(2), 453–467. [Google Scholar] [CrossRef]
- Nordenstreng, K. (2009). Conclusions: Soul-searching at the crossroads of European journalism education. In European journalism education (pp. 511–517). Intellect Books. [Google Scholar]
- Ognyanova, K., Lazer, D., Robertson, R. E., & Wilson, C. (2020). Misinformation in action: Fake news exposure is linked to lower trust in media, higher trust in government when your side is in power. Harvard Kennedy School Misinformation Review. [Google Scholar] [CrossRef]
- Schraw, G., & Dennison, R. S. (1994). Assessing metacognitive awareness. Contemporary Educational Psychology, 19(4), 460–475. [Google Scholar] [CrossRef]
- Spilsbury, M. (2012). Journalists at work: Their view of training, recruitment and conditions. National Council for the Training of Journalists. Available online: https://www.nctj.com/wp-content/uploads/2021/08/jaw_2012.pdf (accessed on 2 October 2025).
- Spilsbury, M. (2014). Emerging skills for journalists. National Council for the Training of Journalists. Available online: https://www.nctj.com/wp-content/uploads/2021/08/NCTJ-Emerging-Skills-FINAL.pdf (accessed on 2 October 2025).
- Stocking, G., Wang, L., Lipka, M., Matsa, K. E., Widjaya, R., Tomasik, E., & Liedke, J. (2024, November 18). America’s news influencers. Pew Research Center. Available online: https://www.pewresearch.org/journalism/2024/11/18/americas-news-influencers/ (accessed on 2 October 2025).
- Thoman, E., & Jolls, T. (2008). Literacy for the 21st century: An overview and orientation guide to media literacy education. Theory CML MedicaLit kit. Center for Media Literacy. [Google Scholar]
- Williams, M. H., & Evans, J. J. (2008). Factors in information literacy education. Journal of Political Science Education, 4(1), 116–130. [Google Scholar] [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).