You are currently viewing a new version of our website. To view the old version click .
Education Sciences
  • Review
  • Open Access

11 January 2025

Artificial Intelligence: An Untapped Opportunity for Equity and Access in STEM Education

and
1
Department of Special Education, Rehabilitation, and Counseling, Auburn University, Auburn, AL 36849, USA
2
Teacher Education, University of Central Florida, Orlando, FL 32816, USA
*
Author to whom correspondence should be addressed.
This article belongs to the Special Issue Application of AI Technologies in STEM Education

Abstract

Artificial intelligence (AI) holds tremendous potential for promoting equity and access to science, technology, engineering, and mathematics (STEM) education, particularly for students with disabilities. This conceptual review explores how AI can address the barriers faced by this underrepresented group by enhancing accessibility and supporting STEM practices like critical thinking, inquiry, and problem solving, as evidenced by tools like adaptive learning platforms and intelligent tutors. Results show that AI can positively influence student engagement, achievement, and motivation in STEM subjects. By aligning AI tools with Universal Design for Learning (UDL) principles, this paper highlights how AI can personalize learning, improve accessibility, and close achievement gaps in STEM content areas. Furthermore, the natural intersection of STEM principles and standards with the AI4K12 guidelines justifies the logical need for AI–STEM integration. Ethical concerns, such as algorithmic bias (e.g., unequal representation in training datasets leading to unfair assessments) and data privacy risks (e.g., potential breaches of sensitive student data), require critical attention to ensure AI systems promote equity rather than exacerbate disparities. The findings suggest that while AI presents a promising avenue for creating inclusive STEM environments, further research conducted with intentionality is needed to refine AI tools and ensure they meet the diverse needs of students with disabilities to access STEM.

1. Introduction

Despite rapid technological advancements in science, technology, engineering, and mathematics (STEM) fields, they remain inaccessible to over seven million students with disabilities in the U.S., perpetuating economic and occupational inequities (National Center for Education Statistics, 2022; National Center for Science and Engineering Statistics, 2023; U.S. Department of Education, 2019). While a growing body of research explores artificial intelligence (AI) in educational settings (e.g., see Table 1 and Table 2), there remains a notable gap in its application, specifically within STEM accessibility (National Center for Education Statistics, 2022; U.S. Department of Education, 2019). The current article addresses these gaps by reviewing how AI applications can enhance STEM accessibility and support Universal Design for Learning (UDL) for students with disabilities in K-12 settings. While the existing literature extensively explores the use of AI in education (Kavitha & Joshith, 2024; Zhai et al., 2021), AI in STEM education (Chng et al., 2023; Hwang & Tu, 2021; Jia et al., 2024; Mohamed et al., 2022; Ouyang et al., 2023; Xu & Ouyang, 2022), and AI for supporting students with disabilities (Barua et al., 2022; Bhatti et al., 2024; Hopcan et al., 2022; Pierrès et al., 2024; Rice & Dunn, 2023), insufficient attention has focused on the specific intersection of AI in STEM education for students with disabilities.

1.1. The Unfortunate Reality of STEM Education for Individuals with Disabilities

In 2020, only 3% of the U.S. STEM workforce consisted of individuals with disabilities, highlighting a systemic issue that begins in K-12 education (Hall & Rathbun, 2020; National Center for Science and Engineering Statistics, 2023). As the demand for STEM knowledge continues to escalate, driven by rapid technological advancements and evolving global challenges, proficiency in these fields has become increasingly indispensable for everyday functions and will remain so in the foreseeable future (Executive Office of the President, 2020; Marino et al., 2021). Science, technology, engineering, and mathematics skills are crucial for specialized careers, navigating modern society, contributing to informed decision making, and fostering innovation. These skills, such as critical thinking and problem solving, are foundational for informed citizenship and future employability for all students, regardless of disability status (Maass et al., 2022; National Academies of Sciences et al., 2021; National Research Council, 2012).
Despite nearly seven million students with disabilities receiving special education services (Office of Special Education and Rehabilitative Services, 2024), they consistently underperform in STEM content areas compared to their peers without disabilities, exhibiting significant gaps in science and mathematics achievement across grade levels (National Center for Education Statistics, 2022; U.S. Department of Education, 2019). Students with disabilities score on average 36% lower in STEM assessments than their peers by 12th grade, a disparity exacerbated by inaccessible instructional methods (National Center for Education Statistics, 2022; U.S. Department of Education, 2019). Given the importance of STEM skills in daily life and global competitiveness, ensuring high-quality STEM education for students with and without disabilities is imperative (Executive Office of the President, 2020). The underrepresentation of individuals with disabilities in STEM fields constitutes both an economic and occupational injustice (Kolne & Lindsay, 2020; National Center for Science and Engineering Statistics, 2023; Paul et al., 2020).
Furthermore, a diverse and robust workforce, inclusive of traditionally marginalized communities such as individuals with disabilities, is essential for the vitality of STEM professions (Executive Office of the President, 2020; National Academies of Sciences et al., 2021). Increasing representation from individuals with disabilities enriches STEM fields with diverse perspectives and enhances national competitiveness (Basham et al., 2020; Marino et al., 2021; National Academies of Sciences et al., 2021). Despite national recognition of the importance of STEM skills and a diverse STEM workforce, students with disabilities continue to fall behind their peers (National Center for Education Statistics, 2022; U.S. Department of Education, 2019), indicating potential deficiencies in delivering effective STEM instruction to these students, leaving lasting effects (Hall & Rathbun, 2020).

Barriers to STEM Content

Students with disabilities face ongoing and multifaceted barriers when attempting to access STEM education. Barriers include inaccessible teaching methods, limited teacher training, and systemic biases in STEM instruction (Doabler et al., 2021; Klimaitis & Mullen, 2021b; Marino et al., 2021; Therrien et al., 2017). For example, students with disabilities often struggle with language, knowledge acquisition, retention, and fundamental academic skills (e.g., reading and writing) inherent in STEM learning (Marino et al., 2021; Therrien et al., 2017). Traditional methods of STEM instruction, primarily lecture-based, tend to emphasize skills that may not cater to the needs of all students, particularly those with disabilities (Friedensen et al., 2021; Therrien et al., 2017).
Consequently, these methods may hinder the accessibility of STEM education for traditionally underrepresented groups, including students with disabilities (Basham et al., 2020). Moreover, a lack of familiarity among teachers with inclusive practices and technologies could further impede the success of students with disabilities (Klimaitis & Mullen, 2021a, 2021b). Furthermore, barriers exist at the organizational level (e.g., access to facilities and resources, time limitations, and departmental isolation (Friedensen et al., 2021; Klimaitis & Mullen, 2021a, 2021b; Marino et al., 2021). Artificial intelligence poses possible solutions for addressing the STEM barriers faced by students with disabilities (Marino et al., 2023). Tools such as Microsoft’s Immersive Reader, which converts text to audio, and Google’s Lookout (Google LLC, 2019), which supports visually impaired students in navigating digital content, illustrate how AI can help overcome these challenges.

1.2. Artificial Intelligence: What It Is and Its Not-So-New History

Artificial intelligence is a branch of computer science that aims to create systems capable of performing tasks that typically require human intelligence, such as learning, reasoning, problem solving, perception, and language understanding. While AI has recently become a topic of concern in education, its history is long. The history of AI has seen significant milestones and evolving definitions influenced by philosophical inquiries (Copeland & Proudfoot, 2007), technological advancements (Kaul et al., 2020), and interdisciplinary collaborations (Hassabis et al., 2017).
The modern era of AI began in the mid-20th century, with pivotal contributions from figures like Alan Turing, who proposed the concept of a machine that could mimic human intelligence (Copeland & Proudfoot, 2007). In 1956, John McCarthy, Marvin Minsky, Nathaniel Rochester, and Claude Shannon organized the Dartmouth Conference, often cited as the official birth of AI as a field of study. The attendees ambitiously predicted that “every aspect of learning or any other feature of intelligence can, in principle, be so precisely described a machine can be made to simulate it” (Crevier, 1993, p. 50).
The following decades saw substantial progress and setbacks in AI. Early successes in the 1950s and 1960s included programs that could play chess and solve algebra problems. Advances in machine learning, particularly neural networks and deep learning, drove the resurgence of AI in the late 20th and early 21st centuries. These technologies have enabled significant breakthroughs in image and speech recognition, autonomous vehicles, and natural language processing (Hamet & Tremblay, 2017). Artificial intelligence encompasses various applications, from medical diagnostics to autonomous vehicles (Groumpos, 2023).

1.3. Alignment of AI to STEM Practices and Standards

Jia and colleagues found 88% (67 out of 76) of the reviewed articles reported positive educational outcomes and findings from applying AI techniques in science education (Jia et al., 2024). The implementation of AI naturally aligns with STEM practices and supports STEM learning. Artificial intelligence supports and improves inquiry and skill application, facilitates interdisciplinary knowledge and creativity, and promotes higher-order thinking skills such as computational thinking, problem solving, and critical thinking. Additionally, educators can utilize AI to identify student learning patterns and behaviors unique to the STEM setting, such as higher-order thinking and problem solving (Xu & Ouyang, 2022).
Moreover, STEM content and academic standards naturally align, such as the National Council of Teachers of Mathematics (NCTM) process standards and the Next Generation Science Standards (NGSS; Holman & Kohnke, 2024). For example, the NGSS science and engineering practice of using mathematics and computational thinking directly overlaps with mathematics standards. Furthermore, mathematics and science standards and practices align with the AI educational guidelines (e.g., AI4K12 guidelines and big ideas). For example, the NGSS science and engineering practice of constructing explanations and designing solutions aligns with the NCTM process standard of problem solving. Both then align with AI4K12’s big idea of societal impact. The authors of the present article further illustrate the alignment between the AI4K12 guidelines, the NGSS scientific and engineering practices, and the NCTM process standards in Figure 1. While the transdisciplinary intersection needs further research, the intersection of AI4K12 and NGSS is evident in the recent literature. Touretzky and colleagues illustrate how the NGSS crosscutting concepts (i.e., patterns, systems, and system models) and the science and engineering practices (i.e., asking questions and defining problems, developing and using models, planning and carrying out investigations, analyzing and interpreting data, constructing explanations and designing solutions) align with the skills required for STEM and AI education (Touretzky et al., 2023).
Figure 1. Transdisciplinary Relationship between NGSS, NCTM, and AI4K12.

Ongoing Emphasis on Access and Equity

The emphasis on improving equity and access in STEM areas is ongoing (Executive Office of the President, 2020). The NGSS highlights the importance of creating access for traditionally underserved groups in science (NGSS Lead States, 2013). Additionally, equity is the first of the six NCTM mathematics principles (Ferrini-Mundy, 2000; National Council of Teachers of Mathematics, 2000). Furthermore, societal impact, AI4K12’s fifth big idea in AI, is closely related to equity and access in terms of recognizing bias and reaching various underrepresented populations (e.g., geographic, socioeconomic, race, ethnicity, disability, resource availability for schools and communities, formal and informal learning). Using AI can create equity of access (AI4K12, 2020, 2024). Denying students experiences with AI technologies in a STEM setting limits their ability to thrive in a technology-driven future (Yang, 2022). Additionally, AI and STEM education emphasize the importance of creative inquiry and problem-solving skills, both of which will be highly important in the future workforce (Yang, 2022).

2. Methodology

Early in the process, we identified a significant gap in research addressing the intersection of AI, STEM education, and students with disabilities. To address this, we shifted focus toward identifying emerging themes in AI’s application in STEM education, emphasizing accessibility and personalized learning for students with disabilities and universal design. Given the limited research directly exploring this intersection, we adopted a conceptual review approach (Hulland, 2020), uncovering patterns and insights across related fields. This approach provides a comprehensive understanding of current research trends and gaps, informing future investigations (Hulland, 2020).
In late June 2024, we conducted a targeted search to identify relevant literature across EBSCOhost, PsycINFO, ERIC (EBSCO), and Google Scholar. To ensure comprehensive coverage, we categorized our search into five key areas: (1) AI and education, (2) AI and STEM education, including individual disciplines of science, mathematics, technology, and engineering, (3) AI and students with disabilities or special education, (4) AI and Universal Design for Learning (UDL), and (5) combinations of these categories (e.g., AI, STEM, and students with disabilities or AI, mathematics, disabilities, and UDL).
The search process began with descriptors from the ERIC Thesaurus, such as “Artificial Intelligence” and “Special Education”, to ensure consistency and alignment with standardized terms and enhance discoverability. We paired these descriptors with synonyms (e.g., A.I. or AI or machine learning or deep learning) to identify relevant literature. We included studies that addressed AI in K-12 or post-secondary educational contexts. We reviewed abstracts for relevance and excluded articles unavailable through our universities’ library systems, not written in English, or focused on non-educational applications of AI. Our conceptual analysis focused on article type, general findings, types and purposes of AI, educational settings, populations studied, and AI’s impacts. See Table 1 for a summary of the review articles examined.
Table 1. Highlighted Reviews.
Table 1. Highlighted Reviews.
CitationTopicArticle TypeNumber of Articles ReviewedTypes of Articles Reviewed
Ahmad et al. (2020)AI and EducationBibliometric analysis and systematic reviewN = 3246Articles from top venues from 2014 to 2020
Dai and Ke (2022)AI and EducationSystematic mapping review and thematic synthesisN = 59Peer-reviewed journal articles, book chapters, and conference proceedings. Years not specified
Kavitha and Joshith (2024)AI and EducationBibliometric analysisN = 324Articles from 2003 to 2023 in Scopus
Misra et al. (2023)AI and EducationThematic reviewN = 280Articles from 1976 to 2023
Paek and Kim (2021)AI and EducationBibliometric analysisN = 5035Articles from 2001 to 2021
Yousuf and Wahid (2021)AI and EducationReview studyNot specifiedNot specified
Zhai et al. (2021)AI and EducationSystematic reviewN = 142Research articles, review papers, interview papers, and book reviews from 2010 to 2020
Chng et al. (2023) AI and STEMLiterature reviewN = 82Empirical articles. Years not specified
Ouyang et al. (2023)AI and STEMSystematic reviewN = 17Empirical research articles from January 2011 to April 2023
Xu and Ouyang (2022)AI and STEMSystematic reviewN = 63Empirical articles from 2011 to 2021
Hwang and Tu (2021)AI and MathematicsBibliometric analysis and systematic reviewN = 43Publications from 1996 to 2020
Mohamed et al. (2022)AI and MathematicsSystematic reviewN = 20Articles from 2017 to 2021 in indexed journals
Jia et al. (2024)AI and ScienceSystematic reviewN = 76Studies from 2013 to 2023
Barua et al. (2022)AI and Students with disabilitiesSystematic reviewN = 26Peer-reviewed articles from 2011 to 2021
Bhatti et al. (2024)AI and Students with disabilitiesSystematic reviewN = 16Journal articles and conference proceedings from 2015 to 2022
Hopcan et al. (2022)AI and Students with disabilitiesSystematic reviewN = 29Studies between 2008 and 2020
Kharbat et al. (2021)AI and Students with disabilitiesSystematic reviewN = 105Peer-reviewed articles published between January 2000 and 2020
Pierrès et al. (2024)AI and Students with disabilitiesSystematic reviewN = 72Articles from 2018 to 2022
Rice and Dunn (2023)AI and Students with disabilitiesSystematic reviewN = 18Publications from 2009 to 2022
Zdravkova et al. (2022)AI and Students with disabilitiesNarrative literature reviewNot statedArticles from 2012 to 2022
Bray et al. (2024)UDL and TechnologySystematic reviewN = 15Peer-reviewed publications. Years not specified
Note. AI: artificial intelligence; STEM: science, technology, engineering, and mathematics education; UDL: Universal Design for Learning.
To address gaps in the available literature reviews, we included primary studies, book chapters, and practitioner articles when existing reviews lacked comprehensive coverage of key areas, such as AI and UDL or AI, STEM, and students with disabilities. This approach allowed us to integrate diverse sources to capture emerging themes and provide a multifaceted understanding of the current state of research (see Table 2).
Table 2. Resources Other than Reviews.
Table 2. Resources Other than Reviews.
CitationTopicArticle TypePopulation Focus
Gawande et al. (2020)AI and EducationResearch articleHigher education
Ivanović et al. (2022)AI and EducationBook chapterAll levels
How and Hung (2019)AI and STEM/STEAMApplication articleK-12 STEAM settings
Ogunkunle and Qu (2020)AI and STEMResearch articleStudents in STEM subjects
Vasconcelos and Santos (2023)AI and STEMResearch articleStudents in simulated STEM learning experience
Yelamarthi et al. (2024)AI and engineeringPosition articleStudents in engineering education
McMahon and Walker (2019)AI and UDLPractitioner articleNot specified
Saborío-Taylor and Rojas-Ramírez (2024)AI and UDLPractitioner articleNot specified
Almufareh et al. (2024)AI and Students with disabilitiesConceptual paperIndividuals with disabilities
Ivanović et al. (2019)AI and Students with disabilitiesPosition articleStudents with disabilities in virtual enviornments
Lamb et al. (2023)AI and Students with disabilitiesOpinion paperStudents with disabilities
Marino et al. (2023)AI and Students with disabilitiesPosition articleStudents receiving special education in K-12 settings
Rai et al. (2023)AI and Students with disabilitiesResearch articleStudents with learning disabilities
Center for Innovation, Design, and Digital Learning (2024)AI and all learnersReportAll learners, early childhood–higher education
Hyatt and Owenz (2024)AI, UDL, and Students with disabilitiesResearch articleGraduate students with disabilities
Shukla et al. (2016)AI, STEM, and Students with disabilitiesResearch articleIndividuals
with profound and multiple learning disabilities
Hughes et al. (2022)AI, STEM, UDL, and Students with disabilitiesResearch articleElementary students with autism spectrum disorder
Note. AI: artificial intelligence; STEM: science, technology, engineering, and mathematics education; UDL: Universal Design for Learning.

4. Precautions for AI Implementation and Research

Implementing AI technologies involves various challenges requiring careful planning and execution to ensure safety, effectiveness, and ethical compliance. In sensitive areas like healthcare and education, where AI systems collect vast amounts of personal data, stakeholders must address concerns about privacy rights and potential data breaches. They should implement robust encryption, access controls, and data anonymization techniques to prevent unauthorized access. Additionally, organizations must establish transparent data practices, enforce stringent data protection measures, and clarify data ownership and control policies (He et al., 2019; Vasquez et al., 2024).
Additionally, transparency in AI algorithms is crucial for building trust among users and stakeholders. Explainable AI techniques can provide clear insights into decision making, thus fostering confidence in the technology’s outcomes (Wiencierz & Lünich, 2022). Explainable AI (XAI) techniques aim to make the operations of AI systems more transparent and interpretable. This means that users, developers, or stakeholders can understand how and why an AI system reaches a particular decision or outcome (Ali et al., 2023).
Regulatory compliance is another significant aspect when AI must adhere to stringent standards. Navigating the regulatory landscape is essential to ensure that AI applications meet legal and ethical requirements (Nilsen et al., 2023). Furthermore, interoperability and standardization are critical for seamlessly integrating AI systems into existing infrastructure. Standardized protocols and formats facilitate efficient data exchange and operation across different platforms, ensuring AI applications function cohesively within the broader technological ecosystem (Svedberg et al., 2022).
Artificial intelligence in education offers potential benefits but raises significant ethical concerns regarding fairness, confidentiality, and algorithmic prejudice (Barnes & Hutson, 2024; Vasquez et al., 2024; Weber, 2020). Ensuring equitable access to AI technologies is crucial to prevent exacerbating existing educational and social inequalities (Baker & Hawn, 2022; Vasquez et al., 2024; Weber, 2020). For example, students in well-funded schools may have greater access to cutting-edge AI tools. Algorithmic bias stemming from biased training data can perpetuate inequalities and unfair admissions, grading, and assessments, highlighting the importance of designing AI systems that are transparent, explainable, and developed with a critical awareness of potential biases (Slimi & Carballido, 2023; Vasquez et al., 2024).
Researchers emphasize the need for comprehensive ethical frameworks (Caccavale et al., 2022) and strategies to mitigate bias, such as diverse datasets and adherence to ethical guidelines (Barnes & Hutson, 2024). Addressing these challenges requires interdisciplinary collaboration (Barnes & Hutson, 2024), policy regulation (Slimi & Carballido, 2023), and education about AI ethics (Weber, 2020). Efforts to ensure algorithmic fairness in education should consider various notions of fairness (Kizilcec & Lee, 2023) and focus on moving from unknown to known bias and from fairness to equity (Baker & Hawn, 2022). Continuous vigilance and proactive strategies are crucial for responsible AI integration in education (Barnes & Hutson, 2024; Massala, 2023).
Integrating AI in education also presents opportunities to enhance instructional methodologies and personalize learning experiences. However, viewing AI as a tool to augment rather than replace teachers’ roles is essential, emphasizing the importance of soft skills such as emotional intelligence and creativity. Ongoing professional development and support for teachers are crucial to effectively integrate AI tools into their teaching practices and address ethical considerations. Educators play a vital role in teaching students about the nature of AI, its potential biases, and its impact on society, fostering students’ critical perspective and ethical awareness. Ultimately, navigating the ethical implications of AI integration in education requires a conscientious approach that prioritizes equitable access, ethical awareness, and the cultivation of technical and soft skills to prepare students for a technologically advanced future while upholding shared values of equity, inclusivity, and human dignity (Vasquez et al., 2024).

Closed vs. Open Bot: Information Source and Transparency

The debate between closed and open bots primarily revolves around their transparency and the source of information they utilize. Maslej and colleagues delineate several key differences between open and closed AI bots, focusing on accessibility, performance, transparency, and security in the 2024 Artificial Intelligence Index Report (Maslej et al., 2024).
Developers retain exclusive access to closed models (e.g., Google’s Gemini), keeping them fully closed. In contrast, some models, including OpenAI’s GPT-4 and Anthropic’s Claude 2, provide limited access via an API without releasing their model weights for independent scrutiny or modification. Open models (e.g., Meta’s Llama 2, Stability AI’s Stable Diffusion) release their model weights publicly, allowing broader use and independent modifications (Maslej et al., 2024). While a closed bot, the Education and Learning in an Inclusive Environment (EL) chatbot allows users to see where the output information comes from to be transparent (Zaugg, 2024).
In terms of performance, closed models consistently surpass open models across various benchmarks. Closed models exhibit a median performance advantage of 24.2% on ten selected AI benchmarks, with performance differences ranging from 4.0% on tasks like GSM8K to 317.7% on tasks like AgentBench. While open models offer accessibility and modifiability, they generally lag behind closed models in performance, as evidenced by significant performance gaps across various benchmarks (Maslej et al., 2024).
Transparency represents another critical distinction. Closed models typically score lower on transparency metrics, with an average transparency score of 30.9%, indicating limited access to information about their development and operations. Conversely, open models achieve significantly higher transparency scores, averaging 51.3%. The openness of these models allows for greater auditability and a broader range of perspectives in their use and development (Maslej et al., 2024).
Security considerations further differentiate these models. Proponents of closed models argue that controlled access reduces the likelihood of their use for malicious purposes, such as creating disinformation or bioweapons. This controlled access helps mitigate security risks associated with powerful AI capabilities. However, while open models foster innovation and transparency, they also pose significant security risks. Their accessibility can facilitate the creation and dissemination of harmful content, necessitating robust security measures to prevent misuse (Maslej et al., 2024).
These differences between open and closed AI bots hold important implications for AI policy and practice. The superior performance of closed models suggests a trade-off between accessibility and performance. Policymakers and practitioners must balance the benefits of transparency and innovation associated with open models against the enhanced performance and potentially reduced security risks of closed models. This balance is crucial in shaping the future landscape of AI development and deployment (Maslej et al., 2024).

5. Conclusions

Despite federal policies to improve access (e.g., Rehabilitation Act of 1973, Americans with Disabilities Act of 1990, Individuals with Disabilities Education Improvement Act, 2004) and national initiatives to increase underrepresented populations in STEM fields (Executive Office of the President, 2020), significant achievement gaps in STEM content areas remain between students with and without disabilities (National Center for Education Statistics, 2022; U.S. Department of Education, 2019). While these policies and initiatives have laid the groundwork for improving access, they have not fully addressed the persistent disparities in STEM achievement for students with disabilities. However, recent advancements in AI offer promising solutions to enhance accessibility and support for students with disabilities to reduce educational barriers (Educating All Learners Alliance & New America, 2024; Marino et al., 2023).
This conceptual review highlights the underexplored intersection of AI, STEM education, and students with disabilities. We uncovered patterns and insights across related disciplines, offering a deeper understanding of current research trends and gaps to inform research, practice, and technological developments. Our findings emphasize AI’s potential to enhance equity and accessibility in STEM education. Additionally, we advocate for the intentional integration of UDL principles, STEM standards (e.g., AI4K12, NGSS, NCTM), and AI to promote greater access for students with disabilities. Ultimately, this article contributes evidence-based insights to inform future research and guide practical applications.
Current research highlights AI’s transforming potential in education, particularly for special education and STEM fields. Artificial intelligence can personalize learning, adapt content, and promote inclusion for students with disabilities (Nixon et al., 2024; Santos et al., 2024). However, challenges persist, including ethical concerns related to privacy, bias, and transparency (Holmes et al., 2022; Ifenthaler et al., 2024). Policymakers must prioritize AI research funding, update curricula, provide professional development for educators (Dieker et al., 2024; Mosher et al., 2024), and develop assessment metrics (Huong, 2024; Saxena, 2024). Experts recommend interdisciplinary collaboration, innovative methodologies, and long-term impact evaluations to maximize AI’s potential while maintaining ethical standards (Hwang et al., 2020; Tanveer et al., 2020). They stress developing clear ethical guidelines, enhancing teacher training, and creating frameworks that balance technological advancements with human-centered approaches (Holmes et al., 2022; Ifenthaler et al., 2024) to ensure equity and accessibility for all students.
Integrating AI with UDL principles enhances inclusivity and accessibility in education. By personalizing learning experiences and addressing diverse student needs, this approach would particularly benefit students with disabilities in STEM fields and improve academic outcomes through AI-based assignments with UDL options (Basham & Marino, 2013; Hyatt & Owenz, 2024; Izzo, 2012; Mrayhi et al., 2023; Roshanaei et al., 2023; Saborío-Taylor & Rojas-Ramírez, 2024; Song et al., 2024; Southworth et al., 2023).

6. Implications and a Call for Research

Artificial intelligence has great potential for enhancing the learning experiences and outcomes for students with disabilities, particularly in STEM courses. However, realizing this potential requires careful consideration of the implications for educational practice, future research, and technology development. The following subsections will explore these implications in detail, offering a roadmap for educators, researchers, and developers.

6.1. Implications for Practice

Incorporating AI in STEM education to serve students with disabilities requires clear guidelines for identifying AI tools that align with UDL principles (Center for Innovation, Design, and Digital Learning, 2024). Teachers must grasp best practices and strategies for implementing AI in the classroom to support individualized learning and ensure that AI tools meet diverse student needs (Center for Innovation, Design, and Digital Learning, 2024; Mohamed et al., 2022).
To provide a concrete example of how AI (e.g., AI4k12) can align with stem standards and UDL principles, we highlight a lesson developed by the Education and Learning in an Inclusive Environment (EL) chatbot (Zaugg, 2024). The initial prompt was the following: “I need a lesson plan that incorporates the use of AI. Address NGSS MS-LS2-1 and MS-LS2-4 and one of the AI4K12’s big ideas. Provide recommendations for UDL and supporting students with disabilities.” Refer to Table 5 for the output of the Exploring Ecosystems with AI lesson plan. In this lesson, students utilize AI tools to analyze and interpret data on ecosystems, simulate the effects of changes in resource availability, and construct evidence-based arguments on how these changes impact populations, enhancing their understanding of ecological dynamics and the application of technology in environmental science.
Table 5. EL Chatbot Output: Exploring Ecosystems with AI Lesson Plan.
To facilitate effective AI integration, it is essential to provide teachers with comprehensive resources, guidelines, ongoing coaching, and professional development tailored to varying levels of AI incorporation (Center for Innovation, Design, and Digital Learning, 2024; Dieker et al., 2024; Marino et al., 2023; Mosher et al., 2024). This support should accommodate teachers’ and students’ comfort and proficiency levels, as well as differing institutional policies and resources. Establishing clear ethical guidelines for AI usage is crucial (Center for Innovation, Design, and Digital Learning, 2024; Dieker et al., 2024; Marino et al., 2023; Mosher et al., 2024); teacher preparation programs and instructional coaches should equip educators to facilitate informed discussions about AI with students, parents, and colleagues. Additionally, teachers must understand the potential risks associated with AI and develop strategies to mitigate them (Center for Innovation, Design, and Digital Learning, 2024; Dieker et al., 2024; Marino et al., 2023; Mosher et al., 2024) while staying informed about emerging trends and technological advancements. Educators should explicitly teach students how to use AI tools, highlighting their limitations and promoting critical thinking and responsible usage.
Teacher preparation programs and professional development coaches must prioritize equipping educators with the knowledge and skills to select, adapt, and implement AI-driven solutions that enhance accessibility and foster inclusive learning environments (Center for Innovation, Design, and Digital Learning, 2024; Dieker et al., 2024; Mosher et al., 2024). This approach empowers educators to leverage AI’s potential while addressing the unique needs of students with disabilities in STEM fields. Furthermore, educators require dedicated time to explore and effectively utilize AI tools, allowing them to intentionally design instruction, modify activities, and support all students, including those with disabilities, in their use of AI. Effective assessment of AI tools is vital for measuring their impact on learning outcomes (Ouyang et al., 2023). Yet, educators need instruction and support in utilizing assessment data to inform their instruction while efficiently using time to reduce teacher burnout. Ultimately, educators play an irreplaceable role in facilitating learning, even in a technology- and AI-driven culture (Center for Innovation, Design, and Digital Learning, 2024; Kohnke, 2023; Ouyang et al., 2023).

6.2. Implications for Technology Developers

Understanding the diverse needs of public school systems is critical for technology developers aiming to create effective and inclusive educational tools. Developers must account for the significant differences in Internet connectivity between urban and rural districts. For example, the Federal Communications Commission (FCC) reported in 2021 that 23% of rural Americans lack broadband coverage compared to only 1.5% of urban Americas (Federal Communications Commission, 2021), as well as the varying patterns of day-to-day technology use (Center for Innovation, Design, and Digital Learning, 2024). Options for offline functioning should be available. Developers should also consider cost limitations, particularly in underfunded districts (Hopcan et al., 2022; Mohamed et al., 2022). Collaboration between technology developers, educational practitioners, and researchers is crucial to ensure that AI tools are responsive to educational needs while adhering to ethical standards (Center for Innovation, Design, and Digital Learning, 2024; Nixon et al., 2024; Pierrès et al., 2024).
Moreover, developers should prioritize the creation of AI-embedded curricula and educational tools that integrate Universal Design for Learning (UDL) principles, providing accommodation and support for students with diverse abilities and knowledge levels. Research shows that students value interactive elements and support within various technology programs (Kohnke, 2023; Pierrès et al., 2024). Furthermore, integrating AI into extended reality environments could enhance these tools by offering virtual intelligent tutoring systems for content support, inquiry guidance, real-time feedback, and coaching (Kohnke, 2023). These features would enable educators and students to tailor virtual support, making extended reality and other virtual curriculum tools more inclusive and adaptive (Kohnke, 2023; Mohamed et al., 2022).

6.3. Future Research Directions and Their Implications

While AI education research has been an increasing trend, more research focuses on STEM education. Therefore, more researchers should focus on studying the use of AI in special education and serving individuals with disabilities (Bhatti et al., 2024; Marino et al., 2023; Zhai et al., 2021). Furthermore, future research should focus on the intersection of AI as a means for students with disabilities to access STEM content areas, exploring how AI-driven tools can bridge gaps in learning and engagement (Hopcan et al., 2022; Nixon et al., 2024). One initial avenue is researching AI’s role as a scaffold to enhance students’ learning processes (How & Hung, 2019; Rice & Dunn, 2023) and supporting teachers (Marino et al., 2023) rather than focusing on disability identification (Rice & Dunn, 2023). An additional potential opportunity would be to enhance the investigation of the use of AI to support students’ social, emotional, behavioral, and academic development, which is essential for STEM careers (Hopcan et al., 2022; Hughes et al., 2012). As the attention to AI for supporting students with disabilities in STEM education matures, systematic and thematic literature reviews and meta-analyses should analyze this developing intersection. Once promising evidence is established, they should be studied longitudinally to determine the sustained impact on student learning.
Researchers should also prioritize multidisciplinary collaborations, such as educational technology, STEM education, and special education, working together on the same project (Center for Innovation, Design, and Digital Learning, 2024; Jia et al., 2024). Collaborating with and utilizing the perspectives of educators, parents, and students with disabilities can foster inclusive, innovative, and pertinent applications (Marino et al., 2023; Pierrès et al., 2024; Rice & Dunn, 2023). Furthermore, researchers must prioritize disseminating findings to practitioners with the same importance as dissemination to peer-reviewed research journals through social media, blogs, podcasts, and practitioner articles (Center for Innovation, Design, and Digital Learning, 2024).
Researchers must address AI’s ethical concerns, particularly regarding data privacy, algorithmic bias, and equitable access, ensuring AI solutions benefit all students, including those from diverse backgrounds and with varying disabilities (Barnes & Hutson, 2024; Vasquez et al., 2024). Incorporating representative samples in AI research is essential to reflect the unique experiences of students with disabilities, enabling the development of inclusive and effective AI tools (Center for Innovation, Design, and Digital Learning, 2024; Jia et al., 2024; Marino et al., 2023). This requires researchers and policymakers to prioritize diversity in datasets, establish clear ethical guidelines, and collaborate across disciplines to ensure that AI integration in education upholds fairness, inclusivity, and equity principles. These efforts will help mitigate the risk of exacerbating existing inequalities and foster the creation of AI technologies that are accessible, transparent, and equitable for all learners.

Funding

This research received no external funding.

Data Availability Statement

No new data were created or analyzed in this study.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Ahmad, K., Qadir, J., Al-Fuqaha, A., Iqbal, W., Elhassan, A., Benhaddou, D., & Ayyash, M. (2020). Artificial intelligence in education: A comprehensive review. Preprint, 1–42. [Google Scholar] [CrossRef]
  2. AI4K12. (2020). Big ideas poster. Available online: https://ai4k12.org/resources/big-ideas-poster/ (accessed on 24 July 2024).
  3. AI4K12. (2024). Artificial intelligence (AI) for K-12 initiative (AI4K12). Available online: https://ai4k12.org/ (accessed on 24 July 2024).
  4. Ali, S., Abuhmed, T., El-Sappagh, S., Muhammad, K., Alonso-Moral, J. M., Confalonieri, R., Guidotti, R., Del Ser, J., Diaz-Rodriguez, N., & Herrera, F. (2023). Explainable artificial intelligence (XAI): What we know and what is left to attain trustworthy artificial intelligence. Information Fusion, 99, 101805. [Google Scholar] [CrossRef]
  5. Almufareh, M. F., Kausar, S., Humayun, M., & Tehsin, S. (2024). A conceptual model for inclusive technology: Advancing disability inclusion through artificial intelligence. Journal of Disability Research, 3(1), 1–11. [Google Scholar] [CrossRef]
  6. Al Omoush, M., & Mehigan, T. (2023). Personalised presentation of mathematics for visually impaired or dyslexic students: Challenges and benefits. Ubiquity Proceedings, 3(1), 409–415. [Google Scholar] [CrossRef]
  7. Amaral, M. (2020). Wheelchair access and inclusion barriers on campus: Exploring Universal Design models in higher education. In Accessibility and Diversity in Education: Breakthroughs in Research and Practice (pp. 509–534). IGI Global. [Google Scholar] [CrossRef]
  8. Ampadu, Y. (2023). Handling big data in education: A review of educational data mining techniques for specific educational problems. AI, Computer Science and Robotics Technology, 2023(2), 1–16. [Google Scholar] [CrossRef]
  9. Baker, R. S., & Hawn, A. (2022). Algorithmic bias in education. International Journal of Artificial Intelligence in Education, 32(4), 1052–1092. [Google Scholar] [CrossRef]
  10. Barnes, E., & Hutson, J. (2024). Navigating the ethical terrain of AI in higher education: Strategies for mitigating bias and promoting fairness. Forum for Education Studies, 2(2), 1229. [Google Scholar] [CrossRef]
  11. Barua, P. D., Vicnesh, J., Gururajan, R., Oh, S. L., Palmer, E., Azizan, M. M., Kadri, N. A., & Acharya, U. R. (2022). Artificial intelligence enabled personalised assistive tools to enhance education of children with neurodevelopmental disorders—A review. International Journal of Environmental Research and Public Health, 19(3), 1192. [Google Scholar] [CrossRef]
  12. Basham, J. D., & Marino, M. T. (2013). Understanding STEM education and supporting students through Universal Design for Learning. Teaching Exceptional Children, 45(4), 8–15. [Google Scholar] [CrossRef]
  13. Basham, J. D., Marino, M. T., Hunt, C. L., & Han, K. (2020). Considering STEM for learners with disabilities and other diverse needs. In C. C. Johnson, M. J. Mohr-Schroeder, T. J. Moore, & L. D. English (Eds.), Handbook of Research on STEM Education (1st ed., pp. 128–137). Routledge. [Google Scholar]
  14. Bastien, F., Koop, R., Small, T. A., Giasson, T., & Jansen, H. (2020). The role of online technologies and digital skills in the political participation of citizens with disabilities. Journal of Information Technology & Politics, 17(3), 218–231. [Google Scholar] [CrossRef]
  15. Beyene, W. M., Mekonnen, A. T., & Giannoumis, G. A. (2023). Inclusion, access, and accessibility of educational resources in higher education institutions: Exploring the ethiopian context. International Journal of Inclusive Education, 27(1), 18–34. [Google Scholar] [CrossRef]
  16. Bhatti, I., Mohi-U-din, S. F., Hayat, Y., & Tariq, M. (2024). Artificial intelligence applications for students with learning disabilities: A systematic review. European Journal of Science, Innovation and Technology, 4(2), 2. [Google Scholar]
  17. Bray, A., Devitt, A., Banks, J., Sanchez Fuentes, S., Sandoval, M., Riviou, K., Byrne, D., Flood, M., Reale, J., & Terrenzio, S. (2024). What next for Universal Design for Learning? A systematic literature review of technology in UDL implementations at second level. British Journal of Educational Technology, 55(1), 113–138. [Google Scholar] [CrossRef]
  18. Caccavale, F., Caccavale, C. L., Gernaey, K., & Krühne, U. (2022). To be fAIr: Ethical and fair application of artificial intelligence in virtual laboratories. In SEFI 50th Annual Conference of The European Society for Engineering Education: Towards a new future in engineering education, new scenarios that European alliances of tech universities open up (pp. 1022–1030). Universitat Politècnica de Catalunya. [Google Scholar] [CrossRef]
  19. CAST. (2018). UDL: The UDL guidelines version 2.2. Available online: http://udlguidelines.cast.org/ (accessed on 24 July 2024).
  20. Center for Innovation, Design, and Digital Learning. (2024). Inclusive intelligence: The impact of ai on education for all learners (pp. 1–99). Available online: https://ciddl.org/wp-content/uploads/2024/04/InclusiveIntelligence_a11y_navadded.pdf (accessed on 18 October 2024).
  21. Chng, E., Tan, A. L., & Tan, S. C. (2023). Examining the use of emerging technologies in schools: A review of artificial intelligence and immersive technologies in STEM education. Journal for STEM Education Research, 6(3), 385–407. [Google Scholar] [CrossRef]
  22. Copeland, B. J., & Proudfoot, D. (2007). Artificial intelligence: History, foundations, and philosophical issues. In Philosophy of psychology and cognitive science (pp. 429–482). North-Holland. [Google Scholar] [CrossRef]
  23. Crevier, D. (1993). AI: The tumultuous history of the search for artificial intelligence. Basic Book. Available online: https://research.ebsco.com/linkprocessor/plink?id=a8db9061-709f-3e7a-b47a-a7e07b3033d6 (accessed on 24 July 2024).
  24. Dai, C.-P., & Ke, F. (2022). Educational applications of artificial intelligence in simulation-based learning: A systematic mapping review. Computers and Education: Artificial Intelligence, 3, 100087. [Google Scholar] [CrossRef]
  25. Demirbilek, M., & Talan, T. (2022). Integrating assistive robotics in STEM education to empower people with disabilities. In Designing, constructing, and programming robots for learning (pp. 179–200). IGI Global. [Google Scholar] [CrossRef]
  26. Dieker, L., Hines, R., Wilkins, I., Hughes, C., Hawkins Scott, K., Smith, S., Ingraham, K., Ali, K., Zaugg, T., & Shah, S. (2024). Using an artificial intelligence (AI) agent to support teacher instruction and student learning. Journal of Special Education Preparation, 4(2), 78–88. [Google Scholar] [CrossRef]
  27. Doabler, C. T., Therrien, W. J., Longhi, M. A., Roberts, G., Hess, K. E., Maddox, S. A., Uy, J., Lovette, G. E., Fall, A.-M., Kimmel, G. L., Benson, S., VanUitert, V. J., Emily Wilson, S., Powell, S. R., Sampson, V., & Toprac, P. (2021). Efficacy of a second-grade science program: Increasing science outcomes for all students. Remedial and Special Education, 42(3), 140–154. [Google Scholar] [CrossRef]
  28. Dobransky, K., & Hargittai, E. (2006). The disability divide in internet access and use. Information, Communication & Society, 9(3), 313–334. [Google Scholar] [CrossRef]
  29. Educating All Learners Alliance & New America. (2024). Prioritizing students with disabilities in ai policy (pp. 1–15). Policy Brief. Available online: https://drive.google.com/file/d/1iaY6s466mlvzo-9SmcuKbohxVF1Pc274/view?usp=sharing&usp=embed_facebook (accessed on 18 October 2024).
  30. Executive Office of the President. (2020). Progress report on the federal implementation of the STEM education strategic plan. Available online: https://trumpwhitehouse.archives.gov/wp-content/uploads/2017/12/Progress-Report-Federal-Implementation-STEM-Education-Strategic-Plan-Dec-2020.pdf (accessed on 27 July 2024).
  31. Federal Communications Commission. (2021). Fourteenth broadband deployment report (FCC 21-18; Broadband Progress Reports, pp. 1–209). Federal Communications Commission. Available online: https://www.fcc.gov/reports-research/reports/broadband-progress-reports/fourteenth-broadband-deployment-report (accessed on 23 December 2024).
  32. Ferrini-Mundy, J. (2000). Principles and standards for school mathematics: A guide for mathematicians. Notices of the American Mathematical Society, 47(8), 868–876. [Google Scholar]
  33. Fichten, C. S., Asuncion, J. V., Barile, M., Fossey, M., & de Simone, C. (2000). Access to educational and instructional computer technologies for post-secondary students with disabilities: Lessons from three empirical studies. Journal of Educational Media, 25(3), 179–201. [Google Scholar] [CrossRef]
  34. Friedensen, R., Lauterbach, A., Kimball, E., & Mwangi, C. G. (2021). Students with high-incidence disabilities in STEM: Barriers encountered in postsecondary learning environments. Journal of Postsecondary Education and Disability, 34(1), 77–90. [Google Scholar]
  35. Gawande, V., Badi, H., & Al Makharoumi, K. (2020). An empirical study on emerging rrends in artificial intelligence and its impact on higher education. International Journal of Computer Applications, 175, 43–47. [Google Scholar] [CrossRef]
  36. Google LLC. (2019). Lookout—Accessibility app (Version 12/5/2024) [Computer software]. Available online: https://play.google.com/store/apps/details?id=com.google.android.apps.accessibility.reveal&hl=en_US (accessed on 22 December 2024).
  37. Grodzinsky, F. S. (1997). Computer access for students with disabilities: An adaptive technology laboratory. SIGCSE Bulletin, 29(1), 292–295. [Google Scholar] [CrossRef]
  38. Groumpos, P. P. (2023). A critical historic overview of artificial intelligence: Issues, challenges, opportunities, and threats. Artificial Intelligence and Applications, 1(4), 4. [Google Scholar] [CrossRef]
  39. Gupta, S., Gupta, S. B., & Gupta, M. (2024). Importance of artificial intelligence in achieving sdgs in India. International Journal of Built Environment and Sustainability, 11(2), 1–26. [Google Scholar] [CrossRef]
  40. Hall, M., & Rathbun, A. (2020). Health and STEM career expectations and science literacy achievement of U.S. 15-year-old students: Stats in brief (NCES 2020-034; Stats in Brief, pp. 1–23). National Center for Education Statistics. Available online: https://nces.ed.gov/pubs2020/2020034.pdf (accessed on 26 December 2024).
  41. Hamet, P., & Tremblay, J. (2017). Artificial intelligence in medicine. Metabolism: Clinical & Experimental, 69, S36–S40. [Google Scholar] [CrossRef]
  42. Hassabis, D., Kumaran, D., Summerfield, C., & Botvinick, M. (2017). Neuroscience-inspired artificial intelligence. Neuron, 95(2), 245–258. [Google Scholar] [CrossRef]
  43. He, J., Baxter, S. L., Xu, J., Xu, J., Zhou, X., & Zhang, K. (2019). The practical implementation of artificial intelligence technologies in medicine. Nature Medicine, 25(1), 30. [Google Scholar] [CrossRef]
  44. Holman, K., & Kohnke, S. (2024). Melding mindsets: Isolated content, same destination, hidden opportunity. Constellations: Online STEM Teacher Education Journal, 1(1), 3. [Google Scholar]
  45. Holmes, W., Porayska-Pomsta, K., Holstein, K., Sutherland, E., Baker, T., Shum, S. B., Santos, O. C., Rodrigo, M. T., Cukurova, M., Bittencourt, I. I., & Koedinger, K. R. (2022). Ethics of AI in education: Towards a community-wide framework. International Journal of Artificial Intelligence in Education, 32(3), 504–526. [Google Scholar] [CrossRef]
  46. Hopcan, S., Polat, E., Ozturk, M. E., & Ozturk, L. (2022). Artificial intelligence in special education: A systematic review. Interactive Learning Environments, 31(10), 7335–7353. [Google Scholar] [CrossRef]
  47. How, M.-L., & Hung, W. L. D. (2019). Educing ai-thinking in science, technology, engineering, arts, and mathematics (steam) education. Education Sciences, 9(3), 184. [Google Scholar] [CrossRef]
  48. Hughes, C. E., Dieker, L. A., Glavey, E. M., Hines, R. A., Wilkins, I., Ingraham, K., Bukaty, C. A., Ali, K., Shah, S., Murphy, J., & Taylor, M. S. (2022). RAISE: Robotics & AI to improve STEM and social skills for elementary school students. Frontiers in Virtual Reality, 3. [Google Scholar] [CrossRef]
  49. Hughes, S., Pennington, J. L., & Makris, S. (2012). Translating autoethnography across the AERA standards: Toward understanding autoethnographic scholarship as empirical research. Educational Researcher, 41(6), 209–219. [Google Scholar] [CrossRef]
  50. Hulland, J. (2020). Conceptual review papers: Revisiting existing research to develop and refine theory. AMS Review, 10(1–2), 27–35. [Google Scholar] [CrossRef]
  51. Huong, X. V. (2024). The implications of artificial intelligence for educational systems: Challenges, opportunities, and transformative potential. The American Journal of Social Science and Education Innovations, 6(3), 101–111. [Google Scholar] [CrossRef]
  52. Hwang, G.-J., & Tu, Y.-F. (2021). Roles and research trends of artificial intelligence in mathematics education: A bibliometric mapping analysis and systematic review. Mathematics, 9(6), 584. [Google Scholar] [CrossRef]
  53. Hwang, G.-J., Xie, H., Wah, B. W., & Gašević, D. (2020). Vision, challenges, roles and research issues of artificial intelligence in education. Computers and Education: Artificial Intelligence, 1, 100001. [Google Scholar] [CrossRef]
  54. Hyatt, S. E., & Owenz, M. B. (2024). Using Universal Design for Learning and artificial intelligence to support students with disabilities. College Teaching, 1–8. [Google Scholar] [CrossRef]
  55. Ifenthaler, D., Majumdar, R., Gorissen, P., Judge, M., Mishra, S., Raffaghelli, J., & Shimada, A. (2024). Artificial intelligence in education: Implications for policymakers, researchers, and practitioners. Technology, Knowledge and Learning: Learning Mathematics, Science and the Arts in the Context of Digital Technologies, 29, 1693–1710. [Google Scholar] [CrossRef]
  56. Ivanović, M., Ganzha, M., Paprzycki, M., Bădică, C., Klašnja-Milićević, A., & Bădică, A. (2019). On measuring learning success of students with disabilities in virtual environments (Vol. 2508). Available online: https://research.ebsco.com/linkprocessor/plink?id=efc81286-f649-3149-b2ee-21ca4e4b3860 (accessed on 16 October 2024).
  57. Ivanović, M., Klašnja-Milićević, A., Paprzycki, M., Ganzha, M., Bădică, C., Bădică, A., & Jain, L. C. (2022). Current trends in AI-based educational processes—An overview. In Handbook on intelligent techniques in the educational process: Vol 1 recent advances and case studies (Vol. 29, pp. 1–15). Springer Nature. [Google Scholar]
  58. Izzo, M. V. (2012). Universal Design for Learning: Enhancing achievement of students with disabilities. Procedia Computer Science, 14, 343–350. [Google Scholar] [CrossRef][Green Version]
  59. Jia, F., Sun, D., & Looi, C. (2024). Artificial intelligence in science education (2013–2023): Research trends in ten years. Journal of Science Education and Technology, 33(1), 94–117. [Google Scholar] [CrossRef]
  60. Kanter, A. (2019). The right to inclusive education for students with disabilities under international human rights law. In G. de Beco, S. Quinlivan, & J. E. Lord (Eds.), The right to inclusive education for students with disabilities under international human rights law (pp. 15–57). Cambridge University Press. [Google Scholar] [CrossRef]
  61. Kaul, V., Enslin, S., & Gross, S. A. (2020). History of artificial intelligence in medicine. Gastrointestinal Endoscopy, 92(4), 807–812. [Google Scholar] [CrossRef]
  62. Kavitha, K., & Joshith, V. P. (2024). The transformative trajectory of artificial intelligence in education: The two decades of bibliometric retrospect. Journal of Educational Technology Systems, 52(3), 376–405, Supplemental Index. [Google Scholar] [CrossRef]
  63. Kharbat, F. F., Alshawabkeh, A., & Woolsey, M. L. (2021). Identifying gaps in using artificial intelligence to support students with intellectual disabilities from education and health perspectives. Aslib Journal of Information Management, 73(1), 101–128. [Google Scholar] [CrossRef]
  64. Kizilcec, R. F., & Lee, H. (2023). Algorithmic fairness in education (1st ed., pp. 174–202). Routledge. [Google Scholar] [CrossRef]
  65. Klimaitis, C. C., & Mullen, C. A. (2021a). Access and barriers to science, technology, engineering, and mathematics (STEM) education for K–12 students with disabilities and females (pp. 813–836). Springer International Publishing. [Google Scholar] [CrossRef]
  66. Klimaitis, C. C., & Mullen, C. A. (2021b). Including K-12 students with disabilities in STEM education and planning for inclusion. Educational Planning, 28(2), 27–43. [Google Scholar]
  67. Koch, K. (2017). Stay in the box! Embedded assistive technology improves access for students with disabilities. Education Sciences, 7(4), 82. [Google Scholar] [CrossRef]
  68. Kohnke, S. (2023). The effect of extended reality on the science achievement gap between students with and without disabilities. University of Central Florida. Available online: https://stars.library.ucf.edu/etd2020/1810 (accessed on 20 December 2024).
  69. Kolne, K., & Lindsay, S. (2020). A systematic review of programs and interventions for increasing the interest and participation of children and youth with disabilities in STEM education or careers. Journal of Occupational Science, 27(4), 525–546. [Google Scholar] [CrossRef]
  70. Lamb, R., Choi, I., & Owens, T. (2023). Artificial intelligence and sensor technologies the future of special education for students with intellectual and developmental disabilities. Journal of Intellectual & Developmental Disability, 11, 1–3. [Google Scholar] [CrossRef]
  71. Leddy, M. H. (2010). Technology to advance high school and undergraduate students with disabilities in science, technology, engineering, and mathematics. Journal of Special Education Technology, 25(3), 3–8. [Google Scholar] [CrossRef]
  72. Lunsford, S. K., & Bargerhuff, M. E. (2006). A project to make the laboratory more accessible to students with disabilities. Journal of Chemical Education, 83(3), 407. [Google Scholar] [CrossRef]
  73. Maass, K., Sorge, S., Romero-Ariza, M., Hesse, A., & Straser, O. (2022). Promoting active citizenship in mathematics and science teaching. International Journal of Science and Mathematics Education, 20(4), 727–746. [Google Scholar] [CrossRef] [PubMed]
  74. Mack, K., Sidik, N., Desai, A., McDonnell, E., Mehta, K., Zhang, C., & Mankoff, J. (2023, October 22–25). Maintaining the accessibility ecosystem: A multi-stakeholder analysis of accessibility in higher education. The 25th International ACM SIGACCESS Conference on Computers and Accessibility, New York, NY, USA. [Google Scholar] [CrossRef]
  75. Marino, M. T., Parsons, C., Brewer, J., & Vasquez, E. (2021). Students with disabilities in science/STEM. In L. M. Stroud, A. Roy, & K. S. Doyle (Eds.), Science laboratory safety manual (4th ed., pp. 351–353). National Safety Consultants, LLC. Available online: https://sites.google.com/site/nationalsafetyconsultantsllc (accessed on 20 December 2024).
  76. Marino, M. T., Vasquez, E., Dieker, L., Basham, J., & Blackorby, J. (2023). The future of artificial intelligence in special education technology. Journal of Special Education Technology, 38(3), 404–416. [Google Scholar] [CrossRef]
  77. Martiniello, N., Asuncion, J., Fichten, C., Jorgensen, M., Havel, A., Harvison, M., Legault, A., Lussier, A., & Vo, C. (2020). Artificial intelligence for students in postsecondary education: A world of opportunity. AI Matters, 6(3), 17–29. [Google Scholar] [CrossRef]
  78. Maslej, N., Fattorini, L., Perrault, R., Parli, V., Reuel, A., Brynjolfsson, E., Etchemendy, J., Ligett, K., Lyons, T., Manyika, J., Niebles, J. C., Shoham, Y., Wald, R., & Clark, J. (2024). Artificial intelligence index report 2024. arXiv. Available online: https://research.ebsco.com/linkprocessor/plink?id=e597febc-e43b-3cbd-b053-294b80bcff87 (accessed on 4 October 2024).
  79. Massala, K. (2023). Navigating bias and ensuring fairness: Equity unveiled in the ai-powered educational landscape. Apprendre et Enseigner Aujourd’hui, 13(1), 37–41. [Google Scholar] [CrossRef]
  80. McMahon, D. D., & Walker, Z. (2019). Leveraging emerging technology to design an inclusive future with Universal Design for Learning. CEPS Journal, 9(3), 75–93. [Google Scholar] [CrossRef]
  81. McNicholl, A., Casey, H., Desmond, D., & Gallagher, P. (2021). The impact of assistive technology use for students with disabilities in higher education: A systematic review. Disability and Rehabilitation: Assistive Technology, 16(2), 130–143. [Google Scholar] [CrossRef]
  82. McNicholl, A., Desmond, D., & Gallagher, P. (2023). Assistive technologies, educational engagement and psychosocial outcomes among students with disabilities in higher education. Disability and Rehabilitation: Assistive Technology, 18(1), 50–58. [Google Scholar] [CrossRef]
  83. Misra, V. P., Mishra, P. K., & Sharma, A. (2023, December 4–6). Artificial intelligence in education—Emerging trends, thematic analysis & application in lifelong learning. 2023 IEEE Asia-Pacific Conference on Computer Science and Data Engineering (CSDE), Nadi, Fiji. [Google Scholar] [CrossRef]
  84. MMD. (2021). Knewton personalizes learning with the power of AI. Digital innovation and transformation. Available online: https://d3.harvard.edu/platform-digit/submission/knewton-personalizes-learning-with-the-power-of-ai/ (accessed on 22 December 2024).
  85. Mohamed, M. Z. b., Hidayat, R., Suhaizi, N. N. b., Sabri, N. b. M., Mahmud, M. K. H. b., & Baharuddin, S. N. b. (2022). Artificial intelligence in mathematics education: A systematic literature review. International Electronic Journal of Mathematics Education, 17(3), 1–11. Available online: http://spot.lib.auburn.edu/login?url=https://search.ebscohost.com/login.aspx?direct=true&db=eric&AN=EJ1357707&site=eds-live&scope=site (accessed on 13 November 2024).
  86. Mosher, M., Dieker, L., & Hines, R. (2024). The Past, present, and future use of artificial intelligence in teacher education. Journal of Special Education Preparation, 4(2), 6–17. [Google Scholar] [CrossRef]
  87. Mrayhi, S., Koutheair Khribi, M., & Jemni, M. (2023, July 10–13). Ensuring inclusivity in MOOCs: The importance of UDL and digital accessibility. 2023 IEEE International Conference on Advanced Learning Technologies (ICALT) (pp. 44–46), Orem, UT, USA. [Google Scholar] [CrossRef]
  88. Nash, R., Conner, B., Fellows, K., Clemmensen, B., Gullickson, R., & Goldrup, S. (2022). Barriers in medical education: A scoping review of common themes for medical students with disabilities. Discover Education, 1(1), 4. [Google Scholar] [CrossRef]
  89. National Academies of Sciences, Engineering, and Medicine, Division of Behavioral and Social Sciences and Education, Board on Science Education, Committee on the Call to Action for Science Education, Honey, M., Schweingruber, H., Brenner, K., & Gonring, P. (2021). Call to action for science education: Building opportunity for the future. National Academies Press. [Google Scholar] [CrossRef]
  90. National Center for Education Statistics. (2022). NAEP mathematics: National student group scores and score gaps. U.S. Department of Education.
  91. National Center for Science and Engineering Statistics. (2023). Diversity and STEM: Women, minorities, and persons with disabilities (Special Report NSF 23-315). National Science Foundation. Available online: https://ncses.nsf.gov/wmpd (accessed on 16 July 2024).
  92. National Council of Teachers of Mathematics. (2000). Principles and standards for school mathematics. NCTM. [Google Scholar]
  93. National Research Council. (2012). A framework for k-12 science education: Practices, crosscutting concepts, and core ideas. National Academies Press. [Google Scholar] [CrossRef]
  94. NGSS Lead States. (2013). Next generation science standards: For states, by states. The National Academies Press. [Google Scholar] [CrossRef]
  95. Nilsen, P., Svedberg, P., Neher, M., Nair, M., Larsson, I., Petersson, L., & Nygren, J. (2023). A framework to guide implementation of AI in health care: Protocol for a cocreation research project. JMIR Research Protocols, 12, e50216. [Google Scholar] [CrossRef] [PubMed]
  96. Nixon, N., Lin, Y., & Snow, L. (2024). Catalyzing equity in STEM teams: Harnessing generative AI for inclusion and diversity. Policy Insights from the Behavioral and Brain Sciences, 11(1), 85–92. [Google Scholar] [CrossRef] [PubMed]
  97. Office of Special Education and Rehabilitative Services. (2024). 45th Annual report to congress on the implementation of the IDEA. US Department of Education. Available online: https://sites.ed.gov/idea/2023-individuals-with-disabilities-education-act-annual-report-to-congress/ (accessed on 15 July 2024).
  98. Ogunkunle, O., & Qu, Y. (2020, December 16–18). A data mining based optimization of selecting learning material in an intelligent tutoring system for advancing STEM education. 2020 International Conference on Computational Science and Computational Intelligence (CSCI) (pp. 904–909), Las Vegas, NV, USA. [Google Scholar] [CrossRef]
  99. Ouyang, F., Dinh, T. A., & Xu, W. (2023). A systematic review of ai-driven educational assessment in stem education. Journal for STEM Education Research, 6(3), 408–426. [Google Scholar] [CrossRef]
  100. Paek, S., & Kim, N. (2021). Analysis of worldwide research trends on the impact of artificial intelligence in education. Sustainability, 13(14), 7941. [Google Scholar] [CrossRef]
  101. Paul, S., Rafal, M., & Houtenville, A. (2020). Annual disabilities statistics compendium: 2020 (pp. 1–173). University of New Hampshire, Institute on Disability. Available online: https://disabilitycompendium.org/sites/default/files/user-uploads/Events/2021_release_year/Final%20Accessibility%20Compendium%202020%20PDF_2.1.2020reduced.pdf (accessed on 29 July 2024).
  102. Pierrès, O., Christen, M., Schmitt-Koopmann, F. M., & Darvishy, A. (2024). Could the use of AI in higher education hinder students with disabilities? A scoping review. IEEE Access, 12, 27810–27828. [Google Scholar] [CrossRef]
  103. Rai, H. L., Saluja, D. N., & Pimplapure, D. A. (2023). AI and learning disabilities: Ethical and social considerations in educational technology. Educational Administration Theory and Practices, 29, 726–733. [Google Scholar] [CrossRef]
  104. Reed, M. J., Lewis, T., & Lund-Lucas, E. (2006). Access to post-secondary education and services for students with learning disabilities: Student, alumni and parent perspectives from two Ontario universities. Higher Education Perspectives, 2(2). [Google Scholar]
  105. Rice, M. F., & Dunn, S. (2023). The use of artificial intelligence with students with identified disabilities: A systematic review with critique. Computers in the Schools, 40(4), 370–390. [Google Scholar] [CrossRef]
  106. Rodrigo, C., & Tabuenca, B. (2020). Learning ecologies in online students with disabilities. Comunicar, 28(62), 53. [Google Scholar] [CrossRef]
  107. Roshanaei, M., Olivares, H., & Lopez, R. R. (2023). Harnessing AI to foster equity in education: Opportunities, challenges, and emerging strategies. Journal of Intelligent Learning Systems and Applications, 15(4), 123–143. [Google Scholar] [CrossRef]
  108. Rostan, J., & Stark, B. (2023). Meet LUDIA, your AI-powered UDL partner. The International Educator (TIE Online). Available online: https://www.tieonline.com/article/3575/meet-ludia-your-ai-powered-udl-partner (accessed on 29 July 2024).
  109. Saborío-Taylor, S., & Rojas-Ramírez, F. (2024). Universal Design for Learning and artificial intelligence in the digital era: Fostering inclusion and autonomous learning. International Journal of Professional Development, Learners and Learning, 6(2), ep2408. [Google Scholar] [CrossRef] [PubMed]
  110. Sandoval-Gomez, A., Cosier, M., & Cardinal, D. N. (2020). Inclusion and the right to access to regular classes for students with disabilities. International Electronic Journal of Elementary Education, 12(3), 233–234. [Google Scholar] [CrossRef]
  111. Santos, S. M. A. V., Silva, C. G. d., Carvalho, I. E. d., Castilho, L. P. d., Meroto, M. B. d. N., Tavares, P. R., Pires, R. d. R., & Moniz, S. S. d. O. R. (2024). The art of personalization of education: Artificial intelligence on the stages of special education. Contribuciones a Las Ciencias Sociales, 17(2), e4971. [Google Scholar] [CrossRef]
  112. Sarasola Sánchez-Serrano, J. L., Jaén-Martínez, A., Montenegro-Rueda, M., & Fernández-Cerero, J. (2020). Impact of the information and communication technologies on students with disabilities. A systematic review 2009–2019. Sustainability, 12(20), 8603. [Google Scholar] [CrossRef]
  113. Saxena, A. K. (2024). AI in governance and policy making. International Journal of Science and Research (IJSR), 13(5), 1218–1223. [Google Scholar] [CrossRef]
  114. Scanlan, M. (2022). Reassessing the disability divide: Unequal access as the world is pushed online. Universal Access in the Information Society, 21(3), 725–735. [Google Scholar] [CrossRef]
  115. Shukla, J., Cristiano, J., Anguera, L., Vergés-Llahí, J., & Puig, D. (2016). A comparison of robot interaction with tactile gaming console stimulation in clinical applications. In L. P. Reis, A. P. Moreira, P. U. Lima, L. Montano, & V. Muñoz-Martinez (Eds.), Robot 2015: Second iberian robotics conference (pp. 435–445). Springer International Publishing. [Google Scholar]
  116. Slimi, Z., & Carballido, B. V. (2023). Navigating the ethical challenges of artificial intelligence in higher education: An analysis of seven global ai ethics policies. TEM Journal, 12(2), 590–602. [Google Scholar] [CrossRef]
  117. Song, Y., Weisberg, L. R., Zhang, S., Tian, X., Boyer, K. E., & Israel, M. (2024). A framework for inclusive AI learning design for diverse learners. Computers and Education: Artificial Intelligence, 6, 100212. [Google Scholar] [CrossRef]
  118. Southworth, J., Migliaccio, K., Glover, J., Glover, J., Reed, D., McCarty, C., Brendemuhl, J., & Thomas, A. (2023). Developing a model for ai across the curriculum: Transforming the higher education landscape via innovation in ai literacy. Computers and Education: Artificial Intelligence, 4, 100127. [Google Scholar] [CrossRef]
  119. Stark, B., & Rostan, J. (2023). Ludia, your AI-powered UDL partner. Available online: https://docs.google.com/presentation/d/e/2PACX-1vTVN6FUeeP4C9Sz49m4Ydqwbh_yH-CrtuayLb5VCaSJF_fy-roEe2uVZAGuiDccwy46qefJGRCBnNav/pub?start=false&loop=true&delayms=15000&usp=embed_facebook (accessed on 29 July 2024).
  120. Svedberg, P., Reed, J., Nilsen, P., Barlow, J., Macrae, C., & Nygren, J. (2022). Toward successful implementation of artificial intelligence in health care practice: Protocol for a research program. JMIR Research Protocols, 11(3), e34920. [Google Scholar] [CrossRef] [PubMed]
  121. Tanveer, M., Bhaumik, A., & Hassan, S. (2020). Academic policy regarding sustainability and artificial intelligence (AI). Sustainability, 12(22), 9435. [Google Scholar] [CrossRef]
  122. Therrien, W. J., Benson, S. K., Hughes, C. A., & Morris, J. R. (2017). Explicit instruction and Next Generation Science Standards aligned classrooms: A fit or a split? Learning Disabilities Research & Practice, 32(3), 149–154. [Google Scholar] [CrossRef]
  123. Thurston, L. P., Shuman, C., Middendorf, B. J., & Johnson, C. (2017). Postsecondary stem education for students with disabilities: Lessons learned from a decade of NSF funding. Journal of Postsecondary Education and Disability, 30(1), 49–60. [Google Scholar]
  124. Touretzky, D., Gardner-McCune, C., & Seehorn, D. (2023). Machine learning and the Five Big Ideas in AI. International Journal of Artificial Intelligence in Education, 33(2), 233–266, ERIC. [Google Scholar] [CrossRef]
  125. U.S. Department of Education. (2019). National assessment of educational progress (NAEP), 2019 science assessment. Institute of Education Sciences, National Center for Education Statistics. Available online: https://www.nationsreportcard.gov/ (accessed on 29 July 2024).
  126. Vasconcelos, M. A. R., & Santos, R. P. d. (2023). Enhancing STEM learning with ChatGPT and Bing Chat as objects-to-think-with: A case study. arXiv. ahead-of-print. [Google Scholar] [CrossRef]
  127. Vasquez, E., III, Basham, J. D., Jimenez, B., & Marino, M. T. (2024). Ethical considerations for educators leveraging artificial intelligence. In Inclusive intelligence: The impact of AI on education for all learners (pp. 92–99). Center for Innovation, Design, and Digital Learning. Available online: https://ciddl.org/wp-content/uploads/2024/04/InclusiveIntelligence_a11y_navadded.pdf (accessed on 29 July 2024).
  128. Weber, A. (2020, March 2–4). Ethics concerns in artificial intelligence use in education. 14th International Technology, Education and Development Conference (pp. 4539–4544), Valencia, Spain. [Google Scholar] [CrossRef]
  129. West, M., & Kregel, J. (1993). Beyond section 504: Satisfaction and empowerment of students with disabilities in higher education. Exceptional Children, 59(5), 456–467. [Google Scholar] [CrossRef]
  130. Wiencierz, C., & Lünich, M. (2022). Trust in open data applications through transparency. New Media and Society, 24(8), 1751–1770. [Google Scholar] [CrossRef]
  131. Xu, W., & Ouyang, F. (2022). The application of AI technologies in STEM education: A systematic review from 2011 to 2021. International Journal of STEM Education, 9, 59. [Google Scholar] [CrossRef]
  132. Yang, W. (2022). Artificial intelligence education for young children: Why, what, and how in curriculum design and implementation. Computers and Education: Artificial Intelligence, 3, 100061. [Google Scholar] [CrossRef]
  133. Yelamarthi, K., Dandu, R., Rao, M., Yanambaka, V. P., & Mahajan, S. (2024). Exploring the potential of generative AI in shaping engineering education: Opportunities and challenges. Journal of Engineering Education Transformations, 37(2), 439–445. [Google Scholar] [CrossRef]
  134. Yousuf, M., & Wahid, A. (2021, November 3–5). The role of artificial intelligence in Education: Current trends and future prospects. 2021 International Conference on Information Science and Communications Technologies (ICISCT) (pp. 1–7), Tashkent, Uzbekistan. [Google Scholar] [CrossRef]
  135. Zaugg, T. (2024). EL-education and learning in an inclusive environment [large language model] (Version 7/1/2024) [CustomGPT.ai]. Available online: https://app.customgpt.ai/projects/14678/ask-me-anything?embed=1&shareable_slug=233792c3a0deb7bcfac31baaab211673 (accessed on 4 October 2024).
  136. Zdravkova, K., Krasniqi, V., Dalipi, F., & Ferati, M. (2022). Cutting-edge communication and learning assistive technologies for disabled children: An artificial intelligence perspective. Frontiers in Artificial Intelligence, 5, 970430. [Google Scholar] [CrossRef] [PubMed]
  137. Zhai, X., Chu, X., Chai, C. S., Jong, M. S. Y., Istenic, A., Spector, M., Liu, J.-B., Yuan, J., & Li, Y. (2021). A review of artificial intelligence (AI) in education from 2010 to 2020. Complexity, 2021, 8812542. [Google Scholar] [CrossRef]
  138. Zorec, K., Desmond, D., Boland, T., McNicholl, A., O’Connor, A., Stafford, G., & Gallagher, P. (2024). A whole-campus approach to technology and inclusion of students with disabilities in higher education in Ireland. Disability & Society, 39(5), 1147–1172. [Google Scholar] [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Article Metrics

Citations

Article Access Statistics

Multiple requests from the same IP address are counted as one view.