Previous Article in Journal
Gen2Gen: Efficiently Training Artificial Neural Networks Using a Series of Genetic Algorithms
Previous Article in Special Issue
FEA-Assisted Test Bench to Enhance the Comprehension of Vibration Monitoring in Electrical Machines—A Practical Experiential Learning Case Study
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Generative AI as a Sociotechnical Challenge: Inclusive Teaching Strategies at a Hispanic-Serving Institution

1
Department of Biology, Natural Science Division, College of Arts and Sciences, University of La Verne, La Verne, CA 91750, USA
2
Department of Chemistry, Natural Science Division, College of Arts and Sciences, University of La Verne, La Verne, CA 91750, USA
3
Department of Mathematics, Natural Science Division, College of Arts and Sciences, University of La Verne, La Verne, CA 91750, USA
4
Department of Physics, Natural Science Division, College of Arts and Sciences, University of La Verne, La Verne, CA 91750, USA
*
Author to whom correspondence should be addressed.
Knowledge 2025, 5(3), 18; https://doi.org/10.3390/knowledge5030018
Submission received: 11 July 2025 / Revised: 31 August 2025 / Accepted: 4 September 2025 / Published: 10 September 2025
(This article belongs to the Special Issue Knowledge Management in Learning and Education)

Abstract

Generative artificial intelligence (GenAI) is reshaping science, technology, engineering, and mathematics (STEM) education by offering new strategies to address persistent challenges in equity, access, and instructional capacity—particularly within Hispanic-Serving Institutions (HSIs). This review documents a faculty-led, interdisciplinary initiative at the University of La Verne (ULV), an HSI in Southern California, to explore GenAI’s integration across biology, chemistry, mathematics, and physics. Adopting an exploratory qualitative design, this study synthesizes faculty-authored vignettes with peer-reviewed literature to examine how GenAI is being piloted as a scaffold for inclusive pedagogy. Across disciplines, faculty-reported benefits such as simplifying complex content, enhancing multilingual comprehension, and expanding access to early-stage research and technical writing. At the same time, limitations—including factual inaccuracies, algorithmic bias, and student over-reliance—underscore the importance of embedding critical AI literacy and ethical reflection into instruction. The findings highlight equity-driven strategies that position GenAI as a complement, not a substitute, for disciplinary expertise and culturally responsive pedagogy. By documenting diverse, practice-based applications, this review provides a flexible framework for integrating GenAI ethically and inclusively into undergraduate STEM instruction. The insights extend beyond HSIs, offering actionable pathways for other minority-serving and resource-constrained institutions.

1. Introduction

1.1. Sociotechnical Context of GenAI in STEM Education

Generative artificial intelligence (GenAI) aligns with the transformative goals of Education 4.0, which advocates for creativity, problem-solving, adaptability, and lifelong learning as foundational skills for navigating the Fourth Industrial Revolution [1,2,3]. These technologies—ranging from large language models such as ChatGPT (OpenAI GPT-4o) to multimodal tools such as DALL·E—hold the potential to reshape how knowledge is accessed, processed, and applied across disciplines [4]. However, realizing this potential depends on an institution’s capacity to integrate GenAI in ways that are inclusive, pedagogically sound, and contextually relevant [2]. At institutions such as the University of La Verne (ULV)—a federally designated Hispanic-Serving Institution (HSI) enrolling approximately 7000 students, the majority of whom are first-generation, multilingual, or from historically underrepresented backgrounds—GenAI adoption is shaped by persistent structural inequities [5]. These include limited campus-wide digital infrastructure, underfunded academic and technology support services, and unequal access to reliable internet and personal computing devices among low-income and commuter student populations [6,7,8,9]. These challenges are not unique to the ULV but are emblematic of the broader resource disparities that affect Minority-Serving Institutions (MSIs) across the United States—especially as they attempt to scale technological innovation without the fiscal flexibility of better-resourced institutions [6,10,11].
As such, the integration of GenAI must not be viewed as a neutral enhancement to teaching and learning, but rather as a sociotechnical intervention that both reflects and reshapes existing institutional conditions [12]. Its adoption at HSIs demands intentional alignment with justice-centered goals, including equity in digital access, transparency in use, and student empowerment in design and application [13]. This study presents a faculty-led, interdisciplinary exploration of early GenAI implementation across STEM disciplines. While ChatGPT features prominently in our pilot efforts, the objective is to examine how GenAI technologies can be adapted to promote equity, enhance student learning, and build discipline-specific competencies in biology, chemistry, mathematics, and physics. We document a series of reflective, practice-informed interventions that surface both the pedagogical opportunities and institutional constraints associated with GenAI integration. These cases highlight iterative experimentation, cross-disciplinary collaboration, and critical engagement—providing practical insights for institutions navigating the ethical, structural, and instructional dimensions of AI adoption.
The demographic landscape of U.S. higher education continues to shift, with Hispanic students comprising one of the fastest-growing student populations [14]. HSIs now enroll nearly 20% of all U.S. college students and two-thirds of all Hispanic undergraduates [15,16], positioning them as critical spaces for shaping the ethical and equitable implementation of emerging technologies, including GenAI. As key access points for historically excluded communities, HSIs bear a unique responsibility—and opportunity—to lead in developing inclusive and context-aware models of GenAI use. However, these institutions continue to face systemic barriers to digital transformation, including limited availability of Spanish-language AI tools, heightened privacy risks for undocumented students, and underresourced technology infrastructure [2,17,18,19]. These challenges underscore the need for institution-specific approaches to GenAI adoption—ones that account for both demographic strengths and material constraints [19].
Although personalized GenAI tools have shown promise in supporting multilingual comprehension, automating formative feedback, and fostering technical writing and data analysis skills [20,21], their educational value must be assessed critically and contextually through an equity lens. Without intentional attention to cultural relevance, linguistic diversity, and disparities in digital access, such tools risk replicating the very structural exclusions they are often presumed to alleviate [22]. Emerging research has highlighted how algorithmic bias, English-language dominance, and opaque model design can marginalize underrepresented students, particularly in STEM settings [23,24]. Our project adopts a strengths-based approach—one that centers the lived experiences, multilingual resources, and adaptive capacities of students at HSIs—to explore how GenAI can be adapted, not merely adopted, to foster inclusive excellence and educational innovation. This framing deliberately resists deficit-based narratives that characterize students as underprepared and instead highlights student agency, faculty experimentation, and institution-specific strategies responsive to both structural constraints and local opportunities [25,26]. GenAI becomes not just a technological solution, but a site for pedagogical co-creation, where learners and educators engage in iterative dialogue with emerging tools.
This manuscript documents the efforts of faculty across four STEM departments—biology, chemistry, mathematics, and physics—each of whom engaged GenAI to address a specific pedagogical challenge. Their work reveals how generative AI tools can be mobilized to support critical learning outcomes, while also raising ethical, disciplinary, and access-related considerations that must inform implementation. Our work is organized around five interrelated thematic areas:
  • Equity and access—How can GenAI help level the playing field for multilingual, first-generation, and low-income learners?
  • Culturally relevant pedagogy—In what ways can AI applications reflect students’ lived experiences and local realities?
  • Interdisciplinary and community connections—How can GenAI facilitate research that is grounded in local communities and cross-disciplinary collaboration?
  • Societal and ethical considerations—What ethical frameworks are needed to guide responsible GenAI use in higher education?
  • AI literacy for underrepresented communities—How can students be equipped with the digital competencies necessary to thrive in AI-integrated STEM fields?
Rather than offering a singular or prescriptive model, this manuscript provides a multi-voiced roadmap for navigating GenAI integration in diverse instructional contexts. Through reflective practice, interdisciplinary collaboration, and community-informed pedagogy, we aim to contribute to the development of more equitable, adaptable, and ethically grounded higher education ecosystems—particularly within institutions serving students historically excluded from both STEM fields and technological innovation.
This review contributes significantly to scholarship at the intersection of generative AI and equitable pedagogy within Hispanic-Serving Institutions. Theoretically, it advances understanding of how GenAI functions as a knowledge management technology—a dynamic tool for capturing, codifying, and sharing educational expertise within institutions [27,28]. Societally, it illuminates ways HSIs can center equity by deploying GenAI to support historically underserved students, rendering justice, inclusion, and culturally responsive teaching core to AI deployment [29]. Practically, this work assembles actionable institutional strategies grounded in faculty practice and peer-reviewed literature, including innovations in prompt literacy, inclusive content creation, and scaffolded autonomy—strategies with clear applicability for institutions pursuing inclusive, scalable AI integration [30].
Building on these thematic areas, the manuscript is organized into a series of discipline-specific cases (Section 2, Section 3, Section 4, Section 5, Section 6, Section 7, Section 8 and Section 9). Each case highlights a sociotechnical challenge in teaching and learning and explores how faculty piloted GenAI to address it. Section 2 examines field biology as a living laboratory that democratizes scientific communication; Section 3 considers GenAI as a scaffold for inclusive learning in organic chemistry; Section 4 explores its role as a catalyst for ethical and inclusive engagement in cell biology and immunology; Section 5 analyzes how GenAI can broaden access to instrumental chemistry; Section 6 highlights AI-enhanced strategies in general biology; Section 7 situates GenAI in mathematics education as a vehicle for equity and innovation; Section 8 addresses how GenAI supports bridging scientific literacy gaps in biology education; and Section 9 presents its use as a coding mentor in physics. Together, these cases demonstrate how GenAI can be mobilized across disciplines to address equity, access, and literacy, while raising critical questions about ethics, authorship, and institutional strategy.

1.2. State of the Field: Inclusive AI Pedagogies

The rapid emergence of GenAI has generated an expanding body of scholarship across education, knowledge management, and organizational learning. Early studies emphasize GenAI’s capacity to support learning through scaffolding, personalized feedback, and efficiency gains in academic tasks [30,31]. Within the field of knowledge management, GenAI has been framed as a tool for capturing and codifying institutional expertise, enabling organizations to transform individual practice into collective, shareable assets [32,33,34]. At the same time, concerns about algorithmic bias, English-language dominance, and unequal digital access highlight the sociotechnical nature of AI adoption—not merely as a tool, but as an intervention shaped through inequitable institutional contexts [12,35], particularly in marginalized and multilingual learning environments where structural inequities are reinforced through algorithmic design [36]. This duality—potential for democratization alongside risks of exclusion—has become a central tension in current scholarship [37].
For HSIs and other MSIs, this tension is heightened by structural disparities in infrastructure, funding, and student resources [38,39], as well as limited institutional “servingness” that affects support structures and educational sustainability [40]. Recent scholarship highlights that although HSIs have substantial potential to lead equity-focused innovations, these efforts are frequently constrained by structural disparities in digital readiness, infrastructure, and policy support—particularly evident in issues such as inadequate access to devices and unreliable internet [41]. Yet, the literature also points to the critical role of faculty-driven experimentation and interdisciplinary collaboration in advancing culturally responsive and inclusive approaches to technology integration [42]. Taken together, these studies reveal both opportunities and limitations in deploying GenAI to advance equitable outcomes, underscoring the need for research that is practice-informed and contextually grounded [43].

1.3. Approach and Methodological Framing

This manuscript adopts an exploratory qualitative review design. Rather than presenting new empirical data, it synthesizes peer-reviewed literature and practice-based faculty narratives from four STEM disciplines—biology, chemistry, mathematics, and physics—within a Hispanic-Serving Institution. Faculty-authored vignettes provide reflective accounts of pedagogical strategies, challenges, and institutional contexts. These are analyzed alongside existing scholarship to surface common themes and actionable insights. In doing so, this review positions faculty practice as both a site of inquiry and a form of organizational knowledge production. Ethical guidelines were followed throughout, with no human subject research conducted; all insights are drawn from faculty experiences and published literature. This approach aligns with prior knowledge management scholarship that emphasizes iterative exploration and institutional learning as precursors to systematic organizational change [44,45].

2. Field Biology as a Living Laboratory: Democratizing Scientific Communication with GenAI

2.1. Context and Course Structure: Field Biology at an HSI

At the ULV, field biology courses are designed as Course-Based Undergraduate Research Experiences (CUREs) that blend traditional laboratory methods, outdoor fieldwork, and introductory computational analysis. These courses introduce students to the full arc of scientific investigation—hypothesis development, data collection, analysis, and communication—within the constraints of a single term [46,47]. Many students enrolled in these CUREs are English learners and first-generation college students, who often encounter challenges when engaging with scientific writing, research protocols, and peer-reviewed literature. To address these barriers, GenAI tools such as ChatGPT (OpenAI GPT-4o) have been introduced as cognitive scaffolds that support students in navigating scientific terminology, revising technical writing, and interpreting research methods. Rather than diminishing academic rigor, these tools have helped students organize their thoughts more clearly, refine scientific arguments, and engage more confidently in both oral and written components of the course. Faculty use of GenAI also extends to assignment design and formative feedback, allowing for more iterative, inclusive modes of student engagement in research practice.

2.2. Literacy Scaffolds: Enhancing Comprehension and Writing Skills

A central goal of the ULV’s field biology courses is to develop students’ fluency with primary scientific literature—an essential yet often intimidating genre for early-stage undergraduates, particularly multilingual learners. One of the first major assignments guides students through the structural elements of peer-reviewed articles, including how to identify research questions, methodologies, and conclusions. This approach emphasizes not just content comprehension but literacy as a scientific practice, helping students move from passive reading to analytical engagement with disciplinary discourse. In one example, a student analyzing a study on kangaroo rat adaptations to arid climates initially submitted a summary that lacked precision and failed to clearly identify this study’s conclusions. After using ChatGPT as a peer-editing tool, the student was able to revise the paragraph with improved clarity, scientific tone, and coherence. The AI-generated feedback prompted the student to refine their structure, use more discipline-appropriate vocabulary, and connect conclusions to supporting evidence. This revision process exemplifies how GenAI can serve as a real-time literacy scaffold, offering personalized guidance that helps students move beyond surface-level reading to deeper analytical reasoning.
Scientific writing is foundational to the ULV’s CUREs, where students must articulate methods, results, and interpretations from original investigations. These field-based projects often involve local environmental challenges and collaborations with community partners, making scientific communication not only a classroom requirement but also a civic act. The “living laboratory” framework that structures these courses connects theory with real-world application, culminating in student-led presentations to both scientific and public audiences. To support this writing process, assignments are modeled on journal standards such as the Instructions for Authors for BIOS—the quarterly journal of the Biology Honor Society. Students draft each manuscript section using detailed rubrics, often returning to ChatGPT for iterative revision. In one instance, a student investigating flavonoid variation in Camellia rosea petals used AI-generated suggestions to clarify structure, improve lexical precision, and streamline transitions between sections. Importantly, the student maintained full ownership of the scientific interpretation while using GenAI to refine the form and tone of their writing. These iterative cycles mirror a writing-to-learn approach, where revision is not merely correction but a key step in disciplinary identity formation and academic confidence-building.

2.3. Scientific Reasoning and Experimental Design: GenAI as a Foil

In field biology courses, where experimental design is both a conceptual and logistical challenge, GenAI tools are increasingly used not to provide answers but to provoke critique. One early-semester assignment asks students to prompt ChatGPT to generate an experimental procedure based on their own hypothesis and then evaluate the feasibility, coherence, and scientific validity of the AI’s response. This exercise frames GenAI as a foil—a contrasting voice against which students must articulate their reasoning and defend their methodological choices. For instance, one student group investigating the relationship between leaf litter density, canopy openness, and grass density noticed that ChatGPT produced a procedural suggestion and conclusion that contradicted their initial expectations. Instead of dismissing the output, the students analyzed it in light of their field observations and ecological context. As one student observed, “ChatGPT can’t ‘see’ what’s going on at the study site like we can.” This insight opened a broader class discussion about the limits of decontextualized data generation, the role of situated knowledge in ecology, and the importance of aligning methods with environmental variability.
These assignments prompt students to engage in scientific metacognition: articulating the why behind each design decision and examining the logical structure of their research plans. By interrogating GenAI’s blind spots—its inability to account for site-specific conditions, sampling constraints, or ecosystem dynamics—students strengthen their experimental literacy and deepen their understanding of the scientific process. Rather than treating GenAI as a shortcut, students learn to use it as a prompt for argumentation, critical reflection, and methodological refinement—core competencies in both fieldwork and research more broadly.

2.4. Broader Impacts: Equity, Publishing, and Participation in the Scientific Community

Integrating GenAI tools into field biology courses addresses persistent barriers in STEM education—particularly those related to language access, scientific writing, and experimental design. Early implementation suggests that GenAI can serve as an adaptive scaffold, improving students’ command of scientific terminology and supporting the development of academic writing skills, especially for second-language learners [48]. These tools also support diverse cognitive and linguistic needs, increasing participation and persistence among students historically underrepresented in science [49,50]. When embedded into culturally relevant, hands-on research assignments, GenAI tools contribute to deeper student engagement and reinforce STEM identity formation. As Barajas-Salazar et al. [51] note, experiential learning environments—particularly those tied to local environmental and community issues—enhance student self-efficacy and sense of belonging in scientific fields. These outcomes are especially critical in contexts where students are preparing to tackle complex, real-world problems such as environmental sustainability [52].
In practice, AI-assisted writing tools function as responsive feedback systems. Students use them to refine technical vocabulary, clarify ambiguous phrasing, and strengthen the logical flow of their arguments. In this way, GenAI mirrors the role of a mentor or tutor: guiding revision while preserving authorship and intellectual control [53,54]. Rather than outsourcing the writing process, students learn to treat GenAI as a cognitive partner—helpful in translation, organization, and clarity, but dependent on the student’s critical input. Importantly, these supports have already contributed to student-led research projects that culminated in peer-reviewed publications [55,56,57]. Students involved in these projects used GenAI tools not only to revise their writing, but also to decode journal formatting guidelines, clarify expectations, and improve abstract structure. These experiences demonstrate that GenAI can help demystify scientific publishing for emerging scholars, expanding access to authentic research participation and preparing students to contribute meaningfully to professional scientific discourse [58].

2.5. Future Applications: Scaling Equity Through AI and Authentic Research

As GenAI tools become more widely accessible, future integration in field biology should prioritize collaborative, multilingual science communication and authentic research preparation that empowers diverse learners. At HSIs such as the ULV, such integration holds particular promise for expanding access to high-impact practices traditionally limited by time, resource, and mentoring constraints. Faculty are exploring how GenAI can assist students in translating raw field observations into formalized scientific writing—particularly for learners unfamiliar with the conventions of peer-reviewed publication. Early experimentation has shown that GenAI can help students navigate complex publishing tasks, such as structuring abstracts, decoding journal submission guidelines, and refining figures and captions with greater clarity. These supports are especially relevant for students who may be engaging in manuscript preparation or conference poster design for the first time.
Emerging efforts also examine how GenAI might be embedded into mobile field data collection apps or cloud-based platforms used for ecological monitoring. In these applications, GenAI could provide just-in-time feedback on data descriptions, procedural language, and early-stage interpretation—offering immediate scaffolding to students in the field. Such tools may help reduce cognitive load during fast-paced data collection while reinforcing precision and reflection in field journaling and metadata documentation. In parallel, faculty are investigating how students interpret the role of GenAI in authorship and attribution. Some students describe AI as a helpful language editor, while some faculty and staff raise concerns about over-reliance or a perceived loss of voice in their scientific writing [54,58,59]. These divergent perspectives underscore the importance of developing best practices that promote transparency, equitable authorship, and scientific integrity in undergraduate research [60].
GenAI presents an opportunity to democratize access to authentic undergraduate research by scaffolding critical components of the research process—particularly for those who have been historically excluded from scientific publication or laboratory-intensive training. Realizing this potential, however, will require thoughtful design, institutional support, and clear guidelines that uphold ethical and disciplinary norms while promoting inclusion and innovation in field-based science education.

3. Generative AI as a Scaffold for Inclusive Learning in Organic Chemistry

Organic chemistry is frequently described as a “gatekeeper” course due to its conceptual density, abstract reasoning, and specialized terminology—all of which contribute to historically high attrition rates in STEM degree pathways [61,62]. These challenges are particularly acute at institutions such as the ULV, where a majority of students are first-generation college-goers, multilingual learners, or historically underrepresented in science. In this context, the organic chemistry laboratory becomes not only a pedagogical hurdle but also a site of opportunity to reimagine inclusion and retention through instructional innovation [61]. The faculty at the ULV have begun piloting GenAI tools—particularly large language models such as ChatGPT—as instructional scaffolds to reduce cognitive load, enhance comprehension of procedural language, and support student success in laboratory-based learning. These interventions aim to address multiple access points: clarifying dense instructional protocols, demystifying mechanistic reasoning, and offering personalized feedback on writing and data analysis. When used critically and transparently, GenAI can supplement faculty guidance while empowering students to develop the language, logic, and literacy of organic chemistry. As such, the integration of GenAI in this context is framed not merely as a convenience or technological enhancement, but as a means of pedagogical equity—intended to mitigate structural and linguistic barriers that disproportionately impact students from underserved backgrounds [21,63]. Ongoing faculty experimentation is guided by the principle that inclusive teaching requires adapting tools to student needs—not the other way around.

3.1. Demystifying Laboratory Language and Procedures

Many core organic chemistry labs—such as caffeine extraction from black tea—involve complex multi-step protocols that assume fluency in technical terminology. This presents a linguistic barrier for students encountering scientific English for the first time. In one example, students were asked to follow this step:
  • “Once at room temperature, transfer the aqueous extract solution into your separatory funnel. Make sure the stopcock is in the closed position. Wash the solution with 10 mL of methylene chloride three times…”
Students found this language daunting. When the instructions were rephrased using ChatGPT, they became more accessible:
  • “Transfer the cooled aqueous extract into a separatory funnel. Ensure the stopcock is closed. Add 10 mL of methylene chloride, shake, vent, and drain the bottom layer into a flask. Repeat this process three times.”
Simplifying the syntax improved comprehension and execution, particularly for multilingual students. This use of GenAI as a real-time translator of scientific language reflects its potential to act as a scaffold—not a crutch—for procedural understanding [48,64].

3.2. Cultivating Critical Visual Literacy with AI

While GenAI supports linguistic comprehension, its visual output remains problematic. When asked to generate a molecular structure of caffeine, ChatGPT produced an inaccurate skeletal diagram (Figure 1a), omitting essential bond angles and atomic connectivity. In contrast, the correct version (Figure 1b) illustrates precise molecular geometry.
These limitations underscore the need for human oversight in AI-assisted instruction. Faculty now integrate critical evaluation tasks into assignments, asking students to verify AI outputs against peer-reviewed models. This helps build visual literacy and skepticism—essential skills in organic chemistry and broader scientific reasoning.

3.3. Toward a Culturally Responsive and Skeptical Pedagogy

At minority-serving institutions, where structural inequities intersect with pedagogical challenges, AI must be leveraged intentionally. Rewriting dense procedures or helping students navigate unfamiliar concepts must be accompanied by practices that encourage evaluation, revision, and scientific inquiry. The ULV’s chemistry faculty emphasize that GenAI is most effective when used to cultivate—not shortcut—students’ engagement with complexity.
To this end, faculty model verification habits, frame AI as a collaborator rather than an authority, and use AI-generated feedback as a springboard for metacognition. Students are encouraged to refine AI outputs, reflect on errors, and justify revisions—skills that extend beyond the chemistry lab.

3.4. Next Steps: AI-Enhanced Chemistry Education for Equity

Building on the challenges and opportunities described above, the ULV faculty are exploring next-generation applications of GenAI that could further strengthen equity in chemistry education. One promising direction is the development of accurate visualization tools, as future AI models may support molecular rendering and enable dynamic interaction with structures and reaction mechanisms. Such tools could be particularly valuable for multilingual or returning STEM students who benefit from visual scaffolds.
Another pathway involves layered, adaptive explanations, where AI could provide tiered accounts of complex phenomena—such as reaction kinetics or resonance—allowing students to build knowledge progressively at their own pace. Similarly, simulation-based practice using virtual instruments and AI-assisted troubleshooting could reinforce skill-building for students preparing for lab practicals, especially those with limited access due to commuting or caregiving responsibilities.
Finally, meta-cognitive AI assignments that ask students to critique AI-generated code, quizzes, or analysis may serve as platforms for building both scientific and digital literacy. Future work will examine how AI feedback influences student reasoning in data interpretation and procedural writing, particularly in complex assignments such as spectroscopy or multi-step synthesis. Together, these innovations underscore a larger goal: making chemistry education more inclusive, responsive, and equitable for the next generation of scientists [65].

4. GenAI as a Catalyst for Inclusive, Ethical Learning in Cell Biology and Immunology

In cell biology and immunology at the ULV, GenAI is being explored not simply as a content tool, but as a lever for equity, student agency, and ethical scientific engagement. These courses serve a diverse student population, many of whom face structural barriers to success in STEM. GenAI was piloted to foster inclusive pedagogical practices, amplify underrepresented voices, and cultivate responsible digital literacy.

4.1. Centering Student Voice Through AI-Supported Feedback Loops

Traditional course evaluations often arrive too late to meaningfully impact students’ learning experiences. To address this, instructors implemented mid-semester anonymous feedback via Google Forms and used ChatGPT to synthesize open-ended responses. This process revealed key patterns:
  • Start: practice worksheets, guided simulations, study guides;
  • Stop: fast lecture pacing, overuse of group work;
  • Continue: visual aids, hands-on labs, supportive classroom climate.
Summarizing responses into visualizations and sharing them in class created transparency and invited dialogue. This intervention empowered students—especially those hesitant to speak out—to influence course design in real time. By integrating AI into the feedback process, faculty scaled reflective pedagogy in an equity-minded way.

4.2. Fostering Peer Wisdom and Intergenerational Guidance

To cultivate a community of practice and strengthen culturally responsive learning, students were asked to write letters of advice to future cohorts. ChatGPT helped identify recurring themes, which were then shared with the class. Students emphasized the following:
  • Time management and consistency;
  • Active engagement with labs and lectures;
  • Use of visual aids and peer discussion;
  • Office hours and help-seeking;
  • Maintaining a growth mindset.
This activity validated diverse ways of learning and created a legacy of peer guidance. It also reframed academic success as a communal, not individual, pursuit—particularly important in serving first-generation and underrepresented learners.

4.3. Cultivating AI Literacy and Ethical Judgment

Recognizing AI’s growing role in science and education, students were assigned a structured “AI Error Correction” task. They received a ChatGPT-generated biology explanation embedded with subtle errors and were asked to identify and revise inaccuracies using peer-reviewed sources.
  • Example: students analyzed an AI-generated summary of T-cell activation and flagged incorrect signaling pathways or misattributed molecular roles.
This assignment positioned students as evaluators rather than consumers of AI. It developed digital discernment, reinforced scientific rigor, and modeled transparency about tool use. Students learned to balance the utility of GenAI with critical scrutiny—skills essential for ethical research and data-informed decision-making.

4.4. From Classroom to Institutional Impact: Building Inclusive AI Practices

The multi-pronged integration of GenAI in Cell Biology and Immunology responded to broader institutional challenges around equity, student engagement, and responsible technology use.
Key outcomes include the following:
  • Equity and voice: mid-semester feedback elevated marginalized perspectives;
  • Culturally responsive pedagogy: peer reflections fostered shared learning values;
  • Ethical use of AI: students gained tools for transparent and critical AI engagement;
  • AI literacy: structured, critique-driven digital fluency and scientific judgment.
Together, these efforts positioned GenAI as an ally in inclusive STEM instruction—one that supports both identity development and disciplinary expertise.

4.5. Future Pathways: Co-Creating Ethical AI Use in STEM

Building on these practices, faculty are exploring how students can co-author ethical frameworks for AI use in STEM. Rather than relying on static policies, this participatory model would allow students to shape norms for citation, authorship, and transparency.
Emerging directions include the following:
  • AI in experimental design: students using GenAI to ideate hypotheses and vet feasibility against published literature;
  • Distinguishing hallucinated vs. evidence-based outputs: promoting epistemic awareness and research ethics;
  • AI across departments: collaboratively developing AI literacy modules on tool evaluation, data verification, and responsible communication.
In parallel, faculty are exploring student integration into AI-supported research, including co-authoring, data annotation, and science communication roles. These pathways would expand access to undergraduate research while preparing students for ethical leadership in a rapidly evolving digital landscape [66].

5. Broadening Access to Instrumental Chemistry Through Generative AI

Laboratory instrumentation plays a central role in undergraduate chemistry education, enabling students to apply theoretical concepts to real-world analytical problems [67]. Across general, organic, analytical, instrumental, and physical chemistry courses at the ULV, students engage with equipment such as spectrophotometers, chromatographs, and titration systems—often for the first time. These hands-on experiences are foundational to STEM skill development and are frequently cited as critical for retention and professional identity formation in the chemical sciences [68]. However, access to instrumentation remains uneven. First-generation and commuter students, in particular, often face barriers that limit their ability to fully benefit from lab-intensive learning environments—ranging from transportation and scheduling conflicts to caregiving and work obligations [69,70]. While all students pay the same tuition, not all have equal opportunities to participate in extended lab sessions or seek informal mentorship during off-hours. These disparities highlight a structural access gap common at commuter-serving institutions such as the ULV.
GenAI offers a potential bridge. By providing on-demand support outside of formal lab hours, GenAI tools can help students interpret technical manuals, review procedural steps, troubleshoot errors, and build fluency with scientific instrumentation. In this sense, GenAI acts as a scalable supplement to hands-on instruction—reinforcing core concepts and offering just-in-time clarification when students are studying independently or working asynchronously. Early explorations suggest that such tools can help demystify the hidden curriculum of instrumentation, particularly for students without prior exposure to lab-intensive courses [71,72]. As faculty at the ULV begin integrating GenAI into instrumental chemistry instruction, the aim is not to replace experiential learning, but to scaffold it—extending the reach of mentorship and enhancing equity in access to critical STEM competencies. This section examines how AI is being used to supplement instruction in instrumentation-heavy courses and explores its role in reducing systemic barriers to participation and skill acquisition.

5.1. GenAI as a Scalable Instructional Supplement

Virtual laboratories have long supported student engagement and conceptual understanding in STEM fields [73]. Building on these foundations, generative AI tools now offer promising new pathways to expand access to instrumentation literacy.
At the ULV, chemistry faculty are piloting tools such as ChatGPT to help students:
  • Interpret technical manuals that are outdated, missing, or overly complex;
  • Reinforce core principles across instruments that vary by manufacturer;
  • Troubleshoot common errors using general operational logic;
  • Clarify terminology and schematics through conversational learning.
These applications are particularly useful when students encounter instrument documentation that assumes prior fluency in procedural language or equipment-specific nuances. In such cases, AI can break down dense content into simpler walkthroughs or generate scaffolding materials—provided students are also taught to assess these outputs for accuracy.
For example, when a student used ChatGPT (free version, December 2024) to generate a labeled diagram of a high-performance liquid chromatography (HPLC) system, the result looked polished but was structurally incorrect (Figure 2). The AI misrepresented key components, underscoring the need for faculty moderation and student skepticism when using GenAI in instrument-heavy disciplines.
These limitations do not negate the tool’s value; rather, they emphasize that GenAI should serve as a supplement, not a substitute, for instructor expertise and hands-on training. When embedded within reflective and evidence-based pedagogy, AI tools can offer just-in-time support, simulate troubleshooting pathways, and strengthen students’ conceptual understanding of complex instrumentation workflows [48,71].

5.2. Future Opportunities in Instrumentation Education

Looking ahead, chemistry faculty at the ULV are piloting a range of innovations to deepen GenAI’s role in supporting inclusive, skill-based instrumentation training—particularly for students who face structural barriers to extended lab access or prior exposure to research environments.
  • AI-supported digital twins: Virtual replicas of laboratory instruments, known as digital twins, can provide students with interactive simulations of common techniques before engaging with real hardware. By combining these simulations with AI-generated walkthroughs or augmented reality (AR) overlays, students could asynchronously explore instrument components, troubleshoot virtual malfunctions, and complete pre-lab modules that enhance readiness for in-person assessments. This approach builds familiarity while reducing anxiety around expensive or delicate instrumentation [73].
  • Troubleshooting as inquiry: Rather than treating technical malfunctions as interruptions, students could use GenAI to propose general troubleshooting steps, test these solutions during lab, and reflect on observed discrepancies. This approach frames troubleshooting as a mode of inquiry, helping students develop diagnostic reasoning, scientific skepticism, and iterative thinking—key competencies for careers in analytical chemistry, environmental monitoring, and biomedical research [74].
  • Adaptive practice via AI quiz banks: AI-generated, scenario-based quiz banks can provide customized reinforcement of procedural knowledge and lab safety protocols. These tools allow students to identify and address knowledge gaps outside of formal lab hours, enabling more efficient use of time during hands-on sessions. When used equitably, adaptive assessments can reduce cognitive overload for first-generation and commuter students, creating a more level learning environment [75].
  • Research on AI impact: Empirical studies are needed to evaluate how AI-integrated lab training impacts students’ confidence, technical fluency, and persistence in STEM. Metrics such as skill transfer, reduced error rates, and increased comfort with instrumentation could help validate these interventions. Importantly, research should disaggregate outcomes by demographic groups to ensure that GenAI advances equity rather than reinforcing disparities [76].
By continuing to explore how GenAI can help close access gaps in chemistry education, the ULV faculty are working toward scalable, student-centered strategies that equip diverse learners to thrive in instrument-intensive fields. These efforts align with broader goals for inclusive STEM education and offer actionable pathways for improving workforce readiness across research, industry, and healthcare—particularly as higher education confronts rising expectations to adopt AI in ways that enhance student success and institutional equity [77].

6. AI-Enhanced Teaching in General Biology

General biology serves as a foundational gateway into STEM for students at HSIs, such as the ULV. This course spans diverse and conceptually demanding content areas—including molecular biology, ecology, and evolution—that often appear fragmented or abstract to first-year students, especially those without prior exposure to college-level science. For multilingual learners, first-generation college students, and others navigating unequal access to textbooks, tutoring, or study groups, this course can present significant barriers to persistence in STEM fields [76,78]. In response, the ULV faculty are piloting GenAI tools to personalize instruction, scaffold complex ideas, and expand equitable access to formative feedback. These interventions aim to promote self-directed learning, reduce reliance on expensive external platforms, and create alternate pathways for conceptual engagement. ChatGPT and similar tools have been used to generate tailored review materials, simulate concept quizzes, and offer multilingual explanations—particularly beneficial for students developing academic English fluency [20,21].
At the same time, the instructional team remains attentive to the ethical, cognitive, and pedagogical implications of GenAI integration. Students are introduced to critical AI literacy skills early in the course, including how to evaluate generated content, identify factual inaccuracies, and reflect on their own use practices. This dual emphasis—on access and critical engagement—positions GenAI not as a substitute for faculty interaction, but as a scalable support structure that complements culturally responsive instruction and evidence-based learning design [54,71].

6.1. Scaffolding Learning with Generative AI

To enhance accessibility, the ULV faculty piloted the use of ChatGPT to generate instructional materials aligned with textbook chapters, including the following:
  • Practice quizzes;
  • Guided activities;
  • Conceptual review questions.
These AI-generated resources reduced faculty preparation time while offering students varied ways to engage with content. Many students began using ChatGPT independently to
  • Create study guides;
  • Generate self-assessment questions;
  • Replace expensive flashcard apps or textbook companion tools.
These strategies allowed students to pursue alternative, cost-free learning pathways, fostering greater autonomy and mitigating financial barriers—key priorities at HSIs. However, faculty also noted that AI-supported learning required deliberate oversight. While ChatGPT accelerated content creation and offered accessible language scaffolds, it struggled in areas such as scientific image generation and rubric-based grading—reminding instructors of the need to use GenAI tools judiciously and supplement them with traditional mentorship.

6.2. Improving Metacognition Through AI-Supported Feedback

To address gaps in student self-awareness around learning progress, faculty introduced ungraded “blitz quizzes” at the start of each lecture. These five-minute assessments helped students identify conceptual weaknesses and provided real-time feedback without the pressure of grades.
  • ChatGPT was used to generate up to 60 customized quiz questions per week.
  • Prompts were refined by instructors to ensure clarity and alignment with textbook language.
  • The iterative use of ChatGPT for testing assignments also allowed faculty to preempt confusion by simulating common student misunderstandings.
This integration enhanced student engagement and helped instructors tailor lessons based on performance trends. Yet, alongside these benefits, faculty encountered new ethical challenges: several students began submitting AI-generated work without disclosure, raising concerns about academic integrity and underscoring the need for formal AI literacy instruction.

6.3. Recognizing the Boundaries of GenAI in Biology Education

Despite its promise, ChatGPT (GPT-4o and GPT-o1) revealed several critical limitations when applied to introductory biology:
  • Inaccurate visuals: Repeated attempts to generate scientifically accurate diagrams—e.g., insect morphology or microbial diversity graphs—yielded oversimplified or misleading images. This remains a barrier in visually intensive disciplines such as biology.
  • Grading inconsistency: When evaluated as a grading assistant for student reflections, ChatGPT matched instructor assessments only about 30% of the time. It frequently overvalued weaker responses and penalized stronger ones, reinforcing the importance of human evaluation for conceptual depth.
  • Confident misinformation: One of the more concerning patterns was the AI’s tendency to present incorrect information with unwarranted confidence. Students unfamiliar with the material often lacked the background to question these inaccuracies. Though performance improved with highly specific prompts, novice users may not know how to craft such queries—making AI literacy a central concern.
Taken together, these limitations suggest that while GenAI can support learning, it cannot replace expert instruction, scientific reasoning, or the developmental benefits of hands-on and inquiry-based education.

6.4. Future Directions: Equity-Driven AI Integration in STEM

Looking ahead, the ULV faculty are exploring new avenues for integrating GenAI into general biology while safeguarding core values of equity, rigor, and student agency. These next steps are guided by ongoing reflection on both the affordances and risks of AI-enhanced pedagogy, particularly within HSI contexts. Emerging priorities include the following:
  • Enhanced visual accuracy: Faculty are collaborating with instructional designers and AI developers to improve the scientific validity of AI-generated visuals—especially in content areas such as microbiology, physiology, and systems ecology. Inaccurate visuals can mislead novice learners and contribute to conceptual misunderstandings, making this a critical area for development [79].
  • Adaptive feedback systems: GenAI is being piloted to develop formative assessments that adjust in difficulty based on student performance. These AI-powered quizzes aim to personalize feedback and support differentiated instruction—a pedagogical strategy shown to reduce achievement gaps in large-enrollment STEM courses [76,78].
  • Student-led AI literacy modules: the ULV is experimenting with a peer-led approach to digital ethics. Student ambassadors are being trained to co-facilitate workshops on ethical AI use, citation practices, bias detection, and academic integrity. This participatory model empowers students as co-creators of learning culture while reinforcing community norms around transparency and responsible technology use [54].
  • Open-access review materials: Faculty are leveraging GenAI to co-create Creative Commons–licensed biology study guides, glossaries, and flashcard decks. These resources are intended to support students who lack access to commercial platforms or tutoring services, while also enabling knowledge-sharing across institutional boundaries—consistent with open education principles [80].
To assess the effectiveness and ethical dimensions of these practices, ongoing research is needed. Future inquiry should explore how GenAI integration affects student outcomes, such as concept mastery, persistence in STEM, and the development of metacognitive skills. Special attention must be given to disaggregated impacts across demographic groups, including first-generation college students, multilingual learners, and students with limited digital access [81].

7. Advancing Equity and Innovation in Mathematics Education with Generative AI

At the ULV, mathematics instruction serves as a critical foundation for developing quantitative reasoning, data fluency, and analytical thinking across STEM and non-STEM disciplines. For many students—particularly first-generation, multilingual, and underrepresented learners—gateway math courses can serve as both an academic hurdle and an opportunity for growth. Faculty teaching courses such as The Art of Guestimation have begun leveraging GenAI to foster engagement through estimation problems and context-rich scenarios that emphasize creativity, civic relevance, and interdisciplinary thinking. Rather than treating GenAI as a computational shortcut, instructors use it to support exploratory thinking, provide multilingual scaffolds, and generate culturally responsive prompts that invite students to connect mathematical reasoning to their lived environments. This aligns with a broader equity-centered approach to STEM instruction, in which student identity, background knowledge, and local context inform curricular design [82,83].
By incorporating GenAI tools into problem design, adaptive support, and collaborative critique, the ULV faculty are exploring how mathematics education can be both rigorous and inclusive. In particular, the ability to rapidly generate estimative scenarios—from calculating neighborhood water usage to analyzing local energy trends—has helped make mathematical abstraction more concrete, actionable, and personally relevant. At the same time, the integration of GenAI has raised questions about accuracy, overreliance, and ethical use—requiring that students and faculty alike approach these tools with curiosity, caution, and a critical eye. The following subsections examine how GenAI is being piloted in mathematics instruction at the ULV, highlighting both the opportunities and tensions that arise in efforts to scale culturally responsive, innovation-driven teaching.

7.1. Expanding Conceptual Relevance Through Contextualization

For many students—particularly multilingual, first-generation, and underrepresented learners—mathematics can appear abstract or disconnected from lived experience. GenAI offers a means to bridge this gap by generating locally grounded, socially relevant prompts. For example, AI tools were used to create estimation problems rooted in familiar settings:
  • Calculating water savings from drought-tolerant landscaping;
  • Modeling household energy use in Southern California;
  • Analyzing grocery budgets or solar panel adoption at the neighborhood scale.
These personalized prompts enabled students to approach mathematics not simply as a discipline, but as a tool for understanding and improving their communities. Yet, the instructional benefits of GenAI are accompanied by pedagogical risks. Oversimplification of logic, shallow explanations, or culturally narrow scenarios can reinforce surface learning or exclusion. Faculty at the ULV emphasized the importance of pairing AI use with critical AI literacy, encouraging students to interrogate how problems are framed and whose realities they reflect.

7.2. Practical Integration: Supporting Instruction and Multilingual Access

In The Art of Guestimation, GenAI has been used for the following tasks:
  • Scaffold complex estimation problems;
  • Translate instructions or terminology for English learners;
  • Rapidly generate instructional materials tailored to local contexts.
Students reported greater confidence when engaging with problems that reflected familiar decisions—such as planning grocery trips, comparing energy sources, or estimating traffic-related delays. GenAI’s multilingual capabilities also supported greater access to instruction, reducing linguistic barriers for non-native English speakers. However, effective implementation required guardrails. Faculty observed that some students leaned on GenAI to produce full answers without engaging in reasoning, risking passive learning. This led to the development of assignments that explicitly require students to critique, annotate, or revise AI outputs, reinforcing active learning and metacognition.

7.3. The Art of Guestimation as a Model for Inclusive AI Use

The course The Art of Guestimation exemplifies how GenAI can be thoughtfully embedded within inclusive, interdisciplinary pedagogy. Rather than simply solving problems, students are asked to perform the following tasks:
  • Articulate assumptions;
  • Justify units and scaling factors;
  • Assess the credibility and logic of AI-generated estimates.
For instance, when estimating population growth or modeling electric vehicle energy consumption, students evaluate the realism of AI suggestions and offer improvements. These exercises emphasize data literacy, civic engagement, and scientific reasoning—skills increasingly essential in an AI-mediated world.
The course also introduces students to core AI ethics topics, including
  • Algorithmic bias;
  • Data transparency;
  • The social implications of automation.
In doing so, it not only cultivates quantitative fluency, but also supports students as critical, responsible users of emerging technologies.

7.4. Addressing Equity and Infrastructure Challenges

To ensure AI innovations benefit all students, the ULV mathematics faculty have prioritized institutional strategies that address systemic barriers:
  • Training and support: workshops for students and faculty on responsible GenAI use help build confidence and fluency.
  • AI-critical assignments: tasks that ask students to critique, revise, or improve AI-generated problems foster engagement and discourage overreliance.
  • Inclusive content development: ongoing collaboration with developers ensures AI-generated content reflects diverse cultural contexts and lived experiences.
  • Digital equity initiatives: Access to devices and reliable internet is essential for inclusive participation. Expanding infrastructure through lending programs and campus Wi-Fi initiatives supports equity in GenAI-enhanced learning.
These strategies align with the ULV’s mission to reduce opportunity gaps and cultivate inclusive academic excellence.

7.5. Illustrative Problems: Real-World Estimation Tasks

The following GenAI-supported examples demonstrate how context-rich prompts can foster deeper engagement:
  • Water conservation: Estimate annual water savings from replacing lawns with drought-resistant landscaping. Scale to 1000 households.
  • Urban transit: compare energy use between electric and gasoline vehicles during peak traffic across Los Angeles.
  • Grocery budgeting: estimate food costs for a family of four and model savings via menu changes or store substitutions.
  • Solar energy: calculate energy output and cost savings for a neighborhood installing rooftop solar panels.
  • Population growth: model population growth in La Verne over 10 years using different projected growth rates.
Framing estimation in local, real-life terms promotes curiosity and reinforces the utility of mathematics in students’ personal and civic lives.

7.6. Future Directions for AI in Math Education

Looking forward, the ULV mathematics faculty are developing new pathways for integrating GenAI tools in ways that deepen student learning, support inclusive pedagogy, and build critical digital fluency. These initiatives aim not only to extend the utility of GenAI across instructional design, but also to investigate its cognitive, cultural, and ethical implications.
  • Empirical studies on engagement and retention: Faculty-led research will examine how GenAI-generated estimation problems compare to traditional assignments in promoting conceptual understanding, retention, and motivation—particularly among multilingual and first-generation students. Metrics will include persistence rates, confidence in mathematical reasoning, and attitudes toward problem-solving [84].
  • Student–AI co-creation models: Ongoing pilots are exploring how students can engage in co-creating, critiquing, and refining AI-generated prompts. This process invites students to become co-designers of their learning environment, fostering both metacognitive awareness and AI literacy. By reflecting on AI-generated logic and identifying inaccuracies or bias, students sharpen both quantitative and ethical reasoning [85].
  • Multimodal AI tools for modeling: Faculty are testing AI platforms that support multimodal inputs—such as text, numerical data, and graphical visualization—to scaffold interdisciplinary mathematical modeling. These tools hold particular promise for students who excel in spatial or verbal reasoning but may struggle with abstract numeracy alone [86].
  • Embedded cross-disciplinary modules: To ensure ethical grounding, the ULV mathematics faculty are collaborating with colleagues in philosophy, education, and computer science to co-develop short AI literacy modules. These will address topics such as algorithmic bias, transparency in tool use, and responsible data modeling—equipping students with transferable skills across STEM and non-STEM fields.
Together, these initiatives reflect the ULV’s broader commitment to designing GenAI-enhanced instruction that centers student experience, cultural relevance, and equity. Rather than positioning GenAI as a replacement for faculty-led instruction, our approach treats it as a pedagogical partner that supports critical thinking, deepens engagement, and expands access to high-impact learning opportunities. This direction is consistent with recent findings that highlight how higher education institutions are increasingly exploring collaborative, instructional uses of GenAI to improve student learning outcomes [87].

8. Bridging Scientific Literacy Gaps Through Generative AI in Biology Education

The COVID-19 pandemic exposed longstanding vulnerabilities in public scientific literacy, especially regarding the ability to differentiate between peer-reviewed evidence and anecdotal claims. While regulatory guidance from the FDA and CDC was grounded in decades of accumulated empirical data, public discourse was often dominated by misinformation, social media speculation, and a limited understanding of scientific process [88,89,90]. This disconnect not only fueled vaccine hesitancy and health misinformation but also reflected a deeper systemic issue: widespread underdevelopment of scientific reasoning skills among both STEM and non-STEM learners [91]. This crisis has renewed calls for institutions of higher education to play a more active role in cultivating critical science literacy—defined not simply as knowledge of facts, but as the capacity to evaluate evidence, interpret data, and understand the social and ethical dimensions of scientific inquiry [92,93]. Within this context, GenAI tools, when integrated with intentional scaffolding, provide promising avenues for supporting this goal. These tools can help students critically engage with scientific texts, formulate and communicate evidence-based arguments, and interrogate the credibility of information sources.
At the ULV, biology faculty have begun incorporating GenAI tools across courses for both majors and non-majors to address these literacy gaps. These implementations aim to achieve the following:
  • Support students in summarizing and interpreting complex peer-reviewed research;
  • Provide real-time feedback on scientific writing and data interpretation;
  • Encourage critical engagement with AI-generated explanations, particularly in assignments that require verification and revision;
  • Foster transparency and ethical reasoning in the use of AI in academic work.
This approach aligns with the ULV’s broader commitment to inclusive and justice-oriented STEM education, especially for students from historically marginalized backgrounds. In classrooms where students bring diverse linguistic resources and varying levels of familiarity with scientific discourse, GenAI can function as a mediator—not a substitute—for authentic scientific engagement. By enabling access to high-quality explanations, assisting with interpretation of complex texts, and modeling transparent revision practices, GenAI has the potential to democratize participation in scientific inquiry. However, faculty at the ULV are also careful to position GenAI as a tool that must be interrogated—not blindly trusted. Assignments that focus on evaluating AI-generated summaries, for instance, help students recognize both the affordances and limitations of automated systems in scientific communication. In this way, GenAI serves not just as a learning aid, but as a catalyst for deeper reflection on how knowledge is created, validated, and shared—core components of scientific literacy in the 21st century.

8.1. Supporting Non-Majors in Navigating Scientific Information

For non-STEM majors, introductory biology courses often present unfamiliar challenges: interpreting scientific literature, evaluating data, and understanding experimental design. To address these barriers, the ULV faculty introduced GenAI as a tool to scaffold engagement with primary literature. Students used ChatGPT to perform the following tasks:
  • Summarize peer-reviewed articles;
  • Generate annotated bibliographies;
  • Evaluate the credibility of sources.
Rather than simply translating complex material into simpler terms, these activities focused on helping students unpack the structure, logic, and implications of scientific claims. This process enabled them to critically differentiate between anecdotal information and empirical evidence—skills that are essential in resisting misinformation and building informed civic perspectives [21,94].

8.2. Enhancing Science Communication Among Biology Majors

Within courses for biology and microbiology majors, GenAI was integrated into communication-intensive assignments such as poster presentations and manuscript writing. Early writing assignments were completed independently to reinforce core competencies. GenAI was then introduced as a revision tool, helping students
  • Clarify their key findings;
  • Distill takeaways into accessible language;
  • Organize poster elements or manuscript sections more effectively.
Crucially, students maintained full authorship of their work, using AI as an editorial assistant rather than a ghostwriter. This supported the development of a scientific voice while offering a platform for ethical discussions on transparency, attribution, and intellectual ownership [71]. These practices also raised important research questions for future exploration: How does AI use shape students’ confidence and ability to communicate science? What supports are needed to ensure equity in these emerging modalities?

8.3. Teaching AI Literacy Through Critical Evaluation

Rather than avoiding the known limitations of GenAI, the ULV faculty incorporated them directly into instruction. Students were tasked with evaluating AI-generated summaries of primary literature, identifying
  • Misinterpretations of experimental design;
  • Logical inconsistencies;
  • Unsupported claims.
These assignments helped students develop critical AI literacy, positioning GenAI not as an authoritative voice but as a collaborator requiring verification. Through this lens, AI errors became case studies in scientific reasoning, reinforcing core concepts in hypothesis testing, source verification, and scientific trust-building. This reflective practice—treating AI as both a tool and an object of critique—also promoted metacognitive awareness of how knowledge is constructed and communicated.

8.4. Broadening Access to Research Communication

By embedding GenAI into structured, low-stakes assignments, biology instructors created inclusive opportunities for students—particularly those from underrepresented backgrounds—to gain confidence in scientific communication. In courses where students revised experimental designs, created visual abstracts, or generated manuscript drafts with AI support, faculty noted marked improvements in clarity, coherence, and ownership of content.
For students unfamiliar with scholarly publishing, GenAI provided a bridge to understanding formatting expectations, peer-review norms, and disciplinary discourse. These interventions align with the HSI goals of expanding access to high-impact educational practices, including undergraduate research, public presentation, and authorship [20]. The result was a more democratized entry into the scientific community—an essential goal for equity-driven biology education.

8.5. Future Directions: Institutionalizing Scientific Literacy in the AI Era

The experiences at the ULV point to a broader imperative: to embed GenAI into STEM curricula in ways that not only enhance learning outcomes, but also cultivate scientific literacy, civic reasoning, and ethical AI engagement. Moving forward, efforts must shift from ad hoc integration toward institutionally scaffolded strategies that prepare students to navigate—and shape—a scientific landscape increasingly mediated by intelligent systems.
  • Cross-disciplinary AI literacy modules: Faculty across disciplines are exploring shared curricula that introduce students to core concepts in AI use—such as source evaluation, algorithmic bias, transparency, and attribution ethics. These modules can be integrated into first-year seminars, science writing courses, or lab-based learning environments and aligned with information literacy frameworks commonly developed in partnership with library science [95,96].
  • Empirical studies on science communication outcomes: The ULV aims to support faculty-led research that evaluates how GenAI affects students’ ability to articulate scientific claims, construct arguments, and adapt content for public or interdisciplinary audiences. These studies should also disaggregate by demographic indicators to track whether AI-supported instruction helps close—or unintentionally widens—existing participation gaps [35].
  • Partnerships with library and information science: To strengthen students’ ability to distinguish credible information from misinformation, collaborations with librarians can help scaffold AI-supported inquiry skills, particularly in navigating scientific databases and interpreting source authority. These collaborations are especially relevant given the surge in AI-generated citations, abstracts, and preprints with variable credibility [97,98].
  • Digital equity initiatives: Finally, the success of any institutional GenAI strategy depends on equitable access. Students from low-income, undocumented, or commuter backgrounds often face disproportionate barriers to AI tools and training. Investments in device lending programs, campus-wide AI literacy workshops, and inclusive technology policies are essential to ensure full participation in GenAI-enhanced research and learning [81,99].
Ultimately, GenAI is not a substitute for rigorous science education, nor is it a fix-all for systemic inequities. However, when approached as a sociotechnical scaffold—one that supports student agency, democratizes access to disciplinary discourse, and fosters reflective practice—it becomes a powerful tool for inclusive scientific engagement [100]. In bridging the gap between research and public understanding, GenAI offers not just a teaching innovation, but a cultural shift in how we prepare students to understand, critique, and contribute to scientific knowledge.

9. Generative AI as a Coding Mentor: Expanding Access to Computational Research in Physics

GenAI is increasingly transforming undergraduate STEM education by expanding access to research experiences that have traditionally been limited to upper-division students with advanced coursework. At the ULV, physics faculty are piloting GenAI tools as real-time coding mentors within computational research settings. This approach is particularly focused on supporting early-stage students—including first- and second-year learners—who are often excluded from research due to their limited programming experience or lack of exposure to faculty-led projects. This work directly addresses a persistent equity gap in physics education. Nationally, over 70% of students who leave the physics major do so within their first two years, frequently citing a lack of early research opportunities, difficulty accessing mentorship, and limited confidence in computational skills [101,102]. These barriers disproportionately affect first-generation students, women, and students from underrepresented racial and ethnic groups, who may be less likely to enter physics with prior programming experience or advanced mathematical preparation [103].
To disrupt these exclusionary patterns, the ULV faculty integrated GenAI—specifically ChatGPT—into a National Science Foundation (NSF)-funded planetary science project that engages students in analyzing four decades of NASA solar wind data. Students developed deep learning models to forecast geomagnetic storms—complex natural events that affect the ionosphere, satellite communications, and power grids. Through this project, students gained hands-on exposure to data science, machine learning, and scientific computing, all while receiving just-in-time, AI-enabled support. Importantly, the initiative welcomed students from diverse academic pathways—not only physics majors but also those in chemistry, mathematics, and computer science. This interdisciplinary model reflects the increasing role of data-intensive inquiry across STEM fields and aligns with national calls to diversify participation in research through scalable, inclusive mentorship models [104]. GenAI served as an on-demand assistant, helping students interpret errors, troubleshoot code, and learn key programming constructs without the intimidation often associated with self-teaching.
This pilot not only democratized access to high-impact learning experiences but also offered new insights into the evolving role of GenAI in undergraduate research mentorship—particularly in disciplines where faculty capacity and student preparation remain uneven.

9.1. Scaffolding Early Research in Space Weather Forecasting

The core research project focused on developing machine learning models to forecast geomagnetic storms based on over four decades of NASA solar wind data. These storms—caused by solar activity—can disrupt communications, satellites, and power systems. To contribute meaningfully, students needed to rapidly acquire skills in Python 3.13.5 (Anaconda distribution) programming, data wrangling, and deep learning.
The ULV’s instructional design offered three scaffolding supports:
  • Faculty mentorship: students attended 15 h per week of small-group sessions that provided guided instruction and real-time troubleshooting.
  • Independent learning resources: supporting resources such as Python tutorials and data science documentation supported self-paced learning.
  • AI-driven support: ChatGPT acted as an always-available tutor—interpreting error messages, generating code snippets, and offering conceptual explanations.
Together, these supports allowed early-career students to engage with advanced research while developing foundational computational fluency.

9.2. Real-World Use Cases: GenAI in Student Research Workflows

Students integrated ChatGPT into various aspects of their computational workflow:
  • Code generation: when students encountered unfamiliar syntax or libraries (e.g., pandas, NumPy, TensorFlow), they used AI to generate and explain code snippets tailored to their project needs.
  • Algorithm design: ChatGPT provided step-by-step explanations of machine learning architectures such as Recurrent Neural Networks (RNNs) and Long Short-Term Memory (LSTM) models, breaking down logic and implementation.
  • Code translation: students with backgrounds in MATLAB (version R2023a), C++, or Java used GenAI to translate familiar logic into Python, accelerating language acquisition across platforms.
  • Optimization and debugging: ChatGPT was used to troubleshoot bugs, improve runtime, and clarify logic errors—transforming technical roadblocks into learning opportunities.
These applications demonstrate how AI can support agency and problem-solving, especially for students navigating complex coding challenges for the first time.

9.3. Building AI Literacy Through Verification and Ethical Use

To ensure responsible use, faculty at the ULV emphasized the critical evaluation of AI outputs. Rather than positioning GenAI as infallible, instructors encouraged students to treat it as a learning partner—one that occasionally makes mistakes.
  • Verification: students were trained to cross-reference AI-generated code with trusted documentation and test functionality across multiple datasets.
  • Transparency: code generated with AI required student annotation and conceptual justification to ensure full intellectual ownership.
  • Ethical engagement: classroom discussions focused on AI authorship, attribution, and responsible integration into collaborative work.
These practices reinforced the principle that GenAI should scaffold—not substitute—scientific reasoning. This aligns with the broader ethical framework presented throughout the manuscript, emphasizing transparency, academic integrity, and metacognition.

9.4. Expanding Equity Through Early Access to Research

The initiative also addressed structural barriers to research participation. By embedding AI in summer internships funded by the NSF, the ULV faculty
  • Expanded participation: students from historically excluded backgrounds—including those without prior programming experience—were able to contribute meaningfully to a cutting-edge research project.
  • Supported financial equity: paid research opportunities helped reduce financial barriers, particularly for commuter and working students.
  • Built research identity: students prepared conference presentations and co-authored publications, accelerating their integration into the scientific community.
This model demonstrates how GenAI, when combined with mentorship and funding, can advance inclusive excellence in undergraduate STEM education.

9.5. Future Directions: Designing Scalable AI-Supported Research Training

Building on this success, the ULV’s physics faculty are exploring scalable innovations to expand AI-supported research training and make early computational experience more equitable, particularly for students from historically excluded groups in STEM. These future directions aim to institutionalize access, deepen learning, and bridge the gap between AI literacy and authentic research participation:
  • Open-access pre-research modules: Faculty are co-developing modular, GenAI-augmented tutorials that introduce foundational coding skills (e.g., Python, pandas, matplotlib) and core machine learning concepts (e.g., classification, regression) to lower-division students. These resources serve as an on-ramp for students without prior programming exposure, addressing early attrition risks often linked to technical intimidation [105].
  • Peer-led AI coaching models: Advanced students are being trained to support their peers not only in debugging code and interpreting model outputs, but also in critically assessing AI-generated content. This mentorship model cultivates a collaborative learning culture while distributing the cognitive load of research onboarding [106].
  • Longitudinal research on impact: Faculty plan to study the long-term effects of AI-supported research participation on outcomes such as STEM identity, persistence, and entry into graduate or research-intensive career paths. These studies will disaggregate results by gender, race/ethnicity, and socioeconomic status to ensure that GenAI interventions are advancing—rather than bypassing—equity goals [106].
  • Ethics-integrated curriculum: As students gain proficiency in machine learning, they will also engage with structured discussions and case studies on algorithmic bias, data privacy, intellectual property, and the social implications of automation. Embedding these themes into technical training ensures that students are not only coders, but also critical thinkers equipped to lead ethically in AI-intensive fields [107].
Together, these strategies offer a blueprint for using GenAI not simply as a productivity tool, but as a transformative pedagogical scaffold—one that expands access to research, promotes epistemic justice, and prepares undergraduates to lead in an increasingly AI-mediated scientific ecosystem [108].

10. Institutional Strategies for Scalable and Inclusive AI Pedagogies

The integration of GenAI into STEM education at HSIs such as the ULV represents a pivotal moment for rethinking equity, inclusion, and innovation in higher education. While GenAI holds considerable potential to scaffold learning, personalize instruction, and reduce resource disparities [20,21], its adoption must be guided by more than technological enthusiasm. The interdisciplinary implementations described in this manuscript illustrate how AI tools can help address longstanding structural challenges—such as high attrition in gateway STEM courses, language barriers in scientific writing, and unequal access to early research experiences. Yet, they also reveal the complexity of deploying GenAI in educational contexts marked by linguistic diversity, constrained infrastructure, and shifting DEI mandates [54,109]. To ensure that GenAI advances equity rather than reinforces existing gaps, its use must be intentionally designed, ethically grounded, and contextually responsive—rooted in culturally relevant pedagogy and informed by ongoing critical reflection [51].

10.1. Equity and Access in the AI Era

GenAI holds significant potential to lower access barriers for first-generation, multilingual, and underrepresented students by simplifying technical language, providing multilingual explanations, and enabling personalized learning pathways [110]. Faculty at the ULV demonstrated how these tools can scaffold complex STEM concepts, from chemistry protocols to computational coding. However, the benefits of GenAI are not evenly distributed [108]. Without equitable access to devices, stable internet, and on-campus support, students—especially commuters and those from underserved backgrounds—face new layers of exclusion in AI-supported classrooms [94]. Addressing these disparities requires institutional investment in digital infrastructure, device lending programs, and targeted learning support [111]. The urgency of these efforts is compounded by two converging forces: accelerating automation in the labor market and the dismantling of Diversity, Equity, and Inclusion (DEI) programs in higher education. As DEI frameworks come under political scrutiny, the responsibility for fostering equitable outcomes increasingly shifts to individual faculty and departments. Within this context, GenAI becomes not merely a technical tool, but a strategic response to structural inequality [112].

10.2. Culturally Responsive and Context-Aware Pedagogy

GenAI’s effectiveness is amplified when aligned with students’ lived experiences, regional challenges, and community values. At the ULV, courses such as The Art of Guestimation and field biology used GenAI to contextualize quantitative reasoning through place-based problems—modeling water use, estimating solar energy, or assessing urban biodiversity. These applications exemplify culturally relevant pedagogy, which has been shown to increase student motivation, engagement, and persistence [113]. Yet, there are limitations. Most GenAI tools are trained on large language datasets that reflect dominant cultural narratives and Western epistemologies. This raises concerns about bias, erasure, and the reinforcement of exclusionary norms [24]. Faculty must therefore serve as cultural mediators—curating prompts, refining outputs, and advocating for AI systems trained on more inclusive datasets [111]. Meaningful progress will require partnerships between educators and developers to align tool design with the diverse cultural and linguistic realities of HSI students [114].

10.3. Connecting Coursework with Research and Community Inquiry

Faculty across disciplines leveraged GenAI to deepen student engagement with real-world scientific problems. Whether through ecological fieldwork, laboratory research, or computational physics, AI tools provided scaffolding for students to design experiments, interpret data, and visualize patterns. These activities created bridges between coursework and community-based research. For example, students used GenAI to plan ecological sampling, develop space weather forecasting models, and troubleshoot laboratory techniques. These implementations illustrate how AI can lower entry barriers to undergraduate research, especially for students without prior exposure to scientific workflows. Still, these tools must be understood as complements—not substitutes—for tactile experimentation and hands-on learning. As others have emphasized [71,73], digital simulations are only effective when paired with physical experience and instructor feedback. Moreover, to sustain these interdisciplinary innovations, institutions must enable cross-departmental collaboration and flexible course design policies that support faculty-led experimentation.

10.4. Ethical Engagement and Critical AI Literacy

While AI can accelerate instructional design and support student learning, it also introduces new ethical dilemmas. Across courses, students occasionally relied on GenAI without verifying content accuracy or citing its contributions—raising concerns about attribution, misinformation, and overreliance. Faculty addressed these issues by integrating AI literacy directly into assignments. Students were asked to detect errors in AI-generated content, revise flawed code, or critique automated research summaries. These reflective tasks helped reposition GenAI from authority to collaborator and reinforced the importance of verification, integrity, and scientific reasoning. This echoes recent calls for faculty to act as ‘AI-aware professors’ who explicitly model responsible engagement for students [115]. To summarize these dual dynamics, Table 1 provides a concise overview of the primary opportunities and risks of GenAI integration in STEM classrooms, as identified across our cases. Such pedagogical strategies align with national recommendations to prepare graduates who are both digitally fluent and ethically grounded [94,99,116]. To institutionalize this work, colleges and universities must develop clear frameworks for AI use that include data governance, academic honesty, and algorithmic transparency [117].

10.5. AI Literacy as a Multidimensional Competency

At the ULV, students engaged with GenAI not only as users, but as evaluators—testing, refining, and critically interpreting AI-generated content. These activities nurtured what this study terms multidimensional AI literacy: the intersection of technical fluency, ethical reflection, and contextual judgment. This framing moves beyond “tool training” to recognize AI literacy as central to 21st-century scientific citizenship. However, faculty-driven innovation alone is insufficient. Institutions must invest in coordinated support—digital skill-building programs, cross-disciplinary AI workshops, and co-curricular learning spaces—to ensure that all students, regardless of background, can build AI literacy in ways that align with their academic and professional goals [116].

10.6. Cross-Disciplinary Patterns and Pedagogical Convergence

Despite disciplinary differences, faculty practice across biology, chemistry, physics, and mathematics revealed several recurring insights:
  • Comprehension scaffolding: GenAI helped students parse complex ideas through simplified language, segmentation, and interactive prompting [71].
  • Prompt-based iteration: instructors used GenAI to generate draft assessments, assignment templates, and quiz questions—streamlining content development and feedback loops.
  • Output limitations: across fields, faculty reported inaccuracies in AI-generated visuals, code, or explanations—highlighting the need for expert review and correction [73].
  • Ownership and reflection: when paired with reflective tasks, GenAI fostered deeper student thinking without displacing intellectual authorship.
These patterns suggest that GenAI, when embedded into thoughtfully scaffolded learning environments, can serve as a cognitive partner—one that enhances instructional design rather than replacing it.

10.7. Broader Institutional Applications

While this review focuses on an HSI context, the insights have broader relevance across higher education. This subsection situates our findings in the wider literature and considers their applicability for non-HSIs. Our findings align with emerging studies documenting both the promise and pitfalls of GenAI integration in higher education. For example, Soliman et al. [118] modeled university students’ continuous intention to adopt GenAI and emphasized that equitable uptake depends less on the novelty of the tools than on the institutional ecosystems in which they are embedded. This parallels our observation that faculty-driven experimentation must be scaffolded by infrastructural and policy supports if it is to scale equitably. Similarly, Pimentel and Palomino [27] describe how GenAI can be deliberately integrated into KM processes to automate codification and collaboration—echoing how the ULV’s faculty practices operationalize AI-supported knowledge sharing.
Although our focus is on an HSI, these dynamics extend beyond MSIs. At predominantly white institutions (PWIs), where student demographics are less diverse, the absence of explicit equity frameworks may actually exacerbate risks of algorithmic bias and digital exclusion. Studies of AI in general higher education contexts suggest that without intentional attention to cultural and linguistic inclusion, GenAI tends to reinforce existing privilege structures [12,36]. Thus, while HSIs face resource constraints, they also possess mission-driven commitments to equity that provide a critical orientation for GenAI adoption. PWIs that lack such commitments must work deliberately to avoid defaulting to one-size-fits-all applications of GenAI that fail to account for linguistic, socioeconomic, or first-generation status diversity.
In this way, the strategies documented here—scaffolding technical writing, expanding research access, and fostering multilingual comprehension—offer transferable models not just for HSIs but for any institution grappling with the sociotechnical tensions of GenAI adoption. For non-HSIs, the implication is clear: equitable AI integration requires intentional design choices and critical faculty engagement, even where the institutional demographics may not make equity concerns as visible. Taken together, these insights emphasize that GenAI adoption is never neutral; it is shaped by institutional commitments, capacities, and demographics. The strategies we document here provide actionable entry points for institutions of all types to approach GenAI integration intentionally, equitably, and sustainably.

10.8. Sustaining DEI Commitments Amid Policy Retrenchment

The use of GenAI at HSIs such as the ULV unfolds against a complex political backdrop. In several states, institutional DEI frameworks have been weakened or dismantled, placing the onus for inclusive teaching on individual faculty, departments, and grassroots initiatives [119,120]. In this climate, GenAI presents both risks and opportunities [108]. If implemented without critical oversight, it may replicate bias, privilege dominant narratives, and deepen digital divides [112]. However, when leveraged intentionally, it can amplify student voice, personalize learning, and make STEM education more inclusive. At the ULV, faculty-led innovations—spanning multilingual learning, community-engaged research, and interdisciplinary curriculum design—demonstrate that it is possible to uphold equity goals even amid institutional constraint. These practices offer a compelling model for how HSIs can turn AI integration into a vehicle for systemic transformation [112]. To support institutions seeking to adapt and apply these findings, Table 2 presents a synthesis of institutional challenges, GenAI-enabled instructional strategies, and their observed or intended equity-centered outcomes, offering a practical framework for action across diverse disciplinary and institutional settings.

10.9. Limitations and Future Directions

This review adopts an exploratory qualitative design, synthesizing faculty-authored vignettes and peer-reviewed literature rather than generating new empirical data. As such, it documents emerging practices rather than offering outcome-based metrics. This scope is both a strength and a limitation: while it enables timely reflection on GenAI’s integration at a Hispanic-Serving Institution, it does not provide quantitative measures of learning outcomes or longitudinal analyses of impact. These limitations reflect the early stage of GenAI adoption in higher education, where tools are rapidly evolving and systematic data collection has only begun. Future studies might also draw on innovative analytic approaches, such as the fuzzy-set methodology, outlined by Soliman et al. [121], which models not only whether students adopt GenAI tools but also the institutional and contextual conditions under which adoption occurs.
Future directions should build on this exploratory groundwork through empirical study. Potential pathways include collecting quantitative evidence on how GenAI shapes student persistence, equity gaps, and disciplinary engagement; conducting longitudinal research to examine the sustained impact of AI-enhanced pedagogy; and extending analysis across institutions to compare experiences at HSIs, other Minority-Serving Institutions, and predominantly white institutions. Recent case study analyses, such as that conducted by Belkina [122], provide examples of how GenAI integration can be systematically documented across diverse higher education contexts, offering useful models for designing empirical studies that move beyond exploratory vignettes. At the same time, faculty-led exploration must continue to play a central role, ensuring that innovation is not prematurely constrained by institutional guardrails. Administrators and policymakers can benefit from engaging directly with classroom experimentation to better understand both the opportunities and challenges that GenAI presents. At the policy level, García-López [123] highlights the ethical and regulatory challenges that accompany GenAI adoption in education, underscoring the importance of future research that balances innovation with safeguards for equity, transparency, and academic integrity. Together, these directions underscore the need for iterative, practice-informed research that balances empirical rigor with openness to faculty creativity and student agency.

11. Conclusions

The integration of GenAI into STEM education at HSIs represents more than a technological innovation—it constitutes a strategic response to persistent inequities in access, scientific literacy, and digital opportunity. Drawing on interdisciplinary faculty-led practices at the ULV, this study has illustrated how GenAI tools can simplify complex concepts, support culturally grounded instruction, and extend early-stage research opportunities to students historically underrepresented in science. Across the disciplines of biology, chemistry, physics, and mathematics, faculty demonstrated that GenAI—when embedded within thoughtfully designed, equity-focused learning environments—can enhance comprehension, personalize learning, and expand access to high-impact educational practices. Whether used to scaffold microbiology writing assignments or debug machine learning algorithms in physics, GenAI served as a pedagogical amplifier: not replacing mentorship or experiential learning, but complementing them in powerful ways.
However, the potential of GenAI will not be realized through tool adoption alone. Institutional action is essential to ensure that these benefits are equitably distributed. Investments in digital infrastructure, inclusive AI literacy programs, and faculty development are necessary to prevent algorithmic bias, avoid over-reliance on automation, and close persistent access gaps. These needs are particularly urgent in light of broader societal forces—including labor market volatility, retrenchments in diversity and inclusion efforts, and deepening digital divides among first-generation students. In this context, HSIs are uniquely positioned to lead. Their missions, student populations, and faculty innovation networks provide fertile ground for modeling ethical, culturally responsive GenAI integration. The faculty initiatives described here highlight a replicable framework rooted in interdisciplinary collaboration, contextualized pedagogy, and critical engagement—one that aligns with the values of justice, inclusion, and academic rigor.
Looking ahead, future research must move beyond anecdotal implementation and toward the development of empirically grounded models that evaluate the long-term effects of GenAI on retention, STEM identity, and workforce preparedness. Of equal importance is the need to investigate how students negotiate the boundaries between AI-generated support and their own intellectual authorship—particularly in research, design, and scientific communication. Rather than responding to the rise of GenAI with restriction or avoidance, higher education should embrace it as a catalyst for pedagogical reinvention. This manuscript advocates for a proactive, justice-centered approach that prepares students not only to navigate a GenAI-infused world—but to shape it. Institutions must move beyond reactive policies and cultivate AI-literate learning environments that reflect the realities students will face in their academic and professional lives. When used intentionally, GenAI becomes not merely a technological add-on, but a tool that complements mentorship, reinforces scientific reasoning, and expands access to high-impact learning. By foregrounding faculty voice, institutional context, and scalable strategies, this work contributes to a growing movement redefining what inclusive, forward-looking STEM education can and should look like.

Author Contributions

Conceptualization: V.D.C.-G.; methodology: V.D.C.-G., H.U., M.Z., C.B., E.T., Y.D., D.C., and T.L.; software: H.U., M.Z., C.B., E.T., Y.D., D.C., and T.L.; validation: V.D.C.-G.; formal analysis: V.D.C.-G., H.U., M.Z., C.B., E.T., Y.D., D.C., and T.L.; investigation: H.U., M.Z., C.B., E.T., Y.D., D.C., and T.L.; resources: V.D.C.-G.; data curation: V.D.C.-G.; writing—original draft preparation: V.D.C.-G.; writing—review and editing: V.D.C.-G., H.U., M.Z., C.B., E.T., Y.D., D.C., and T.L.; visualization: V.D.C.-G.; supervision: V.D.C.-G.; project administration: V.D.C.-G.; funding acquisition: V.D.C.-G. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Ethical review and approval were waived for this study, because this study did not include any interventions, personal data collection, or procedures that would require formal Ethics Committee or Institutional Review Board (IRB) approval.

Informed Consent Statement

Participant consent was waived, because this study did not include any interventions, personal data collection, or procedures that would require formal Ethics Committee or Institutional Review Board (IRB) approval.

Data Availability Statement

The data supporting the findings of this study are available from the authors upon reasonable request.

Acknowledgments

V.D.C.-G. sincerely thanks Ngoc H. Bui, from Effectiveness, Planning, and Faculty Affairs in the College of Arts and Sciences at the University of La Verne, for her invaluable conversations about the innovative ways CAS faculty are integrating generative AI into their work, which were instrumental in shaping the direction of this project. Her efforts in recruiting the initial faculty team for meaningful discussions and shared exploration are deeply appreciated.

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
ARAugmented reality
CURECourse-Based Undergraduate Research Experiences
DEIDiversity, Equity, and Inclusion
GenAIGenerative Artificial Intelligence
HPLCHigh-performance liquid chromatography
HSIHispanic-Serving Institution
LSTMLong Short-Term Memory
MSIMinority-Serving Institution
NSFNational Science Foundation
RNNRecurrent Neural Network
STEMScience, Technology, Engineering, and Mathematics
ULVUniversity of La Verne

References

  1. Milberg, T. The future of learning: How AI is revolutionizing education 4.0. In Proceedings of the World Economic Forum, Davos, Switzerland, 15–19 January 2024. [Google Scholar]
  2. UNESCO. Education in the Age of Artificial Intelligence; United Nations Educational: Paris, France, 2023. [Google Scholar]
  3. National Science Foundation. NSF Investing Nearly $8M in EducateAI Awards to Develop Next-Generation AI Curriculum; National Science Foundation: Alexandria, VA, USA, 2024. [Google Scholar]
  4. Bratsis, I. The AI Product Manager’s Handbook: Develop a Product That Takes Advantage of Machine Learning to SOLVE AI Problems; Packt Publishing Ltd.: Birmingham, UK, 2023. [Google Scholar]
  5. Emrey-Arras, M. Higher Education: Hispanic-Serving Institutions Reported Extensive Facility and Digital Infrastructure Needs. GAO-24-106162. In Report to Congressional Committees; US Government Accountability Office: Washington, DC, USA, 2024. [Google Scholar]
  6. Mangan, K. What does it take to be a ‘minority-serving institution’? The Chronicle of Higher Education, 20 March 2023. [Google Scholar]
  7. Flores, A.R. Article on Hispanic-Serving Institutions fell short [Letters to the Editor]. The Chronicle of Higher Education, 13 December 2021. [Google Scholar]
  8. de La Torre, A.F. Bridging the AI Divide: A Call to Action; Inside Higher Ed: Washington, DC, USA, 2024. [Google Scholar]
  9. Monaghan, P. What it means, politically, to serve Hispanics. The Chronicle of Higher Education, 21 April 2019. [Google Scholar]
  10. Carmona-Galindo, V.D.; Velado-Cano, M.A.; Groat-Carmona, A.M. The Ecology of Climate Change: Using Virtual Reality to Share, Experience, and Cultivate Local and Global Perspectives. Educ. Sci. 2025, 15, 290. [Google Scholar] [CrossRef]
  11. National Science Board; National Science Foundation. Science and Engineering Indicators 2022: The State of U.S. Science and Engineering (NSB-2022-1); National Science Foundation: Alexandria, VA, USA, 2022. [Google Scholar]
  12. Williamson, B.; Eynon, R. Historical threads, missing links, and future directions in AI in education. Learn. Media Technol. 2020, 45, 223–235. [Google Scholar] [CrossRef]
  13. Humburg, M.; Han, A.; Zheng, J.; Rosé, C.P.; Chao, J.; Melo, N.A.; Chávez, V.C.; Higgs, J.; Kaimana, M.; Isero, M. Humanizing AI for Education: Conversations with the JLS 2026 Special Issue Contributors. In Proceedings of the 19th International Conference of the Learning Sciences-ICLS, Helsinki, Finland, 10–13 June 2025; pp. 2269–2277. [Google Scholar]
  14. Hurtado, S.; Ruiz Alvarado, A. Realizing the Potential of Hispanic-Serving Institutions: Multiple Dimensions of Institutional Diversity for Advancing Hispanic Higher Education; Hispanic Association of Colleges and Universities: San Antonio, TX, USA, 2012. [Google Scholar]
  15. Bauman, D. Share of Hispanic college students has nearly doubled since 2005. The Chronicle of Higher Education, 11 May 2023. [Google Scholar]
  16. Brown, S.M.; Mangan, K. Everyone wants to be a Hispanic-Serving Institution. The Chronicle of Higher Education, 1 December 2021. [Google Scholar]
  17. Taylor, Z.; Ortega, G.; Hernández, S.H. Hispanic-serving artificial intelligence: Do hispanic-serving institutions use chatbots and can they speak Spanish? Teach. Coll. Rec. 2024, 126, 30–53. [Google Scholar] [CrossRef]
  18. Taylor, Z.; Burnett, C.A. Hispanic-serving institutions and web accessibility: Digital equity for Hispanic students with disabilities in the 21st century. J. Hisp. High. Educ. 2021, 20, 402–421. [Google Scholar] [CrossRef]
  19. Valdivieso, T.; González, O. Generative AI Tools in Salvadoran Higher Education: Balancing Equity, Ethics, and Knowledge Management in the Global South. Educ. Sci. 2025, 15, 214. [Google Scholar] [CrossRef]
  20. Sinha, N.; Madhavarao, R.; Freeman, R.; Oujo, I.; Boyd, J. AI Literacy for Hispanic-Serving Institution (HSI) Students. In Proceedings of the AAAI Symposium Series, Vancouver, BC, Canada, 25–27 March 2024; pp. 522–527. [Google Scholar]
  21. Robert, J. The Future of AI in Higher Education: 2024 Educause AI Landscape Study. 2024. Available online: https://www.educause.edu/ecar/research-publications/2024/2024-educause-ai-landscape-study/the-future-of-ai-in-higher-education (accessed on 12 January 2025).
  22. Ahmed, R.; Kumar, S. Algorithmic bias in educational systems: Examining the impact of AI-driven decision making in modern education. World J. Adv. Res. Rev. 2025, 25, 37–45. [Google Scholar] [CrossRef]
  23. Kizilcec, R.F. To Advance AI Use in Education, Focus on Understanding Educators. Int. J. Artif. Intell. Educ. 2024, 34, 12–19. [Google Scholar] [CrossRef]
  24. Downs, L. So, What Is Culturally Responsive Digital Learning? Available online: https://wcet.wiche.edu/frontiers/2023/09/07/so-what-is-culturally-responsive-digital-learning/ (accessed on 12 January 2025).
  25. Yosso, T.J. Whose culture has capital? A critical race theory discussion of community cultural wealth. Race Ethn. Educ. 2005, 8, 69–91. [Google Scholar] [CrossRef]
  26. Paris, D.; Alim, H.S. Culturally Sustaining Pedagogies: Teaching and Learning for Justice in a Changing World; Teachers College Press: New York, NY, USA, 2017. [Google Scholar]
  27. Pimentel, M.; Palomino, J.C.V. Generative AI Solutions for Enhancing Knowledge Management: A Literature Review and Roadmap. In Proceedings of the European Conference on Knowledge Management, Lahti, Finland, 5–6 September 2024; pp. 1092–1095. [Google Scholar]
  28. Kudryavtsev, D.; Khan, U.A.; Kauttonen, J. Transforming Knowledge Management Using Generative AI: From Theory to Practice. In Proceedings of the International Joint Conference on Knowledge Discovery, Knowledge Engineering and Knowledge Management, Porto, Portugal, 16–18 November 2024. [Google Scholar]
  29. Bura, C.; Myakala, P.K. Advancing Transformative Education: Generative AI as a Catalyst for Equity and Innovation. 2024. Available online: https://www.researchgate.net/publication/386112299_Advancing_Transformative_Education_Generative_AI_as_a_Catalyst_for_Equity_and_Innovation (accessed on 12 January 2025).
  30. Qian, Y. Pedagogical Applications of Generative AI in Higher Education: A Systematic Review of the Field. TechTrends 2025. [Google Scholar] [CrossRef]
  31. Zawacki-Richter, O.; Marín, V.I.; Bond, M.; Gouverneur, F. Systematic review of research on artificial intelligence applications in higher education—Where are the educators? Int. J. Educ. Technol. High. Educ. 2019, 16, 39. [Google Scholar] [CrossRef]
  32. Nonaka, I.; Takeuchi, H. The knowledge-creating company: How Japanese companies create the dynamics of innovation. Long Range Plan. 1996, 29, 592. [Google Scholar] [CrossRef]
  33. Alavi, M.; Leidner, D.E. Review: Knowledge Management and Knowledge Management Systems: Conceptual Foundations and Research Issues. MIS Q. 2001, 25, 107–136. [Google Scholar] [CrossRef]
  34. Rezaei, M. Artificial intelligence in knowledge management: Identifying and addressing the key implementation challenges. Technol. Forecast. Soc. Change 2025, 217, 124183. [Google Scholar] [CrossRef]
  35. Kizilcec, R.F.; Lee, H. Algorithmic fairness in education. In The Ethics of Artificial Intelligence in Education; Routledge: Abingdon, UK, 2022; pp. 174–202. [Google Scholar]
  36. Madaio, M.; Blodgett, S.L.; Mayfield, E.; Dixon-Román, E. Beyond “fairness”: Structural (in) justice lenses on ai for education. In The Ethics of Artificial Intelligence in Education; Routledge: Abingdon, UK, 2022; pp. 203–239. [Google Scholar]
  37. O’Dea, X. Generative AI: Is it a paradigm shift for higher education? Stud. High. Educ. 2024, 49, 811–816. [Google Scholar] [CrossRef]
  38. Garcia, G.A. Becoming Hispanic-Serving Institutions: Opportunities for Colleges and Universities; Johns Hopkins University Press: Baltimore, MD, USA, 2019. [Google Scholar]
  39. Gasman, M.; Samayoa, A.C.; Boland, W.C.; Esmieu, P. Educational Challenges at Minority Serving Institutions; Routledge: New York, NY, USA, 2018. [Google Scholar]
  40. Covarrubias, R.; Laiduc, G.; Quinteros, K.; Arreaga, J. Lessons on servingness from mentoring program leaders at a Hispanic serving institution. J. Leadersh. Equity Res. 2023, 9. Available online: https://journals.sfu.ca/cvj/index.php/cvj/article/view/259 (accessed on 12 January 2025).
  41. Bell, T.; Aubele, J.W.; Perruso, C. Digital divide issues affecting undergraduates at a Hispanic-serving institution during the pandemic: A mixed-methods approach. Educ. Sci. 2022, 12, 115. [Google Scholar] [CrossRef]
  42. Peristeris, K. Culturally sustaining pedagogies: Teaching and learning for justice in a changing world. J. Teach. Learn. 2017, 11. [Google Scholar] [CrossRef]
  43. Walter, Y. Embracing the future of Artificial Intelligence in the classroom: The relevance of AI literacy, prompt engineering, and critical thinking in modern education. Int. J. Educ. Technol. High. Educ. 2024, 21, 15. [Google Scholar] [CrossRef]
  44. Dalkir, K. Knowledge Management in Theory and Practice, 4th ed.; MIT Press: Cambridge, MA, USA, 2023. [Google Scholar]
  45. Davenport, T.H.; Prusak, L. Working Knowledge: How Organizations Manage What They Know; Harvard Business Review Press: Brighton, MA, USA, 2000. [Google Scholar]
  46. Nyutu, E.N.; Carmona-Galindo, V.; Polanco, M. Assessment of Community-Engaged Research Experiences in Introductory General Biology Laboratories. Am. Biol. Teach. 2024, 86, 426–431. [Google Scholar] [CrossRef]
  47. Nyutu, E.N.; Carmona-Galindo, V.; Polanco, M.C. A Framework for Scaling-Up Community-Engaged Research Experiences in Introductory General Biology Laboratories. Bioscene J. Coll. Biol. Teach. 2023, 49, 42–51. [Google Scholar]
  48. Chen, C.; Gong, Y. The Role of AI-Assisted Learning in Academic Writing: A Mixed-Methods Study on Chinese as a Second Language Students. Educ. Sci. 2025, 15, 141. [Google Scholar] [CrossRef]
  49. Kavitha, K.; Joshith, V. Pedagogical Incorporation of Artificial Intelligence in K-12 Science Education: A Decadal Bibliometric Mapping and Systematic Literature Review (2013–2023). J. Pedagog. Res. 2024, 8, 437–465. [Google Scholar] [CrossRef]
  50. Smith-Mutegi, D.; Mamo, Y.; Kim, J.; Crompton, H.; McConnell, M. Perceptions of STEM education and artificial intelligence: A Twitter (X) sentiment analysis. Int. J. STEM Educ. 2025, 12, 9. [Google Scholar] [CrossRef]
  51. Barajas-Salazar, B.E.; Almeida, M.; Aguirre-Muñoz, Z.; Viveros, M. Culturally relevant informal STEM learning for underserved students: Effects of repeated exposure to the engineering design process. In Proceedings of the Frontiers in Education, Nashville, TN, USA, 2025, 2–5 November; p. 1534452.
  52. The Importance of STEM Education for K–12 Students. Available online: https://www.nms.org/Resources/Newsroom/Blog/2023/November/The-Importance-of-STEM-Education-for-K-12-Students.aspx (accessed on 12 January 2025).
  53. Using AI Tools for Science Writing. Available online: https://scitechedit.com/using-ai-tools-for-science-writing/ (accessed on 30 June 2025).
  54. Perez-Oquendo, M.; Hileman, E.O. How—And Why—STEM Trainees Must Hone Their Writing; Inside Higher Ed: Washington, DC, USA, 2024. [Google Scholar]
  55. Shekanino, A.; Agustin, A.; Aladefa, A.; Amezquita, J.; Gonzalez, D.; Heldenbrand, E.; Hernandez, A.; May, M.; Nuno, A.; Ojeda, J.; et al. Differential Stomatal Responses to Surface Permeability by Sympatric Urban Tree Species Advance Novel Mitigation Strategy for Urban Heat Islands. Sustainability 2023, 15, 11942. [Google Scholar] [CrossRef]
  56. Winn-Swanson, K.; Kostich, L.; Castañeda-Childress, M.; Solis, I.; Remillard, J.; Agustin, A.; Gonzalez, D.; Carmona-Galindo, V.D. Dynamics of Primary Succession in Airborne Microbial Communities on Urban Masonry. Acta Microbiol. Hell. 2025, 70, 4. [Google Scholar] [CrossRef]
  57. Puno, T.; Heldenbrand, E.; Ortiz, A.; Reola, J.; Aladefa, A.; Zaragoza, C.; Shekanino, A.; Carmona-Galindo, V.D. Urban heat and cool island effects on aerosol microbiome assemblages. Bios 2025, 96, 9–15. [Google Scholar] [CrossRef]
  58. Coffey, L. Most Students Outpacing Faculty in AI Use; Inside Higher Ed: Washington, DC, USA, 2023. [Google Scholar]
  59. Stokel-Walker, C. ChatGPT listed as author on research papers: Many scientists disapprove. Nature 2023, 613, 620–621. [Google Scholar] [CrossRef]
  60. Floridi, L.; Chiriatti, M. GPT-3: Its Nature, Scope, Limits, and Consequences. Minds Mach. 2020, 30, 681–694. [Google Scholar] [CrossRef]
  61. Seery, M.K. Establishing the laboratory as the place to learn how to do chemistry. J. Chem. Educ. 2020, 97, 1511–1514. [Google Scholar] [CrossRef]
  62. Grove, N.P.; Bretz, S.L. A continuum of learning: From rote memorization to meaningful learning in organic chemistry. Chem. Educ. Res. Pract. 2012, 13, 201–208. [Google Scholar] [CrossRef]
  63. Adams, S. The Impact of Socioeconomic Status on the Development of STEM Identity, Choice, and Persistence; Murray State University: Murray, KY, USA, 2022. [Google Scholar]
  64. Liu, M.; Zhang, J.; Nyagoga, L.M.; Liu, L. Student-AI question cocreation for enhancing reading comprehension. IEEE Trans. Learn. Technol. 2023, 17, 815–826. [Google Scholar] [CrossRef]
  65. Tlais, S. AI-Driven Interest-Based Learning for Enhanced Engagement in General Chemistry. 2024. Available online: https://www.researchgate.net/publication/386438533_AI-Driven_Interest-Based_Learning_for_Enhanced_Engagement_in_General_Chemistry (accessed on 12 January 2025).
  66. Bugri, V.A.; Egala, S.B. AI-Powered Innovation: Navigating the Entrepreneurial Landscape in STEM Education. AI Soc. 2025, 23–50. [Google Scholar]
  67. Huffmyer, A.S.; O’Neill, T.; Lemus, J.D. Evidence for professional conceptualization in science as an important component of science identity. CBE Life Sci. Educ. 2022, 21, ar76. [Google Scholar] [CrossRef]
  68. Basken, P. Undergraduate Science Gains Are Tied to Hands-on Lab Experience. Available online: https://www.chronicle.com/article/undergraduate-science-gains-are-tied-to-hands-on-lab-experience/ (accessed on 15 October 2023).
  69. Jacoby, S. What About the Other 85 Percent? Inside Higher Ed. 2020. Available online: https://www.insidehighered.com/views/2020/07/23/colleges-should-be-planning-more-intentionally-students-who-commute-campuses-fall (accessed on 12 January 2025).
  70. Salai, S. Online learning flourishes as residential colleges face rising costs.The Washington Times. 11 September 2024. [Google Scholar]
  71. Zhu, Y. The impact of AI-assisted teaching on students’ learning and psychology. J. Educ. Humanit. Soc. Sci. 2024, 38, 111–116. [Google Scholar] [CrossRef]
  72. Guo, Y.; Lee, D. Leveraging chatgpt for enhancing critical thinking skills. J. Chem. Educ. 2023, 100, 4876–4883. [Google Scholar] [CrossRef]
  73. Potkonjak, V.; Gardner, M.; Callaghan, V.; Mattila, P.; Guetl, C.; Petrović, V.M.; Jovanović, K. Virtual laboratories for education in science, technology, and engineering: A review. Comput. Educ. 2016, 95, 309–327. [Google Scholar] [CrossRef]
  74. National Academies of Sciences, Engineering, and Medicine. Role of Laboratory and Field Instruction in Biology Education; National Academies of Sciences, Engineering, and Medicine: Washington, DC, USA, 2023. [Google Scholar]
  75. Paniagua, A.; Istance, D. Teachers as Designers of Learning Environments; OECD Publications Centre: Paris, France, 2018. [Google Scholar]
  76. Theobald, E.J.; Hill, M.J.; Tran, E.; Agrawal, S.; Arroyo, E.N.; Behling, S.; Chambwe, N.; Cintrón, D.L.; Cooper, J.D.; Dunster, G. Active learning narrows achievement gaps for underrepresented students in undergraduate science, technology, engineering, and math. Proc. Natl. Acad. Sci. USA 2020, 117, 6476–6483. [Google Scholar] [CrossRef] [PubMed]
  77. McMurtrie, B. AI to the Rescue. 2024. The Chronicle of Higher Education. 2025. Available online: https://www.chronicle.com/special-projects/the-different-voices-of-student-success/ai-to-the-rescue (accessed on 30 June 2025).
  78. Haak, D.C.; HilleRisLambers, J.; Pitre, E.; Freeman, S. Increased structure and active learning reduce the achievement gap in introductory biology. Science 2011, 332, 1213–1216. [Google Scholar] [CrossRef]
  79. Cooper, M.M.; Corley, L.M.; Underwood, S.M. An investigation of college chemistry students’ understanding of structure–property relationships. J. Res. Sci. Teach. 2013, 50, 699–721. [Google Scholar] [CrossRef]
  80. Colvard, N.; Watson, C.; Park, H. The Impact of Open Educational Resources on Various Student Success Metrics. Int. J. Teach. Learn. High. Educ. 2018, 30, 262–276. [Google Scholar]
  81. Roscoe, R.D.; Salehi, S.; Nixon, N.; Worsley, M.; Piech, C.; Luckin, R. Inclusion and equity as a paradigm shift for artificial intelligence in education. In Artificial Intelligence in STEM Education; CRC Press: Boca Raton, FL, USA, 2022; pp. 359–374. [Google Scholar]
  82. Louie, N.L. The culture of exclusion in mathematics education and its persistence in equity-oriented teaching. J. Res. Math. Educ. 2017, 48, 488–519. [Google Scholar] [CrossRef]
  83. Aguirre, J.; Mayfield-Ingram, K.; Martin, D.B. Impact of Identity in K-12 Mathematics: Rethinking Equity-Based Practices; ERIC: Oxdord MS, USA, 2024. [Google Scholar]
  84. Martin, D.B. Mathematics Success and Failure Among African-American Youth: The Roles of Sociohistorical Context, Community Forces, School Influence, and Individual Agency; Routledge: Abingdon, UK, 2000. [Google Scholar]
  85. Koehler, T.; Sammon, J. Using AI to Support Math Instruction. Edutopia 2023. Available online: https://www.edutopia.org/article/using-ai-math-instruction/ (accessed on 12 January 2025).
  86. Rose, D.H.; Meyer, A.; Hitchcock, C. The Universally Designed Classroom: Accessible Curriculum and Digital Technologies; ERIC: Oxdord, MS, USA, 2005. [Google Scholar]
  87. Vyse, G. How Generative AI Is Changing the Classroom [Research Brief]. The Chronicle of Higher Education. 2024. Available online: https://connect.chronicle.com/rs/931-EKA-218/images/GenAI-ResearchBrief.pdf (accessed on 12 January 2025).
  88. Vraga, E.K.; Bode, L. Addressing COVID-19 misinformation on social media preemptively and responsively. Emerg. Infect. Dis. 2021, 27, 396. [Google Scholar] [CrossRef] [PubMed]
  89. National Academies of Sciences, Engineering, and Medicine. Building trust in public health emergency preparedness and response (PHEPR) science. In Proceedings of a Workshop–in Brief, Washington, DC, USA, 14–15 November 2022. [Google Scholar]
  90. World Health Organization. Managing the COVID-19 infodemic: Promoting healthy behaviours and mitigating the harm from misinformation and disinformation. In Joint statement by WHO, UN, UNICEF, UNDP, UNESCO, UNAIDS, ITU, UN Global Pulse, and IFRCI; WHO: Geneva, Switzerland, 2020. [Google Scholar]
  91. Pennycook, G.; McPhetres, J.; Zhang, Y.; Lu, J.G.; Rand, D.G. Fighting COVID-19 misinformation on social media: Experimental evidence for a scalable accuracy-nudge intervention. Psychol. Sci. 2020, 31, 770–780. [Google Scholar] [CrossRef] [PubMed]
  92. Feinstein, N. Salvaging science literacy. Sci. Educ. 2011, 95, 168–185. [Google Scholar]
  93. Allchin, D. From science studies to scientific literacy: A view from the classroom. Sci. Educ. 2014, 23, 1911–1932. [Google Scholar] [CrossRef]
  94. Samuel, Y.; Brennan-Tonetta, M.; Samuel, J.; Kashyap, R.; Kumar, V.; Krishna Kaashyap, S.; Chidipothu, N.; Anand, I.; Jain, P. Cultivation of human centered artificial intelligence: Culturally adaptive thinking in education (CATE) for AI. Front. Artif. Intell. 2023, 6, 1198180. [Google Scholar] [CrossRef]
  95. Head, A.J.; Fister, B.; MacMillan, M. Information literacy in the age of algorithms. Available online: https://files.eric.ed.gov/fulltext/ED605109.pdf (accessed on 12 January 2025).
  96. Board, A. Framework for Information Literacy for Higher Education. 2016. Available online: https://www.ala.org/acrl/standards/ilframework (accessed on 12 January 2025).
  97. Chen, C.; Shu, K. Combating Misinformation in the Age of LLMs: Opportunities and Challenges. AI Mag. 2024, 45, 354–368. [Google Scholar] [CrossRef]
  98. Balaji, S.; Jayachandran, S.; Prabagaran, S.R. Evidence for the natural occurrence of Wolbachia in Aedes aegypti mosquitoes. FEMS Microbiol. Lett. 2019, 366, fnz055. [Google Scholar] [CrossRef]
  99. Plotts, C.; Gonzalez, L. Creating a Culture Around AI: Thoughts and Decision-Making. Educ. Rev. 2024. Available online: https://er.educause.edu/articles/2024/4/creating-a-culture-around-ai-thoughts-and-decision-making (accessed on 12 January 2025).
  100. Shirky, C. Is AI Enhancing Education or Replacing It? The Chronicle of Higher Education, 29 April 2025. [Google Scholar]
  101. Porter, A.M.; Chu, R.Y.; Ivie, R. Attrition and Persistence in Undergraduate Physics Programs. 2024. Available online: https://www.aip.org/statistics/attrition-and-persistence-in-undergraduate-physics-programs (accessed on 12 January 2025).
  102. Schweingruber, H.A.; Nielsen, N.R.; Singer, S.R. Discipline-Based Education Research: Understanding and Improving Learning in Undergraduate Science and Engineering; National Academies Press: Cambridge, MA, USA, 2012. [Google Scholar]
  103. Hazari, Z.; Tai, R.H.; Sadler, P.M. Gender differences in introductory university physics performance: The influence of high school physics preparation and affective factors. Sci. Educ. 2007, 91, 847–876. [Google Scholar] [CrossRef]
  104. Bangera, G.; Brownell, S.E. Course-based undergraduate research experiences can make scientific research more inclusive. CBE Life Sci. Educ. 2014, 13, 602–606. [Google Scholar] [CrossRef]
  105. Becker, B.A. What does saying that ‘programming is hard’really say, and about whom? Commun. ACM 2021, 64, 27–29. [Google Scholar] [CrossRef]
  106. Estrada, M.; Hernandez, P.R.; Schultz, P.W. A longitudinal study of how quality mentorship and research experience integrate underrepresented minorities into STEM careers. CBE Life Sci. Educ. 2018, 17, ar9. [Google Scholar] [CrossRef] [PubMed]
  107. Birhane, A. Algorithmic injustice: A relational ethics approach. Patterns 2021, 2, 100205. [Google Scholar] [CrossRef] [PubMed]
  108. Swaak, T. AI Will Shake Up Higher Ed. Are Colleges Ready? The Chronicle of Higher Education, 26 February 2024. [Google Scholar]
  109. Michel-Villarreal, R.; Vilalta-Perdomo, E.; Salinas-Navarro, D.E.; Thierry-Aguilera, R.; Gerardou, F.S. Challenges and Opportunities of Generative AI for Higher Education as Explained by ChatGPT. Educ. Sci. 2023, 13, 856. [Google Scholar] [CrossRef]
  110. Ro, H.K.; Aguilar-Smith, S.; Anderson, S.Y.; Rodriguez, T.; Ramon, E.J.; Javier, D. Attending to STEM education in servingness at Hispanic-serving institutions: A systematic review of more than a decade of scholarship. Int. J. STEM Educ. 2024, 11, 33. [Google Scholar] [CrossRef]
  111. Chen, Y.; Granco, G.; Hou, Y.; Macias, H.; Gomez, F.A. AI for Social Good Education at Hispanic Serving Institutions. In Proceedings of the AAAI Symposium Series, Vancouver, BC, Canada, 25–27 March 2024; p. 473. [Google Scholar]
  112. Lythreatis, S.; Singh, S.K.; El-Kassar, A.-N. The digital divide: A review and future research agenda. Technol. Forecast. Soc. Change 2022, 175, 121359. [Google Scholar] [CrossRef]
  113. Cook, E.; Hollebeke, N. Culturally-Responsive Teaching and Humanizing the Student Experience in the Age of AI. Every Learner Everywhere Blog 2023. Available online: https://www.everylearnereverywhere.org/blog/culturally-responsive-teaching-and-humanizing-the-student-experience-in-the-age-of-ai/ (accessed on 12 January 2025).
  114. Lozano, G.; Franco, M.; Subbian, V. Transforming STEM education in Hispanic serving institutions in the United States: A consensus report. SSRN Electron. J. 2018. [Google Scholar] [CrossRef]
  115. Watkins, M. Your Students Need an AI-Aware Professor. The Chronicle of Higher Education, 2025. [Google Scholar]
  116. Schei, O.M.; Møgelvang, A.; Ludvigsen, K. Perceptions and use of AI chatbots among students in higher education: A scoping review of empirical studies. Educ. Sci. 2024, 14, 922. [Google Scholar] [CrossRef]
  117. Lowe, M. The More Things Change: The Ethical Impacts of AI in Higher Education. Res. Issues Contemp. Educ. 2024, 9. [Google Scholar]
  118. Soliman, M.; Ali, R.A.; Khalid, J.; Mahmud, I.; Ali, W.B. Modelling continuous intention to use generative artificial intelligence as an educational tool among university students: Findings from PLS-SEM and ANN. J. Comput. Educ. 2024. [Google Scholar] [CrossRef]
  119. Gretzinger, E.; Hicks, M.; Dutton, C.; Smith, J.; Cutler, S.; Baiocchi, A. Tracking Higher Ed’s Dismantling of DEI. The Chronicle of Higher Education, 2025. [Google Scholar]
  120. Dutton, C.; Smith, J. Clinging to Control. The Chronicle of Higher Education, 12 February 2025. [Google Scholar]
  121. Soliman, M.; Ali, R.A.; Mahmud, I.; Noipom, T. Unlocking AI-Powered Tools Adoption among University Students: A Fuzzy-Set Approach. J. Inf. Commun. Technol. 2025, 24, 1–28. [Google Scholar] [CrossRef]
  122. Belkina, M.; Daniel, S.; Nikolic, S.; Haque, R.; Lyden, S.; Neal, P.; Grundy, S.; Hassan, G.M. Implementing generative AI (GenAI) in higher education: A systematic review of case studies. Comput. Educ. Artif. Intell. 2025, 8, 100407. [Google Scholar] [CrossRef]
  123. García-López, I.M.; Trujillo-Liñán, L. Ethical and regulatory challenges of Generative AI in education: A systematic review. Front. Educ. 2025, 10, 1565938. [Google Scholar] [CrossRef]
Figure 1. (a) Incorrect chemical structure of caffeine generated by ChatGPT, displaying inaccuracies in bond placement and atomic connectivity [OPENAI/ChatGPT]. (b) Correct skeletal structure of caffeine, illustrating proper bond alignment and molecular geometry.
Figure 1. (a) Incorrect chemical structure of caffeine generated by ChatGPT, displaying inaccuracies in bond placement and atomic connectivity [OPENAI/ChatGPT]. (b) Correct skeletal structure of caffeine, illustrating proper bond alignment and molecular geometry.
Knowledge 05 00018 g001
Figure 2. ChatGPT’s inaccurate portrayal of a high-performance liquid chromatography (HPLC) instrument. The diagram fails to correctly label and represent essential components, underscoring current limitations in AI-generated educational content.
Figure 2. ChatGPT’s inaccurate portrayal of a high-performance liquid chromatography (HPLC) instrument. The diagram fails to correctly label and represent essential components, underscoring current limitations in AI-generated educational content.
Knowledge 05 00018 g002
Table 1. Opportunities and risks of GenAI integration in STEM classrooms, as identified across faculty-led case studies at the ULV.
Table 1. Opportunities and risks of GenAI integration in STEM classrooms, as identified across faculty-led case studies at the ULV.
OpportunitiesRisks/Limitations
Simplifies complex content and scaffolds technical language for multilingual and first-generation studentsRisk of factual inaccuracies or “hallucinations” in AI-generated outputs
Enhances access to research, writing, and problem-solving for underrepresented studentsPotential over-reliance by students, leading to reduced independent reasoning
Supports culturally responsive pedagogy by generating locally relevant and contextualized examplesAlgorithmic bias and lack of representation in training data can reinforce inequities
Expands instructional capacity (e.g., multilingual translation, rapid feedback, adaptive prompts)Institutional guardrails or premature policy restrictions may constrain innovation
Table 2. Actionable takeaways for ethical and equitable GenAI integration in STEM at the ULV.
Table 2. Actionable takeaways for ethical and equitable GenAI integration in STEM at the ULV.
ChallengeGenAI-Enabled StrategyObserved or Intended Outcome
Limited access to supplemental resources (e.g., textbooks, office hours, language support)AI-generated quizzes, flashcards, summaries, and multilingual explanationsReduced cost barriers; improved comprehension and engagement, especially for multilingual and first-generation students
High attrition in gateway courses due to conceptual overloadRewriting lab protocols and lecture materials with AI for clarity and accessibilityIncreased student confidence and retention in challenging STEM courses
Student disengagement in abstract STEM contentContextualized, culturally relevant AI-generated problems (e.g., water conservation, local biodiversity)Enhanced relevance, student belonging, and application of knowledge to real-world issues
Delayed or exclusive access to undergraduate research experiencesAI-supported entry into coding, experimental design, and literature reviewExpanded early research participation; inclusive access to computational research
Misconceptions and overreliance on AI as a source of truthAssignments critiquing AI-generated errors or summariesStrengthened critical thinking, scientific reasoning, and AI literacy
Lack of guidance around ethical use of GenAI toolsPeer-led modules, transparent attribution discussions, and structured reflection tasksCultivation of responsible use practices, academic integrity, and ethical awareness
Faculty time constraints in adapting instructionAI-assisted generation of quizzes, rubrics, and visual explanationsIncreased instructional agility; time saved for mentorship and individualized support
Fragmented cross-departmental collaborationShared values and practices across biology, chemistry, physics, and math through GenAI integrationStrengthened interdisciplinary pedagogical innovation and student-centered reform
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Carmona-Galindo, V.D.; Ung, H.; Zeng, M.; Broussard, C.; Taranenko, E.; Daneshbod, Y.; Chappell, D.; Lorenz, T. Generative AI as a Sociotechnical Challenge: Inclusive Teaching Strategies at a Hispanic-Serving Institution. Knowledge 2025, 5, 18. https://doi.org/10.3390/knowledge5030018

AMA Style

Carmona-Galindo VD, Ung H, Zeng M, Broussard C, Taranenko E, Daneshbod Y, Chappell D, Lorenz T. Generative AI as a Sociotechnical Challenge: Inclusive Teaching Strategies at a Hispanic-Serving Institution. Knowledge. 2025; 5(3):18. https://doi.org/10.3390/knowledge5030018

Chicago/Turabian Style

Carmona-Galindo, Víctor D., Hou Ung, Manhao Zeng, Christine Broussard, Elizaveta Taranenko, Yousef Daneshbod, David Chappell, and Todd Lorenz. 2025. "Generative AI as a Sociotechnical Challenge: Inclusive Teaching Strategies at a Hispanic-Serving Institution" Knowledge 5, no. 3: 18. https://doi.org/10.3390/knowledge5030018

APA Style

Carmona-Galindo, V. D., Ung, H., Zeng, M., Broussard, C., Taranenko, E., Daneshbod, Y., Chappell, D., & Lorenz, T. (2025). Generative AI as a Sociotechnical Challenge: Inclusive Teaching Strategies at a Hispanic-Serving Institution. Knowledge, 5(3), 18. https://doi.org/10.3390/knowledge5030018

Article Metrics

Back to TopTop