Next Article in Journal
Artificial Intelligence in the Personalization of Teaching of Higher Mathematics Students in Kazakhstan
Previous Article in Journal
Promoting Social and Emotional Learning Through Physical Activity: An Evaluation of a School-Based Program
Previous Article in Special Issue
Bridging the Skills Gap: Reimagining Faculty Development Through Centers for Teaching and Learning
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Student Involvement in Digital Tool Selection: A Pedagogical Approach to Critical Thinking-Oriented Learning

Department of Science Teaching, Hemdat College of Education, P.O. Box 412, Netivot 80200, Israel
Educ. Sci. 2026, 16(4), 512; https://doi.org/10.3390/educsci16040512
Submission received: 5 February 2026 / Revised: 19 March 2026 / Accepted: 22 March 2026 / Published: 25 March 2026

Abstract

Digital technologies are widely recognized for their potential to support active learning and foster higher-order cognitive skills, including critical thinking. However, limited research has examined the extent to which students are directly involved in selecting digital tools that shape their learning. This study investigates teachers’ ability to engage students in the selection and pedagogical use of digital technologies, with attention to practices supporting active, personalized learning and critical thinking. Data were collected from 156 educators across diverse disciplines in five teacher-training colleges in Israel using an online questionnaire assessing levels of digital tool use, from non-use to active student involvement. Item Response Theory (IRT) was applied to model teachers’ proficiency and examine differences across tools and background characteristics. Results indicate substantial variability in teachers’ ability to involve students, with particularly low involvement in tools related to problem-solving, differentiation, and personalized learning. Gender and institutional role were significant predictors, with female educators and those holding additional roles demonstrating higher proficiency. These findings highlight the importance of teachers’ techno-pedagogical competence in enabling student participation in digital decision-making and suggest that involving students in tool selection can support the development of critical thinking and learner agency in digitally mediated learning environments.

1. Introduction

Numerous studies have examined the integration of digital technologies in higher education and their impact on teaching and learning processes. This body of research highlights the role of active learning in conjunction with digital technologies in enhancing learning outcomes and supporting higher-order cognitive objectives, including critical thinking, analysis, evaluation, and reflection (Riswanti Rini et al., 2022; Sailer et al., 2024; Schindler et al., 2017). However, despite these advances, there remains a significant gap in research addressing students’ direct involvement in selecting and utilizing digital technologies tailored to their learning needs. Even educators who employ digital tools to promote interaction and active learning rarely involve students in decisions regarding the selection of these tools (Aflalo et al., 2024).
Recent developments in Artificial Intelligence (AI)-driven educational technologies further accentuate the importance of understanding how teachers guide students in selecting appropriate digital tools. AI-based systems can generate personalized learning pathways and support learner autonomy, yet their pedagogical value depends not only on technological functionality but also on teachers’ techno-pedagogical competencies and their ability to scaffold students’ critical thinking when engaging with digital tools and AI-generated recommendations (Bond et al., 2024; Tan et al., 2025).
Exposing students to a variety of digital tools and involving them in decisions about their selection and use requires educators to possess advanced techno-pedagogical competencies. These competencies reflect the integration of technological skills with pedagogical knowledge, enabling educators to design learning environments that support active, student-centered learning and foster critical thinking through reflective judgment and informed decision-making (Asad et al., 2021).
The significance of the present study lies in its examination of educators’ ability to involve students in the selection and use of digital technologies, as well as the factors influencing this practice, an area that remains largely underexplored. By focusing on teachers’ facilitation of student participation in digital tool selection, this study contributes to ongoing discussions on how innovative pedagogical practices can intentionally cultivate critical thinking and learner agency within digitally mediated learning environments. Importantly, this study does not directly measure students’ critical thinking or cognitive outcomes. Instead, it examines teachers’ reported practices in involving students in digital tool selection as a pedagogical condition that may support critical thinking. The focus is on instructional design and teacher facilitation rather than on empirically demonstrating cognitive development, addressing structural and pedagogical conditions associated in the literature with higher-order engagement.
The remainder of this article is organized as follows. Section 2 presents the theoretical framework, followed by a description of the methodology and analytical procedures. The subsequent sections report the findings and discuss their implications for participatory digital practices, critical thinking, and future research.

2. Theoretical Background

2.1. The Selection and Integration of Digital Tools by Teachers

Digital technologies encompass a wide range of hardware and software tools that facilitate communication, access to information, transmission, and storage within digital learning environments. When integrated meaningfully, these technologies can promote active learning, support higher-order cognitive processes, and enhance learning outcomes in higher education (Morris & Rohs, 2021). However, their pedagogical value depends not on availability but on how tools are selected, implemented, and embedded within instructional practices.
Despite their widespread availability, the integration of digital technologies in higher education remains inconsistent. Many educators continue to use digital tools primarily for administrative or organizational purposes rather than for pedagogical innovation or cognitively demanding learning activities (Amhag et al., 2019; Pinto & Leite, 2020). Learning management systems, for example, are often valued for efficiency rather than for fostering deep learning or critical inquiry (Martin et al., 2020). Similarly, Mercader and Gairín (2020) and M. Henderson et al. (2017) found that access to technology alone has not led to substantial transformation of teaching and learning practices.
Educators’ adoption of digital tools is shaped by an interplay of technological, pedagogical, and contextual factors. Technological competence, infrastructure, and access to support facilitate integration (Timotheou et al., 2023; Uzorka et al., 2023), while pedagogical orientations, teaching experience, and beliefs about learning outcomes influence how tools are used (Nagy & Dringó-Horváth, 2024; Hyland & Kranzow, 2011). These beliefs are particularly consequential when digital technologies are expected to support higher-order learning, including analysis, evaluation, and reflective judgment.
Barriers to adoption are equally multifaceted. Professional and institutional constraints often limit educators’ capacity to move beyond surface-level use (Mercader & Gairín, 2020). Moreover, attitudes toward technology frequently exert a stronger influence on pedagogical integration than technical skills alone (Madsen et al., 2018). Beliefs about the role of technology in teaching and assessment can either enable or hinder instructional practices that require rethinking traditional teacher-centered control (Romero et al., 2019). Together, these findings highlight that fostering cognitively meaningful uses of digital tools requires both pedagogical readiness and structural support.

2.2. Active Student Engagement, Critical Thinking, and Digital Tool Selection

Digital technologies offer expanded opportunities for choice and flexibility, supporting both active learning and self-directed learning (SDL). These approaches emphasize learner autonomy, engagement, and responsibility for learning processes (Domínguez & García, 2017; Li et al., 2023). Active learning involves students’ deliberate cognitive engagement with tasks and strategies (Morris, 2019), while SDL highlights learners’ capacity to plan, monitor, and evaluate their learning goals and outcomes (Gureckis & Markant, 2012). Recent evidence from online graduate education suggests that when learners are offered structured ‘voice and choice’ over how they engage with course activities, they report sustained engagement, but such autonomy functions pedagogically only when the course design explicitly supports students in navigating options rather than merely providing choice (Henrikson & Baliram, 2023).
Within this framework, involving students in the selection of digital tools represents a shift toward shared responsibility for learning design. Research suggests that student participation in choosing tools aligned with learning goals and preferences supports personalized learning and strengthens SDL by enabling learners to tailor resources and strategies to their needs (Bullock, 2013; Nkomo et al., 2021). Digital tools thus become not only learning supports but also objects of reflection and decision-making within personalized learning environments (Damsa, 2019; García-Martínez et al., 2020; Schmid & Petko, 2019). In participatory dashboard design, student involvement has been operationalized as co-selecting which data and features a tool should display, with selections largely driven by students’ perceived usefulness and desire for actionable guidance, highlighting student judgment as a design determinant rather than a post hoc reaction (Herodotou et al., 2025).
Beyond engagement and autonomy, student involvement in digital tool selection constitutes a cognitively demanding process with significant potential to foster critical thinking. Selecting tools requires learners to articulate learning goals, evaluate affordances and limitations, and justify choices based on pedagogical relevance rather than convenience. These processes align with conceptualizations of critical thinking as purposeful, self-regulatory reasoning involving analysis, evaluation, and decision-making (Facione, 1990; Halpern, 2013). When students compare tools, assess suitability for specific tasks, and reflect on effectiveness, they engage in higher-order cognitive processes that extend beyond functional use toward epistemic reasoning (Morris & Rohs, 2021; Li et al., 2023).
Empirical studies indicate that participatory tool selection can enhance critical evaluation skills, as learners consider functionality, usability, and learning value in relation to goals (Riswanti Rini et al., 2022; Sailer et al., 2024). The process also supports collaboration, as joint discussions and shared decision-making require learners to articulate criteria, negotiate perspectives, and engage in collective problem-solving (Godsk & Møller, 2024). Furthermore, enabling students to advocate for tools that address diverse learning and accessibility needs supports inclusive pedagogical practices (Nkomo et al., 2021).
AI-enhanced learning environments further expand these opportunities by offering adaptive recommendations, automated feedback, and data-driven insights that can inform students’ decisions about digital tools (Katiyar et al., 2024). However, the development of critical thinking in such contexts is not automatic. It depends on pedagogical scaffolding that supports learners in articulating evaluation criteria, questioning algorithmic recommendations, and reflecting on the learning implications of their choices (Bond et al., 2024; Sailer et al., 2024). Without such scaffolding, students may rely on habitual or surface-level preferences, limiting the cognitive value of choice.
Although empirical research directly examining students’ active role in digital tool selection remains limited, existing evidence suggests that such involvement can positively influence learning. By framing tool selection as a reflective and evaluative activity, educators can leverage digital technologies not only to personalize learning but also to cultivate critical thinking, deeper engagement, and more meaningful learning outcomes. In the present study, digital tool selection is conceptualized as a reasoned evaluative practice that involves considering alternatives in relation to pedagogical goals, criteria, and contextual constraints. This differs from digital tool use, which refers to the procedural implementation of a chosen technology. Framing selection as an evaluative process highlights its alignment with reflective and critical thinking, beyond preference-based or convenience-driven choice.

2.3. Research Questions

The purpose of this study was to examine teachers’ ability to involve their students in selecting digital tools for various purposes as well as factors related to the demographic and personal characteristics of the study participants. The specific research questions were:
A. What is the overall distribution of student involvement in the selection and use of digital technologies, and how does it vary across different types of tools?
B. Which digital technology activities most distinctly differentiate teachers regarding their individual ability to involve their students in the selection and use of digital technologies?
C. What is the relationship between teachers’ ability to involve their students in selecting and using digital technologies and teachers’ background variables (gender, age, field of expertise, education, employment status, academic rank, tenure and additional academic role)?

3. Methodology

3.1. The Research Population

The study sample included 156 educators representing various academic disciplines from five teacher-training colleges in Israel. The study was conducted in institutions offering undergraduate and graduate programs for preservice teachers. Students are adult learners enrolled in structured curricula aligned with national teacher education standards. The colleges encompassed a range of demographic and cultural contexts: two were sectarian and served only Jewish students, two were secular and included both Jewish and Arab students, and one exclusively served Arab students. Faculty members typically retain pedagogical autonomy regarding instructional design and digital tool integration within these institutional frameworks. The majority of participants had significant teaching experience, with over six years in the profession, and were responsible for teaching at least five courses annually. Table 1 presents a detailed breakdown of the participants’ demographic and professional characteristics, including their age, gender, education, academic rank, years of teaching experience, and whether they hold additional academic roles, such as department chair, dean, or program director.

3.2. The Research Tool

An 18-item questionnaire, specifically developed for this study, was administered online to allow educators to self-evaluate their familiarity with and use of digital technologies in various professional contexts. The design of the questionnaire was informed by the Self-reflection on Effective Learning by Fostering the use of Innovative Educational Technologies (SELFIE) framework, created by the European Commission’s Joint Research Center (Kampylis et al., 2015).
While the original SELFIE tool comprises 34 items aimed at helping educators reflect on their digital competence, the current study adapted 18 items to focus on teaching practices relevant to its specific aims. These adaptations, validated in a prior study (Aflalo et al., 2024), demonstrated high reliability, with Cronbach’s alpha values exceeding 0.8. The questionnaire addressed topics such as creating and modifying digital resources, leveraging digital technologies for feedback and reflection, and fostering collaboration among learners. Each item included five response options, ranging from non-use to active student engagement in selecting and employing digital technologies, with particular emphasis on the latter. In this study, student involvement refers to pedagogical practices in which learners are invited to participate in decisions regarding the selection of digital tools for learning purposes. It does not refer to merely informing students about tool choices or allowing unguided selection without instructional framing. The measure captures whether such participatory practice is reported, without differentiating between levels of shared authority (e.g., consultation versus fully shared decision-making). In addition to assessing digital practices, the questionnaire collected demographic and professional background data (see the attached questionnaire in the Appendix A. The questionnaire is identical to that reported in (Aflalo et al., 2024). It was distributed via email to approximately 500 educators, with 156 completing the survey, yielding a response rate of around 30%.
Ethical approval: Human ethics approval was given by Hemdat College of Education, Israel. Informed consent was obtained from the participants included in the study. Participation was voluntary.

3.3. Data Analysis

The dataset used in the present study was collected as part of a broader research project previously published (Aflalo et al., 2024); however, the current manuscript constitutes a secondary analysis with distinct research questions, conceptual focus, and statistical modeling procedures.
This study employed the Item Response Theory (IRT) framework to develop a proficiency score representing teachers’ engagement of students in selecting and utilizing digital technologies across multiple instructional scenarios. Of the 18 questionnaire items, 17 were included in the IRT analysis. The first item was excluded because it did not contain a response category reflecting student involvement in digital tool selection and therefore did not align with the operational definition of the construct.
The primary objective was to assess the extent to which teachers involve their students in the selection of digital resources. The original response scale ranged from one (no use) to five (use and involve learners in selecting the technological tool). Because the construct of interest was teachers’ facilitation of student involvement in digital tool-related pedagogical practices, including but not limited to tool selection, the 17 analyzed items were recoded into a binary format: one (use and involve learners) and zero (all other responses). This decision was conceptually driven. Only the highest response category explicitly reflects learners’ participation in selecting the digital tool, whereas the remaining categories represent varying degrees of teacher use without student involvement. As these intermediate categories do not constitute ordered levels of learner participation, modeling them as progressive thresholds would not align with the theoretical structure of the construct. While the highest response category explicitly reflects learner involvement, it may encompass a range of participatory practices (e.g., selection, adaptation, evaluation, or feedback-related uses of digital tools), rather than strict tool co-selection in all cases.
Accordingly, a dichotomous IRT model was used to estimate the probability that teachers actively engage students in tool selection. IRT assumes the existence of an underlying latent trait, which in this study represents teachers’ proficiency in fostering student involvement in digital tool selection. Unlike Classical Test Theory (CTT), which assigns equal weight to items, IRT estimates item-specific parameters, including difficulty and discrimination. By accounting for differences in item characteristics, the IRT framework provides a differentiated assessment of how strongly each digital practice distinguishes between teachers with varying levels of proficiency in engaging students in tool selection. The IRT analysis employed a two-parameter logistic (2PL) model with a logit link function, estimated using maximum likelihood in Mplus (Version 8.3).

4. Results

Table 2 provides a descriptive overview of student involvement across items, alongside the IRT-based difficulty and discrimination estimates that form the basis for the subsequent analysis.
The highest proportion (36%) was observed for Item 16, which pertains to tools that enhance communication between learners. Teachers also involved their students relatively frequently in selecting tools that promote collaboration among learners (27%). Conversely, the lowest level of student involvement was reported for tools designed to respond to differences between learners (Item 12, 6%). Similarly, tools related to addressing special needs or supporting problem-solving processes exhibited very low levels of student involvement (see Table 2, Items 11 and 18).
As noted in the analysis section, the study employed an IRT analysis to estimate teachers’ ability to engage students in digital tool selection, referred to as the teacher proficiency score. This methodology also tested the relationship between teachers’ proficiency and the frequency of using various digital tools. Table 2 provides the mean discrimination and difficulty estimates for each item, offering insights into how teachers differ in their engagement of students in selecting and using digital technologies. The data highlight variability in the extent to which digital tools differentiate between teachers based on their proficiency. Broadly, the digital tools can be categorized into three main groups: tools with both high difficulty and high discrimination, tools with lower difficulty but still high discrimination, and tools with moderate difficulty and discrimination, which do not differentiate as effectively between teachers.
The first group includes tools such as those related to responding to differences between learners (Table 2, Item 12 discrimination a = 2.16; difficulty b = 1.97), promoting learning for students with special needs (Item 11), providing feedback to learners (Item 10), and tools for problem-solving processes (Item 18). These tools were found to have both high difficulty and high discrimination, suggesting that only teachers with advanced techno-pedagogical skills involved their students in selecting these tools.
The second group comprises tools with lower difficulty but still high discrimination, such as those associated with active and creative learning (Item 13) and facilitating communication between learners (Item 16), tools for providing feedback to teachers (Item 5), and tools for modifying and adapting digital resources (Item 3). These tools also differentiate effectively between teachers with varying levels of proficiency, indicating their value in assessing and promoting teacher engagement.
In contrast, the third group includes tools such as those for planning and managing learning (Item 7), creating digital resources (Item 2), and management and information evaluation by learners (Item 15). These tools exhibited moderate levels of difficulty and discrimination, suggesting a more uniform level of engagement among teachers, with less variability in their ability to involve students in selecting these tools.
Figure 1 presents examples of item characteristic curves (ICCs) estimated under a dichotomous IRT model. The x-axis represents the latent proficiency parameter (θ), reflecting teachers’ propensity to engage students in digital tool selection. The y-axis represents the probability of endorsing category 1, defined as reporting “use and involve learners in selecting the technological tool” following the binary recoding (1 = involve learners; 0 = all other response categories). Each ICC is characterized by two parameters: item difficulty (b), indicating the proficiency level at which the probability of endorsing category 1 reaches 0.5, and item discrimination (a), reflecting how sharply the probability changes across levels of θ. Steeper curves indicate higher discrimination, meaning the item more strongly differentiates between teachers with lower and higher levels of reported involvement.
Overall, the probability of engaging students in the selection of digital tools increases as proficiency (θ) increases, consistent with the monotonic structure of the IRT model. However, distinct patterns emerge across items. For example, Item 13 (active and creative learning tools) displays a relatively steep slope within a narrow proficiency range (approximately 0 to 1.5), indicating high discrimination in that range. In contrast, Item 17 (expression and content creation by learners) shows an increase in probability beginning at lower proficiency levels (around θ = −1), but with a more gradual slope, suggesting a different discrimination pattern. These differences illustrate how specific digital practices vary in their sensitivity to teachers’ overall propensity to involve students in tool selection.
Figure 2 presents the distribution of estimated latent proficiency scores (θ) across participants. Lower θ values reflect a lower likelihood of reporting student involvement in digital tool selection across items. Approximately one-third of the teachers exhibited θ values around −1 or lower, indicating relatively low propensity to endorse student-involvement practices. The remaining distribution approximates a normal curve, reflecting variability in teachers’ reported engagement levels. It is important to note that lower proficiency scores do not imply absolute non-use of digital tools, but rather a lower probability of reporting explicit student involvement in tool selection.
The comparison of outcome proficiency, referring to teachers’ ability to involve their students in the selection and use of digital technologies across various background characteristics, is presented in Table 3. The table provides the means, standard errors, F-test results, and effect sizes (partial eta squared) for each background characteristics. The analysis revealed that female teachers demonstrated significantly higher proficiency levels compared to their male counterparts (F = 8.32, p = 0.004). Similarly, teachers who held an additional role within the institution (e.g., department chair or program director) exhibited higher proficiency levels than those without such roles (F = 7.12, p = 0.009).
Notably, these were the only categorical variables that showed significant differences in proficiency. Comparisons across all other background characteristics did not yield statistically significant differences (see Table 3).

5. Discussion

This study examined teachers’ ability to involve students in the selection and use of digital tools for learning purposes, with particular attention to factors shaping this practice and its implications for active learning and critical thinking. The findings offer a nuanced understanding of the challenges associated with fostering meaningful student engagement in digital tool selection and illuminate how pedagogical, technological, and contextual considerations intersect in this process.
It is important to clarify that while this discussion interprets student involvement in digital tool selection through the lens of critical thinking theory, the present findings do not constitute empirical evidence of enhanced critical thinking among students. Rather, the results illuminate the extent to which teachers create pedagogical conditions that may support critical thinking, such as opportunities for evaluation, justification, and informed decision-making. The study therefore speaks to enabling structures and instructional practices rather than to measured cognitive gains. Accordingly, student involvement in tool selection can be framed as an instructional condition that invites ethical and evaluative reasoning, without presuming that critical thinking gains have been directly measured (Atenas et al., 2023).

5.1. Teachers’ Involvement in Digital Tool Selection

The findings revealed substantial variability in the extent to which teachers involve students in selecting digital tools. Tools designed to support communication and collaboration were associated with the highest levels of student involvement, aligning with prior research emphasizing the central role of interaction and teamwork in active learning environments (Morris & Rohs, 2021). Their accessibility, familiarity, and immediate pedagogical relevance may contribute to their widespread adoption, enabling relatively seamless integration into collaborative teaching practices.
In contrast, student involvement was notably limited for tools designed to address special needs and learner diversity. This pattern is consistent with prior research identifying professional and contextual barriers that constrain the adoption of more advanced digital practices (Mercader & Gairín, 2020). Such tools often require sophisticated techno-pedagogical competencies and sustained institutional support (Asad et al., 2021), which may not be uniformly available across educational settings.
Low levels of student participation were also evident in the selection of tools for complex pedagogical purposes, including problem-solving, differentiation, and personalized learning. This finding is particularly noteworthy in the context of generative AI (GenAI), where digital tools offer potential for adaptive scaffolding, personalized feedback, and data-informed instructional adjustments (Tan et al., 2025). Despite this potential, the present results indicate that teachers are less likely to engage students in selecting precisely those tools associated with more complex and individualized instructional purposes. Evidence from participatory tool design suggests that when students are included in selecting complex tool features (e.g., analytics indicators), they tend to prioritize interpretability and actionability, conditions that align with structured critical engagement rather than passive consumption of digital outputs (Herodotou et al., 2025).
From a critical thinking perspective, this pattern may have important pedagogical implications. Problem-solving, differentiation, and personalized learning inherently involve evaluative judgment, strategic decision-making, and reflective reasoning. When teachers retain primary control over tool selection in these domains, opportunities for students to participate in structured evaluative processes regarding digital tools may be reduced. In such cases, digital tools may function primarily as instructional supports rather than as occasions for cultivating evaluative and reflective engagement.
The observed pattern may reflect structural and professional considerations shaping teachers’ decision-making. In high-stakes domains such as differentiation and special needs support, teachers may perceive increased responsibility and therefore retain centralized control over tool selection. Research has consistently identified professional risk, accountability concerns, and contextual barriers as factors influencing the adoption of complex digital practices (Mercader & Gairín, 2020; Romero et al., 2019). Institutional constraints may further limit participatory decision-making; mandated platforms, policy regulations, and digital infrastructure requirements are known to shape technology implementation in higher education (Timotheou et al., 2023; Howard et al., 2021).
In addition, co-selection itself represents a pedagogically demanding practice. Supporting students in evaluating digital tools requires explicit criteria, scaffolding of evaluative reasoning, and sufficient techno-pedagogical competence (Asad et al., 2021; Sailer et al., 2024). Under conditions of workload pressure or limited professional development, teachers may prioritize instructional stability and clarity over participatory experimentation. These explanations are offered as theoretically grounded interpretations consistent with the empirical pattern, rather than as tested causal mechanisms.
These structural and professional considerations are further illuminated by the IRT findings. Tools associated with higher cognitive demands exhibited higher difficulty and discrimination parameters, indicating that student involvement in selecting such tools was more characteristic of teachers with higher levels of the latent involvement proficiency. This pattern suggests that participatory digital practices are not solely a function of tool availability, but are related to pedagogical orientation and instructional design choices. Teachers who engage students in selecting more complex digital tools may frame technology choice itself as a structured learning task, positioning evaluation and justification as part of the instructional process rather than as purely technical decisions.
Tools intended to promote active and creative learning were similarly underutilized, despite evidence that such approaches can enhance autonomy and engagement (Schmid & Petko, 2019). This underuse may signal the need for targeted professional development that supports teachers in integrating these tools in ways that explicitly scaffold student agency and evaluative participation.
The growing prevalence of AI-enhanced learning environments further highlights the importance of pedagogical judgment. Although AI systems can recommend resources, generate adaptive pathways, and provide real-time assistance, their educational value may be shaped by teachers’ capacity to scaffold students’ critical and responsible engagement with these tools (Bond et al., 2024; Mulaudzi & Hamilton, 2025). Without structured guidance, students may gravitate toward familiar or surface-level functionalities, potentially limiting deeper evaluative engagement. In this context, techno-pedagogical proficiency appears central to shaping whether AI-supported environments enable meaningful student participation in digital tool selection. The stronger performance observed among educators holding additional institutional roles may reflect broader pedagogical experience and engagement with institutional digital strategy, although this interpretation warrants further investigation.

5.2. Variability in Digital Tool Characteristics

The categorization of digital tools into three groups based on difficulty and discrimination highlights the complex dynamics underlying teacher engagement. Tools characterized by both high difficulty and high discrimination, such as those supporting problem-solving and differentiated instruction, strongly differentiated between teachers with varying levels of proficiency. These tools may serve as diagnostic indicators of advanced techno-pedagogical competence, but their complexity can also discourage adoption among less experienced educators, underscoring the importance of targeted professional development (Sailer et al., 2024; Timotheou et al., 2023).
The difficulty function further points to technical, contextual, and knowledge-related barriers that hinder adoption, echoing Mercader and Gairín’s (2020) findings. Consistent with previous research (Aflalo et al., 2024), familiarity and experience with digital tools emerged as key factors influencing their pedagogical use. Tools perceived as particularly challenging may therefore expose inequities in access to training and support, highlighting areas where professional development resources should be strategically allocated.
Conversely, tools with lower difficulty but high discrimination, such as those supporting communication and feedback, offer more accessible entry points for involving students in digital decision-making. These tools can function as pedagogical scaffolds, enabling teachers to progressively build confidence and competence in engaging students with more complex technologies (Wood et al., 1976). Tools with moderate difficulty and discrimination, including those used for planning and managing learning, represent a baseline level of engagement. While practically valuable, they contribute less to differentiating teacher proficiency and offer limited insight into higher-order pedagogical practices.
Overall, this categorization emphasizes the need to align professional development initiatives with the cognitive and pedagogical demands of specific digital tools, addressing barriers to adoption while promoting equitable access to training and support.

5.3. Factors Influencing Teacher Proficiency

The analysis of background variables provides further insight into factors shaping teachers’ ability to involve students in digital tool selection. Gender differences emerged as significant, with female teachers demonstrating higher proficiency levels than their male counterparts. This finding is consistent with prior research suggesting that women are often more proactive in adopting innovative and student-centered pedagogical approaches (Aldahdouh et al., 2020; Venkatesh et al., 2003; Wekerle et al., 2022). Studies have also shown that women are more likely to employ pedagogical strategies that emphasize learner agency and participation, which may explain their greater tendency to involve students in digital decision-making (Korlat et al., 2021; Zhou & Xu, 2007).
Teachers holding additional academic roles, such as department chairs or program directors, also exhibited higher proficiency. These roles are likely to provide increased access to professional development opportunities and exposure to institutional decision-making processes, enhancing teachers’ capacity to integrate digital tools in pedagogically meaningful ways.
In contrast, variables such as age, tenure, academic rank, teaching load, and disciplinary field did not yield significant differences in proficiency. While some prior studies have identified disciplinary differences in digital technology integration (Aldahdouh et al., 2020; Záhorec et al., 2019), the present findings align with research emphasizing the primacy of institutional support and professional learning opportunities over demographic characteristics (Howard et al., 2021; Tondeur et al., 2021). Together, these results suggest that teachers’ capacity to involve students in digital tool selection and to leverage this process to foster critical thinking is shaped less by who teachers are and more by the pedagogical and institutional conditions under which they work.
Taken together, the findings provide a structured response to the study’s research questions. Digital technologies vary considerably in the extent to which teachers report student involvement (RQ A), and IRT parameters indicate that cognitively demanding and differentiation-oriented tools most strongly distinguish between levels of involvement proficiency (RQ B). Regarding background variables (RQ C), only gender and holding an additional academic role were significantly associated with higher proficiency. These results clarify both the variability of practices and the factors shaping teachers’ engagement in participatory digital tool selection.

6. Implications for Active Student Involvement and Critical Thinking

The findings of this study suggest the potential pedagogical value of involving students in the selection of digital tools, particularly in relation to learner autonomy, self-directed learning (SDL), and critical thinking. Active participation in digital decision-making enables students to tailor learning experiences to their individual needs and preferences, thereby strengthening their capacity to plan, monitor, and evaluate their learning processes (Gureckis & Markant, 2012). When students are required to compare tools, articulate criteria, and justify their choices, digital tool selection can function as a cognitively demanding activity aligned with processes commonly associated with critical thinking through analysis, evaluation, and informed decision-making (Riswanti Rini et al., 2022; Sailer et al., 2024).
At the same time, the cognitive benefits of participatory tool selection are contingent on sustained pedagogical support. Educators play a central role in guiding students through the expanding array of digital tools and in scaffolding the evaluative processes necessary for meaningful choice. Without such guidance, students tend to rely on a limited and familiar set of technologies, restricting both the breadth of their digital engagement and the depth of their critical thinking (Margaryan, 2008; C. Henderson et al., 2011). These findings underscore educators’ dual responsibility: actively engaging students in digital tool selection while continuously developing their own techno-pedagogical proficiency (Aldahdouh et al., 2020; Sancar et al., 2021). From a design perspective, involving students meaningfully in tool selection can require staged evaluation activities (needs assessment, usability testing, and usefulness appraisal) rather than a single ‘choose a tool’ moment. (de Vreugd et al., 2024).
In line with this discussion, fostering critical thinking through digital tool selection requires making pedagogical reasoning explicit. Teachers must model evaluative criteria and clarify the learning purposes underlying digital choices, thereby enabling students to internalize critical thinking strategies that extend beyond specific tools (Bergdahl et al., 2018).
The rapid expansion of AI in higher education further reinforces these implications. Professional development initiatives should integrate AI-specific competencies that emphasize critical thinking, including understanding algorithmic limitations, critically examining AI-generated recommendations, and guiding students in reflective, AI-supported inquiry. Evidence indicates that many educators feel underprepared to integrate AI meaningfully into teaching, highlighting the need for structured institutional support and targeted training programs (Tan et al., 2025; José & José, 2024).
In conclusion, by identifying gaps in teachers’ proficiency and demonstrating variability in the cognitive demands of digital tools, this study provides a foundation for designing interventions that promote effective and equitable digital integration. It is important to clarify that, in this study, student involvement refers to teachers’ self-reported practice of involving learners in digital tool-related activities, including selection, adaptation, evaluation, and use. While some items directly reflect tool selection, others capture broader forms of participatory engagement in digitally mediated pedagogical processes.
This operationalization captures participatory engagement in tool selection, but does not necessarily imply student-led decision-making, deep co-design processes, or fully shared authority in instructional planning. The observed gender differences suggest the importance of supporting broader adoption of student-centered practices, particularly among male educators, that emphasize shared decision-making, reflection, and critical engagement. Achieving this requires a pedagogical shift toward learning environments that value choice and collaboration, alongside professional development that integrates pedagogical and technological competencies.

7. Research Limitations and Future Research

This study has several limitations that should be considered when interpreting the findings. First, the reliance on educators’ self-reported data may introduce response bias, as reported practices do not necessarily reflect actual classroom behavior. Future studies could strengthen the validity of the findings by incorporating additional methodologies, such as classroom observations, learning analytics, or analyses of student learning outcomes. In particular, future research should directly examine whether and how student involvement in digital tool selection translates into measurable gains in critical thinking, thereby empirically testing the theoretical linkage proposed in the present study.
It should also be noted that the analytical focus on the highest response category (“use and involve learners”) reflects the study’s conceptual emphasis on explicit student participation in tool selection. While this approach aligns with the theoretical definition of the construct, it does not capture intermediate or emerging forms of involvement that may represent partial participatory practices. Future research may therefore benefit from examining graded or developmental forms of involvement alongside the more explicit category analyzed here.
Second, variability in course characteristics, such as differences between research seminars that involve individualized guidance and courses based on more traditional instructional formats, may have influenced teachers’ reported practices. Future research should therefore examine educators’ techno-pedagogical proficiency in relation to course structure, pedagogical goals, and disciplinary contexts. Broader contextual factors, including institutional culture, access to technological resources, and student demographics, also warrant closer examination, as they may shape opportunities for student involvement in digital tool selection.
Third, the study was conducted prior to the widespread integration of AI-based tools in higher education. As AI-driven personalization and automated feedback become increasingly embedded in digital learning environments, the dynamics of tool selection may evolve. Future research should explore how emerging AI-supported practices influence teachers’ pedagogical roles and students’ participation in selecting and using digital technologies, particularly in relation to evaluative judgment and decision-making.
In addition, digital tools that exhibited high discrimination values in the present study provide a promising direction for further investigation. Examining how specific technological features, instructional goals, and classroom conditions interact to shape teacher engagement and student participation could yield deeper insights into the pedagogical mechanisms underlying effective digital integration.
Addressing these limitations will contribute to a more comprehensive understanding of the interplay between pedagogical, technological, and contextual factors, and will support the development of targeted interventions aimed at strengthening teachers’ competencies and enhancing meaningful student engagement with digital tools.

Funding

This research received no external funding.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki and approved by the Ethics Committee of Hemdat College of Education, Israel (protocol code 110 and 13 January 2024 approval).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The original contributions presented in this study are included in the article. Further inquiries can be directed to the corresponding author.

Conflicts of Interest

The author declares no conflicts of interest.

Appendix A

  • Questionnaire on teacher digital practices
  • Part one: background questions
A.
Gender:
  • Male
  • Female
B.
age:
  • 1. 20–30 2. 31–40
  • 3. 41–50 4. 51–60
  • 5. 61–70 6. 70 and above
C.
In my house there is:
  • At least one computer and a connection to the Internet
  • There is a computer but no Internet connection
  • There is no computer and no connection to the Internet
  • I only have a smartphone
  • Another ______________
D.
The name of the college where I teach _________:
E.
education:
  • 1. Bachelor’s degree 2. Master’s degree
  • 3. PhD 4. Other ______
F.
Academic rank:
  • 1. Lecturer without rank 2. Lecturer
  • 3. Senior lecturer 4. Professor
G.
The number of courses you are teaching this year:
  • 1. 1–2 2. 3–4
  • 3. 5–6 4. 7 and more
  • 5. Other
H.
Teaching area:
1. English7. Computer science, learning technologies
2. Art, music8. Jewish Studies
3. Early childhood education9. Literature, language
4. Special education Psychology10. Physical education
5. Science11. General education
6. Mathematics/Physics12. Other ____________
I.
Additional role if any:
  • 1. Board member      2. Department head
  • 3. Dean              4. Rector
  • 5. Unit head           6. Program leader
  • 7. None              8. Other __________
  • Second part:
  • Below are various skills related to teaching and learning with digital technologies. Please choose one option for each question, which is the most correct for you.
  • Question 1: Searching and Sorting Digital Information for Teaching and Learning Needs (e.g., search engines and digital libraries)
  • I don’t engage in digital information retrieval.
  • I possess knowledge of various online search methods.
  • I have attempted to search the Internet for digital information.
  • I actively use various online tools to search for digital resources.
  • I systematically analyze and select digital resources based on specific criteria.
  • Question 2: Creating Digital Resources to Improve and Support Teaching and Learning Goals (e.g., digital text, presentations, video, audio)
  • I do not produce digital resources.
  • I acknowledge the possibility of creating resources digitally.
  • I have experimented with digital tools to create learning resources.
  • I actively produce diverse digital resources based on their unique features.
  • I apply design principles and systematic processes with the learners to create high-quality digital resources.
  • Question 3: Modification and Adaptation of Existing Digital Resources to Improve Teaching and Learning (e.g., editing web images, editing and deleting text)
  • I do not engage in changing digital resources.
  • I recognize the potential for adapting resources to meet teaching and learning needs.
  • I have attempted to modify digital resources to align with my teaching goals.
  • I actively redesign digital resources to suit specific educational requirements.
  • I involve learners in the process of modifying and adapting digital resources.
  • Question 4: Using Digital Technologies to Store and Share Information for Teaching and Learning Improvement (for example, storing and sharing information in the cloud, collaborative boards, interactive presentations)
  • I do not use digital technologies for storing or sharing information.
  • I recognize that digital technologies can support teaching and learning through information storage and sharing.
  • I have experimented with using digital technologies to enhance my teaching.
  • I actively use digital technologies to improve teaching and learning experiences.
  • I involve learners in adapting these digital technologies to improve teaching and learning.
  • Question 5: Using Digital Technologies for Feedback and Reflection on Teaching (for example, chat, online forms such as Google Forms or Microsoft Forms)
  • I do not utilize digital technologies for feedback and reflection on my teaching.
  • I acknowledge that digital technologies can be used for providing and receiving feedback on teaching.
  • I have experimented with using digital technologies for feedback on my teaching.
  • I actively use digital technologies to give and receive feedback in real-time and/or asynchronously.
  • I engage learners in providing feedback through digital technologies, promoting self-evaluation and peer feedback.
  • Question 6: Using Digital Technologies to Encourage Collaboration Between Learners (for example, collaborative files, collaborative presentations)
  • I do not engage in collaborative activities using digital technologies.
  • I acknowledge the potential of digital technologies to enhance collaboration among learners.
  • I have experimented with using digital technologies to support collaborative activities.
  • I actively use digital technologies to facilitate collaborative learning experiences.
  • I empower learners to select digital technologies that best support collaborative learning.
  • Question 7: Using Digital Technologies to Promote Learning Planning and Management (for example, planning and setting goals using a digital diary, providing choices, directing to self-learning spaces)
  • I do not utilize digital technologies for promoting learning planning.
  • I recognize the potential of digital technologies to facilitate learning planning.
  • I have attempted to use digital technologies to support learners in planning and managing their learning.
  • I actively use digital technologies to assist learners in planning and managing their learning.
  • I encourage learners to choose digital technologies for managing and planning their learning.
  • Question 8: Using Digital Technologies to Support Formative and Summative Assessment (for example, online quizzes, assessment and feedback platforms)
  • I do not employ digital technologies for formative or summative assessment.
  • I am aware that digital technologies can effectively support both formative and summative assessment.
  • I have experimented with using digital technologies for formative or summative assessment.
  • I actively use digital technologies for effective formative and summative assessment.
  • I involve learners in constructing and planning evaluation methods and selecting digital technologies for assessment.
  • Question 9: Using Digital Technologies to Analyze Learning Processes and Collect Learning Outcomes (e.g., online surveys, spreadsheets)
  • I do not use digital technologies to analyze learning processes and collect learning outcomes.
  • I am aware of digital technologies that can reflect learning processes and outcomes.
  • I have attempted to use digital technologies to analyze individual or group learning activities.
  • I actively use digital technologies to collect learning outcomes and analyze learning activities.
  • I involve learners in the analysis of their learning data to plan further learning.
  • Question 10: Providing Feedback to Learners Using Digital Technologies (for example, online forms such as Google Forms, video-based feedback, online surveys on platforms such as Moodle, Google Sites, Teams)
  • I do not provide feedback to learners using digital technologies.
  • I recognize that digital technologies can be effective tools for providing feedback.
  • I have experimented with using digital technologies for providing feedback to learners.
  • I actively use digital technologies, including automatic feedback, for providing feedback to learners.
  • I involve learners in the use and choice of digital technologies for feedback.
  • Question 11: Ensuring Access to Digital Resources for Special Needs (for example, ensuring accessibility to files that can be scanned by software such as Word, access to infrastructure, adapted technologies such as screen readers)
  • I do not incorporate digital technologies in this context.
  • I am aware of digital technologies supporting learners with special needs.
  • I have attempted to use adaptable digital technologies for learners with special needs.
  • I actively use digital technologies tailored for learners with special needs.
  • I involve learners in selecting and utilizing digital technologies adapted to special needs.
  • Question 12: Using Digital Technologies to Respond to Differences Between Learners (for example, online tools for defining personal learning goals such as a personal online form, digital tasks of different levels of difficulty)
  • I do not employ digital technologies to personalize learning paths.
  • I recognize the potential of digital technologies for personalized learning.
  • I have experimented with digital technologies allowing personalized learning.
  • I actively use various digital technologies to address individual learning needs.
  • I collaborate with learners in creating personalized learning plans using digital technologies.
  • Question 13: Using Digital Technology to Promote Creative Learning (for example, digital games and escape rooms, virtual worlds, digital portfolio on platforms such as Pinterest)
  • I do not use digital technologies to foster creative learning.
  • I acknowledge the role of digital technologies in promoting creative learning.
  • I have tried using digital technologies to stimulate creative learning.
  • I actively use various digital technologies to encourage creative learning.
  • I involve learners in selecting digital technologies that promote creative learning.
  • Question 14: Using Resources and Digital Tools for Distance and Hybrid Learning for example, online meetings using Zoom or Teams, virtual laboratories)
  • I do not use digital technologies in this context.
  • I recognize that digital technologies can facilitate various learning modalities.
  • I have experimented with digital technologies for distance and hybrid learning.
  • I actively use diverse digital technologies facilitating distance and integrated learning.
  • I involve learners in choosing suitable tools for distance and blended learning.
  • Question 15: Using Digital Technologies for Searching, Managing Data, and Evaluating Information (e.g., searching for digital information, evaluating information, comparing sources, reading graphs).
  • I do not use digital technologies in this context.
  • I am aware of digital technologies improving information literacy.
  • I have attempted to use digital technologies for activities involving information and data.
  • I actively implement activities requiring learners to search, evaluate, and manage information.
  • I involve learners in selecting technological tools for information searching, managing, and critical evaluation.
  • Question 16: Digital Technologies for Communication Between Learners (e.g., email, WhatsApp, forum)
  • I do not use digital technologies in this context.
  • I recognize digital technologies enhancing communication between learners.
  • I have tried using digital technologies to encourage learner communication.
  • I actively implement activities requiring learners to communicate digitally based on learning needs.
  • I involve learners in selecting digital technologies for communication.
  • Question 17: Digital Technologies as a Means of Expression and Content Creation by the Learners (such as text, presentations, audio, videos, visualizations, digital portfolio)
  • I do not use digital technologies in this context.
  • I am aware of digital technologies enabling learners to produce content and express ideas.
  • I have attempted to use digital technologies to encourage learners in content creation.
  • I actively use digital technologies allowing learners to produce content and express ideas.
  • I involve learners in the use and selection of digital technologies for encouraging digital expression.
  • Question 18: Digital Technologies to Promote Learning to Solve Problems by the learners (such as text, presentations, audio, videos, visualizations, digital portfolio)
  • I do not use digital technologies in this context.
  • I am aware of digital technologies encouraging learners to understand and solve problems.
  • I have attempted to use digital technologies for activities promoting problem solving.
  • I actively use digital technologies enabling learners to apply problem-solving processes.
  • I involve learners in adapting digital technologies for the benefit of problem-solving processes.

References

  1. Aflalo, E., Vaknin, M., & Harband, Y. (2024). Using digital technologies to support active and self-directed learning. Computing in Higher Education, 38, 105–129. [Google Scholar] [CrossRef]
  2. Aldahdouh, T. Z., Nokelainen, P., & Korhonen, V. (2020). Technology and social media usage in higher education: The influence of individual innovativeness. Sage Open, 10(1), 2158244019899441. [Google Scholar] [CrossRef]
  3. Amhag, L., Hellström, L., & Stigmar, M. (2019). Teacher educators’ use of digital tools and needs for digital competence in higher education. Journal of Digital Learning in Teacher Education, 35(4), 203–220. [Google Scholar] [CrossRef]
  4. Asad, M. M., Aftab, K., Sherwani, F., Churi, P., Moreno-Guerrero, A., & Pourshahian, B. (2021). Techno-pedagogical skills for 21st century digital classrooms: An extensive literature review. Education Research International, 2021, 8160084. [Google Scholar] [CrossRef]
  5. Atenas, J., Havemann, L., & Timmermann, C. (2023). Reframing data ethics in research methods education: A pathway to critical data literacy. International Journal of Educational Technology in Higher Education, 20(1), 11. [Google Scholar] [CrossRef]
  6. Bergdahl, N., Fors, U., Hernwall, P., & Knutsson, O. (2018). The use of learning technologies and student engagement in learning activities. Nordic Journal of Digital Literacy, 13(2), 113–130. [Google Scholar] [CrossRef]
  7. Bond, M., Khosravi, H., De Laat, M., Bergdahl, N., Negrea, V., Oxley, E., & Siemens, G. (2024). A meta systematic review of artificial intelligence in higher education: A call for increased ethics, collaboration, and rigour. International Journal of Educational Technology in Higher Education, 21(1), 4. [Google Scholar] [CrossRef]
  8. Bullock, S. M. (2013). Using digital technologies to support self-directed learning for preservice teacher education. Curriculum Journal, 24(1), 103–120. [Google Scholar] [CrossRef]
  9. Damsa, C. I. (2019). Learning with digital technologies in higher education. Journal of Educational Sciences & Psychology, 9(1), 5–9. [Google Scholar]
  10. de Vreugd, L., van Leeuwen, A., Jansen, R., & van der Schaaf, M. (2024). Learning analytics dashboard design and evaluation to support student self-regulation of study behavior. Journal of Learning Analytics, 11(3), 249–262. [Google Scholar] [CrossRef]
  11. Domínguez, C. Y., & García, C. M. (2017). University students’ self-regulated learning using digital technologies. International Journal of Educational Technology in Higher Education, 14, 38. [Google Scholar] [CrossRef]
  12. Facione, P. (1990). Critical thinking: A statement of expert consensus for purposes of educational assessment and instruction (The Delphi report). American Philosophical Association. [Google Scholar]
  13. García-Martínez, J., Rosa-Napal, F., Romero-Tabeayo, I., López-Calvo, S., & Fuentes-Abeledo, E. (2020). Digital tools and personal learning environments: An analysis in higher education. Sustainability, 12(19), 8180. [Google Scholar] [CrossRef]
  14. Godsk, M., & Møller, K. L. (2024). Engaging students in higher education with educational technology. Education and Information Technologies, 30, 2941–2976. [Google Scholar] [CrossRef]
  15. Gureckis, T., & Markant, D. (2012). A cognitive and computational perspective on self-directed learning. Perspectives in Psychological Science, 7, 464–481. [Google Scholar] [CrossRef]
  16. Halpern, D. F. (2013). Thought and knowledge: An introduction to critical thinking. Psychology Press. [Google Scholar]
  17. Henderson, C., Beach, A., & Finkelstein, N. (2011). Facilitating change in undergraduate STEM instructional practices: An analytic review of the literature. Journal of Research in Science Teaching, 48(8), 952–984. [Google Scholar] [CrossRef]
  18. Henderson, M., Selwyn, N., & Aston, R. (2017). What works and why? student perceptions of ‘useful’digital technology in university teaching and learning. Studies in Higher Education, 42(8), 1567–1579. [Google Scholar] [CrossRef]
  19. Henrikson, R., & Baliram, N. (2023). Examining voice and choice in online learning. International Journal of Educational Technology in Higher Education, 20(1), 31. [Google Scholar] [CrossRef]
  20. Herodotou, C., Shrestha, S., Comfort, C., Fernandez, M., Andrews, H., Mulholland, P., Bayer, V., Maguire, C., & Lee, J. (2025). A participatory approach to designing a student-facing dashboard for online and distance education. Journal of Learning Analytics, 12(2), 158–174. [Google Scholar] [CrossRef]
  21. Howard, S. K., Tondeur, J., Ma, J., & Yang, J. (2021). What to teach? Strategies for developing digital competency in preservice teacher training. Computers & Education, 165, 104149. [Google Scholar] [CrossRef]
  22. Hyland, N., & Kranzow, J. (2011). Faculty and student views of using digital tools to enhance self-directed learning and critical thinking. International Journal of Self-Directed Learning, 8(2), 11–27. [Google Scholar]
  23. José, J., & José, B. J. (2024). Educators’ academic insights on artificial intelligence: Challenges and opportunities. Electronic Journal of E-Learning, 22(2), 59–77. [Google Scholar] [CrossRef]
  24. Kampylis, P., Punie, Y., & Devine, J. (2015). Promoting effective digital-age learning-A European framework for digitally-competent educational organisations (No. JRC98209). Joint Research Centre (Seville Site). [Google Scholar] [CrossRef]
  25. Katiyar, N., Awasthi, M. V. K., Pratap, R., Mishra, M. K., Shukla, M. N., Singh, R., & Tiwari, M. (2024). AI-driven personalized learning systems: Enhancing educational effectiveness. Educational Administration: Theory and Practice, 30(5), 11514–11524. [Google Scholar] [CrossRef]
  26. Korlat, S., Kollmayer, M., Holzer, J., Lüftenegger, M., Pelikan, E. R., Schober, B., & Spiel, C. (2021). Gender differences in digital learning during COVID-19: Competence beliefs, intrinsic value, learning engagement, and perceived teacher support. Frontiers in Psychology, 12, 637776. [Google Scholar] [CrossRef]
  27. Li, H., Zhu, S., Wu, D., Yang, H. H., & Guo, Q. (2023). Impact of information literacy, self-directed learning skills, and academic emotions on high school students’ online learning engagement: A structural equation modeling analysis. Education and Information Technologies, 28(10), 13485–13504. [Google Scholar] [CrossRef]
  28. Madsen, S. S., Thorvaldsen, S., & Archard, S. (2018). Teacher educators’ perceptions of working with digital technologies. Nordic Journal of Digital Literacy, 13(3), 177–196. [Google Scholar] [CrossRef]
  29. Margaryan, A. (2008). Work-based learning: A blend of pedagogy and technology. VDM Publishing. [Google Scholar]
  30. Martin, F., Polly, D., Coles, S., & Wang, C. (2020). Examining higher education faculty use of current digital technologies: Importance, competence, and motivation. International Journal of Teaching and Learning in Higher Education, 32(1), 73–86. [Google Scholar]
  31. Mercader, C., & Gairín, J. (2020). University teachers’ perception of barriers to the use of digital technologies: The importance of the academic discipline. International Journal of Educational Technology in Higher Education, 17(1), 4. [Google Scholar] [CrossRef]
  32. Morris, T. H. (2019). Self-directed learning: A fundamental competence in a rapidly changing world. International Review of Education, 65(4), 633–653. [Google Scholar] [CrossRef]
  33. Morris, T. H., & Rohs, M. (2021). Digitization bolstering self-directed learning for information literate adults–A systematic review. Computers and Education Open, 2, 100048. [Google Scholar] [CrossRef]
  34. Mulaudzi, L. V., & Hamilton, J. (2025). Lecturer’s perspective on the role of AI in personalized learning: Benefits, challenges, and ethical considerations in higher education. Journal of Academic Ethics, 23, 1571–1591. [Google Scholar] [CrossRef]
  35. Nagy, J. T., & Dringó-Horváth, I. (2024). Factors influencing university teachers’ technological integration. Education Sciences, 14(1), 55. [Google Scholar] [CrossRef]
  36. Nkomo, L. M., Daniel, B. K., & Butson, R. J. (2021). Synthesis of student engagement with digital technologies: A systematic review of the literature. International Journal of Educational Technology in Higher Education, 18, 34. [Google Scholar]
  37. Pinto, M., & Leite, C. (2020). Digital technologies in support of students learning in Higher Education: Literature review. Digital Education Review, 37, 343–360. [Google Scholar] [CrossRef]
  38. Riswanti Rini, R., Mujiyati, M., Ismu, S., & Hasan, H. (2022). The effect of self-directed learning on students’ digital literacy levels in online learning. International Journal of Instruction, 15(3), 229–341. [Google Scholar] [CrossRef]
  39. Romero, R., Plaza, I. R., & Orfali, C. H. (2019). Barriers in teacher perception about the use of technology for evaluation in Higher Education. Digital Education Review, 35, 170–185. [Google Scholar] [CrossRef]
  40. Sailer, M., Maier, R., Berger, S., Kastorff, T., & Stegmann, K. (2024). Learning activities in technology-enhanced learning: A systematic review of meta-analyses and second-order meta-analysis in higher education. Learning and Individual Differences, 112, 102446. [Google Scholar] [CrossRef]
  41. Sancar, R., Atal, D., & Deryakulu, D. (2021). A new framework for teachers’ professional development. Teaching and Teacher Education, 101, 103305. [Google Scholar] [CrossRef]
  42. Schindler, L. A., Burkholder, G. J., Morad, O. A., & Marsh, C. (2017). Computer-based technology and student engagement: A critical review of the literature. International Journal of Educational Technology in Higher Education, 14, 25. [Google Scholar] [CrossRef]
  43. Schmid, R., & Petko, D. (2019). Does the use of educational technology in personalized learning environments correlate with self-reported digital skills and beliefs of secondary-school students? Computers & Education, 136, 75–86. [Google Scholar] [CrossRef]
  44. Tan, X., Cheng, G., & Ling, M. H. (2025). Artificial intelligence in teaching and teacher professional development: A systematic review. Computers and Education: Artificial Intelligence, 8, 100355. [Google Scholar] [CrossRef]
  45. Timotheou, S., Miliou, O., Dimitriadis, Y., Sobrino, S. V., Giannoutsou, N., Cachia, R., Monés, A. M., & Ioannou, A. (2023). Impacts of digital technologies on education and factors influencing schools’ digital capacity and transformation: A literature review. Education and Information Technologies, 28(6), 6695–6726. [Google Scholar] [CrossRef] [PubMed]
  46. Tondeur, J., Howard, S. K., & Yang, J. (2021). One-size does not fit all: Towards an adaptive model to develop preservice teachers’ digital competencies. Computers in Human Behavior, 116, 106659. [Google Scholar] [CrossRef]
  47. Uzorka, A., Namara, S., & Olaniyan, A. O. (2023). Modern technology adoption and professional development of lecturers. Education and Information Technologies, 28(11), 14693–14719. [Google Scholar] [CrossRef] [PubMed]
  48. Venkatesh, V., Morris, M. G., Davis, G. B., & Davis, F. D. (2003). User acceptance of information technology: Toward a unified view. MIS Quarterly, 27(3), 425–478. [Google Scholar] [CrossRef]
  49. Wekerle, C., Daumiller, M., & Kollar, I. (2022). Using digital technology to promote higher education learning: The importance of different learning activities and their relations to learning outcomes. Journal of Research on Technology in Education, 54(1), 1–17. [Google Scholar] [CrossRef]
  50. Wood, D., Bruner, J. S., & Ross, G. (1976). The role of tutoring in problem solving. Journal of Child Psychology and Psychiatry, 17(2), 89–100. [Google Scholar] [CrossRef]
  51. Záhorec, J., Nagyová, A., & Hašková, A. (2019). Teachers’ attitudes to incorporation digital means in teaching process in relation to the subjects they teach. International Journal of Engineering Pedagogy, 9(4), 101–120. [Google Scholar] [CrossRef]
  52. Zhou, G., & Xu, J. (2007). Adoption of educational technology: How does gender matter? International Journal of Teaching and Learning in Higher Education, 19(2), 140–153. [Google Scholar]
Figure 1. Teachers’ ability to involve students in choosing digital technologies based on item attributes.
Figure 1. Teachers’ ability to involve students in choosing digital technologies based on item attributes.
Education 16 00512 g001
Figure 2. Distribution of individual teachers’ proficiency score in involving students in digital technology selection across subjects.
Figure 2. Distribution of individual teachers’ proficiency score in involving students in digital technology selection across subjects.
Education 16 00512 g002
Table 1. Demographic and professional profile of study participants (N = 156).
Table 1. Demographic and professional profile of study participants (N = 156).
Characteristic Count%
Gender96 females61.7
60 males 38.3
Age31–40: 19 12.2
41–50: 44 28.2
51–60: 56 35.9
61–70: 31 20.0
71+: 63.9
College: nature or orientation and geographical region
a.
Jewish religious college, the southern region: 48
a.
30.8
b.
Jewish religious college, the center region: 41
b.
26.3
c.
Secular college (Jews and Arabs), northern region: 34
c.
21.8
d.
Arab College, north region: 17
d.
10.9
e.
Secular college (Jews and Arabs), southern region: 16
e.
10.3
EducationMA: 4226.9
PhD: 114 73.0
Teaching disciplineEducation: 6143.0
Humanities: 4330.2
Sciences: 3826.8
Seniority in teacher training (years)1–3 years: 11 7.1
4–6 years: 19 12.2
7–10 years: 31 19.9
11+ years: 95 60.9
Scope of teaching (number of courses per year)1–2 courses: 26 16.7
3–4 courses: 24 15.4
5–6 courses: 46 29.5
7+ courses: 60 38.5
Academic rankWithout an academic rank: 60 39.0
Lecturer rank: 54 35.1
Senior lecturer and Professor: 40 26.1
Additional academic roleNo: 7953.7
Yes: 6846.3
Table 2. Difficulty and discrimination estimates for teachers’ engagement in involving students in digital tool selection.
Table 2. Difficulty and discrimination estimates for teachers’ engagement in involving students in digital tool selection.
Purpose of Using Digital Technologies
(Item No. in the Questionnaire)
Student-Involvement Rate * Discrimination SEDifficulty SE
Communication between learners (16)0.361.700.360.530.15
Store and share information with teachers (4)0.322.630.560.620.13
Feedback on the teaching (5)0.292.130.460.770.15
Expression and content creation by the learners (17)0.291.770.390.810.17
Collaboration between learners (6)0.271.690.380.890.18
Modification and adaptation of digital resources (3)0.232.730.640.930.14
Creating digital resources (2)0.181.220.341.570.35
Learning planning and management (7)0.171.530.401.420.27
Distance learning and hybrid learning (14)0.162.240.581.280.20
Active and creative learning (13)0.143.861.241.220.15
Management and information evaluation by learners (15)0.131.660.471.610.30
Summative and formative assessment (8)0.122.730.801.380.20
Feedback for learners (10)0.112.080.601.600.27
Problem-solving processes (18)0.082.260.721.710.29
Analysis and collection of learning outcomes (9)0.082.380.781.730.29
Access to digital resources for special needs (11)0.072.110.731.920.37
Respond to differences between learners (12)0.062.160.781.970.39
* Student-involvement rate (category 5). For the IRT analysis, responses were recoded as 1 = “use and involve learners” and 0 = all other categories.
Table 3. Comparative analysis of student involvement in digital technology selection across teachers’ background characteristics.
Table 3. Comparative analysis of student involvement in digital technology selection across teachers’ background characteristics.
MeanSENdfFpηp2
Gender 11518.320.0040.052
Male−0.26 a0.1159
Female0.15 b *0.0994
AgeYears 41501.990.0990.050
31–400.010.2118
41–500.010.1344
51–600.100.1256
61–70−0.030.1631
71+−0.950.366
Seniority in teacher trainingYears 31510.660.5800.013
1–30.020.2711
4–60.030.2118
7–10−0.200.1631
11+0.060.0995
Education 11523.260.0730.021
MA0.210.1441
PhD−0.080.08113
Scope of teachingNumber of courses per year 31511.320.2690.026
1–20.030.1726
3–4−0.170.1824
5–6−0.140.1346
7+0.160.1159
Academic rank 21502.610.0770.034
Without rank0.060.1254
Lecturer 0.140.1159
Senior Lecturer + Prof.−0.250.1440
Teaching discipline 21391.540.2180.022
Education−0.100.1161
Humanities−0.150.1343
Sciences0.160.1438
Additional academic role 11457.120.0090.047
No−0.18 a0.1079
Yes0.20 b0.1168
* a indicates the lowest mean and b the highest mean.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Aflalo, E. Student Involvement in Digital Tool Selection: A Pedagogical Approach to Critical Thinking-Oriented Learning. Educ. Sci. 2026, 16, 512. https://doi.org/10.3390/educsci16040512

AMA Style

Aflalo E. Student Involvement in Digital Tool Selection: A Pedagogical Approach to Critical Thinking-Oriented Learning. Education Sciences. 2026; 16(4):512. https://doi.org/10.3390/educsci16040512

Chicago/Turabian Style

Aflalo, Ester. 2026. "Student Involvement in Digital Tool Selection: A Pedagogical Approach to Critical Thinking-Oriented Learning" Education Sciences 16, no. 4: 512. https://doi.org/10.3390/educsci16040512

APA Style

Aflalo, E. (2026). Student Involvement in Digital Tool Selection: A Pedagogical Approach to Critical Thinking-Oriented Learning. Education Sciences, 16(4), 512. https://doi.org/10.3390/educsci16040512

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop