From Teachers’ Perspective: Can an Online Digital Competence Certification System Be Successfully Implemented in Schools?
Abstract
:1. Introduction
- RQ1: What factors impact teachers’ perspectives on the successful implementation of an online DC certification system in primary and secondary schools?
- RQ2: What are the relationships between those factors?
2. Literature Review
- Indicate a lack of DC in children;
- Describe systems for assessment and certification of DC;
- Describe the acquisition of DC through the curriculum.
- A modest number of studies deal with the DC acquisition and certification in primary and secondary schools in a way of only indicating its importance without providing an implementation solution.
- None of the studies tried to assess the implementation of the systems for DC acquisition.
- There seems to be a need to start with DC education and assessment from the earliest age and integrate it into the formal curriculum.
3. Research Aims
- RO1. To propose a DC certification system success model;
- RO2. To develop and validate a survey instrument that can empirically test and theorize the model;
- RO3. To examine the relationships among the variables and their relative impact on DC certification system success.
4. Research Context
5. Research Model and Methodology
6. Measurement Instrument Development
7. Sample and Procedure
8. Results
8.1. Measurement Model Assessment
8.2. Structural Model Testing
8.3. Importance–Performance Map Analysis (IPMA)
9. Discussion
10. Limitations
11. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Appendix A
Appendix B
System Quality |
SQ1: The CRISS platform is easy to use. |
SQ2: The CRISS platform is available whenever I need to use it. |
SQ3: The CRISS platform runs fast. |
SQ4: The CRISS platform has all the features necessary to accomplish my tasks (e.g., create and share content, work with others, import data from other tools). |
Information Quality |
IQ1: The information I find on the CRISS platform is useful to perform my activities. |
IQ3: I can easily find the information I need on the CRISS platform. |
IQ4: The information I find on the CRISS platform is complete. |
IQ5: The information I find on the CRISS platform is accurate. |
IQ6: The presentation of the assessment results is easy to understand. |
IQ7: The information about the students’ progress is clear. |
Service Quality |
SVQ1: Helpdesk is available to help me use the CRISS platform. |
SVQ2: Helpdesk responds promptly when I have a problem with the CRISS platform. |
SVQ3: Helpdesk provides a useful response when I have a problem with the CRISS platform. |
SVQ4: Other forms of online help are available (e.g., chat, social networks) for using the CRISS platform. |
System Use |
SU5: I use the CRISS platform to collaborate with my colleagues (e.g., creation of CAS, assessments). |
SU6: I use the CRISS platform to communicate/give feedback with/to my students. |
SU7: I use the CRISS platform to tag my content (e.g., CAS). |
SU8: I use the CRISS platform to track the progress and achievements of my students. |
SU9: I use the CRISS platform to provide additional CAS or activities to students based on their assessment results. |
SU10: I use the CRISS platform to integrate the content or activities from external tools (e.g., YouTube, Facebook, Flickr, Google Drive). |
User Satisfaction |
US1: I feel comfortable using the CRISS platform. |
US2: I find the CRISS platform useful for additional assessment of my students. |
US3: I think it is worthwhile to use the CRISS platform. |
US4: I feel confident using the CRISS platform. |
US5: I am satisfied with the CRISS platform possibilities. |
Net Impacts |
NI3: The CRISS platform helps me to improve the engagement of my students. |
NI4: The CRISS platform enables me to provide clear evaluation criteria to my students. |
NI5: I am able to provide better feedback to my students through the CRISS platform. |
NI6: I am able to provide timely feedback to my students. |
NI7: The CRISS platform extends my capacity for assessment. |
NI8: The CRISS platform saves me time by supporting my teaching activities (planning process, guiding students, assigning tasks, monitoring students’ activities, etc.). |
NI9: The CRISS platform allows me to track the progress of my students much better than I could do without CRISS platform. |
NI10: I am able to detect underperforming students more quickly than I would without the CRISS platform. |
NI11: The CRISS platform helps me to make more suitable decisions to enable students’ progress. |
NI12: The CRISS platform enables me to propose tasks that allow students to be creative in solving them (ingenious, original). |
NI13: The CRISS platform enables me to track my students’ reasoning when solving the tasks. |
References
- Vuorikari, R.; Punie, Y.; Carretero, S.; Brande, L.V.D. DigComp 2.0: The Digital Competence Framework for Citizens. Update Phase 1: The Conceptual Reference Model; Publication Office of the European Union: Luxembourg, 2016; ISBN 978-92-79-58876-1. [Google Scholar]
- Council of the European Union. Council Recommendation of 22 May 2018 on Key Competences for Lifelong Learning. Off. J. Eur. Union 2018, 1–13. Available online: https://eur-lex.europa.eu/ (accessed on 1 December 2021).
- Siddiq, F.; Gochyyev, P.; Wilson, M. Learning in Digital Networks—ICT Literacy: A Novel Assessment of Students’ 21st Century Skills. Comput. Educ. 2017, 109, 11–37. [Google Scholar] [CrossRef] [Green Version]
- Martín, S.C.; González, M.C.; Peñalvo, F.J.G. Digital Competence of Early Childhood Education Teachers: Attitude, Knowledge and Use of ICT. Eur. J. Teach. Educ. 2019, 43, 210–223. [Google Scholar] [CrossRef]
- Zabotkina, V.; Korovkina, M.; Sudakova, O. Competence-Based Approach to a Module Design for the Master Degree Programme in Translation: Challenge of Tuning Russia Tempus Project. Tuning J. High. Educ. 2019, 7, 67–92. [Google Scholar] [CrossRef]
- Tudor, S.L. The Open Resurces and Their Influences on the Formation of Specific Competencies for the Teaching Profession. In Proceedings of the 10th International Conference on Electronics, Computers and Artificial Intelligence (ECAI 2018), Iasi, Romania, 28–30 June 2018; pp. 1–4. [Google Scholar]
- Varela, C.; Rebollar, C.; García, O.; Bravo, E.; Bilbao, J. Skills in Computational Thinking of Engineering Students of the First School Year. Heliyon 2019, 5, 1–9. [Google Scholar] [CrossRef]
- Sillat, L.H.; Tammets, K.; Laanpere, M. Digital Competence Assessment Methods in Higher Education: A Systematic Literature Review. Educ. Sci. 2021, 11, 402. [Google Scholar] [CrossRef]
- Siddiq, F.; Hatlevik, O.E.; Olsen, R.V.; Throndsen, I.; Scherer, R. Taking a Future Perspective by Learning from the Past—A Systematic Review of Assessment Instruments That Aim to Measure Primary and Secondary School Students’ ICT Literacy. Educ. Res. Rev. 2016, 19, 58–84. [Google Scholar] [CrossRef] [Green Version]
- Cutumisu, M.; Adams, C.; Lu, C. A Scoping Review of Empirical Research on Recent Computational Thinking Assessments. J. Sci. Educ. Technol. 2019, 28, 651–676. [Google Scholar] [CrossRef]
- Lazonder, A.W.; Walraven, A.; Gijlers, H.; Janssen, N. Longitudinal Assessment of Digital Literacy in Children: Findings from a Large Dutch Single-School Study. Comput. Educ. 2020, 143, 1–8. [Google Scholar] [CrossRef]
- De Lone, W.H.; McLean, E.R. Information Systems Success: The Quest for the Dependent Variable. Inf. Syst. Res. 1992, 3, 60–95. [Google Scholar] [CrossRef] [Green Version]
- DeLone, W.H.; McLean, E.R. Information Systems Success Measurement; Now Publishers Inc.: Hanover, PA, USA, 2016; ISBN 978-1-68083-143-6. [Google Scholar]
- Kitchenham, B.; Charters, S. Guidelines for Performing Systematic Literature Reviews in Software Engineering; Technical Report EBSE 2007-001; Keele University and Durham University Joint Report: Newcastle, UK, 2007. [Google Scholar]
- Alarcón, R.; Jiménez, E.P.; Vicente-Yagüe, M.I. Development and Validation of the DIGIGLO, a Tool for Assessing the Digital Competence of Educators. Br. J. Educ. Technol. 2020, 51, 1–15. [Google Scholar] [CrossRef]
- Brevik, L.M.; Gudmundsdottir, G.B.; Lund, A.; Strømme, T.A. Transformative Agency in Teacher Education: Fostering Professional Digital Competence. Teach. Teach. Educ. 2019, 86, 1–15. [Google Scholar] [CrossRef]
- Demirata, A.; Sadik, O. Design and Skill Labs: Identifying Teacher Competencies and Competency-Related Needs in Turkey’s National Makerspace Project. J. Res. Technol. Educ. 2021, in press. [Google Scholar] [CrossRef]
- Hinojo-Lucena, F.-J.; Aznar-Diaz, I.; Caceres-Reche, M.-P.; Trujillo-Torres, J.-M.; Romero-Rodriguez, J.-M. Factors Influencing the Development of Digital Competence in Teachers: Analysis of the Teaching Staff of Permanent Education Centres. IEEE Access 2019, 7, 178744–178752. [Google Scholar] [CrossRef]
- Instefjord, E.J.; Munthe, E. Educating Digitally Competent Teachers: A Study of Integration of Professional Digital Competence in Teacher Education. Teach. Teach. Educ. 2017, 67, 37–45. [Google Scholar] [CrossRef]
- León, L.D.; Corbeil, R.; Corbeil, M.E. The Development and Validation of a Teacher Education Digital Literacy and Digital Pedagogy Evaluation. J. Res. Technol. Educ. 2021. [Google Scholar] [CrossRef]
- Scherer, R.; Siddiq, F.; Tondeur, J. All the Same or Different? Revisiting Measures of Teachers’ Technology Acceptance. Comput. Educ. 2020, 143, 1–17. [Google Scholar] [CrossRef]
- Yeung, A.S.; Taylor, P.G.; Hui, C.; Lam-Chiang, A.C.; Low, E.-L. Mandatory Use of Technology in Teaching: Who Cares and so What? Br. J. Educ. Technol. 2012, 43, 859–870. [Google Scholar] [CrossRef]
- Cordero, D.; Mory, A. Education in System Engineering: Digital Competence. In Proceedings of the 2019 IEEE 6th International Conference on Industrial Engineering and Applications (ICIEA), Tokyo, Japan, 12–15 April 2019; pp. 677–681. [Google Scholar] [CrossRef]
- Engelbrecht, L.; Landes, D.; Sedelmaier, Y. A Didactical Concept for Supporting Reflection in Software Engineering Education. In Proceedings of the IEEE Global Engineering Education Conference (EDUCON), Santa Cruz de Tenerife, Santa Cruz de Tenerife, Spain, 17–20 April 2018; pp. 547–554. [Google Scholar]
- Spernjak, A.; Sorgo, A. Outlines for Science Digital Competence of Elementary School Students. In Proceedings of the 41st International Convention on Information and Communication Technology, Electronics and Microelectronics (MIPRO), Opatija, Croatia, 21–25 May 2018; pp. 825–829. [Google Scholar]
- Ng, O.-L.; Liu, M.; Cui, Z. Students’ in-Moment Challenges and Developing Maker Perspectives during Problem-Based Digital Making. J. Res. Technol. Educ. 2021. [Google Scholar] [CrossRef]
- Stopar, K.; Bartol, T. Digital Competences, Computer Skills and Information Literacy in Secondary Education: Mapping and Visualization of Trends and Concepts. Scientometrics 2019, 118, 479–498. [Google Scholar] [CrossRef]
- Florián-Gaviria, B.; Glahn, C.; Gesa, R.F. A Software Suite for Efficient Use of the European Qualifications Framework in Online and Blended Courses. IEEE Trans. Learn. Technol. 2013, 6. [Google Scholar] [CrossRef]
- Kluzer, S.; Priego, L.P. DigComp into Action—Get Inspired, Make It Happen; Carretero, S., Punie, Y., Vuorikari, R., Cabrera, M., O’Keefe, W., Eds.; Publications Office of the European Union: Luxembourg, 2018; ISBN 978-92-79-79901-3. [Google Scholar]
- Bartolomé, J.; Garaizar, P.; Larrucea, X. A Pragmatic Approach for Evaluating and Accrediting Digital Competence of Digital Profiles: A Case Study of Entrepreneurs and Remote Workers; Springer: Amsterdam, The Netherlands, 2021; ISBN 0-12-345678-9. [Google Scholar]
- Voogt, J.; Roblin, N.P. A Comparative Analysis of International Frameworks for 21st Century Competences: Implications for National Curriculum Policies. J. Curric. Stud. 2012, 44, 299–321. [Google Scholar] [CrossRef] [Green Version]
- Pubule, J.; Kalnbalkite, A.; Teirumnieka, E.; Blumberga, D. Evaluation of the Environmental Engineering Study Programme at University. Environ. Clim. Technol. 2019, 23, 310–324. [Google Scholar] [CrossRef] [Green Version]
- der Vleuten, C.P.M.V.; Schuwirth, L.W.T. Assessment in the Context of Problem-Based Learning. Adv. Health Sci. Educ. 2019, 24, 903–914. [Google Scholar] [CrossRef] [Green Version]
- Brilingaitė, A.; Bukauskas, L.; Juozapavičius, A. A Framework for Competence Development and Assessment in Hybrid Cybersecurity Exercises. Comput. Secur. 2020, 88. [Google Scholar] [CrossRef]
- Scherer, R.; Siddiq, F.; Tondeur, J. The Technology Acceptance Model (TAM): A Meta-Analytic Structural Equation Modeling Approach to Explaining Teachers’ Adoption of Digital Technology in Education. Comput. Educ. 2019, 128, 13–35. [Google Scholar] [CrossRef]
- Marin-Suelves, D.; Lopez-Gomez, S.; Castro-Rodriguez, M.M.; Rodriguez-Rodriguez, J. Digital Competence in Schools: A Bibliometric Study. IEEE Rev. Iberoam. Tecnol. Aprendiz. 2020, 15, 381–388. [Google Scholar] [CrossRef]
- Kadıoğlu-Akbulut, C.; Çetin-Dindar, A.; Küçük, S.; Acar-Şeşen, B. Development and Validation of the ICT-TPACK-Science Scale. J. Sci. Educ. Technol. 2020, 29, 355–368. [Google Scholar] [CrossRef]
- Guárdia, L.; Maina, M.; Juliá, A. Digital Competence Assessment System: Supporting Teachers with the CRISS Platform. In Proceedings of the Central European Conference on Information and Intelligent Systems, Varazdin, Croatia, 27–29 September 2017; pp. 77–82. [Google Scholar]
- Roegiers, X. Pedagogy of Integration. Education and Training Systems at the Heart of Our Societies; DeBoeck University: Brussels, Belgium, 2010. [Google Scholar]
- Petter, S.; de Lone, W.; McLean, E. Measuring Information System Success: Models, Dimensions, Measures, and Interrelationships. Eur. J. Inf. Syst. 2008, 17, 236–263. [Google Scholar] [CrossRef]
- Chen, H.J. Linking Employees’ e-Learning System Use to Their Overall Job Outcomes: An Empirical Study Based on the IS Success Model. Comput. Educ. 2010, 55, 1628–1639. [Google Scholar] [CrossRef]
- Cidral, W.A.; Oliveira, T.; Felice, M.D.; Aparicio, M. E-Learning Success Determinants: Brazilian Empirical Study. Comput. Educ. 2018, 122, 273–290. [Google Scholar] [CrossRef] [Green Version]
- Isaac, O.; Aldholay, A.; Abdullah, Z.; Ramayah, T. Online Learning Usage within Yemeni Higher Education: The Role of Compatibility and Task-Technology Fit as Mediating Variables in the IS Success Model. Comput. Educ. 2019, 136, 113–129. [Google Scholar] [CrossRef]
- López, X.; Valenzuela, J.; Nussbaum, M.; Tsai, C.C. Some Recommendations for the Reporting of Quantitative Studies. Comput. Educ. 2015, 91, 106–110. [Google Scholar] [CrossRef]
- Straub, D.; Gefen, D. Validation Guidelines for IS Positivist Research. Commun. Assoc. Inf. Syst. 2004, 13, 380–427. [Google Scholar] [CrossRef]
- Guest, G.; Namey, E.; McKenna, K. How Many Focus Groups Are Enough? Building an Evidence Base for Nonprobability Sample Sizes. Field Methods 2016, 29, 3–22. [Google Scholar] [CrossRef]
- Vogt, D.S.; King, D.W.; King, L.A. Focus Groups in Psychological Assessment: Enhancing Content Validity by Consulting Members of the Target Population. Psychol. Assess. 2004, 16, 231–243. [Google Scholar] [CrossRef] [Green Version]
- R Core Team. R: A Language and Environment for Statistical Computing; R Foundation for Statistical Computing: Vienna, Austria, 2021. [Google Scholar]
- Nulty, D.D. The Adequacy of Response Rates to Online and Paper Surveys: What Can Be Done? Assess. Eval. High. Educ. 2008, 33, 301–314. [Google Scholar] [CrossRef] [Green Version]
- Ringle, C.M.; Wende, S.; Becker, J.-M. SmartPLS 3; SmartPLS GmbH: Boenningstedt, Germany, 2015. [Google Scholar]
- Hair, J.F.; Hult, G.T.M.; Ringle, C.M.; Sarstedt, M. A Primer on Partial Least Squares Structural Equation Modeling (PLS-SEM), 2nd ed.; Sage Publications: Thousand Oaks, CA, USA, 2017. [Google Scholar]
- Hooper, D.; Coughlan, J.; Mullen, M.R. Structural Equation Modelling: Guidelines for Determining Model Fit. Electron. J. Bus. Res. Methods 2008, 6. [Google Scholar] [CrossRef] [Green Version]
- Benitez, J.; Henseler, J.; Castillo, A.; Schuberth, F. How to Perform and Report an Impactful Analysis Using Partial Least Squares: Guidelines for Confirmatory and Explanatory IS Research. Inf. Manag. 2019. [Google Scholar] [CrossRef]
- Streukens, S.; Leroi-Werelds, S.; Willems, K. Dealing with Nonlinearity in Importance-Performance Map Analysis (IPMA): An Integrative Framework in a PLS-SEM Context. In Partial Least Squares Path Modeling; Latan, H., Noonan, R., Eds.; Springer International Publishing: Cham, Switzerland, 2017; ISBN 978-3-319-64068-6. [Google Scholar]
- Ooi, K.-B.; Hew, J.-J.; Lee, V.-H. Could the Mobile and Social Perspectives of Mobile Social Learning Platforms Motivate Learners to Learn Continuously? Comput. Educ. 2018, 120, 127–145. [Google Scholar] [CrossRef]
- Kluzer, S. Guidelines on the Adoption of DigComp; Rissola, G., Ed.; Telecentre Europe: Brussels, Belgium, 2015; pp. 1–28. [Google Scholar]
Demographic Detail | Category | Frequency | Percent |
---|---|---|---|
Gender | Male | 92 | 30.9% |
Female | 206 | 69.1% | |
Workplace | Primary school | 65 | 21.8% |
Secondary school | 233 | 78.2% | |
Age | Under 25 | 0 | 0% |
25–29 | 20 | 6.7% | |
30–39 | 86 | 28.9% | |
40–49 | 117 | 39.3% | |
50–59 | 71 | 23.8% | |
Over 60 | 4 | 1.3% | |
Country | Croatia | 58 | 19.5% |
Greece | 56 | 18.8% | |
Italy | 57 | 19.1% | |
Romania | 25 | 8.4% | |
Spain | 92 | 30.9% | |
Sweden | 10 | 3.3% | |
Teaching experience | Less than one year | 5 | 1.7% |
1–2 years | 9 | 3.0% | |
3–5 years | 39 | 13.1% | |
6–10 years | 53 | 17.8% | |
11–15 years | 71 | 23.8% | |
16–20 years | 50 | 16.8% | |
Over 20 years | 71 | 23.8% | |
Education | High School Diploma | 5 | 1.7% |
Associate’s Degree | 2 | 0.6% | |
Bachelor’s Degree | 104 | 34.9% | |
Master’s Degree | 173 | 58.1% | |
Doctorate Degree | 14 | 4.7% | |
Computer skill | Fundamental Skills | 1 | 0.3% |
Basic Computing and Applications | 46 | 15.4% | |
Intermediate Computing and Applications | 134 | 45.0% | |
Advanced Computing and Applications | 66 | 22.2% | |
Proficient Computing, Applications, and Programming | 51 | 17.1% |
Latent Variable | Indicators | Item Mean | Standard Deviation | Convergent Validity | Internal Consistency Reliability | ||
---|---|---|---|---|---|---|---|
Loadings | AVE | CR | CA | ||||
System Quality | SQ1 | 2.81 | 1.10 | 0.83 | 0.63 | 0.87 | 0.81 |
SQ2 | 2.94 | 1.07 | 0.78 | ||||
SQ3 | 2.72 | 1.02 | 0.79 | ||||
SQ4 | 3.31 | 0.98 | 0.77 | ||||
Information Quality | IQ1 | 3.46 | 0.96 | 0.78 | 0.66 | 0.92 | 0.89 |
IQ3 | 3.13 | 1.00 | 0.85 | ||||
IQ4 | 3.32 | 0.93 | 0.85 | ||||
IQ5 | 3.51 | 0.92 | 0.83 | ||||
IQ6 | 3.26 | 1.06 | 0.76 | ||||
IQ7 | 3.34 | 1.09 | 0.79 | ||||
Service Quality | SVQ1 | 3.66 | 0.94 | 0.88 | 0.71 | 0.90 | 0.85 |
SVQ2 | 3.51 | 0.97 | 0.90 | ||||
SVQ3 | 3.42 | 1.03 | 0.91 | ||||
SVQ4 | 3.52 | 0.83 | 0.65 | ||||
System Use | SU5 | 2.66 | 1.08 | 0.76 | 0.63 | 0.91 | 0.88 |
SU6 | 3.02 | 1.20 | 0.81 | ||||
SU7 | 2.60 | 1.09 | 0.80 | ||||
SU8 | 3.39 | 1.10 | 0.76 | ||||
SU9 | 2.63 | 1.16 | 0.82 | ||||
SU10 | 3.07 | 1.19 | 0.81 | ||||
User Satisfaction | US1 | 2.88 | 1.12 | 0.89 | 0.72 | 0.93 | 0.90 |
US2 | 3.30 | 1.13 | 0.88 | ||||
US3 | 3.25 | 1.13 | 0.88 | ||||
US4 | 3.24 | 1.05 | 0.72 | ||||
US5 | 3.19 | 1.06 | 0.85 | ||||
Net Impacts | NI3 | 3.38 | 1.06 | 0.84 | 0.63 | 0.95 | 0.94 |
NI4 | 3.34 | 1.04 | 0.79 | ||||
NI5 | 3.28 | 1.01 | 0.84 | ||||
NI6 | 3.50 | 0.97 | 0.74 | ||||
NI7 | 3.44 | 1.02 | 0.82 | ||||
NI8 | 2.78 | 1.15 | 0.82 | ||||
NI9 | 3.12 | 1.01 | 0.83 | ||||
NI10 | 2.90 | 1.04 | 0.77 | ||||
NI11 | 3.11 | 0.98 | 0.85 | ||||
NI12 | 3.57 | 0.96 | 0.74 | ||||
NI13 | 3.08 | 0.99 | 0.70 |
Information Quality | Net Impacts | Service Quality | System Quality | System Use | User Satisfaction | |
---|---|---|---|---|---|---|
Information Quality | ||||||
Net Impacts | 0.83 | |||||
Service Quality | 0.53 | 0.53 | ||||
System Quality | 0.89 | 0.78 | 0.51 | |||
System Use | 0.71 | 0.74 | 0.31 | 0.71 | ||
User Satisfaction | 0.89 | 0.90 | 0.53 | 0.87 | 0.77 |
R2 | Results | Q2 | Results | |
---|---|---|---|---|
Net Impacts | 0.82 | Substantial | 0.42 | Path model’s predictive relevance for particular constructs |
System Use | 0.55 | Moderate | 0.26 | |
User Satisfaction | 0.86 | Substantial | 0.50 |
System Quality | Information Quality | Service Quality | System Use | User Satisfaction | Net Impacts | |
---|---|---|---|---|---|---|
Chi-square(df) | 1.54(1) | 6.48(4) | 0.06(1) | 7.35(4) | 3.76(4) | 47.51(31) |
RMSEA | 0.042 | 0.046 | 0.000 | 0.053 | 0.000 | 0.042 |
GFI | 0.997 | 0.993 | 1.000 | 0.992 | 0.995 | 0.972 |
AGFI | 0.974 | 0.962 | 0.999 | 0.958 | 0.981 | 0.940 |
CFI | 0.999 | 0.998 | 1.000 | 0.997 | 1.000 | 0.993 |
Hypotheses | Path Coefficient (a) | t Values (a) | f Square (b) | Conclusion |
---|---|---|---|---|
H1: System Quality → System Use | 0.34 ** | 4.19 | 0.08 | Supported |
H2: System Quality → User Satisfaction | 0.26 ** | 4.85 | 0.10 | Supported |
H3: Information Quality → System Use | 0.39 ** | 4.78 | 0.10 | Supported |
H4: Information Quality → User Satisfaction | 0.40 ** | 6.81 | 0.21 | Supported |
H5: Service Quality → System Use | −0.05 | 0.98 | 0.00 | Rejected |
H6: Service Quality → User Satisfaction | 0.10 * | 2.57 | 0.03 | Supported |
H7: System Use → Net Impacts | 0.19 ** | 4.23 | 0.06 | Supported |
H8: System Use → User Satisfaction | 0.26 ** | 5.93 | 0.15 | Supported |
H9: User Satisfaction → Net Impacts | 0.70 ** | 18.39 | 0.90 | Supported |
Predecessor Construct | Importance (Total Effects) | Performance |
---|---|---|
Information Quality | 0.43 | 58 |
Service Quality | 0.07 | 63 |
System Quality | 0.30 | 49 |
System Use | 0.34 | 47 |
User Satisfaction | 0.61 | 54 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Balaban, I.; Sobodić, A. From Teachers’ Perspective: Can an Online Digital Competence Certification System Be Successfully Implemented in Schools? Appl. Sci. 2022, 12, 3785. https://doi.org/10.3390/app12083785
Balaban I, Sobodić A. From Teachers’ Perspective: Can an Online Digital Competence Certification System Be Successfully Implemented in Schools? Applied Sciences. 2022; 12(8):3785. https://doi.org/10.3390/app12083785
Chicago/Turabian StyleBalaban, Igor, and Aleksandra Sobodić. 2022. "From Teachers’ Perspective: Can an Online Digital Competence Certification System Be Successfully Implemented in Schools?" Applied Sciences 12, no. 8: 3785. https://doi.org/10.3390/app12083785
APA StyleBalaban, I., & Sobodić, A. (2022). From Teachers’ Perspective: Can an Online Digital Competence Certification System Be Successfully Implemented in Schools? Applied Sciences, 12(8), 3785. https://doi.org/10.3390/app12083785