Next Article in Journal
Planning & Open-Air Demonstrating Smart City Sustainable Districts
Previous Article in Journal
Surface Water Resource Protection in a Mining Process under Varying Strata Thickness—A Case Study of Buliangou Coal Mine, China
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

The Relationships of Family, Perceived Digital Competence and Attitude, and Learning Agility in Sustainable Student Engagement in Higher Education

1
Department of Education, Graduate School of Education, Chung-Ang University, Seoul 06974, Korea
2
Department of Education, College of Education, Chung-Ang University, Seoul 06974, Korea
*
Author to whom correspondence should be addressed.
Sustainability 2018, 10(12), 4635; https://doi.org/10.3390/su10124635
Submission received: 4 November 2018 / Revised: 2 December 2018 / Accepted: 3 December 2018 / Published: 6 December 2018
(This article belongs to the Section Sustainable Education and Approaches)

Abstract

:
College students are often assumed to be digitally fluent as they are “digital natives”, owing to their exposure to digital technologies from an early age. Furthermore, it is assumed that this digital competence is likely to prepare them for learning in college. However, it has been observed that current college students who are “digital natives” may or may not effectively apply digital technologies during their college education. The purpose of this study is to examine the impact of college students’ prior digital experiences, particularly their families’ influence, on their in-college digital competence and attitude, and by extension, on student engagement. A total of 381 university students were surveyed in this study. Data was obtained from a self-administered, online survey and analyzed using partial least squares, which also evaluated the research model. According to the findings of this study, students’ positive prior digital experience significantly influences their perceived digital competence and their attitude toward digital technologies. In addition, our research also indicates that college students’ perceived digital competence and attitudes are mediated by their learning agility, which is the ability to continuously learn and the willingness to apply acquired knowledge. This article may thus act as a springboard for further empirical research, as well as for examining the nature of students’ prior and positive experiences and learning agility in digital competencies.

1. Introduction

In recent decades, college students have required competence in digitally integrated life. The use of digital technologies in both personal and professional fields has raised interest in student digital competence in university [1,2]. Moreover, technological advancement has brought opportunities for integrating technology into academic work, as well as future life. Students’ digital competence at the college level has therefore gained attention, and is regarded as an important skill for student engagement [2,3]. Student engagement is defined as a multidimensional phenomenon encompassing students’ cognitive, behavioral, and affective components in coursework, including students’ in-class and out-of-class multifaceted experiences, which are considered good predictors of learning and career development [4,5,6].
As digitally literate natives [7,8,9], today’s college students are growing up with digital technologies central to everyday functioning in school, at home, and in the community [10,11]. With these technologies, students interact with others, absorb information from multiple sources, engage in content creation, and share information and views [12]. In college, students take courses using educational systems (e.g., e-learning platforms, word processing software, and mobile learning management systems) supported by their academic institutions [13]. Contrary to educators’ expectations and assumptions about them, college students access and use a variety of technology; however, during their time in college, they either do not effectively use digital technologies as digital natives or do not use it at all [14,15]. As digital natives, students are assumed to have adequate digital literacy rather than being assessed, remediated, and amplified on the level of digital literacy they really need [16]. For example, recent studies have shown that undergraduate students need more training in digital technologies to integrate them into their educational experiences [17,18]. In fact, many institutions of higher education have not fully embraced digital literacy for their students as a foundational literacy [2,16].
The pervasiveness and advancement of digital technologies in higher education and students’ adoption of these technologies can influence student engagement in academic work [19], and student engagement tends to be considered a good predictor of learning and personal development [4]. Significantly, learning with technologies in college is associated with a student’s academic engagement for academic success [19,20] regarding active participation and interaction in class [21], active exploratory learning [22], application of knowledge and academic performance [23], and attitudes and self-efficacy [24].
The current digitally literate generation, which is assumed by the older generation to be highly digitally literate, now joins institutions of higher education where they are expected to access campus resources and services using their smartphones and laptops. However, college students’ digital experiences indicate a weak relationship between their engagement and their adoption of technology on campus [25]. Indeed, they can cause students to disengage in learning activities due to the ineffective integration of digital technologies. In addition, Guzmán-Simón et al. [2] found that Spanish college students do not incorporate digital technologies and relevant literacies into their academic literacy, in that students showed a wide gap between their digital competence in informal contexts and in formal learning. In the Korean context, surprisingly, college students did not significantly integrate digital technologies into academic work [26]. Moreover, current higher education emphasizes collaborative and active learning in courses, as well as student-faculty interaction [13,27]. However, when digital technology in learning activities needs to be adopted based on powerful pedagogical approaches, students can experience and enhance their learning in college through digital technology [3]. Thus, while the integration of digital technologies in academic life is expanding, students’ disconnected exposure to digital experiences in their academic work reduces the scope of their usage.
Specifically, we investigate the antecedents of students’ digital competence and attitudes in college while they engage in academic work. As discussed above, college students have difficulty integrating digital technologies into their academic work, as well as acquiring digital competence for academic success. Additionally, college students’ prior digital experiences may influence their digital experiences and attitudes during college. Thus far, only a few studies have addressed whether the developmental effect of students’ digital experiences before joining college affects their digital competence and attitudes while in college, as well as their academic outcomes and experiences (i.e., student engagement). This study therefore aims to analyze the impact of college students’ prior digital experiences, particularly their family influence and personal effort, attitudes toward digital technologies, and student engagement, on their in-college digital competence.
The rest of this paper is structured as follows: Section 2 presents the background and hypothesis derivation regarding the research model; Section 3 introduces the methods, which include data collection, sample characteristics, measures, and statistical analysis; Section 4 presents the results of the empirical analysis; Section 5 discusses the main results and concludes the paper; and finally, Section 6 discusses the limitations and recommendations for future research based on the findings.

2. Background and Hypothesis Derivation

2.1. Prior Digital Experience with Family

In this research model, students’ prior personal digital experiences before joining college need to be examined as antecedents for predicting the effectiveness of their technology integration in college academic work. It has been found that students’ positive perceptions of technology tend to rise based on prior positive experiences with family, personal effort, and teachers [28]. More prior experience of computer use can also result in stronger confidence [29]. Additionally, Beckers and Schmidt [30] found that an enjoyable initial exposure to computers was associated with greater subsequent computer experience, which in turn was associated with lower levels of anxiety and greater affinity for computers. Moreover, families’ encouragement in using digital technologies can be an important factor that influences students’ attitudes toward technology. Furthermore, support from parents that are highly knowledgeable and skilled in using computers can be helpful in building student competence [31]. In particular, a study has shown that female college students studying computing grew up with computers in their households, and frequently used them with their family members (e.g., parents and siblings), which caused them to embrace computing before joining college [28]. It is, however, possible for students to acquire intermediate skills in technology integration for their learning activities, as well as a positive attitude toward digital technology as a learning tool. This can cause them to actively learn how to integrate technology into their learning activities. Students with a positive perception of digital technology may thus have positive experiences in successfully adopting technology in learning activities, which is connected to academic achievement. This may in turn guarantee future adoption devoid of anxiety or a negative perception of digital technology. Therefore, the following hypotheses will be tested:
Hypothesis 1 (H1).
Prior digital experience with family positively influences the attitude toward using digital technologies.
Hypothesis 2 (H2).
Prior digital experience with family positively influences college digital literacy.
Hypothesis 3 (H3).
Prior digital experience with family positively influences personal efforts toward using digital technologies.

2.2. Perceived Digital Competence for Academic Work

As a component of the current digital status of students in college, digital competence in academic works can be defined as technology-related knowledge, skills, and attitudes in using digital technologies to meet educational aims and expectations in college. In addition, college students’ adoption of digital technologies in academic work can be a good predictor of college learning and success [4,19]. The European Parliament and Council [32] defined digital competencies in a technologized world as “the confident and critical use of information society technology for work, leisure, and communication. It is underpinned by basic skills in information and communion technology: the use of computers to retrieve, assess, store, produce, present, and exchange information; and to communicate and participate in collaborative networks via the internet”. College students need to be competent in effectively accessing various resources and information to identify and organize course-related digital information, choose appropriate technologies to accomplish academic tasks, effectively collaborate or communicate with others in academic communities, and solve problems using online resources [33,34,35,36,37,38]. Since they engage in cognitive, technical, and social-emotional processes during the use of digital technologies and digital information [36,39], college students’ digital competence is widely emphasized as essential for work and life today. Hence, we anticipate that when college students’ level of perceived digital competence is increased, the level of student engagement and learning agility will also increase Therefore, the following hypotheses will be tested:
Hypothesis 4 (H4).
Perceived digital competence positively influences student engagement.
Hypothesis 5 (H5).
Perceived digital competence positively influences learning agility.

2.3. Attitudes toward Using Digital Technologies

As another component of the current digital status of students in college, attitudes toward using digital technologies have a significant relationship with student engagement. Whether students suffer from digital technology anxiety or perceive technology negatively, a positive attitude toward using digital technology as a productive medium can influence their degrees of cognitive spontaneity in technology interactions [40]. Fishbein and Ajzen [41] defined attitude as “a learned predisposition to respond in a consistently favorable or unfavorable manner toward an attitude object” (p. 6). Attitude is defined as an acquired internal state that influences the choice of personal action as a learning outcome [42,43]. Additionally, attitude toward technology influences cognition and behavior [44]. People who have positive perceptions of computer technologies show that computers are fun and satisfying tools or toys for on- and offline activities [45]. This trait further impacts individual online activity types and academic performance [45]. According to the Technology Acceptance Model, the actual use of a technology system in personal contexts (e.g., learning or marketing) is influenced directly or indirectly by the user’s attitude, behavioral intentions, and perceptions of its usefulness and ease. Attitudes toward using technology can thus influence the user’s choice of actions and responses to challenges, and are therefore a critical driver of technology adoption [46,47,48]. Therefore, college students’ positive attitudes toward using digital technologies for academic works will help them achieve academic success through student engagement. In light of this, the following hypotheses will be tested:
Hypothesis 6 (H6).
Attitude toward using digital technologies positively influences student engagement.
Hypothesis 7 (H7).
Attitude toward using digital technologies positively influences learning agility.

2.4. Previous Experiences with Personal Efforts to Learn Digital Technologies

In this study, personal effort to learn digital technologies is defined as maintaining an effort toward learning digital technologies, despite potential distractions and setbacks, before joining college. This is a psychological factor explaining the direction and regulation of one’s energy toward goals or objectives. It is also related to a belief in effort-outcome covariation, which is the belief that success in learning is caused by personal effort, and not personal ability, luck, or task difficulty [49]. Since motivation and engagement do not always occur together, regulated effort by the learner plays a role in transforming motivation into engagement in the learning process [50]. Previous experience with personal effort to learn digital technologies meant effort regulation, which is to control one’s expenditure of effort [51] in learning digital technologies. Particularly, when students perceive that the learning tasks are easy to execute, interesting, and enjoyable, this kind of regulation occurs more easily [50]. Personal effort will thus form a continuum in the development of digital competence between previous experiences gathered in the family context and college digital competence and engagement in academic work. Therefore, the following hypotheses will be tested:
Hypothesis 8 (H8).
Prior digital experience with personal effort positively influences attitude toward using digital technologies.
Hypothesis 9 (H9).
Prior digital experience with personal effort positively influences college digital literacy.

2.5. Learning Agility

As a mediator between the current digital status of students and their engagement, learning agility is an important factor that integrates digital technologies into engagement in academic life. We live in the era of transition from knowing with predetermined skills and existing knowledge to rapid learning agility in new situations with different experiences and technological tools. The term “learning agility” was coined by Lombardo and Eichinger [52]. As a learning strategy based on experience, it has been defined as the willingness and ability to learn from experience, and subsequently apply acquired knowledge in new situations for successful performance [52]. Learning agility is the ability to continuously learn and the willingness to apply acquired knowledge. It has three essential components: potential, motivation, and adaptability [53]. Highly agile learners can apprehend the right lessons from experience, and apply those lessons to novel situations [54]. Amato and Molokhia [53] assert that agile learners can quickly and accurately analyze problems, synthesize information, and comprehend complexity. They are curious about new opportunities and challenges, and are flexible in their approaches to problem solving. Thus, agile learners tend to be eager to learn, to experiment with assumptions, and to identify lessons learned in order to improve their ability to cope with challenges. Moreover, learning agility will be needed to cope with challenges in college or future workplaces. According to De Meuse et al. [54], learning agility as a relatively stable construct is unrelated to gender, age, and ethnicity. Recently, college students have been expected to be flexible and fast learners amidst a high level of knowledge uncertainty and changes in global circumstances, as prerequisites to engaging in society. This is because learning agility is known to be a stronger predictor of high performance than intellectual or personal traits [55]. Thus, students’ previous experiences in using and integrating technology into their learning or lives play a role in mediating the effects of digital attitudes and literacy on their engagement, as represented by learning agility. Learning agility in students enables them to adapt and create new values when they are in ambiguous situations. Therefore, this study hypothesizes:
Hypothesis 10 (H10).
Learning agility positively influences student engagement.

2.6. Student Engagement

Students’ effective use of digital technology improves engagement in learning and encourages a positive attitude towards school [19,20,56]. As a significant predictor of academic success, student engagement is considered a crucial factor in optimizing the student experience, enhancing learning, and linking with high-quality learning outcomes [57]. In the study, student engagement as an outcome has been defined as commitment, participation, or effortful involvement in learning [58], including students’ psychological efforts and investments in learning, understanding, or mastering the skills, crafts, or knowledge that schoolwork is intended to promote [59]. Furthermore, student engagement as a multifaceted construct consists of behavioral, emotional, and cognitive engagement [60]. There are similar terms for student engagement, including academic engagement, school engagement, and learner engagement [61]. They all refer to engaging in learning and academic tasks, such as putting in effort, persisting, concentrating, paying attention, asking questions, and contributing to class discussions [62]; affectively reacting in the classroom, including showing interest, boredom, happiness, sadness, and anxiety [62,63]; and cognitively engaging in learning, a desire to go beyond the basic course requirements, and a preference for challenge [59,63]. In the study, student engagement focuses on examining multidimensional facets of engaging in courses, such as student engagement through practicing skills in the classroom, emotional involvement with the class material, participation in class and interaction with instructors and other students, and levels of performance in the class [6]. Thus, students’ learning processes and activities become meaningful to them with their positive energy [64].

2.7. Research Model

The research question that we are trying to answer pertains to students’ prior digital experiences and current digital status in college, both of which influence their student engagement. Following the literature review, we developed a research model that explains the role of students’ prior digital experiences in their current digital competence and engagement in academic courses. The following research question can thus be posited: How do college students’ prior digital experiences influence their digital literacy and attitudes in college, and lead to student engagement?
The current study contributes to the body of research on college students’ digital literacy and its relation to students’ prior experiences, and their positive perceptions of using current information and communication technologies. Additionally, a major concern is the role of learning agility as a mediator between digital competencies and student engagement in academic work. Figure 1 outlines the hypothesized research model:

3. Methods

3.1. Data Collection and Sample Characteristics

The populations of interest in the study were college students at four-year institutions. A self-administered online questionnaire was administered to test our research model, and the data was collected in the republic of Korea for one week in November 2017, which is at the end of the third quarter of the semester. An online survey design was administered to collect the data. A sample population was selected from a large private university in a metropolitan area of South Korea. An email explaining the survey was sent to the school’s staff members, asking them to invite their currently registered undergraduate students to voluntarily participate in the study. Participants were recruited through the use of a mass e-mail to all 17,665 students in the university. The students were asked to click on a link in the e-mail, which gave them access to the online questionnaire. The respondents were allowed to withdraw their participation at any time during the process of filling out the survey. The data in the study was collected as part of a larger study on the quality of undergraduate education. An average survey took around 20 min. The students voluntarily participated, and the participants received a cash coupon (approximately USD $10) upon completion of the online survey.
A total of 381 valid surveys were completed, and respondents included 127 males (33.3%) and 254 females (66.7%) (sampling error of 4.97% for a confidence level of 95%). The participants’ ages ranged from 18 to 27 years. Their mean age was 21.4, and the standard deviation was 2.1. In addition, we inquired about the level of ICT application in courses on a scale of 1 (not at all) to 5 (a lot). Students responded as follows: overall experience information and communication technology in the courses (M = 3.97, SD = 1.05), Internet search (M = 4.46, SD = 0.87), Microsoft Excel (M = 2.79, SD = 1.27), computer programming (M = 2.47, SD = 1.49), graphic software (M = 1.68, SD = 1.09), Microsoft Powerpoint (M = 3.94, SD = 1.07), simulation (M = 1.61, SD = 1.03), and 3D design software (M = 2.47, SD = 1.49).

3.2. Measure

For the constructs of our study, we generated multi-item scales on the basis of established studies. We measured student engagement based on the scale of Handelsman et al. [6]. The scale for learning agility was adopted from Gravett and Caldwell [65]. The scales for attitudes toward digital technologies were adopted from OECD [66]. Prior digital experience with family and personal effort to learn ICT were developed from the literature of Frieze and Quesenberry [28]. The scale of perceived digital competence for college students was adapted from Hong and Kim [33], validated in 2015 through a research project. The scales of the final investigation survey are presented in Table 1. To measure the variables in the larger study, an online survey was developed. It consisted of five subsections, including background, prior digital experience, perceptions of digital technologies for courses, student engagement, and learning agility. All variables were operationalized with a five-point Likert scale, ranging from 1 (strongly disagree) to 5 (strongly agree). To ensure content validity and construct reliability, all items were adapted from established studies.

3.3. Statistical Analysis

This study used the Partial Least Squares–Structural Equation Modeling (PLS-SEM) technique for data analysis. PLS-SEM is a type of structural equation modeling used to statistically analyze and measure latent variables with multiple observed variables [67,68]; it involves regression-based methods, instead of the maximum likelihood estimation used in structural equation modeling [69]. PLS is effective when used for theory development with fewer data assumptions, such as multivariate normality assumptions, smaller sample sizes than structural equation modeling, and measurement scales [69]. For this study, SmartPLS 3.0 was used to assess the measurement and structural models [70]. SPSS 23.0 was applied to examine descriptive statistics of the data. We used a two-step approach commonly used in SEM [71,72] to evaluate model fit. This approach involves the assessment of the measurement model (outer model) and the assessment of the structural model (inner model). To assess model structures, we adapted Chin’s [67] recommendations of criteria.
To evaluate the model fit, the construct validity of the measurement model was tested by assessing discriminant validity and reliability. In the study, the measurement model was evaluated in terms of the factor loadings of each item, plus internal consistency reliability, including Cronbach’s alpha, composite reliability, convergent validity, and discriminant validity. Convergent validity, which assesses whether each item measures what it was theoretically supposed to measure, is established when T-values are greater than 1.96. In this study, convergent validity was assessed using average variance extracted (AVE) and standardized factor loadings. The presence of multicollinearity among constructs was tested with the variance inflation factor. Discriminant validity was examined using square-root average variance extracted (AVE) values that were larger than the inter-construct correlations, suggesting that each measurement item was better explained by its intended construct than by other constructs [67]. The results of comparing the square root of AVE to construct correlations mean that each construct is more associated with its own measures than with other constructs. According to one study, the AVE and cross-loading can be used to assess validity [73]. The result should be at least 0.70 (i.e., AVE > 0.50), and greater than the construct’s correlation with other constructs.
Path significances were determined by running the model through a bootstrap resampling routine to estimate the precision of the PLS estimates and assess the significance level of the estimates [70,71]. The number of resamples used for this study was equal to 5000 (no sign changes option). In the research model, the path significance of each hypothesis was included, and we explained the variance (the R2) by each path. The T-test in the significance level (0.05) required a T-value > 1.96, and the significance level (0.01) required a T-value > 2.58.

4. Results

4.1. Overview of the Measurement Model

To examine the measurement model, internal consistency reliability was tested; see Table 2. The Cronbach’s alpha of all the factors concerned ranged from 0.80 to 0.86, which exceeded 0.7, the minimum critical value suggested by both Chin [67] and Lohmöller [74]. As shown in Table 1, the composite reliability (CR) of latent variables was between 0.87 and 0.90. The results indicated good internal consistency and a satisfactory level of reliability.
Results of convergent validity indicate that the variance inflation factor was always below the 5.0 level [75], ranging from 1.00 to 1.24. All multi-item constructs met the guideline for average variance extracted, which is greater than 0.50 [69,71], which means 50% or more of the variance of indicators was accounted for. All latent constructs satisfied this condition. Discriminant validity was examined in Table 3. The values, ranging from 0.78 to 1.00, were greater than the variance. Thus, discriminant validity was confirmed.
In Table 2, loadings and cross-loadings for the items used in this study are shown. In Table 3, the diagonal entries represent the square root of the AVE for each construct. All other values are the corresponding correlation coefficients among the constructs. All diagonal values exceed the inter-construct correlation. In Table 2, all items for each sample loaded more highly on their own constructs, and had values higher than 0.5. All resulting factor loadings were higher than the cutoff score of 0.5, with no cross-loadings of items above 0.4., which satisfies the criteria for convergent and discriminant validity [76].
The discriminant and convergent validities of the multi-item constructs of the models were thus acceptable. All results support the reliability and validity of the measurement model.

4.2. Structural Model and Hypothesis Testing

Through the measurement model validation, this study subsequently evaluated the structural model to confirm the relationships among constructs and proceeded to test the hypotheses. To test the structural model, we conducted bootstrapping of 5000 samples in SmartPLS 3.0. We tested H1 to H10 by examining the path coefficients between variables. The results are reported in Figure 2. The results of the data analysis are presented in Table 4. The ten hypotheses presented were tested using the PLS approach.

4.2.1. Positive Impact of Prior Digital Experience with Family

Prior digital experience with family was found to positively impact attitude toward digital technologies (β = 0.117, p < 0.01), perceived digital competence (β = 0.095, p < 0.05), and prior digital experience with personal effort (β = 0.246, p < 0.001), supporting H1, H2, and H3, respectively, see Table 5. Prior digital experience with personal effort was verified as positively and significantly associated with attitudes toward digital technologies (β = 0.426, p < 0.001) and perceived digital competence (β = 0.574, p < 0.001), which supported H9 and H10, respectively.
The two aspects of prior digital experience, i.e., family and personal effort, were found to be positively related to college students’ current digital competence and attitudes toward digital technologies. Pre-built positive experience and self-effort before joining college therefore influence students’ experiences. Further, students’ prior digital experiences in the model positively and indirectly influence student engagement, which are connected to learning outcomes in higher education.

4.2.2. Perceived Digital Competence and Attitude toward Academic Work

Attitudes toward digital technologies (β = 0.177, p < 0.01) and perceived digital competence (β = 0.278, p < 0.001) significantly influenced learning agility, which supported H5 and H7, respectively, see Table 5. Learning agility was found to positively impact student engagement (β = 0.572, p < 0.001). However, the relationship between attitudes toward digital technologies and perceived digital competence was found to be statistically insignificant. Thus, H4 and H6 were not supported.
College students’ perceived digital competence and attitudes toward using digital technologies significantly and positively influenced student engagement through learning agility. Learning agility plays a significant role in mediating the use of technology with academic work, because technology adoption in academic work requires active and flexible adaptation of functions of the technology in academic courses and contexts with a positive attitude. Thus, students can positively perceive digital technologies as learning tools, adopt them at a significant level, and integrate them into their academic work. To engage academic courses with digital experiences and skills, a digitally literate and positive attitude to integrating technology for their academic performance can provide strong motivation to sustain and enhance students’ capabilities. In addition, both perceived digital competence and attitude did not show a significant direct effect on student engagement. That is, learning agility shows full mediation between digital components, including competence, attitude, and student engagement. These results tell us that college students’ digital competence tends to influence how they learn and gain experience through proactive intention and learning agility, instead of through commitment to meditating and learning a specific field of knowledge and skills.
PLS-SEM provides no global goodness-of-fit indices to examine the adequacy of models [72]. The structural model was therefore evaluated by examining the structural paths, T-statistics, and variance explained (the R2 value) [77]. The final dependent construct, student engagement, had an adjusted R2 value of 0.341. Additionally, other constructs also had adjusted R2 values for the following: learning agility (0.163), attitude toward digital technologies (0.215), perceived digital competence (0.362), and prior digital experience with personal effort (0.058).

5. Discussion and Conclusions

This study aimed to better understand the influence of students’ past and current experiences with digital technologies on student engagement in learning and to evaluate a proposed research model. To do this, we explored the antecedents of student engagement by considering students’ prior digital experience with family, as well as perceived digital competence and attitude towards academic work. A total of 381 students participated in verifying the developed hypotheses in the research model. Based on the results of the analysis, there were several noteworthy findings, as follows in the next section.
Students’ prior experience with digital technology, particularly with family, positively predicts the level of digital competence and attitude toward using digital technologies in college. In regard to the positive and predictive relationship, the additive result shows that when students’ personal effort to learn the necessary digital technologies was added as a mediator, the relationship between prior experience and current digital experience (i.e., competence and attitude) in academic work seemed to get stronger in the model. We extend focus and pay attention to the role of students’ effort in the link between prior and current digital experience. Students’ effort attribution is considered an influencer of future student performance and a key to persisting longer in the task and increasing the level of their performance [78]. Thus, students’ personal effort to learn digital technologies in academic life can be a connector between students’ past and current digital experience. Today’s college students are considered digitally literate natives that have grown up with digital technologies in their everyday lives. This techno-social phenomenon of the digital era seems to create immoderate expectation among educators and a presumption about students’ level of digital competence, without prescriptive intervention in schools. Our findings can be a clue to supporting college students as digital natives when they ineffectively integrate digital technologies into their academic works, as well as a reason for educators to reconsider current support for students’ digital experiences on campus.
The findings suggest that students need to make an effort to learn and apply acquired knowledge as agile learners in a digitally enriched environment to achieve a meaningful adoption of digital technologies in academic life. The findings identify with previous research findings on the positive relation between digital literacy and student learning outcomes [4,34,38,39]. Particularly, the results on digital competence and student engagement imply that digital competence has a positive effect on student engagement, which is linked to important outcomes, such as grades, persistence, and college completion [79]. It is possible that it is not enough to solely enhance students’ digital competence or attitude separately from academic experiences to strengthen the relationship between students’ digital experiences and student engagement. Learning agility has three essential components: potential, motivation, and adaptability. These components closely relate to the sub-components of student engagement in the aspect of technology adoption in college.
In conclusion, college students who live in a digitally enriched environment at home and school are expected to engage in learning and adopt digital technologies effectively. What we know is that students have various prior digital experiences before joining college and develop digital competencies and attitudes in college. What we do not know well enough is how we can effectively connect students’ prior digital experiences to engagement in academic work. To appropriately support students, it is necessary to consider a customized approach based on students’ digital experiences, including families, personal traits, attitudes, and efforts, which can nurture different levels of adaptability and potential with digital technologies. Digital natives need to be supported to find appropriate ways to adopt digital technologies for academic success, neither to make an effort personally nor to learn functions of digital technologies only. Therefore, digitally enriched education for college students should assume that students can find, process, generate, and communicate by adopting technology, which should be applied in the context of problem solving in academic work. Embodied experiences of integrating digital technologies with academic work will impact students’ future career and the quality of their life.

6. Limitations and Recommendations for Future Research

This study identified several limitations that need to be discussed. One limitation is that this study used 381 students from various backgrounds, including different majors, genders, and grade levels. These students’ backgrounds could imply differences in past and current digital experiences. Since this study aimed to examine a model for all college students, it did not probe into the background differences relative to these diverse backgrounds. However, future research could examine the differences by making group–specific comparisons in partial least squares. Second, students’ digital competence is a perceived competence, and cannot represent the true level of students’ digital competence, including skills and knowledge. Third, our samples are from a university and there is some bias in this situation. It would be better to increase the sample size; for example, future research could include students from different universities to improve research generalizability. Fourth, this study is particularly focused on one antecedent, prior digital experience with family, to examine its influence upon digital competence, attitudes, and previous experiences with personal efforts to learn digital technologies for college students. However, it is necessary to explain the model with additional antecedents to develop a better model-fit toward data and R-squared values for outcome variables in future research.

Author Contributions

H.J.K., A.J.H., and H.-D.S. conceived the research idea and designed the research framework. H.J.K. analyzed the data and wrote the draft. All three authors reviewed the final manuscript.

Funding

This research was supported by the Ministry of Education of the Republic of Korea and the National Research Foundation of Korea (NRF-2017S1A3A2066878).

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Crook, C. Addressing research at the intersection of academic literacies and new technology. Int. J. Educ. Res. 2005, 43, 509–518. [Google Scholar] [CrossRef]
  2. Guzmán-Simón, F.; García-Jiménez, E.; López-Cobo, I. Undergraduate students’ perspectives on digital competence and academic literacy in a Spanish university. Comput. Hum. Behav. 2017, 74, 196–204. [Google Scholar] [CrossRef]
  3. Kuh, G.D.; Vesper, N. Do computers enhance or detract from student learning? Res. High. Educ. 2001, 42, 87–102. [Google Scholar] [CrossRef]
  4. Carini, R.M.; Kuh, G.D.; Klein, S.P. Student engagement and student learning: Testing the linkages. Res. High. Educ. 2006, 47, 1–32. [Google Scholar] [CrossRef]
  5. Coates, H. Student Engagement in Campus-Based and Online Education: University Connections; Routledge: New York, NY, USA, 2006. [Google Scholar]
  6. Handelsman, M.M.; Briggs, W.L.; Sullivan, N.; Towler, A. A measure of college student course engagement. J. Educ. Res. 2005, 98, 184–192. [Google Scholar] [CrossRef]
  7. Prensky, M. Digital natives, digital immigrants Part 1. Horizon 2001, 9, 1–6. [Google Scholar] [CrossRef]
  8. Jones, C.; Ramanau, R.; Cross, S.; Healing, G. Net generation or Digital Natives: Is there a distinct new generation entering university? Comput. Educ. 2010, 54, 722–732. [Google Scholar] [CrossRef]
  9. Bullen, M.; Morgan, T.; Qayyum, A.; Qayyum, A. Digital learners in higher education: Generation is not the issue. Can. J. Learn. Technol. 2011, 37. [Google Scholar] [CrossRef]
  10. Beetham, H.; Sharpe, R. Rethinking Pedagogy for a Digital Age Designing for 21st Century Learning; Routledge: New York, NY, USA, 2013. [Google Scholar]
  11. Waycott, J.; Bennett, S.; Kennedy, G.; Dalgarno, B.; Gray, K. Digital divides? Student and staff perceptions of information and communication technologies. Comput. Educ. 2010, 54, 1202–1211. [Google Scholar] [CrossRef]
  12. Martin, C.A. From high maintenance to high productivity: What managers need to know about Generation Y. Ind. Commer. Train. 2005, 37, 39–44. [Google Scholar] [CrossRef]
  13. Blasco-Arcas, L.; Buil, I.; Hernández-Ortega, B.; Sese, F.J. Using clickers in class: The role of interactivity, active collaborative learning and engagement in learning performance. Comput. Educ. 2013, 62, 102–110. [Google Scholar] [CrossRef]
  14. Kennedy, G.E.; Judd, T.S.; Churchward, A.; Gray, K.; Krause, K.L. First year students’ experiences with technology: Are they really digital natives? Aust. J. Educ. Technol. 2008, 24, 108–122. [Google Scholar] [CrossRef]
  15. Nasah, A.; DaCosta, B.; Kinsell, C.; Seok, S. The digital literacy debate: An investigation of digital propensity and information and communication technology. Educ. Technol. Res. Dev. 2010, 58, 531–555. [Google Scholar] [CrossRef]
  16. Murray, M.C.; Pérez, J. Unraveling the digital literacy paradox: How higher education fails at the fourth literacy. Issues Inf. Sci. Inf. Technol. 2014, 11, 85–100. [Google Scholar] [CrossRef]
  17. Alvermann, D. Why bother theorizing adolescents’ online literacies for classroom practice and research. J. Adolesc. Adult Lit. 2008, 52, 8–19. [Google Scholar] [CrossRef]
  18. Strømsø, H.I.; Bråten, I. Students’ sourcing while reading and writing from multiple web documents. Nordic J. Digit. Lit. 2014, 9, 92–111. [Google Scholar]
  19. National Survey of Student Engagement. A Fresh Look at Student Engagement: Annual Results 2013. Bloomington, IN, USA, 2013. Available online: http://nsse.indiana.edu/NSSE_2013_Results/pdf/NSSE_2013_Annual_Results.pdf (accessed on 2 December 2018).
  20. Goode, J. The digital identity divide: How technology knowledge impacts college students. New Media Soc. 2010, 12, 497–513. [Google Scholar] [CrossRef]
  21. Fitch, J.L. Student feedback in the college classroom: A technology solution. Educ. Technol. Res. Dev. 2004, 52, 71–77. [Google Scholar] [CrossRef]
  22. Barak, M.; Lipson, A.; Lerman, S. Wireless laptops as means: For promoting active learning in large lecture halls. J. Res. Technol. Educ. 2006, 38, 245–263. [Google Scholar] [CrossRef]
  23. Siegle, D.; Foster, T. Laptop computers and multimedia and presentation software: Their effects on student achievement in anatomy and physiology. J. Res. Technol. Educ. 2001, 34, 29–37. [Google Scholar] [CrossRef]
  24. Yang, S.H. Exploring college students’ attitudes and self-efficacy of mobile learning. Turk. Online J. Educ. Technol. 2012, 11, 148–154. [Google Scholar]
  25. Kuh, G.D.; Hu, S. The relationships between computer and information technology use, student learning, and other college experiences. J. Coll. Stud. Dev. 2001, 42, 217–232. [Google Scholar]
  26. Kim, H.J. Exploring college students’ perceptions and educational experiences of digital literacy. Korean J. Learn.-Cent. Curric. Instr. 2016, 16, 937–958. [Google Scholar]
  27. Kuh, G.D. Assessing what really matters to student learning: Inside the National Survey of Student Engagement. Change 2001, 33, 10–17. [Google Scholar] [CrossRef]
  28. Frieze, C.; Quesenberry, J. Kicking Butt in Computer Science: Women in Computing at Carnegie Mellon University; Dog Ear Publishing: Indianapolis, IN, USA, 2015. [Google Scholar]
  29. Işman, A.; Çelikli, G.E. How does student ability and self-efficacy affect the usage of computer technology? Turk. Online J. Educ. Technol. 2009, 8, 33–38. [Google Scholar]
  30. Beckers, J.J.; Schmidt, H.G. Computer experience and computer anxiety. Comput. Hum. Behav. 2003, 19, 785–797. [Google Scholar] [CrossRef]
  31. Shashaani, L.; Khalili, A. Gender and computers: Similarities and differences in Iranian college students’ attitudes toward computers. Comput. Educ. 2001, 37, 363–375. [Google Scholar] [CrossRef]
  32. European Council. Recommendation of the European Parliament and the Council on Key Competencies for Lifelong Learning. Off. J. Eur. Union 2006, 10–18. Available online: http://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=OJ:L:2006:394:0010:0018:en:PDF (accessed on 2 December 2018).
  33. Hong, A.J.; Kim, H.J. College Students’ Digital Readiness for Academic Engagement (DRAE) Scale: Scale development and validation. Asisa-Pac. Educ. Res. 2018, 27, 303–312. [Google Scholar] [CrossRef]
  34. Knutsson, O.; Blåsjö, M.; Hållsten, S.; Karlström, P. Identifying different registers of digital literacy in virtual learning environments. Internet High. Educ. 2012, 15, 237–246. [Google Scholar] [CrossRef]
  35. Mohammadyari, S.; Singh, H. Understanding the effect of e-learning on individual performance: The role of digital literacy. Comput. Educ. 2015, 82, 11–25. [Google Scholar] [CrossRef]
  36. Ng, W. Can we teach digital natives digital literacy? Comput. Educ. 2012, 59, 1065–1078. [Google Scholar] [CrossRef]
  37. Radovanović, D.; Hogan, B.; Lalić, D. Overcoming digital divides in higher education: Digital literacy beyond Facebook. New Media Soc. 2015, 17, 1733–1749. [Google Scholar] [CrossRef]
  38. Soldi, R.; Cavallini, S.; Friedl, J.; Volpe, M.; Zuccaro, C.P. A New Skills Agenda for Europe; European Commission: Brussels, Belgium, 2016. [Google Scholar]
  39. Greene, J.A.; Yu, S.B.; Copeland, D.Z. Measuring critical components of digital literacy and their relationships with learning. Comput. Educ. 2014, 76, 55–69. [Google Scholar] [CrossRef]
  40. Webster, J.; Martocchio, J.J. Turning work into play: Implications for Microcomputer software training. J. Manag. 1993, 19, 127–146. [Google Scholar] [CrossRef]
  41. Fishbein, M.; Ajzen, I. Belief, Attitude, Intention and Behaviour: An Introduction to Theory and Research; Addison-Wesley: Boston, MA, USA, 1975; Available online: http://people.umass.edu/aizen/f&a1975.html (accessed on 2 December 2018).
  42. Driscoll, M.P. Psychology of Learning for Instruction, 3rd ed.; Pearson Education: Boston, MA, USA, 2005. [Google Scholar]
  43. Gagné, R.M.; Driscoll, M.P. Essentials of Learning for Instruction, 2nd ed.; Prentice Hall Inc.: Englewood Cliffs, NJ, USA, 1988. [Google Scholar]
  44. Melone, N.P. A theoretical assessment of the user-satisfaction construct in information systems research. Manag. Sci. 1990, 36, 76–91. [Google Scholar] [CrossRef]
  45. Jia, R. Computer playfulness, Internet dependency and their relationships with online activity types and student academic performance. J. Behav. Addict. 2012, 1, 74–77. [Google Scholar] [CrossRef] [Green Version]
  46. Davis, C.; Tang, C.S.; Chan, S.-F.F.; Noel, B. The development and validation of the international AIDS Questionnaire-Chinese version (IAQ-C). Educ. Psychol. Meas. 1999, 59, 481–491. [Google Scholar] [CrossRef]
  47. Park, S.Y. An analysis of the technology acceptance model in understanding university students’ behavioral intention to use e-learning. Educ. Technol. Soc. 2009, 12, 150–162. [Google Scholar]
  48. Sam, H.K.; Othman, A.E.A.; Nordin, Z.S. Computer self-efficacy, computer anxiety, and attitudes toward the Internet: A study among undergraduates in Unimas. Educ. Technol. Soc. 2005, 8, 205–219. [Google Scholar]
  49. Willoughby, T.; Wood, E. Children’s Learning in a Digital World. Children’s Learning in a Digital World. 2008. Available online: http://ovidsp.ovid.com/ovidweb.cgi?T=JS&PAGE=reference&D=psyc5&NEWS=N&AN=2007-09150-000 (accessed on 2 December 2018).
  50. Kim, C.M.; Park, S.W.; Cozart, J.; Lee, H. From motivation to engagement: The role of effort regulation of virtual high school students in mathematics courses. Educ. Technol. Soc. 2015, 18, 261–272. [Google Scholar]
  51. Halisch, F.; Heckhausen, H. Search for feedback information and effort regulation during task performance. J. Personal. Soc. Psychol. 1977, 35, 724–733. [Google Scholar] [CrossRef]
  52. Lombardo, M.M.; Eichinger, R.W. High potentials as high learners. Hum. Resour. Manag. 2000, 39, 321–329. [Google Scholar] [CrossRef]
  53. Amato, M.A.; Molokhia, D. How to Cultivate Learning Agility; Harvard Business Publishing: Boston, MA, USA, 2016. [Google Scholar]
  54. De Meuse, K.P.; Dai, G.; Hallenbeck, G.S. Learning agility: A construct whose time has come. Consult. Psychol. J. Pract. Res. 2010, 62, 119–130. [Google Scholar] [CrossRef]
  55. Connolly, J. Assessing the Construct Validity of a Measure of Learning Agility. Doctoral Dissertation; Florida International University: Miami, FL, USA, 2001; Unpublished Work. [Google Scholar]
  56. Howard, S.K.; Ma, J.; Yang, J. Student rules: Exploring patterns of students’ computer-efficacy and engagement with digital technologies in learning. Comput. Educ. 2016, 101, 29–42. [Google Scholar] [CrossRef] [Green Version]
  57. Krause, K.; Coates, H. Students’ engagement in first-year university. Assess. Eval. High. Educ. 2008, 33, 493–505. [Google Scholar] [CrossRef]
  58. Henrie, C.R.; Halverson, L.R.; Graham, C.R. Measuring student engagement in technology-mediated learning: A review. Comput. Educ. 2015, 90, 36–53. [Google Scholar] [CrossRef]
  59. Newmann, F.; Wehlage, G.; Lamborn, S. The significance and sources of student engagement. In Student Engagement and Achievement in American Secondary Schools; Newmann, F.M., Ed.; Teachers College Press: New York, NY, USA, 1992; pp. 11–39. [Google Scholar]
  60. Fredricks, J.A.; Blumenfeld, P.C.; Paris, A.H. School engagement: Potential of the concept, state of the evidence. Rev. Educ. Res. 2004, 74, 59–109. [Google Scholar] [CrossRef]
  61. Christenson, S.L.; Reschly, A.L. Jingle, jangle, and conceptual haziness: Evolution and future directions of the engagement construct. In Handbook of Research on Student Engagement; Christenson, S.L., Reschly, A.L., Wylie, C., Eds.; Springer: New York, NY, USA, 2012; pp. 3–19. [Google Scholar]
  62. Skinner, E.A.; Belmont, M.J. Motivation in the classroom: Reciprocal effects of teacher behavior and student engagement across the school year. J. Educ. Psychol. 1993, 85, 571–581. [Google Scholar] [CrossRef]
  63. Connell, J.P.; Wellborn, J.G. Competence, autonomy, and relatedness: A motivational analysis of self-system processes. In Self-Processes and Development; Gunnar, M.R., Sroufe, L.A., Eds.; Lawrence Erlbaum Associates, Inc.: Hillsdale, NJ, USA, 1991; pp. 43–77. [Google Scholar]
  64. Schreiner, L.; Louis, M.C. The engaged learning index: Implications for faculty development. J. Excell. Coll. Teach. 2011, 22, 5–28. [Google Scholar]
  65. Gravett, L.S.; Caldwell, S.A. Learning Agility: The Impact on Recruitment and Retention; Palgrave Macmillan: New York, NY, USA, 2016. [Google Scholar]
  66. OECD. PISA 2006 Technical Report; OECD: Paris, France, 2009. [Google Scholar]
  67. Chin, W. The partial least squares approach to structural equation modeling. Mod. Methods Bus. Res. 1998, 295, 295–336. [Google Scholar]
  68. Wold, H. Partial least squares. In Encyclopedia of Statistical Sciences; Kotz, S., Johnson, N.L., Eds.; Wiley: New York, NY, USA, 1985; Volume 6, pp. 581–591. [Google Scholar]
  69. Hair, J.F.; Hult, G.T.; Ringle, C.; Sarstedt, M. A Primer on Partial Least Squares Structural Equation Modeling (PLS-SEM); Sage Publications, Inc.: Thousand Oaks, CA, USA, 2014. [Google Scholar]
  70. Ringle, C.M.; Wende, S.; Will, A. SmartPLS 3.0. 2005. Available online: http://www.smartpls.de (accessed on 2 December 2018).
  71. Chin, W.W. How to write up and report PLS analyses. In Handbook of Partial Least Squares: Concepts, Methods and Applications in Marketing and Related Fields; Vinzi, V.E., Chin, W.W., Henseler, J., Wang, H., Eds.; Springer: Berlin, German, 2010; pp. 655–690. [Google Scholar]
  72. Henseler, J.; Ringle, C.M.; Sinkovics, R.R. The use of partial least squares path modeling in international marketing. Adv. Int. Mark. 2009, 20, 277–320. [Google Scholar] [CrossRef]
  73. Kerlinger, F.N.; Lee, H.B. Foundations of Behavioral Research, 4th ed.; Harcourt College Publishers: New York, NY, USA, 2000. [Google Scholar]
  74. Lohmöller, J.-B. Latent Variable Path Modeling with Partial Least Squares; Physica-Verlag: Heidelberg, German, 1989. [Google Scholar]
  75. Wixom, B.H.; Todd, P.A. A theoretical integration of user satisfaction and technology acceptance. Inf. Syst. Res. 2005, 16, 85–102. [Google Scholar] [CrossRef]
  76. Hair, J.F.; Black, B.; Babin, B.; Anderson, R.E.; Tatham, R.L. Multivariate Data Analysis, 7th ed.; Prentice-Hall: Upper Saddle River, NJ, USA, 2010. [Google Scholar]
  77. Tenenhaus, M.; Esposito Vinzi, V.; Chatelin, Y.-M.; Lauro, C. PLS path modeling. Comput. Stat. Data Anal. 2005, 48, 159–205. [Google Scholar] [CrossRef]
  78. Schunk, D.H. Effects of effort attributional feedback on children’s perceived self-efficacy and achievement. J. Educ. Psychol. 1982, 74, 548–556. [Google Scholar] [CrossRef]
  79. Kuh, G.D.; Cruce, T.M.; Shoup, R.; Kinzie, J.; Gonyea, R.M. Unmasking the effects of student engagement on first-year college grades and persistence. J. High. Educ. 2008, 79, 540–563. [Google Scholar] [CrossRef]
Figure 1. Research model.
Figure 1. Research model.
Sustainability 10 04635 g001
Figure 2. Results of structural model analysis. * p < 0.05; ** p < 0.01; *** p < 0.001
Figure 2. Results of structural model analysis. * p < 0.05; ** p < 0.01; *** p < 0.001
Sustainability 10 04635 g002
Table 1. Survey items.
Table 1. Survey items.
ScaleItems
Attitude toward digital technologies [66]AT 1. It is very important to me to work with a computer.
AT 2. I use a computer because I am very interested.
AT 3. I can accomplish work faster with computers.
AT 4. I think that computer skills will be important for my future job.
Learning Agility [65]LA 1. New experiences are learning opportunities for me.
LA 2. I easily retain new information.
LA 3. I’m optimistic that I can learn new information.
LA 4. I enjoy researching new information.
LA 5. I look for ways to use new knowledge.
Prior Digital Experience with Family [28]DE 1. Parents recommended that I acquire computer-related skills.
DE 2. Parents thought learning technologies was important for my future.
DE 3. Our family shared computers at home.
Student Engagement [6]SE 1. Finding ways to make the course material relevant to my life.
SE 2. Applying course material to my life.
SE 3. Finding ways to make the course interesting to me.
SE 4. Thinking about the course between class meetings.
SE 5. Really desiring to learn the material.
Personal effort to learn ICT [28]PE 1. I learned myself digital technologies in need.
Perceived digital competence [33]PD 1. I can use the fundamental functions of a presentation program for class presentations.
PD 2. I can use the fundamental functions of word processing programs to create and edit documents for class assignments.
PD 3. I can generate keywords to search information for academic work.
PD 4. I can share my files with classmates using online software.
PD 5. I can collaborate with classmates using online software.
Table 2. Average variance extracted, composite reliability, and factor loadings of the constructs.
Table 2. Average variance extracted, composite reliability, and factor loadings of the constructs.
ConstructsMeanSDAVEAlphaCRPDATLADESEPEt-Value
Perceived Digital Competence (PD)
PD14.400.780.630.860.900.79 25.25
PD24.380.77 0.83 37.67
PD33.850.88 0.75 26.76
PD44.020.86 0.78 27.48
PD53.940.92 0.82 37.62
Attitude toward Digital Technologies (AT)
AT14.230.840.620.800.87 0.78 28.86
AT23.940.93 0.82 41.04
AT33.891.00 0.79 31.34
AT44.130.90 0.77 22.01
Learning Agility (LA)
LA13.141.020.620.850.89 0.77 30.09
LA23.360.94 0.82 44.21
LA33.261.00 0.84 45.57
LA43.081.00 0.73 24.67
LA53.350.98 0.78 33.27
Prior Digital Experience with Family (DE)
DE13.071.230.740.820.89 0.88 35.48
DE23.221.18 0.87 35.40
DE32.851.25 0.82 26.99
Student Engagement (SE)
SE13.301.080.610.840.89 0.80 34.38
SE23.081.05 0.78 31.46
SE33.241.00 0.79 32.71
SE43.141.02 0.77 27.61
SE53.470.99 0.77 28.53
Personal Effort to Learn ICT (PE)
PE13.970.941.001.001.00 1.00
Table 3. Results from the cross-loadings variables in the measurement model (final model).
Table 3. Results from the cross-loadings variables in the measurement model (final model).
PDATLADESEPE
PD10.7860.4970.2310.1890.1390.517
PD20.8340.5310.2540.1570.1630.572
PD30.7540.4070.3570.2310.2520.449
PD40.7830.4450.3580.1810.2600.417
PD50.8190.4990.3260.1790.1960.415
AT10.4640.7790.2330.0910.1780.348
AT20.4880.8190.3150.1710.1760.370
AT30.4720.7900.2550.2570.1980.367
AT40.4610.7650.2760.1700.1700.347
LA10.3200.2650.7690.1150.4210.195
LA20.4150.3090.8240.1960.4480.306
LA30.3110.2510.8360.1830.4900.246
LA40.2580.3230.7310.1030.3750.213
LA50.2100.2190.7820.1470.5680.158
DE10.1730.2100.1580.8780.0970.165
DE20.2020.1780.1460.8740.0890.203
DE30.2260.1840.1830.8240.1490.255
SE10.1540.1520.4400.0840.8010.066
SE20.1320.1250.4180.1000.7780.034
SE30.2020.1790.4720.1830.7920.097
SE40.2390.2110.4290.1140.7660.059
SE50.2550.2160.5180.0430.7650.111
PE10.5980.4550.2840.2460.0971.000
Note: Bold values are indicator loadings.
Table 4. Correlations of all constructs and square root of AVE for discriminant validity.
Table 4. Correlations of all constructs and square root of AVE for discriminant validity.
Latent DimensionsPDATLADESEPE
Perceived Digital Competence (PD)0.80
Attitude toward Digital technologies (AT)0.600.79
Learning Agility (LA)0.380.340.79
Prior Digital Experience with Family (DE)0.240.220.190.86
Student Engagement (SE)0.260.230.590.130.78
Personal Effort to Learn ICT (PE)0.600.460.280.250.101.00 *
Note: The shaded numbers on the diagonal are the square roots of AVE. * Single-item construct.
Table 5. Hypotheses, path coefficients, and results.
Table 5. Hypotheses, path coefficients, and results.
HypothesisPathPath CoefficientT-StatisticsResults
H1Family influence > Attitude0.117 **2.67Supported
H2Family influence > Digital competence0.095 *2.51Supported
H3Family influence > Personal effort0.246 ***4.89Supported
H4Digital competence > Student engagement0.0250.47Not Supported
H5Digital competence > Learning agility0.278 ***4.47Supported
H6Attitude > Student engagement0.0170.33Not Supported
H7Attitude > Learning agility0.177 **3.11Supported
H8Personal effort > Attitude0.426 ***8.02Supported
H9Personal effort > Digital competence0.574 ***14.08Supported
H10Learning agility > Student engagement0.572 ***14.21Supported
* p < 0.05, ** p < 0.01, *** p < 0.001.

Share and Cite

MDPI and ACS Style

Kim, H.J.; Hong, A.J.; Song, H.-D. The Relationships of Family, Perceived Digital Competence and Attitude, and Learning Agility in Sustainable Student Engagement in Higher Education. Sustainability 2018, 10, 4635. https://doi.org/10.3390/su10124635

AMA Style

Kim HJ, Hong AJ, Song H-D. The Relationships of Family, Perceived Digital Competence and Attitude, and Learning Agility in Sustainable Student Engagement in Higher Education. Sustainability. 2018; 10(12):4635. https://doi.org/10.3390/su10124635

Chicago/Turabian Style

Kim, Hye Jeong, Ah Jeong Hong, and Hae-Deok Song. 2018. "The Relationships of Family, Perceived Digital Competence and Attitude, and Learning Agility in Sustainable Student Engagement in Higher Education" Sustainability 10, no. 12: 4635. https://doi.org/10.3390/su10124635

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop