Next Article in Journal
“Don’t Get Your Meat Where You Get Your Bread”: Beliefs and Advice about Workplace Romance
Next Article in Special Issue
Investigating Students’ Answering Behaviors in a Computer-Based Mathematics Algebra Test: A Cognitive-Load Perspective
Previous Article in Journal
The Spillover Effect of Life Satisfaction on Customer Satisfaction: The Mediating Role of Service-Oriented Organizational Citizenship and the Moderating Role of Competition Climate
Previous Article in Special Issue
Teachers’ Mental Health and Their Involvement in Educational Inclusion
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Supporting Student Learning Needs in Tertiary Education: Institutional Support Structures Based on the Institutional Support Questionnaire

Teaching & Learning Centre, Singapore University of Social Sciences, Singapore 599494, Singapore
*
Author to whom correspondence should be addressed.
Behav. Sci. 2022, 12(8), 277; https://doi.org/10.3390/bs12080277
Submission received: 21 July 2022 / Revised: 8 August 2022 / Accepted: 9 August 2022 / Published: 11 August 2022

Abstract

:
This article presents and focuses on the Institutional Support Questionnaire (ISQ) that was developed and validated to complement the Learning Needs Questionnaire (LNQ). While the LNQ, validated and published earlier, assessed students’ perceived learning needs, the ISQ assesses students’ psychological perspectives of their institution, particularly how they perceive their institution supports their learning. Both questionnaires work in tandem to support resource optimisation efforts in establishing targeted academic support structures within teaching-focused tertiary institutions. This study found that the 42-item ISQ had adequate psychometric properties and that institutional support could be represented by four factors (i.e., academic competency support, teaching practices, tutors’ characteristics, and use of technology in instruction) that reflected in large part the factors characterised by the LNQ (i.e., perceived academic competency, time management, preferred tutors’ characteristics, and use of technology). Practical applications of the use of both the ISQ and LNQ (i.e., how both could be applied in a tertiary education setting to identify perceived students’ learning needs and whether an institution is providing adequate support to meet these needs) and limitations on their use are discussed.

1. Introduction

Supporting student learning is a mission naturally entrusted to all educational institutions. This is not an easy feat, considering that multiple moderators can impact a student’s academic success. In this regard, an assessment of learning needs is critical in that information derived from the assessment would inform how institutional support structures could be developed and optimised in order to motivate students toward academic success. This paper discusses the development and validation of a questionnaire, the Institutional Support Questionnaire (ISQ), that intends to, noting the finite nature of resources, support resource optimisation efforts in establishing targeted academic support structures within teaching-focused tertiary education institutions.
Notwithstanding other factors such as qualifications (e.g., undergraduate, postgraduate) and the nature (e.g., part-time, full-time) and delivery mode (e.g., full online, blended, face-to-face) of the programme, the amount of time a student spends at an institution is finite. By this token, educators should preferably, within all practical means, optimise students’ learning as much as possible within this time [1]. The impetus to achieve this drives institutions toward not only more fitting instructional practices but more optimal types of academic support afforded for its students. Study skill interventions are a typical type of academic support provided to students with the view that developing these study skills would have a positive impact on students’ academic performance; studies and meta-analyses by scholars have found these interventions generally effective [2,3]. Nonetheless, nuances in the efficacy of these interventions have been detected depending on whether: (1) the interventions were focusing on a single learning need, (2) the interventions adopted a multipronged approach where several learning needs were addressed, or (3) the interventions were metacognitive in nature. The effects of these interventions also differed when taking into account whether the interventions aimed to enhance performance that was closely related to (i.e., near transfer, which showed better results) or distantly related to (i.e., far transfer) the interventions. The current body of knowledge associated with study skill interventions is that for such interventions to be effective, the interventions should also incorporate metacognitive awareness and consider with caution the learning context.
Another study by King [4] identified two approaches to providing student support that aimed to enhance student learning: (1) an institutional-level approach emphasising learning experience and (2) an approach focused on student characteristics. This meant that institutions tried to support students by providing high-quality learning experiences, such as providing spaces and avenues for faculty–student interactions, and providing service learning and volunteering opportunities. On the other hand, learning support that focused on student characteristics attempted to develop and enhance educational practices that pertained to student attributes. King postulated that student support was comprehensive only when learning experiences and student characteristics were aligned and linked, student characteristics were acknowledged, and learning experiences were designed with these characteristics in mind.
Evidently, the body of research on academic support provided by institutions (e.g., study skill interventions, enhancement of learning experiences, etc.) has suggested strong interest in, as well as recognition of the importance of, academic support structures for students’ academic success. While an institution that constantly reflects its adequacies in supporting student learning demonstrates commitment to providing a conducive learning environment to students, it must also be realistic in the allocation of its resources for such support structures. After all, many education institutions are publicly funded and must demonstrate responsibility and prudence in how resources are utilised [5]. Hence, other than the consideration of effective support structures for students, it is critical to consider key stakeholders’ views during this endeavour, one of which is to consider students’ psychological perspectives (i.e., hearing from students themselves with regard to the types of support they perceive they are receiving and need). Students’ perspectives would be informative for institutions in facilitating decision making about what types of academic support structures to implement or prioritise. The development of the ISQ serves this need, with a view to helping institutions optimise their resources by revealing targeted academic support structures that their students need.

1.1. Dynamic Student Profiles, Dynamic Expectations and Learning Needs

The profile of students within institutions is ever changing. Contrary to the typical impression of a university campus filled with fresh secondary school graduates (i.e., conventional university students), there has been increasing diversity on campus, with more returning to schools to obtain or further their tertiary education qualifications [6,7,8]. Considering international advocacy toward continuing and further education [9] and the influx of adult learners, institutions should not assume that adult learners’ learning needs, and hence the expectations that the institution should have of adult learners, are similar to those of conventional university students [10]. Adult learners are busy individuals who have different and multiple priorities that impact how they expect learning to take place and how they learn. While some adult learners may be able to immerse themselves in some learning experiences, expecting no less in terms of the experience by conventional university students, other adult learners may face constraints and may not be able to spend as much time on their learning [11]. Studies have also shown that adult learners tend to learn through formal teaching and learning settings in the classroom, while conventional university students benefit from informal learning experiences from campus activities other than the formal teaching and learning settings [12]. The seemingly established profile of a conventional university student seems to be changing as well, and an example of such a change is that conventional students are spending more time working, in part to fulfil credits (e.g., work-study qualifications) [13]. Given this, it is premature now to claim that conventional university students have ample time for studying as compared with adult learners. With such complexities and the inevitable strain on resources brought about by an ever-shifting social–economic–political climate (e.g., due to the COVID-19 pandemic), it is all the more critical that feedback be sought from students on a regular basis on whether institutions have been keeping up with supporting their needs. It is with this framing that this study presents the ISQ, which was developed and validated as a tool to support resource optimisation efforts in establishing targeted academic support structures within teaching-focused tertiary education institutions.

1.2. Contextualising the Need for the Institutional Support Questionnaire (ISQ)

The ISQ is intended for teaching-focused tertiary education institutions, such as the case institution in this study, to have a validated tool to solicit feedback from students about their perceptions of the institutional support they are receiving for their learning. Based in Singapore, the case institution at the time of writing was uniquely the only publicly funded tertiary education institution that championed lifelong learning [14]. The Singapore Ministry of Education earlier designated the case institution to cater more to adult learners than conventional university students in efforts to diversify the local tertiary education landscape. From science and technology, to humanities and behavioural sciences, to business, to human development, to law, the case institution offered a variety of undergraduate, postgraduate, and continuing education programmes that catered to working adults who want to pursue tertiary-level education. Naturally, the case institution had about 75% of its student population being adult learners enrolled in programmes that were usually part-time in nature, with classes arranged in the evenings or weekends to cater to adult learners’ schedules and other commitments. To afford greater flexibility, the case institution also offered courses in a variety of face-to-face, blended, or fully online modes.
Based on Messick’s first facet of validity evidence [15], content appropriateness, it is critical to consider the context in which a test or questionnaire is to be deployed. By token of the context described, the absence of a questionnaire in the Singapore tertiary education institution context, and the case institution having a unique student population (i.e., 75% adult learners), the ISQ had to be developed for its intended purpose.

2. Methodology

The ISQ was developed based on the measurement of latent variables and classical measurement theory as described by DeVellis [16]. Essentially, each subscale is measured unidimensionally by a number of items, upon which a score is computed and claims are made. Three stages, as recommended by Messick [15], were undertaken for this study. Stage 1 involved developing items, with a focus on content appropriateness. Stage 2 involved administering the ISQ to collect data, and Stage 3 involved analyses to establish the factorial structure of the ISQ.
  • Stage 1
A separate questionnaire, the Learning Needs Questionnaire (LNQ) by Ho and Lim [17], was developed and validated earlier for the case institution to assess what students perceived of themselves. Items in the ISQ were developed with reference to what students perceived of the institution and in part to the results of the validation of the LNQ. This would allow the ISQ and LNQ to be complementary, such that information gathered from both questionnaires would be useful in supporting institutional efforts toward supporting students. Based on the literature review, there appeared to be a dearth of existing and complementary instrument(s) that had been developed via a positivist approach with adequate psychometric properties for claims suitable for the purpose of establishing student learning support structures at teaching-focused tertiary education institutions. In this regard, the ISQ items were primarily developed via a deductive research approach by reviewing existing nonvalidated measures of institutional support and the various intended dimensions within these measures. Along with this, inductive approaches were used when items were developed by drawing from academic and professional staff experience within a unit in the Office of the Provost responsible for planning and implementing academic support structures at the institutional level.
The initial item development stage saw the generation of 56 items representing how students perceived institutional support structures related to key domains such as academic competency support, learning resources, tutors’ teaching practices, and tutors’ characteristics. These items were developed in part with a view that tutors, which in the case institution meant the course instructors, were faces of the institution, as they interacted the most with students. Five of the fifty-six items were removed, as they were deemed inappropriate or confusing for respondents by an expert panel comprising a member of the institution senior management, a senior lecturer, a lecturer, and five professional staff under the Office of the Provost. Table 1 presents the reasons why these items were removed.

2.1. Rating Scale and Scoring

For the purposes of respondents’ familiarity and minimising potential confounds due to confusion, the ISQ employed a rating scale similar to that of the LNQ. Item responses were coded one to seven, with seven and one corresponding to strongly agree and strongly disagree, respectively.
  • Stage 2
Stage 2 was focused on data collection over a two-month period. Apart from email invitations seeking student respondents sent to all official student email addresses, digital signage boards across all lift lobbies in the case institution advertised the call for student respondents. Interested respondents would then access the ISQ via a hyperlink. To further encourage students to participate, the email invitation and digital advertisements specified that respondents who completed the questionnaire would be eligible to receive a randomly drawn prize. Ethical review and approval were waived for this study on the basis that this research was conducted in established or commonly accepted educational settings and involved normal educational practices. Informed consent was assumed when students voluntarily accepted the invite and responded by completing the ISQ.

2.2. Participants and Procedure

A total of 1178 participant responses were used to validate the ISQ (see Table 2). Responses with at least one missing value and duplicates were excluded.
  • Stage 3
Exploratory and confirmatory factor analyses (EFA and CFA) were used in stage 3. As described by DeVellis [16], EFA was used to establish the factorial structure of the ISQ as well as for item reduction based on statistical results rather than theory. CFA was then used to determine the extent to which the a priori factorial structure established through EFA represented the actual data [18]. For the purpose of cross-validation, the sample was partitioned, and one half was used for each of EFA and CFA. Each half of the sample was comparable in terms of gender, χ2 (1) = 0.01, p > 0.05; degree programme (part-/full-time undergraduate, postgraduate), χ2 (2) = 0.11, p > 0.05; years of study with the university, χ2 (5) = 1.43, p > 0.05; school respondent was from, χ2 (5) = 0.07, p > 0.05; age group, χ2 (5) = 0.26, p > 0.05; cumulative grade point average band, χ2 (7) = 3.03, p > 0.05; and highest educational qualification attained before matriculation, χ2 (4) = 3.49, p > 0.05.

2.3. Exploratory Factor Analysis

Tests of normality (Mardia, Shapiro–Wilk, Kolmogorov–Smirnov, Cramer–von Mises, and Anderson–Darling) were undertaken to assess the suitability of EFA for the dataset. The dataset was considered suitable for EFA based on recommendations by Fuller and Hemmerle [19], given that the distributions of all items were modestly nonnormal; the skewness and kurtosis of each were also within the thresholds recommended for structural equation modelling [20]. Further to tests of normality, the Kaiser–Meyer–Olkin measure of sampling adequacy, was 0.95 and Barlett’s test of sphericity was significant (x2 (1081) = 33,452.69, p < 0.001). These metrics also supported the suitability of EFA for the dataset.
Cattell’s scree test, Kaiser’s eigenvalue criterion, Horn’s parallel analysis, and the amount of variance explained were considered in establishing the factorial structure. Four factors were identified, along with a reduction of eight items, as these had item loadings of less than 0.32 or were cross-loaded [18,21]. These factors and their corresponding variances explained were interpreted as tutors’ characteristics (TC, 79.26%), academic competency support (ACS, 9.82%), use of technology in instruction (TII, 7.97%), and teaching practices (TP, 2.95%). The interfactor correlations, means, and standard deviations and final factor loadings (after item reduction) are presented in Table 3 and Table 4, respectively.

2.4. Reliability

Reliability indices suggested sufficient internal consistency based on the thresholds recommended by Everitt [22], Kline [23], and Tucker and Lewis [24]. Cronbach’s α was 0.97, Tucker and Lewis’s reliability coefficient was 0.78, and item total correlations ranged from 0.43 to 0.81. In view of these, all items that were used for EFA were retained without further deletion.

2.5. Confirmatory Factor Analysis

Based on the factors specified through the EFA, a four-factor measurement model of the ISQ was proposed. This model was then subjected to a hierarchical CFA to assess model fit and confirm the factorial structure of the instrument. As with the partitioned sample for EFA, the CFA sample was slightly nonnormal, and hence, the maximum likelihood with Satorra–Bentler (MLSB) scaled chi-square statistics for model fit that adjusted the chi-square statistic and standard errors for data nonnormality was used for more precise goodness-of-fit statistics [25,26,27].
Initial iterative runs of the CFA found few items that contributed to model misfit. Table 5 presents these items and suggestions why they might have contributed to model misfit.
With the removal of these five items, the model fit of the first-order four-factor model specified of the ISQ by the EFA was assessed to be acceptable based on recommended thresholds, indicators, and goodness-of-fit statistics per Brown [28], Hair et al. [18], and Hu and Bentler [29] (i.e., CFI ≥ 0.90; RMSEA < 0.08; SRMR < 0.08). Theoretically plausible models (i.e., one-factor model, second-order four-factor model) were also considered to ascertain the feasibility of the first-order four-factor model. Table 6 present results of the CFA. Based on the chi-square difference test, Schwartz Bayesian Criterion and Akaike Information Criterion, it could be suggested that the first-order four-factor model specified by the EFA but with five items removed was most appropriate for the ISQ.
All standardised loading estimates of the first-order four-factor model were significant (p < 0.001) and greater than 0.5; these suggested a factorial structure with practical significance [18]. With the exception of items ACS5 (loading 0.6), TP9 (loading 0.5), TP5 (loading 0.6), TC20 (loading 0.6), and TII6 (loading 0.4), all ISQ items had standardised loadings greater than 0.7; these further indicated a well-defined factorial structure.
The construct reliability (CR) of each factor was well above 0.7; this indicated that there was internal consistency and that the items consistently measured the same constructs. The average variance extracted (AVE) for each factor was also above 0.5, suggesting acceptable error variance and that no items should be further eliminated.
To investigate whether each factor was distinct from the others, the square root of the AVE was compared with the interfactor correlations as recommended by Fornell and Larcker [30] and Hair et al. [18]. It was established that all the factors were distinct from others, given that the square roots of the AVE estimates were greater than the interfactor correlations (see Table 7 and Table 8).

3. Discussion

The EFA and CFA suggested a first-order four-factor model to represent the ISQ, with a reduction of 14 items across stages 1 to 3 of this study. The four factors (i.e., tutors’ characteristics, academic competency support, use of technology in instruction, and teaching practices) would invariably inform how students perceived institutional support vis-à-vis their learning. Other than the teaching practices, the factors found for the ISQ in this study complemented three of the four factors identified by the LNQ (i.e., student preference of tutors’ characteristics, use of technology, perceived academic competency). Since the development of the ISQ and LNQ had been intended to be complementary in their efforts to help institutions establish the learning support required by their students, the following sections discuss how the factors identified in the ISQ and LNQ complemented each other (see Table 9 for factors).

3.1. Tutors’ Characteristics

While the LNQ measures student preferences of tutor characteristics, the ISQ measures what students go through (i.e., what their tutors’ characteristics were). Administered together (e.g., the LNQ followed by the ISQ), gaps between scores of the LNQ and ISQ in this area could indicate a disconnect in that what students preferred of tutors was inconsistent with the characteristics displayed by tutors. Students’ perception of the quality of instruction has been associated with tutor-related factors [31]. For example, students’ enjoyment of a class seemed to be mediated by students’ perception of an instructor’s enthusiasm [32]. More importantly, such a perception seemed to have an impact on student learning. A three-year study by Alshariff and Qi [33] showed that students’ motivation to learn was related to their perception of tutor-related factors. Despite these studies, though, there seem to have been limited studies on students’ perception of an ideal tutor [34]. Hence, it would be judicious for institutions to survey their students to understand their preferences for tutor-related characteristics. When the responses were compared with the responses regarding students’ actual experience of their tutors, institutions would then be better placed to assess whether students’ expectations have been met. In the event that there were a disconnect arising from responses to the ISQ, institutions would have relevant information to convey to tutors; this could be used in part to develop more meaningful tutor–student interactions that support learning.

3.2. Academic Competency Support

Academic competency support (ACS) within the ISQ provides information on students’ perception of the support they receive from their institution in developing their academic skills (e.g., verbal presentation skills, critical thinking, etc.). Taken together with the perceived academic competency measure within the LNQ, which measures students’ perception of their own academic competency, information regarding the sufficiency of ACS could be obtained. As an example, if students reported a low perceived academic competency in their academic writing skills and concurrently reported a low perception of institutional support in the area of academic writing, this would be an indication that academic writing support was not sufficient within the institution and that resources should be directed to develop support in this area.
The assessment of ACS provided by an institution has important implications for student retention. Whether in Tinto’s student integration model, which suggests academic and social experiences in college as determinants for academic persistence; Bean and Metzner’s student attrition model, which explains academic persistence of nontraditional students; or Rovai’s model of student persistence, which integrates both Tinto’s and Bean and Metzner’s models, the roles of academic experience and support provided by institutions still feature prominently as factors related to student retention [35]. Indeed, institutions have a moral obligation to provide the necessary support to afford their students the highest chance for success. This is especially important when public funds are used to finance operations [36]. In view of this, institutions should regularly seek feedback from their students about their experience of ACS, and the ACS factor of ISQ holds promise for this purpose.

3.3. Use of Technology in Instruction

Given the advent of information technology in education, it is critical to learn how students view their institutions in terms of the use of technology in instruction, which is what the use of technology in instruction factor of the ISQ was intended to measure. This measure could also be compared with the LNQ’s student use of technology measure; a difference in scores could indicate that a student was IT savvy but receiving instruction that did not optimise the use of information technology for learning, whereas a high score achieved in both scales could indicate that the institution had prepared both students and tutors to function satisfactorily in a technology-assisted learning space.
Increased dependence on and use of technology in education have been keenly felt during the COVID-19 pandemic. With schools and tertiary education institutions locked down to prevent the spread of the virus, technology was heavily utilised to ensure that teaching and learning could continue [37]. With this, students and instructors were thrown into a new reality overnight, which included the need to be able to learn and teach effectively using relevant technologies. While the world is still grappling with containing the virus, students and instructors are now more familiar in the online teaching and learning space than they were during the beginning phases of the pandemic. This presents an opportune time to assess whether the quick pivot to online education has accelerated not only students’ information technology skills but instructors’ expertise in using technology to engage students, and the ISQ use of technology in instruction factor is timely for this purpose.

3.4. Teaching Practices

Based on the findings of this study, there was no direct association between the teaching practices factor and those of the LNQ. Nonetheless, teaching practices are the face of an education institution. Whether students receive quality teaching can be impacted by what goes on in the classrooms. In view of this, how students view institutional support in terms of teaching practices would provide institutions, including tutors, with information on potential areas for improvement. As an example, if the item “My tutor uses learning outcomes to guide students on what to learn” consistently returned a low rating of agreement, it would pay to investigate how tutors could level up in constructive alignment [38] so that teaching practices were consistent with intended learning outcomes.

4. Limitations and Directions for Future Research

The ISQ was developed and validated to complement the LNQ for the case institution. As the validation was based on classical test theory and hence sample dependent, it would be most appropriate to examine the generalisability of the ISQ in tertiary education institutions that place a high value on teaching and have diverse student populations, as with the case institution. To further minimise sample dependency and provide a basis for exploring the additivity of scores, a Rasch analysis could be undertaken subsequently [26]. Nonetheless, the ISQ (along with the LNQ) presents a reference for tertiary education institutions to develop a comprehensive questionnaire that evaluates both students’ learning needs and their perceptions of how the institution may be meeting their needs.
It is worth noting that the ISQ items were conceptualised before COVID-19 affected the experience of education. Therefore, there is a possibility that some items may not be sensitive to changes due to the pandemic. This calls for a review of the items before subsequent administration, as is normal practice when deploying questionnaires.

5. Conclusions and Practical Implications

The development and validation of an instrument can be time and resource intensive. However, the processes are necessary so that the responses collected present valid and reliable measures. Based on the review of literature and the absence of a similar questionnaire in the Singapore context, the ISQ was developed and validated in the context of the educational experiences of students at the case institution.
The advent of lifelong learning, with the aim to live well and cope with today’s ever-changing world, means that people are expected to return to school from time to time throughout life in order to upskill, reskill, or just learn for pleasure. Hence, education institutions should expect greater diversity within their student population, not only in terms of student characteristics (i.e., a mix of conventional fresh graduates and adult learners) but in terms of motivation for learning. Such diversity is undeniably also fuelled by the development of continuing education units and the potential of stackable credentials in institutions [39]. This calls for institutions to regularly survey their students to learn about how students perceive institutional support vis-à-vis their learning needs. In view of this, the ISQ, in parallel with the LNQ, holds promise as an instrument that can be readily deployed on a practical and large-scale basis in institutions for this purpose; the ISQ assesses students’ perceptions of institutional support of their learning, while the LNQ assesses their perceived learning needs. With the information gathered from these measures, institutions would be better placed to fulfil their responsibility and moral duty of providing students under their charge the opportunity for academic success. In a practical setting, this means that there must be targeted support structures to help both low- and higher-progress students to realise and maximise their potential.

Author Contributions

Conceptualisation, Y.Y.H.; methodology, Y.Y.H.; formal analysis, L.L.; writing—original draft preparation, Y.Y.H. and L.L.; writing—review and editing, Y.Y.H. and L.L. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Ethical review and approval were waived for this study on the basis that this research was conducted in established or commonly accepted educational settings and involved normal educational practices.

Informed Consent Statement

Informed consent was assumed when students voluntarily accepted the invite and responded by completing the ISQ.

Data Availability Statement

Data available on request because of privacy and ethical restrictions.

Acknowledgments

The authors thank the academic and professional staff of the Teaching & Learning Centre in the Singapore University of Social Sciences for their feedback, support, and administration of the ISQ.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Ho, Y.Y.; Yeo, E.-Y.; Wijaya, D.S.B.M. Turning coffee time into teaching moments through bite-sized learning for adult learners. J. Contin. Educ. 2022. [Google Scholar] [CrossRef]
  2. Dent, A.; Koenka, A. The relation between self-regulated learning and academic achievement across childhood and adolescence: A meta-analysis. Educ. Psychol. Rev. 2016, 28, 425–474. [Google Scholar] [CrossRef]
  3. Hattie, J.; Biggs, J.; Purdie, N. Effects of learning skills interventions on student learning: A meta-analysis. Rev. Educ. Res. 1996, 66, 99–136. [Google Scholar] [CrossRef]
  4. King, P.M. Enriching the student learning experience: Linking student development and organizational perspectives. About Campus 2014, 19, 7–13. [Google Scholar] [CrossRef] [Green Version]
  5. Teng, A. Parliament: Younger Universities Get Double the Government Funding of Older Counterparts; The Straits Times: Singapore, 2019. [Google Scholar]
  6. Osam, E.K.; Bergman, M.; Cumberland, D.M. An integrative literature review on the barriers impacting adult learners’ return to college. Adult Learn. 2017, 28, 54–60. [Google Scholar] [CrossRef]
  7. Ross-Gordon, J.M. Research on adult learners: Supporting the needs of a student population that is no longer nontraditional. Peer Rev. 2011, 13, 26–29. [Google Scholar]
  8. Ho, Y.Y.; Lim, W.Y. Educating Adult Learners: Bridging Learners’ Characteristics and the Learning Sciences. In International Diversity and Inclusion: Innovative Higher Education in Asia; Gleason, N.W., Sanger, C.S., Eds.; Palgrave: Singapore, 2020; pp. 97–115. ISBN 978-981-15-1628-3. [Google Scholar]
  9. OECD. Getting Skills Right: Future-Ready Adult Learning Systems; OECD Publishing: Paris, France, 2019. [Google Scholar] [CrossRef]
  10. Roumell, E.A. Priming adult learners for learning transfer: Beyond content and delivery. Adult Learn. 2018, 30, 15–22. [Google Scholar] [CrossRef]
  11. Bourke, A.; Vanderveken, J.; Ecker, E.; Bell, H.; Richie, K. Teaching is a learning experience: Exploring faculty engagement with low-income adult learners in a college-community partnership program. Can. J. Educ. 2020, 43, 313–340. Available online: https://journals.sfu.ca/cje/index.php/cje-rce/article/view/3897 (accessed on 10 July 2022).
  12. Panacci, A.G. Adult students in higher education: Classroom experiences and needs. Coll. Q. 2015, 18, n3. Available online: https://files.eric.ed.gov/fulltext/EJ1087330.pdf (accessed on 10 July 2022).
  13. Chu, M.L.; Conlon, E.G.; Creed, P.A. Work–study boundary congruence: Its relationship with student well-being and engagement. Int. J. Educ. Vocat. Guid. 2021, 21, 81–99. [Google Scholar] [CrossRef]
  14. Singapore Ministry of Education. Singapore University of Social Sciences Bill Second Reading; Singapore Ministry of Education: Singapore, 2017. Available online: https://www.moe.gov.sg/news/speeches/20170508-singapore-university-of-social-sciences-bill-second-reading-speech-by-mr-ong-ye-kung-minister-for-education-higher-education-and-skills (accessed on 10 August 2022).
  15. Messick, S. Foundations of validity: Meaning and consequences in psychological assessment. ETS Research Report No. RR-93-51, Series 2. ETS Res. Rep. 1993, 1993, i-18. [Google Scholar] [CrossRef]
  16. DeVellis, R.F. Scale Development: Theory and Applications, 4th ed.; SAGE Publications: Los Angeles, CA, USA, 2017; ISBN 9781506341569. [Google Scholar]
  17. Ho, Y.Y.; Lim, L. Targeting student learning needs: The development and preliminary validation of the Learning Needs Questionnaire for a diverse university student population. High. Educ. Res. Dev. 2021, 40, 1452–1465. [Google Scholar] [CrossRef]
  18. Hair, J.F.; Black, W.C.; Babin, B.J.; Anderson, R.E. Multivariate Data Analysis, 8th ed.; Pearson: Upper Saddle River, NJ, USA, 2019; ISBN 978-1-4737-5654-0. [Google Scholar]
  19. Fuller, E.L., Jr.; Hemmerle, W.J. Robustness of the maximum-likelihood estimation procedure in factor analysis. Psychometrika 1966, 31, 255–266. [Google Scholar] [CrossRef] [PubMed]
  20. West, S.G.; Finch, J.F.; Curran, P.J. Structural Equation Models with Nonnormal Variables: Problems and Remedies. In Structural Equation Modelling: Concepts, Issues and Applications; Hoyle, R.H., Ed.; Sage: Thousand Oaks, CA, USA, 1995; pp. 56–75. ISBN 978-0803953185. [Google Scholar]
  21. Costello, A.B.; Osborne, J.W. Best practices in exploratory factor analysis: Four recommendations for getting the most from your analysis. Pract. Assess. Res. Eval. 2005, 10, 7. [Google Scholar] [CrossRef]
  22. Everitt, B. The Cambridge Dictionary of Statistics, 2nd ed.; Cambridge University Press: Cambridge, UK, 2002; ISBN 0-521-81099. [Google Scholar]
  23. Kline, P. The Handbook of Psychological Testing; Routledge: London, UK, 2000; ISBN 9780415211581. [Google Scholar]
  24. Tucker, L.R.; Lewis, C. A reliability coefficient for maximum likelihood factor analysis. Psychometrika 1973, 38, 1–10. [Google Scholar] [CrossRef]
  25. Lim, L. Development and initial validation of the Computer-Delivered Test Acceptance Questionnaire for secondary and high school students. J. Psychoeduc. Assess. 2020, 38, 182–194. [Google Scholar] [CrossRef]
  26. Lim, L.; Chapman, E. Development and preliminary validation of the Moral Reasoning Questionnaire for secondary school students. SAGE Open 2022, 12, 21582440221085271. [Google Scholar] [CrossRef]
  27. Lim, L.; Lim, S.H.; Lim, R.W.Y. Measuring learner satisfaction of an adaptive learning system. Behav. Sci. 2022, 12, 264. [Google Scholar] [CrossRef]
  28. Brown, T.A. Confirmatory Factor Analysis for Applied Research, 2nd ed.; Guildford Press: New York, NY, USA, 2015; ISBN 978-1-4625-1536-3. [Google Scholar]
  29. Hu, L.T.; Bentler, P.M. Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Struct. Equ. Model. A Multidiscip. J. 1999, 6, 1–55. [Google Scholar] [CrossRef]
  30. Fornell, C.; Larcker, D.F. Evaluating structural equations models with unobservable variables and measurement error. J. Mark. Res. 1981, 18, 39–50. [Google Scholar] [CrossRef]
  31. Kneipp, L.B.; Kelly, K.; Biscoe, J.D.; Richard, B. The impact of tutor’s personality characteristics on quality of instruction. Coll. Stud. J. 2010, 44, 901–905. [Google Scholar]
  32. Frenzel, A.C.; Becker-Kurz, B.; Pekrun, R.; Goetz, T.; Lüdtke, O. Emotion transmission in the classroom revisited: A reciprocal effects model of teacher and student enjoyment. J. Educ. Psychol. 2018, 110, 628–639. [Google Scholar] [CrossRef]
  33. Alsharif, N.Z.; Qi, Y. A three-year study of the impact of tutor attitude, enthusiasm, and teaching style on student learning in a medicinal chemistry course. Am. J. Pharm. Educ. 2014, 78, 132. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  34. Fike, D.S.; Fike, R.; Zhang, S. Teacher qualities valued by students: A pilot validation of the teacher qualities (T-Q) instrument. Acad. Educ. Leadersh. J. 2015, 19, 115–125. [Google Scholar]
  35. Rovai, A.P. In search of higher persistence rates in distance education online programs. Internet High. Educ. 2003, 6, 1–16. [Google Scholar] [CrossRef]
  36. Johnson, M.S.; Thompson, S. The necessity for good governance and effective leadership at public state-funded historically Black colleges and universities (HBCUs) in the midst of COVID-19. Int. J. Multidiscip. Perspect. High. Educ. 2020, 5, 129–133. Available online: https://www.ojed.org/index.php/jimphe/article/view/2653 (accessed on 10 August 2022).
  37. Filho, W.L.; Wall, T.; Rayman-Bacchus, L.; Mifsud, M.; Pritchard, D.J.; Lovren, V.O.; Farinha, C.; Petrovic, D.S.; Balogun, A. Impacts of COVID-19 and social isolation on academic staff and students at universities: A cross-sectional study. BMC Public Health 2021, 21, 1213. [Google Scholar] [CrossRef]
  38. Biggs, J.; Tang, C. Teaching for Quality Learning at University, 4th ed.; McGraw-Hill and Open University Press: New York, NY, USA, 2011; ISBN 978-0-33-524275-7. [Google Scholar]
  39. Bailey, T.; Belfield, C. Stackable Credentials: Awards for the Future? Columbia Teachers College Community College Research Center, Working Paper No. 92; CCRC: New York, NY, USA, 2017; Available online: https://ccrc.tc.columbia.edu/media/k2/attachments/stackable-credentials-awards-for-future.pdf (accessed on 10 August 2022).
Table 1. Items removed from original item pool.
Table 1. Items removed from original item pool.
Item StemReason(s) for Removal
My tutor is able to incorporate students’ experiences to deepen learning.It may be unclear how learning might be deepened by incorporating students’ experiences.
My tutor is able to engage students in the online learning environment.Item precludes other learning environments that are also prevalent within the institution (e.g., face to face, blended).
My tutor provides feedback on assessments and assignments that help students to improve.It is not clear whether it is the feedback or the assessment that is meant to help students to improve.
My tutor is empathetic to student learning needs.Empathetic might not be understood by students enrolled in non-English disciplines. Item may also be perceived as similar to another item, “my tutor is helpful to students in addressing their learning needs”.
My tutor cares about student learning.Item was unanimously considered vague.
Table 2. Profile of respondents.
Table 2. Profile of respondents.
CharacteristicsSample Partitioned for EFASample Partitioned for CFA
n%n%
Gender
 Male25343.025543.3
 Female33657.133456.7
Degree programme
 Full-time undergraduate15025.514725.0
 Part-time undergraduate41670.641770.8
 Postgraduate233.9254.2
Years of study with the university
 Less than 115726.714925.3
 1 to 217129.018431.2
 3 to 421837.022037.4
 5 to 6376.3305.1
 7 to 850.950.9
 More than 810.210.2
School
 A91.5101.7
 B13322.613222.4
 C24441.424441.4
 D10217.310117.2
 E81.481.4
 F9315.89416.0
Age group
 Below 2524441.424341.3
 25 to 3019132.419633.3
 31 to 408614.68414.3
 41 to 50467.8447.5
 51 to 60162.7172.9
 Above 6161.050.9
Cumulative grade point average band
 4.51 to 5.00142.4101.7
 4.01 to 4.50508.5559.3
 3.51 to 4.0011619.711018.7
 3.01 to 3.5013723.314624.8
 2.51 to 3.008314.17612.9
 2.00 to 2.50508.5539.0
 0 to 1.9981.4132.2
 Not sure13122.212621.4
Highest educational qualification before matriculation
 Postgraduate172.9111.9
 Degree488.2559.3
 Diploma42672.344174.9
 GCE A Level7412.66310.7
 GCE O Level244.1193.2
Table 3. Interfactor correlations, means, standard deviations, and Cronbach’s α.
Table 3. Interfactor correlations, means, standard deviations, and Cronbach’s α.
ISQF1F2F3MSDα
TC (F1) 5.540.920.81
ACS (F2)0.42 4.871.000.85
TII (F3)0.620.43 5.231.040.83
TP (F4)0.620.540.565.110.940.81
Table 4. Final factor loadings.
Table 4. Final factor loadings.
ItemsF1F2F3F4
TC1. Is approachable0.920.04−0.03−0.03
TC2. Is patient with students0.910.030.00−0.04
TC3. Is helpful to students in addressing their learning needs0.910.040.01−0.03
TC4. Respects students0.900.08−0.04−0.12
TC5. Is dedicated to teaching0.900.02−0.040.05
TC6. Is enthusiastic in teaching0.87−0.040.030.08
TC7. Prepares their lessons well0.87−0.010.040.01
TC8. Explains clearly in lessons0.87−0.030.030.06
TC9. Motivates students to learn0.840.060.040.04
TC10. Has strong subject knowledge0.83−0.160.040.09
TC11. Provides high-quality feedback0.820.140.01−0.03
TC12. Provides prompt feedback0.800.140.04−0.04
TC13. Provides sufficient learning materials0.800.020.100.02
TC14. Delivers interesting lessons0.730.030.080.09
TC15. Believes in students’ learning ability0.670.040.130.09
TC16. Stimulates student thinking0.66−0.030.100.23
TC17. Is humorous in teaching0.66−0.020.080.12
TC18. Relates work or life experiences to concepts taught0.54−0.110.140.26
TC19. Sets high learning expectations0.430.040.210.15
TC20. The learning materials (presentation slides, readings, activity sheets, etc.) prepared by my tutor help my learning0.410.050.100.23
ACS1. My verbal presentation skills are developed by the resources, courses, and workshops provided by the university0.000.93−0.01−0.12
ACS2. My critical thinking skills have been honed through the resources, courses, and workshops provided by the university0.050.83−0.090.04
ACS3. My academic writing needs are developed and supported by the resources, courses, and workshops provided by the university0.110.830.02−0.20
ACS4. I am able to develop my verbal presentation skills through tutors’ feedback and advice−0.030.750.040.04
ACS5. My academic writing needs are developed and supported by tutors’ feedback and advice0.140.62−0.010.05
ACS6. My critical thinking skills have been honed through feedback from tutors0.080.60−0.130.31
ACS7. The university provided opportunities to hone my information technology skills−0.050.570.230.09
ACS8. The university provided opportunities to hone my research0.040.560.000.22
ACS9. My tutors helped me to be better at researching information for my academic work−0.010.540.080.29
ACS10. My tutors taught me to be better at using information technology−0.150.520.330.17
ACS11. The university provided opportunities to develop my self-directed learning skills0.000.440.110.27
TII1. Uses technology to make learning more flexible0.10−0.040.860.09
TII2. Uses technology to positively enhance my learning experience0.200.030.85−0.03
TII3. Uses technology to enhance learning0.160.010.81−0.02
TII4. Is comfortable using technology to help my learning0.20−0.030.790.03
TII5. Makes technology an integral part of my learning experience0.180.030.780.02
TII6. The prerecorded lectures are useful to my learning0.040.210.40−0.04
TII7. The classroom replay videos are useful to my learning0.060.190.35−0.03
TP1. My tutor is able to challenge students to broaden their perspectives0.230.00−0.090.76
TP2. My tutor is able to provide strategies to students to help them understand their learning0.190.08−0.070.75
TP3. My tutor is able to show students how theories are applied0.28−0.04−0.090.74
TP4. My tutor encourages students to share their learning with peers0.070.100.030.60
TP5. My tutor uses learning outcomes to guide students on what to learn0.070.070.060.59
TP6. My tutor is able to use assessment to identify gaps in students’ learning−0.040.230.100.58
TP7. My tutor uses assessment rubrics to guide students’ learning−0.030.180.130.50
TP8. My tutor summarises main issues covered in each session at the end of each class0.230.110.100.39
TP9. My tutor uses past year exam questions to guide students’ learning0.080.080.090.35
Table 5. Items that contributed to model misfit.
Table 5. Items that contributed to model misfit.
ItemPossible Reasons for Misfit
TC4. Respects studentsThere could have been multiple interpretations of “respect” and how this might be an indicator of tutors’ characteristics.
ACS4. I am able to develop my verbal presentation skills through tutors’ feedback and advicePresentations are normally marked as part of assessment, and hence, students might view the process as more evaluative than developmental. Further, the final assessment task in general is either a written exam or essay assignment.
ACS10. My tutors taught me to be better at using information technologyLearning support in terms of information technology is provided by a unit within the university rather than tutors.
TII7. The classroom replay videos are useful to my learningThe replay videos could have helped some but not other groups, particularly if the videos did not capture discussions in discussion-heavy seminars.
TP7. My tutor uses assessment rubrics to guide students’ learningIt is not common practice that tutors inform students that rubrics are used for marking.
Table 6. CFA goodness-of-fit indicators.
Table 6. CFA goodness-of-fit indicators.
Modelχ2χ2diffdfχ2/dfCFIRMSEASRMRAICSBC
One-factor4379.69 * 8195.350.700.090.094547.694915.48
Second-order four-factor1925.48 *2454.21 *8152.360.910.050.052101.482486.78
First-order four-factor1902.81 *2476.88 *
22.67 *,#
8132.340.910.050.052082.812476.86
Note:. χ2 = Satorra–Bentler scaled chi-square statistic; χ2diff was computed relative to the first-order one-factor model; df = degrees of freedom; CFI = comparative fit index; RMSEA = root mean square error of approximation; SRMR = standardised root mean square; AIC = Akaike information criterion; SBC= Schwarz Bayesian criterion. # refers to x2diff relative to the second-order four-factor model. * p < 0.001.
Table 7. Standardised loadings, AVE, and CR coefficients of the first-order four-factor model.
Table 7. Standardised loadings, AVE, and CR coefficients of the first-order four-factor model.
Construct and ItemsStandardised LoadingAVECR
TC (F1) 0.640.89
TC10.81
TC20.78
TC30.84
TC50.85
TC60.86
TC70.83
TC80.84
TC90.87
TC100.76
TC110.83
TC120.81
TC130.82
TC140.85
TC150.83
TC160.85
TC170.73
TC180.72
TC190.67
TC200.65
ACS (F2) 0.530.93
ACS10.75
ACS20.79
ACS30.76
ACS50.61
ACS60.73
ACS70.74
ACS80.77
ACS90.73
ACS110.68
TII (F3) 0.730.96
TII10.94
TII20.92
TII30.89
TII40.93
TII50.92
TII60.39
TP (F4) 0.510.96
TP10.81
TP20.83
TP30.80
TP40.69
TP50.65
TP60.67
TP80.69
TP90.54
Table 8. Distinctiveness of factors.
Table 8. Distinctiveness of factors.
ConstructTCACSTIITP
TC0.89 *
ACS0.580.85 *
TII0.720.540.91 *
TP0.790.700.640.84 *
* refers to the square root of AVE.
Table 9. Factors from the ISQ and LNQ.
Table 9. Factors from the ISQ and LNQ.
ISQ FactorsLNQ Factors
Tutors’ Characteristics *Preferred Tutors’ Characteristics *
Academic Competency Support #Perceived Academic Competency #
Use of Technology in Instruction @Use of Technology @
Teaching PracticesTime Management
*,#,@ Factors that could be used complementarily with each other.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Lim, L.; Ho, Y.Y. Supporting Student Learning Needs in Tertiary Education: Institutional Support Structures Based on the Institutional Support Questionnaire. Behav. Sci. 2022, 12, 277. https://doi.org/10.3390/bs12080277

AMA Style

Lim L, Ho YY. Supporting Student Learning Needs in Tertiary Education: Institutional Support Structures Based on the Institutional Support Questionnaire. Behavioral Sciences. 2022; 12(8):277. https://doi.org/10.3390/bs12080277

Chicago/Turabian Style

Lim, Lyndon, and Yan Yin Ho. 2022. "Supporting Student Learning Needs in Tertiary Education: Institutional Support Structures Based on the Institutional Support Questionnaire" Behavioral Sciences 12, no. 8: 277. https://doi.org/10.3390/bs12080277

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop