Next Article in Journal
Plastic Waste Management in India: Challenges, Opportunities, and Roadmap for Circular Economy
Next Article in Special Issue
The Role of Pedagogical Agents in Personalised Adaptive Learning: A Review
Previous Article in Journal
Cloud Cover Forecast Based on Correlation Analysis on Satellite Images for Short-Term Photovoltaic Power Forecasting
Previous Article in Special Issue
Developing a Conceptual Model for the Causal Effects of Outdoor Play in Preschools Using PLS-SEM
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Students’ Perceptions of ILS as a Learning-Style-Identification Tool in E-Learning Environments

by
Zoran Marosan
1,
Ninoslava Savic
1,
Aleksandra Klasnja-Milicevic
2,*,
Mirjana Ivanovic
2 and
Boban Vesin
3
1
Novi Sad School of Business, 21000 Novi Sad, Serbia
2
Faculty of Sciences, University of Novi Sad, 21000 Novi Sad, Serbia
3
USN School of Business, University of South-Eastern Norway, Borre 3184, Norway
*
Author to whom correspondence should be addressed.
Sustainability 2022, 14(8), 4426; https://doi.org/10.3390/su14084426
Submission received: 27 January 2022 / Revised: 4 April 2022 / Accepted: 6 April 2022 / Published: 8 April 2022
(This article belongs to the Special Issue E-learning Personalization Systems and Sustainable Education)

Abstract

:
This paper presents the evaluation of the Index of Learning Styles, an assessment tool of the Felder–Silverman learning model. A few studies have previously evaluated this tool, but as far as we know, none of them considered the learners’ opinion to achieve their goals. Considering that many studies suggest continuing with the Index of Learning Styles’ evaluation, an experimental study was conducted using Protus, developed as an adaptive learning system. Analysing the concurrent validity of the Index of Learning Styles, students’ learning preferences were acquired via two different tools: the Index of Learning Styles and the subjective questionnaire. Results suggest that the Index of Learning Styles is valid for defining learning style at the beginning of the learning process, resolving the cold-start problem. We found some differences between the results of the Index of Learning Styles and subjective assessment. By enhancing the Protus user interface with new functionality, which allows a free choice of the learning style during the learning process, we overcome the observed limitations of the Index of Learning Styles. This solution could be implemented in different personalised e-learning environments, regardless of the applied assessment tool, leading to a more reliable student model.

1. Introduction

Different learners have different learning styles (LS), which can be defined as cognitive characteristics and ways of perceiving, interacting with, and responding to the learning environment [1,2]. In traditional classrooms, it is difficult to perform and adapt the teaching process to different learning styles of students. E-learning systems could resolve this issue, first by identifying the learner’s learning style, and then delivering him/her content in such a way that would suit him/her the best, providing adaptability and personalisation [3,4].
Most educational researchers agree on the relevance of learning styles in the learning process and notably in online learning contexts. Numerous research has been conducted on the topic of student activities in an e-learning environment and its relationship to learning styles [5,6]. In [7], authors made a comprehensive presentation of different approaches and algorithms used to predict learning styles along with their classification and the analysis of their advantages and disadvantages.
Researchers continue to debate over the causal relationship between a learner’s adaptability to their learning style and the quality of the learning process in an e-learning environment and learning results. Many researchers believe that having a technique for adapting a certain learner’s learning style is crucial for an adaptive learning system [8,9]. The unifying objective in their research is to accurately adapt courses to learners’ learning preferences.
Some academics downplay the relevance of learning-style adaptation in e-learning systems [10,11]. They mainly consider that there is no substantial improvement in the learning process because of adaption to the learning style. Additionally, psychological research shows that studying in a learning environment that is inconsistent with the preferences of learners may stimulate the learner to gain new abilities [11].
On the other hand, positive views on the impact of an online learning environment’s adaptability to a certain learning style throughout the online training process predominate [12,13]. This approach is supported by experiments with observable findings in [14].
Although learning-style research dates back more than 30 years, the creation of learning-style-adaptive educational systems only began in the first decade of this millennium [15]. According to Cristea and Stash, it is essential to integrate knowledge about a learners’ learning style in an online learning environment, which allows learners to choose the best learning method for them and might improve their outcomes [16]. Adaptive learning systems, according to Popescu et al., can improve contentment, efficiency, and effectiveness by changing instructions of the learning process, relying on the learners’ preferred learning style [15].
The importance of understanding learning styles in an adaptable learning environment is undeniably worthwhile for additional investigation. Regardless of differing viewpoints in related studies, we endorse a study path in which learning-style modelling in an online learning environment must be a built-in feature. This might provide learners with the option of defining their learning style and selecting from a variety of educational strategies.
Among numerous learning-style models and theories [17,18], we chose the Felder–Silverman learning-style model (FSLSM). The Index of Learning Styles (ILS), an FSLSM psychometric instrument, has not been completely validated yet. This paper seeks to provide a subjective assessment, which considers the learners’ opinion for the ILS evaluation. An online learning system, named Protus, was developed as an experimental platform. Implementation of learning-style identification (LSI) in Protus and the effects of this process on personalisation were considered and proposed in this paper. In the end, we present the results of the study, in which we compared and analysed students’ learning preferences acquired from two different sources: a subjective questionnaire and the Index of Learning Styles (ILS).

2. Related Work

A learning-style model categorises learners based on how they collect and process information, as well as how they acquire knowledge. Researchers have developed distinct learning-style theories, as well as the instruments to assess them. They mainly differ in how they differentiate the most important characteristics of the learning processes. Learning-style models are classified into different LS families based on their cognition of the learning-style paradigm [19].
Attwell declared it would be best to use different learning styles in different contexts, subjects, and knowledge domains, responding to different learning aims and goals, concluding that it does not have one learning style [20]. We have selected Felder and Silverman’s learning- and teaching-style model, which aims at features and learning differences important in engineering education. The four dimensions of the learning style are defined [18,21]: Information processing (Active or Reflective Learners); Information reception (Visual or Verbal Learners); Information perception (Sensing or Intuitive Learners); Information understanding (Sequential/Global Learners). In adaptive educational systems that focus on learning styles, FSLSM is widely used. According to several academics, FSLSM is the ideal model for such systems [22,23]. FSLSM is different from existing models in that it incorporates important learning-style models such as Kolb’s [24] and Pask’s [25]. Several other learning-style models treat learning styles as mandatory categories, while FSLSM treats them as tendencies [26]. Özpolat and Akar find this model more suitable for applications covering basic science issues [27]. Although FSLSM has its opponents, we found more positive opinions about it in literature [28,29].
Some studies have identified navigational behaviour as a crucial learners’ feature for accurately assessing learning styles in adaptive e-learning systems [30,31,32]. When designing an adaptive learning system with the goal of achieving an appropriate personalisation function, this must be considered. Most of the existing e-learning environments, such as CS383 [33], WELSA [34], TSAL [35], E-learning 2.0 [36], iLearn [37], and EDUCA [38] include adaptability characteristics depending on the student’s navigational behaviour, considering learning styles that are not mandatory online.
Graf, Liu, and Kinshuk, on the other hand, constrained their study to learners’ activities in an online context only [31]. They investigated the correlation between learners’ diverse learning styles and their preferences and actions throughout the e-learning process by conducting research on learners’ navigational behaviour in an online course in an adaptive learning environment. According to the findings, information regarding variations in learners’ navigational behaviour may be utilised to design a new pattern in learner modelling that automatically recognises learning styles based on learners’ behaviour in an online course.
Research findings were applied in the building of DeLeS, an online learning environment in which authors offer an enhanced method for automatic learning-style recognition in an adaptive learning system [39]. DeLeS’s extended structure comprises three different types of data sources: behaviour patterns, cognitive abilities, and navigation patterns. This study’s findings revealed that combining data from many sources on students’ learning behaviour (navigation patterns and cognitive qualities) might increase the accuracy of the learning style. Other studies were searching for new methods of automatic learning-style recognition, some of which were relying on big data analysis [40], while others were using EEG signals to detect the learning style [41].
A large group of researchers have devoted their studies to the topic of assessing whether the ILS is an appropriate tool that produces valid and reliable results in the Semantic Web Rule Language based on Learning styles for MOOCs [42]. A web-based instrument defined as the Index of Learning Styles (ILS) assesses learners’ preferences on the four dimensions: Active/Reflective, Visual/Verbal, Sensing/Intuiting, and Sequential/Global [43]. Some of them provided evidence of construct validity for this instrument [44], while others expressed concerns regarding the robustness of the model [45]. Although most of these studies have provided evidence that the ILS instrument is reliable, valid, and suitable for its use in education, we found a lot of opinions that the research in the field of evaluation of ILS should be continued.
Cook conducted the study of the ILS instrument’s validity on the sample of internal medical residents [46]. Tests were performed by using two different LS instruments, twice ILS and once Learning Style Type Indicator (LSTI), which was used for comparison of the results. He found acceptable reliability and validity of ILS for assessing two LS dimensions: active or reflective and sensing or intuitive. However, the data from this study provided weak support for the validity of visual or verbal and sequential/global LS dimensions. Cook indicated that style classifications based on ILS results have variable reproducibility, despite acceptable reliability of ILS results. He suggested further research of the ILS instrument’s validity and its comparison with other available tools.
Platsidou and Metallidou [47] compare the psychometric properties of two different learning-style tools: ILS and the Learning Style Inventory (LSI) [43]. Internal consistency reliability, construct validity, and discriminant validity of these measures was investigated in their study. According to the results, the authors found psychometric weaknesses and limitations in both instruments. They also criticised the phenomenon of grouping learners according to their learning style as a tool for adaptation and personalisation of the learning environment [47].
Hosford and Siders conducted a study for evaluation of the ILS instrument, assessing the temporal stability, internal consistency, and factor structure of students’ responses to the ILS [48]. During a two-year study, the findings were moderate to highly reliable, with an acceptable rate of internal consistency. In a conclusion, the study found the suitability of the ILS as a tool for evaluating learning-style preferences.
In their research at North Carolina University, Felkel and Gosky also evaluated the ILS instrument by assessing reliability, discriminant validity, and construct validity [49]. They tested validity by checking whether the four dimensions of learning style, measured by ILS, are non-overlapping concepts. The study proved that the ILS instrument is a trustworthy and valid tool for assessing learning styles.
Zywno conducted a study about psychometric properties of the ILS instrument in a hypermedia-assisted learning environment [50]. The methodology included test–retest reliability, Cronbach’s alpha/factor analysis, interscale correlation, construct validity, internal consistency, and total item correlation. The study’s findings revealed that ILS is a useful instrument for assessing engineering students’ learning styles. The author argues that the work on ILS evaluation must be continued.
To summarise, the current body of evidence is inconclusive, and further study in this area is needed. Our research question investigates ILS results regarding students’ learning styles by offering students to study first through specially designed lessons in Protus and subsequently fill out a subjective questionnaire. Studying differences between the preferences obtained from the ILS and the subjective questionnaire is important in relation to the improvement process of designing an adaptive, personalised, and flexible online learning system by allowing students to change the presentation of the lessons through its interface.
According to the research question and objectives, the null hypothesis, H0, is defined: There are no differences between the ILS and the subjective questionnaire. The alternative hypotheses are determined, per each dimension:
Hypothesis 1a (H1a).
There is a significant difference between the Active/Reflective preferences obtained from the ILS and subjective questionnaires for the Information processing dimension.
Hypothesis 1b (H1b).
There is a significant difference between the Sensing/Intuitive preferences obtained from the ILS and subjective questionnaires for the Information perception dimension.
Hypothesis 1c (H1c).
There is a significant difference between the Visual/Verbal preferences obtained from the ILS and subjective questionnaires for the Information reception dimension.
Hypothesis 1d (H1d).
There is a significant difference between the Sequential/Global preferences obtained from the ILS and subjective questionnaires for the Information understanding dimension.
According to our knowledge, this type of study, which considers the learners’ opinion, has not been used for the ILS assessment. The results initiated enhancing the functionalities of the user interface, and thus expecting better adaptability of Protus.

3. Adaptive Learning System Protus

In this section, we will present an intelligent and adaptive web-based educational system, named Protus. It is developed to help learners in the learning process of different courses such as Essentials of Programming Languages, E-Business, and Use of Information Technology [32] (Figure 1).
Functionalities of the Protus system are adjusted and highly oriented to learners’ and teachers’ needs. Several essential goals and requirements for modern e-learning personalised systems are also realised in Protus [30]: disjunction of the two distinct interfaces—for students and teachers; achieving a high-quality system modularisation; a strong separation of distinct system components: domain module, application module, learner model and adaptation module, inside a sharable and dynamic learner model; continuous management of learning preferences, progress, and personal learners’ data; facilitating communication and collaboration among students as well as between students and instructors knowledge assessment and increasing the learners’ competency level; features for creating new learning content, as well as content transfer from external sources; necessity to achieve semantically rich descriptions of the components’ functions to support successful interoperability between system components; and ensuring that the system’s components are properly coordinated and communicated.
In the following sections, we explain how adaptability and personalisation are achieved in this system and present the FSLSM that was chosen to be implemented in Protus.
Adaptability and personalisation in the Protus system are obtained by the implementation of recommendation techniques (e.g., collaborative filtering, clustering, and association rule mining). The Protus system suggests online learning activities or optimal learning sequences based on learners’ interests, knowledge, learning style, and the browsing history of similar learners with comparable characteristics.
The learner model, as one of the constituent components of the core of the Protus system, stores the information needed to predict student behaviour and thus achieve adaptation. That information is about [32]: the learner, with cognitive, affective, and social characteristics; the hardware and software characteristics of the learner’s environment; the learner’s knowledge and feedback on the content; and the way the learner interacts with online content, including noted metrics such as the learner’s number of keystrokes, dwell time, and patterns of access.
All those data are categorised through three layers: the learner’s performance, objective information, and learning path.
To make Protus intelligent and adaptive, an automatic recommendation system was built. It contains three modules: a learner system-interaction module, an offline module, and a recommendation engine (Figure 2).
The learner system-interaction module records all of a student’s activities such as visited pages, sequence patterns, test outcomes, and grades obtained, and saves them into the server logs. Those data are combined with information previously collected through the learner’s registration process and the learning-style survey. This module also keeps track of the added, modified, and deleted tags.
The offline module is activated periodically, and its goal is to filter the learning content based on the course’s current state, learners’ tags, and learners’ affiliations.
The recommendation engine generates a list of recommendations based on tags posted by learners or educators for each created cluster and the frequent sequences valuation, supplied by the Protus.
The recommendation module of Protus is relying on a specific learner style, determined by the results of an initial questionnaire based on FSLSM, that learners fill in as a first activity after accomplishing their registration in the system.

4. Materials and Methods

Felder and Soloman’s ILS, as a data-gathering instrument for researching learning styles, is used to evaluate learning styles [43]. The ILS is a 44-question multiple-choice learning-styles instrument that allows the assessment of alterations in personal learning-style preferences through four dimensions: Information Perception, Information Understanding, Information Reception, and Information Processing. This questionnaire’s data are used to generate suitable clusters, which are groups of students with similar learning patterns. These results will directly influence the look and the content of the learners’ interface, determining the way in which the lesson will be presented based on the learner’s favourite style.
The Information-Processing dimension allows us to differentiate learners that are example-oriented, called Reflectors, from the ones that are activity-oriented, named Activists [21]. When students are doing something active with their information, such as reviewing, practising, or clarifying it to others, they are more likely to recall and grasp it. Reflectors prefer to gather and analyse information before proceeding with any action. Faced with an active learner, the Protus 2.1 system will first present him/her the activity, then a theoretical explanation, followed by a clarification and an example. This order is different for a reflective learner, an example is first shown to him/her, succeeded by an explanation and theory, and at the end, he/she is requested to execute an activity.
The Information-Perception dimension defines sensing learners, named Sensors, which are known for their patience with details, as well as their ability to memorise knowledge and perform laboratory work. Intuitors are more skilled at acquiring new ideas, concepts, and complex mathematical formulations and abstractions than Sensors. Sensing learners work slower and are less inventive than intuitive learners. Sensors are more practical and cautious, while intuitors despise repetition and care for innovation.
Sensing learners, for example, are expected to be interested in supplementary resources; thus, they can click on the “additional material” button on the screen interface. On the other hand, the interface offers intuitors formulas, abstract material, and concepts. Specific syntax rules or block diagrams are used to provide adequate explanations. The Information-Reception dimension defines verbal and visual learners. Visual learners recall better what they look at in visual forms such as diagrams, drawings, demonstrations, timelines, and flowcharts. Words in the form of written and spoken explanations have a greater impact on verbal learners.
The Information-Understanding dimension defines global and sequential learners. Sequential learners tend to follow the learning material in a straight line, with each stage logically following the one before it. Learners in Protus 2.1 go through lessons in a predetermined order, according to the criteria of the Sequential learning style.
Global learners tend to learn by performing big jumps, passing over learning objects and moving on to more sophisticated information. They are given a broad overview of the course, along with brief descriptions of each unit and the option of gaining access to the unit they choose by clicking the unit hyperlinks rather than completing the course in the sequence.

4.1. Participants

The research was conducted during three months within two courses (E-business and Information Technology Implementation) of the second and third year of a study program in Economics at Novi Sad School of Business. In total, 71 learners voluntarily took part in it. The percentage of the valid questionnaire was 95.77%; 3 of them were not complete and their data were deleted. Thus, we obtained 68 cases for analysis. The participants’ gender structure shows that 47 of them were females and 21 were males. They were 24 years old on average (MAD = 2.55). Figure 3 shows that most of the students were 22 and 23 years old.
The participants were enrolled in four different economics subspecialties: Finance (20), Trade (14), Entrepreneurship (28), and Tourism (6).

4.2. Procedure

We conducted a survey using a subjective questionnaire in order to investigate the learners’ preferences during the learning process, and compared its results with the results obtained using the ILS. The procedure was subdivided into two phases: Experimental Phase 1 and Experimental Phase 2.

4.2.1. Experimental Phase 1

At the beginning of learning with the Protus system, learners filled out the ILS questionnaire (based on FSLSM) to predict their initial learning styles.
Each of the four FSLSM dimensions was covered with 11 questions. The learner could respond to each question by choosing one of the two offered answers. Each answer had an impact on the final result, leading it towards one of two categories within the corresponding learning styles’ dimension.
The system assigns one point to each learner’s answer in the relevant field, which is then entered into the related table (Table 1). In the next step, the system sums the numbers belonging to the same column. The number of replies identified as A and B is used to determine the final index. Therefore, if all 11 responses were of type A, the index would be −6. In the case of ten responses of type A and just one of type B, the index would be −5. The index would present a value of −4 in the case of nine type-A answers and two type-B answers, and so on.
If the resulting index has a value between −2 and 2, the learner is considered as “fairly well-balanced” and one of the dimension’s categories has a modest preference for a learner. In the case that the value of the index is −4, −3, 3, or 4, the learner has a moderate propensity for one of the categories (“moderate preference” type) and will find it easier to study in an environment that prioritises that category. If the index is −6, −5, 5, or 6, the student is strongly inclined to one of the categories of dimension (type “strong preferences”) and may have problems in a learning environment that supports the opposite category.
The Protus then determines the learning style’s numerical value and sets up the relevant learner’s model. The learner’s model’s setting up is critical in determining the initial options that would lead to the system’s personalisation.

4.2.2. Experimental Phase 2

To see to what extent the results of the ILS questionnaire match the learner’s requirements, we integrated into Protus—particularly for the purpose of this study—new specifically designed lessons. For each of the four previously mentioned dimensions, we created two lessons on the same topic, each of them designed to illustrate a particular learning style. Thus, if the learner was classified by the ILS questionnaire in the visual category, he/she was asked to learn from the lessons designed in two opposite ways: visual, but also verbal. After learning using both types of the same lesson for each of the four dimensions, learners were requested to fill out a questionnaire (subjective questionnaire) consisting of 11 questions that inquired from different points of view which of the two presented lessons were more appropriate to the learner from their own studying experience (Figure 4).

5. Data Analysis

During the practical use of Protus, we noticed nonuniform segmentation of learning categories. Thus, learning styles are not equally distributed among students. For example, within the Information-Perception domains, nearly 80% of students had a sensing learning style, while only a small part had an intuitive learning style. We distributed the learners to one of the two categories within each dimension based on the previously calculated index. Each category included all learners with the same preference, fairly balanced, moderate, or strong. Figure 5 presents the learners’ comparison of established learning-style preferences within all four dimensions.
The data processing of the learners’ answers was conducted in the same way as with the answers from the ILS questionnaire. As a result, each learner was classified in one of the following three LS types: a fairly well-balanced type, a type with moderate preference, or a type with a strong preference for every opposite type of the lesson within each dimension.
In the next step, we assigned numeric values for each of those learning-style types within each of the four dimensions. Consequently, we gave each student, for each of the four dimensions, a numeric value in the range from −2 to 2, depicting in that way his/her inclination toward one of the two opposite learning styles. Table 2 shows the distribution of those values for each possible instance.
We repeated this for the results obtained from the ILS questionnaire. Therefore, every learner, for each of the four dimensions, was classified in one of the five possible types, first based on the findings of the ILS questionnaire, and secondly following the results of the questionnaire that they filled in after learning from specially designed lessons. In order to estimate to which extent those two values would vary for each learner, we calculated the absolute value of their difference (Table 3).
The minimum value of the absolute difference would be zero, in case the learner was classified in the same learning-style group by both questionnaires. The maximum value could be 4 in case the learner was categorised in totally opposite learning-style types by the two questionnaires.
Finally, after calculating the average value of all the absolute differences we obtained the mean absolute difference for each learning-style dimension.
To investigate the different reliability and validity aspects of our survey along with some other numeric data presented in the tables shown in our paper, we used two software products. The simple calculations were performed in Microsoft Excel 2016, while for the more sophisticated ones we used the Statistical Package for the Social Sciences (SPSS), version 26. It was used to calculate Cronbach’s alpha to present the internal consistency and reliability of the subjective questionnaire. We also used SPSS to perform the Exploratory Factor Analysis (EFA). One of the EFA resulting tables, the Rotated Component Matrix, gave us the data necessary to perform the calculation of Composite Reliability (CR) and Average Variance Extracted (AVE) the latter of which was used to confirm the convergent validity of our survey. To establish the discriminant validity, we used the SPSS Bivariate Correlation module based on Pearson’s correlation coefficient. The calculation of the McDonald’s omega, as an additional indicator of the reliability of our survey, was performed using the Hayes Omega macro in SPSS.
In order to assess the internal consistency and reliability of the subjective questionnaire in all four domains, Cronbach’s alpha was used. The results shown in Table 4 indicate a high Cronbach’s alpha, slightly lower for the Sensing/Intuitive one. The same conclusions were drawn after calculating McDonald’s omega, its values were almost the same as Cronbach’s alpha ones.
In addition, we calculated values of Composite Reliability (CR) and Average Variance Extracted (AVE). CR was calculated according to the following formula.
C R = ( i = 1 n λ i ) 2 ( i = 1 n λ i ) 2 + ( i = 1 n ε i ) ,
where i refers to the number of items ranging from 1 to n; n represents the total number of items; λi is the standardised factor loading for item i; and εi is the error variance of item i (εi = 1 − λi2).
The value of the AVE was obtained by using the following formula:
A V E = i = 1 n λ i 2 n
where i refers to the number of items ranging from 1 to n; n represents the total number of items; and λi is the standardised factor loading for item i.

6. Results

As shown in Table 5, the AVE values for all of the four dimensions are considerably above the acceptable 0.5 value, except for the Sensing/Intuitive style dimension, so we can claim that it indicates a good convergent validity. The same can be said regarding composite reliability. For all of the four dimensions of our subjective survey, the CR values are above the 0.7 minimum acceptable value.
The discriminant validity of our survey has been investigated as well. The square root of the average variance extracted for each style dimension was higher than the correlations involving the style dimension (Table 6), suggesting acceptable discriminant validity.
The comparison of the results obtained by the analysis of the ILS and subjective questionnaires showed that the lowest mean absolute difference was 1.28 for the Sensing/Intuitive dimension, while the highest was 1.79 in the case of the Sequential/Global and Active/Reflective dimensions. The mean difference value for the Visual/Verbal dimension was 1.42. Although we can notice divergences in all four dimensions, they are significantly below the maximum value of 4.
Results obtained from both questionnaires were also interpreted by counting how many times every student answered each of the 11 questions inclined to one specific style in each of the four dimensions. The mean values of all scores for each dimension are presented in Table 7.
The values of the ILS mean scores are very close to the subjective questionnaire ones. Their absolute differences vary from 0.34 for the Sensing/Intuitive dimension to 1.79 for the Sequential/Global dimension.
Results obtained from the subjective questionnaire and the ILS regarding the distribution of different learning styles among students are shown in Figure 6.
The distribution is almost the same in the Information-Perception and Information-Processing dimensions and marginally different for the Information-Reception dimension. Once more, the Information-Understanding domain is the one that has the most divergent figures. According to the ILS, the sequential learning style is the prevailing one (65%), while the subjective questionnaire suggests that the global learning style is the most present among the students (58%).
Comparing the results gathered by the ILS and the subjective questionnaire, we also calculated the percentage of students that changed their learning style. According to the viewpoint that "fairly balanced" categories show only a weak leaning to any of the two opposite learning styles, we took into consideration only when someone with a “strong” or “moderate” preference for a learning style shifted to the "strong" or "moderate" preferences for the opposite one. The lowest percentage of style changing is in the Information-Reception dimension (9.09%); a bit higher is in the Information-Perception dimension, while the highest is in the Information-Processing (19.18%) and Information-Understanding dimensions (19.40%).
In order to test our hypotheses, for each learning-style dimension, a paired-samples t-test was conducted in SPSS to determine if there was a significant difference between the results obtained by the ILS questionnaire and the ones gathered by the subjective questionnaire. For the purpose of this test, preferences previously presented as strong and weak were joined in the one category.
The data presented in Table 8, designed to test hypothesis H1a, suggest that there was a significant difference between results obtained by the ILS questionnaire (M = 2.515; SD = 0.702) and those obtained by the subjective questionnaire (M = 1.765; SD = 0.601); [t(67) = 6.720, p < 0.001]. We can therefore reject the null hypothesis that within the Information-Processing dimension there is no difference between the ILS’s and subjective questionnaire’s results.
The testing of hypothesis H1b was conducted using the data shown in Table 9, which led us to conclude that there was not a statistically significant difference between results obtained by the ILS questionnaire (M = 2.206; SD = 0.771) and those obtained by the subjective questionnaire (M = 2.059; SD = 0.722); [t(67) = 1.559, p = 0.124]. Due to received value p > 0.05, we failed to reject the null hypothesis that within the Information-Perception dimension there is no difference between the results of the ILS and the subjective questionnaires.
From the data presented in Table 10, related to the testing of hypothesis H1c, we concluded that there was a statistically significant difference between Visual/Verbal preferences obtained by the ILS questionnaire (M = 2.338; SD = 0.745) and those obtained by the subjective questionnaire (M = 1.853; SD = 0.579); [t(67) = 4.500, p < 0.001]. For that reason, we can reject the null hypothesis that within the Information-Reception dimension there is no difference between the ILS’s and subjective questionnaire’s results.
The testing of hypothesis H1d was based on data presented in Table 11. The presented indicators suggest that there was a statistically significant difference between results obtained by the ILS questionnaire (M = 2.544; SD = 0.609) and those obtained by the subjective questionnaire (M = 1.573; SD = 0.676); [t(67) = 8.462, p < 0.001]. On that account, we can reject the null hypothesis that within the Information-Understanding dimension there is no difference between the ILS’s and subjective questionnaire’s results.
Results of our study suggest that the ILS is not a fully reliable tool for making precise and final conclusions about the learner’s learning style, but its results are still good as a starting point for defining the initial learner’s model.
According to certain studies, a learner’s learning style might alter depending on the activity that the learner has learned [51,52]. In addition, learning styles may be modified based on the learning content and learning duration. The study that we conducted in order to try and find out to which extent the results gathered by ILS are accurate also confirmed the necessity of providing the learners with the possibility of changing the presentation method of the lessons during the learning process. According to this, we enhanced the user interface of Protus by adding one new functionality: the experience bar. Using the experience bar, learners may freely choose between presenting approaches and styles throughout the remainder of the course (Figure 7).

7. Discussion and Conclusions

Information about personal learning preferences—referred to in scientific society as individual LS—are essential to achieve adaptability and personalisation as important features of the modern e-learning environment. Among numerous models for LS representation, we presented Felder and Silverman’s learning-style model, its principles, and its practical use for LS representation in an e-learning environment. The main reasons for this choice were the wide use of FSLSM, its flexibility and suitability, and most importantly, the existence of ILS as an accepted assessment tool associated with this learning model.
Considering all the existing and previously investigated controversies of the learning style paradigm presented in the introductory theoretical review, and the necessity to continue with the evaluation of ILS, we conducted the study among students of School of Business, Novi Sad, measuring the concurrent validity of the ILS instrument. An adaptive learning system was developed as an experimental platform for studying the use of LS identification in the process of personalisation. During the study, we introduced a subjective questionnaire as a control tool for the assessment of ILS. Namely, we acquired students’ learning preferences using two instruments: ILS at the beginning, and later a subjective questionnaire. This subjective assessment, which considers the learners’ opinions, was the base for our conclusions about the ILS’s validity.
The results of the study suggest the satisfying validity of ILS as a tool for defining the initial learner’s model, at the beginning of the learning cycle. The new fact is the findings of the differences between the results of ILS and the results of subjective assessment.
This part of the results, which aims to answer our research question, suggests that ILS is not a tool for making comprehensive and final conclusions about the learner’s LS. Results of the hypothesis related to the Information-Perception dimension failed to reject the null hypothesis. The results of the other three alternative hypotheses were statistically significant enough to reject the null hypothesis that within the respective dimensions there is no difference between the ILS’s and subjective questionnaire’s results. As a result, it is unproductive to keep the learner’s learning style consistent during the course, specifically if the learner is dissatisfied with his or her existing learning style.
Our research results show that a balanced approach can improve the process of designing an adaptive and flexible online learning system. That goal can be reached by first implementing an additional subjective questionnaire that will enhance the results of the ILS questionnaire. Independent of the initial ‘cold start’ assigned LS, the extension of Protus functionalities with the experience bar will allow students to change the initial interface design by choosing the presentation method that they find the most suitable for them. The additional information can be included in the calculation process of learning styles. Incorporating more data in the calculation processes leads to a more reliable result and therefore improves student modelling. Moreover, this solution may find wider application in different types of personalised e-learning environments, regardless of the applied assessment tool.
A limitation of our study is the uniformity of the participants regarding their study program. Only students enrolled in Economics participated in the experiment. Another limitation is that the lessons designed for the students to be learned before they fill out the subjective questionnaire only cover subjects from the Information Technology area. Finally, most of the participants were about the same age (22–24).
Further research of the ILS instrument’s validity and its comparison with other available tools would be valuable. In addition, future research should investigate the evaluation and prediction of student goals, knowledge gaps, motivation, values, trust, and other variables critical to the learning process by analysing the success achieved through the possibility to change learning style using the experience bar, as an adaptive tool for learning-style selection.

Author Contributions

Conceptualisation, Z.M., B.V., N.S., M.I. and A.K.-M.; data curation, Z.M. and N.S.; formal analysis, Z.M., N.S. and A.K.-M.; investigation, A.K.-M., N.S. and Z.M.; methodology, N.S., Z.M., A.K.-M. and M.I.; resources, Z.M., N.S., A.K.-M. and B.V.; software, B.V.; supervision, M.I.; validation, Z.M., A.K.-M. and N.S.; visualisation, Z.M., N.S., A.K.-M. and B.V.; writing—original draft, N.S., A.K.-M., Z.M. and B.V.; writing—review and editing, Z.M., M.I., N.S. and A.K.-M. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

The study was conducted according to the guidelines of the Declaration of Helsinki and approved by the Ethics Committee of the Novi Sad School of Business (2 September 2020).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

This research has been approved by the Ethics Committee of the Novi Sad School of Business.

Conflicts of Interest

The authors state that they have no conflicts of interest.

References

  1. Keefe, J. Learning style: An overview. In Student Learning Styles: Diagnosing and Prescribing Programs; Keefe, J., Ed.; National Association of Secondary School Principals: Reston, VA, USA, 1979; pp. 1–17. [Google Scholar]
  2. Katsaris, I.; Vidakis, N. Adaptive e-learning systems through learning styles: A review of the literature. Adv. Mob. Learn. Educ. Res. 2021, 1, 124–145. [Google Scholar] [CrossRef]
  3. Dantas, L.A.; Cunha, A. An integrative debate on learning styles and the learning process. Soc. Sci. Humanit. Open 2020, 2, 100017. [Google Scholar] [CrossRef]
  4. Weng, F.; Ho, H.-J.; Yang, R.-J.; Weng, C.-H. The Influence of Learning Style on Learning Attitude with Multimedia Teaching Materials. Eurasia J. Math. Sci. Technol. Educ. 2018, 15, em1659. [Google Scholar] [CrossRef]
  5. Bernard, J.; Chang, T.W.; Popescu, E.; Graf, S. Learning style Identifier: Improving the precision of learning style identification through computational intelligence algorithms. Expert Syst. Appl. 2017, 75, 94–108. [Google Scholar] [CrossRef]
  6. Costa, R.D.; Souza, G.; Valentim, R.A.M.; Castro, T.B. The theory of learning styles applied to distance learning. Cogn. Syst. Res. 2020, 64, 134–145. [Google Scholar] [CrossRef]
  7. Raleiras, M.; Nabizadeh, A.H.; Costa, F.A. Automatic learning styles prediction: A survey of the State-of-the-Art (2006–2021). J. Comput. Educ. 2022, 1–93. [Google Scholar] [CrossRef]
  8. Alzain, A.M.; Clark, S.; Jwaid, A.; Ireson, G. Adaptive education based on learning styles: Are learning style instruments precise enough? Int. J. Emerg. Technol. Learn. 2018, 13, 41–52. [Google Scholar] [CrossRef]
  9. Khamparia, A.; Pandey, B. Association of learning styles with different e-learning problems: A systematic review and classification. Educ. Inf. Technol. 2020, 25, 1303–1331. [Google Scholar] [CrossRef]
  10. Freedman, R.D.; Stumpf, S.A. Learning Style Theory: Less than Meets the Eye. Acad. Manag. Rev. 1980, 5, 445–447. [Google Scholar] [CrossRef]
  11. Kozhevnikov, M.; Evans, C.; Kosslyn, S.M. Cognitive style as environmentally sensitive individual differences in cognition: A modern synthesis and applications in education, business, and management. Psychol. Sci. Public Interest 2014, 15, 3–33. [Google Scholar] [CrossRef]
  12. Azzi, I.; Jeghal, A.; Radouane, A.; Yahyaouy, A.; Tairi, H. A robust classification to predict learning styles in adaptive E-learning systems. Educ. Inf. Technol. 2020, 25, 437–448. [Google Scholar] [CrossRef]
  13. Joseph, M.R. Learning Styles of Learners and Its Importance in Instruction. J. Appl. Sci. Res. 2019, 7, 67–76. [Google Scholar]
  14. Ocepek, U.; Bosnić, Z.; Šerbec, I.N.; Rugelj, J. Exploring the relation between learning style models and preferred multimedia types. Comput. Educ. 2013, 69, 343–355. [Google Scholar] [CrossRef]
  15. Popescu, E. Adaptation provisioning with respect to learning styles in a Web-based educational system: An experimental study. Comput. Assist. Learn. 2010, 26, 243–257. [Google Scholar] [CrossRef]
  16. Cristea, A.; Stash, N. AWELS: Adaptive Web-Based Education and Learning Styles. In Proceedings of the Sixth IEEE International Conference on Advanced Learning Technologies (ICALT’06), Kerkrade, The Netherlands, 5–7 July 2006; pp. 1135–1136. [Google Scholar] [CrossRef] [Green Version]
  17. Coffield, F.; Moseley, D.; Hall, E.; Ecclestone, K. Learning Styles and Pedagogy in Post-16 Learning: A Systematic and Critical Review; The Learning and Skills Research Centre: London, UK, 2004. [Google Scholar]
  18. Felder, R.; Silverman, L. Learning and teaching styles in Engineering education. Eng. Educ. 1988, 78, 674–681. [Google Scholar]
  19. Coffield, F.; Moseley, D.; Hall, E.; Ecclestone, K. Should We Be Using Learning Styles? The Learning and Skills Research Centre: London, UK, 2004. [Google Scholar]
  20. Attwell, G. Personal Learning Environments—The future of eLearning? Elearn. Pap. 2007, 2, 1–8. Available online: http://www.elearningeuropa.info/out/?doc_id=9758&rsr_id=11561 (accessed on 10 October 2021).
  21. Felder, R.M. Matters of Style. ASEE Prism 1996, 6, 18–23. [Google Scholar]
  22. Huang, E.Y.; Lin, S.W.; Huang, T.K. What type of learning style leads to online participation in the mixed-mode e-learning environment? A study of software usage instruction. Comput. Educ. 2012, 58, 338–349. [Google Scholar] [CrossRef]
  23. Kuljis, J.; Liu, F. A Comparison of Learning Style Theories on the Suitability for elearning. In Proceedings of the Conference on Web Technologies Applications, and Services, Calgary, AB, Canada, 4–6 July 2005; pp. 191–197. [Google Scholar]
  24. Kolb, D.A. Experiential Learning: Experience as The Source of Learning and Development; Prentice Hall: Hoboken, NJ, USA, 1984. [Google Scholar]
  25. Pask, G. Styles and Strategies of Learning. Br. J. Educ. Psychol. 1976, 46, 128–148. [Google Scholar] [CrossRef] [Green Version]
  26. Graf, S.; Lin, T. The relationship between learning styles and cognitive traits—Getting additional information for improving student modelling. Comput. Hum. Behav. 2008, 24, 122–137. [Google Scholar] [CrossRef] [Green Version]
  27. Özpolat, E.; Akar, G.B. Automatic detection of learning styles for an e-learning system. Comput. Educ. 2009, 53, 355–367. [Google Scholar] [CrossRef]
  28. Al-azawei, A.; Parslow, P.; Lundqvist, K. Investigating the effect of learning styles in a blended e-learning system: An extension of the technology acceptance model (TAM). Australas. J. Educ. Technol. 2017, 33, 1–23. [Google Scholar] [CrossRef] [Green Version]
  29. Özyurt, Ö.; Özyurt, H. Learning style based individualized adaptive e-learning environments: Content analysis of the articles published from 2005 to 2014. Comput. Hum. Behav. 2015, 52, 349–358. [Google Scholar] [CrossRef]
  30. Klašnja-Milićević, A.; Vesin, B.; Ivanović, M.; Budimac, Z. Integration of recommendations into Java tutoring system. In Proceedings of the 4th International Conference on Information Technology ICIT 2009 Jordan, Amman, Jordan, 3–5 June 2009. [Google Scholar]
  31. Graf, S.; Liu, T.C.; Kinshuk. Analysis of learners’ navigational behaviour and their learning styles in an online course. J. Comput. Assist. Learn. 2010, 26, 116–131. [Google Scholar] [CrossRef]
  32. Klašnja-Milićević, A.; Vesin, B.; Ivanović, M.; Budimac, Z. E-Learning personalization based on hybrid recommendation strategy and learning style identification. Comput. Educ. 2011, 56, 885–899. [Google Scholar] [CrossRef]
  33. Carver, C.; Howard, R.; Lane, W. Addressing different learning styles through course hypermedia. IEEE Trans. Educ. 1999, 42, 33–38. [Google Scholar] [CrossRef] [Green Version]
  34. Popescu, E. An artificial intelligence course used to investigate students’ learning style. In Proceedings of the Advances in Web Based Learning-ICWL Conference, Jinhua, China, 20–22 August 2008; pp. 122–131. [Google Scholar] [CrossRef]
  35. Tseng, J.; Chu, H.; Hwang, G.; Tsai, C. Development of an adaptive learning system with two sources of personalization information. Comput. Educ. 2008, 51, 776–786. [Google Scholar] [CrossRef]
  36. Holenko Dlab, M.; Hoić-Božić, N. An Approach to Adaptivity and Collaboration Support in a Web-Based Learning Environment. Int. J. Emerg. Technol. Learn. 2009, 4, 28–30. [Google Scholar] [CrossRef] [Green Version]
  37. Peter, S.E.; Bacon, E.; Dastbaz, M. Adaptable, personalized e-learning incorporating learning styles. Campus-Wide Inf. Syst. 2010, 27, 91–100. [Google Scholar] [CrossRef]
  38. Cabada, R.Z.; Barron, E.M.L.; Reyes Garcia, C.A. EDUCA: A web 2.0 authoring tool for developing adaptive and intelligent tutoring systems using a Kohonen network. Expert Syst. Appl. 2011, 38, 9522–9529. [Google Scholar] [CrossRef]
  39. Kumar, V.; Graf, S. Causal competencies and learning styles: A framework for adaptive instruction. J. E-Learn. Knowl. Soc. 2011, 7, 13–32. [Google Scholar] [CrossRef]
  40. Hmedna, B.; El Mezouary, A.; Baz, O. A predictive model for the identification of learning styles in MOOC environments. Cluster Comput. 2020, 23, 1303–1328. [Google Scholar] [CrossRef]
  41. Zhang, B.; Chai, C.; Yin, Z.; Shi, Y. Design and Implementation of an EEG-Based Learning-Style Recognition Mechanism. Brain Sci. 2021, 11, 613. [Google Scholar] [CrossRef] [PubMed]
  42. Agarwal, A.; Mishra, D.S.; Kolekar, S.V. Knowledge-based recommendation system using semantic web rules based on Learning styles for MOOCs. Cogent Eng. 2022, 9, 2022568. [Google Scholar] [CrossRef]
  43. Felder, R.M.; Soloman, B. Index of Learning Styles Questionnaire. NC State University. Available online: http://www.engr.ncsu.edu/learningstyles/ilsweb.html (accessed on 19 April 2020).
  44. Litzinger, T.A.; Lee, S.H.; Wise, J.C.; Felder, R. A study of the reliability and validity of the Felder-Soloman Index of Learning Styles. In Proceedings of the ASEE Annual Conference, American Society for Engineering Education, Portland, OR, USA, 12–15 June 2005; pp. 10.95.1–10.95.16. [Google Scholar] [CrossRef]
  45. Van Zwanenberg, N.; Wilkinson, L.J.; Anderson, A. Felder and Silverman’s Index of Learning Styles and Honey and Mumford’s Learning Styles Questionnaire: How do they compare and do they predict academic performance? Educ. Psychol. 2000, 20, 365–380. [Google Scholar] [CrossRef]
  46. Cook, D.A. Reliability and validity of scores from the index of learning styles. Acad. Med. 2005, 80, S97–S101. [Google Scholar] [CrossRef]
  47. Platsidou, M.; Metallidou, P. Validity and reliability issues of two learning style inventories in a Greek Sample: Kolb’s Learning Style Inventory and Felder & Soloman’s Index of Learning. Int. J. Teach. Learn. High. Educ. 2009, 20, 324–335. [Google Scholar]
  48. Hosford, C.C.; Siders, W. Felder-Soloman’s Index of Learning Styles: Internal consistency, temporal stability, and factor structure. Teach. Learn. Med. 2010, 22, 298–303. [Google Scholar] [CrossRef]
  49. Felkel, B.; Gosky, R. A Study of Reliability and Validity of the Felder-Soloman Index of Learning Styles for Business Students. In Proceedings of the Annual International Conference on Technology in Collegiate Mathematics, Electronic, Orlando, FL, USA, 22–25 March 2012. [Google Scholar]
  50. Zywno, M. A contribution to validation of score meaning for Felder-Soloman’s index of learning styles. In Proceedings of the 2003 American Society for Engineering Education, Nashville, TN, USA, 22–25 June 2003. [Google Scholar] [CrossRef]
  51. Crockett, K.; Latham, A.; Whitton, N. On predicting learning styles in conversational intelligent tutoring systems using fuzzy decision trees. Int. J. Hum. Comput. Stud. 2017, 97, 98–115. [Google Scholar] [CrossRef]
  52. Yonghou, L.; Rui, X.; Ke, W. A Study on the Influence of the Matching between Learning Style and Teaching Style on the Academic Record of English Majors. Br. J. Educ. Soc. Behav. Sci. 2016, 15, 1–8. [Google Scholar] [CrossRef]
Figure 1. The user interface of Protus.
Figure 1. The user interface of Protus.
Sustainability 14 04426 g001
Figure 2. The recommendation system in Protus.
Figure 2. The recommendation system in Protus.
Sustainability 14 04426 g002
Figure 3. The distribution of students among different age categories.
Figure 3. The distribution of students among different age categories.
Sustainability 14 04426 g003
Figure 4. Subjective questionnaire.
Figure 4. Subjective questionnaire.
Sustainability 14 04426 g004
Figure 5. Learning styles’ distribution.
Figure 5. Learning styles’ distribution.
Sustainability 14 04426 g005
Figure 6. Learning styles distribution based on ILS and subjective questionnaire.
Figure 6. Learning styles distribution based on ILS and subjective questionnaire.
Sustainability 14 04426 g006
Figure 7. Experience bar.
Figure 7. Experience bar.
Sustainability 14 04426 g007
Table 1. An example of a complete table for determining learning-style types.
Table 1. An example of a complete table for determining learning-style types.
Active/ReflexiveSensitive/IntuitiveVisual/VerbalGlobal/Sequential
QuestionABQuestionABQuestionABQuestionAB
11 21 3 141
5 16 171 81
9 110 111 1121
13 114 1151 161
171 18 1191 201
211 22 123 124 1
25 1261 271 281
291 30 131 1321
331 341 35 1361
371 38 1391 401
411 42 1431 441
Overall (sum of marks within one column)
Active/ReflexiveSensitive/IntuitiveVisual/VerbalGlobal/Sequential
AB AB AB AB
Sum74Sum38Sum65Sum101
Index of the particular style
−23−1−5
Learning style type
Fairly well balancedModerate preference for IntuitiveFairly well balancedStrong preference for Global
Table 2. Assignment of numeric values to learning styles.
Table 2. Assignment of numeric values to learning styles.
Learning Style DimensionsLearning StyleWell-BalancedModerate PreferenceStrong Preference
Information processingActive012
Reflective−1−2
Information perceptionSensing012
Intuitive−1−2
Information receptionVisual012
Verbal−1−2
Information understandingSequential012
Global−1−2
Table 3. Calculation of the mean absolute difference.
Table 3. Calculation of the mean absolute difference.
Active or ReflectiveSensing or IntuitiveVisual or VerbalSequential or Global
Learning Style Type Defined byAbs. Differ.Learning Style Type Defined byAbs. Differ.Learning Style Type Defined byAbs. Differ.Learning Style Type Defined byAbs. Differ.
StudentsILSSubj. Quest.ILSSubj. Quest.ILSSubj. Quest.ILSSubj. Quest.
11102020−11011
2−1−211−23−224000
30220−220221−23
4−101000−1121−23
52−240−221−230−22
61210−221−23011
..........................................
620220−220221−23
630220−220−220−11
640110−22−1−100−22
651−23−123−1231−23
660−22211022110
670−22−123022121
68−2−200−22121121
Mean absolute difference 1.79 1.28 1.42 1.79
Table 4. Internal Consistency Reliability of the subjective questionnaire.
Table 4. Internal Consistency Reliability of the subjective questionnaire.
Style DimensionCronbach’s Alpha (N = 68)McDonald’s Omega (N = 68)
Active/Reflective0.9710.974
Sensing/Intuitive0.9320.931
Visual/Verbal0.9680.970
Sequential/Global0.9780.978
Table 5. AVE and CR of the subjective questionnaire.
Table 5. AVE and CR of the subjective questionnaire.
Style DimensionAVECR
Active/Reflective0.8460.980
Sensing/Intuitive0.5790.841
Visual/Verbal0.7510.970
Sequential/Global0.8070.978
Table 6. Discriminant validity indicators and correlations between style dimensions.
Table 6. Discriminant validity indicators and correlations between style dimensions.
Correlations between Style Dimensions
Style DimensionAVEAct/Ref 2Sen/Int 2Vis/Ver 2Seq/Glob 2
Act/Ref 20.8460.920 1
Sen/Int 20.5790.1000.761 1
Vis/Ver 20.7510.2750.1600.867 1
Seq/Glob 20.8070.004−0.131−0.1130.8987 1
1 The numbers in bold are square roots of the average variance extracted. 2 Note: Act/Ref = Active/Reflective; Sen/Int = Sensing/Intuitive; Vis/Ver = Visual/Verbal; Seq/Glob = Sequential/Global.
Table 7. A comparison of the ILS and subjective questionnaire mean scores.
Table 7. A comparison of the ILS and subjective questionnaire mean scores.
Learning Style DimensionsLearning StyleILS Mean ScoresSubj. Quest. Mean ScoresAbsolute Difference
Information processingActive6.156.950.80
Information perceptionSensing7.698.030.34
Information receptionVisual6.197.481.29
Information understandingSequential6.374.581.79
Table 8. T-test results for the Information-Processing dimension (Active/Reflective).
Table 8. T-test results for the Information-Processing dimension (Active/Reflective).
QuestionnaireMeanNStd. DeviationtdfSig. (2-Tailed)
ILS2.515680.7026.72067<0.001
Subjective1.765680.601
Table 9. T-test results for the Information-Perception dimension (Sensing/Intuitive).
Table 9. T-test results for the Information-Perception dimension (Sensing/Intuitive).
QuestionnaireMeanNStd. DeviationtdfSig. (2-Tailed)
ILS2.206680.7711.559670.124
Subjective2.059680.722
Table 10. T-test results for the Information-Reception dimension (Visual/Verbal).
Table 10. T-test results for the Information-Reception dimension (Visual/Verbal).
QuestionnaireMeanNStd. DeviationtdfSig. (2-Tailed)
ILS2.338680.7454.50067<0.001
Subjective1.853680.579
Table 11. T-test results for the Information-Understanding dimension (Sequential/Global).
Table 11. T-test results for the Information-Understanding dimension (Sequential/Global).
QuestionnaireMeanNStd. DeviationtdfSig. (2-Tailed)
ILS2.544680.6098.46267<0.001
Subjective1.573680.676
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Marosan, Z.; Savic, N.; Klasnja-Milicevic, A.; Ivanovic, M.; Vesin, B. Students’ Perceptions of ILS as a Learning-Style-Identification Tool in E-Learning Environments. Sustainability 2022, 14, 4426. https://doi.org/10.3390/su14084426

AMA Style

Marosan Z, Savic N, Klasnja-Milicevic A, Ivanovic M, Vesin B. Students’ Perceptions of ILS as a Learning-Style-Identification Tool in E-Learning Environments. Sustainability. 2022; 14(8):4426. https://doi.org/10.3390/su14084426

Chicago/Turabian Style

Marosan, Zoran, Ninoslava Savic, Aleksandra Klasnja-Milicevic, Mirjana Ivanovic, and Boban Vesin. 2022. "Students’ Perceptions of ILS as a Learning-Style-Identification Tool in E-Learning Environments" Sustainability 14, no. 8: 4426. https://doi.org/10.3390/su14084426

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop