Next Article in Journal
Hybrid Control and Energy Management of a Residential System Integrating Vehicle-to-Home Technology
Previous Article in Journal
Designing and Testing a Tool That Connects the Value Proposition of Deep-Tech Ventures to SDGs
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Research on Interface Design of Interactive Response System App with Different Learning Styles

1
Department of Industrial Design, National Yunlin University of Science and Technology, Yunlin 640301, Taiwan
2
Wistron Medical Technology Corporation, New Taipei City 221211, Taiwan
*
Author to whom correspondence should be addressed.
Designs 2023, 7(2), 51; https://doi.org/10.3390/designs7020051
Submission received: 5 March 2023 / Revised: 22 March 2023 / Accepted: 24 March 2023 / Published: 1 April 2023

Abstract

:
This paper discusses the interaction between students (humans) and an interaction response system (IRS) app (machine) in the teaching context and explores the interface usability and interactive experience design through the experimental method. The experiment mainly explored the differences in the use of the IRS app by learners with different learning styles. A total of 72 subjects were recruited for the experiment, of which the four learning styles (diverger, assimilator, converger, and accommodator) and the two kinds of information architecture (deep/shallow) are discussed respectively. With operating time performance and use experience as dependent variables, the relationship between variables was explored. The results of this study are as follows: in the learning style factor, subjects of the reflection and observation type responded faster to vibration; in the information architecture factor, with the deep information architecture, it took longer for page switching as more pages needed to be switched, and thus the operation performance was poor. According to the results of the two-stage experiment, the following design suggestions are proposed. It is expected that the research results can contribute to the fields of interactive experience design and teaching technology.

1. Introduction

The scholars indicated that teachers’ teaching solutions affect students’ learning motivation, while students’ leaning attitudes affect their learning outcomes [1]. According to a survey conducted by CommonWealth Magazine in 2008, more than 90% of elementary and middle school teachers felt that students did not pay enough attention in class, and a quarter of teachers felt that they could not concentrate on teaching because of their students [2]. In addition, McKeachie pointed out that in the first 10 min of class, students were able to remember 70% of what the teacher said, while in the 10 min before class ended, they could only remember 20% of the content [3,4].
With the advancement of technology, the number of users of smart mobile devices is increasing, and the number of 4G mobile broadband subscribers in April 2020 reached 2.906 million [5]. In addition, the Taiwan Ministry of Education proposed the “High School Mobile Learning Counseling Program for School 2014” to change teachers’ teaching modes and promote diversification of students’ learning styles [6]. The high penetration rate of the mobile network environment in Taiwan is very suitable for the development of an interactive response system (IRS), an interactive education model.
IRSs should meet the needs of teachers and students in the teaching field and have good usability to reduce the learning costs of teachers and students in operating the system so as to promote learning motivation, improve learning experience, and enhance learning effectiveness [7]. The authors conducted experiments on the usability of two existing IRS applications (apps) operated by users with different learning styles, and pointed out that their performance differed, and information architecture was an important factor affecting the usability of the apps [8]. The information architecture was analyzed and redrawn. Therefore, this study proposes the following research objectives.
  • To analyze and redraw the information architecture of the IRS apps.
  • To improve the interface design of the IRS apps.
  • To investigate the differences between different learning styles when using IRS apps.
  • To propose design suggestions for IRS apps.

2. Literature Review

In this study, the interaction between students and teachers in a classroom environment was investigated from three perspectives: students, teachers, and the IRS app. On the teacher’s side, we explored technology-assisted teaching and teaching technology, and organized the flow of teaching events; regarding the IRS app, we explored the design of interactive experience and the usability of information architecture variables, and the change of usage experience. Student side includes their learning style & activity. IRS APP includes interface design, information architecture, and usability.
This study used learning styles and information architecture as variables to investigate the differences in objective time performance and subjective feelings of learners with different learning styles about different information architectures, and to organize the research results and propose design suggestions. Due to the limitations of this study, only the student side of the IRS app was designed for this experiment. Research framework as in Figure 1.

2.1. Interactive Response System

IRS has many synonym, among them are Classroom Response System (CRS), Audience Response Systems (ARS), Student Response System (SRS), and Clickers. In this study, we called it an interactive response system (IRS) [9].
An IRS is a teaching application that allows teachers and students to interact with each other in the classroom through electronic devices (e.g., tablet computers, smart mobile devices). The system allows teachers and students to solve some of the problems that require more time than traditional lecture methods, such as spot-checking, quizzes, answering questions, and reflecting on the difficulty of the course. Other scholars have also proposed that micro-competitions can be conducted at the beginning of the course to check students’ background or memory of the past [10].
With the development of technology, highly interactive products are constantly being introduced, and teaching technology is constantly advancing. In recent years, many innovative teaching methods, such as M-Learning and U-Learning, have emerged as specific teaching methods in the education field. These new teaching methods emphasize teacher-student interaction and two-way communication, gradually replacing the traditional lecture method, which involves more one-way teaching. Innovative interactive teaching has become the most important change in education in the 21st century [11]. For example, more than 1000 universities in the United States (e.g., Harvard University and Brown University) have implemented IRSs, and more primary and secondary schools have introduced this technology to enhance teacher-student interactions and improve learning outcomes [12].
Through the IRS system, teachers can interact with students in real-time during class and quickly quantify the learning effect. This can improve students’ learning satisfaction and motivation [13]. In addition, teachers who are familiar with the IRS system can use it more smoothly and establish a good interaction between teachers and students in the classroom [14].
In recent years, most studies on IRSs have investigated their effectiveness in terms of supplementing teaching and learning, mostly focusing on their application in the teaching field, with their presence/absence as the independent variable and learning effectiveness (test scores), learning motivation, and learning attitude as the dependent variables. It has been found that using an IRS to supplement teaching and learning can effectively enhance learning motivation [15,16,17,18,19,20,21]. The scholar further explored the attitudes of students with different learning styles towards the use of an IRS [22]. In addition, the scholars pointed out that the response speed of the IRS affects the users’ perceptions, where a slower response speed will cause hesitation [23]. In addition to improving teacher-student interaction in the classroom [22,24], IRSs can be applied to lecture venues such as seminars and community centers [18,23]. This study expands on the results of the author’s (2019) study on the usability of two existing IRS apps by using the existing IRS app as the basis for the interface design of the experimental sample. The design of the app is based on the recommendations of the study [8].

2.2. Learning Style

The theories of learning styles have been discussed by scholars since the 1970s, based on the concept that each learner has different cognitions, habits, and attitudes toward learning. Kolb (1984) proposed the experiential learning theory (ELT), which is the most widely used in academia, and believes that learning is a process of transforming experience into knowledge [25]. Learners are divided into the four categories of diverger, assimilator, converger, and accommodator, as shown in Figure 2 and described in Table 1.
The scholars pointed out that students with different learning styles can improve their learning performance by using digital learning platforms [26], and the scholar pointed out that students with different learning styles have positive perceptions of IRSs [19]. The scholars showed that active experimentation (AE) learners in the Kolb style are used to trying different solutions when learning, so they are able to complete tasks without prompting, while reflective observation (RO) learners spend more time learning [27]. The scholars suggested that learners are divided into practical and theoretical types, with the former using experimentation as a decision-making method and the latter using longer thinking to solve problems [28]. Conversely, when the app did not fit the mental model of the active experimental learners, the learners would keep trying the wrong way to operate the app, and therefore the performance was poorer than that of the reflective observational learners. However, there are still few studies on learning styles. The purpose of this research focuses mainly on the information architecture design of the IRS, so the research will continue the research of authors to explore the performance and feelings when operating the IRS [8].

2.3. Information Architecture

The concept of information architecture was first proposed by Richard Saul Wurman, president of the American Institute of Architects, in 1976. He wanted to use architectural theories to organize complex information spaces into simple and clear structures, constructing clear and unambiguous data structures or maps that would allow users to find. In this way, users can find the information they need in the information space [24]. Just as in the real world, different types of buildings have their own fixed styles, such as walking into a bank and recognizing that we are in a bank instead of a hospital; the same is true of a good digital information space. When users visit a bank website with a well-designed information architecture, they will recognize that they are visiting a bank website rather than a hospital website [29].
An important concept of organizational structures in the information space is hierarchy, and many good information architectures are designed in a hierarchical manner. The concept of hierarchy is that the categories on the same level are mutually exclusive, and the upper and lower levels are parent-child relationships. When information architects design information architecture, they should pay attention to two important points: (1) the categories of the hierarchy should be mutually exclusive; and (2) it is important to consider the balance of breadth and depth. The breadth of the information architecture refers to the number of categories at the same level in the hierarchy, while the depth refers to the number of levels in the hierarchy. If the hierarchy is too narrow and deep, users will have to click multiple times to find the correct information, as in Figure 3. If the hierarchy is too broad, users will face too many choices in the main menu and less content after selection, which may lead to a poor user experience, as in Figure 4 [29].

2.4. Teaching Activities

Gagne defined instruction in 1985 as “the arrangement of external events to support the learner’s internal learning process.” He believed that there are different stages of development in the learning process, and in order to achieve the best learning effect, the instructor needs to design different teaching events at different stages to address the learners’ internal learning process [30]. According to Gagne’s teaching theory, he divided learners’ intrinsic learning process into the following nine stages: (1) attention alertness, (2) expectancy, (3) retrieval to working memory, (4) selective perception, (5) semantic encoding, (6) responding, (7) reinforcement, (8) cueing retrieval, and (9) evoked recall. In addition, the nine intrinsic learning processes are presented as external teaching events that the instructor needs to design, as shown in Table 2 [30,31]. In this study, the nine teaching events proposed in Gagne’s teaching theory were used as the basis for designing the content of the experimental tasks, so that the functions and purposes of the overall interactive response system app operating in the classroom could be examined more completely.

2.5. Usability

The concept of usability comes from user-centered design (UCD) in the field of human-computer interaction, which is a design concept that emphasizes user-centered thinking. In order to obtain quantitative information about the user’s interaction with a product, a scale can be used to measure the user’s response. The purpose is to allow users to complete the task with minimal effort and to avoid errors or frustration during operation in order to achieve a satisfying interactive experience [32].
A common interface assessment scale is the system usability scale (SUS), which was developed by Brooke in 1996 to test the usability of electronic office systems. Although the scale consists of only 10 questions, it is increasingly being used to test the subjective assessment of the use of various systems. The System Ease of Use Scale is a quick, but not crude, test of system validity, efficiency, and satisfaction [33]. It is a 5-point Likert scale with options ranging from 1 to 5, with strongly disagree represented by a score of 1 and strongly agree represented by a score of 5. The scores for individual questions are not meaningful and need to be calculated before they are scored on the scale. The total score of the scale was obtained.

3. Method

This study was divided into two stages of experiments. The first stage was the pilot experiment, which extended the results of the author’s study and analyzed the existing two IRS app information architectures as the core of the study and mapped out two information architectures—deep/shallow [8]. In the second stage, based on the results of the literature and the pilot experiment, we used learning style and information architecture as independent variables to explore the differences in the usability and post-use feelings of users with different learning styles regarding the depth of information architecture, as shown in Figure 5.

3.1. Pilot Experiment—Analyze Information Architecture

3.1.1. Pilot Experiment Method

The main purpose of the pilot experiment was to analyze the information structure of two existing IRS apps and redraw the information structure. The information structure of the IRS apps was analyzed using the content analysis method to inventory the information content of the two existing IRS apps, and after collecting the information content, it was reclassified according to the functional categories, and the content maps were drawn.
  • Content analysis method
The content analysis method was divided into two stages: (1) collecting content and (2) analyzing content. It is a bottom-up process to take a comprehensive inventory of the basic content elements that make up the information space. Content collection is a data sampling process to find out all the information content of the existing architecture.
2.
Content mapping
After analyzing the attributes and definitions of the data content through content analysis, the complex information architecture environment can be expressed by drawing a content map. Basically, an information space map is presented from a high perspective concept. The main purpose of the content map is to help the project team think about the overall information architecture [29].

3.1.2. Samples of the Pilot Experiment

In this study, we selected the two most popular IRS apps used by colleges and universities in Taiwan, both of which had been downloaded more than 100,000 times from Google Play [34,35]. We therefore selected these two apps as the experimental samples for information architecture analysis and mapping and named them: Promising Experimental Sample A (Figure 6) and Promising Experimental Sample B (Figure 7). We create two similar lessons: user interface design lesson & design method lesson, subject, class information, and Q&A in these two apps for the experiments. Although the class contents are different, the students’ behaviors are the same.

3.2. Interactive Response System App Interactive Behavior Experiment

3.2.1. Method

In order to obtain suggestions from those who had actual classroom experience of using IRSs, a total of 72 subjects, 39 males and 33 females, were recruited, and the average age of the subjects was 21.44 years old (SD = 2.21). The experiment was designed with 4 learning styles (diverger/assimilator/converger/accommodator) × 2 information architectures (deep/shallow), and a group design was used to investigate whether users with different learning styles differed in their use of the information architecture of different IRSs using the task performance and SUS ease of use scales as dependent variables. Descriptive statistics were used to understand the basic data differences between samples, and two-way ANOVA was used to understand whether there were statistical differences and to examine whether there was an interaction effect.

3.2.2. Research Hypothesis

This study investigates the differences in the performance and perceptions of users with different learning styles when operating the IRS app with different information frameworks by sequentially operating tasks related to classroom interactions. Research hypotheses are shown in Table 3.

3.2.3. Experiment Task

The experimental tasks of this research were based on nine teaching activities organized by Shen [30], and the teacher-student interaction corresponds to the function of the IRS app. The functions were connected in series to carry out the task design of the experiment. For the reason that this research only discusses the tasks within the curriculum, we will not discuss the fourth step (teacher prompt or group activity) and the fifth step (group exercise and teacher’s individual instruction) in the teacher-student interaction, as shown in Table 4 and Table 5.

3.2.4. Experiment Samples

In this study, we used the Adobe XD interface drawing software to design the screen and the back-end program to draw and produce the experimental samples in html and CSS programming languages. The experimental software had all the functions of the IRS app, including the function to record the time nodes of each task. The main difference is the number of function columns (3 vs. 6). Deep frameworks have three functions at the bottom of the page and a switch tap bar on the page top. Shallow frameworks have six functions at the bottom of the page. As shown in the bottom function column in Table 6. The functions at the bottom of the information framework-deep are listed as: (1) roll call and Q&A, (2) course information, and (3) course communication; the functions at the bottom of the information framework-shallow are listed as: (1) roll call, (2) Q&A, (3) grades, (4) course information, (5) private message, and (6) public discussion.

3.2.5. Experiment Process

In this study, learning style questionnaires were first distributed to determine the type of learning styles of the subjects and to categorize them. After the categorization, the subjects with different learning styles were randomly assigned to six experimental sample groups, and after the assignment, the subjects were invited to participate in the experiment. After the experiment was completed, the subjects were given the equivalent value of NT$100 as an experimental reward. The experimental process is as follows, as shown in Figure 8.

4. Results

4.1. Pilot Experiment—Sorting Result of Information Architecture

Firstly, we conducted a comprehensive site survey of the functions of the two existing IRS apps, Samples A and B. Since this study focuses on the course demand functions, only the duplicated functions of the two apps were collected. After expanding the content of the interactive functional elements of each course, the common functional elements of the two IRSs are as follows: (1) roll call, (2) attendance performance, (3) roll call record, (4) number of classes attended, (5) attendance, (6) classroom Q&A, (7) number of articles published, (8) question and answer score, (9) Q&A record, (10) course information, (11) announcement, (12) course dates, (13) feedback to teachers, (14) sending messages to teachers, (15) sent messages, (16) open discussions, (17) topics under discussion, and (18) new discussions, for a total of 18 IRS app functional elements.
The 18 IRS app functional elements are categorized according to the usage context of each function and divided into three major categories: (1) in-class interaction; (2) information display; and (3) course communication, as shown in Figure 9.
According to the categorized functional categories and the depth of information architecture, two types of information architecture were mapped out: (1) deep-information architecture and (2) shallow-information architecture.

4.2. Interactive Response System App Interactive Behavioral Experiment Results

This study uses task analysis to explore the differences in the operating time performance of users with different learning styles in operating the IRS app and uses the information shelf as the research variable. Task operation performance and the system usability scale score (SUS) were used to conduct a two-way ANOVA (Two-way ANOVA) based on learning style and information architecture.

4.2.1. Descriptive Statistics of Subjects

The mean age of the subjects was 21.44 years old (SD = 2.21), and they had four learning styles: diverger, assimilator, converger, and accommodator, with 18 students each.

4.2.2. Analysis Results

Through the two-factor variance analysis, we know that there is no significant difference in the time performance of different learning styles operating the IRS app. This is different from the author’s research [8]. It is speculated that the task interface with different learning styles in the operational performance part of the study is more complicated, and this research reorganized the information architecture and improved the interface design. Therefore, the interface and process design have reduced many unintuitive parts, so there is no significant difference in the learning style part. In the part of information architecture, among the 10 task nodes in the experiment of this study, there are significant differences in the operating time performance of three task nodes. Among them, shallower information architecture is better than depth in tasks 2-1 and 3-3; however, task 4-2 is the opposite. For the information architecture, deep is better than shallow. We elaborate on these three tasks as shown in Table 7.
Task 2-1 is “Question and Answer”. An ANOVA test showed that there was a significant effect (F(1,72) = 8.645, p = 0.005 < 0.05). The operating time performance of the shallow information architecture (M = 11.45, SD = 3.41) was less than that of the deep information architecture (M =16.18, SD = 8.27). From the post-interviews, it was found that the deep-information framework requires tabs to switch between roll call and Q&A functions in an in-class interaction, and users found that the tabbed design was not easy to find and was less intuitive, while the Q&A function is on the right side of the tab, which requires an additional switching step and therefore has poor operational performance.
Task 3-3 is “View Announcements”. An ANOVA test showed that there was a significant effect (F(3,72) = 7.726, p = 0.007 < 0.05). The operating time performance of the shallow information architecture (M = 10.55, SD = 4.59), is less than that of the deep information architecture (M = 16.57, SD = 11.93). After conducting the interview, the “Watch Announcement” function is the same as the “Question and Answer” function of Task 2-1. Both are on the right side of the table; therefore, users think that it is less intuitive and less friendly, and the performance of the deep-information architecture operation is poor, as shown in Figure 10.
Task 4-2 is “View Question and Answer Scores”. An ANOVA test showed that there was a significant effect (F(1,72) = 5.291, p = 0.025 < 0.05). The operating time performance of the deep information architecture (M = 9.30, SD = 3.43) was less than that of the shallow information architecture (M = 12.47, SD = 7.20). After conducting the interview, I learned that because the deep-information architecture has only three function columns, you can quickly click on one of the three functions during operation and then have better operating performance. Therefore, in the interface design, fewer functions are listed with better operating time performance.
There are significant differences for the three tasks, Task 2-1 “question and answer”, Task 3-3 “view announcements”, and Task 4-2 “view question and answer scores”. The functions of Task 2-1 and Task 3-3 are on the right side of the tabs. The user thinks that it is not intuitive to switch tabs after clicking the function bar below, so shallow-information architecture is better than deep-information architecture. Task 4-2, “view question and answer scores”, the deep-information architecture is better than the shallow-information architecture. From the interview results for the deep-information architecture, there are only three function columns, while the deep-information architecture “Achievement” is in the center of the bottom of the interface, so participants can quickly click on the function key during operation, and then have a better operation performance.

4.3. Discussion

The experimental results of this study show that there is no significant difference in the objective operating time performance and subjective feelings of users of different learning styles operating the IRS app. The results of this study are different from those of Author (2019), which is presumed to be due to the predecessor of this research. The information structure of the experiment is improved, the interface complexity is reduced, and the difficulty of finding data is reduced, so that there is no significant difference between the operational performance and subjective feelings of learners of different learning styles. This research concludes that improving the ease of use of the interface can enable users with different learning styles to operate the information system smoothly. There is no significant difference in the main effects of the information architecture for most tasks in this research experiment. It is inferred that the main effects of the information architecture are reduced due to the improvement of the information architecture, and only significant differences are produced in the main effects of Tasks 2-1, 3-3, and 4-2.
This research redraws the information architecture of the IRS app through pilot experiments and explores the differences in the usability of learning styles and information architecture with experimental methods. After statistical analysis and interview results, the following design suggestions are put forward:
  • Interactive response system app information architecture: The pilot experiment in this study analyzed the information architecture of the existing IRS app, and verified the usability of the redrawn information architecture through experiments. The experimental results show that each task has good operational performance, so it is recommended that the interface of the IRS app be designed based on this information architecture.
  • Reduce tabbed design: The analysis results of the leading experiments show that there are six main functional items in the IRS. All functions can be listed in the function column during interface design, and the tabbed page design is reduced (Deep-information architecture experimental sample).
  • The private message teacher can use the chat conversation method: the private message teacher function in the experimental sample design of this research is a line-up design. Some testees thought that this design was not intuitive and that it was difficult to distinguish the results of the teacher’s answer. Suggestions can be presented in a chat conversation.
  • Course information can be arranged on the same page: This research puts the course content and announcements in the course information function. The users thought this arrangement is intuitive and has good operational performance when operating, so this design suggestion is proposed.

4.4. Summary

In this research, through the analysis of the information architecture of the pilot experiment, the information architecture of two existing IRS apps was analyzed, and the content map was redrawn according to the functional categories. The pilot experiment drew a total of two information architectures, namely deep-information architecture and shallow-information architecture, and used them as variables in the experiment.
This research experiment explored the differences in the operating time performance and feelings of learners with different learning styles (diverger/assimilator/converger/accommodator) using the IRS app, and the information architecture (deep/shallow) was used as the experimental independent variable. A total of 72 subjects were recruited for experiments, among which four learning styles were diverger, assimilator, converger, and accommodator, with 18 students each. Through the Two-way ANOVA, we could understand the relationship between learning style and the two variables of information architecture that affect the usability of the interface. The research hypothesis was verified by the SPSS 24.0 software statistical analysis experimental results, which are summarized as follows:
  • Research hypothesis H1a does not hold: users with different learning styles operate the IRS app, and there is no significant difference in timing performance.
  • The experimental results of this research show that there was no significant difference in the operating time performance of users of different learning styles using the IRS app, so research hypothesis H1a does not hold.
  • Research hypothesis H1b does not hold: users with different learning styles operated the IRS app, and there was no significant difference in their experiences.
  • The experimental results of this research show that there was no significant difference in the subjective feelings of users with different learning styles when using the IRS app, so research hypothesis H1b does not hold.
  • Research hypothesis H2a was supported: the depth/lightness of the app information structure of the IRS had a significant difference in the time performance of the operation.
  • The experimental results of this research show that there were significant time differences in Tasks 2-1, 3-3, and 4-2. Therefore, the depth of the information architecture of the IRS app will produce differences in operating time performance. Research hypothesis H2a was supported.
  • Research hypothesis H2b does not hold: the depth/shallowness of the app information structure of the IRS had no significant difference in the experience of operation.
The experimental results of this research show that there was no significant difference in the ease of use score of the SUS system. Therefore, the different information architectures of the IRS app had no significant difference in subjective use experience. Therefore, research hypothesis H2b does not hold.

5. Conclusions

In this research, through the analysis of the information architecture of the pilot experiment, the information architecture of two existing IRS apps was analyzed, and the content map was redrawn according to the functional category. The pilot experiment drew a total of two information architectures, namely deep-information architecture and shallow information architecture, and used them as variables in the experiment. This research experiment explored the differences in the operating time performance and feelings of learners with different learning styles (diverger/assimilator/converger/accommodator) using the IRS app, and the information architecture (deep/shallow) was used as the experimental independent variable. A total of 72 subjects were recruited for experiments to understand the relationship between learning style and the two variables of the information architecture that affect the usability of the interface through Two-way ANOVA.
The experimental results of this study indicate that if the user interface effectively improves the information architecture, reduces interface complexity, and decreases the difficulty of finding data, learners with different learning styles can achieve good operational performance. In applications with little functional information, a shallower information architecture hierarchy can be used to make the interface more user-friendly. Many studies have confirmed that the use of IRS can increase learning motivation and effectiveness. This research aims to provide better user-friendly IRS system design guidelines to assist more teachers and students in easily utilizing the IRS, thus increasing its acceptance and popularity.
This research mainly used experiments to manipulate learning styles and information architecture to explore the differences in the operation and feelings of the student-side IRS app. Modern education methods and technology are constantly changing. Future related research can research and discuss teachers; due to research limitations, this research did not explore the functions of the IRS app, such as public discussion and message boards; this research is limited by the experimental environment. Therefore, only usability experiments were conducted in the laboratory. In the future, relevant research can explore the interaction of the IRS app in actual courses, and learning effectiveness and learning motivation can be used as dependent variables. This research focused on interaction in the class, so only those functions that would be used for interaction in the class were designed. Therefore, it is recommended that future research be conducted to explore other needs of the course, such as homework, grouping, examinations, and so forth; this research mainly uses smartphones, which have a higher holding rate among modern students, as the research platform and equipment. Relevant research can use technology products such as tablet computers or wearable devices as research platforms to explore differences in usability; this research only recruited undergraduates and master students from the National Yunlin University of Science and Technology due to research restrictions. In the future, relevant research can be conducted for users of different ages, and can discuss the effects of the IRS app in different teaching fields, such as seminars, lectures, workshops, and so forth.

Author Contributions

Conceptualization, S.-C.C. and H.-Y.L.; methodology, S.-C.C. and H.-Y.L.; software, H.-Y.L.; validation, S.-C.C. and H.-Y.L.; formal analysis, S.-C.C. and H.-Y.L.; investigation, H.-Y.L.; resources, S.-C.C. and H.-Y.L.; data curation, H.-Y.L.; writing—original draft preparation, S.-C.C. and H.-Y.L.; writing—review and editing, S.-C.C. and H.-Y.L.; visualization, S.-C.C. and H.-Y.L.; supervision, S.-C.C. and H.-Y.L.; project administration, S.-C.C. and H.-Y.L.; funding acquisition, S.-C.C. and H.-Y.L. All authors have read and agreed to the published version of the manuscript.

Funding

Financial support of this research study by National Science and Technology Council under the grant NSTC 111-2410-H-224-021 is gratefully acknowledged.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Lee, L.S.; Kung, H.Y.; Chen, C.Y.; Lin, K.Y. The influence of teaching strategies on learning effectiveness in the project-, problem- and inquiry-based capstone course. Curric. Instr. Q. 2019, 22, 55–76. [Google Scholar]
  2. Ho, C.Y.; Bin, J.S.; Chen, Y.H. Flipped Education; CommonWealth Magazine: Taipei, Taiwan, 2013. [Google Scholar]
  3. McKeachie, W. Teaching Tips: A Guidebook for the Beginning College Teacher; D.C. Heath: Boston, MA, USA, 1986. [Google Scholar]
  4. Pillo, H. What Students Think about and Do in College Lecture Classes; Learning Research Center, University of Tennessee: Knoxville, TN, USA, 1984. [Google Scholar]
  5. Taiwan National Communications Commission. 2020 Mobile Communication Business Customs Analysis [Data File]. 2020. Available online: https://www.ncc.gov.tw/chinese/news_detail.aspx?site_content_sn=1966&sn_f=41445 (accessed on 15 August 2020).
  6. Taiwan Ministry of Education. Goals of High School Vocational Action Learning Counselling Program. Available online: http://mlearning.ntust.edu.tw/ (accessed on 15 August 2020).
  7. Hsu, S.K. Discrepancies between educational technology and educational practices: A reflection. Tsing Hua J. Educ. Res. 2018, 35, 71–103. [Google Scholar]
  8. Chen, S.-C.; Lin, H.-Y. Study of Interface Usability of Interactive Response System App by Different Learning Styles. In Proceedings of the TAICHI 2019 Conference, Taipei, Taiwan, 30 September–1 October 2019. [Google Scholar]
  9. Natanael, Y.; Rosmansyah, Y. Definitions, features, and technologies on classroom response systems: A systematic literature review. In Proceedings of the 2020 International Conference on Information Technology Systems and Innovation (ICITSI), Bandung, Indonesia, 19–22 October 2020. [Google Scholar]
  10. Moffarts, G.; Combéfis, S. Challengr, a classroom response system for competency based assessment and real-time feedback with micro-contests. In Proceedings of the 2020 IEEE Frontiers in Education Conference, Uppsala, Sweden, 21–24 October 2020. [Google Scholar] [CrossRef]
  11. Huang, C.S. On the strategy of applying irs interaction response system to university course teaching. Taiwan Educ. Rev. Mon. 2017, 6, 81–87. [Google Scholar]
  12. Habook. Teaching Solutions and Strategies Are the Key. Technology Is Only the Catalyst. Available online: http://www.habook.com.tw/habook_epaper/2008/20080527_eTeaching_digest/20080527_eTeaching.htm (accessed on 27 May 2020).
  13. Shen, C.-C.; Wang, Y.-M.; Wu, J.; Meng, F.-D.; Shou, T.-W. Teaching effect of interactive response system in new media courses. In Proceedings of the 2022 10th International Conference on Orange Technology (ICOT), Shanghai, China, 15–16 September 2022. [Google Scholar]
  14. Yeh, Y.J.Y.; Chen, M.H. Examining the primacy and recency effect on learning effectiveness with the application of interactive response systems (IRS). Technol. Knowl. Learn. 2022, 27, 957–970. [Google Scholar] [CrossRef]
  15. Chen, P.S. The application and combination of fore exercise and reading guidance, peer assessment, and interactive response system-taking the class instruction of school administration for example. Sch. Adm. 2008, 58, 150–180. [Google Scholar]
  16. Hung, C.H. Improvement Effects of an Interactive Response System Combined with Peer Instruction on Learning Attitudes and Problem-Solving Abilities in Senior High School Physics. Master’s Thesis, Tamkang University, Taipei, Taiwan, 2015. [Google Scholar] [CrossRef]
  17. Lawrence, C.; Kasey, M.; Pike, B. A quasi-experimental assessment of interactive student response systems on student confidence, effort, and course performance. J. Account. Educ. 2013, 31, 17–30. [Google Scholar]
  18. Liu, C.J.; Chu, C.C.; Shih, C.M. The interactive response system in a symposium of respiratory therapist. J. Respir. Ther. 2008, 7, 84. [Google Scholar]
  19. Luh, W.T. Chemistry courses and interactive response system. Chemistry 2014, 72, 291–299. [Google Scholar]
  20. Sun, C.-Y.; Chen, Y.-Z. Effects of integrating dynamic concept maps with interactive response system on elementary school students’ motivation and learning outcome: The case of anti-phishing education. Comput. Educ. 2016, 102, 117–127. [Google Scholar] [CrossRef]
  21. Tsay, W.R. A study on irs implementing in a management mathematics class in a public university. J. Educ. Sci. 2015, 13, 75–96. [Google Scholar]
  22. Lin, K.Y. A study on integration of classroom response system into a class from the viewpoints of learning style and technology acceptance model. Middle Educ. 2014, 66, 138–156. [Google Scholar]
  23. Hu, L.F.; Lin, C.H.; Lin, I.C. The effectiveness of applying an interactive response system for geriatric health promotion in communities. Taipei City Med. J. 2009, 6, 445–454. [Google Scholar]
  24. Lu, C.H.; Shieh, J.C.; Huang, W.Z.; Huang, Y.C. A user experience study of pop science web information. Architecture of National Taiwan Normal University. Univ. Libr. J. 2016, 20, 63–87. [Google Scholar]
  25. Kolb, D.A. Experiential Learning; Prentice-Hall: Englewood Cliffs, NJ, USA, 1984. [Google Scholar]
  26. Lian, J.W.; Hsu, H.L.; Hsu, H.M. The effect of serious game platform and learning style fit on learning effectiveness. J. e-Bus. 2018, 20, 217–248. [Google Scholar]
  27. Konak, A.; Clark, K.T.; Nasereddin, M. Using Kolb’s experiential learning cycle to improve student learning in virtual computer laboratories. Comput. Educ. 2014, 72, 11–22. [Google Scholar] [CrossRef]
  28. Honey, P.; Mumford, A. The Manual of Learning Styles; Peter Honey Publications: Maidenhead, UK, 1992. [Google Scholar]
  29. Rosenfeld, L.; Morville, P.; Arango, J. Information Architecture, 4th ed.; O’Reilly Media: Newton, MA, USA, 2015. [Google Scholar]
  30. Shen, C.W. Technology and Learning: Theory and Practice, 3rd ed.; Psychology: Taipei, Taiwan, 2008. [Google Scholar]
  31. Gagne, R.M.; Briggs, L.J.; Wager, W.W. Principles of Instructional Design, 4th ed.; Harcourt Brace Jovanovich: New York, NY, USA, 1992. [Google Scholar]
  32. Nielsen, J. Usability Engineering; Academic Press: London, UK, 1993. [Google Scholar]
  33. Brooke, J. SUS—A quick and dirty usability scale. Usability Eval. Ind. 1996, 189, 4–7. [Google Scholar]
  34. Google Play. Tronclass. Available online: https://play.google.com/store/apps/details?id=com.wisdomgarden.trpc (accessed on 22 November 2019).
  35. Google Play. Zuvio. Available online: https://play.google.com/store/apps/details?id=com.zuvio.student (accessed on 22 November 2019).
Figure 1. Research framework.
Figure 1. Research framework.
Designs 07 00051 g001
Figure 2. Kolb learning style model (redrawn for this study).
Figure 2. Kolb learning style model (redrawn for this study).
Designs 07 00051 g002
Figure 3. Overly deep information architecture makes it hard to find.
Figure 3. Overly deep information architecture makes it hard to find.
Designs 07 00051 g003
Figure 4. Overly broad information architecture makes it too many options to choose hardly.
Figure 4. Overly broad information architecture makes it too many options to choose hardly.
Designs 07 00051 g004
Figure 5. Experimental framework of this study.
Figure 5. Experimental framework of this study.
Designs 07 00051 g005
Figure 6. Promising experimental sample A.
Figure 6. Promising experimental sample A.
Designs 07 00051 g006
Figure 7. Promising experimental sample B.
Figure 7. Promising experimental sample B.
Designs 07 00051 g007
Figure 8. Experiment process.
Figure 8. Experiment process.
Designs 07 00051 g008
Figure 9. Subcategories included in the three categories.
Figure 9. Subcategories included in the three categories.
Designs 07 00051 g009
Figure 10. 3-3 Task page switching details.
Figure 10. 3-3 Task page switching details.
Designs 07 00051 g010
Table 1. Introduction to Kolb learning styles [25].
Table 1. Introduction to Kolb learning styles [25].
Learning StyleIntroduction
DivergerPrefers introspective observation and concrete experience, and likes to absorb knowledge through observation.
AssimilatorPrefers reflective observation and abstract concepts, and is used to handling large amounts of information.
ConvergerPrefers active experimentation and abstract concepts, and prefers to solve problems through practice when faced with them.
AccommodatorPrefers active experimentation and concrete experience, believes in intuition, and is suitable for action-oriented learning styles.
Table 2. The internal learning process and its corresponding external teaching events.
Table 2. The internal learning process and its corresponding external teaching events.
Student Intrinsic Learning ProcessTeacher External Teaching EventsInteraction between Teachers and Students
1Attentional AwarenessAttracting attentionThe teacher controls the students’ attention
2ExpectationsInform students of learning objectivesTell students what they can do after learning
3Retrieval of working memoryPrompt recall of prior knowledge or skillsAsk students to reflect on what they have learned in the past
4Selective perceptionPresenting learning materialsTeacher prompts or group activities
5Semantic codingProvide learning supportInduce performanceSmall group exercises and individual teacher instruction
6Behavioral responseInduce performanceGroup competitions or teacher-assigned questions
7Augmented feedbackEnhancing feedbackInformative feedback from peers or teachers
8Clue recoveryEvaluate behavioral performanceAssessment of student performance
9Evoked memoriesEnhancing Learning Retention and MigrationTeacher review and interactive discussions among small groups on integrated learning content
Table 3. Research hypotheses.
Table 3. Research hypotheses.
NumberHypothesis
H1aThere are significant differences in time performance between users with different learning styles when operating the IRS app.
H1bThere is a significant difference in the experience of using the real-time feedback app for different learning styles.
H2aThe deep/shallow information achitecture of the real-time feedback app results in a significant difference in the time performance of the operation.
H2bThe deep/shallow information structure of the real-time feedback app makes a significant difference in the experience of using the app.
Table 4. Table of app functions corresponding to the teaching activities of the interactive response system.
Table 4. Table of app functions corresponding to the teaching activities of the interactive response system.
Interaction between Teachers and StudentsCorresponding App FunctionsTesks
1. The teacher controls the students’ attentionRoll Call1-1
2. Tell students what they can do after learningClass Information, Announcements3-3
3. Ask students to think back to what they have learned in the pastQuiz record, results, ranking1-2, 3-2
4. Teacher prompts or group activitiesPublic Message, DiscussionNo corresponding task
5. Small group exercises and individual instruction by the teacherOpen DiscussionNo corresponding task
6. Group competition or teacher-assigned questionsReal-time questions and answers2-1
7. Informative feedback from peers or teachersPrivate Message to Teachers4-1
8. Assessment of student performanceResults2-2, 4-2, 4-3
9. Teacher review and integrated learning content group interaction and discussionQuestions and Answers3-1
Table 5. Task content.
Table 5. Task content.
No.Task ContentTask StepsTeaching Activities
1-1Roll Call Sign in using the roll call1. The teacher controls the students’ attention
1-2View RankingView the ranking of this account in this course3. Asking students to recall what they have learned in the past
2-1Question and Answer1. Answer the first question
2. Choose option A
3. Tell whether the answer is correct or not
6. Group competition or teacher-assigned questions
2-2View Roll CallView the most recent roll call of this account8. Assessment of student performance
3-1Q&A1. Answer the second question
2. Choose option B
3. Tell whether the answer is correct or not
9. Teacher review and integrated learning content group interaction and discussion
3-2View Attendance PerformanceFind the attendance of this account3. Asking students to recall what they have learned in the past
3-3View AnnouncementsFind the “Classes will be closed for one week on 12/3” announcement for this course2. Telling students what they can do after learning
4-1Feedback to Teachers1. Send a message to the teacher
2. Choose option “I understand” to give feedback to the teacher
7. Informative feedback from peers or teachers
4-2View Question and Answer ScoresFind the answer accuracy rate of this account8. Evaluate student performance
4-3View the number of students taking the courseFind the number of students taking this course8. Evaluate student performance
Table 6. Experimental sample.
Table 6. Experimental sample.
Framework-DeepFramework-Shallow
Designs 07 00051 i001Designs 07 00051 i002
Deep—roll call and Q&A—Page of roll callShallow—Page of roll call
Table 7. Two-way ANOVA analysis summary table.
Table 7. Two-way ANOVA analysis summary table.
Task No.Task ContentOperating Time Performance Analysis
1-1Roll Call nonsignificant
1-2View Rankingnonsignificant
2-1Question and AnswerInformation achitecture—shallow less than deep
2-2View Roll Callnonsignificant
3-1Q&Anonsignificant
3-2View Attendance Performancenonsignificant
3-3View AnnouncementsInformation achitecture—shallow less than deep
4-1Feedback to Teachersnonsignificant
4-2View Question and Answer ScoresInformation achitecture—deep less than shallow
4-3View the number of students taking the coursenonsignificant
System usability scalenonsignificant
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Chen, S.-C.; Lin, H.-Y. Research on Interface Design of Interactive Response System App with Different Learning Styles. Designs 2023, 7, 51. https://doi.org/10.3390/designs7020051

AMA Style

Chen S-C, Lin H-Y. Research on Interface Design of Interactive Response System App with Different Learning Styles. Designs. 2023; 7(2):51. https://doi.org/10.3390/designs7020051

Chicago/Turabian Style

Chen, Shih-Chieh, and Hsiao-Yi Lin. 2023. "Research on Interface Design of Interactive Response System App with Different Learning Styles" Designs 7, no. 2: 51. https://doi.org/10.3390/designs7020051

Article Metrics

Back to TopTop