Next Article in Journal
Designing for Critical Science Agency in a Community-Based Science Curriculum
Previous Article in Journal
Blockchain and Artificial Intelligence Non-Formal Education System (BANFES)
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Pure Question-Based Learning

by
Olle Bälter
*,
Richard Glassey
,
Andreas Jemstedt
and
Daniel Bosk
School of Electrical Engineering and Computer Science, KTH Royal Institute of Technology, 100 44 Stockholm, Sweden
*
Author to whom correspondence should be addressed.
Current address: Institutionen för Pedagogik Och Didaktik, Stockholm University, 106 91 Stockholm, Sweden.
Educ. Sci. 2024, 14(8), 882; https://doi.org/10.3390/educsci14080882
Submission received: 28 June 2024 / Revised: 6 August 2024 / Accepted: 8 August 2024 / Published: 13 August 2024

Abstract

:
We have evaluated a new pedagogical approach, pure question-based learning, or rather, a modern, digitized version of a really old approach: the Socratic method of learning. The pedagogical approach was evaluated and improved using a design-based research methodology. An online course was developed with pure question-based learning to explain its predecessor: question-based learning. The course was successively taken by students, researchers, and practitioners, and discussed in four group seminars. Feedback from each iteration was integrated into the next version and the course is still in use, see link below. Results from the design-based research process were positive ( n 78 participants, over four iterations) with the main negative results coming from the unfamiliarity of the format and feelings of exam-like stress during the first encounter. While pure question-based learning is new, it builds upon well-tested pedagogical methods. The method has several potential advantages: learning can be broken down into smaller modules, there is less passive learning for the students, less learning material needs to be created and AI could be used for this creation.

1. Introduction

Passive learning refers to the hope that by having students sit through lectures, seminars, videos, etc., they will assimilate and integrate the new knowledge into their existing knowledge. In contrast, active learning strives to engage students by creating many opportunities to practice what they are learning. Question-based learning (QBL) is one form of active learning where students have the opportunity to answer questions that are interspersed throughout course material and receive immediate feedback.
QBL, in the form developed within the Open Learning Initiative, has produced impressive results when it comes to making learning more efficient and effective. Multiple studies have found benefits, and have the data to support their claims [1,2,3,4].
However, developing high-quality learning material is very resource-demanding, and we have previously engaged in several ways to reduce the cost for development, such as engaging students in the creation of learning material [5]. One successful path we have used is to hire Teaching Assistants (TAs) to develop learning material under the supervision of an experienced subject teacher, thereby reducing the salary cost dramatically [6]. Unfortunately, this also requires that we train new students each spring in the methodology, and we are also under pressure to do this with minimal resources.
To solve this recurring situation, we decided to develop an online course on QBL, designed with the QBL methodology, in order to practice what we preach. In our preparation to create this course, we repeatedly ran into research on the benefits of testing first, that is, asking questions first, without any background information, which has been shown to have positive effects on learning [7,8,9,10].
Our conclusion was that, in light of the overwhelming research of the benefits of testing first, and a complete lack of evidence that reading (or watching) learning materials before answering questions would be better, we decided to test the hypothesis that we could move all learning material into the feedback of the questions, and, thereby, create a course that only consists of questions with immediate interactive formative feedback, see https://qbl.sys.kth.se/sections/questionbased_learning_2209/overview (accessed on 27 June 2024). Hence, we created a course on QBL based upon this new approach that we coined pure question-based learning (pQBL).
Pure question-based learning: learning material that consists solely of questions with constructive formative feedback.
Figure 1 provides a high-level view of the relationship between QBL and pQBL. In its idealized form, QBL structures learning material as content interspersed with questions with immediate feedback. However, a typical bias that can emerge in learning material is a focus on content (reading, watching, etc.). As a consequence, questions and their feedback are less prevalent, and, in the worst case, absent altogether. In contrast, pQBL addresses this bias by putting the question first, ensuring that there is high-quality feedback, and, if necessary, additional content to supplement the feedback.
This pQBL approach has three major advantages even compared to the established QBL: it enables the learning material to be broken down into smaller pieces [11], it reduces the amount of learning material for teachers to produce (as there is no longer a need for the teacher to write text to read before asking the questions), and, consequently, reduces the amount of learning material for students to consume. Looking ahead, we also believe that the use of generative AI will be a further emerging advantage in terms of reducing the costs of production.
Getting everything right on the first attempt when you are developing something new is, of course, not likely, so, therefore, we decided to develop the course using a design-based research methodology [12,13] with four iterations in different groups: a research setting, a teacher setting, a student setting, and, finally, in a research setting again.

1.1. Question-Based Learning

The Open Learning Initiative (OLI) was founded in 2002 at Carnegie Mellon University to improve learning and teaching through science. Over the years, the initiative has produced a long list of data-driven research in this field that has greatly improved learning [1].
The abbreviation OLI is used for three different meanings: the project, the methodology, and the platform the project has developed. In this paper, we will, for clarity, refer to the OLI methodology as question-based learning (QBL).
Built upon well-established learning science principles, OLI has been demonstrably effective in closing gaps and improving performance for, for example, underrepresented learners in STEM [2,4]. Multiple, rigorous studies have demonstrated the potential of QBL to improve student outcomes in terms of fulfilling the intended learning outcomes while advancing learning research [14,15].
At the core of the QBL courses are questions (activities) grouped into skills, with different constructive feedback for each answer alternative (see Figure 2). This type of learning has been shown to be six times as efficient as reading and watching videos [3].
The methodology has been applied widely and can be seen as an effective way to learn by doing at scale. It has been proven to work both for K-12 education where cognitive tutors in math produces two times more learning in algebra, compared to traditional algebra courses [16], for university education in statistics where the learning time could be reduced by 50% in a randomized study, while, at the same time, increasing the learning gain [1], and for professional development in programming with a learning time reduction of 25% [17]. This difference between the two last studies is partly because the statistics course had been iterated with the data collected from the students over several years, hence the 50% reduction, but the programming course was freshly developed, that is, without the continuous data-based improvement. This 25% reduction could be seen as a measure of the effectiveness of the QBL methodology itself, without using the data in the iterations.
A part of the explanation as to why the QBL methodology is so effective is that the many interactive questions in the learning material encourage active learning (see, e.g., [18]). It is well known that students learn more when they are active rather than passive, although the students often perceive the opposite [19], but thanks to the possibilities in OLI to measure learning continuously, researchers have managed to break this misconception.
However, these learning gains and time savings do not come for free. Each skill should have, on average, seven to eight opportunities (questions) to be tested [20]. Developing this type of high-quality learning material is, therefore, resource-demanding. We have addressed this challenge in several ways, by involving students in the teaching [21], the development of the learning material as a part of their course work [5] in line with other work that encourages students to actively develop questions as part of their learning [22], and as hired TAs [6]. In this paper, we take a new angle by reducing the amount of learning material needed for the course.

1.2. Testing and Testing First

As early as 1950, letting learners try to solve a problem before they were taught how to solve the problem led to significant improvement in both performance and retention compared to teaching the students first and letting them try afterwards [7]. This has later been reproduced by, for instance, Refs. [8,9,10].
There are also more studies showing that it is good for the students’ learning to try first, even if it results in a failure. Refs. [9,23,24] have a series on “productive failure”. It involves two groups of students, one under a “being told” condition, the other under a “finding out” condition. The total time spent was the same for both conditions. In the “being told” condition, the students were first told the method to solve the problems and then they practiced it. In the “finding out” condition, they first spent half the time trying to solve it by themselves. The other half was the same as the “being told” condition, except with shorter time for practicing. The problem, in this case, was on the topic of analyzing distributions in data.
Kapur distinguished between different outcomes. Both groups did equally well on items measuring “procedural fluency” (i.e., in computing the variance). On items measuring data analysis, conceptual insight and transfer, the “finding out” group did better than the “being told” group.
([25], p. 219)
There are also studies showing that being tested before actively studying can benefit learning [10,26]. That is, even though students perform poorly on the pretest, the test aids their continued learning of the information more than if they had been exposed to the material before taking the pretest. Hence, given how closely pQBL resembles pretesting, it should lead to more effective learning than a learning environment that requires the student to read or watch the to-be-learned material before answering questions.
According to [25], the “finding out” phase allows the students to introduce sufficient variation to themselves, so that, when they are finally being told, they have discerned the necessary aspects.
We should note that there is an alternative way of phrasing the explanation. When being told the method first, the focus is on memorizing (surface learning). When trying first, being told later, the focus is on the necessary aspects (deep learning). The reference (Ref. [25], Ch. 5) shows that, indeed, the difference between surface and deep learning is the use of different patterns of variation and invariance.
From the perspective of cognitive psychology, research on the benefits of self-testing or retrieval practice support the idea that QBL is more beneficial to learning than more passive forms of learning, such as reading [27]. In addition, the same evidence also suggests that pure QBL could be even more beneficial than “classical” QBL. First and foremost, retrieval practice (i.e., recalling information from memory as a learning technique) has continually been shown to lead to better long-term retention of information compared to more passive forms of learning (e.g., [27], for meta-analyses, see [28,29]). Thus, a learning environment that provides more opportunities to test oneself (e.g., pQBL), should benefit learning more than a learning environment that provides fewer such opportunities (e.g., “classical” QBL).

1.3. Feedback

A central part of the QBL methodology is direct formative feedback. The main goal of feedback is to improve future performance on the same or similar tasks. In QBL, learners receive feedback on their performance. However, it is up to the individual course manager to decide what type of feedback the learner receives. Simply asking questions with no feedback already has a positive effect on learning [30], as does simple (right/wrong) feedback [31]. However, we believe that one of the largest benefits of pQBL is that it always provides the learner with instant elaborate feedback, rather than other forms of less effective feedback.
Feedback can come in many forms, and some forms are better than others [32]. As a general rule, feedback has a positive effect on grades [33] and learning, especially if initial performance is low [34]. In other words, learners gain more from feedback when it is provided when they are unaware of the correct answer to a question. Furthermore, although simply verifying whether the responses are correct or not can benefit learning (e.g., [31,35]), feedback tends to have a more positive effect the more elaborate it is [32]. For instance, if a learner responds to a true–false question and receives the feedback that this is incorrect, the student will most likely learn that the other alternative is better. However, if we instead inform them about why they were incorrect/correct, they would most likely learn more and with that understanding be more likely to answer correctly later. In pQBL, all the learning material is moved into the feedback of the questions, as such, it will always provide elaborated feedback to the learner. Feedback is also something that the students want, preferably early and frequently [33].
While constructive feedback is a long-known part of the scaffolding in a learner-centered pedagogy [36], using it this way in pQBL is (to our knowledge) new. We therefore decided to examine and develop the methodology through design-based research.

1.4. Design-Based Research

Design-based research (DBR) has emerged as a practical research methodology with the aim to bridge research and practice in formal education [13]. DBR promotes the design and redesign of educational interventions based on research findings, making it a flexible and adaptable methodology for teaching contexts [37]. DBR has been grounded by [12] and should involve several iterations, but there are examples of just one iteration [38]. In our case, the pQBL approach, starts with the assessment of the local context, which, in the settings of our study, has been conducted in four different contexts, further explained in the next section. The interventions can be of different kinds, such as a learning activity, a new type of assessment, the introduction of an administrative activity, or a technological intervention [13]. In the present study, a new type of learning activity was introduced in the form of pQBL.
Many design practices frequently evolve through the creation and testing of artifacts (e.g., technological, social, or information artifacts), iterative refinement, and continuous evaluation [13]. DBR is one such iterative and adaptive approach involving multiple cycles of design and testing in the context studied, in which the design and implementation of the intervention is refined during the DBR process [13,39]. Another important characteristic of DBR is that it involves a collaborative partnership between researchers and practitioners. In our study, we are both researchers and practitioners, but we have also included other researchers and practitioners as well as a group of students in a professional development course.
DBR has many similarities with the more common action-based research, and the two methods can complement each other [40]; the main difference is that DBR, besides the practical applications, also attempts to develop theories and/or practices.
While the use of DBR in an educational context so far has been limited, it has been growing [37] and has been employed in a higher educational innovation to develop and explore new teaching and learning methods or scenarios in various contexts (e.g., [41,42,43]).
This leads us to our research question:
How do higher education students and teachers, and researchers in technology-enhanced learning perceive pQBL when testing it for the first time?

2. Methods

2.1. Data Collection

The data collection was adapted to the ongoing pandemic and, therefore, mostly done online, orally, or in writing, but the last round was physical. All participants were informed of the purpose of the data collection and offered to decline participation without consequences in accordance with the Helsinki Declaration. As practical experience is important [44], all seminars started by providing hands-on experience for the participants by simply offering them a link to the pQBL course (https://qbl.sys.kth.se/sections/questionbased_learning_2209/overview (accessed on 27 June 2024)), without any further explanations of the methodology. After experiencing pQBL, the two first groups were asked the same following questions:
  • What was your first impression?
  • What kind of challenges do you see?
  • What kind of advantages do you see?
In addition, each page in the course ended with an open-feedback question: “What can be improved on this page?”. Answers to these questions were intended to identify slips and mistakes on that page.

2.1.1. Round 1: University of Macedonia

The first author was invited to give an online seminar/workshop at the Department of Applied Informatics at the University of Macedonia in Thessaloniki, Greece in January 2022. The session was recorded through Zoom, and participants were informed of this, including the intended use for research.
The seminar participants were given a short introduction to the form of QBL used in the Open Learning Initiative, and one slide on the difference between QBL and pQBL. After that, the participants were given instruction on how to access the QBL course and given ten minutes to go through as much of the pQBL material as they had time for. After that, they were asked to reflect on the three questions above, one at a time.

2.1.2. Round 2: University of Gävle

The first author was invited to give an online seminar/workshop at the Department of Mathematics at the University of Gävle in Gävle, Sweden in March 2022. The session was intended to be recorded through Zoom, and participants were informed of this, including the intended use for research. However, the recording on the Gävle side failed already at the start of the seminar, so we switched to extended note-taking for this seminar.
These participants were given the same short introduction and, thereafter, access to the online course in the same way as the University of Macedonia, answering the same questions, although this seminar session was in Swedish.

2.1.3. Round 3: East African Course Participants

As part of the concluding workshop in a hybrid course (OneLearns, [45]), the participants were asked to spend up to 20 min to go through the pQBL-designed QBL course before attending the concluding workshop, and, during the workshop, answer the three questions below, in groups using a Google document. These participants were introduced to QBL through the hybrid course that covered, among other things, efficient online learning with QBL. The workshop took place in Kigali, Rwanda in April 2022. About half of the course participants partook remotely from Ethiopia, Kenya, and Rwanda.
The East African course participants were adult students; all had at least a bachelor’s degree. The recruitment targeted government officials and actors in civil society organizations (such as NGOs) in Rwanda, Ethiopia, and Kenya who held a position with a mandate to bring about change to those communities.
These participants answered the following three questions, which are a slight reformulation of the questions in the two first rounds to provide more context (due to the asynchronous setting) and separate the eight month part-time OneLearns course they were about to finish, from the 25 min QBL course they did as a prerequisite to the concluding OneLearns workshop:
  • What was your impression of this course that had no reading material besides the questions?
  • What do you think needs to be improved (or what are the disadvantages with this type of methodology)?
  • What was the best parts of the experience?

2.1.4. Round 4: Workshop at the NLASI ’22

In the final round, a workshop on pQBL was organized by the first two authors as part of the Nordic Learning Analytics (Summer) Institute in 2022. The workshop was held on campus in person with ten conference delegates who had voluntarily chosen it.
The structure of the workshop was as follows. First, the 10 participants were given a brief introduction and motivation for pQBL. Next, participants had the opportunity to work together in small groups of two to three with the pQBL course. Then, this was meant to be followed by a second activity that focused on the challenges of creating pQBL questions, as well as a discussion linking learning analytics and pQBL; however, all of the time was spent on the first activity and an active discussion of the impressions of the approach took the main focus for the rest of the allotted time.
The participants were a mix of researchers and practitioners in the field of learning analytics with diverse and considerable expertise shared among them. As such, there was less structure to the discussion that followed the first activity and the guiding questions were unnecessary to prompt discussion.

2.2. Data Analysis

We used thematic analysis for all the gathered data, but, as the data were collected in three different ways, the analysis was applied to three different types of data. For the first recorded session, we analyzed the audio recording for answers to the three questions and transcribed those verbatim. These citations were then subjected to thematic analysis. For the second and fourth sessions, we relied on notes taken during the seminars. From the third session, we used the hand-ins written by the students as direct answers to the three questions.

3. Results

Citations from participants are in italics.

3.1. Round 1: University of Macedonia

About 25 (some came and left, as it was on Zoom) participants followed the seminar. It was a mixture of faculty and Ph.D. students.

3.1.1. First Impression

While the participants in general were positive about the methodology, the first impression might be overwhelming for some: I was also stressed and confused in the beginning. Very much stressed because I did not know what I had to do. Furthermore, very anxious. I do not know. Furthermore, because I am also a student! I do not feel like a professor. It got better with time.
However, some of the participants were really enthusiastic: Excitement! I followed a different approach. Just answer and see what happens. Nobody will be killed. There is no harm in that. You can try again. If you follow this approach, it’s fun.

3.1.2. Challenges

Moving the learning material into the feedback reduced the time for course construction, but it did not simplify question construction: It is [would be] very hard to prepare the questions. Good questions.
This included the ordering of the questions and to give the right feedback. A few of the questions in the course were constructed with negations in the question, something that is advised against [46], but, according to one of the participants, there are also advantages with such questions: The negation was confusing, but I think it was a way to make us focus more. It made me read more carefully the question.
A longer discussion arose around the possibilities that this type of online material could reduce the student’s willingness to buy and read textbooks. However, that road has been open for a long time: The students no longer read the book. They only read my slides.

3.1.3. Advantages

As mentioned above, the advantages with this methodology were apparent, at least for those who said something about it, even though some preferred traditional methods: It is more active. You have to change your style of studying. I still prefer to read, the classical style of learning.
Teachers saw the value of the methodology both for their students and for their own teaching:
If I first see the problem and then [after thinking about the problem] the answer, I will be more engaged than someone tell me okay here is the problem and here is the solution to that problem. So, this question thing, for me, it could be engaging, more than a lecture. However, it might not be for everyone. … Thinking of how to create the feedback helps me structure my lecture, or what I should focus on in my lecture.
The first advantage is that it is easier than displaying questions without giving them feedback. Furthermore, it is more fun if you include those challenging questions. The clear advantages are these. … Giving them hints [feedback] may help critical thinking. In a way solving it himself, instead of copying the knowledge. … It reduces copied answers that are the answers just because the book says so.

3.1.4. Improvements to the Course

We realized that the pure question-based methodology is so different from traditional courses that we needed to have an onboarding process to reduce the feeling that you are challenged with a test, and the subsequent increased levels of stress. Therefore, we added an initial section with some text and three very simple questions (on how to answer questions in this interface).
These pioneers also used the open-ended feedback questions on each page to point out mistakes and errors in the questions, answering alternatives or the feedback that we had made, which we corrected for the next round, but, mostly, these comments were praising the course. However, one of the participants had specific knowledge of one of the papers we referred to in our feedback and pointed out that we had misunderstood that paper. This made us go back and re-read the paper, and correct the feedback accordingly.
From learning analytics data showing the percentage of correct answers on the first attempt and, eventually, the correct percentage on each question, we identified a problem with the interface that did not make the difference between select one (radio buttons) and check all that apply sufficiently clear. As the interface was beyond our control, we tried to reduce these mistakes by stating clearly in text to check all that apply in those questions.

3.2. Round 2: University of Gävle

There were 17 participants at the workshop in Gävle. All participants were physically present, but the presenter was remote on Zoom. Due to the failed Zoom recording, the data from Gävle were based on notes made during the workshop made by the first author. These notes were then shared with the person invited to the workshop who verified the accuracy of the notes. The citations below are, therefore, not verbatim, and also translated from Swedish.

3.2.1. First Impression

Similar to the comments from University of Macedonia, the general comments were positive, but the first comment opened up another argument in support of the methodology: I have ADD [Attention Deficit Disorder], so for me this was the way I learn naturally, by trying out things. In other contexts, when teaching how to handle students or athletes with ADD, it is often mentioned that “what is good for the persons on the autism spectrum, is good for all” [47], so this “learn by doing” approach to learning could have benefits we did not think of.

3.2.2. Challenges

The added onboarding questions might have helped to reduce the stress levels, as there were less mentions about that, but this could also be explained by the lack of students among these participants. However, again, it is clear that pQBL is perceived as being very different by the participants: For me, the challenge was to leave my comfort zone and allow myself to be wrong, as I always want to give the correct answer.
One limitation of the methodology is, of course, that the questions do not cover all the things that would be mentioned in a textbook: The questions are a selection of the possible knowledge. Someone else has made a choice for me. Of course, these choices of selection for a course or a textbook are always made by someone else: the teacher or the author.
In an attempt to include examples from various topics, we inadvertently excluded one person: In the history-related questions, I did not have the patience to read all the feedback, as I am not interested in history.
These teachers could also identify the challenge of creating high-quality questions: The construction of the questions puts high demands on the person constructing them.

3.2.3. Advantages

These teachers could also see the advantages from a pedagogical perspective:
The interactivity was really motivational!
It makes you really understand what you have not understood.

3.2.4. Improvements to the Course

As we hoped, this round led to less changes to the course than the first round. Slips and mistakes were gathered through the open-feedback questions on each page, but the main revision was the history-themed questions mentioned above to make the feedback shorter and more to the point of the question-based methodology.

3.3. Round 3: East African Course Participants

While the intention of the written data collection was that the course participants should discuss their answers first and respond as a group, the occasionally poor internet connection made this difficult for some of the remote participants, who, instead of having a discussion, wrote down their individual views. In total, there were 26 participants from whom we got eleven different written answers. Some examples of statements are listed below, under the same headlines as above, although the questions had a slightly more elaborated formulation in this written round. In this context, all these participants should be considered as professional development students, as it was in this capacity that they partook in the OneLearns course.

3.3.1. First Impression

As students, they did observe the differences between pQBL and traditional methodologies: … this method can work for people who have some basic knowledge on the course, but for the new course material, we think students might be confused.
Despite the onboarding questions, some students still perceived the pQBL design as a test and did not want to answer the questions wrong: To get the answers one had to look for relevant reading material.
This might be an initialization problem, as some students got past this initial confusion: I found it good. When I look at the first time it was confusing. After you become familiar, it becomes interesting.
The (pQBL) methodology led to different student behavior: This method required more concentration from the students.
One of the professional teachers participating in the OneLearns course perceived the pQBL design as valuable, despite being taught essentially the same things in the QBL-designed OneLearns course: The workshop preparation course is very helpful. It has opened my mind to think widely putting myself in student’s position in order to figure out how to prepare QBL and make it effective.

3.3.2. Challenges

The challenges identified by the students were similar to the ones previously defined. Initial shock over the format: it may lead to frustration especially if it’s the first time for a student. Different requirements on the students: Such learning requires commitment of learners and demanded strong [internet] connection access and [i]t’s difficult to adapt oneself who is new to the topic. [I]t requires some pre-existing ideas about the topic then you challenge yourself about what is right. Finally, a concern that the learning is depending on the selection of questions: One of the limitation, the student reads only on the specific area to answer that specific question and may not comprehensively refer concepts and theories around the topic of interest.

3.3.3. Advantages

The students could also identify the following advantages of the methodology:
  • It prompt you to develop deep thinking and focus.
  • We have been impressed by how we can answer questions by guessing.
  • The feedback was very constructive and helpful.
  • It promotes focused reading towards the answer to the specific question.
  • It allows learner to test him/herself by answering questions [until he/she] gives correct answer. This is the best way of effective academic learning.
  • Not boring as there is no many texts to read.
  • It is smart and time saving.
The same group that wrote that they searched (probably googled) for “relevant reading material” before answering the question also stated this as an advantage: Testing self in areas that one did not have prior information on was interesting.

3.3.4. Improvements to the Course

We diversified a few questions to other fields (besides history). The data from the open-feedback questions on each page were a mixture of praise, Well explained, Very Good, and requests for more questions, To increase innovative questions that help to Trigger the students’ minds.

3.4. Round 4: Workshop at the NLASI ’22

The workshop at the conference attracted 10 participants, all active researchers in learning analytics or similar fields. To add interactivity to the workshop, we encouraged the participants to work in pairs or threes with the course. Whilst parallel feedback instruments were intended to be used to capture data (online evaluation forms in the course and padlet boards for each activity), they were not used as much as expected. Instead the open discussion became the main focus of interesting data. As there was no such recording of the session, comments from evaluation forms, the notes of the organizers, and their recollections of the event are reported in the following sections. This means that we will not use direct quotes as we did earlier.

3.4.1. First Impression

Presenting a new idea to a group of experts in the same field is both daunting and rewarding. Participants were comfortable with the overall concept as described in the workshop introduction and were enthusiastic to get started. After some rearrangement of seating and sharing of computers, participants dived in by taking the QBL course in the pQBL format. In contrast to the earlier need for a better introduction to the concept, one participant felt that there should be “less onboarding text”.
Within moments a technical glitch had been uncovered where the correct answer did not match the expected answer. Whilst minor and somewhat beyond our control, this did dent the first impression as the organizers had to quickly verify that this was not a systemic issue. Participants were vocal in their group discussions and the organizers floated between groups and joined in were appropriate. Perhaps the most notable takeaway was that the groups never managed to move onto their second and third activities; instead, the session evolved into a full room discussion about the challenges and opportunities of the approach for online course material.

3.4.2. Challenges

There was little resistance to the idea of promoting feedback as the most important aspect of developing a pQBL course. However, challenges were readily identified by participants. The chief concern was that, whilst having immediate feedback at the moment of activity, “what was the relevant feedback that a learner should see?” The group discussed the possibility for adaptive feedback that is more tightly connected to previous interactions with the system. Personalization of the feedback was understood to be a difficult requirement, as the effort required to generate one version of feedback was already considered the most challenging part.
This led to a branch of discussion on the challenges of generating course material in this format, as much effort had already been put into the instructional design and convincing teachers to start over is a difficult request. As mentioned previously, the use of trained TAs for the course material development was the current state of the art; however, with advances in large language models in the AI space, there is hope that machine support could speed up the development effort and reduce the development cost. Furthermore, there were concerns about finding a sweet spot in the creation of questions that were not too simple and those that were too thought-provoking (which might lead the participants astray from understanding the concept).
Another concern raised was how to deal with content that students might not see as they never click on an answering alternative. To date, it is difficult to tell if this matters much. There has been some minor anecdotal evidence that some students enjoyed exploring the alternatives just to see the different feedback messages. If students are engaged and at least read the feedback they encounter, then this could be an improvement over the lack of reading from recommended texts. However, the simple conclusion was that the interface could be better organized to make it easy to view the unseen feedback. Ultimately, if the feedback for a single question manages to go in very different directions, then this is perhaps more a problem of focus on the behalf of the course constructors.

3.4.3. Advantages

The participants were supportive of the pQBL approach, noting that, whilst on a surface level, it might not appear so revolutionary or different from online courses, for example, Khan Academy was discussed as one example, what made it significantly different was the shift in focus to activity and feedback as the primary concerns. As much as flipping a classroom changed perspectives of where the focus should be, the pQBL approach was viewed as a further complement to that idea of activating the students first, then discussing the feedback during the teaching session.

3.4.4. Improvements to the Course

The onboarding instructions we added after the first iteration were replaced with three pQBL-questions. Throughout the session, the participants also made suggestions for improvements. In terms of analytics, a better dashboard was desired in order to make sense of the student answering behavior. Furthermore, finding some way to know if students were engaging with the feedback was seen as desirable, especially as the user interface cues, like a green correct box, might nudge a student to skip the explanation altogether; see Figure 3.
The pair-based mode of interaction used in the fourth iteration when taking a pQBL course was found to be interesting and engaging, much like how peer instruction works to get students to try answering questions and then actively discuss and debate the answers. The organizers had only used groups to foster discussion rather than quiet study, so this use-case for students was quite a good suggestion if the appropriate circumstances can be found to pair students. However, there was also the concern that such a mode might lead to unwanted competition within the groups.
The (large) amount of questions might make the participants stressed to finish the course rather than reflecting on the questions and the feedback. On the other hand, this engagement could also be an indicator that the method works well. Prior work with students has shown anecdotal evidence that students slip into an exam mode when faced with pQBL for the first time, because the appearance of the course material and their prior experience combine to bring up exam-like feelings. This has been countered by spending more time on the onboarding material that students face first.
A good point was raised about note-taking, or more generally how the students should re-find this material. One should hope that the combination of less content first and better underlying structuring of skills and their mapping to activities would make it easier (1) for students to get back to something they wish to review and (2) have the opportunity to retake the question immediately. However, this can also be seen as frustration if you have to “unlock” an explanation every time you want to revisit it for reference. The platform will support this partly: if you create an account (instead of taking the course anonymously), the results will be saved and the learners can see their latest answer, including feedback, to all questions.

4. Discussion

This design-based research study set out to answer how students, teachers, and researchers in technology-enhanced learning-perceived pQBL when testing it for the first time. Over the four iterations, the following common themes emerged: initial feelings of overwhelm quickly gave way to engaged and positive impressions. The focus on active learning was highlighted as positive and motivating, although the need to create high-quality questions with feedback was also noted as a possible challenge to develop further courses. Furthermore, the need for appropriate onboarding to reduce the feeling of being tested as an exam was a recurring theme, even with the modifications made to the beginning of the course after the earliest iteration.
This feeling of overwhelm initially was predicted as a part of the learnability of the methodology, which, due to its radically different approach, would have scored low on the familiarity scale, where some respondents initially wrongly classified the interface as a test due to the similarities with tests; see, e.g., [48]. However, as we wanted to test the pQBL methodology in its fullest, we started this study deliberately without any explanations at all in the course, but felt that we were forced to add them after the first iteration. As the participants in the fourth iteration complained about the onboarding text, we replaced it with three pQBL questions and we would recommend the inclusion of these in any course offered to learners who have no prior experience in either QBL or pQBL. This change was very satisfactory to do, as we now can use the pQBL methodology to explain itself.
Our study has limitations: it is a qualitative study and the results are, therefore, not possible to generalize to other settings or subjects. However, we do think that the positive results with these varied audiences are sufficient for encouraging further examination of the methodology to determine which type of subjects and students it is suitable for.
Another limitation is the means of data collection. Polite participants showing respect for the invited presenter (or teacher in one case) could have reduced the criticism. On the other hand, three of the four studies were done with academics who, in general, are not cautious to express their opinions when something is wrong, or even just to enjoy an intelligent discussion.
Another issue with the data collection was the loss of the intended audio recording from Gävle. This likely resulted in information loss, as note-taking will never be as accurate as a successful audio recording. We tried to counter this by, immediately after the seminar, sending the notes to the organizer to get confirmation that the notes were accurate. Still, details in the participants’ answers were most likely lost, which could have added more color to this part of the study, but we still believe that the main issues were correctly covered. Furthermore, this is one of the benefits of DBR; if we missed something in this iteration, it would probably be caught in the following iterations.
Regarding the possibilities that engaging in online material would reduce the students’ willingness to read textbooks, we must admit that this could be the case; on the other hand, the pedagogical research on the importance of textbooks does not give much support that this would be a loss [49]. Furthermore, as mentioned by the participants, this is already happening, and students follow lectures, and read the presentation slides in order to not buy and read books.
While students might prefer the classical style of learning, based on reading, there is research showing that (a) QBL is six times as efficient as reading [3] and (b) students are poor evaluators of the effectiveness of study methods [19]. Some also mentioned the (negative) feeling of being tested. Confer, e.g., how Marton ([25], Ch. 6) describes the typical Japanese classroom:
In a typical Japanese lesson, on the other hand, the teacher introduced a rather complex problem and invited the students to have a go at it—individually or together in small groups. After a while the teacher asked the students to show their solutions, their attempts were compared and teacher and students might conclude that one way of solving the problem is more powerful than the others, and in such a case the students could practice that particular way of solving this kind of problem on some other, but related, problems.
([25], p. 181)
So, this experience of feeling tested and the preference for a classical style of learning is culturally dependent and pQBL would probably feel rather natural in the Japanese environment.
One could argue that, by limiting the course content to only things that are possible to ask questions about, we would exclude some knowledge that is difficult to create questions for (or, at least, questions to which it is possible to give constructive feedback). While we find it hard to think of things to learn that cannot be asked questions about, many courses will also include parts of skills that are tacit and hard to verbalize, like performing surgery, holding a presentation, or riding a bike. Adding more questions, as long as it is possible, would reduce the learning gaps, but the remaining gaps will, as always, be covered with other traditional methods, e.g., labs, essays, presentations.
Some students mentioned guessing as part of their strategy. Wild guessing without trying to understand the problem, the question, the answering alternatives, or the feedback is, of course, not learning. However, if the students are not interested in learning, adding text or video learning material to the existing questions would not improve anything. In such a small-scale exploratory study as this, we cannot claim that this methodology would increase motivation in otherwise unmotivated students, but the many descriptions by the participants of the methodology as “motivating” is encouraging to undertake such studies. In order to investigate this further, we will study this in courses where there already exists learning material that could either be included in the feedback, or as extra material after the corresponding questions.
Another strategy used by students was, to our surprise, to do the opposite of guessing: googling the information to be able to answer the questions correctly on the first attempt. Using this strategy would, of course, not be time-saving, but learning how to search for unknown information is, of course, a valuable skill to train. Furthermore, since some of the students who practice this behavior also mention that it was “interesting” to study in this way, it might not be all bad. Another question is also whether this search-first behavior is only an initial behavior to cope with the strange situation of being asked questions to which they have no prior knowledge to base the answers on (although, in this case, this would be the second time they were exposed to the similar course content, although in a different form in the OneLearns course). Further studies are necessary to examine whether this behavior changes with time over a course as the students become more familiar with the concept.
From this exploratory study, the pQBL design seems promising, but more studies are necessary to prove its value. One such area is to compare the traditional QBL with the new pQBL, and, indeed, a first study about adult learners has just been published [50], confirming our conclusions here, but more are needed. Another is to explore the possibilities of using generative AI to generate the questions, and, more importantly, the feedback that so many teachers are struggling with.

5. Conclusions

We have performed a design-based research study with four iterations on an online course that examines the possibilities of a new pedagogical method: pQBL, where all learning material has been moved into the feedback of the (many) formative questions. The method can be seen as a digitized version of a very old method: Socratic teaching. Overall, the approximately 78 participants were positive about the methodology, some even found it motivational. One challenge is that the new methodology is unfamiliar, which was stressful for some participants, at least initially. Whilst only a starting point, we believe that this method both reduces time spent on developing learning material that is ineffective and refocuses the effort towards the areas of practice and feedback that are much more effective.

Author Contributions

Conceptualization, all authors; methodology, O.B. and R.G.; analysis, O.B. and R.G.; data collection, O.B. and R.G.; writing—original draft preparation, O.B. and R.G.; writing—review and editing, all authors. All authors have read and agreed to the published version of the manuscript.

Funding

This work has partly been supported by a grant from STINT—The Swedish Foundation for International Cooperation in Research and Higher Education.

Institutional Review Board Statement

Approval for the study was not required in accordance with national legislation, as stated in paragraph 2 of the Swedish Ethical Review Authority [51]. Ethical approval is only necessary if the study involves collecting sensitive personal data, such as physical procedures, methods that impact participants physically or physiologically, risks of harm, or involvement of biological material. In this study, none of these conditions applied. All participants were informed about the intention to use their opinions for the study and were made aware that participation was voluntary, with the option to withdraw at any time without any negative consequences regarding their participation in the seminars.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data presented in this study are available on request from the registrar’s office at KTH Royal Institute of Technology due to GDPR.

Acknowledgments

We would like to thank all participants in the seminars who improved the course and the methodology with their insightful comments, and also the anonymous reviewers who improved the clarity of this paper.

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
ADDAttention Deficit Disorder
DBRDesign-based research
GDPRGeneral Data Protection Regulation
NGONon-Governmental Organisation
OLIOpen Learning Initiative
pQBLPure question-based learning
QBLQuestion-based learning

References

  1. Lovett, M.; Meyer, O.; Thille, C. The Open Learning Initiative: Measuring the Effectiveness of the OLI Statistics Course in Accelerating Student Learning. J. Interact. Media Educ. 2008, 1, 13. [Google Scholar] [CrossRef]
  2. Kaufman, J.; Ryan, R.; Thille, C.; Bier, N. Open Learning Initiative Courses in Community Colleges: Evidence on Use and Effectiveness; Mellon University: Pittsburgh, PA, USA, 2013; Available online: https://www.researchgate.net/publication/344726610_Open_Learning_Initiative_Courses_in_Community_Colleges_Evidence_on_Use_and_Effectiveness?channel=doi&linkId=5f8c1f2e458515b7cf8825ed&showFulltext=true (accessed on 9 August 2024).
  3. Koedinger, K.R.; McLaughlin, E.A.; Jia, J.Z.; Bier, N.L. Is the doer effect a causal relationship? How can we tell and why it’s important. In Proceedings of the Sixth International Conference on Learning Analytics & Knowledge, Edinburgh, UK, 25–29 April 2016; pp. 388–397. [Google Scholar]
  4. Yarnall, L.; Means, B.; Wetzel, T. Lessons Learned from Early Implementations of Adaptive Courseware; SRI Education: Menlo Park, CA, USA, 2016. [Google Scholar]
  5. Glassey, R.; Bälter, O. Put the students to work: Generating Questions with Constructive Feedback. In Proceedings of the IEEE Frontiers in Education Conference (FIE), Uppsala, Sweden, 21–24 October 2020; pp. 1–8. [Google Scholar]
  6. Glassey, R.; Bälter, O. Sustainable Approaches for Accelerated Learning. Sustainability 2021, 13, 11994. [Google Scholar] [CrossRef]
  7. Székely, L. Productive processes in learning and thinking. Acta Psychol. 1950, 7, 388–407. [Google Scholar] [CrossRef]
  8. Bransford, J.D.; Schwartz, D.L. Chapter 3: Rethinking transfer: A simple proposal with multiple implications. Rev. Res. Educ. 1999, 24, 61–100. [Google Scholar] [CrossRef]
  9. Kapur, M. Productive failure in learning the concept of variance. Instr. Sci. 2012, 40, 651–672. [Google Scholar] [CrossRef]
  10. Little, J.L.; Bjork, E.L. Multiple-Choice Pretesting Potentiates Learning of Related Information. Mem. Cogn. 2016, 44, 1085–1101. [Google Scholar] [CrossRef]
  11. Duggal, S. Factors impacting acceptance of e-learning in India: Learners’ perspective. Asian Assoc. Open Univ. J. 2022, 17, 101–119. [Google Scholar] [CrossRef]
  12. Brown, A.L. Design experiments: Theoretical and methodological challenges in creating complex interventions in classroom settings. J. Learn. Sci. 1992, 2, 141–178. [Google Scholar] [CrossRef]
  13. Anderson, T.; Shattuck, J. Research-based design: A decade of progress in Educational Research. Educ. Res. 2012, 41, 16–25. [Google Scholar] [CrossRef]
  14. Bier, N.; Moore, S.; Van Velsen, M. Instrumenting courseware and leveraging data with the Open Learning Initiative (OLI). In Proceedings of the Companion Proceedings 9th International Learning Analytics & Knowledge Conference, Tempe, AZ, USA, 4–8 March 2019. [Google Scholar]
  15. Clearinghouse, W.W. What Works Clearinghouse Standards Handbook, Version 4.1. US Department of Education, Institute of Education Sciences. National Center for Education Evaluation and Regional Assistance. 2020. Available online: https://ies.ed.gov/ncee/wwc/Docs/referenceresources/WWC-Procedures-Handbook-v4-1-508.pdf (accessed on 9 August 2024).
  16. Pane, J.F.; Griffin, B.A.; McCaffrey, D.F.; Karam, R. Effectiveness of cognitive tutor algebra I at scale. Educ. Eval. Policy Anal. 2014, 36, 127–144. [Google Scholar] [CrossRef]
  17. Bälter, O.; Glassey, R.; Wiggberg, M. Reduced learning time with maintained learning outcomes. In Proceedings of the 52nd ACM Technical Symposium on Computer Science Education, Virtual, 13–20 March 2021; pp. 660–665. [Google Scholar]
  18. Deng, R.; Yang, Y.; Shen, S. Impact of question presence and interactivity in instructional videos on student learning. Educ. Inf. Technol. 2024, 1–29. [Google Scholar] [CrossRef]
  19. Deslauriers, L.; McCarty, L.S.; Miller, K.; Callaghan, K.; Kestin, G. Measuring actual learning versus feeling of learning in response to being actively engaged in the classroom. Proc. Natl. Acad. Sci. USA 2019, 116, 19251–19257. [Google Scholar] [CrossRef] [PubMed]
  20. Bälter, O.; Zimmaro, D.; Thille, C. Estimating the minimum number of opportunities needed for all students to achieve predicted mastery. SpringerOpen Smart Learn. Environ. 2018, 5, 1–19. [Google Scholar] [CrossRef]
  21. Glassey, R.; Bälter, O.; Haller, P.; Wiggberg, M. Addressing the double challenge of learning and teaching enterprise technologies through peer teaching. In Proceedings of the IEEE/ACM 42nd International Conference on Software Engineering: Software Engineering Education and Training (ICSE-SEET), Seoul, Republic of Korea, 27 June–19 July 2020; pp. 130–138. [Google Scholar]
  22. Alsufyani, A.A. “Scie-losophy” a teaching and learning framework for the reconciliation of the P4C and the scientific method. MethodsX 2023, 11, 102417. [Google Scholar] [CrossRef] [PubMed]
  23. Kapur, M. Productive failure. Cogn. Instr. 2008, 26, 379–424. [Google Scholar] [CrossRef]
  24. Kapur, M. Productive failure in mathematical problem solving. Instr. Sci. 2010, 38, 523–550. [Google Scholar] [CrossRef]
  25. Marton, F. Necessary Conditions of Learning; Routledge: London, UK, 2015. [Google Scholar]
  26. Little, J.; Bjork, E. Pretesting with Multiple-choice Questions Facilitates Learning. Proc. Annu. Meet. Cogn. Sci. Soc. 2011, 33, 294–299. [Google Scholar]
  27. Roediger, H.L.; Karpicke, J.D. Test-enhanced learning: Taking memory tests improves long-term retention. Psychol. Sci. 2006, 17, 249–255. [Google Scholar] [CrossRef] [PubMed]
  28. Adesope, O.O.; Trevisan, D.A.; Sundararajan, N. Rethinking the use of tests: A meta-analysis of practice testing. Rev. Educ. Res. 2017, 87, 659–701. [Google Scholar] [CrossRef]
  29. Schwieren, J.; Barenberg, J.; Dutke, S. The testing effect in the psychology classroom: A meta-analytic perspective. Psychol. Learn. Teach. 2017, 16, 179–196. [Google Scholar] [CrossRef]
  30. Van der Meij, H.; Böckmann, L. Effects of embedded questions in recorded lectures. J. Comput. High. Educ. 2021, 33, 235–254. [Google Scholar] [CrossRef]
  31. Bälter, O.; Enström, E.; Klingenberg, B. The effect of short formative diagnostic web quizzes with minimal feedback. Comput. Educ. 2023, 60, 234–242. [Google Scholar] [CrossRef]
  32. Shute, V.J. Focus on Formative Feedback. Rev. Educ. Res. 2008, 78, 153–189. [Google Scholar] [CrossRef]
  33. Martinez, C.; Serra, R.; Sundaramoorthy, P.; Booij, T.; Vertegaal, C.; Bounik, Z.; Van Hastenberg, K.; Bentum, M. Content-Focused Formative Feedback Combining Achievement, Qualitative and Learning Analytics Data. Educ. Sci. 2023, 13, 1014. [Google Scholar] [CrossRef]
  34. Pashler, H.; Cepeda, N.J.; Wixted, J.T.; Rohrer, D. When Does Feedback Facilitate Learning of Words? J. Exp. Psychol. Learn. Mem. Cogn. 2005, 31, 3–8. [Google Scholar] [CrossRef] [PubMed]
  35. Marsh, E.J.; Lozito, J.P.; Umanath, S.; Bjork, E.L.; Bjork, R.A. Using verification feedback to correct errors made on a multiple-choice test. Memory 2012, 20, 645–653. [Google Scholar] [CrossRef] [PubMed]
  36. Ahmed, M.M.; Rahman, A.; Hossain, M.K.; Tambi, F.B. Ensuring learner-centred pedagogy in an open and distance learning environment by applying scaffolding and positive reinforcement. Asian Assoc. Open Univ. J. 2022, 17, 289–304. [Google Scholar] [CrossRef]
  37. Cividatti, L.N.; Moralles, V.A.; Bego, A.M. Incidence of design-based research methodology in science education articles: A bibliometric analysis. Revista Brasileira de Pesquisa em Educação em Ciências 2021, e25369. [Google Scholar] [CrossRef]
  38. Schleiss, J.; Laupichler, M.C.; Raupach, T.; Stober, S. AI course design planning framework: Developing domain-specific AI education courses. Educ. Sci. 2023, 13, 954. [Google Scholar] [CrossRef]
  39. McKenney, S.; Reeves, T.C. Educational design research. In Handbook of Research on Educational Communications and Technology; Springer: New York, NY, USA, 2014; pp. 131–140. [Google Scholar]
  40. Majgaard, G.; Misfeldt, M.; Nielsen, J. How design-based research and action research contribute to the development of a new design for learning. Des. Learn. 2011, 4, 8–27. [Google Scholar] [CrossRef]
  41. Kim, P.; Suh, E.; Song, D. Development of a design-based learning curriculum through design-based research for a technology-enabled science classroom. Educ. Technol. Res. Dev. 2015, 63, 575–602. [Google Scholar] [CrossRef]
  42. Reinmann, G. Outline of a holistic design-based research model for higher education. EDeR EDucational Des. Res. 2020, 4, 1–16. [Google Scholar]
  43. Wang, Y.H. Design-based research on integrating learning technology tools into higher education classes to achieve active learning. Comput. Educ. 2020, 156, 103935. [Google Scholar] [CrossRef]
  44. Silva-Díaz, F.; Marfil-Carmona, R.; Narváez, R.; Silva Fuentes, A.; Carrillo-Rosúa, J. Introducing Virtual Reality and Emerging Technologies in a Teacher Training STEM Course. Educ. Sci. 2023, 13, 1044. [Google Scholar] [CrossRef]
  45. Bälter, K.; Abraham, F.J.; Mutimukwe, C.; Mugisha, R.; Osowski, C.P.; Bälter, O. A web-based program about sustainable development goals focusing on digital learning, digital health literacy, and nutrition for professional development in Ethiopia and Rwanda: Development of a pedagogical method. JMIR Form. Res. 2022, 6, e36585. [Google Scholar] [CrossRef] [PubMed]
  46. Haladyna, T.M.; Downing, S.M. A taxonomy of multiple-choice item-writing rules. Appl. Meas. Educ. 1989, 2, 37–50. [Google Scholar] [CrossRef]
  47. Martin Nyling, M.P. Självhjälpsguiden-För dig med ADHD på Jobbet. 2019. Available online: https://attention.se/wp-content/uploads/2021/03/attention_ADHD-pa-jobbet-_sjalvhjalpsguide.pdf (accessed on 19 August 2022).
  48. Butler, K.A. Usability engineering turns 10. Interactions 1996, 3, 58–75. [Google Scholar] [CrossRef]
  49. Gurung, R.A. Pedagogical aids and student performance. Teach. Psychol. 2003, 30, 92–95. [Google Scholar] [CrossRef]
  50. Jemstedt, A.; Bälter, O.; Gavel, A.; Glassey, R.; Bosk, D. Less to produce and less to consume: The advantage of pure question-based learning. Interact. Learn. Environ. 2024, 1–22. [Google Scholar] [CrossRef]
  51. Swedish Ethical Review Authority (SERA) About the Authority. 2019. Available online: https://etikprovningsmyndigheten.se/en/about-the-authority/ (accessed on 23 October 2023).
Figure 1. Relationship between QBL and pQBL. In QBL, content is followed by questions and feedback, which repeats through the learning material. Content-biasing occurs when there is too much focus on content to the detriment of questions and feedback (if any) only referring back to the text that the student just has read without understanding. In contrast, pQBL makes questions and feedback the primary focus of the learning material and content is re-framed as part of the feedback.
Figure 1. Relationship between QBL and pQBL. In QBL, content is followed by questions and feedback, which repeats through the learning material. Content-biasing occurs when there is too much focus on content to the detriment of questions and feedback (if any) only referring back to the text that the student just has read without understanding. In contrast, pQBL makes questions and feedback the primary focus of the learning material and content is re-framed as part of the feedback.
Education 14 00882 g001
Figure 2. Screenshot from an QBL course on introductory programming in Java. The crucial detail to notice is that the learning activity is embedded in the content, and answering the question gives immediate constructive feedback specific to the alternative that was chosen.
Figure 2. Screenshot from an QBL course on introductory programming in Java. The crucial detail to notice is that the learning activity is embedded in the content, and answering the question gives immediate constructive feedback specific to the alternative that was chosen.
Education 14 00882 g002
Figure 3. An example question from the course illustrating the green-colored feedback.
Figure 3. An example question from the course illustrating the green-colored feedback.
Education 14 00882 g003
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Bälter, O.; Glassey, R.; Jemstedt, A.; Bosk, D. Pure Question-Based Learning. Educ. Sci. 2024, 14, 882. https://doi.org/10.3390/educsci14080882

AMA Style

Bälter O, Glassey R, Jemstedt A, Bosk D. Pure Question-Based Learning. Education Sciences. 2024; 14(8):882. https://doi.org/10.3390/educsci14080882

Chicago/Turabian Style

Bälter, Olle, Richard Glassey, Andreas Jemstedt, and Daniel Bosk. 2024. "Pure Question-Based Learning" Education Sciences 14, no. 8: 882. https://doi.org/10.3390/educsci14080882

APA Style

Bälter, O., Glassey, R., Jemstedt, A., & Bosk, D. (2024). Pure Question-Based Learning. Education Sciences, 14(8), 882. https://doi.org/10.3390/educsci14080882

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop