Next Article in Journal
A Voting-Based Ensemble Approach for Brain Disorder Detection Using Random Forest
Next Article in Special Issue
Gamified Project-Based Learning in Vocational Education and Training Computer Science Courses
Previous Article in Journal
Enabling Deep Recursion in C++
Previous Article in Special Issue
Digital Twin-Enhanced Programming Education: An Empirical Study on Learning Engagement and Skill Acquisition
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Preliminary Usability Study of a Novel Educational Training System to Teach ScratchJr. in School

by
María Jesús Manzanares
1,
Diana Pérez Marín
2,* and
Celeste Pizarro
3
1
Department of Financial Economics and Accounting, Faculty of Economics and Business, Rey Juan Carlos University, Móstoles Campus, 28933 Madrid, Spain
2
Department of Computer Science and Statistics, Higher Technical School of Computer Engineering, Rey Juan Carlos University, Móstoles Campus, 28933 Madrid, Spain
3
Department of Applied Mathematics, Materials Science and Engineering and Electronic Technology, Higher School of Experimental Sciences and Technology, Rey Juan Carlos University, Móstoles Campus, 28933 Madrid, Spain
*
Author to whom correspondence should be addressed.
Computers 2026, 15(1), 17; https://doi.org/10.3390/computers15010017
Submission received: 31 October 2025 / Revised: 19 December 2025 / Accepted: 22 December 2025 / Published: 1 January 2026
(This article belongs to the Special Issue Future Trends in Computer Programming Education)

Abstract

Teaching programming to children at an early age has been proven to be beneficial. Some research has focused on how to teach programming to children with special needs. According to Human–Computer Interaction, all users should be involved in the design of their systems (including learning systems). However, evaluation procedures with young children are a complex task, which can be even harder when the young children have some special needs. This paper presents a preliminary usability study of a novel educational training system to teach ScratchJr. to young children. 48 neurotypical students between 6 and 7 years and 2 students with special needs in a pilot study were asked to use the system to find out whether they could complete an input/output activity and how it is related to their preferences.

1. Introduction

Human–Computer Interaction [1] studies how to design systems so that they are usable and accessible for everyone, in accordance with the principles of Universal Design [2]. Among interactive systems, special attention is given to those aimed at teaching, and, in particular, at teaching programming [3].
According to Manches and Plowman [4], the teaching of programming at an early age should be a priority. This is because, in early childhood education, children develop the foundations of computational thinking [5], and logical thinking, which enable them to solve problems, design projects, and communicate ideas [6,7]
The resources commonly used for teaching programming in early childhood education include robots such as Cubetto [8,9], Bee-Bot [10], Tale-Bot [11], and KUBO [12].
At the software level, there is a simplified version of Scratch [13] called ScratchJr. [14] which includes fewer blocks that are mainly focused on movement, simple dialogue, and repetition—fundamental concepts typically introduced between the ages of 3 and 7.
Neurotypical children—that is, those who follow a developmental trajectory consistent with the timeframes studied in psychology—are generally able to learn ScratchJr. directly within the application, with guidance from their teachers [15].
However, in the case of children with a disability or learning disorder, the teaching of programming requires additional support and curricular adaptations, as is performed in other subjects, following the principles of Universal Design for Learning (UDL) [16].
In the literature, there are examples of ScratchJr. versions such as ScratchJr. Tactile, a non-digital, tangible version designed to address diversity [17]. There are also examples of tools developed to teach programming to students with autism [18,19], intellectual disabilities [20] or visual impairments [21].
On the other hand, it is worth highlighting the existence of interactive systems that rely on the use of a Pedagogical Conversational Agent (PCA)—that is, an internal or external representation within the system in the form of an animal, robot, or human [22]—which serves as a guide and support for teaching, taking on the role of teacher, student, or companion.
There are examples of agents that have been shown to improve both the performance and satisfaction of neurotypical students [23]. Improvements have even been reported in students’ emotional capacity when learning programming with a PCA incorporating mindfulness [24].
In our previous work, teachers were consulted about the possibility of creating an interactive system to train children—both neurotypical and neurodivergent—in the use of ScratchJr., according to their preferences and needs [25].
The difficulty of identifying the needs of some non-verbal neurodivergent children or those with limited attention spans to complete a questionnaire independently was taken into account. This is especially relevant considering that, at these ages, executive functions are generally not yet fully developed, nor are reading and writing skills.
Therefore, a User-Centered Design process [1] was initiated to begin the development of a Pedagogical Conversational Agent (PCA), which the children decided to name Boby, in the role of student, to train in the use of ScratchJr., taking into account classroom diversity.
The first functional prototype of Boby was brought to the classroom during the 2024–2025 school year at a school in Madrid, Spain. It was tested by 48 neurotypical children, aged 6 to 7. A pilot study was conducted with 2 students with special needs to test whether they could also use Boby. The results of this experiment are presented in this article.
Additionally, this article also aims to address research questions such as whether children who are accustomed to using technology will find interaction with Boby more satisfying and taking into account their level of digital competence.
Another issue raised is the relationship between block-based games and interaction with Boby, considering that ScratchJr. is a block-based language and that not all children may enjoy playing with blocks.
The article is structured into five sections: Section 2 presents the related state of the art, Section 3 presents the materials and methods of the experiment, Section 4 presents the results, Section 5 presents the discussion of the results, and Section 6 ends the paper with the main conclusions and future lines of work.

2. State of the Art

2.1. Overview

Research on teaching programming in early childhood education is still in its early stages [26,27,28]. This section reviews the published work from both methodological perspectives and practical implementations. It is also important to note that, since the children are between 3 and 6 years old, they are generally not expected to create programs as in primary education. Instead, the focus is usually on coding simpler tasks [29], such as performing sequences of movements or planning routes. In this planning—moving from one point to another, for example, on a grid—basic concepts such as conditionals (going left or right) or loops (turning an object a certain number of times) can begin to be introduced.
Furthermore, when teaching coding to preschool-aged children, it is essential to use an appropriate pedagogical approach, as at this age children have only developed very basic reasoning, lacking the ability to abstract or make complex associations. However, they can already perform two tasks simultaneously, making it an ideal time to introduce them to computer science [30].
Bers [31] proposed, at a methodological level, the TangibleK program, which is a detailed curriculum for teachers who wish to teach coding in early childhood education. The theoretical foundations of this approach can be found in the framework of Positive Technological Development, aimed at developing personal and coding skills based on constructionism [32], and Positive Youth Development [33]. The program involves a minimum of 20 h of in-person work divided into six sessions dedicated to engineering design processes, robotics, flows, loops and parameters, sensors and loops, and sensors and branches.
Each session follows a similar format: (1) initial preparation; (2) activities; (3) individual or collaborative work on a small project; (4) technology circle; and (5) assessment. The MECUE methodology [34] is based on the TangibleK program adapted for an unplugged teaching approach without using robots, and on books such as Hello Ruby [35] in which a 5-year-old girl must solve various puzzles using coding skills—for example, dressing herself using the correct sequence of clothing. Other tasks may involve asking children to perform movements as if they were a robot executing the orders provided by the teacher in sequence, as if following a program.
At the practical tools level, ref. [36] distinguished two main approaches: software environments and robots. In the case of software environments, it is emphasized that interaction for children aged 3 to 6 should be primarily tactile, interacting with a digital projector in class or with tablets. One of the most widely used software environments is the ScratchJr. programming language [37]. This language is simplified compared to Scratch but follows the same philosophy, facilitating the transition to using Scratch when the child turns 6. The instructions are also pieces of a puzzle that form the program to move an object, which can also be a cat. However, unlike Scratch, ScratchJr. has fewer instruction blocks and is used on a tablet, as shown in Figure 1.
Figure 2 shows an example of a program to say “Hello” in ScratchJr. on a tablet screen. It can be seen how the instruction with the text “Hello” is selected, and when executed, the cat says “Hello.”
In the case of robots, approaches without screens are distinguished, such as Cubetto [8,9], in which children also complete a puzzle—in this case, pieces to be placed on a wooden controller to move the robot on a grid mat forward, turn left, or turn right. It is also possible to combine this with a mixed approach, where one group of children uses Cubetto while other children perform the same actions themselves, following the robot’s sequence [36].
Another group of robots is used with buttons, such as KIBO, Bee-Bot, Code & Go, and Code-a-pillar [38], which differ in their design, how children can explore concepts and interact with the robot, and the range of activities that can be performed with the robot. Figure 3 shows an example of the most representative button-based robots for developing programming skills in early childhood.
It can be observed that, in all cases, the design is friendly, rounded, and features bright colours to attract children’s attention. Additionally, three of the four robots are shaped like animals, which are generally very appealing at these ages. Regarding materials, plastic is typically used to reduce costs. They can also be distinguished based on whether they use insertable code blocks, as in KIBO, or plastic tiles, as in Code & Go. In any case, the benefits of young children being able to immediately see the effects of their interactions with the robot have been validated in multiple studies [31,39,40,41,42,43].

2.2. Teaching Programming to Students with Special Needs

The teaching of programming, as indicated in the various studies found in the literature, is beneficial for children from an early age. However, research on the teaching of programming in Early Childhood Education is still in its early stages [26,27,28], This section reviews the existing literature from both a methodological perspective and that of practical implementations.
This section presents studies by several authors highlighting the benefits of the use of ICT and robotics for all pupils, and particularly for pupils with special educational needs.
  • One relevant study identified is the work by Lorena Lanzas Llorente entitled “Emotional comprehension work in children with ASD (autism spectrum disorder) through Educational Robotics: a proposed intervention approach”, which focuses on the benefits provided by robotics, in this case through the use of a robot called AISOYl Kik-E, for children with ASD (autism spectrum disorder). The study centres on developing the understanding of basic emotions through the recognition of facial expressions. Through this work, the aim is to provide children with an innovative resource that promotes their emotional development and, consequently, supports their educational development.
In this study, the author takes into account the principles of meaningful learning by using the AISOYl Kik-E robot as a motivating element and a tool for enhancing pupils’ skills, allowing learners to play an active role in the teaching–learning process [44,45,46].
Therefore, this study introduces the use of AISOYl Kik-E as a support or facilitator of learning rather than as the object of learning itself [44,47].
The conclusions drawn by the author from this study were as follows
(1)
At a general level, the author observed the benefits of Educational Robotics for pupils with special educational needs; in this specific case, for pupils with ASD, as it facilitates learning and enhances interaction, among other aspects,
(2)
Robots such as AISOYl Kik-E help children with ASD (autism spectrum disorder) to develop various socio-emotional skills by using visual resources adapted to their needs, which enable repetition, interaction, and emotional comprehension.
2.
Other relevant studies identified include the work by [48], in which the authors, in their study on technological tools for inclusive education, highlight the possibilities offered by ICT in addressing classroom diversity. They argue that these tools can facilitate access to education and provide high-quality learning opportunities for pupils with different special educational needs present in mainstream classrooms.
In this study, the authors use various tools already available on the market, such as Araboard, Araword, Pictsonidos, and Pictogram, all integrated within the ARASAAC software version 2024–2028 (project 38-09), for the teaching of different disciplines, such as writing and associating images with their meanings, among others. These tools are particularly aimed at pupils with special educational needs.
The conclusions drawn from this study indicate that ICT represents an approach with significant potential, especially for individuals with motor disabilities, ASD (autism spectrum disorder), and ADHD (Attention-Deficit/Hyperactivity Disorder) within educational contexts.
Studies such as that by [49] reinforce the idea that the use of ICT with all pupils, and particularly with pupils with special educational needs, provides a controllable environment and learning situation. This, in turn, increases learners’ motivation and reinforcement, as it promotes attention and reduces frustration in response to errors, enables autonomous work, fosters the development of self-regulation skills, and adapts to the individual characteristics of pupils with special educational needs, thereby facilitating their inclusion.
3.
Studies such as that by [50], in their work entitled “Software tools for inclusive education in the early childhood education stage”, also focus on the various existing computer-based tools applicable in early educational settings, paying particular attention to how these tools may influence development and learning, especially among pupils with special educational needs. The study conducted by these authors was qualitative in nature, allowing for a more in-depth exploration of existing software tools used in inclusive education, the identification of best practices, and an examination of how their use influenced children with special educational needs.
The results obtained were as follows:
  • The authors observed a growing trend in the use of these tools in inclusive education at early educational stages and were also able to verify, through studies conducted by [51], that not all software tools are equally effective for pupils with special educational needs.
    1.1
    The existing software tools used with pupils with special educational needs and analysed in the authors’ study include:
    1.2
    Tinker cad, used to teach programming to secondary school students.
    1.3
    Various educational mobile applications employed in different learning processes for pupils in Early Childhood and Primary Education.
  • Several interactive programmes aimed at the development of logical and motor skills, used with preschool and primary school pupils.
  • Furthermore, through this qualitative study, the authors were able to confirm that 67% of educational institutions have incorporated at least one software tool designed for inclusive education into their early childhood education programmes.
  • Finally, at least 72% of pupils with special educational needs showed a notable improvement in learning during their educational process, underscoring the relevance of these tools within the field of inclusive education.
4.
Another study identified is “The influence of motivation and cooperation among primary school pupils through educational robotics: a case study” by [52]. This study seeks to explore the relationship between Educational Robotics and the factors influencing pupils’ motivation during the implementation of a robotics-based project, with the aim of determining whether Educational Robotics constitutes part of an educational change in the classroom from both a methodological and a procedural perspective.
The study is focused on an early educational level (primary education) and is addressed to all pupils, both neurotypical and those with special educational needs; however, it does not place particular emphasis on pupils with NEE.
This research shows how studies by authors such as [53] identify robotics as one of the most important resources within educational technology, as it can provide a constructive learning environment in the classroom.
The conclusions reached by the author [52], in this study were as follows:
  • Educational robotics promotes increased motivation and interest among pupils in the classroom, leading to more meaningful outcomes in the teaching–learning process.
  • A cooperative working methodology generates both academic and social benefits for pupils.
  • Additionally, drawing on the work of [54] in his book, the author argues that in order to achieve an effective technological learning environment, it is necessary to take into account all the emotions that may influence pupils. She maintains that for learning to occur, it is essential to generate an emotional response in learners; when this emotion is positive, learning becomes more meaningful, as motivation plays a central role.
5.
A brief summary of other studies identified in the literature is presented below:
  • Floor Robots (Bee-Bot)
    The Bee-Bot robot is part of the new generation of educational robotics designed to introduce young children to programming, specifically pupils in Early Childhood Education and the first cycle of Primary Education. Several studies have employed Bee-Bot, a programmable bee-shaped robot specifically designed for children aged 4–7 years.
    According to Angeli and Valanides [55]:
    The authors examine how working with Bee-Bot influences sequential thinking skills (sequencing, route planning, and debugging) in pupils in the early years of primary education.
    They conclude that, with appropriate scaffolding, children improve in problem decomposition, step anticipation, and logical reasoning.
    According to Di Lieto et al. [56]:
    An Educational Robotics Lab programme using Bee-Bot and Pro-Bot was implemented with pupils with special educational needs (neurodevelopmental conditions and learning difficulties) integrated within mainstream classrooms.
    The benefits observed included:
    Improvements in executive functions (planning, working memory, and inhibitory control).
    Increased participation and better behavioural regulation during classroom tasks.
    Inclusive activities in which both pupils with special educational needs and their peers without special educational needs participated, thereby strengthening social interactions.
    Review of educational robotics and neurodevelopmental disorders [57]:
    A critical review of studies involving educational robots with children with ASD, ADHD, intellectual disability, among others.
    The review includes experiences with Bee-Bot-type robots at Early Childhood and Primary Education levels.
    The main conclusions indicate that:
    Robots increase motivation and participation.
    They can support both academic skills (pre-programming and logical–spatial concepts) and social skills (pair work, turn-taking, and communication).
  • Kit-Based Robots (KIBO) for Children Aged 4–7 Years
    KIBO is a robot explicitly designed for children aged 4–7 years and is programmed using physical wooden blocks, making it a screen-free system.
    According to Elkin, Sullivan, and Bers [58]: “Programming with the KIBO Robotics Kit in Preschool Classrooms”
    The study involved work in early childhood classrooms using KIBO.
    Results:
    Children aged 3–5 years were able to create simple programmes, sequence instructions, and debug them.
    KIBO was naturally integrated with activities such as art, music, and storytelling, which supported the development of language skills and creativity.
    KIBO as a tool for Computational Thinking and STEAM
    Documentation and studies from DevTech/KinderLab show that KIBO enables children aged 4–7 years to develop computational thinking, problem-solving skills, and STEAM exploration without the need for screens.
    Although many studies involving KIBO do not focus specifically on pupils with special educational needs, they do report:
    High accessibility due to its tangible and manipulable nature.
    The possibility of adaptation for pupils with attentional, motor, or communication difficulties, owing to the physical, collaborative, and multimodal nature of the activities (sound, movement, and construction).
  • Social Robots and Programming with Children with ASD/Special Educational Needs
    This line of research combines social robotics with the development of social competences, while also incorporating elements of simple programming or “co-programming”.
    According to Gkiolnta et al. [59]: “Robot programming for a child with ASD”
    The study presents a case of robot programming involving a child with ASD.
    The child participates in guided programming activities, with observed improvements in sustained attention, interaction, and enjoyment of the task.
Although many of these studies focus more on social skills than on “pure” programming, in practice robot programming activities (such as deciding sequences and testing them) are used as a vehicle for developing attention, cognitive flexibility, and cooperation.

3. Materials and Methods

3.1. Context

During the 2024/2025 school year, a school in the Community of Madrid, Spain, was asked for permission to attend a validation session of a new interactive system being developed as a future training agent in ScratchJr. This was performed to assess whether, at that stage of development, students were able to interact with the system and to show their level of satisfaction.
Figure 4 shows an example of a system screen, where it can be seen that the Scratch-Jr. blocks are displayed on the left, the question appears at the top, and students must drag the blocks to the center to check whether they have completed the exercise correctly or if they need to try again.
As can be seen in Figure 4, the initial question is “I have to say Hello! How can I do it?”. The idea is that the child helps Boby by dragging the ScratchJr. blocks so that “Hello” is said in the correct position. Initially, no support is provided to see if neurotypical or neurodivergent students can complete the task independently. The option to receive help is available, including interacting with the agent by speaking (microphone) if a block cannot be dragged, and receiving all information in audio for students with visual impairments. For students with autism, the idea of sequence is emphasized using the numbers 1 and 2, and guided information about the task is provided in advance. For students with ADHD, they are allowed to take a pause and receive recommendations, such as going for a walk or using a Bobath ball, to better focus on the system’s instructions.
The novel approach of the tool should also be highlighted: it does not aim to teach the student and does not assume the role of a teacher. On the contrary, it takes on the role of a student who needs help, in order to increase motivation and satisfaction for learners who are not only learning for themselves but also helping the agent Boby, following the Learning by Teaching methodology used by agents such as Betty [60].
Figure 5 shows the technical diagram of how the agent works.
As can be observed, the student accesses the system directly either through voice input or via mouse interaction, without requiring any form of authentication. On the initial screen, or main interface, the system asks the student “How are you?”, and depending on the response provided—either verbally or through mouse interaction—the system behaves as follows:
(a)
If the student selects options one or two, a screen presenting recommendations on the actions to be performed is displayed. These recommendations are provided both in written form and as audio output.
a.
Once the student has completed the recommended actions, they return to the initial or main screen of the system, either by voice command or via mouse interaction.
(b)
If the student selects option three, a screen displaying the number of activities proposed by the system is shown.
a.
When the student indicates—either verbally or through mouse interaction—the activity they wish to perform, the corresponding activity screen is displayed. Instructions for completing the activity are provided both in written form and through audio.
i.
If the child is able to complete the activity independently, they can automatically return—by voice command or mouse interaction—to the main screen containing the set of questions in order to select another activity.
ii.
If assistance is required, the system redirects the student to the recommendations screen. Once the student has completed the recommended actions, they return to the initial or main screen of the system, either verbally or through mouse interaction.
From this main screen, the student may choose to continue working with the system or to exit it, indicating their decision either by voice or via mouse interaction.
A sample dialog of how Boby and the student interact is shown in Figure 6.

3.2. Sample

Fifty children aged 6–7 years were recruited, of whom 17 (34%) are boys and the remaining 33 (66%) are girls. Two of the children are neurodivergent and have special educational needs. They are distributed across two classes with 25 students each, and in each class, 24 are neurotypical and 1 is neurodivergent with educational support needs.
None of the students have previously learned programming or are familiar with ScratchJr. Forty-three of the 50 students (86%) are able to complete a simple questionnaire and have basic reading skills.

3.3. Research Questions

The main objective of this study is to validate if students are able to use the Boby system to train in the use of ScratchJr., and to analyse which factors influence user satisfaction with the tool.
Research Question 1 (RQ1) is: which factors influence students’ satisfaction with Boby? The associated hypotheses are:
H1. 
Students who are accustomed to using tablets for learning will find using the Boby system more satisfying.
H2. 
Students who enjoy block-based games will find using the Boby system more satisfying.
Research Question 2 (RQ2) is: to what extent does the student’s reading ability influence their ability to use Boby?
Research Question 3 (RQ3) is: which factors influence students’ ability to complete the activity shown in Figure 4, that is, teaching Boby to say “Hello”.

3.4. Instrument

To address the research questions, a questionnaire with nine multiple-choice questions was designed and validated by a Special Education teacher, as described in our previous work [25].
The questionnaire is included in Appendix A (in Spanish).

3.5. Procedure

During a session in the computer classroom, students were arranged so that everyone had access to a tablet connected to the Internet to interact with the Boby system (see Figure 4). The classroom tutor and a support teacher for students with special educational needs were present. Additionally, the first author of the article was allowed access to the classroom as an observing researcher.
Initially, all students were asked to individually complete a paper questionnaire validated by the Special Education teacher, as described in our previous work [25]. All neurotypical students were able to complete the questionnaire, as were the two students with special educational needs, with partial support of the teacher.
Subsequently, all students were allowed to interact with the Boby system freely for 30 min. It was observed that all neurotypical students were able to interact with the system without difficulty, and the two neurodivergent students with special educational needs were also able to interact with the system with the support of their teacher.
Finally, all students and their teachers were thanked for their participation in the experiment. The tablets were collected (an email with information about the completed exercise was sent to the researchers), and all data on the classroom tablets were deleted.

4. Results

In this section, the quantitative results of the study are presented. The main outcome variable, Y26 (“Did you like the app?”), is used as an indicator of students’ satisfaction with Boby. Three research questions are then addressed: RQ1 examines which background factors (such as usual tablet use and preference for block-based games) are associated with students’ satisfaction; RQ2 explores to what extent students’ reading ability (Q7) is related to satisfaction; and RQ3 focuses on students’ ability to complete the “Hello” activity using ScratchJr. Given the small sample size, the presence of missing data, and the strong imbalance in satisfaction responses, the analyses are based exclusively on descriptive statistics (frequencies and percentages), supported by tables and bar charts in the following subsections.

4.1. Descriptive Overview of the Participants and Variables

Fifty children aged 6–7 years participated in the study. As described in Section 3.2, 17 (34%) were boys and 33 (66%) were girls, and two of the children had identified special educational needs and were considered neurodivergent. None of the students had previous experience with ScratchJr., and 43 out of 50 (86%) were able to complete the paper questionnaire with at least basic reading skills. The distribution of the main sociodemographic and background variables is summarised in Table 1.
The main outcome variable used in the analyses was item Y26 (“Did you like the app?”), with response options “Yes”/”No”. The explanatory variables considered were: (a) Q10, usual use of the tablet (e.g., mainly for games or mainly for learning); (b) Q14, whether the child likes block-based or piece-based games (“Yes”/”No”); and (c) Q7, reading ability on an ordinal scale from 0 to 3, where higher values indicate greater reading proficiency. In addition, information was collected on whether the questionnaire was completed, which allows the amount of valid data available for the analyses to be estimated.
Given the age of the participants and the classroom context, some items have missing data. In addition, the distribution of the main outcome is strongly skewed: most children reported that they liked the app, with only a small minority selecting “No”. This imbalance, together with the small number of children in some categories (for instance, those who do not like block games, or those at the lowest reading level), makes complex modelling approaches difficult to interpret. For this reason, the results are reported using simple, robust descriptive statistics (frequencies and percentages), complemented by bar charts. Percentages are based on valid responses for each variable. Small deviations from the nominal sample size (N = 50) reflect missing data for some items.

4.2. RQ1—Factors Associated with Satisfaction with Boby

Research Question 1 (RQ1) asks which factors are associated with students’ satisfaction with Boby, operationalised as answering “Yes” to Y26 (“Did you like the app?”). Among the children with valid data for Y26 and the background variables, 34 out of 40 students indicated that they liked the app, which corresponds to 85% of the usable sample. Descriptively, no subgroup shows a pattern of general dissatisfaction with the system.
In this subsection, satisfaction is described by usual tablet use (Q10) and by preference for block-based games (Q14). The corresponding frequency distributions are summarized in Table 2 and Table 3, and visualized in Figure 7 and Figure 8, respectively.

4.2.1. Usual Use of the Tablet (Q10)

Table 2 summarizes the distribution of “Yes”/”No” responses to Y26 across the categories of Q10 (usual use of the tablet). For each category, the table reports the number of children who said that they liked the app and the number of those who did not.
As shown in Table 2, satisfaction is high in all three groups. Among students who reported using the tablet mainly for games, 23 out of 27 indicated that they liked the app. Among those who reported using it mainly for learning, 7 out of 9 liked the app. Finally, all 4 students in the “other/mixed” category reported that they liked the app. Overall, 34 out of 40 students with valid data for Q10 and Y26 reported that they liked the app.
Although the proportion of “Yes” responses is slightly higher in the “other/mixed” category, this group is very small and the difference should therefore be interpreted with caution. There is no clear pattern indicating that using the tablet mainly for games or mainly for learning is associated with a markedly different level of satisfaction. Figure 7 presents these relationships graphically (together with other background variables), highlighting that the bar corresponding to “Yes” is consistently dominant in all Q10 categories.
Figure 7. Proportion of “Yes, I liked the app” (Y26) across subgroups of tablet use and previous experience with technology (panels for Q10, Q9, Q8 and Q14).
Figure 7. Proportion of “Yes, I liked the app” (Y26) across subgroups of tablet use and previous experience with technology (panels for Q10, Q9, Q8 and Q14).
Computers 15 00017 g007aComputers 15 00017 g007b

4.2.2. Preference for Block-Based Games (Q14)

To explore Hypothesis H2 (“Students who enjoy block-based games will find using the Boby system more satisfying”), Table 3 summarises Y26 by Q14. Among children who said that they like block or piece-based games, 27 reported that they liked the app and 5 reported that they did not. In the small group of children who stated that they do not like block games, all 4 reported that they liked the app. For students with missing values or “not applicable” responses on Q14 (labelled as NA), 3 liked the app and 1 did not.
Thus, satisfaction with Boby is high in all Q14 categories, including among those who reported not liking block games. The small size of some subgroups—particularly the “No” and NA categories—means that the observed differences should be interpreted with caution. Figure 8 displays the same information as a bar chart, showing that the percentage of children who liked the app is consistently high and that disliking block-based games does not appear to prevent students from enjoying their interaction with Boby.
Figure 8. Percentage of children who liked the app (Y26 = “Yes”) according to preference for block-based games (Q14).
Figure 8. Percentage of children who liked the app (Y26 = “Yes”) according to preference for block-based games (Q14).
Computers 15 00017 g008
Taken together, the descriptive analyses for RQ1 suggest that no clear subgroup of children is dissatisfied with the system based on their usual tablet use or their liking of block-based games. Satisfaction scores remain high and relatively homogeneous across the different categories considered.

4.3. RQ2—Reading Ability and Satisfaction with Boby

Research Question 2 (RQ2) examines to what extent students’ reading ability (Q7) is related to whether they report liking the app (Y26). For this analysis, data were available for 40 children with valid scores in both variables.
Table 4 shows, for each level of Q7, the total number of children, how many said that they liked the app, and how many did not. The pattern is very stable:
  • At the lowest reading level (Q7 = 0, “reads one word”), the only child in this category reported that they liked the app.
  • Among those who “read some words” (Q7 = 1), 6 out of 7 children liked the app and 1 did not.
  • Among those who “read all words” (Q7 = 2), 27 out of 32 children liked the app and 5 did not.
  • No students were classified as “advanced readers” (Q7 = 3) in the current dataset.
Table 4. Satisfaction with the app (Y26) by reading ability level (Q7): frequencies.
Table 4. Satisfaction with the app (Y26) by reading ability level (Q7): frequencies.
Reading Ability (Q7)Total n in Subgroupn “Yes, I Liked the App”n “No”
0 = reads one word110
1 = reads some words761
2 = reads all words32275
3 = advanced reader (if applicable)000
Overall, 34 out of 40 children with reading data (85%) reported that they liked the app. Figure 9 visualises these percentages, showing that the bars corresponding to each reading level are all high and very similar in height. There is no clear trend suggesting that higher reading ability is associated with greater satisfaction.
From a descriptive perspective, these results indicate that high reading ability is not required for children to enjoy using Boby. Children with more limited reading skills were still able to report high levels of satisfaction with the app.

4.4. RQ3—Completion of the “Hello” Activity

Research Question 3 (RQ3) focuses on which factors influence students’ ability to complete the activity shown in Figure 4, namely teaching Boby to say “Hello” using the ScratchJr. blocks. For this analysis, log data on completion were available for 42 students, distinguishing between completion without help, completion with help, and non-completion.
Table 5 summarises the distribution of completion outcomes by group. Among neurotypical students with available log data (n = 41), 37 completed the activity (2 without help and 35 with help), while 4 did not complete it. For the neurodivergent student with available log data (n = 1), the activity was not completed. Thus, in total, 37 out of 42 students with valid completion data (88.1%) succeeded in completing the “Hello” activity, although the majority required some degree of assistance.
These results indicate that most students were able to use the system to solve the simple programming task, but also that support from teachers or assistants played an important role in enabling completion, particularly for some students. Given the small number of neurodivergent students with log data, the results for this group should be interpreted with extreme caution and are better understood as illustrative cases rather than as generalisable evidence.

4.5. Synthesis

Across the three research questions, the descriptive results show a coherent pattern. First, satisfaction with Boby was very high in almost all subgroups considered (RQ1), and there was no clear descriptive evidence that usual tablet use or preference for block-based games systematically influenced whether children liked the app (see Table 2 and Table 3 and Figure 7 and Figure 8). Second, reading ability did not appear to be a limiting factor for enjoyment of the system (RQ2): children with different levels of reading proficiency reported very similar percentages of “I liked the app” (Table 4 and Figure 9). Third, most students successfully completed the “Hello” activity (RQ3), including the two neurodivergent students, who required only minimal support (Table 5).
Overall, within the limits of this small and imbalanced sample, these descriptive findings suggest that Boby was perceived as usable and enjoyable by the large majority of participating children, regardless of their previous experience with technology, their preference for block-based games, or their reading ability.

5. Discussion

Regarding the first research question (P1) about the factors that influence students’ satisfaction with Boby, the following hypotheses were explored:
H1. 
Students who are used to learning with a tablet will find the use of the Boby system more satisfactory.
H2. 
Students who enjoy block-based games will find the use of the Boby system more satisfactory.
As shown in the results (Section 4.1), neither of the two hypotheses was confirmed. This means that it is not necessary for students to be accustomed to using a tablet in order to use Boby, nor is it necessary for them to enjoy block-based games.
These results are relevant because they broaden the number of students who can interact with the agent to carry out their programming training using ScratchJr.
In particular, this includes autistic students who might not show interest in playing with blocks [61], but who may still be motivated to use Boby for training in ScratchJr., which is a block-based language.
It also takes into account students who usually do not have access to tablets due to economic constraints in their families or other types of limitations. Specifically, this refers to new regulations such as those in the [62] in Spain, where the use of screens has been legally prohibited for children aged 0 to 3, limited to one shared hour per week for students in the second cycle of Early Childhood Education (ages 3–6) and the first cycle of Primary Education (ages 6–8), one and a half hours per week for students in the second cycle of Primary Education (ages 8–10), and two hours per week for those in the third cycle of Primary Education (ages 10–12).
There are exceptions to these laws for children with special educational needs, provided that a psychopedagogical report justifies that technology could support their learning [62].
Regarding the second research question, which examined the extent to which a student’s reading ability influences their capacity to use Boby, the statistical analysis presented in Section 4.2 also found no evidence that higher reading ability is associated with a greater likelihood of satisfaction when using Boby. This finding is significant because it indicates that students do not need to know how to read in order to use the app, which was designed to be visually accessible through descriptive icons and also includes auditory aids for students who are visually impaired.
These results are consistent with other studies that have gone a step further in exploring the inverse relationship—namely, that the use of an interactive application may even improve reading ability. In particular, studies have shown that young learners can engage successfully with interactive reading applications and that such experiences can positively influence their later non-digital reading skills [63,64].
Regarding the third research question, which explores the factors that influence students’ ability to complete the activity shown in Figure 4—that is, teaching Boby to say “Hello”—the results show that both neurotypical and neurodivergent students are able to complete the activity regardless of their ability to use tablets, read, or visually identify the content of images.
In total, 38 out of 48 students were able to complete the activity (79.2%), including the two neurodivergent students, although both indicated that they had needed help. In any case, the assistance provided was minimal, and at no point did the teacher or the researcher complete the exercise for the student; rather, support was given only to allow the student to finish it independently.
When examining the cases in which students did not need help to complete the activity, no significant pattern was observed that could link reading ability or visual image recognition skills. Based on direct observation, it appears instead that this may be more closely related to the students’ own attitude—specifically, their preference for completing activities independently and their lower tendency to ask for or request help.
Finally, it should be noted that no student highlighted the agent’s “student” behaviour, possibly due to the short duration of the interaction (only a single session). This time limitation should be taken into account when interpreting the results.
It should also be considered that validating the usability of the agent on one simple task (i.e., an activity to say hello, input/output concept) is quite limited to validating the learning outcomes of all programming concepts.

6. Conclusions

The use of a pedagogical conversational agent in the role of a student to train learners in ScratchJr. appears to be effective, regardless of the students’ ability to use tablets or play with blocks, when the students are between 6 and 7 years of age.
Overall, 79.2% of the students were able to complete the activity requested by the agent. However, as stated in the Discussion, more research should be devoted to exploring the results with other programming concepts such as sequences or loops.
In the pilot study with the two neurodivergent students, it is noteworthy that both were also able to complete the activity. More sample is needed and research on how students with special needs could use agents such Bobby to learn ScratchJr.
Further exploration is planned to investigate the agent’s potential for teaching additional ScratchJr. instructions over longer periods and multiple sessions with more students with special needs, and to subsequently assess to what extent the agent enables both neurotypical and neurodivergent students to program directly in the ScratchJr. environment.

Author Contributions

Conceptualization, M.J.M.; methodology, D.P.M.; software, M.J.M.; validation, D.P.M.; formal analysis, C.P.; investigation, M.J.M.; resources, M.J.M.; data curation, C.P.; writing—original draft preparation, M.J.M.; writing—review and editing, D.P.M. and C.P. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by MICIU/AEI/10.13039/501100011033 and FEDER/UE, grant number PID2022-137849OB-I00.

Data Availability Statement

The data utilized and generated by this study is publicly available at the following University’s repository: https://edatos.consorciomadrono.es/dataset.xhtml?persistentId=doi:10.21950/XP2SPQ (accessed on 16 December 2025).

Acknowledgments

We would like to thank all students and teachers involved in the experiment.

Conflicts of Interest

The authors declare no conflicts of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript; or in the decision to publish the results.

Appendix A

Computers 15 00017 g0a1aComputers 15 00017 g0a1bComputers 15 00017 g0a1cComputers 15 00017 g0a1dComputers 15 00017 g0a1eComputers 15 00017 g0a1fComputers 15 00017 g0a1gComputers 15 00017 g0a1h

References

  1. Abascal, J.; Aedo, I.; Cañas, J.; Gea, M.; Gil, A.; Lorés, J.; Martínez, A.; Ortega, M.; Valero, P.; Vélez, M. La Interacción Persona-Ordenador; Lleida, Spain. 2001. Available online: https://aipo.es/wp-content/uploads/2022/02/LibroAIPO.pdf (accessed on 16 December 2025).
  2. Muñoz-Arteaga, J.; Collazos, C.; Granollers, T.; Luna-García, H. Perspectivas en la Interacción Humano-Tecnología; Red HCI-Collab; Lleida, Spain, 2023. Available online: https://hci-collab.uxartetic.com/ (accessed on 16 December 2025).
  3. Computer Science Teachers Association (CSTA). Computer Science K–8: Building a Strong Foundation. 2012. Available online: https://csteachers.org/k12standards/ (accessed on 16 December 2025).
  4. Manches, A.; Plowman, L. Computing education in children’s early years: A call for debate. Br. J. Educ. Technol. 2017, 48, 191–201. [Google Scholar] [CrossRef]
  5. Wing, J. Computational Thinking. Commun. ACM 2006, 49, 33–35. [Google Scholar] [CrossRef]
  6. Sánchez-Vera, M. La robótica, la programación y el pensamiento computacional en la educación infantile. Rev. Infanc. Educ. Y Aprendiz. 2020, 7, 209–234. [Google Scholar]
  7. Caguana-Anzoátegui, L.; Rodrigues-Pereira, M.; Solís-Jarrín, M. Cubetto for preschoolers: Computer programming code to code. In Proceedings of the 2017 International Symposium on Computers in Education (SIIE), Lisbon, Portugal, 9–11 November 2017; pp. 1–5. [Google Scholar] [CrossRef]
  8. Alsina, Á.; Acosta, Y. Conectando la educación matemática infantil y el pensamiento computacional: Aprendizaje de patrones de repetición con el robot educativo programable Cubetto. Innovaciones Educ. 2022, 24, 1–20. [Google Scholar] [CrossRef]
  9. Ramos Ruiz, J.; Paredes Barragán, P. Desarrollo del pensamiento Computacional mediante Cubetto en Educación Infantil. IE Comun. Rev. Iberoam. De Informática Educ. 2023, 38, 35–44. [Google Scholar]
  10. Pérez Vázquez, E.; Lorenzo Lledó, G.; Lledó Carreres, A. Aplicación del Robot Bee-Bot en las Aulas de Educación Infantil y Educación: Una Revisión Sistemática Desde el Año 2016. 2021. Available online: https://rua.ua.es/server/api/core/bitstreams/2ed6f7f2-6c22-4f71-8401-1056292a5440/content (accessed on 16 December 2025).
  11. Buendía Cueva, G.I.; Tasayco Díaz, A.P.; Menacho Rivera, A.S. Gamificación y tecnología en la educación infantil: Una revisión sistemática. Rev. InveCom 2025, 5, 1–8. [Google Scholar]
  12. Ruiz Moltó, M.; Arteaga Martínez, B. El pensamiento geométrico-espacial y computacional en educación infantil: Un estudio de caso con KUBO. Contextos Educ. Rev. De Educ. 2022, 30, 41–60. [Google Scholar] [CrossRef]
  13. Resnick, M.; Maloney, J.; Monroy-Hernández, A.; Rusk, N.; Eastmond, E.; Brennan, K.; Millner, A.; Rosenbaum, E.; Silver, J.; Silverman, B.; et al. Scratch: Programming for all. Commun. ACM 2009, 52, 60–67. [Google Scholar] [CrossRef]
  14. Bers, M. El desarrollo de Scratch J.R: El aprendizaje de programación en primera infancia como nueva alfabetización. Virtualidad Educ. Y Cienc. 2023, 26, 43–62. [Google Scholar] [CrossRef]
  15. Narváez Minda, J.; Torres Navas, E.J.; Analuisa Maguashca, J.; Guerrón Varela, E.R. Software Scratch J.R como recurso pedadógico para la consolidación de nociones espaciales en Educación Infantil. MIKARIMIN Rev. Multidiscip. 2021, 7, 71–84. [Google Scholar]
  16. Meyer, A.; Rose, D.H.; Gordon, D. Universal Design for Learning: Principles, Framework, and Practice, 3rd ed.; CAST: Lynnfield, MA, USA, 2025; ISBN 9781943085392. [Google Scholar]
  17. Smith, J.; Brown, L. Scratch Jr. Tactile: An inclusive approach to programming for young children. J. Educ. Technol. 2013, 10, 45–60. [Google Scholar]
  18. Elshahawy, M.; Aboelnaga, K.; Sharaf, N. CodaRoutine: A serious game for introducing sequencial programming concepts to children with autism. In Proceedings of the IEEE Global Enginerering Education Conference (EDUCON), Porto, Portugal, 27–30 April 2020. [Google Scholar]
  19. Zubair, M.S.; Brown, D.; Hughes-Roberts, T.; Bates, M. Designing accessible visual programming tools for children with autism spectrum condiction. Univers. Access Inf. Soc. 2021, 22, 277–296. [Google Scholar] [CrossRef]
  20. Taylor, M. Computer programming with Pre-K through first-grade students with intellectual disabilities. J. Spec. Educ. 2018, 52, 78–88. [Google Scholar] [CrossRef]
  21. Utreras, E.; Pontelli, E. Introductory programming and young learners with visual disabilities: A review. Univers. Access Inf. Soc. 2023, 22, 169–184. [Google Scholar] [CrossRef]
  22. Johnson, J.L. Animated Pedagogical Agents: Face-to-Face Interaction in Interactive Learning Environments. J. Artif. Intell. Educ. 2000, 11, 47–78. [Google Scholar]
  23. Ocaña, J.; Morales-Urrutia, E.; Pérez-Marín, D.; Pizarro, C. Can a Learning Companion Be Used To Continue Teaching Programming to Children Even During The COVID-19 Pandemic? IEEE Access 2020, 8, 157840–157861. [Google Scholar] [CrossRef]
  24. Morales-Urrutia, E.; Ocaña, J.; Pérez-Marín, D.; Pizarro, C. Cand Mindfulness Help Primary Education Students to Learn How to Program With an Emotional Learning Companion? IEEE Access 2021, 9, 6642–6660. [Google Scholar] [CrossRef]
  25. Manzanares, M.; Pérez-Marín, D.; Pizarro-Romero, C. Towards the use of pedagogic Conversational Agents to teach block-based programming to all using ScartchJr. In Actas del XXVII Simposio Internacional de Informática Educativa (SIIE); SIIE: Viseu, Portugal, 2025; pp. 107–112. [Google Scholar]
  26. García-Peñalvo, F. La enseñanza de la informática, la programación, el pensamiento computacional en los estudios preuniversitarios. El Enfoque TACCLE3 2017, 18, 7–17. [Google Scholar] [CrossRef][Green Version]
  27. Ozturk, H.; Calingasan, L. Robotics in Early Childhood Education: A Case Study for the Best Practices; Ozcinar, G., Wong, H., Ozturk, H., Eds.; IGI Global Scientific Publishing: Hershey, PA, USA, 2018; pp. 182–200. [Google Scholar] [CrossRef]
  28. Ching, Y.; Hsu, Y.; Baldwin, S. Developing computational thinkin with educational technologies for young learners. TechTrends 2018, 62, 563–573. [Google Scholar] [CrossRef]
  29. Morgado, L.; Cruz, M.; Kahn, K. Preschool cookbook of computer programming topics. Australas. J. Educ. Technol. 2010, 26, 309–326. [Google Scholar] [CrossRef]
  30. Etecé. Niño de 3 Años. Biblioteca Humanidades. 2023. Available online: https://humanidades.com/nino-de-3-anos/ (accessed on 16 December 2025).
  31. Bers, M. Beyond computer literacy: Supporting youth’s positive development through technology. New Dir. Youth Dev. 2010, 2010, 13–23. [Google Scholar] [CrossRef]
  32. Papert, S. Mindstorms: Children, Computers, and Powerful Ideas; Basic Books: New York, NY, USA, 1980. [Google Scholar]
  33. Lerner, R.; Almerigi, J.; Theokas, C.; Lerner, J. Positive youth development: A view of the issues. J. Early Adolesc. 2005, 25, 10–16. [Google Scholar] [CrossRef]
  34. Pérez-Marín, D. Teaching Programming in Early Childhood Education with stories. In Proceedings of the Actas de la international Coference of Education Research and Innovation (ICERI), Seville, Spain, 11–13 November 2019; pp. 9206–9212. [Google Scholar]
  35. Liukas, L. Hello Ruby: Adventures in Coding; COBEE: Singapore, 2015; Available online: https://www.helloruby.com (accessed on 16 December 2025).
  36. Otterborn, A.; Schonborn, K.; Hultén, M. Investigating Preschool Educators’ implementation of Computer Programming in Their Teaching Practice. Early Child. Educ. J. 2019, 48, 253–262. [Google Scholar] [CrossRef]
  37. Flannery, L.; Silverman, B.; Kazakoff, E.; Bers, M.; Bontá, P.; Resnick, M. Designing ScratchJr.: Support for early childhood learning through computer programming. In Proceedings of the 12th International Conference on Interaction Design and Children, New York, NY, USA, 24–27 June 2013; pp. 1–10. [Google Scholar]
  38. Yu, T.; Roque, N. Designing pedagogic conversational agents: A framework and review. Int. J. Artif. Intell. Educ. 2022, 32, 1–35. [Google Scholar]
  39. Beraza, I.; Pina, A.; Demo, B. Soft & hard ideas to improve interaction with robots for kids & teachers. In Proceedings of the SIMPAR 2010 International Conference on Simulation, Modeling and Programming for Autonomous Robots, Darmstadt, Germany, 15–18 November 2010; pp. 549–557. [Google Scholar]
  40. Highfield, K. Robotic toys as a catalyst for mathematical problem solving. Aust. Prim. Math. Classr. 2010, 15, 22–27. [Google Scholar]
  41. Stoeckelmayr, K.; Tesar, M.; Hofmann, A. Pre-primary children programming robots: A first attempt. In Proceedings of the 2nd International Conference on Robotics in Education, Thessaloniki, Greece, 23–25 April 2011; pp. 185–192. [Google Scholar]
  42. Kazakoff, E.; Bers, M. Programming in a robotics context in the pre-primary classroom: The impact on sequencing skills. J. Educ. Multimed. Hypermedia 2012, 21, 371–391. [Google Scholar]
  43. Bers, M.; Flannery, L.; Kazakoff, E.; Sullivan, A. Computational Thinking and tinkering: Exploration of an early childhood robotics curriculumn. Comput. Educ. 2014, 72, 145–157. [Google Scholar] [CrossRef]
  44. Bravo Sánchez, F.; Forero Guzmán, A. La Robótica como un recurso para facilitar el aprendizaje y desarrollo de competencias generales. Teoría De La Educación. Educ. Y Cult. En La Soc. De La Inf. 2012, 13, 120–136. [Google Scholar] [CrossRef]
  45. Ruiz Velasco, E.; Beauchemin, M.; Freyre, A.; Martínez, P.; García, V.; Rosas, L.; Minami, Y.; Velásquez, M. Robótica Pedagógica: Desarrollo de Entornos de Aprendizaje con Tecnología. Virtual Educa. 2006. Available online: https://www.academia.edu/3249497/Rob%C3%B3tica_pedag%C3%B3gica_desarrollo_de_entornos_de_aprendiz (accessed on 16 December 2025).
  46. Acuña, A. Robótica y Aprendizaje por Diseño. Educación año xlviii-xlix, 2004, 139–140. Available online: https://es.scribd.com/document/333359840/Robotica-y-Aprendizaje-Por-Diseno (accessed on 16 December 2025).
  47. González, A.; Hernández, A.R. AlToy 1: Un robot neo-educativo con emociones. Rev. Iberoam. De Informática Educ. 2013, 51, 51–62. [Google Scholar]
  48. Romero Martinez, S.J.; Calzada, I.G.; García Sandoval, A.; Lozano Domínguez, A. Herramientas Tecnológicas para la educación Inclusiva. Rev. Tecnol. Cienc. Y Educ. 2018, 2018, 83–112. [Google Scholar] [CrossRef]
  49. Pérez de la Maza, L. Programa de estructuración ambiental por ordenador para personas del especto autista: PEAPO. In Las Nuevas Tecnologías en la Respuesta Educativa a la Diversidad; Soto Pérez, F.J., Rodríguez Vázquez, J., Eds.; Selegrafía, S.L.: Murcia, Spain, 2000; pp. 255–258. [Google Scholar]
  50. Guaña-Moya, J.; Rodrigo Altamiraño-Pazmiño, M. Herramientas de Software para la educción inclusiva en la etapa de educación inicial. Rev. Enectiva 2024, 5, 1–14. [Google Scholar]
  51. Rozengardt, A. Lo no Formal en la Atención y Educación de la Primera Infancia; UNESCO: Paris, France, 2020. [Google Scholar]
  52. Sánchez Sánchez, T. La influencia de la motivación y la cooperación del alumnado de primaria con la Robótica educativa: Un estudio de caso. Panorama 2019, 13, 117–140. [Google Scholar] [CrossRef]
  53. Merino-Armero, J.M.; Villena-Taranilla, R.; Somoza, J.A.G.-C.; Cózar-Gutiérrez, R. Análisis del efecto de la robótica en la motivación de estudiantes de tercero de Educación Primaria durante la resolución de tareas de interpretación de planos. Rev. De Estud. Y Exp. En Educ. 2017, 3, 163–173. [Google Scholar] [CrossRef]
  54. Jiménez, L. El Poder y la Ciencia de la Motivación. Cómo Cambiar tu Vida y Vivir Mejor Gracias a la Ciencia de la Motivación. 2017. Available online: https://www.scribd.com/document/544570897/El-poder-y-la-ciencia-de-la-motivacion-PDFDrive (accessed on 16 December 2025).
  55. Angeli, C.; Valanides, N. Developing young children’s computational thinking with educational robotics: An intervention study. J. Educ. Comput. Res. 2020, 58, 461–484. [Google Scholar]
  56. Di Lieto, M.; Inguaggiato, E.; Castro, E.; Cecchi, F.; Cioni, G.; Dell’Omo, M.; Laschi, C.; Dario, P. Educational robotics and students with neurodevelopmental disordes: A systematic review. Heliyon 2020, 6, e05160. [Google Scholar] [CrossRef]
  57. Pivetti, M.; Di Battista, S.; Agatolio, F.; Moro, M. Educational robotis and students with nuerodevelopmental disorders: A systematic review. Heliyon 2020, 6, e05145. [Google Scholar] [CrossRef]
  58. Elkin, M.; Sullivan, A.; Bers, M. Programming with the KIBO robotics kit in preschool classrooms. Comput. Sch. 2016, 33, 169–186. [Google Scholar] [CrossRef]
  59. Gkonlnta, E.; Papadopoulos, A.; Giannakoulas, I.; Bamidis, P. Robot programming as an intervention for a child with autism spectrum disorder: A case study. Int. J. Dev. Disabil. 2023, 69, 424–431. [Google Scholar] [CrossRef]
  60. Leelawong, K.; Biswas, G. Designing learning by teaching agents: The Betty’s Brian system. Int. J. Artif. Intell. Educ. 2008, 18, 181–208. [Google Scholar] [CrossRef]
  61. Elbeltagi, R.; Al-Beltagi, M.; Saeed, N.K.; Alhawamdeh, R. Play therapy in children with autism: Its role, implications, and limitations. World J. Clin. Pediatr. 2023, 12, 1–22. [Google Scholar] [CrossRef]
  62. Comunidad de Madrid Comunidad de Madrid Inicio. Página no Encontrada. 2025. Available online: https://www.comunidad.madrid/ (accessed on 16 December 2025).
  63. Wang, X.C.; Christ, T.; Chiu, M.M.; Strekalova-Hughes, E. Exploring the relationship between kindergarteners’ buddy reading and individual comprehension of interactive app books. AERA Open 2019, 5, 1–17. [Google Scholar] [CrossRef]
  64. Raja, P.; Setiyadi, A.B.; Riyantika, F. The Correlation between perceptions on the use of online digital interactive media and reading comprehension ability. Int. J. Engl. Lang. Lit. Stud. 2021, 10, 292–319. [Google Scholar] [CrossRef]
Figure 1. Example of the ScratchJr. environment (source: scratchjr.org).
Figure 1. Example of the ScratchJr. environment (source: scratchjr.org).
Computers 15 00017 g001
Figure 2. Example of a program in ScratchJr. (source: own elaboration).
Figure 2. Example of a program in ScratchJr. (source: own elaboration).
Computers 15 00017 g002
Figure 3. Example of robots: (a) KIBO, top left; (b) Bee-Bot, top right; (c) Code & Go, bottom left; (d) Code-a-pillar, bottom right (own elaboration).
Figure 3. Example of robots: (a) KIBO, top left; (b) Bee-Bot, top right; (c) Code & Go, bottom left; (d) Code-a-pillar, bottom right (own elaboration).
Computers 15 00017 g003
Figure 4. Example of the initial screen of the agent Boby (source: own elaboration).
Figure 4. Example of the initial screen of the agent Boby (source: own elaboration).
Computers 15 00017 g004
Figure 5. Technical diagram of Boby.
Figure 5. Technical diagram of Boby.
Computers 15 00017 g005
Figure 6. Sample dialog between Boby and a student.
Figure 6. Sample dialog between Boby and a student.
Computers 15 00017 g006
Figure 9. Percentage of children who liked the app (Y26 = “Yes”) by reading ability level (Q7).
Figure 9. Percentage of children who liked the app (Y26 = “Yes”) by reading ability level (Q7).
Computers 15 00017 g009
Table 1. Sample characteristics and data availability.
Table 1. Sample characteristics and data availability.
VariableCategory%
SexBoys35.4
Girls56.3
Age6 years56.3
7 years37.5
ClassClass A46.7
Class B53.3
Neurodivergent statusNeurotypical97.8
Neurodivergent2.2
Reading ability (Q7)0 = reads one word2.2
1 = reads some words15.6
2 = reads all words82.2
3 = advanced reader (if used)0
Tablet use (Q10)Mainly for games66.7
Mainly for learning24.4
Other/mixed8.9
Likes block games (Q14)Yes82.2
No8.9
Other/mixed8.9
Questionnaire completedYes (valid responses)93.75
No/missing0.06
Table 2. Satisfaction with the app (Y26) by usual use of the tablet (Q10): frequencies.
Table 2. Satisfaction with the app (Y26) by usual use of the tablet (Q10): frequencies.
Tablet Use (Q10)n “Yes, I Liked the App”n “No”
Mainly for games234
Mainly for learning72
Other/mixed40
Total (valid cases)346
Table 3. Satisfaction with the app (Y26) by preference for block-based games (Q14).
Table 3. Satisfaction with the app (Y26) by preference for block-based games (Q14).
Likes Block Games (Q14)n “Yes, I Liked the App”n “No”
Yes275
No40
NA31
Table 5. Completion of the “Hello” activity by group and need for help (valid log data).
Table 5. Completion of the “Hello” activity by group and need for help (valid log data).
Groupn Totaln Completed Without Helpn Completed with Helpn Not Completed
Neurotypical412354
Neurodivergent1001
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Manzanares, M.J.; Pérez Marín, D.; Pizarro, C. A Preliminary Usability Study of a Novel Educational Training System to Teach ScratchJr. in School. Computers 2026, 15, 17. https://doi.org/10.3390/computers15010017

AMA Style

Manzanares MJ, Pérez Marín D, Pizarro C. A Preliminary Usability Study of a Novel Educational Training System to Teach ScratchJr. in School. Computers. 2026; 15(1):17. https://doi.org/10.3390/computers15010017

Chicago/Turabian Style

Manzanares, María Jesús, Diana Pérez Marín, and Celeste Pizarro. 2026. "A Preliminary Usability Study of a Novel Educational Training System to Teach ScratchJr. in School" Computers 15, no. 1: 17. https://doi.org/10.3390/computers15010017

APA Style

Manzanares, M. J., Pérez Marín, D., & Pizarro, C. (2026). A Preliminary Usability Study of a Novel Educational Training System to Teach ScratchJr. in School. Computers, 15(1), 17. https://doi.org/10.3390/computers15010017

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop