Next Article in Journal
A Review on Federated Learning Architectures for Privacy-Preserving AI: Lightweight and Secure Cloud–Edge–End Collaboration
Previous Article in Journal
A Multi-Source Embedding-Based Named Entity Recognition Model for Knowledge Graph and Its Application to On-Site Operation Violations in Power Grid Systems
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Participatory Design Approach to Designing Educational Interventions for Science Students Using Socially Assistive Robots

by
Mahmoud Mohamed Hussien Ahmed
1,
Mohammad Nehal Hasnine
2 and
Bipin Indurkhya
3,*
1
Instructional Technology Department, South Valley University, Qena 83523, Egypt
2
Research Center for Computing and Multimedia Studies, Hosei University, Tokyo 184-8588, Japan
3
Cognitive Science Department, Jagiellonian University, 30-060 Kraków, Poland
*
Author to whom correspondence should be addressed.
Electronics 2025, 14(13), 2513; https://doi.org/10.3390/electronics14132513
Submission received: 30 April 2025 / Revised: 8 June 2025 / Accepted: 18 June 2025 / Published: 20 June 2025
(This article belongs to the Section Artificial Intelligence)

Abstract

We present here an approach to the deployment of social robots in a science laboratory to monitor the behavior of students with respect to safety regulations to prevent accidents. Our vision is that the social robot should act as a friendly companion for students and encourage them to follow safe laboratory practices. Towards this goal, we developed a Laboratory Safety Assistant Framework (LSA) using a Misty II Plus robot and designed three dashboards within it as interventions. This LSA framework was evaluated using a participatory design (PD) study with twenty university students (eleven from Japan and nine from Egypt). For this study, we designed a questionnaire that contains 42 questions on the prior knowledge of students about socially assistive robots and their expectations about how socially assistive robots can create a secure environment in the scientific laboratory. The chi-square test revealed that there are no differences between groups in their perceptions of using Misty II to achieve safety inside science laboratories. In their perception of the capabilities of social robots and the sharing of feelings, students believe that using social robots like Misty II inside the science laboratory can make the lab safe and decrease risk inside the science laboratory without using the three dashboards of the LSA framework. However, the Wilcoxon signed-rank test revealed that there is a significant improvement in students’ perceptions ((Median = 106.5, Z = −2.39, p < 0.05, r = 0.53)) between students’ expectations of using social robots to achieve safety in scientific laboratories before and after they interacted with the social robot and knew about the feasibility of the three dashboards we designed. Furthermore, the t-test revealed participants’ experiences of sharing feelings with a social robot, and the intervention suggested by the LSA framework was to design a system integrating this into a social robot to enhance safety within the scientific laboratory (t (19) = 3.39, p = 0.003).

1. Introduction

Laboratories play a crucial role in science education; they are where students learn to design and conduct experiments to make empirical observations and test hypotheses. They also learn to use various tools and equipment that are needed for experiments. However, laboratories also have equipment and materials that can be dangerous if not used properly. For example, there may be powerful acids and neurotoxins, and equipment such as Bunsen burners that guzzle natural gas, high-pressure cookers known as autoclaves, and certain types of explosive materials. Moreover, error detection in dynamic images still has an inadequate generalization performance on invisible categories. Domain generalization is essential for their practical implementation, yet tasks such as crop disease detection remain difficult. In addition, the techniques for creating interpretable prompts need to be improved [1]. The materials used in the construction of existing sensors limit their ability to detect physical events, such as humidity, as a continuous monitoring approach, which leads to insufficient sensitivity. In addition, many continuous monitoring devices use variations in signal frequency to evaluate human conditions such as breathing [2].
Failure to follow the proper procedures can result, for example, in chemical burns, cuts from broken glass, toxic fume inhalation, chemical absorption through the skin, or toxic chemical ingestion. Therefore, students inside a scientific laboratory must handle laboratory equipment and materials with extreme caution. Otherwise, it could lead to laboratory accidents.
Laboratory accidents occur regularly in universities, although many safety measures have been proposed and implemented. The Web of Science database contains 219 publications on university laboratory safety. These studies describe laboratory accidents in 44 countries and 254 research institutes [3]. A study by Chen et al. (2020) on explosion accidents in higher education found that laboratory accidents occur not only at universities but also in industries, resulting in numerous injuries, deaths, and financial losses [4].
Laboratory accidents may occur for a variety of reasons. One common source is human error, which involves breaking lab rules, failing to follow lab safety procedures, carelessness, using broken glassware, not wearing protective clothing throughout the experiment, and rushing around the lab excitedly. In the absence of lab teachers, data-driven intervention technologies could help reduce injury and increase safety.
Modern intervention systems use data-driven decision-making to advise teachers and students. Interventions are used to help struggling students achieve their goals. According to Hammerschmidt-Snidarich et al. (2019), interventions use a variety of methods to meet the needs of different types of learners, such as visual learners, auditory learners, and kinesthetic learners. Interventions can take the form of resources such as graphic organizers, study guides, seats near the teacher, and visual schedules [5]. Intervention strategies have been shown to be beneficial in helping at-risk or underachieving children before they fail [6]. In addition, learning analytics-based intervention tools such as student-facing dashboards, teacher dashboards, automated learning analytics-based feedback, customized feedback, and visualizations are used to help students in need of help [7]. Teachers and policymakers use intervention systems to identify areas of need. These tools can guide teachers in taking action to address these needs.
Currently, most of the intervention systems developed to solve educational problems use data from the activity of learners in digital learning environments. Digital learning environments include Learning Management Systems (LMS) such as Moodle and Canvas, digital textbooks, Student Information System (SIS), and eye trackers. Data generated by digital learning environments can guide teachers about the Digital Learning Institute (DLI) [8]: How often are learners logging into the course? How long are the learner sessions? Are learners progressing well through the course or confronting challenges? Are learners struggling with specific parts of the assessment? Are learners participating in social learning activities? Are learners utilizing all available resources? What percentage of learners are completing the course? Learning analytics researchers use these log data to generate predictive analytics models for identifying at-risk students [9], dropout detection [10], final score prediction [11], and retention rate measurement [12]. However, data from the digital learning environment and the existing predictive models found in the literature are not sufficient to identify human (i.e., student) errors that could break lab rules and fail to follow lab safety procedures. Existing educational interventions [13], often in the form of learning analytics dashboards, have limitations in guiding students with respect to behaviors that could lead to potential accidents or injuries. Furthermore, most of the existing data-driven intervention technologies have limitations with regard to helping students reduce injuries and assisting teachers in increasing safety in the laboratory. Having a social robot present in the lab while students are conducting experiments could provide a solution to this problem. In addition, in the absence of teachers in the lab, a social robot-assisted intervention technology can provide insight into students’ activities and behaviors in the lab to the teacher.
A socially assistive robot or social robot is an artificial intelligence platform equipped with multiple sensors, cameras, microphones, and other technologies, such as computer vision, that enable it to interact with humans and other robots. These robots can communicate with humans and other social robots by adhering to the social behaviors and rules associated with their role in the group [14]. Social robots have various shapes and sizes. Many of them have human-like faces. Social robots can engage in social interactions with humans through verbal exchange, cooperative problem solving, and independent decision-making [14]. Because of this, social robots are gaining popularity in solving real-world problems because they can capture unattended user interactions in real natural environments such as homes [15]. Some popular socially assistive robots are Nao, Pepper, Misty, Furhat and Jennie. These robots can carry out social interactions and can communicate through facial expressions. They have shown great promise in healthcare [16], smart house design [17], telepresence [18], defense [19], and entertainment [20] due to their autonomy, controllability, maneuverability, and stability. However, their roles in education have not yet been explored, particularly in laboratory accident prevention, companionship, and improving student well-being in the science laboratory.
We have introduced the Laboratory Safety Assistant (LSA) [21] to address these limitations and bridge research gaps. This framework aims to identify students’ dangerous activities and risky behaviors that could increase the risk of laboratory accidents and human injury. This framework is supported by a social robot. The framework aims to detect behaviors that violate laboratory guidelines, such as not wearing safety goggles, not wearing gloves, and dangerous physical movements during experiments, which could be identified in advance using robotic features to reduce the risk of laboratory accidents. The LSA framework aims to use the data captured by the social robot to build a robot-assisted intervention system for teachers to identify risky behaviors and guide student safety during laboratory experiments. The LSA framework aims to monitor, assist, and provide companionship and friendly supervision to a student in the absence of a lab supervisor. It consists of three dashboards (those we refer to as robot-assisted interventions) that are designed using a co-design method to meet a set of user requirements. The reason for using the co-design method to design the three interventions is that it leads to a better understanding of user needs, increases engagement, promotes human–to–robot interaction, promotes human–to–intervention interaction, and is user-centered. In this paper, we evaluated them using a participatory design experiment. Participatory design in social robotics research is a practice of User-Centered Design (UCD), where multiple stakeholders contribute to the design process as active co-designers [22]. The evaluation experiment sheds light on university students’ experiences about social robot capabilities, their expectations about using social robots for achieving safety in scientific laboratories, and whether socially assistive robots could enhance safety in science laboratories. We identified three research questions:
  • What is the appropriate intervention for designing a social robot to improve safety in a science laboratory?
  • What are the students’ expectations for improving safety inside a science laboratory with the support of social robots?
  • How does sharing students’ feelings affect the design of a social robot to improve safety inside a science laboratory?
The goal of this study is to overcome the limitations of the prior frameworks. For example, the HARISM framework, which facilitates outdoor safety monitoring, is unable to adequately represent the intricacy of interactions that occur in the actual world. Additionally, issues like the best location for sensors, local conditions-appropriate communication protocols, or an easily modifiable architecture must be considered. In addition, this study aims to improve emergency response, particularly in settings such as science laboratories [23]. Hence, this paper presents the teacher and student insights on using the LSA framework to achieve safety inside the laboratory, and how integrating the suggested interventions with the framework can lead to providing a secure laboratory for teaching students the practical skills of conducting scientific experiments. The structure of this paper is as follows. In Section 2, we provide a comprehensive review of the literature on various dimensions of risky behaviors in science laboratories, the aspects of participatory design in education and research on social robotics, and the application of social robotics to solve various education problems. Section 3 describes the LSA framework and its components. Section 4 presents the details of an evaluation experiment, the results of which are presented in Section 5. The implications of these results are discussed in Section 6, and the conclusions are presented in Section 7.

2. Literature Review

2.1. Social Robotics in Education

Robots, particularly humanoid and social robots, have many uses in education. For example, they are applied to STEM (Science, Technology, Engineering, and Math) education [24], language learning and teaching [25], vocabulary development [26], as teaching assistants [27], as tutors or peer learners [28], etc. They have also been involved in the development of complex educational applications such as knowledge tracing and predictive decision-making [29], robot-assisted learning (RAL) agents and r-learning services [30], and learning experience through social interaction with learners. In facilitating learning and teaching, robots are used as the latest forms of technology. The benefits of a social robot together with virtual pedagogical agents (such as laptops, tablets, or phones) and intelligent tutoring systems include facilitating student learning and improving educational performance, personalizing the curriculum for students with diverse needs [28], and adding social interaction to the learning context in some specific cases [27].
Social robots are specifically designed to interact and communicate with humans, either autonomously or semi-autonomously. Social robots allow human interaction to be more natural than other forms of technology due to their appearance, which is often humanoid or in the shape of an animal [25]. Due to their physical appearance and intelligent features, they are relatively easy to use when exploring ways of solving educational problems. In education, they have been shown to be effective in increasing cognitive and affective outcomes and have achieved outcomes like those of human tutoring on restricted tasks [28]. Furthermore, they have been shown to reduce the level of anxiety of students and offer a more positive attitude toward learning [26]. In recent years, a special focus on affective aspects such as student motivation and their responses to robot social behavior has been noticed. Studies comparing students’ motivation levels in a robot-assisted classroom versus a traditional classroom found higher student motivation in robot-assisted classrooms compared to traditional classrooms [25,31]. Robots that exhibit social behaviors are explored primarily in education, although robots that do not exhibit social elements can also be used as educational tools [32]. In general, robots, social or of other types, are a natural choice when learning materials are required that employ direct physical manipulation of the world. Despite having numerous benefits, there are several challenges in using robot-assistive technologies to support education.

2.2. The Effectiveness and Challenges of Robot-Assisted Interventions

In recent years, social robots have grown out of traditional uses and are being deployed for novel applications, such as robot-assisted intervention. For example, research in 2019 led to Shukla et al. (2019) developing a robot-assisted mental health intervention tool [33]. The researchers evaluated the efficacy of the tool by assessing its impact on users and their caregivers and found a significant reduction in the caregiver burden. However, they also noted the need to provide special training to caregivers to take full advantage of the socially assistive robot in cognitive rehabilitation.
Another study in 2020 applied the collaborative design approach to develop a socially assistive robot-based education intervention aimed at children with autism [34]. The researchers found that in helping students with autism, the sensory rewards provided by the robot elicited more positive reactions than verbal praises from humans.
Another example is provided by a 2022 study, which focused on elementary school children with specific learning disorders and developed a robot-assisted intervention tool to improve their academic performance [35]. They found that humans and Nao robots have comparable effectiveness in teaching these children. They also found that the Nao-assisted intervention was as effective as the gold standard intervention for this group of children.

2.3. Risky Behaviors in Science Laboratories

Laboratory staff, such as instructors or lab assistants, are essential in avoiding accidents. However, each function they perform within the science laboratory has significant limitations. To keep users in a safe environment, educators can create a written safety policy. In addition, they have the power to prohibit students from using the laboratory until they have passed a test designed to gauge their familiarity with that procedure. As students require some experience to connect what they learn with what they encounter in the lab, there are several barriers to providing them with a secure environment. Each time a novel substance is used in the lab, the procedure must be reviewed, which adds time and effort. Although laboratory staff can keep an eye on every student at once, this nevertheless makes the science experiments uncomfortable for the students. Before students begin their work, lab staff can quickly verify that they are safe inside the lab. However, they cannot always remain with students to monitor dangerous behavior while conducting studies. Using social robots like Misty II, all these problems can be solved with ease. Depending on how many robots can perform the tasks correctly, Misty II can supervise every student all the time. Table 1 below shows how Misty II’s capabilities can help with safety issues in science laboratories.
Because the dynamics of quantitative and qualitative changes in science laboratories depend on time and space—some substances can be harmful to humans at very low concentrations, and individual reactions vary depending on several factors, including the production of toxins—the traditional supervisor inside laboratories cannot fully help students avoid risks [40]. Research has shown that nearly half of all laboratory accidents are caused by human factors. Therefore, violations of laboratory procedures, operating procedures, and negligent operation are the main causes of laboratory accidents [41]. Laboratory workers are at risk of upper extremity musculoskeletal disorders because these tests often involve ergonomic risk factors, such as prolonged or repetitive manual activities, non-neutral upper extremity posture, and forceful exertion [42]. Causal factors that contribute to laboratory accidents often include unsafe practices and conditions. Recently, comprehensive and rigorously standardized engineering and administrative controls, such as regular safety inspections, have been implemented to address hazardous conditions. Despite these efforts, laboratory-related accidents are still widely reported [43].
For this reason, we identified the main factors that cause accidents in science laboratories. Based on our empirical data, we found that students do not use appropriate personal protective equipment; students do not always wear safety glasses during experiments; students do not wear aprons; students stand for long periods of time; students sit for long periods of time; and students leave open flames exposed, all of which are dangerous behaviors. However, it is beyond the current capabilities of social robots to monitor all these diverse behaviors. Therefore, this study focuses on dangerous behaviors related to people, the environment, and experimental equipment. For example, students wear aprons in the laboratory or conduct experiments for long periods of time without any interaction.

2.4. Participatory Design and Co-Design Approaches in Social Robotics

Designing robot assistive tools/software is often a multidisciplinary activity because of the broad knowledge required from different scientific domains, including a deep understanding of human behavior and intelligence. According to the pioneers of social robotics, Cynthia Breazeal, Kerstin Dautenhahn, and Takayuki Kanda, “the design of technologies and methodologies of social robots is informed by robotics, artificial intelligence, psychology, neuroscience, human factors, design, anthropology, and more” [44]. Therefore, it is important to work with a diverse team, including end users, to specify the appropriate user requirements. Although various approaches are being tried to address these problems, easy-to-use tools or artifacts that facilitate multidisciplinary collaboration to design robot assistive technologies are still scarce [22].
In social robotics research, Participatory Design (PD) is a practice of User-Centered Design (UCD) in which multiple stakeholders contribute to the design process as active co-designers [45]. In general, the participatory design approach starts with collecting initial information through interviews or surveys, which is shared among participants as the basis for the design. Then, new information is gathered to design tools or interventions for a particular set of users or scenarios. Participatory design offers several advantages, including examining users’ needs, challenging assumptions, encouraging reciprocal learning, and creating innovative ideas [46,47]. By treating stakeholders as designers rather than as end users, researchers increase their ability to frame and solve problems and create value [47].
There are many instances where a participatory design approach has been successfully applied in the robot-assistive education and human–robot interaction domains. A 2016 study used the participatory design of social robots to build service robots that can perform various services inside buildings [48]. Their design goal was to guide a blind person when entering a building. Their experimental results suggested that the participatory design process they adopted can benefit the robot design for blind people. A 2017 study used the participatory design of social robots to understand the learning behaviors of older adults with depression [49]. This study was based on self-identified questions and concerns of the participants and developed robot concepts according to the interpretations of the participants of the capabilities and potential applications of robotic technologies. A 2020 study applied a participatory design process to develop a robotic tutor to assist with sign language learning for children with autism [34]. This study provided a set of design guidelines and ethical considerations for the participatory design process of social robots. Their pilot study revealed that children with autism and their companions had positive experiences with the robot. Using participatory design, a 2019 study focused on community-based robot design for dementia caregivers [50], and found (i) the context of dementia robot design to give a more prominent role to informal family caregivers in the co-design process, (ii) new design guidelines that contextualize robots within the family caregiving paradigm, and (iii) connections between certain robot attributes and their relationship to the stage of dementia a caregiver is experiencing. These findings could lessen emotional labor or provide redirection during emotionally difficult times, as well as facilitate positive shared moments.
We found several studies in which robotic-assistive technologies are explored in education. However, we did not find any recognized studies in which robots, social or other kinds, are used in the context of science laboratories, specifically to identify human errors, reduce laboratory accidents, provide support to students in the absence of the teacher, offer social companionship while an experiment runs for a long period of time, or monitor students’ behaviors. Therefore, our work bridges the literature gaps and brings new dimensions to address these limitations.

3. Methodology

Research indicates that nearly half of laboratory accidents are caused by human factors, such as the violation of laboratory protocols, operating procedures, and careless operations [41]. Hence, it is essential to have a framework that can identify key factors for laboratory accidents, guide students, teachers, and technical staff on their roles to avoid/prevent them, and define the roles of technologies in the laboratory to bring health to the laboratory. From that point of view, we designed the Laboratory Safety Assistant (LSA) framework [21]. In designing this framework, we adopted a four-step user-centered co-design approach. Using the user-centered co-design approach, we identified the risky/dangerous behaviors that could cause harm, set the user requirements, and defined the characteristics of the framework by putting in the appropriate components.

3.1. A User-Centered Co-Design Approach

In designing new technological tools, the term co-design is often used as an umbrella for participatory design, co-creation, and open design. According to Steen (2013), co-design involves organizing open innovation processes in which users share and combine ideas and knowledge or engage users as participants in the design process [51]. Co-design is one of the best ways to design solutions because design itself remains a social process by including a variety of approaches, from research-oriented to design-oriented (e.g., using creative tools) and emphasizing user participation, to approaches in which researchers and designers develop user-oriented to participatory design. The co-design approach we adopted is for the end-users and with end-users. Figure 1 shows the processes of the user-centered co-design method used to design the LSA framework.
The co-design process has four steps: contextual inquiry, consultation with teachers and students, the design of the interventions using a participatory design method, and intervention prototyping.
Contextual inquiries: The contextual inquiry method was introduced by Hugh Beyer and Karen Holtzblatt to resolve the drawbacks of other qualitative research methods such as surveys and interviews [52]. The objectives of our contextual investigations were to define the educational contexts and identify preliminary design challenges. To better understand the context, the authors physically visited science laboratories. Our visits were limited to Biology and Chemistry laboratories and involved in-depth observation and interviews of a small group of users to gain an understanding of work practices and behaviors. Contextual inquiries to science labs revealed hidden insights into the work of students that may not be available through other research methods. These include how students perform their assigned tasks, how student–to–student interactions and student–to–teacher interactions happen during experiments, and how students interact with the instruments used for conducting experiments. Figure 2 shows the laboratory environment of a university in Egypt. Figure 2a shows a teacher experimenting, whereas Figure 2b shows some instruments used by the students to conduct experiments.
Consultation: In co-design, consultation methodology is a process in which relevant stakeholder views are sought, but the final decisions are made by others [53]. In our consultation study, we visited a biology lab and interviewed students and teachers, who were put into a co-design group to support our design process. During the consultation, we informally asked students and teachers about their opinions and perspectives on risky behaviors and the role of a robot-assisted intervention. These topics included the following:
  • Are you familiar with, or have you heard of, laboratory accidents?
  • What is your typical day in the laboratory like?
  • How do you define risky or dangerous behaviors in the lab?
  • What should, or should not, a student do while an experiment is in progress?
  • What measures are taken by the teacher to familiarize students with the hazardous apparatus?
  • What do you think about having a social robot in the lab?
  • What are your positive and negative impressions of social robots?
  • What are the general issues and challenges that you face in the laboratory?
The feedback we received from the consultation step helped us to understand the user requirements required to overcome the challenges. The empirical data obtained in the contextual inquiry and consultation steps yielded two key outcomes: identifying the risky/dangerous behaviors in the science laboratories, and a set of user requirements. Our collected data indicated that students
(a)
Do not use appropriate personal protective equipment;
(b)
Do not wear goggles during the experiment;
(c)
Do not always wear safety glasses;
(d)
Do not wear an apron;
(e)
Do not stand for long periods;
(f)
Do not sit for a long time;
(g)
Do not leave an open flame unattended.
All of these should be classified as risky behaviors. Our study focuses on risky behaviors that involve humans, the environment, and experimental equipment. For example, a student can wear an apron in the lab or can conduct experiments for a long time without any interaction. To specify a set of user requirements, we considered the needs of academic supervisors (i.e., teachers). The resulting user requirements for the framework are shown in Table 2.
Participatory design: To meet user requirements, this step aimed to design three interventions using the Design Based Research (DBR) approach, which applies iterative designs to develop knowledge that improves educational practices [54]. The process of design-based research involves providing solutions (known as interventions) to an educational problem. Using this approach, we designed three robot-assistive interventions (also called dashboards) for the LSA framework, which are a key component of our proposed framework. The subsection ‘The LSA framework and its components’ provides further details on them.
Prototyping of the interventions: We adopted a blue-sky research approach in this step; however, the goal of this step was to have interventions running in real-time. Blue-sky research is distinct from applied research, which aims to develop or advance technologies to solve a specific problem or create a new product. Blue-sky research describes research driven by a desire to advance our scientific understanding, without necessarily considering specific real-world applications [55]. To better understand scientific needs, we proceeded with blue-sky research, hypothesizing that the interventions we design will assist teachers in identifying human errors made by students in the laboratory. Our approach here was to use the iterative nature of blue-sky research to collect data from students and teachers to generate a new intervention(s) to conceptualize the design.

3.2. The LSA Framework and Its Components

As stated earlier, a user-centered co-design approach is adopted to design the LSA (Laboratory Safety Assistant) framework. The LSA framework is made up of a socially assistive robot called Misty II Plus, a Behavior Detection and Analysis (BDA) server, and the interventions. (Figure 3) shows an overview of the framework.
Misty II Plus as the social robot: A social robot developed by Misty Robotics, Misty II Plus can express a range of emotions, from joy to sadness, using its eyes and vocalizations. Misty can also communicate curiosity or excitement by moving its head, neck, and arms. It allows users to interact verbally using on-board speech recognition. Moreover, Misty II Plus can move autonomously and dynamically in its environment using the 3D maps it creates. It has advanced built-in sensors that include an infrared projector, a 4K video sensor, six capacitive touch panels, eight time-of-flight sensors, six bump sensors, and two internal inertial measurement units (IMU). Our Misty II Plus also has an Arduino Backpack, which allows us to build a customized environment for integrating new hardware and new sensors, thereby enabling Misty II Plus to interact with the physical world in a more interactive way. Using these advanced sensors, the robot can interact with the environment and collect human–robot interaction data, which can be used to assist a person in the lab.
We still do not fully understand how students feel about the collection and use of their data. Students often worry about how much their personal information is monitored and used by educational institutions. Educational institutions still need to establish explicit policies and permission processes that notify students of the information being collected, how it will be used, and who will have access to it. Undoubtedly, a social robot can provide students with all the information they need to lessen their anxiety about their data [56]. Misty II has a flexible open robotics platform that comes with a wide range of programming tools. It is perfect for investigating concepts of computer science and engineering because of its sophisticated sensors and HTTP API endpoints. By completing the psychological activity that promotes wellness, Misty also significantly contributes to coaches connecting with the robot to receive instructions. Investigating the possibility of implementing robotic tutoring in the real world is crucial [57].
Behavior Detection and Analysis (BDA) server: This is where Misty’s sensor data and interaction data are analyzed. The BDA server processes the events acquired by Misty II Plus sensors, and the interactions that occur between the teacher and the intervention, and the student and the intervention. The BDA server comprises many Relational Database Services (RDS) that use MySQL as their query language.
Interventions: Using Design-based Research (DBR) methods, we designed three robot-assisted interventions (dashboards) to assist students and teachers in science laboratories (Figure 4). Each intervention is an analytics dashboard on a tablet PC for both the student and the teacher, which aims to monitor a student’s behavior and assist him/her in con-ducting laboratory experiments safely without the teacher or the lab technician being present. It also aims to send analytical feedback to the teacher intervention screen so that the teacher can monitor individual students. Meanwhile, the Misty II robot assists as a social companion.
The first intervention/dashboard is to detect five basic safety measures. They are (i) detecting whether or not a student is wearing goggles during the experiment, (ii) checking whether or not a student is wearing gloves during the experiment, (iii) checking if the student is wearing appropriate footwear in the laboratory, (iv) checking whether a student is wearing an apron in the laboratory, and (v) checking if a student checks on the check sheet and sets the day’s goal. The BDA server uses data from the 4K video sensor, twin infrared global shutter cameras (30 fps), and an infrared projector to visualize them on the dashboard.
The second intervention/dashboard has advanced functions to support students in a science lab. For example, this dashboard can (i) report on the number of risky behaviors a student performs in a day, (ii) report on the number of risky behaviors a student performs in a week, (iii) report on the number of risky behaviors a student performs in a month, (iv) measure and report the temperature near the experimental area, and (v) report on how long the student is conducting experiments without interaction with a human or a robot. This dashboard is designed to have a graph for frequently made errors (that is, which risky behaviors are frequently made by the student), and a radar chart to provide information on the type of interaction that happens in the lab. This dashboard can also inform the teacher if a student is talking on the phone for more than the optimal time. Moreover, it checks whether there is a flame in the experimental area, calculates the proximity of the student to the table, and provides feedback on it. To generate these results, data from multiple sources such as heat sensors, a temperature screening assistant, a 4K video sensor, three far-field microphone arrays, and two internal inertial measurement units (IMUs) are captured and analyzed in the Behavior Detection and Analysis (BDA) server.
The third intervention/dashboard implements four main functions; namely, information, scan, help, and alert. The information function visualizes information about the experiment and the risks associated with it. The scan function checks and anticipates dangerous situations related to the lab, the apparatus, and the trash. In this regard, since trash is essential, Misty II Plus could help students understand how to eliminate various dangerous substances that students might use. The help function allows the students to ask for help from the supervisor or the assisted robot. For example, the supervisor can connect to the intervention to send messages that could help students be safe during emergency situations. Regarding the alert function, students may receive sound, flash, or both sound and flash whenever the robot detects risky behavior. Using this function, the robot can send a flash signal to the students to alert them to potentially dangerous situations. The audio alert can be based on speech or just alarm sounds.

4. Evaluation Experiment

4.1. Objective

This experiment aimed to evaluate the use of social robots for science students. Using a co-design approach, we asked students about their prior experiences and expectations about social robots and their expectations about using socially assistive robots to improve lab safety.

4.2. Participants

Twenty students (18–27 years) participated in the experiment. Nine participants were enrolled in a Japanese university (eight Japanese and one Chinese national), and eleven in an Egyptian university (all native Arabic speakers). Fourteen participants were undergraduate students who used a science lab regularly and had prior experience with social robots and the use of robots in education. Six participants were master’s degree students at a Japanese university. Eleven participants were majoring in Instructional and Information Technology: five participants were registered in Information and Computer Science, five participants in Electrical and Electronic Engineering, and one participant in the Department of Bioscience. Seventeen participants had adequate English language proficiency; they received information in English and answered the questionnaire in English. Three participants, whose English proficiency was not sufficient, interacted through a teacher who was fluent in English and in the native language of the participants. All participants had some experience working in a science laboratory, and each described at least one possible accident in a science laboratory. Participation was voluntary and participants received no compensation.
Although there were only twenty participants, there is precedence in working with a small number of participants in the medical domain during initial investigations [58,59].

4.3. Procedure

The experiment was conducted in two stages: The first stage was an initial experiment to understand the risky behaviors that need to be avoided to maintain a safe environment in a scientific laboratory. We asked teachers to specify risky behavior within scientific laboratories and how to help students control these behaviors. The second stage was to ask students about their prior experiences and expectations about social robots and their expectations about the use of socially assistive robots to enhance lab safety using the System Usability Scale (SUS). Quantitative methods, including surveys, were used to collect demographic data on the students who participated in the experiment. Before presenting the interventions to the participants, they were asked to specify their previous experiences with social robots, as well as their expectations about the capabilities of social robots for achieving safety within scientific laboratories. This information was gathered using a Before-Interaction Questionnaire (BIQ). Then we presented the interventions to the participants, explained the LSA framework, and allowed them to interact with Misty II Plus. After this step, participants were asked to answer an After-Interaction Questionnaire (AIQ), which used the system usability scale (SUS) to receive feedback. Participants answered 12 questions about their understanding of the suggested interventions to improve safety in science laboratories by ranking each item on a Likert scale of 1 (strongly disagree) to 5 (strongly agree). Figure 5 shows the procedure of the experiment.

4.4. Method and Materials

Apparatus: The Misty II robot was created by the Misty Robotics company. It detects and recognizes faces taken by the camera in her visor using a module built on the Snapdragon Neural Processing Engine. Misty II is a 36 cm-tall robot designed to be a social companion and enhance the human–robot combination. Microphone specifications of the Misty II include a sensitivity polar pattern (omnidirectional), a signal–to–noise ratio (SNR) (64 dB), sensitivity (−26 dBFS), and a power supply reaction (PSR) (−70 dBFS). These devices offer significant benefits for the overall acceptability of the system. Robotics for active and assisted living should be easily customizable to meet the needs of the user and adapt to changing environments [60]. It can function in inadequate lighting conditions and at three distinct degrees of head tilt. When Misty’s RGB camera turned on by itself, an ML algorithm evaluated the captured image for human detection. If a person was successfully identified, the system recorded their location; if not, it was muted for a while before starting the localization process again. The parameters of the SONY IMX214 RGB Camera were as follows: Power supply (Analog 2.7 V; Digital: 1.0 V and 1.8 V), Operating Temperature (−20 °C–+70 °C Module size 8.5 × 8.5 × 6.15 mm (L × W × H)), Camera mass (Lens Specifications Total pixel number 4224 (H) × 3200 (V)), Field of view (FOV) (Horizontal 63.2° Vertical 49.0° Diagonal 75.0°), Image sensor size (1/3 inch), Ensemble of localized features (EFL) (3.86 mm). Previous tests carried out by the authors confirm that the best algorithm for human detection using the Misty RGB camera in an indoor scenario is the YOLO-v3 algorithm (99.9%) [61].
Transcript: To guide the participants, we prepared a transcript that contained the following information: (1) the purpose of the experiment, (2) social robots, (3) Misty as a social robot, (4) the LSA framework, (5) the interventions, and (6) the informed consent. For Japanese participants, the explanation was given in English, but a Japanese translation (prepared by a native Japanese speaker) was also provided. A similar strategy was used with Egyptian participants. The same transcript was used with the Japanese and Egyptian participants.
Before Interaction Questions (BIQ) consisted of forty-eight questions. Ten of the forty-eight questions were about demographic information about the participant. Other questions were about their previous experiences and expectations about social robots, and expectations about the use of socially assistive robots to improve lab safety. Google forms were used for the questionnaire. https://docs.google.com/document/d/1KXKCCfWMQBZ7MdAxADg4sUNWR4IVECQ6fh-xxgXZIaU/edit?usp=sharing (aceesed on 17 June 2025).
Videos: We chose ten videos to introduce Misty II Plus to the participants. The videos showed: (1) Misty Robot. (2) What does Misty do? (3) Misty’s many capabilities. (4) Misty’s unique personality and expressions. (5) Misty II feature breakdown: computer vision. (6) Temperature screening assistant for Misty II at Misty Robotics. (7) Misty II feature breakdown: voice interaction and speakers. (8) Misty II feature breakdown: mapping and navigation. (9) Misty II, Social distance, and sanitizing. (10) Misty II—Conversation Skill (V2). We chose these videos considering many factors, such as many participants experiencing Misty II for the first time, keeping complex and irrelevant features out, and using shorter-length videos. We used videos available on YouTube. The videos used for this experiment can be accessed from here. The total lengths of those ten videos were 16 min and 34 s (1st clip: 29 s 2nd clip: 1 m 33 s; 3rd clip: 1 m 47 s; 4th clip: 23 s; 5th clip: 2 m 59 s; 6th clip: 1 m 29 s; 7th clip: 3 m 10 s; 8th clip: 3 min 2 s; 9th clip: 1 m 6 s; and 10th clip: 36 s). Figure 6 shows the setup used in Japan to show the videos to the participants.
Interaction with Misty: In Japan, participants were asked to interact with a Misty II Plus robot. However, participants in Egypt were unable to interact. During the interaction, participants were asked to touch the robot, feel its humanoid body, talk to it, and ask questions. There was no time limit for the participant to interact with the robot. (Figure 7) shows the participants interacting with the robot.
Interventions: We prepared an iPad for the experiment and showed the interventions we designed. Participants were allowed to zoom in and out so that they could see the features closely. There was no time limit for the participants to experience the dashboards on the iPad. We also printed the interventions on paper. Figure 8 shows a view of the participants in the interventions.

5. Results

We present our findings in two major areas: (1) Participants’ expectations for social robot capabilities in achieving safety within scientific settings. In the first stage, students provided input to implement the suggested interventions in the social robot using the LSA framework. (2) Test the usability of the system using SUS after providing the intervention to the users. This helps us to understand how the proposed interventions might be implemented in social robots to improve safety in the scientific laboratory while conducting scientific research.

5.1. Expectations on the Use of Social Robots to Achieve Safety in Scientific Laboratories

To define the participants’ expectations for implementing the suggested solution using social robots to provide a safe laboratory for scientific inquiry, students filled out a questionnaire (see Supplementary Materials) with 42 questions regarding their prior knowledge about socially assistive robots and their expectations for how socially assistive robots can create a secure environment in the scientific laboratory. Data about the use of the questionnaire was obtained as follows. Eleven elements provided qualitative data to give reasons for students’ choices. Thirty-one items of quantitative data provided answering these questions led to a better understanding of the prior experience of the students about their previous experience in using science laboratories, what types of accidents that may occur inside these laboratories they know, why these accidents happen, and what they know about assistive social robots that help them to be safe in science laboratories. The student’s score to answer the question was a maximum of 155, and the median score, 93 points. The scope of the questions on the questionnaire was administered online after the interventions ended. The score of the experimental group was compared with the median score of the questionnaire. The Wilcoxon signed-rank test was used to further evaluate students’ expectations for social robot characteristics that lead to safety in scientific facilities. At the same time, students proposed ways to incorporate the suggested interventions into the social robot using the LSA framework.
According to the Kruskal–Wallis test, there are no significant differences between groups in the social robots’ capacity to improve safety and sharing emotional interactions between students and the social robots in science labs. The students’ perceptions regarding the capabilities of social robots and the improvement of safety in the science laboratory were as follows: χ2(1) = 1.007, p = 0.316, with a mean rank for group 1. (9.42), and for group 2. (12.13). A Wilcoxon signed-rank test was performed to evaluate the participants’ expectations about how the integration of the suggested intervention into the LSA framework can provide a socially assistive robot that provides us with a safe environment in the scientific laboratory (median = 106.5, Z = −2.39, p < 0.05, r = 0.53). Furthermore, the sum of the positive difference scores (169.00) was higher than the sum of the negative difference scores (41.00), demonstrating a positive impact of the participants’ expectations about the use of the capabilities of social robots to provide a safe laboratory. The median, in our view, is significantly better for non-parametric tests: (Mean = 84.40, Median = 87, SD = 19.92) for the pre-test and (mean = 103.45, Median = 106.5, SD = 12.77) for the post-test. These results indicate that the effect size of using the suggested intervention is large in enhancing safety inside the scientific laboratory by using a social robot.
The Kruskal–Wallis test results indicate that there are no significant differences between groups in students’ perceptions about sharing feelings with the social robot capabilities and improving safety in the science laboratory. They are are as follows: χ2(1) = 0. 434, p = 0. 510, with a mean rank for group 1. (9.79), and for group 2. (11.56). A Wilcoxon signed-rank test was performed to evaluate the experiences of participants about sharing feelings with a social robot that supports us to provide a secure environment in the scientific laboratory (Median = 22.5, Z = −2.36, p < 0.05, r = 0.53). The median, in our view, is significantly better for non-parametric tests: (Mean = 21.65, Median = 22.50, SD = 4.27) for the pre-test and (mean = 24.75, Median = 26, SD = 3.02) for the post-test. Furthermore, the sum of the positive difference ranks (168.00) was larger than the sum of the negative difference ranks (42.00), demonstrating a positive impact of the participants’ expectations of using the abilities of social robots to share feelings with students to make a safe laboratory. These results indicate that the effect size of using the suggested intervention is large in evaluating the experiences of the participants about sharing feelings with a social robot and how it can support students to be in a safe environment in the scientific laboratory. Figure 9 and Figure 10 show an overview of the results to demonstrate that students’ expectations for robot capabilities and emotional sharing were raised significantly.

5.2. Use of Social Assistive Robot for Enhancing Safety in Science Laboratory (SUS)

The parametric t-test for a sample was performed because the Kolmogorov–Smirnov test confirmed that the data adhered to the assumption of normality (p = 0.56). The findings indicate a significant deviation of the sample mean from the population mean. The one-sample t-test assessed whether the mean expectations of the participants (M = 36.40, SD = 3.17) of sharing feelings with a social robot under the LSA-based intervention for lab safety differed from the hypothesized mean of 20. The results revealed a statistically significant difference, t (19) = 3.39, p = 0.003, with a mean difference of 2.4 (95% CI [0.92, 3.88]). This suggests that participants perceive socially assistive robots as enhancing safety in science laboratories.

6. Discussion

Despite there having been some attempts to improve the safety in scientific laboratories, these efforts have had numerous drawbacks, and the current study offers valuable solutions. Detecting hydrogen with sensors has several limitations when applied in real-world situations [62]. According to the findings, university laboratory safety is a very interdisciplinary area of study. In contrast to other safety areas, it is within the minority research sector. Although some new research areas have emerged in the last 10 years, researchers may still need to contribute innovative concepts, new subjects, and novel approaches or theories to this area. Such areas of study include emergency response, human error, safety communication, risk perception, resilience, and safety science in general [3]. This is exactly what we have done in our current study using a robot. Easy-to-use software and hardware systems to assist in the rapid development and implementation of smart sensing are important in science laboratories. According to [63], the combined use of dynamic real-time traceability and information visualization for laboratory safety management is still required to provide security within science laboratories, and that is exactly what we proposed in the current study using a mobile social robot that can detect errors in a dynamic context.
In this study, we explored the participants’ ideas for designing an intervention that can be integrated into a socially assistive robot (like Misty II) according to the LSA framework. A co-design approach based on participant responses was used. The LSA framework provides an opportunity for the lab supervisor or instructor to use the robot-assisted learning tool to help students stay safe during potentially hazardous experiments. The robot may converse using vocal messages and alert sounds, with or without visual indications. This can help maintain a safe environment for students. In addition, the teacher or supervisor can provide synchronous instructions to the students via the robot.
We have contributed to the research on socially assistive robots by providing an intervention to improve safety inside the science laboratory. To interact effectively in real-world scenarios, robots must be able to connect and engage users autonomously with minimal supervision from human professionals. The robot should be able to provide work instructions, provide formative feedback that responds to what the user is doing in the present moment, and furnish detailed information on proper and incorrect behaviors. Effective formative feedback is essential for providing personalized help, supporting skill improvement, and increasing user engagement [64].
Robots can be used for safety-related risks and opportunities. They can provide a variety of collaboration safety-monitored stop, separation and speed, force and power limiting, and hand guiding [65]. Assistive robots can control user behaviors. For example, robot-assisted task-specific ankle training provides promising effects by allowing one to improve functional muscle, which has a positive influence on the quality of life of children [66]. As a result, we answered the research question: What are the students’ expectations for improving safety inside a science laboratory with the support of social robots?
Regarding the research question ‘How does sharing the feelings of students affect the design of a social robot to improve safety within a science laboratory?’ we found that sharing feelings with a social robot can increase the effectiveness of the robot in achieving safety within the laboratory. Derakhshan et al. (2025) indicate that AI-powered robots improved student empathy and communication skills during teaching English for medical sciences [67]. This can be explained by cooperative resolutions resulting from enhanced empathy during student–robot conversations. Socially assistive robots with a human-like face and relatable facial expressions and gestures showed higher engagement levels; those capable of showing a smiling and talking face showed higher levels of engagement [68]. The study by Louie (2017) proposes a learning-based interaction approach for socially assistive robots, enabling them to autonomously facilitate multi-user social recreational activities for novice users [69]. Additionally, it personalizes robotic assistance to enhance individual user compliance.
The socially assistive robot with the suggested intervention improved safety within the science laboratory. Consistent with El-Shamouty et al. (2020), the current study found that the use of an assistive robot can teach students how to securely handle harmful substances or apparatus [70]. Using robots as an assistive tool in a scientific laboratory creates a safe environment while also increasing their satisfaction [71]. Breazeal et al. (2004) show that a social robot can teach a task through cooperative communication and coordinate shared goals to complete the assignment with the instructor or supervisor [72]. Learning scientists’ perspectives are critical in robot-assisted educational environments. We are confident that the implementation and application of this framework complements new learning theories.

7. Conclusions

The current study investigated the use of socially assistive robots (Misty II Plus) to improve safety in science laboratories. Our research highlights three significant results. Firstly, the LSA architecture helps the lab manager or instructor deploy social robots to keep students safe during potentially dangerous experiments. Secondly, by giving precise feedback, socially assistive robots can be used to regulate user actions. Thirdly, students’ emotions can be shared by socially assistive robots, which can improve safety in a research lab. Human-looking socially assistive robots demonstrated greater levels of engagement.
Finally, we should discuss the limitations of the present study. We believe that increasing the number of participants could yield more information that would help social robots better understand the risky behaviors that students may engage in. In addition, this could give more appropriate feedback based on the different levels of risky behaviors that students may exhibit. Additionally, recording videos and gathering data for more than a few weeks may highlight the benefits that users can experience while utilizing the social robot and framework in various types of educational labs. Future research could also be conducted to detect risky behaviors in different learning environments, such as playgrounds. In addition, a study on the use of a social robot to support students while conducting a specific experiment could provide us with more accurate data on the usability of the system. More studies in various contexts may lead to a list of criteria for conducting a specific experiment in a specific context safely by using social robots. In addition, there is a need to explore the effectiveness of using the suggested framework for using a social robot while teaching students in primary school and kindergarten.

Supplementary Materials

The following supporting information can be downloaded at: https://www.mdpi.com/article/10.3390/electronics14132513/s1.

Author Contributions

Conceptualization, M.M.H.A. and M.N.H.; methodology, M.M.H.A., M.N.H. and B.I.; software, M.M.H.A. and M.N.H.; validation, M.M.H.A., M.N.H. and B.I.; formal analysis, M.M.H.A.; investigation, M.M.H.A. and M.N.H.; resources, M.M.H.A., M.N.H. and B.I.; data curation, M.M.H.A. and M.N.H.; writing—original draft preparation, M.M.H.A. and M.N.H.; writing—review and editing, B.I.; visualization, M.M.H.A.; supervision, B.I.; project administration, M.N.H.; funding acquisition, M.N.H. All authors have read and agreed to the published version of the manuscript.

Funding

No external funding was received for this research.

Institutional Review Board Statement

The study was approved by the Information Media Education and Research Center Research Ethics Committee (code: Hosei University RCCMS 2023-001).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data presented in this study are available on request from the authors.

Acknowledgments

We thank the Research Center for Computing and Multimedia Studies for providing the authors a robot (Misty II) for conducting the experiment.

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
LSALaboratory Safety Assistant
SUSSystem Usability Scale
AIQAfter Interaction Questions
BIQBefore Interaction Questions
PDParticipatory Design
LMSLearning Management System
SISStudent Information System
DLIDigital Learning Institute
UCDUser-Centered Design
STEMScience, Technology, Engineering, and Math
RALRobot-assisted Learning
DBRDesign-based Research
BDABehavior Detection and Analysis

References

  1. Chen, H.; Li, H.; Zhao, J.; Ruan, C.; Huang, L. Enhancing crop disease recognition via prompt learning-based progressive Mixup and Contrastive Language-Image Pre-training dynamic calibration. Eng. Appl. Artif. Intell. 2025, 152, 110805. [Google Scholar] [CrossRef]
  2. Yang, H.; Guo, Q.; Chen, G.; Zhao, Y.; Shi, M.; Zhou, N.; Huang, C.; Mao, H. An intelligent humidity sensing system for human behavior recognition. Microsyst. Nanoeng. 2025, 11, 17. [Google Scholar] [CrossRef]
  3. Yang, Y.; Reniers, G.; Chen, G.; Goerlandt, F. A bibliometric review of laboratory safety in universities. Saf. Sci. 2019, 120, 14–24. [Google Scholar] [CrossRef]
  4. Chen, M.; Wu, Y.; Wang, K.; Guo, H.; Ke, W. An explosion accident analysis of the laboratory in university. Process Saf. Prog. 2020, 39, e12150. [Google Scholar] [CrossRef]
  5. Hammerschmidt-Snidarich, S.M.; McComas, J.J.; Simonson, G.R. Individualized goal setting during repeated reading: Improving growth with struggling readers using data based instructional decisions. Prev. Sch. Fail. 2019, 63, 334–344. [Google Scholar] [CrossRef]
  6. Utamachant, P.; Anutariya, C.; Pongnumkul, S. i-Ntervene: Applying an evidence-based learning analytics intervention to support computer programming instruction. Smart Learn. Environ. 2023, 10, 37. [Google Scholar] [CrossRef]
  7. Tepgeç, M.; Ifenthaler, D. Learning analytics based interventions: A systematic review of experimental studies. In Proceedings of the 19th edition the International Conference on Cognition and Exploratory Learning in Digital Age (CELDA 2022), Lisbon, Portugal, 8–10 November 2022. [Google Scholar]
  8. Digital Learning Institute. Learning Analytics: The Ultimate Guide. 2025. Available online: https://www.digitallearninginstitute.com/blog/learning-analytics-the-ultimate-guide (accessed on 18 April 2025).
  9. Akçapnar, G.; Hasnine, M.N.; Majumdar, R.; Flanagan, B.; Ogata, H. Developing an early-warning system for spotting at-risk students by using eBook interaction logs. Smart Learn. Environ. 2019, 6, 4. [Google Scholar] [CrossRef]
  10. Mubarak, A.A.; Cao, H.; Hezam, I.M. Deep analytic model for student dropout prediction in massive open online courses. Comput. Electr. Eng. 2021, 93, 107271. [Google Scholar] [CrossRef]
  11. Hasnine, M.N.; Akcapinar, G.; Flanagan, B.; Majumdar, R.; Mouri, K.; Ogata, H. Towards final scores prediction over clickstream using machine learning methods. In Proceedings of the International Conference on Computers in Education, Manila, Philippines, 26–30 November 2018. [Google Scholar]
  12. Shafiq, D.A.; Marjani, M.; Habeeb, R.A.A.; Asirvatham, D. Student retention using educational data mining and predictive analytics: A systematic literature review. IEEE Access 2022, 10, 72480–72503. [Google Scholar] [CrossRef]
  13. Paulsen, L.; Lindsay, E. Learning analytics dashboards are increasingly becoming about learning and not just analytics—A systematic review. Educ. Inf. Technol. 2024, 29, 14279–14308. [Google Scholar] [CrossRef]
  14. Sunny, M.S.H.; Rahman, M.M.; Haque, M.E.; Banik, N.; Ahmed, H.U.; Rahman, M.H. Assistive robotic technologies: An overview of recent advances in medical applications. In Medical and Healthcare Robotics; Elsevier: Amsterdam, Netherlands, 2023; pp. 1–23. [Google Scholar]
  15. Michaelis, J.E.; Cagiltay, B.; Ibtasar, R.; Mutlu, B. “off script:”’ design opportunities emerging from long-term social robot interactions in-the-wild. In Proceedings of the 2023 ACM/IEEE International Conference on Human-Robot Interaction, New York, NY, USA, 13–16 March 2023. [Google Scholar]
  16. Ragno, L.; Borboni, A.; Vannetti, F.; Amici, C.; Cusano, N. Application of social robots in healthcare: Review on characteristics, requirements, technical solutions. Sensors 2023, 23, 6820. [Google Scholar] [CrossRef]
  17. Mois, G.; Trinh, M.U.; Ali, A.F.A.; Vergara, L.G.L.; Rogers, W.A. Design considerations for a socially assistive robots in the home to support older adults. Proc. Hum. Factors Ergon. Soc. Annu. Meet. 2023, 67, 542–548. [Google Scholar] [CrossRef]
  18. Almeida, L.; Menezes, P.; Dias, J. Telepresence social robotics towards co-presence: A review. Appl. Sci. 2022, 12, 5557. [Google Scholar] [CrossRef]
  19. Oruma, S.O.; Sánchez-Gordón, M.; Colomo-Palacios, R.; Gkioulos, V.; Hansen, J.K. A systematic review on social robots in public spaces: Threat landscape and attack surface. Computers 2022, 11, 181. [Google Scholar] [CrossRef]
  20. Kang, M.H.; Kim, S. Research trends in entertainment robots: A comprehensive review of the literature from 1998 to 2024. Digit. Bus. 2024, 5, 100102. [Google Scholar] [CrossRef]
  21. Hasnine, M.N.; Indurkhya, B.; Ahmed, M.M.H. Socially assistive robot as laboratory safety assistant for science students. In Proceedings of the 2024 IEEE International Conference on Advanced Robotics and Its Social Impacts (ARSO), Hong Kong SAR, China, 20–22 May 2024. [Google Scholar]
  22. Axelsson, M.; Oliveira, R.; Racca, M.; Kyrki, V. Social robot co-design canvases: A participatory design framework. ACM Trans. Hum. Robot Interact. 2021, 11, 3. [Google Scholar] [CrossRef]
  23. Chen, Y.; Li, J.; Blasch, E.; Qu, Q. Future Outdoor Safety Monitoring: Integrating Human Activity Recognition with the Internet of Physical–Virtual Things. Appl. Sci. 2025, 15, 3434. [Google Scholar] [CrossRef]
  24. Ahmad, M.I.; Khordi-moodi, M.; Lohan, K.S. Social Robot for STEM Education. In Proceedings of the Companion of the 2020 ACM/IEEE International Conference on Human-Robot Interaction, New York, NY, USA, 23–26 March 2020; ACM: New York, NY, USA, 2020. [Google Scholar]
  25. van den Berghe, R.; Verhagen, J.; Oudgenoeg-Paz, O.; van der Ven, S.; Leseman, P. Social robots for language learning: A review. Rev. Educ. Res. 2018, 89, 259–295. [Google Scholar] [CrossRef]
  26. Alemi, M.; Meghdari, A.; Ghazisaedy, M. The impact of social robotics on L2 learners’ anxiety and attitude in English vocabulary acquisition. Int. J. Soc. Robot. 2015, 7, 523–535. [Google Scholar] [CrossRef]
  27. Mubin, O.; Alhashmi, M.; Baroud, R.; Alnajjar, F.S. Humanoid robots as teaching assistants in an Arab school. In Proceedings of the 31st Australian Conference on Human-Computer-Interaction, New York, NY, USA, 2–5 December 2019; ACM: New York, NY, USA, 2019. [Google Scholar]
  28. Belpaeme, T.; Kennedy, J.; Ramachandran, A.; Scassellati, B.; Tanaka, F. Social robots for education: A review. Sci. Robot. 2018, 3, eaat5954. [Google Scholar] [CrossRef]
  29. Schodde, T.; Bergmann, K.; Kopp, S. Adaptive robot language tutoring based on Bayesian knowledge tracing and predictive decision-making. In Proceedings of the 2017 ACM/IEEE International Conference on Human-Robot Interaction, New York, NY, USA, 6 March 2017. [Google Scholar]
  30. Han, J. Robot-aided learning and r-learning services. In Human-Robot Interaction; IntechOpen: London, UK, 2010; p. 288. [Google Scholar]
  31. Alemi, M.; Meghdari, A.; Ghazisaedy, M. Employing humanoid robots for teaching English language in Iranian junior high-schools. Int. J. Hum. Robot. 2014, 11, 1450022. [Google Scholar] [CrossRef]
  32. Girotto, V.; Lozano, C.; Muldner, K.; Burleson, W.; Walker, E. Lessons learned from in-school use of rTAG. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, New York, NY, USA, 7–12 May 2016; ACM: New York, NY, USA, 2016. [Google Scholar]
  33. Shukla, J.; Cristiano, J.; Oliver, J.; Puig, D. Robot assisted interventions for individuals with intellectual disabilities: Impact on users and caregivers. Int. J. Soc. Robot. 2019, 11, 631–649. [Google Scholar] [CrossRef]
  34. Kostrubiec, V.; Kruck, J. Collaborative research project: Developing and testing a robot-assisted intervention for children with autism. Front. Robot. AI 2020, 7, 37. [Google Scholar] [CrossRef] [PubMed]
  35. Papadopoulou, M.T.; Karageorgiou, E.; Kechayas, P.; Geronikola, N.; Lytridis, C.; Bazinas, C.; Kourampa, E.; Avramidou, E.; Kaburlasos, V.G.; Evangeliou, A.E. Efficacy of a robot-assisted intervention in improving learning performance of elementary school children with specific Learning Disorders. Children 2022, 9, 1155. [Google Scholar] [CrossRef]
  36. Khan, F.; Hashemi, S.J.; Paltrinieri, N.; Amyotte, P.; Cozzani, V.; Reniers, G. Dynamic risk management: A contemporary approach to process safety management. Curr. Opin. Chem. Eng. 2016, 1, 9–17. [Google Scholar] [CrossRef]
  37. Olewski, T.; Snakard, M. Challenges in applying process safety management at university laboratories. J. Loss Prev. Process. Ind. 2017, 49, 209–214. [Google Scholar] [CrossRef]
  38. Swuste, P.; Theunissen, J.; Schmitz, P.; Reniers, G.; Blokland, P. Process safety indicators, a review of literature. J. Loss Prev. Process. Ind. 2016, 40, 162–173. [Google Scholar] [CrossRef]
  39. Artdej, R. Investigating Undergraduate Students’ Scientific Understanding of Laboratory Safety. Procedia Soc. Behav. Sci. 2012, 46, 5058–5062. [Google Scholar] [CrossRef]
  40. Kozajda, A.; Miśkiewicz, E. The role of The National Register of Biological Agents in health protection of employees exposed to biological agents used intentionally at work in Poland. Int. J. Occup. Med. Environ. Health 2025, 38, 91–97. [Google Scholar] [CrossRef]
  41. Li, Z.; Wang, X.; Gong, S.; Sun, N.; Tong, R. Risk assessment of unsafe behavior in university laboratories using the HFACS-UL and a fuzzy Bayesian network. J. Saf. Res. 2022, 82, 13–27. [Google Scholar] [CrossRef]
  42. AAzyabi, A.; Khamaj, A.; Ali, A.M.; Abushaega, M.M.; Ghandourah, E.; Alam, M.M.; Ahmad, M.T. Predicting ergonomic risk among laboratory technicians using a Cheetah Optimizer-Integrated Deep Convolutional Neural Network. Comput. Biol. Med. 2024, 183, 109314. [Google Scholar] [CrossRef] [PubMed]
  43. Lin, C.; Zheng, K.; Man, S.S. Risk perception scale for laboratory safety: Development and validation. Int. J. Ind. Ergon. 2025, 105, 103689. [Google Scholar] [CrossRef]
  44. Breazeal, C.; Dautenhahn, K.; Kanda, T. Social Robotics. In Springer Handbook of Robotics; Springer International Publishing: Berlin/Heidelberg, Germany, 2016; pp. 1935–1972. [Google Scholar]
  45. Bjögvinsson, E.; Ehn, P.; Hillgren, P.-A. Design things and design thinking: Contemporary participatory design challenges. Des. Issues 2012, 28, 101–116. [Google Scholar] [CrossRef]
  46. Müller, M. Participatory design: The third space in human-computer interaction. In The Human-Computer Interaction Handbook; CRC Press: Boca Raton, FL, USA, 2003; pp. 1051–1068. [Google Scholar]
  47. Fischer, G. Symmetry of igorance, social creativity, and meta-design. In Proceedings of the 3rd Conference on Creativity & Cognition, Loughborough, UK, 11–13 October 1999; pp. 116–123. [Google Scholar]
  48. Azenkot, S.; Feng, C.; Cakmak, M. Enabling building service robots to guide blind people a participatory design approach. In Proceedings of the 11th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Christchurch, New Zealand, 7–10 March 2016; IEEE: Piscataway, NJ, USA, 2016. [Google Scholar]
  49. Lee, H.R.; Šabanović, S.; Chang, W.-L.; Nagata, S.; Piatt, J.; Bennett, C.; Hakken, D. Steps toward participatory design of social robots. In Proceedings of the 2017 ACM/IEEE International Conference on Human-Robot Interaction, Vienna, Austria, 6–9 March 2017; Association for Computing Machinery: New York, NY, USA, 2017. [Google Scholar]
  50. Moharana, S.; Panduro, A.E.; Lee, H.R.; Riek, L.D. Robots for joy, robots for sorrow: Community based robot design for dementia caregivers. In 2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI); IEEE: Piscataway, NJ, USA, 2019. [Google Scholar]
  51. Steen, M. Co-design as a process of joint inquiry and imagination. Des. Issues 2013, 29, 16–28. [Google Scholar] [CrossRef]
  52. Beyer, H.; Holtzblatt, K. Contextual design. Interactions 1999, 6, 32–42. [Google Scholar] [CrossRef]
  53. JFA Purple Orange Guide to Co-Design with People Living with Disability. AUSTRALIA, May 2021. Available online: https://purpleorange.org.au/ (accessed on 18 April 2025).
  54. Armstrong, M.; Dopp, C.; Welsh, J. Design-based research. In The Students’ Guide to Learning Design and Research; 2020; pp. 1–6. Available online: https://edtechbooks.s3.us-west-2.amazonaws.com/pdfs/10/266.pdf (accessed on 17 June 2025).
  55. The European Geosciences Union (Ed.) Fundamental Sciences/Blue-Skies Research. 2025. Available online: https://www.egu.eu/policy/science/fundamental-sciences-blue-skies-research/ (accessed on 18 April 2025).
  56. Karimov, A.; Saarela, M.; Aliyev, S.; Baker, R. Ethical Considerations and Student Perceptions of Engagement Data in Learning Analytics. In Proceedings of the Annual Hawaii International Conference on System Sciences, University of Hawai’I, Manoa, HO, USA, 7–10 January 2025. [Google Scholar] [CrossRef]
  57. Spitale, M.; Axelsson, M.; Gunes, H. Robotic mental well-being coaches for the workplace. In Proceedings of the 2023 ACM/IEEE International Conference on Human-Robot Interaction, Stockholm, Sweden, 13–16 March 2023; Association for Computing Machinery: New York, NY, USA, 2023. [Google Scholar]
  58. Narasimhan, S.; Turkcan, M.K.; Ballo, M.; Choksi, S.; Filicori, F.; Kostic, Z. Monocular 3D Tooltip Tracking in Robotic Surgery—Building a Multi-Stage Pipeline. Electronics 2025, 14, 2075. [Google Scholar] [CrossRef]
  59. Vondikakis, I.; Politi, E.; Goulis, D.; Dimitrakopoulos, G.; Georgoulis, M.; Saltaouras, G.; Kontogianni, M.; Brisimi, T.; Logothetis, M.; Kakoulidis, H.; et al. Integrated Framework for Managing Childhood Obesity Based on Biobanks, AI Tools and Methods, and Serious Games. Electronics 2025, 14, 2053. [Google Scholar] [CrossRef]
  60. Ciuffreda, I.; Amabili, G.; Casaccia, S.; Benadduci, M.; Margaritini, A.; Maranesi, E.; Marconi, F.; De Masi, A.; Alberts, J.; de Koning, J.; et al. Design and Development of a Technological Platform Based on a Sensorized Social Robot for Supporting Older Adults and Caregivers: GUARDIAN Ecosystem. Int. J. Soc. Robot. 2023, 17, 803–822. [Google Scholar] [CrossRef]
  61. Ciuffreda, I.; Battista, G.; Casaccia, S.; Revel, G.M. People detection measurement setup based on a DOA approach implemented on a sensorised social robot. Meas. Sens. 2023, 25, 100649. [Google Scholar] [CrossRef]
  62. Buttner, W.J.; Post, M.B.; Burgess, R.; Rivkin, C. An overview of hydrogen safety sensors and requirements. Int. J. Hydrogen Energy 2011, 36, 2462–2470. [Google Scholar] [CrossRef]
  63. Xiao, X. Smart sensing for laboratory safety management. J. Artif. Intell. Robot. 2024, 1, 11–16. [Google Scholar] [CrossRef]
  64. Mohan, M.; Nunez, C.M.; Kuchenbecker, K.J. Closing the loop in minimally supervised human-robot interaction: Formative and summative feedback. Sci. Rep. 2024, 14, 10564. [Google Scholar] [CrossRef] [PubMed]
  65. Ikumapayi, O.M.; Afolalu, S.A.; Ogedengbe, T.S.; Kazeem, R.A.; Akinlabi, E.T. Human-robot co-working improvement via revolutionary automation and robotic technologies—An overview. Procedia Comput. Sci. 2023, 217, 1345–1353. [Google Scholar] [CrossRef]
  66. Alotaibi, M.; Arnold, B.L.; Munk, N.; Dierks, T.; Altenburger, P.; Alqabbani, S.; Almuwais, A. The pilot study of the effect of six-week robot-assisted ankle training on mobility and strength of lower extremity and life habits for children with cerebral palsy. Heliyon 2024, 10, e34318. [Google Scholar] [CrossRef]
  67. Derakhshan, A.; Teo, T.; Khazaie, S. Investigating the usefulness of artificial intelligence-driven robots in developing empathy for English for medical purposes communication: The role-play of Asian and African students. Comput. Human Behav. 2025, 162, 108416. [Google Scholar] [CrossRef]
  68. Moro, C. Learning Socially Assistive Robot Behaviors for Personalized Human-Robot Interaction; University of Toronto: Toronto, ON, Canada, 2018. [Google Scholar]
  69. Louie, W.-Y.G. A Learning Based Robot Interaction System for Multi-User Activities; University of Toronto: Toronto, ON, Canada, 2017. [Google Scholar]
  70. El-Shamouty, M.; Wu, X.; Yang, S.; Albus, M.; Huber, M.F. Towards safe human-robot collaboration using deep reinforcement learning. In Proceedings of the 2020 IEEE International Conference on Robotics and Automation (ICRA), Paris, France, 31 May–31 August 2020; IEEE: Piscataway, NJ, USA, 2020. [Google Scholar]
  71. Velentza, A.-M.; Fachantidis, N.; Lefkos, I. Human-robot interaction methodology: Robot teaching activity. MethodsX 2022, 9, 101866. [Google Scholar] [CrossRef]
  72. Breazeal, C.; Hoffman, G.; Lockerd, A. Teaching and working with robots as a collaboration. In Proceedings of the Third International Joint Conference on Autonomous Agents and Multiagent Systems, New York, NY, USA, 23–23 July 2004; pp. 1030–1037. [Google Scholar]
Figure 1. The co-design process. Reprinted with permission from ref. [21]. Copyright 2024 IEEE.
Figure 1. The co-design process. Reprinted with permission from ref. [21]. Copyright 2024 IEEE.
Electronics 14 02513 g001
Figure 2. Contextual inquiries: (a) Teacher is conducting an experiment. (b) A biology laboratory.
Figure 2. Contextual inquiries: (a) Teacher is conducting an experiment. (b) A biology laboratory.
Electronics 14 02513 g002
Figure 3. The LSA framework. Adapted with permission from ref. [21]. Copyright 2024 IEEE.
Figure 3. The LSA framework. Adapted with permission from ref. [21]. Copyright 2024 IEEE.
Electronics 14 02513 g003
Figure 4. Dashboards of the LSA framework. Reprinted with permission from ref. [21]. Copyright 2024 IEEE.
Figure 4. Dashboards of the LSA framework. Reprinted with permission from ref. [21]. Copyright 2024 IEEE.
Electronics 14 02513 g004
Figure 5. Steps of the experiment.
Figure 5. Steps of the experiment.
Electronics 14 02513 g005
Figure 6. Participants watching the video during the experiment.
Figure 6. Participants watching the video during the experiment.
Electronics 14 02513 g006
Figure 7. Participants interacting with Misty.
Figure 7. Participants interacting with Misty.
Electronics 14 02513 g007
Figure 8. Interventions experienced by the participants (a) on printed paper, (b) on iPad Mini. Reprinted with permission from ref. [21]. Copyright 2024 IEEE.
Figure 8. Interventions experienced by the participants (a) on printed paper, (b) on iPad Mini. Reprinted with permission from ref. [21]. Copyright 2024 IEEE.
Electronics 14 02513 g008
Figure 9. Enhancement of students’ perceptions toward the use of social robots to improve safety in scientific laboratories.
Figure 9. Enhancement of students’ perceptions toward the use of social robots to improve safety in scientific laboratories.
Electronics 14 02513 g009
Figure 10. Enhancement of students’ feelings toward the use of social robots to improve safety in scientific laboratories.
Figure 10. Enhancement of students’ feelings toward the use of social robots to improve safety in scientific laboratories.
Electronics 14 02513 g010
Table 1. Laboratory safety issues and Misty II’s capabilities [36,37,38,39].
Table 1. Laboratory safety issues and Misty II’s capabilities [36,37,38,39].
Hazardous in Science Laboratory Secure Procedures Staff Roles for Lab Security Misty II Capabilities
  • Explosion
  • Biological exposure
  • Toxic exposure
  • Asphyxiation
  • Electrocution
  • Fire
  • Vehicle accident
Design lab policy Yes, but it will take more time and work to update the instructions.Yes, it can
TrainingYes, one once for all instructions is possible.It can be used once for all instructions and once for every experiment.
State of safety, unwanted behaviorThere is nothing to do.Motivation to keep going
Giving Alarms, failures, number per time period When it is donePrior to acting to stop risky behavior or come up with a solution
Exposure dangerous materials/activities When it is doneBefore it is done to stop risky behavior or find a solution
Incidents, number Sometimes May constantly offer data and specifics for analysis.
Leakage, number, amount Before users begin workingIt can keep track of the quantity before and throughout the work.
Fires, explosions, number, costs When it is doneClimate change can readily identify potential fires before and when they occur.
Process design, failures Yes, be givenYes, can be given
Maintenance, quality control, failures Yes, be givenIncrease maintenance time by providing sufficient knowledge to address potential threats.
Tests, failures Safety system, frequency of activation can be discovered with more work.Easily discovered
Dynamic risk management (Update estimated risks; test the change; show residual risks)More time and effort are needed. Can easily provided
Table 2. A set of user requirements [21].
Table 2. A set of user requirements [21].
AreaRequirements
Analysis and
improvement of
at-risk behavior
The intervention system should detect students’ risky behaviors that could cause human errors during experiments.
The intervention system of should monitor the student’s behavior so that the teacher can interpret how the student is performing during the experiment.
The social robot should provide direct feedback to the student if the student’s behavior is risky, or if the student is violating the laboratory safety guidelines.
The intervention system should identify safe behaviors that will guide the student toward safety practice early in the semester. This will potentially lead the student to practice safe behaviors whenever they are exposed to hazardous environments.
The social robot should check if students are meeting the prerequisite to an experiment (for example, wearing an apron, wearing gloves, and putting goggles on).
Physical activity and
proximity
The social robot should track the patterns of physical and social activity during the experiment and continue learning. It should also continue to calculate the safe proximity for the student.
The social robot should provide responses through vocal interaction or by intervention when the student’s physical activity and/or proximity could cause an accident.
MonitoringThe social robot should track the patterns of physical and social activity during the experiment and continue learning. It should also continue to calculate the safe proximity for the student.
The social robot should provide responses through vocal interaction or by intervention when the student’s physical activity and/or proximity could cause an accident.
Environment checkThe social robot should check whether the lighting in the lab is adequate or not.
The social robot should check if dangerous materials such as gas cylinders are placed in an appropriate place.
The social robot should check if the laboratory has an emergency exit.
AssistingThe social robot should encourage behaviors that prevent tiredness (for example, take a coffee break or drink water).
The intervention system should help the student show their recent behavior and behavior patterns.
The intervention system should show a student’s behavior patterns in a timeline with a daily, weekly, and monthly overview.
The intervention system should help a student by showing the common mistakes that he/she is making to prevent human errors in the near future.
The intervention system should provide students with the most important safety instructions before starting the experiment.
The social robot should ask the student some questions and from the feedback decide whether the student is ready to conduct the experiment or not.
Social companionshipThe social robot should support the student by making small talk, giving positive compliments, and encouraging the student to boost his confidence level.
The social robot should interact with the student when the student is experimenting for a long time without interacting with peers or without taking breaks.
The social robot should motivate and remind students to do their daily tasks.
The social robot should remind the student of birthdays and upcoming meetings.
Student–intervention
interaction
The intervention system should collect all interactions that occur between the student and the intervention and send this data to the database for further analysis.
Teacher–intervention
interaction
The intervention system should collect all interactions that occur between the teacher and the intervention and send this data to the database.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Ahmed, M.M.H.; Hasnine, M.N.; Indurkhya, B. A Participatory Design Approach to Designing Educational Interventions for Science Students Using Socially Assistive Robots. Electronics 2025, 14, 2513. https://doi.org/10.3390/electronics14132513

AMA Style

Ahmed MMH, Hasnine MN, Indurkhya B. A Participatory Design Approach to Designing Educational Interventions for Science Students Using Socially Assistive Robots. Electronics. 2025; 14(13):2513. https://doi.org/10.3390/electronics14132513

Chicago/Turabian Style

Ahmed, Mahmoud Mohamed Hussien, Mohammad Nehal Hasnine, and Bipin Indurkhya. 2025. "A Participatory Design Approach to Designing Educational Interventions for Science Students Using Socially Assistive Robots" Electronics 14, no. 13: 2513. https://doi.org/10.3390/electronics14132513

APA Style

Ahmed, M. M. H., Hasnine, M. N., & Indurkhya, B. (2025). A Participatory Design Approach to Designing Educational Interventions for Science Students Using Socially Assistive Robots. Electronics, 14(13), 2513. https://doi.org/10.3390/electronics14132513

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop