Next Article in Journal
Exploring Problem-Solving Strategies in Gifted and Regular Students: Education Insights from Eye-Tracking Analysis
Previous Article in Journal
ADAEN: Adaptive Diffusion Adversarial Evolutionary Network for Unsupervised Anomaly Detection in Tabular Data
 
 
Due to scheduled maintenance work on our servers, there may be short service disruptions on this website between 11:00 and 12:00 CEST on March 28th.
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Remote Laboratory Based on FPGA Devices Using the E-Learning Approach

by
Victor H. García Ortega
1,2,*,
Josefina Bárcenas López
1 and
Enrique Ruiz-Velasco Sánchez
3
1
Departamento de Tecnologías de la Información y Procesos Educativos, Instituto de Ciencias Aplicadas y Tecnología, Universidad Nacional Autónoma de Mexico (UNAM), Circuito Exterior sn, Cd. Universitaria, Ciudad de Mexico 04510, Mexico
2
Departamento de Ingeniería en Sistemas Computacionales, Escuela Superior de Cómputo, Instituto Politécnico Nacional (IPN), Av. Juan de Dios Batiz sn, Ciudad de Mexico 07738, Mexico
3
Departamento de Diversidad Sociocultural en la Educación, Instituto de Investigaciones Sobre la Universidad y la Educación, Universidad Nacional Autónoma de Mexico (UNAM), Circuito Cultural Universitario sn, Cd. Universitaria, Ciudad de Mexico 04510, Mexico
*
Author to whom correspondence should be addressed.
Appl. Syst. Innov. 2026, 9(2), 37; https://doi.org/10.3390/asi9020037
Submission received: 29 November 2025 / Revised: 12 January 2026 / Accepted: 23 January 2026 / Published: 31 January 2026

Abstract

Laboratories across educational levels have traditionally required in-person attendance, limiting practical activities to specific times and physical spaces. This paper presents a technological architecture based on a system-on-chip (SoC) and a connectivist model, grounded in Connectivism Learning Theory, for implementing a remote laboratory in digital logic design using FPGA devices. The architecture leverages an Internet-of-Things (IoT) environment to provide applications and servers that enable remote access, programming, manipulation, and visualization of FPGA-based development boards located in the institution’s laboratory, from anywhere and at any time. The connectivist model allows learners to interact with multiple nodes for attending synchronous classes, performing laboratory exercises, managing the remote laboratory, and accessing educational resources asynchronously. This approach aims to enhance learning, knowledge transfer, and skills development. A four-year evaluation was conducted, including one experimental group using an e-learning approach and three in-person control groups from a Digital Logic Design course. The experimental group achieved an average performance score of 9.777, surpassing the control groups, suggesting improved academic outcomes with the proposed system. Additionally, a Technology Acceptance Model-based survey showed very high acceptance among learners. This paper presents a novel connectivist model, which we call the Massive Open Online Laboratory.

1. Introduction

The COVID-19 pandemic was caused by the coronavirus known as SARS-CoV-2, with the first human cases reported in Wuhan, China, in December 2019 [1]. In March 2020, the WHO declared COVID-19 a pandemic, urging countries to adopt urgent and aggressive measures [2]. This situation drastically altered daily activities such as work, education, and family life, requiring rapid adaptation to unprecedented changes within a short period of time.
To prevent the spread and contagion of COVID-19, educational institutions worldwide began closing, affecting over 1.5 billion learners globally [3].
In this context, digital technologies such as Learning Management Systems (LMSs) [4], native cloud storage applications [5,6], platforms for videoconferencing and virtual meetings [7,8], social media [9,10], and mobile instant messaging applications [11,12] were employed. All these technologies served as a means for transmitting information and connecting and communicating among people, primarily learners and teachers, achieving Technology-Mediated Learning [13].
In Mexico, academic activities were specifically suspended on 23 March 2020 [14]. As a result, the entire in-person teaching–learning model had to be completely transformed into a distance learning model, in which both learners and teachers had to adapt in a very short time. The government implemented programs such as “Aprende en Casa” at the basic and upper-secondary education levels to enforce a rapid transition to distance education [15]. In addition, accelerated training had to be provided to teachers during the pandemic, as many of them lacked the necessary preparation, as well as the financial and technological resources and institutional support from the organizations where they were employed [16]. On the other hand, platforms such as Zoom, Google Classroom, Canvas, Moodle, Edmodo, and Kahoot are reported as widely used tools for virtual classes, collaborative work, and remote assessments [17].
In contrast, in several European and Asian countries, the transition was more structured, as teachers in many educational systems had greater prior training and experience, with extensive use of Learning Management Systems (LMSs) and videoconferencing tools supported by well-established institutional infrastructures [18].
One of the major challenges in engineering university programs was the development of remote practical sessions. The practical activities conducted in each undergraduate program are highly varied due to the diversity of courses offered. Each course utilizes different elements depending on the field to which it belongs.
Courses related to the area of Digital Logic Design (DLD) use Field-Programmable Gate Arrays (FPGAs) for conducting practical sessions. An FPGA allows the synthesis of digital logic designs using a Hardware Description Language (HDL). Programming an FPGA requires the use of a development board that contains the FPGA and all the necessary peripherals for conducting lab exercises.
Since it was not possible to use these development boards in person, some alternatives emerged, such as using simulators in class and creating videos by teachers to demonstrate the operation of the lab exercises to learners. However, none of these options provides learners with an adequate solution for conducting practical activities, as they do not allow physical interaction with the development boards. Another option is to create a remote laboratory.
A remote laboratory is a technological environment that enables remote access, programming, manipulation, interaction, and visualization of physical equipment available in educational laboratories at any level. This technological environment is implemented using a computational system through an embedded system based on SoC technology.
A remote laboratory offers the entire infrastructure necessary for learners to test their lab exercises from a distance. This infrastructure includes servers for transferring files to the physical equipment, remote access, applications to interact with and manipulate the physical equipment, as well as the use of cameras to stream video that allows visualization of the lab exercise results, and of course, a network connection for Internet access. This infrastructure enables remote laboratories to create the technological environment known as the IoT [19,20,21].
With a remote laboratory, learners can interact with the development boards located in the in-person laboratory from anywhere (home, office, university), at any time, and from any device (laptop, desktop computer, mobile device). This allows them to gain meaningful teaching and learning experiences by conducting lab exercises for the DLD area, making remote laboratory an accessible, available, and usable technological environment. This allows them to acquire what is known as ubiquitous learning [22,23], or u-learning. They only need a computing system (personal computer, laptop, tablet, mobile, etc.) with computer-aided design and electronic design automation (CAD-EDA) tools and an internet connection.
The remote laboratory presented consists of twelve development systems that are currently operational at the School of Computing Science of the National Polytechnic Institute, Mexico. Learners in the DLD and Computer Architecture courses are conducting various lab exercises using the remote laboratory with the e-learning approach.
This approach is defined as a transfer of knowledge and skills through intelligently designed educational materials with the help of electronic media such as the Internet, Web 4.0, intranets, and extranets [24]. This approach enhances the accessibility and availability of educational materials for people without age, time, or learner limits. The Learning Management System (LMS) is a platform used in the e-learning approach and therefore in the presented system [25].
Both the IoT technological environment and the e-learning approach applied in education contribute to the creation of a Smart Learning Environment (SLE) by providing an innovative, supportive, and safe educational setting that fosters educational transformation [26].
Despite the increasing adoption of remote laboratories in engineering education, there is limited empirical evidence on how integrated technological and pedagogical environments aligned with Smart Learning Environment (SLE) principles influence learners’ acceptance and learning outcomes in Digital Logic Design courses. In particular, prior studies rarely examine acceptance mechanisms using appropriately specified formative constructs within an extended TAM framework, nor do they simultaneously analyze learning performance in authentic FPGA-based remote experimentation contexts.
Therefore, we established the following research questions:
  • Research Question 1: What technological and pedagogical environment, including its elements and characteristics, is required for the implementation of a remote laboratory and its incorporation into an SLE?
  • Research Question 2: How can the level of acceptance of the proposed technological and pedagogical environment be assessed, from the students’ perspective, in a specific higher-education course?
  • Research Question 3: Does the proposed technological environment allow students to enhance their academic performance by acquiring the practical competencies required in a specific higher-education course?
These research questions position the study as an empirical investigation that integrates pedagogical design, technological implementation, and theory-driven evaluation to address student acceptance and learning outcomes in remote-laboratory-based Smart Learning Environments.
The paper is organized as follows: Section 2 describes the literature review on in-person and remote laboratories. Section 3 introduces the research methodology. Section 4 describes the analysis and design stage of the research methodology. Section 5 describes the implementation stage of the research methodology. Section 6 describes the results of the remote laboratory. Section 7 provides a discussion and the Section 8 concludes the paper.

2. Literature Review

The development of laboratories has been carried out in in-person, virtual, or remote modalities. In the in-person modality, laboratories have been oriented towards various areas such as Control [27] and Robotics [28].
In the remote modality, several works have also been conducted by different researchers in various fields. Jonathan Alvarez and Sergio Gonzalez in [29] propose a remote laboratory to learn programming and physical computing through Python language and Raspberry Pi 4.
Chevalier and other researchers in [30,31] propose a remote laboratory used in control engineering studies. Additionally, Krishnashree Achuthan and other researchers in [32] propose a remote laboratory that operates a Universal Testing Machine (UTM). The researchers use the Transactional Distance Theory (TDT) to compare physical laboratory platforms and remotely triggerable platforms. Also, Zhiyun Zhang and other researchers in [33] present a Software-Defined Radio remote lab, which permit learners to experiment with real wireless communication, designing radiofrequency systems with minimal code adjustments.
Particularly in the area of DLD, various remote laboratories based on FPGAs have been reported. Navas-González and other researchers in [34] present examples of practices sessions to teach and learn digital electronics using an FPGA-based remote laboratory. The remote laboratory allows for design verification, but there is no video transmission to view the results. A server is implemented on a SoC, which communicates serially with a Nexys3 board. The board sends the status of the peripherals to the SoC to update their state on the user’s screen. To carry out this communication, a module called RLAB Plant must be implemented in the learners’ practices. Additionally, Naoki Fujieda and Atsuki Okuchi in [35] propose a remote FPGA laboratory platform using a PIC18F4450 microcontroller-based board that communicates with FPGA devices remotely via commands. This is intended to give learners the sensation of interacting with the hardware in a remote laboratory environment. Moreover, Michal Melosik and other researchers in [36] propose a remote laboratory focused on FPGA devices for the remote design and testing of electronics circuits. The laboratory hardware consists of an application server on a personal computer and Open Broadcaster Software (OBS) environment for streaming. Two cameras are used, one to visualize the FPGA board and another to monitor instruments like an oscilloscope. Two tasks are performed: using a temperature and humidity sensor and designing a sine wave generator. Also, Han Wan and other researchers in [37] propose a remote laboratory with an FPGA for a Computer Organization course using a web system. The laboratory hardware uses one FPGA for practicing and another FPGA as a monitor. Both FPGAs belong to the Spartan 6 family from Xilinx. A personal computer handles the remote connection to a cloud server. There is a web interface where the practice results are displayed. The researchers report that 50.9% of learners completed Practice 8 in 2017 with the remote laboratory, compared to 21.3% of learners who completed the same practice in 2016 without the remote laboratory. On the other hand, Yuxiang Zhang and other researchers in [38] propose a remote FPGA experiment for experiments from a Computer Systems curriculum. The laboratory hardware uses an Artix-7 FPGA for practicing and a Zynq-7 SoC from Xilinx as a controller to program the FPGA’s binary file through serial slave configuration. The researchers test the platform with the implementation of a 32-bit MIPS architecture processor capable of running a modified version of the Linux kernel.
Thang Manh Hoang and other researchers in [39] propose a remote laboratory for teaching FPGA and HDL. The laboratory hardware uses an Altera Cyclone II 2C20 FPGA for practical exercises, which is controlled through a PIC16F877A microcontroller. Additionally, Toyoda and other researchers in [40] propose an FPGA-based remote laboratory implemented in a hybrid cloud for semi-automatic FPGA-run experiments. The laboratory consists of development systems from Altera. The user only needs to prepare their HDL code and FPGA configuration parameters, which are then sent to the laboratory. Finally, Aramburu and other researchers in [41] propose a cross-national remote laboratory based on FPGAs, implemented by the universities UPNA in Spain and UNIFESP in Brazil. The laboratory consists of 14 Altera development systems from both universities. They use a Raspberry Pi 3 B+ SoC to install the server and a Logitech webcam for video streaming.
Muhammad Alhammami in [42] outlines the development of a Hardware Development Kit (HDK) for a remote training platform based on FPGA. The HDK is equipped with a Raspberry Pi, a screen, a camera, LEDs, and the EP3C16Q240 FPGA chip from Altera. The raspberry Pi is used to run a service as a loading or programming server for the FPGA. They present the design of a mechanical enclosure. To interact with the HDK, the VNC Viewer is used. Additionally, Rithea Sum and other researchers in [43] propose a remote embedded system design approach utilizing Zynq 7000 FPGA, providing a web-based platform for learners to remotely test and debug their designs, with a focus on efficient signal acquisition and effective debugging and analysis. The methods used in the study include the implementation of a lab module with adaptive run-length-encoding data compression with an average compression ratio of 2.90 across three benchmark signals. Also, Carlos Cruz and other researchers in [44] proposed a remote laboratory using a Nexys 4 FPGA Board. The designed remote setup allowed learners to reserve laboratory time by creating customized environments for a wide range of experiments across different subjects. They used a remote computer, camera, sensors, and motor for the development of experiments. There was one lab exercise reported. The study used a questionnaire with 17 questions to gather learners’ perceptions of the remote lab. The results were derived from the Electronic Design course with over 70% of learners preferring to combine real lab with remote lab. On the other hand, Guerrero-Osuna and other researchers in [45] presented a remote laboratory based on a Basys 3 FPGA Board, a Raspberry Pi 5, Google Firebase services, and a web platform called UAZ Labs. They used a waterfall methodology integrating IoT and Cloud Computing technologies to facilitate close interaction between hardware and software. There were three lab exercises reported focusing on controlling DC, servo, and stepper motors. The testing phase involved 50 robotics and mechatronics engineering learners who participated in hands-on sessions for one month, followed by a structured survey evaluating their experience, interaction, and the educational relevance of the platform. Finally, Ballina and other researchers in [46] describe a remote laboratory with a cluster of an AMD Zynq UltraScale and a multiprocessor System on Chip (MPSoC) based on FPGA technology. The cluster is used for teaching the concepts of data transfer and parallel processing across CPU and FPGA. The laboratory is used in a parallel and heterogeneous computing course or in the development of accelerators for an ad hoc computing course. The authors present five exercises for heterogeneous computing.
There are common elements reported in the literature; however, most studies focus primarily on the technological architecture, consisting of hardware design and software infrastructure, with limited attention to pedagogical integration. This can be observed in the fact that only some studies report a sequence of laboratory exercises, none of the reviewed papers report learning outcomes or explicitly state the learning theory employed, and only a few describe the use of measurement instruments applied to learners.
Specifically, this manuscript highlights three main gaps in the existing literature:
  • The lack of studies that conceptualize remote laboratories as integrated Smart Learning Environments (SLEs), combining technological, pedagogical, and organizational dimensions;
  • The limited use of established theoretical models, such as the Technology Acceptance Model (TAM), to systematically evaluate student acceptance and behavioral intention in FPGA-based remote laboratories;
  • The predominance of descriptive or technical validations, with scarce empirical evidence derived from authentic instructional contexts involving real learners.
Based on these identified gaps, we identified the main elements and characteristics required to create an SLE using a remote laboratory in a DLD course, based on an IoT technological environment and the e-learning approach, as follows:
  • Technological architecture: Includes the hardware components used by a personal computer or an embedded system based on SoC technology, a development board based on a programmable logic device (PLD)—either a Complex Programmable Logic Device (CPLD) or an FPGA—and the application software.
  • Learning Management System: Provides learning activities and didactic resources for a specific course using different learning strategies.
  • Lab exercise sequence: A sequence of laboratory exercises designed for a specific course using a CPLD or FPGA.
  • Learning theory: The learning theory applied to support interaction among learners when using the remote laboratory.
  • Measuring instrument: An instrument used to evaluate learners’ acceptance of the remote laboratory.
  • Learning outcomes: Metrics used to evaluate academic performance of learners in a specific course using the remote laboratory.
The proposed research is not limited to the implementation of a new technological platform. Instead, it introduces a structured methodology for the design and deployment of a remote laboratory framed as a Smart Learning Environment and empirically evaluates student acceptance using an extended TAM model within a real Digital Logic Design course.
In Section 7, a detailed comparison between this paper and the reviewed literature is presented.

3. Research Methodology

In this paper, a quantitative research approach with an exploratory and explanatory scope was used. The research methodology is shown in Figure 1.
Figure 1 shows the research methodology with three different stages: Analysis and Design, Implementation, and Evaluation and Results. The methodological design of this paper was explicitly structured to address the research questions formulated in Section 1. Each methodological stage was selected to generate empirical evidence aligned with a specific research objective:
  • The analysis and design, and implementation of a Smart Learning Environment (RQ1).
  • The assessment of students’ acceptance of the proposed technological and pedagogical environment using an extended TAM framework (RQ2)
  • Learning outcomes and assessments associated with the use of the remote laboratory (RQ3)
The following sections describe how each methodological stage contributes to answering these research questions and to generating new empirical and methodological insights.

4. Analysis and Design Stage of the Research Methodology

4.1. General Architecture

In this stage, the general architecture of the remote laboratory for the DLD area is presented, using reconfigurable technology based on FPGA, a connectivist model based on Connectivism Learning Theory, and an embedded system based on IoT for implementation. Figure 2 shows the general architecture.
The general architecture consists of two important components, which are:
  • On-site implementation of lab exercises.
  • Use of remote laboratories based on IoT.
These components are described below:

4.2. On-Site Implementation of Lab Exercises

In this first component, the learner performs the analysis, design, implementation, simulation, constraints assignment, and BitStream file generation of the lab exercises to be developed using a CAD-EDA tool (ISE or Vivado by Xilinx) [47]. In this development environment, a project is created by selecting the FPGA of the Nexys 4 development board [48]. The sequence of actions for the on-site implementation of lab exercises is shown in Figure 3 and described below:
  • Design procedure of the digital logic lab exercise: The Digital Design methodology is used to obtain the entity, the Algorithmic State Machine (ASM), the functional blocks, the datapath, and the control unit for the lab exercise.
  • Project creation with CAD-EDA tool: In the development environment, a project is created by selecting a target device, which can be either a Complex Programmable Logic Device (CPLD) or an FPGA.
  • Lab exercise programming with HDL: A program is written using an HDL, such as VHDL [49] or Verilog [50]. All functional blocks, the datapath, and the control unit of the design are programmed.
  • Lab exercise simulation with HDL test bench: The HDL program is simulated to verify its correct operation according to the exercise specifications using a set of test vectors with a Test-Bench file. For this, the ISE simulator integrated into the CAD-EDA tool is used.
  • Pin assignment in constraints file: Pin assignment in the constraints file (.xdc) is performed using a virtual connector created within the technological architecture, called LAB REMOTE. This connector contains 24 virtual switches for assigning input pins. The output pin assignment is carried out through the peripherals on the development board.
  • BitStream file generation: Finally, the syntheses, mapping, place, route, and BitStream file (.bit) generation are performed.

4.3. Use of Remote Laboratories Based on IoT

In this second component, the learner accesses the remote laboratory to interact, program, manipulate, visualize, and verify the results of the lab exercise on the FPGA using the Nexys 4 development board. The remote laboratory is implemented on an embedded system with SoC technology. The sequence of actions for using the remote laboratory is shown in Figure 4 and is described below:
  • BitStream and constraints file transfer: To achieve this, an SSH server is used for file transfer from a terminal on any operating system (Windows, Linux, MacOS) executing the scp command. In addition, an SSH client for desktop or mobile devices can be used.
  • Access to video server: A video web server allows visualization of the results from each lab exercise performed on the Nexsys 4 development board.
  • SSH connection to remote laboratory: Remote access to the remote lab is achieved through the SSH server using the username and password provided by the instructor.
  • FPGA programming with BitStream file (.bit): The BitStream file is programmed in the FPGA device using an application developed for the SoC.
  • Lab exercise verification with constraints file: The lab exercise is verified through an application developed for the SoC.

5. Implementation Stage of the Research Methodology

In this stage, the following modules are used:
  • Technological architecture of the remote laboratory.
  • Sequence of lab exercises for a DLD course.
  • Connectivist model based on Connectivism Learning Theory.

5.1. Technological Architecture of the Remote Laboratory

To implement the general architecture shown in Figure 2, a technological architecture for the development of the remote laboratory is proposed. Figure 5 shows the main components of the technological architecture.
Three important components shown in Figure 5 are explained below:
  • V-Model methodology.
  • Embedded system based on IoT.
  • SoC–FPGA interface.

5.1.1. V-Model Methodology

For the analysis, design, and implementation of the remote laboratory using an embedded system, an adaptation of the V-model methodology proposed by Coop [51] was used for the development of SoC-based embedded systems. The stages of this methodology are shown in Figure 6.
Figure 6 shows the stages of the V-model methodology, from the specification of requirements to the operational testing of the system. This methodology allows returning to previous stages to make the necessary adjustments and ensure that the established requirements are met. In addition, the methodology supports the use of Unified Modeling Language (UML) in each stage [52], enabling a detailed description and a standardized design.

5.1.2. Embedded System Based on IoT

An embedded application was developed using SoC technology and a Linux-based operating system, specifically, Raspberry Pi OS Lite (64-bit), which has an image size of 423 MB [53]. The SoC consisted of a Raspberry Pi 3 Model B+, which has the following features [54]:
  • A 64-bit quad-core Cortex A53 processor, on a Broadcom BCM2837B0 SoC.
  • Memory of 1 GB LPDDR2 SDRAM and Gigabit Ethernet over USB 2.0.
  • Wireless transmission: 2.4 GHz and 5 GHz IEEE 802.11b/g/n/ac wireless LAN, Bluetooth 4.2, BLE.
  • Peripherals: SPI, I2C, UART, I2S, USB, GPIOs.
Although a Raspberry Pi 3 Model B+ SoC was used, it could be replaced by a Raspberry Pi 4 Model B or a Raspberry Pi 5. The SoC uses a Camera Module V2 for Raspberry Pi [55], based on Sony’s IMX219 8-megapixel sensor [56]. This camera communicates with the SoC via the MIPI CSI-2 protocol [57] for live video streaming.
Furthermore, two servers were configured on the embedded system, as described below:
  • SSH server: This server uses the Secure Shell protocol for encrypted communication and is managed by the systemd initialization system.
  • Video web server: This server is implemented using motion, which is used through a web application, so a browser is used for access. Motion is a configurable program that monitor video signals from many types of cameras and perform actions when movement is detected [58].
Additionally, two Linux shell applications were programmed and implemented on the embedded system to interact with the Nexys 4 FPGA development board, as described below:
  • FPGA programming application: An application called progFPGA is used to program the FPGA device and relies on the drivers for the Advanced RISC Machine (ARM) architecture provided by Digilent Inc. (Pullman, WA, USA).
    These drivers are managed through the Join Test Action Group (JTAG) standard.
  • Virtual switches application: An application called inputFPGA is used to verify the lab exercise and relies on the GPIOs of the SoC. This application allows the manipulation of up to 24 virtual switches which are labeled with the input port names defined in the user constraints file.

5.1.3. SoC–FPGA Interface

This manipulation interface is an electronic circuit that connects the SoC’s 40-pin connector to the Nexys 4 development board through the PMOD connectors. The block diagram of this interface is shown in Figure 7.
The functions provided by the SoC–FPGA Interface shown in Figure 7 are as follows:
  • Create virtual switches: A connector referred to as “LAB REMOTE” is created, which supports the operation of up to 24 virtual switches. These virtual switches are shown in Table 1.
  • Configure a Real-Time Clock/Calendar (RTCC): The SoC lacks an integrated Real-Time Clock Calendar (RTCC) for system date and time configuration; instead, it relies on a Network Time Protocol (NTP) server to synchronize the system clock. When an RTCC is required, it must be implemented as an external peripheral to the SoC. The SoC–FPGA interface provides a connector compliant with the IIC bus specification [59], enabling the integration of external devices. Consequently, a custom electronic board was designed to incorporate an external RTCC, model DS1338 from Analog Devices [60], into the SoC.
  • Configure a Crypto-Authentication device: These devices feature a cryptographic coprocessor that performs hardware-based encryption algorithms for IoT applications. A custom electronic board was designed for the connector, compliant with the IIC bus specification of the SoC–FPGA interface, to integrate an external Crypto-Authentication device, model ATECC608 from Microchip Inc. (Chandler, AZ, USA) [61].
  • Provide a USB-to-serial-UART interface: The SoC–FPGA interface provides a connector compliant with the UART interface. It enables the interconnection of external peripherals such as a USB-to-serial-UART interface (FT232R) [62]. The FT232 device allows console access to the SoC operating system and access to programming, configuration, updating, and maintenance tasks for the embedded system applications.

5.2. Sequence of Lab Exercises for a DLD Course

The remote laboratory has been used in the DLD courses since 2021, with nine lab exercises covering the course content. The lab exercises are shown in Table 2.
Lab exercise 1 through 8 are explained in general terms, while lab exercise 9 is explained in detail as it encompasses the full thematic content of the course.
Lab exercise 1 involves integrating different combinational logic circuit elements such as a multiplexer, comparator, and code converter into an application that allows the use of various HDL structures, including concurrent, sequential, and conditional ones. In this lab exercise, learners review and/or acquire the foundational knowledge needed for the course.
In lab exercise 2, learners design, implement, and test an SR Flip-Flop (FF), JK-FF, T-FF, and D-FF based on the equations obtained from their extended truth tables. The implementation is carried out using the D-FF and HDL on an FPGA. Through this lab exercise, learners apply and reinforce the theoretical knowledge acquired on latches and FFs.
In lab exercise 3, learners design, implement, and test a circuit that performs the functions of a register, including load, hold, shift-left, and shift-right operations. The implementation is carried out at different levels of abstraction using HDL on an FPGA. Through this lab exercise, learners apply and reinforce the theoretical knowledge acquired on registers, including their architectures and applications.
In lab exercise 4, learners design, implement, and test a 4-bit sequence detector without overlap using HDL on an FPGA. The sequence to detect is “1101” where the methodology for designing sequential circuits is applied. Through this lab exercise, learners apply and reinforce the theoretical knowledge acquired on Deterministic Finite Automaton (DFA) with outputs called Mealy Machines.
In lab exercise 5, learners design, implement, and test a multiplexed message using HDL on an FPGA. The message has four symbols and is displayed on a module of four multiplexed common anode displays. Also, the methodology for designing sequential circuits is applied. The implementation is carried out in two ways: a static message without movement and a moving message in marquee mode. Through this lab exercise, learners apply and reinforce the theoretical knowledge acquired on DFA with outputs called Moore Machines.
In lab exercise 6, learners design, implement, and test counters using HDL on an FPGA. They analyze an up-counter with enable for obtaining a generic equation using a Moore Machine and applying the methodology for designing sequential circuits. Additionally, they implement counters using high-level operators + and −. Through this lab exercise, learners apply and reinforce the knowledge acquired on counters.
In lab exercise 7, learners design, implement, and test BCD, ring, and Johnson counters using HDL on an FPGA. They analyze these counters through their Mealy and Moore Machines and implement them using packages and components. Through this lab exercise, learners apply and reinforce the knowledge acquired on special counters.
In lab exercise 8, learners design, implement, and test an application that determines whether a person enters or exits a room and counts the number of people in the room using two optical sensors and BCD counters. The application is implemented using a Mealy Machine analyzed with high-level HDL. Packages and components are used for the implementation. Through this lab exercise, learners apply and reinforce the knowledge acquired on Mealy Machines and special counters.
Practice 9 is described in detail below.

Algorithmic State Machine Lab Exercise

ASMs are used to model dedicated cores through a flowchart that represents the state transitions and outputs generated by Mealy and Moore machines. The resulting digital logic design enables the execution of a specific algorithm in hardware.
In this lab exercise, a digital logic system is designed, implemented, and tested using HDL on an FPGA to determine the number of bits set to logic ‘1’ in a 22-bit register through an ASM.
The design begins with the analysis of the algorithm to be implemented, which is shown below. From Algorithm 1, the ASM is obtained, which contains the actions to execute the practice’s algorithm. The ASM is shown in Figure 8.
Algorithm 1 Pseudocode algorithm of the ASM lab exercise
countBitsInOne ( A )
     / / A = 22 bits number
     B = 0
     w h i l e A 0 d o
          i f a 0 = 1 d o
               B = B + 1
          e n d i   f
               A = A s h i f t _ r i g h t 1
     r e p e a t
    return  B
From the ASM in Figure 8, the microarchitecture shown in Figure 9 is obtained. This microarchitecture is known as the datapath.
In this microarchitecture, the datapath of the design is shown and the following functional blocks are presented:
  • Register A: This register contains the 22-bit number to be analyzed.
  • BCD Counters: They count the number of bits set to logic ‘1’ contained in Register A. The algorithmic diagram of the BCD counters is shown in Figure 10.
  • Ring Counter: Performs the multiplexing of the digits from the BCD counters on the four multiplexed common-anode displays.
  • ROM: Performs the decoding of the 4 bits from the BCD counters to the 7 segments of the display. For a ROM with an organization of m × n bits, a data bus called D A T A with n bits and an address bus called B C D _ N with l o g 2 ( m ) bits are created using the following equation:
    D A T A i = R O M ( B C D _ N j ) i
    where:
    i = 0 , 1 , , n 1
    B C D _ N j = 0 , 1 , , m 1
  • Control Unit: The automaton of the control unit obtained by replacing the actions of the ASM in Figure 8 with the control signals of each element of the datapath shown in Figure 9. This automaton is shown in Figure 11.
  • The automaton of the control unit is defined by the 9-tuple:
    M = ( Q , Σ , Δ , δ , α , β , λ 1 , λ 2 , q 0 )
  • where:  
    Q = q 0 , q 1 , q 2
    Σ = 0 , 1
    Δ = 0 , 1
    α = I N I , Z , Q A 0
    β = S H E , L A , I B , L B , S E L
    q 0 = I n i t i a l s t a t e
  • For each state q , p Q , each input symbol a Σ , and each input signal s α , the transition function δ is defined as follows:
    δ s ( q , a ) = p
  • For the automaton of the control unit in Figure 11, the transition function δ is defined as follows:
    δ I N I ( q 0 , 0 ) = q 0                 δ I N I ( q 0 , 1 ) = q 1
    δ Z ( q 1 , 0 ) = q 1                 δ Z ( q 1 , 1 ) = q 2
    δ I N I ( q 2 , 0 ) = q 0                 δ I N I ( q 2 , 1 ) = q 2
  • The output function λ 1 represents the Mealy-type outputs of the ASM. Additionally, for each state q Q , each input symbol a Σ , each input signal s α , each output signal b β , and each output value d Δ , the function λ 1 is defined as follows:
    λ s , b 1 ( q , a ) = d
  • For the automaton of the control unit in Figure 11, the output function λ 1 is defined as follows:
    λ I N I , L A 1 ( q 0 , 0 ) = 1
    λ Q A 0 , I B 1 ( q 1 , 1 ) = 1
  • The output function λ 2 represents the Moore-type outputs of the ASM. Additionally, for each state q Q , each output signal b β , and each output value d Δ , the function λ 2 is defined as follows:
    λ b 2 ( q ) = d
  • For the automaton of the control unit in Figure 11, the output function λ 2 is defined as follows:
    λ L B 2 ( q 0 ) = 1
    λ S H E 2 ( q 1 ) = 1
    λ S E L 2 ( q 2 ) = 1
With this lab exercise, learners apply and reinforce the knowledge acquired in the theoretical topics of special and applications counters, memories, and ASM. Additionally, the learner applies the digital systems design methodology using ASM symbols for the implementation of the lab exercise.
Each element shown in Figure 9 is implemented using VHDL. Specifically, the learners apply the design using packages and components in VHDL, utilizing component parameterization. Furthermore, the control unit in Figure 11 is implemented using functional description by specifying states through the TYPE directive in VHDL. Functions, procedures, and file reading in VHDL are also implemented.

5.3. Connectivist Model Based on Connectivism Learning Theory

To conduct theoretical topics, learning activities, and the development and assessment of lab exercises for the DLD course, the Connectivism Learning Theory was applied. This theory is one of the most relevant frameworks in smart e-learning environments [25].
The Connectivism Learning Theory is the thesis that knowledge consists of sets of connections between entities, such that a change in one entity can result in a change in the other, and that learning is the growth, development, modification, or strengthening of those connections [63]. Connectivism is the integration of principles explored by chaos, network, complexity, and self-organization theories [64].
Siemens defines the principles of connectivism in [64]. Based on these principles, Siemens and Downes [63] proposed in 2008 a non-formal connectivist learning environment called a Massive Open Online Course (MOOC), also referred to as a connectivist MOOC (cMOOC). In this type of environment, there is no predefined content or fixed thematic sequence. Learners make decisions regarding the selection of thematic content and learning objectives, which provides them with the autonomy to establish a personalized learning path based on informal learning [65]. This approach enables learners to acquire connective knowledge, defined as distributed knowledge that emerges from the connections formed among the various entities involved in the MOOC and their interactions [66].
In contrast, extended MOOCs (xMOOCs) are characterized by a predominantly instructivist approach, influenced by behaviorist and cognitivist learning theories, in which the teaching–learning process focuses on the structured and sequential transmission of content [67,68].
In this paper, a novel connectivist model was developed for the DLD course using remote laboratories, called a Massive Open Online Laboratory (MOOL). The MOOL is described through different models, two of them derived from the Unified Modeling Language (UML). UML is a diagram-based modeling language for the specification, visualization, construction, and documentation of any complex system [69]. It has been used to model software engineering systems as well as other types of systems [70,71,72,73,74]. In this paper, UML is used to model the connectivist environment. The models used are as follows:
  • Entity model.
  • Use case diagram.
  • Activity diagram.

5.3.1. Entity Model

This model shows the functionality, connections, and interactions of each entity. The model is shown in Figure 12.
Figure 12 shows the different entities of the connectivist model; each of them is explained below.
  • Learning Management System (LMS) for DLD course: An LMS is designed to enable educators to manage, measure, and enhance learning experiences. Learning activities and didactic resources in LMSs are lecture notes, instructional design documents, class videos, assignment, and exercises. Homepage and course content of the DLD course in the LMS are shown in Figure 13. The explanation of the ASM lab exercise in the LMS is shown in Figure 14.
  • Cloud storage: In this entity, a native application is used for storing the content of the DLD course.
  • Video conferences: This entity utilizes a platform that allows scheduling and initiating video conference sessions. In these sessions, participants can join to collaborate through voice, video, and screen-sharing functions. This platform is used to deliver synchronous theoretical class sessions, which are recorded for later storage in the cloud, so that the videos can be accessed asynchronously by learners. During these sessions, learners’ questions regarding the covered topics are clarified and addressed.
  • Virtual meetings: This entity utilizes a video communication service to conduct video conferences and virtual meetings. This service is used to hold synchronous class sessions for the review of lab exercises via the remote laboratory. During these sessions, learners demonstrate the functionality of each lab exercise to the instructor by implementing their VHDL program, running simulations via test-bench file, and verifying the design on the FPGA through the remote laboratory. Learners’ questions related to the lab exercises are also clarified and addressed in these sessions.
  • Platform for video storage: This entity uses a social media platform that allows hosting and sharing videos showcasing the remote laboratory’s operation. These videos can be accessed online asynchronously by the learners.
  • Instant messaging application for mobile devices: This entity uses a platform to communicate with all learners in the course for the activities shown in Figure 9. In addition, this node enables academic interaction among learners.
  • Remote Laboratory: This entity uses the remote laboratory, where learners test lab exercises by programming the FPGA, interacting with it and manipulating it.

5.3.2. Use Case Diagram

Within the structural view is the use case diagram, which models the functionality of a system as perceived by the actors interacting with it [52]. The use case diagram of the connectivist model is shown in Figure 15.
Figure 15 shows the different use cases of the connectivist model, each of them is explained below.
  • Deliver synchronous classes: In this use case, the professor and learner actors interact during synchronous classes. The participating entities are an instant messaging application for mobile devices, video conferences, the LMS, and cloud storage.
  • Generate didactic resources: In this use case, the professor actor generates didactic resources by managing and editing course content within the LMS. The participating entities are the LMS, platform for video storage, and cloud storage.
  • Review didactic resources: In this use case, the learner actor reviews didactic resources from the LMS. The participating entities are the LMS, platform for video storage, an instant messaging application for mobile devices, and cloud storage.
  • Conduct learning activities: In this use case, the learner actor conducts learning activities from the LMS. The participating entities are the LMS, platform for video storage, an instant messaging application for mobile devices, remote laboratory, and cloud storage.
  • Lab exercise assessment: In this use case, the professor, learner, and embedded system actors interact during the lab exercise assessment. The participating entities are an instant messaging application for mobile devices, virtual meetings, the LMS, and remote laboratory.

5.3.3. Activity Diagram

The dynamic behavior of the connectivist model can be analyzed through the activity diagram view [52]. These diagrams show the flow control of the activities performed by the learner, professor, and embedded system actors from de use case diagram, and each entity within the connectivist model. In Figure 16 the activity diagram of the “Lab Exercise Assessment” use case is shown.
Figure 16 shows the interactions between the professor and learner actors and the instant messaging application for mobile devices, virtual meetings, the LMS, and remote laboratory entities.

6. Results

The tests performed on the proposed system, as well as the evaluation of the results obtained, are divided into the following sections:
  • Testing and results of server’s performance on the SoC.
  • Testing and results of the ASM lab exercise conducted in the remote laboratory.
  • Testing and results of the Technology Acceptance Model (TAM).
  • Learning outcomes and assessment.
These tests were carried out using the remote laboratory, which consisted of twelve development systems used in the DLD courses of the Computer Systems Engineering program at the Higher School of Computing of the National Polytechnic Institute located in Mexico City.
Figure 17 shows the physical implementation of a system used in the remote laboratory, which consists of a System on Chip, SoC–FPGA interface, the FPGA development board, and a power supply.

6.1. Testing and Results of Server’s Performance on the SoC

Figure 5 shows the video server and the SSH server implemented on the SoC used in the remote laboratory. Their testing and evaluations are described below.

6.1.1. Testing and Results of Video Server’s Performance on the SoC

The video server used a Raspberry Pi Camera v2 [55] based on Sony’s IMX219 sensor [56]. That server allows a maximum of 10 users to connect. The server was tested with a program that displayed the message “Lab-rEnnoto-ESCOnn-IPN!” in marquee mode using a four-digit multiplexed common anode display module. The camera resolution was 1024 × 768 pixels at 20 frames per second. The Linux operating system top command was used to obtain the CPU and memory usage percentages, which are shown in Table 3.
Table 3 shows that the maximum CPU usage percentage was 85.1, corresponding to a single core of the processor. The remaining three cores were used for running other system processes. The percentage of memory used in the SoC was only 5.2% for 10 connected users.

6.1.2. Testing and Results of SSH Server’s Performance on the SoC

To test the SSH server, the JMeter application [78] was used. With this application, load tests were performed with a range of 5 to 50 users in time intervals between 1 s and 60 s. The results obtained are shown in Table 4.
In Table 4, it is shown that the system can handle up to 15 users within 1 s of response time. Additionally, it is observed that a maximum of 19 users can connect starting from 3 s of response time. This allows the connection of up to six groups of three learners each, along with one instructor.

6.2. Testing and Results of the ASM Lab Exercise Conducted in the Remote Laboratory

Each practice shown in Table 2 was analyzed, designed, and implemented using the Vivado v2022.2.2 tool from Xilinx [47] and the Nexys 4 development board [48] with the XC7A100TCSG324 FPGA [79]. The testing and evaluation of ASM lab exercise results are reported below.
Each element of the microarchitecture shown in Figure 9 was implemented using VHDL. Figure 18 shows the implementation of the control unit.
After the implementation of the lab exercise using VHDL, a simulation was performed using a test-bench file with the test vector D A = 0 x 3 AAAAA , which is a hexadecimal value containing 12 bits set to one. This vector allowed us to verify the correct operation of the lab exercise. Figure 19 shows the simulation result.
Figure 19 shows the right-shift operation in the QAS register’s output and the increment in the BCD counters from 00 to 12. The result was displayed on the common anode displays.
Next, the input and output terminal assignment was performed using the constraints file and the virtual switches from the remote lab connector. At that moment, the synthesis, place and route, and generation of the BitStream file for the ASM lab exercise were carried out. After this process, the FPGA resources used were obtained, as shown in Table 5.
Timing information was also obtained, as shown in Table 6. The maximum frequency of the design was obtained using Equation (22).
F m a x = 1 T W S N
Finally, the testing and evaluation of the ASM lab exercise were carried out using the remote laboratory and the connectivist model, as shown in Figure 20.

6.3. Testing and Results of the TAM

The TAM is a theoretical framework originally developed by Fred Davis [80] to explain and predict user acceptance and adoption of information technologies. Over the past decades, the TAM has been extensively applied in educational contexts, including e-learning environments, remote laboratories, and virtual laboratories in engineering education [81].
The core TAM constructs considered in this paper were as follows:
  • Perceived Usefulness—PU: It measures practical learning gains, innovation in modality, temporal comparison with a physical laboratory, pedagogical value vs. simulators, and spatio-temporal accessibility.
  • Perceived Ease of Use—PEU: It measures the ease of use of different technical components.
  • Intention to Use—IU: It measures a latent intention.
Additionally, to capture contextual and technological factors specific to engineering education and remote experimentation, the original TAM was extended with three additional constructs, which are described as follows:
  • Technology Application—TA: Refers to the extent to which a learner utilizes foundational knowledge from areas related to the technology being used. This allows the learner to both apply previously acquired knowledge and learn new topics.
  • Support Digital Resources—SDRs: Refers to the digital resources available to the learner for managing and using the technology. These resources include online user manuals, videos, supporting software, and Learning Management Systems.
  • Acquired Digital Skills—ADSs: Refers to the acquired knowledge through learning digital technologies necessary for using the new technology. These digital technologies include CAD-EDA tools for managing and programming the physical device of the remote laboratory and applications for remote interaction.
Derived from the different constructs of the extended TAM framework, a data collection instrument was designed, consisting of a survey with 20 items. These items are shown in Table 7.
In this paper, Perceived Usefulness, Perceived Ease of Use, Technology Application, Support Digital Resources, and Acquired Digital Skills were conceptualized as emergent constructs rather than latent traits. Each indicator represented a distinct and non-interchangeable facet of the construct domain, contributing uniquely to the overall meaning of the construct. Consequently, the removal of any indicator would alter the conceptual interpretation of the construct, which is a defining characteristic of formative measurement models [82,83].
In contrast, Intention to Use (IU) was modeled as a reflective construct, as its indicators were considered interchangeable manifestations of a single underlying behavioral intention, consistent with the original TAM literature [82].
Figure 21 illustrates the measurement model (outer model) and the structural model (inner model) specified for the extended TAM framework used in this paper.
The items shown in Table 7 were delivered in the form of a self-administered survey conducted on 28 learners from group 2CM16 of the DLD course taught during the 2021–2 semester, using a Likert scale. The Likert scale consisted of the following values: totally disagree (td), disagree (d), neutral (n), agree (a), and totally agree (ta). Regarding item SP1, the surveyed learners were between 19 and 24 years old, with 46.4% being 20 years old. Regarding item SP2, 75% of the surveyed learners were male. The results obtained from the remaining items are shown in Table 8.
This research adopted an exploratory approach and involved a relatively small sample size ( n s = 28). The research model included formative constructs and multiple mediated relationships. Therefore, Partial Least Squares Structural Equation Modeling (PLS-SEM) was applied, as it is the recommended analytical technique under these conditions [85,86].

6.3.1. Measurement Model Assessment

The measurement model was assessed in accordance with established guidelines for PLS-SEM, taking into account the mixed specification of formative and reflective constructs adopted in this paper [85,86]. Specifically, PU, PEU, TA, SDR, and ADSs were modeled as formative constructs (Mode B), whereas Intention to Use (IU) was specified as a reflective construct (Mode A).
  • Assessment of Formative Measurement Models: For formative constructs, traditional internal consistency and convergent validity measures such as Cronbach’s alpha, composite reliability, and average variance extracted (AVE) are not applicable and were therefore not considered. Instead, the evaluation focused on collinearity among indicators using variance inflation factors (VIFs) and the significance and relevance of outer weights and loadings. The results obtained are shown in Table 9.
  • Assessment of the Reflective Measurement Model: The reflective construct Intention to Use (IU) was evaluated using standard reliability and validity criteria. Both indicators exhibited high outer loadings ( I U 1 = 0.895; I U 2 = 0.935), exceeding the recommended threshold of 0.70, thereby indicating strong indicator reliability. Additionally, IU demonstrated adequate internal consistency reliability, with Cronbach’s alpha [87,88] ( α = 0.809 ) and composite reliability ρ a = 0.839 , ρ c = 0.912 ; all exceeding the recommended thresholds. Convergent validity was also confirmed, as the average variance extracted (AVE) reached 0.838, indicating that the construct explained a substantial proportion of the variance in its indicators.

6.3.2. Structural Model Assessment

The structural model was evaluated following established PLS-SEM procedures, focusing on the assessment of path coefficients, coefficient of determination ( R 2 ), and total indirect effects to examine the explanatory power and predictive relevance of the proposed extended TAM framework [85]. The results obtained are shown in Table 10.

6.4. Learning Outcomes and Assessments

To evaluate the remote laboratory, a quasi-experimental field design was employed [89] with the experimental group 2CM16. In this course, the connectivist learning model proposed in Figure 12 was used, and the course followed an e-learning approach. The learners were assessed using the rubric shown in Table 11. This rubric specified the evaluated learning indicators, performance criteria, and grading scale, which was based on a numerical scale from 0 to 10.
The assessment results of the experimental group 2CM16 are presented alongside the assessment results of three control groups from the same course in the years 2017, 2018, and 2019, when the courses were taught in person. These results allowed the reported average of the experimental group to be interpreted relative to the grading scale and in comparison with the control groups. Table 12 presents the grades for each specific indicator for all groups, using the assessment rubric shown in Table 11.
As the study was exploratory in nature, the analysis focused on descriptive statistics rather than inferential statistics. Each group consisted of 33 enrolled learners, whose grades were used for the descriptive statistical analysis.
The sample mean ( x ¯ ) was calculated as a measure of central tendency. In addition, the sample variance ( s 2 ) , standard deviation ( s ) , and coefficient of variation ( C V = s / x ¯ ) were calculated as measures of dispersion. This was carried out using a program developed with the Pandas library [90] and the Jupyter Notebook 6.4.12 development environment [91]. The results are shown in Table 13.
In Table 13, it can be observed that the grades’ sample mean ( x ¯ ) of the experimental group 2CM16 from the year 2021 was higher than the average in the groups from 2017, 2018, and 2019.

7. Discussion

7.1. Discussion of Server’s Performance on the SoC

As for the servers implemented on the embedded system, the video server supported up to 10 connected users, depending on the configured camera resolution, the number of frames transmitted per second, and the available network bandwidth. For 10 users, the video server on the embedded system showed a CPU usage of 85.1% on a single core and 5.2% RAM usage, indicating that the processing and memory resources provided by the embedded system were sufficient for the remote laboratory’s operation. Furthermore, an embedded system with fewer processing and memory resources could also be used, resulting in lower power consumption and reduced costs.
The SSH server could connect up to nineteen users simultaneously, allowing up to six groups of three learners each, along with one instructor per remote laboratory. This was due to the maximum number of connections allowed by the network manager.
These findings directly address the research problem by demonstrating that embedded systems based on SoC technology offer sufficient performance for the implementation of the remote laboratory’s technological architecture.

7.2. Discussion of the ASM Lab Exercise

The ASM lab exercise illustrates the different stages of the DLD methodology, as shown in Figure 3, as well as the stages for using the remote laboratory, shown in Figure 4. The utilization of LUT and FF resources in this lab exercise was less than 1%, demonstrating the high availability of FPGA resources. Furthermore, the maximum frequency achieved was 452 MHz for this lab exercise; however, the other lab exercises used different FPGA resources, with maximum frequencies of at least 400 MHz. Additionally, further lab exercises can be proposed beyond those presented here.

7.3. Discussion of the Extended TAM Framework

  • Collinearity Assessment: Collinearity among formative indicators was assessed using variance inflation factors (VIFs). As shown in Table 9, all indicators exhibited VIF values below the conservative threshold of 3.3, indicating that multicollinearity does not pose a threat to the estimation of the formative constructs. These results confirm that each indicator contributes unique information to its respective construct and that redundancy among indicators is limited.
  • Significance and Relevance of Outer Weights: The relevance of formative indicators was evaluated through their outer weights. Several indicators demonstrated substantial contributions to their corresponding constructs. For example, within the Perceived Ease of Use construct, PEU1 (0.738) and PEU2 (0.662) exhibited strong positive contributions, while PEU3 (0.016), PEU4 (0.147), and PEU5 (0.053) showed lower relative importance. Similarly, for Perceived Usefulness, PU4 (0.765) and PU1 (0.582) emerged as the most influential indicators, whereas PU2 and PU3 contributed marginally.
    Within the Support Digital Resources construct, SDR1 (−0.988) and SDR2 (−0.712) displayed negative outer weights, while SDR3 (0.822) showed a positive contribution. In formative measurement models, negative outer weights do not indicate measurement problems; rather, they reflect compensatory or suppressor effects among indicators capturing different dimensions of the construct domain. These results suggest that different types of digital support resources may exert contrasting influences on students’ perceived ease of use.
    For Technology Application, TA2 (0.808) contributed more strongly than TA1 (0.344), indicating that operating system-related knowledge played a more prominent role than computer network knowledge in the context of the remote laboratory. Acquired Digital Skills was represented by a single indicator (ADS1), which fully defines the construct in this model.
    Although outer loadings were also examined as a supplementary criterion to assess indicator relevance, they were not used as the primary basis for indicator retention, in accordance with formative measurement guidelines. The observed loadings further support the distinct contribution of indicators without implying internal consistency requirements.
    Overall, the formative measurement assessment confirms that the selected indicators adequately define their respective constructs and that the measurement model is free from critical multicollinearity issues.
  • Reflective construct IU: Given the two-item nature of the construct and the high loadings observed, the reflective measurement of IU demonstrates satisfactory reliability and convergent validity. These results confirm that IU is appropriately specified as a reflective construct, with its indicators representing interchangeable manifestations of students’ behavioral intention to continue using the remote laboratory.
  • Path Coefficients: Table 10 present the estimated path coefficients of the structural model shown in Figure 21. The results indicate that Technology Application (TA) exerts a strong positive effect on Perceived Ease of Use (PEU) ( β = 0.811 ) , highlighting the importance of learners’ prior technological knowledge in shaping perceptions of ease of interaction with the FPGA-based remote laboratory. In contrast, Acquired Digital Skills (ADSs) show a modest positive influence on PEU ( β = 0.152 ) , while Support Digital Resources (SDRs) exhibit a negligible effect ( β = 0.011 ) .
    Perceived Ease of Use strongly influences Perceived Usefulness (PU) ( β = 0.951 ) , confirming the central TAM assumption that ease of use is a key antecedent of perceived usefulness in technology acceptance processes. Furthermore, PU demonstrates a substantial positive effect on Intention to Use (IU) ( β = 0.585 ) , indicating that learners’ intention to continue using the remote laboratory is primarily driven by its perceived usefulness.
    Overall, the pattern of path coefficients supports the theoretical structure of the extended TAM model and highlights the dominant role of PEU and PU in explaining learners’ acceptance of the remote laboratory.
  • Coefficient of determination R 2 : The explanatory power of the structural model was assessed using the coefficient of determination ( R 2 ) . The results revealed high levels of explained variance for the endogenous constructs. Specifically, PEU achieved an R 2 value of 0.886 (adjusted R 2 = 0.872 ), indicating that ADSs, SDRs, and TA jointly explain a substantial proportion of the variance in perceived ease of use.
    Perceived Usefulness exhibited an R 2 = 0.904 (adjusted R 2 = 0.900 ), demonstrating that PEU accounts for a very large proportion of its variance. Intention to Use achieved an R 2 = 0.342 (adjusted R 2 = 0.316 ), which can be considered moderate in behavioral research contexts and is acceptable given the exploratory nature of the study and the relatively small sample size.
    These results indicate that the proposed model exhibits strong explanatory power for the key cognitive constructs of the TAM and a meaningful level of prediction for learners’ behavioral intention.
  • Mediation and Indirect Effects: To further examine the mechanisms through which the exogenous constructs influence Intention to Use, the total indirect effects were analyzed. The results indicated that Technology Application exerted a substantial indirect effect on Intention to Use ( β = 0.451 ) , primarily mediated through Perceived Ease of Use and Perceived Usefulness. Similarly, Technology Application showed a strong indirect effect on Perceived Usefulness ( β = 0.772 ) , reinforcing its central role in the acceptance process.
    Perceived Ease of Use exhibited a strong indirect effect on Intention to Use ( β = 0.556 ) , confirming its dual role as both a direct predictor and a mediator within the model. Acquired Digital Skills demonstrated smaller but positive indirect effects on Perceived Usefulness ( β = 0.145 ) and Intention to Use ( β = 0.084 ) , suggesting a secondary contribution to acceptance through cognitive evaluations.
    In contrast, Support Digital Resources displayed minimal indirect effects on both Perceived Usefulness ( β = 0.010 ) and Intention to Use ( β = 0.006 ) , indicating that within the studied context, digital support resources played a limited role compared to learners’ technological application capabilities and Perceived Ease of Use.
    Overall, the mediation analysis revealed that acceptance of the FPGA-based remote laboratory was predominantly driven by indirect mechanisms, with Perceived Ease of Use and Perceived Usefulness acting as key mediators between contextual technological factors and students’ behavioral intention.
Additionally, learners provided further comments in the survey, some of which are given below:
  • Learner 1: “The remote laboratory was a great support, bringing the lab exercise to our homes in a simple way, without the need to purchase extra materials for each of the lab exercises”.
  • Learner 2: “Completely an innovative tool and one of the most helpful while we were taking online semesters”.
  • Learner 3: “I think the use of a remote laboratory is excellent, as it extends the time for learners to test their lab exercises, and is not limited to available spaces or restricted usage time”.

7.4. Discussion of Learning Outcomes and Assessments

In Table 12, the grades for each specific indicator of the evaluation rubric can be observed. All grades are higher for the group in which the remote laboratory was used. Specifically, indicator D shows the highest score, with a value of 9.88, indicating that all learners successfully completed the laboratory exercises using the remote laboratory.
In Table 13, it can be observed that the sample mean ( x ¯ ) of the grades for group 2CM16 in 2021 is the highest, with a value of 9.777, in comparison with the control groups in previous years. This suggests that academic performance improves with the proposed system. Furthermore, the coefficient of variation, with a value of 6.49%, indicates the lowest relative variability, suggesting high and consistent performance among the learners in group 2CM16 when the proposed technological architecture and connectivist model were applied through the remote laboratory.

7.5. Discussion of Laboratory Modalities

Although this COVID-19 pandemic period motivated the development of the proposed technological architecture and connectivist model with an e-learning approach to solve the challenge of conducting distance lab exercises, it is not exclusive to this period. In fact, the remote laboratory has multiple advantages compared to the laboratory used in in-person classes, which are shown in Table 14.

7.6. Comparison with Papers in the Literature Review

Finally, the main differences between this paper and the papers reported in the literature review is shown in Table 15.
The evolution of emerging technologies such as the Internet of Things, mobile and ubiquitous computing, artificial intelligence, and embedded systems has enabled the Industry 4.0 [92] and the development of disruptive technological environments such as remote laboratories that allow remote experimentation, fostering the consolidation of knowledge and the reinforcement of theoretical concepts introduced in the classroom. The proposed architecture makes it possible to advance teaching and learning processes in various courses across different educational levels toward Education 4.0 [93] through the use of highly efficient, low-cost technologies.

8. Conclusions

The research methodology proposed present an alternative for create an SLE based on the technological environment of IoT and the e-learning approach using the Connectivism Learning Theory.
IoT applications implemented through various technologies have enabled the development of remote laboratories in different fields. The technological architecture and the connectivist model of the remote laboratory presented is a solution for the remote management of DLD physical equipment. The technological architecture is implemented using embedded systems based on SoC technology and applying the V-Model Methodology. This allows the development of highly efficient remote laboratories at very low cost. The Systems on Chip (SoCs) used in the implementation of the remote laboratory cost only USD 40.00 [94]. Additionally, with this technological architecture, learners do not need to purchase these development boards, which have a high cost at USD 349.00 [95].
The learning theory based on connectivism has enabled the creation of connectivist learning environments such as MOOCs. However, implementing these environments using a new technological platform, such as remote laboratories, enables the creation of a new connectivist environment, which we refer to as MOOL. This MOOL is designed using UML to describe the system entities, functionality, and control flow of activities. Furthermore, the MOOL allows learners to interact with FPGA-based development boards located in the institution’s laboratory from anywhere (home, office, university, etc.), at any time, and from any device (PC, laptop, mobile device, etc.). Through the remote laboratory approach, the MOOL offers several advantages over the in-person laboratory modality, enabling learners to successfully complete 100% of the lab exercises and reinforce the theoretical knowledge acquired during the semester. In the MOOL, a sequence of nine lab exercises that cover the thematic content of the Digital Logic Design course was proposed. With the implementation of applications for FPGA programming (progFPGA) and input manipulation through virtual switches (inputFPGA), learners can program, manipulate, and interact with each lab exercise to verify its functionality.
The MOOL contains the theoretical elements necessary for learning the course content, similar to a traditional MOOC. In addition, it includes the components for conducting lab exercises with real physical equipment. In the MOOL presented, the learners carried out the actions sequence for the on-site implementation of lab exercises and subsequently, the sequence of actions required to use remote laboratories.
All these advantages of the MOOL provided unrestricted access at any time, enabling learners in the experimental group to interact freely within the connectivist model and fostering active, autonomous, and self-organized learning, which in turn improved academic performance and learning outcomes.
The measurement model evaluation in the extended TAM framework supported the adequacy of the proposed mixed formative–reflective specification. The formative constructs exhibited acceptable levels of indicator collinearity and meaningful outer-weight contributions, while the reflective construct demonstrated strong indicator reliability.
Additionally, the structural model assessment demonstrated that the extended TAM framework provide a coherent and explanatory representation of learners’ acceptance of the FPGA-based remote laboratory. The strong path coefficients, high R 2 values for PEU and PU, and meaningful mediation effects collectively support the proposed theoretical relationships and underscore the relevance of technological application and cognitive perceptions in shaping intention to use in e-learning-based engineering education contexts.
The experimental group was active during the COVID-19 pandemic, during which the proposed technological architecture and the connectivist model, using an e-learning approach, were applied. This group achieved a mean evaluation score of 9.777, which was higher than that of the control groups, where a traditional in-person teaching approach was used. These results suggest that academic performance improves when a remote laboratory, connectivist learning theory, and an e-learning approach are applied.

Author Contributions

Conceptualization, V.H.G.O. and J.B.L.; methodology, V.H.G.O. and J.B.L.; software, V.H.G.O.; validation, V.H.G.O.; formal analysis, V.H.G.O. and J.B.L.; investigation, V.H.G.O. and E.R.-V.S.; resources, V.H.G.O. and J.B.L.; data curation, V.H.G.O.; writing—original draft preparation, V.H.G.O.; writing—review and editing, V.H.G.O. and J.B.L.; visualization, V.H.G.O.; supervision, J.B.L.; project administration, J.B.L. and E.R.-V.S.; funding acquisition, E.R.-V.S. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Universidad Nacional Autónoma de México, Programa de Apoyo a Proyectos para Innovar y Mejorar la Educación project, grant number 400524.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The original contributions presented in the study are included in the article, further inquiries can be directed to the corresponding author.

Acknowledgments

The authors express their gratitude to the Consejo Nacional de Humanidades, Ciencias y Tecnologías (CONAHCYT) for the national postgraduate scholarship. Furthermore, the authors gratefully acknowledge the valuable support and guidance provided by Nicolás C. Kemper Valverde (in memoriam) from the Universidad Nacional Autónoma de México (UNAM).

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
ADSsAcquired Digital Skills
ARMAdvanced RISC Machine
ASMAlgorithmic State Machine
BCDBinary-Coded Decimal
CAD-EDAComputer-Aided Design and Electronic Design Automation
CPLDComplex Programmable Logic Device
CPUCentral Processing Unit
DFADeterministic Finite Automaton
DLDDigital Logic Design
FFFlip-Flop
FPGAField-Programmable Gate Array
GPIOGeneral-Purpose Input/Output
HDLHardware Description Language
IoTInternet of Things
IUIntention to Use
JTAGJoin Test Action Group
LMSLearning Management System
MOOC      Massive Open Online Course
MOOLMassive Open Online Laboratory
MPSoCMultiprocessor System on Chip
OBSOpen Broadcaster Software
PEUPerceived Ease of Use
PMODPeripheral Module
PUPerceived Usefulness
ROMRead-Only Memory
RTCCReal-Time Clock/Calendar
SDRsSupport Digital Resources
SLESmart Learning Environment
SoCSystem on Chip
SSHSecure Shell
TATechnology Application
TAMTechnology Acceptance Model
TDTTransactional Distance Theory
UMLUnified Modeling Language
UNAMNational Autonomous University of Mexico
UTMUniversal Testing Machine
VHDLVHSIC Hardware Description Language
WHOWorld Health Organization

References

  1. Burki, T. The origin of SARS-CoV-2. Lancet Infect. Dis. 2020, 20, 1018–1019. [Google Scholar] [CrossRef]
  2. WHO. WHO Director-General’s Opening Remarks at the Media Briefing on COVID-19—11 March 2020; WHO: Geneva, Switzerland, 2020. [Google Scholar]
  3. UNESCO. Education: From School Closure to Recovery. Available online: https://www.unesco.org/en/covid-19/education-response (accessed on 15 May 2023).
  4. Bradley, V.M. Learning Management System (LMS) use with online instruction. Int. J. Technol. Educ. 2021, 4, 68–92. [Google Scholar] [CrossRef]
  5. Gannon, D.; Barga, R.; Sundaresan, N. Cloud-Native Applications. IEEE Cloud Comput. 2017, 4, 16–21. [Google Scholar] [CrossRef]
  6. Kratzke, N.; Quint, P.C. Understanding cloud-native applications after 10 years of cloud computing-a systematic mapping study. J. Syst. Softw. 2017, 126, 1–16. [Google Scholar] [CrossRef]
  7. Singh, R.; Awasthi, S. Updated comparative analysis on video conferencing platforms-zoom, Google meet, Microsoft Teams, WebEx Teams and GoToMeetings. EasyChair Preprint 2020, 4026, 9. [Google Scholar]
  8. Caiko, J.; Patlins, A.; Nurlan, A.; Protsenko, V. Video-conference communication platform based on webrtc online meetings. In Proceedings of the 2020 IEEE 61th International Scientific Conference on Power and Electrical Engineering of Riga Technical University (RTUCON), Riga, Latvia, 5–7 November 2020; pp. 1–6. [Google Scholar] [CrossRef]
  9. González-Padilla, D.A.; Tortolero-Blanco, L. Social media influence in the COVID-19 Pandemic. Int. Braz. J. Urol. 2020, 46, 120–124. [Google Scholar] [CrossRef]
  10. Wong, A.; Ho, S.; Olusanya, O.; Antonini, M.V.; Lyness, D. The use of social media and online communications in times of pandemic COVID-19. J. Intensive Care Soc. 2021, 22, 255–260. [Google Scholar] [CrossRef] [PubMed]
  11. Alsukaini, A.K.M.; Sumra, K.; Khan, R.; Awan, T.M. New trends in digital marketing emergence during pandemic times. Int. J. Innov. Sci. 2022, 15, 167–185. [Google Scholar] [CrossRef]
  12. Woods, K.; Gomez, M.; Arnold, M.G. Using social media as a tool for learning in higher education. In Research Anthology on Applying Social Networking Strategies to Classrooms and Libraries; IGI Global: Hershey, PA, USA, 2023; pp. 35–49. [Google Scholar] [CrossRef]
  13. Bower, M. Technology-mediated learning theory. Br. J. Educ. Technol. 2019, 50, 1035–1048. [Google Scholar] [CrossRef]
  14. UNESCO. Global Monitoring of School Closures Caused by COVID-19. Available online: https://covid19.uis.unesco.org/global-monitoring-school-closures-covid19/country-dashboard/ (accessed on 19 May 2023).
  15. Centro de Estudios Sociales. Opinión Pública; Centro de Estudios Sociales: Mexico City, Mexico, 2020. [Google Scholar]
  16. Gastélum-Escalante, J.; León Santiesteban, M. Enseñanza remota o educación virtual. Disyuntiva de las instituciones mexicanas de educación superior. Apertura 2022, 14, 24–39. [Google Scholar] [CrossRef]
  17. Chans, G.M.; Orona-Navar, A.; Orona-Navar, C.; Sánchez-Rodríguez, E.P. Higher Education in Mexico: The Effects and Consequences of the COVID-19 Pandemic. Sustainability 2023, 15, 9476. [Google Scholar] [CrossRef]
  18. Akkan, I.N.; Kucuktepe, S.E. Distance Education Practices During the COVID-19 Lockdown: Comparison of Belgium, Japan, Spain, and Türkiye. Int. Rev. Res. Open Distrib. Learn. 2024, 25, 154–175. [Google Scholar] [CrossRef]
  19. Kramp, T.; Van Kranenburg, R.; Lange, S. Introduction to the Internet of Things. In Enabling Things to Talk: Designing IoT Solutions with the IoT Architectural Reference Model; Springer: Berlin/Heidelberg, Germany, 2013; pp. 1–10. [Google Scholar] [CrossRef]
  20. Milenkovic, M. Internet of Things: Concepts and System Design; Springer: Cham, Switzerland, 2020. [Google Scholar] [CrossRef]
  21. Kumar, S.; Tiwari, P.; Zymbler, M. Internet of Things is a revolutionary approach for future technology enhancement: A review. J. Big Data 2019, 6, 111. [Google Scholar] [CrossRef]
  22. Cárdenas-Robledo, L.A.; Peña-Ayala, A. Ubiquitous learning: A systematic review. Telemat. Inform. 2018, 35, 1097–1132. [Google Scholar] [CrossRef]
  23. Novoa Castillo, P.F.; Cancino Verde, R.F.; Uribe Hernández, Y.C.; Garro Aburto, L.L.; Mendez Ilizarbe, G.S. Ubiquitous learning in the teaching-learning process. Rev. Multi-Ensayos 2020, 2–8. Available online: https://revistas.unan.edu.ni/index.php/Multiensayo/en/article/view/1398 (accessed on 10 May 2025).
  24. Choudhury, S.; Pattnaik, S. Emerging themes in e-learning: A review from the stakeholders’ perspective. Comput. Educ. 2020, 144, 103657. [Google Scholar] [CrossRef]
  25. Goyal, M.; Krishnamurthi, R.; Yadav, D. E-Learning Methodologies: Fundamentals, Technologies and Applications; IET: London, UK, 2021; Volume 40, Available online: https://digital-library.theiet.org/doi/book/10.1049/pbpc040e (accessed on 5 August 2025).
  26. Daniela, L. (Ed.) The Internet of Things for Education: A New Actor on the Stage; Springer Nature Switzerland AG: Cham, Switzerland, 2021. [Google Scholar] [CrossRef]
  27. Cruz-Miguel, E.E.; Rodríguez-Reséndiz, J.; García-Martínez, J.R.; Camarillo-Gómez, K.A.; Pérez-Soto, G.I. Field-programmable gate array-based laboratory oriented to control theory courses. Comput. Appl. Eng. Educ. 2019, 27, 1253–1266. [Google Scholar] [CrossRef]
  28. Garduño-Aparicio, M.; Rodríguez-Reséndiz, J.; Macias-Bobadilla, G.; Thenozhi, S. A multidisciplinary industrial robot approach for teaching mechatronics-related courses. IEEE Trans. Educ. 2018, 61, 55–62. [Google Scholar] [CrossRef]
  29. Ariza, J.Á.; Gil, S.G. RaspyLab: A low-cost remote laboratory to learn programming and physical computing through Python and Raspberry Pi. IEEE Rev. Iberoam. de Tecnol. del Aprendiz. 2022, 17, 140–149. [Google Scholar] [CrossRef]
  30. Chevalier, A.; Copot, C.; Ionescu, C.; De Keyser, R. A three-year feedback study of a remote laboratory used in control engineering studies. IEEE Trans. Educ. 2017, 60, 127–133. [Google Scholar] [CrossRef]
  31. Barrios, A.; Panche, S.; Duque, M.; Grisales, V.H.; Prieto, F.; Villa, J.L.; Chevrel, P.; Canu, M. A multi-user remote academic laboratory system. Comput. Educ. 2013, 62, 111–122. [Google Scholar] [CrossRef]
  32. Achuthan, K.; Raghavan, D.; Shankar, B.; Francis, S.P.; Kolil, V.K. Impact of remote experimentation, interactivity and platform effectiveness on laboratory learning outcomes. Int. J. Educ. Technol. High. Educ. 2021, 18, 38. [Google Scholar] [CrossRef]
  33. Zhang, Z.; Inoñan, M.; Orduña, P.; Hussein, R. RHLab: Towards Implementing a Partial Reconfigurable SDR Remote Lab. In Proceedings of the Smart Technologies for a Sustainable Future; Auer, M.E., Langmann, R., May, D., Roos, K., Eds.; Springer: Cham, Switzerland, 2024; pp. 180–192. [Google Scholar]
  34. Navas-González, R.; Oballe-Peinado, Ó.; Castellanos-Ramos, J.; Rosas-Cervantes, D.; Sánchez-Durán, J.A. Practice Projects for an FPGA-Based Remote Laboratory to Teach and Learn Digital Electronics. Information 2023, 14, 558. [Google Scholar] [CrossRef]
  35. Fujieda, N.; Okuchi, A. A Novel Remote FPGA Lab Platform Using MCU-based Controller Board. In Proceedings of the 2023 IEEE International Conference on Teaching, Assessment and Learning for Engineering (TALE), Auckland, New Zealand, 28 November–1 December 2023; pp. 1–6. [Google Scholar] [CrossRef]
  36. Melosik, M.; Naumowicz, M.; Kropidłowski, M.; Marszalek, W. Remote Prototyping of FPGA-Based Devices in the IoT Concept during the COVID-19 Pandemic. Electronics 2022, 11, 1497. [Google Scholar] [CrossRef]
  37. Wan, H.; Liu, K.; Lin, J.; Gao, X. A web-based remote FPGA laboratory for computer organization course. In Proceedings of the 2019 on Great Lakes Symposium on VLSI, Tysons Corner, VA, USA, 9–11 May 2019; pp. 243–248. [Google Scholar] [CrossRef]
  38. Zhang, Y.; Chen, Y.; Ma, X.; Tang, Y.; Niu, Y.; Li, S.; Liu, W. Remote FPGA lab platform for computer system curriculum. In Proceedings of the ACM Turing 50th Celebration Conference-China, Shanghai, China, 12–14 May 2017; pp. 1–6. [Google Scholar] [CrossRef]
  39. Hoang, T.M.; Quang, H.N.; Hung, T.Q.; de Souza-Daw, T.; Ngoc, L.H.; Dzung, N.T.; Bien, P.H. Implementation of Low Cost FPGA Remote Laboratory. ASEAN Eng. J. Part A 2015, 5, 56–76. [Google Scholar] [CrossRef]
  40. Toyoda, Y.; Koike, N.; Li, Y. An FPGA-based remote laboratory: Implementing semi-automatic experiments in the hybrid cloud. In Proceedings of the 2016 13th International Conference on Remote Engineering and Virtual Instrumentation (REV), Madrid, Spain, 24–26 February 2016; pp. 24–29. [Google Scholar] [CrossRef]
  41. Mayoz, C.A.; da Silva Beraldo, A.L.; Villar-Martinez, A.; Rodriguez-Gil, L.; de Souza Seron, W.F.M.; Orduña, P. FPGA remote laboratory: Experience of a shared laboratory between UPNA and UNIFESP. In Proceedings of the 2020 XIV Technologies Applied to Electronics Teaching Conference (TAEE), Porto, Portugal, 8–10 July 2020; pp. 1–8. [Google Scholar] [CrossRef]
  42. Alhammami, M. FPGA hardware kit for remote training platforms. Discov. Educ. 2024, 3, 102. [Google Scholar] [CrossRef]
  43. Sum, R.; Suwansantisuk, W.; Kumhom, P. Remote field-programmable gate array laboratory for signal acquisition and design verification. Int. J. Electr. Comput. Eng. 2024, 14, 2344–2360. [Google Scholar] [CrossRef]
  44. Cruz, C.; Gil, R.; de la Llana, A.; Bravo, I.; Gardel, A.; Lázaro, J.L. Remote Laboratory Based on a Reconfigurable Hardware Platform. In Proceedings of the 2024 XVI Congreso de Tecnología, Aprendizaje y Enseñanza de la Electrónica (TAEE), Malaga, Spain, 26–28 June 2024; pp. 1–6. [Google Scholar] [CrossRef]
  45. Guerrero-Osuna, H.A.; García-Vázquez, F.; Ibarra-Delgado, S.; Mata-Romero, M.E.; Nava-Pintor, J.A.; Ornelas-Vargas, G.; Castañeda-Miranda, R.; Rodríguez-Abdalá, V.I.; Solís-Sánchez, L.O. Developing a Cloud and IoT-Integrated Remote Laboratory to Enhance Education 4.0: An Approach for FPGA-Based Motor Control. Appl. Sci. 2024, 14, 115. [Google Scholar] [CrossRef]
  46. Ballina, M.G.; Molina, R.S.; Crespoa, M.L.; Carrato, S. HyperFPGA: Enhancing Education with Remote Laboratory Access for Heterogeneous Computing on MPSoC-FPGA Technologies. In Proceedings of the 2025 IEEE Global Engineering Education Conference (EDUCON), London, UK, 22–25 April 2025; pp. 1–5. [Google Scholar] [CrossRef]
  47. AMD. Vivado Design Suite Tutorial. 2023. Available online: https://docs.amd.com/r/en-US/ug893-vivado-ide/Using-the-Vivado-IDE (accessed on 5 August 2023).
  48. Digilent. Nexys 4 DDR FPGA Board Reference Manual. Pullman, WA99163. 2016. Available online: https://digilent.com/reference/_media/reference/programmable-logic/nexys-4-ddr/nexys4ddr_rm.pdf (accessed on 20 May 2023).
  49. IEEE Std 1076-2019; IEEE Standard for VHDL Language Reference Manual. IEEE: Piscataway, NJ, USA, 2019; pp. 1–673. [CrossRef]
  50. IEEE Std 1364-2005 (Revision of IEEE Std 1364-2001); IEEE Standard for Verilog Hardware Description Language. IEEE: Piscataway, NJ, USA, 2006; pp. 1–590. [CrossRef]
  51. Coop, I.S. Una metodología para el desarrollo de hardware y software embebidos en sistemas críticos de seguridad. Iiisci.org 2006, 3, 70–75. [Google Scholar]
  52. Booch, G.; Rumbaugh, J.; Jacobson, I. The Unified Modeling Language User Guide: Covers UML 2.0, 2nd ed.; Pearson Education: Noida, India, 2015; Available online: https://books.google.com.mx/books?id=m5E8ygEACAAJ (accessed on 13 December 2024).
  53. Raspberry Pi Ltd. Operating System Images. 2025. Available online: https://www.raspberrypi.com/software/operating-systems/ (accessed on 15 March 2025).
  54. Raspberry Pi Ltd. Raspberry Pi 3 Model B+. 2016. Available online: https://datasheets.raspberrypi.com/rpi3/raspberry-pi-3-b-plus-product-brief.pdf (accessed on 10 September 2025).
  55. Raspberry Pi. Camera, About the Camera Modules. 2025. Available online: https://www.raspberrypi.com/documentation/accessories/camera.html (accessed on 23 September 2025).
  56. SONY. IMX219. 2025. Available online: https://www.opensourceinstruments.com/Electronics/Data/IMX219PQ.pdf (accessed on 28 September 2025).
  57. MIPI Alliance. MIPI CSI-2®. 2025. Available online: https://www.mipi.org/specifications/csi-2 (accessed on 12 September 2025).
  58. Motion Project. Motion Project Website. 2025. Available online: https://motion-project.github.io/ (accessed on 5 September 2025).
  59. NXP Semiconductors. UM10204: I2C-Bus Specification and User Manual; rev. 7; NXP Semiconductors: Eindhoven, The Netherlands, 2021; Available online: https://www.nxp.com/docs/en/user-guide/UM10204.pdf (accessed on 4 October 2025).
  60. Analog Devices, Inc. DS1338: I2C RTC with 56-Byte NV RAM; rev. 4/15; Analog Devices, Inc.: Wilmington, MA, USA, 2015; Available online: https://www.analog.com/media/en/technical-documentation/data-sheets/DS1338-DS1338Z.pdf (accessed on 11 October 2025).
  61. Microchip Technology Inc. ATECC608A: CryptoAuthentication™ Device Summary Data Sheet; Document No. DS40001977B; Microchip Technology Inc.: Chandler, AZ, USA, 2018; Available online: https://www.microchip.com/en-us/product/atecc608a (accessed on 22 October 2025).
  62. Future Technology Devices International Ltd. FT232R USB UART IC Datasheet; rev. 2.16; Future Technology Devices International Ltd.: Glasgow, UK, 2020; Available online: https://ftdichip.com/wp-content/uploads/2020/08/DS_FT232R.pdf (accessed on 15 October 2025).
  63. Downes, S. Connectivism. Asian J. Distance Educ. 2022, 17, 58–87. [Google Scholar] [CrossRef]
  64. Siemens, G. Connectivism: A Learning Theory for the Digital Age. 2005. Available online: https://www.itdl.org/Journal/Jan_05/article01.htm (accessed on 11 August 2025).
  65. Downes, S. Online Learning and MOOCs: Visions and Pathways. 2018. Available online: https://www.downes.ca/post/69604 (accessed on 23 August 2024).
  66. Downes, S. An Introduction to Connective Knowledge; Chapter: Media, Data & Knowledge; Innsbruck University Press: Innsbruck, Austria, 2008. [Google Scholar] [CrossRef]
  67. Yuan, L.; Powell, S.J. MOOCs and Open Education: Implications for Higher Education; Technical Report; CETIS, University of Bolton: Manchester, UK, 2013. [Google Scholar]
  68. Daniel, J. Making sense of MOOCs: Musings in a maze of myth, paradox and possibility. J. Interact. Media Educ. 2012, 2012, 18. [Google Scholar] [CrossRef]
  69. Minguillón, A. Introduction to Unified Modeling Language. 2001. Available online: https://openaccess.uoc.edu/bitstream/10609/9121/1/Intro_UML.pdf (accessed on 17 August 2025).
  70. Mohammed, A.R.; Kassem, S.S. E-Learning System Model For University Education Using UML. In Proceedings of the 2020 Sixth International Conference on e-Learning (econf), Sakheer, Bahrain, 6–7 December 2020; pp. 35–39. [Google Scholar] [CrossRef]
  71. Yaser Nasr, S.; Kassem, S. Modeling the Production Planning and Control System using UML. In Proceedings of the 2020 2nd Novel Intelligent and Leading Emerging Sciences Conference (NILES), Giza, Egypt, 24–26 October 2020; pp. 21–26. [Google Scholar] [CrossRef]
  72. Mostafa, A.I.; Shalaby, M.A.W.; Kassem, S.S. Application of unified modelling language on firm’s supply chain. In Proceedings of the 2023 5th Novel Intelligent and Leading Emerging Sciences Conference (NILES), Giza, Egypt, 21–23 October 2023; pp. 107–110. [Google Scholar] [CrossRef]
  73. Mansour, K.M.; Kassem, S. Modeling of Agile MicroFactory System using Unified Modeling Language. In Proceedings of the 2022 4th Novel Intelligent and Leading Emerging Sciences Conference (NILES), Giza, Egypt, 22–24 October 2022; pp. 200–205. [Google Scholar] [CrossRef]
  74. Liebel, G.; Badreddin, O.; Heldal, R. Model Driven Software Engineering in Education: A Multi-Case Study on Perception of Tools and UML. In Proceedings of the 2017 IEEE 30th Conference on Software Engineering Education and Training (CSEET), Savannah, GA, USA, 7–9 November 2017; pp. 124–133. [Google Scholar] [CrossRef]
  75. Hyerle, D. Visual Tools for Transforming Information into Knowledge; Corwin Press: Thousand Oaks, CA, USA, 2008; Available online: https://books.google.com.mx/books?id=9np1AwAAQBAJ&lpg=PP1&ots=q8uMuagEvE&dq=Visual%20Tools%20for%20Transforming%20Information%20into%20Knowledge (accessed on 27 October 2025).
  76. Canva. Canva Website. 2025. Available online: https://www.canva.com/ (accessed on 3 October 2025).
  77. Lumi Education. Lumi Education Website. 2025. Available online: https://lumi.education/es/ (accessed on 25 October 2025).
  78. Apache. Apache JMeter. Available online: https://jmeter.apache.org/ (accessed on 7 September 2024).
  79. Xilinx. 7 Series FPGAs Data Sheet: Overview. 2020. Available online: https://docs.amd.com/v/u/en-US/ds180_7Series_Overview (accessed on 18 October 2025).
  80. Davis, F.D. Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Q. 1989, 13, 319–340. [Google Scholar] [CrossRef]
  81. Granić, A.; Marangunić, N. Technology acceptance model in educational context: A systematic literature review. Br. J. Educ. Technol. 2019, 50, 2572–2593. [Google Scholar] [CrossRef]
  82. Jarvis, C.B.; MacKenzie, S.B.; Podsakoff, P.M. A Critical Review of Construct Indicators and Measurement Model Misspecification in Marketing and Consumer Research. J. Consum. Res. 2003, 30, 199–218. Available online: https://academic.oup.com/jcr/article-pdf/30/2/199/17927858/30-2-199.pdf (accessed on 12 December 2025). [CrossRef]
  83. Petter, S.; Straub, D.; Rai, A. Specifying Formative Constructs in Information Systems Research. MIS Q. 2007, 31, 623–656. [Google Scholar] [CrossRef]
  84. SmartPLS GmbH. SmartPLS 4; Structural Equation Modeling (SEM) Software; SmartPLS GmbH: Monheim am Rhein, Germany, 2026; Available online: https://smartpls.com/ (accessed on 1 January 2026).
  85. Hair, J.; Alamer, A. Partial Least Squares Structural Equation Modeling (PLS-SEM) in second language and education research: Guidelines using an applied example. Res. Methods Appl. Linguist. 2022, 1, 100027. [Google Scholar] [CrossRef]
  86. Sarstedt, M.; Hair, J.F.; Pick, M.; Liengaard, B.D.; Radomir, L.; Ringle, C.M. Progress in partial least squares structural equation modeling use in marketing research in the last decade. Psychol. Mark. 2022, 39, 1035–1064. [Google Scholar] [CrossRef]
  87. Oviedo, H.C.; Campo-Arias, A. Aproximación al uso del coeficiente alfa de Cronbach. Rev. Colomb. de Psiquiatr. 2005, 34, 572–580. [Google Scholar]
  88. Streiner, D.L. Starting at the beginning: An introduction to coefficient alpha and internal consistency. J. Personal. Assess. 2003, 80, 99–103. [Google Scholar] [CrossRef] [PubMed]
  89. Hernández-Sampieri, R.; Mendoza, C. Metodología de la Investigación: Las Rutas Cuantitativa, Cualitativa y Mixta; McGraw-Hill Education: Mexico City, Mexico, 2018; Available online: https://books.google.com.mx/books?id=5A2QDwAAQBAJ (accessed on 22 February 2025).
  90. Pandas Development Team. Pandas User Guide; NumFOCUS, Inc.: Austin, TX, USA, 2025; Available online: https://pandas.pydata.org/docs/user_guide/index.html (accessed on 15 October 2025).
  91. Project Jupyter Community. Project Jupyter Documentation. 2025. Available online: https://docs.jupyter.org/en/latest/ (accessed on 21 September 2025).
  92. Schwab, K. La Cuarta Revolución Industrial; Debate: Barcelona, Spain, 2016; Available online: https://books.google.com.mx/books?id=BRonDQAAQBAJ (accessed on 1 July 2025).
  93. Mukul, E.; Büyüközkan, G. Digital transformation in education: A systematic review of education 4.0. Technol. Forecast. Soc. Change 2023, 194, 122664. [Google Scholar] [CrossRef]
  94. DigiKey. Raspberry Pi 3 Model B+. Available online: https://www.digikey.com.mx/es/products/detail/raspberry-pi/SC0073/8571724 (accessed on 20 June 2024).
  95. Digilent. Nexys A7: FPGA Trainer Board. Available online: https://digilent.com/shop/nexys-a7-fpga-trainer-board-recommended-for-ece-curriculum (accessed on 17 June 2024).
Figure 1. Research methodology.
Figure 1. Research methodology.
Asi 09 00037 g001
Figure 2. General architecture.
Figure 2. General architecture.
Asi 09 00037 g002
Figure 3. Sequence of actions for the on-site implementation of lab exercises.
Figure 3. Sequence of actions for the on-site implementation of lab exercises.
Asi 09 00037 g003
Figure 4. Sequence of actions for using the remote laboratory.
Figure 4. Sequence of actions for using the remote laboratory.
Asi 09 00037 g004
Figure 5. Technological architecture components of the remote laboratory.
Figure 5. Technological architecture components of the remote laboratory.
Asi 09 00037 g005
Figure 6. The V-model methodology stages.
Figure 6. The V-model methodology stages.
Asi 09 00037 g006
Figure 7. Block diagram of the SoC–FPGA Interface.
Figure 7. Block diagram of the SoC–FPGA Interface.
Asi 09 00037 g007
Figure 8. Algorithm State Machine of the lab exercise.
Figure 8. Algorithm State Machine of the lab exercise.
Asi 09 00037 g008
Figure 9. Microarchitecture of the ASM lab exercise. The figure illustrates the datapath composed of the control unit, a 22-bit register, BCD counters, ring counter, ROM, and combinational logic as well as the input and output signals used for control, data processing, and system verification.
Figure 9. Microarchitecture of the ASM lab exercise. The figure illustrates the datapath composed of the control unit, a 22-bit register, BCD counters, ring counter, ROM, and combinational logic as well as the input and output signals used for control, data processing, and system verification.
Asi 09 00037 g009
Figure 10. Algorithm design for BCD counters.
Figure 10. Algorithm design for BCD counters.
Asi 09 00037 g010
Figure 11. Automaton of the control unit.
Figure 11. Automaton of the control unit.
Asi 09 00037 g011
Figure 12. Entity model for the DLD course.
Figure 12. Entity model for the DLD course.
Asi 09 00037 g012
Figure 13. Homepage and course content of the DLD course in the LMS.
Figure 13. Homepage and course content of the DLD course in the LMS.
Asi 09 00037 g013
Figure 14. Explanation of the ASM lab exercise using the Visual Design learning strategy [75]. Within this strategy, the Canva environment [76] and the Lumi software [77] are employed to create interactive H5P components.
Figure 14. Explanation of the ASM lab exercise using the Visual Design learning strategy [75]. Within this strategy, the Canva environment [76] and the Lumi software [77] are employed to create interactive H5P components.
Asi 09 00037 g014
Figure 15. Use case diagram of the connectivist model.
Figure 15. Use case diagram of the connectivist model.
Asi 09 00037 g015
Figure 16. Activity diagram of the “Lab Exercise Assessment” use case.
Figure 16. Activity diagram of the “Lab Exercise Assessment” use case.
Asi 09 00037 g016
Figure 17. Physical implementation of the remote laboratory: (a) Complete system consisting of an FPGA development board, camera, SoC–FPGA interface, SoC, FT232 module, power supply, crypto-authentication device, and RTCC. (b) A shelf containing six complete remote laboratory systems.
Figure 17. Physical implementation of the remote laboratory: (a) Complete system consisting of an FPGA development board, camera, SoC–FPGA interface, SoC, FT232 module, power supply, crypto-authentication device, and RTCC. (b) A shelf containing six complete remote laboratory systems.
Asi 09 00037 g017
Figure 18. VHDL implementation of the control unit of the ASM lab exercise microarchitecture.
Figure 18. VHDL implementation of the control unit of the ASM lab exercise microarchitecture.
Asi 09 00037 g018
Figure 19. Testbench simulation of the ASM lab exercise: (a) After a reset signal, the INI input signal is set to logic 1 and the BCD counters increment from 00 to 04. (b) The BCD counters continue incrementing from 05 to 12.
Figure 19. Testbench simulation of the ASM lab exercise: (a) After a reset signal, the INI input signal is set to logic 1 and the BCD counters increment from 00 to 04. (b) The BCD counters continue incrementing from 05 to 12.
Asi 09 00037 g019
Figure 20. Functional verification of the ASM lab exercise with the remote laboratory: (a) Input values for the ASM lab exercise using the virtual switches application. (b) Video server result for the ASM lab exercise.
Figure 20. Functional verification of the ASM lab exercise with the remote laboratory: (a) Input values for the ASM lab exercise using the virtual switches application. (b) Video server result for the ASM lab exercise.
Asi 09 00037 g020
Figure 21. Extended TAM framework designed in smartPLS 4 [84].
Figure 21. Extended TAM framework designed in smartPLS 4 [84].
Asi 09 00037 g021
Table 1. Virtual switches of the LAB REMOTE connector.
Table 1. Virtual switches of the LAB REMOTE connector.
Virtual SwFPGA-PinVirtual SwFPGA-PinVirtual SwFPGA-Pin
SWV0H14SWV8G17SWV16B18
SWV1H16SWV9G18SWV17A18
SWV2G16SWV10E18SWV18B16
SWV3G13SWV11F18SWV19B17
SWV4F16SWV12D18SWV20A15
SWV5F13SWV13E17SWV21A16
SWV6D14SWV14C17SWV22A13
SWV7E16SWV15D17SWV23A14
Table 2. Lab exercises of DLD course.
Table 2. Lab exercises of DLD course.
NoLab Exercise TitleCourse Topic
1Combinational logicIntroduction
2Flip-FlopsLatches and Flip-Flops circuits
3RegistersRegisters: architectures and applications
4Sequence detectorRegisters applications and Mealy Machine (MyM)
5Multiplexed messageMoore Machine (MoM)
6CountersCounters: architectures and applications
7Special countersBCD, ring and Johnson counters
8SensorsMyM, special counters applications
9Algorithmic State Machine (ASM)Registers, counters and memories applications, and ASM
Table 3. CPU performance and memory usage for the video server.
Table 3. CPU performance and memory usage for the video server.
Users% CPU% MEMMem (MB)
TotalFreeUsedCache
036.65.2871.7548.068.7254.4
184.85.2871.7548.269.0254.4
2–1085.15.2871.7548.269.1254.4
Table 4. SSH server’s Performance.
Table 4. SSH server’s Performance.
User ConnectionsConnections Made Within Time Intervals
1 s3–60 s
555
101010
151515
20–301519
Table 5. FPGA resources of the ASM lab exercise.
Table 5. FPGA resources of the ASM lab exercise.
ResourceUtilizationAvailableUtilization (%)
LUT6563,4000.10
FF72126,8000.06
IO5821027.62
BUFG2326.25
Table 6. FPGA timing information of the ASM lab exercise.
Table 6. FPGA timing information of the ASM lab exercise.
PeriodWorst Negative SlackMaximum FrequencyWorst Hold Slack
(T) (WSN) (Fmax) (WHS)
10 ns7.788 ns452 MHz0.262 ns
Table 7. Items for each construct of the extended TAM.
Table 7. Items for each construct of the extended TAM.
VariableTagItem
LearnerSP1Age.
ProfileSP2Gender (male/female).
Perceived
Usefulness
PU1The remote lab allows learners to gain practical knowledge of the topics covered in the course.
PU2The remote lab allows learners to perform lab exercises remotely in an innovative way.
PU3The time allocated for remote lab use is greater than the time allocated for physical lab use.
PU4Interaction with remote lab enhances the understanding of each course topic more than using only simulators.
PU5Remote lab access is available from anywhere and at any time.
Perceived
Easy of use
PEU1The application for programming the BitStream file onto the FPGA is easy to use.
PEU2The application for interacting with the remote lab connector allows for easy input value assignment to test the lab exercises.
PEU3The video server allows the results of the lab exercises to be displayed with low latency (in seconds).
PEU4The live video stream allows the results of the lab exercises to be viewed clearly and sharply.
PEU5Assigning terminals on the REMOTE LAB connector using the constraints file is easy to do.
Technology
Application
TA1The use of the remote lab allows learners to acquire and apply knowledge from the Computer Networks course.
TA2The use of the remote lab allows learners to acquire and apply knowledge from the Operating Systems course.
Support
Digital
Resources
SDR1The operation video of the remote lab using Putty and WinSCP is useful for learning how to use it on the Windows operating system.
SDR2The user manual of the remote lab is useful for learning how to use it.
SDR3The support material provided is useful for operating the remote lab.
Acquired Digital SkillsADS1The Vivado CAD-EDA tool allows learners to develop and simulate HDL programs of each course lab exercise.
Intention
to Use
IU1Would you like to continue using the remote lab in future courses?
IU2Would you like to continue using the remote lab for development and learning of personal and academic projects?
Table 8. Survey results.
Table 8. Survey results.
VariableTag(n)(a)(ta)
Perceived UsefulnessPU1 3.6%96.4%
PU2 3.6%96.4%
PU33.6%3.6%92.9%
PU4 3.6%96.4%
PU5 7.1%92.9%
Perceived Easy of UsePEU1 7.1%92.9%
PEU23.6%3.6%92.9%
PEU33.6%7.1%89.3%
PEU4 10.7%89.3%
PEU5 3.6%96.4%
Technology ApplicationTA1 10.7%89.3%
TA23.6%7.1%89.3%
Support Digital ResourcesSDR1 7.1%92.9%
SDR2 3.6%96.4%
SDR3 3.6%96.4%
Acquired Digital SkillsADS1 14.3%85.7%
Intention to useIU1 7.1%92.9%
IU2 14.3%85.7%
Table 9. Collinearity, outer weights, and outer loadings.
Table 9. Collinearity, outer weights, and outer loadings.
ItemVIFRelationOuter WeightsOuter Loadings
ADS11ADS1 → ADS11.000
IU11.857IU1 ← IU0.4830.895
IU21.857IU2 ← IU0.6070.935
PEU11.414PEU1 → PEU0.7380.747
PEU21.014PEU2 → PEU0.6620.591
PEU32.269PEU3 → PEU0.0160.454
PEU41.841PEU4 → PEU0.1470.354
PEU51.009PEU5 → PEU0.053−0.031
PU11.008PU1 → PU0.5820.533
PU21.008PU2 → PU−0.028−0.100
PU31.081PU3 → PU0.0170.058
PU41.008PU4 → PU0.7650.722
PU51.083PU5 → PU0.40.335
SDR11.931SDR1 → SDR−0.988−0.380
SDR21.003SDR2 → SDR−0.712−0.690
SDR31.929SDR3 → SDR0.8220.163
TA11.205TA1 → TA0.3440.677
TA21.205TA1 → TA0.8080.950
Table 10. Path coefficients, coefficient of determination, and indirect effects.
Table 10. Path coefficients, coefficient of determination, and indirect effects.
RelationPath CoefficientsConstruct R 2 Adjusted R 2 RelationIndirect Effects
ADS → PEU0.152IU0.3420.316ADS → IU0.084
PU → IU0.585PEU0.8860.872ADS → PU0.145
PEU → PU0.951PU0.9040.900PEU → IU0.556
SDR → PEU0.011 SDR → IU0.006
TA → PEU0.811 SDR → PU0.010
TA → IU0.451
TA → PU0.772
Table 11. Assessment rubric for the DLD Course.
Table 11. Assessment rubric for the DLD Course.
Specific IndicatorsCriteriaGrade
8.0–10 6.0–7.9 0–5.9
A. Ability to represent a problem using a
model based on Mealy Machine, Moore
Machine and Algorithmic State Machine.
a. Problem analysis, determination of inputs and outputsaaa
b. Model selection and designbb
c. Model states’ reductionc
B. Ability to apply the digital logic design
methodology and obtain a digital circuit.
d. Correct code assignmentddd
e. State table generationeee
f. Correct assignment of values to the control signals of the chosen FFff
g. Correct use of minimization methodg
h. Correct equationsh
C. Ability to implement the digital circuit
using a model based on Mealy Machine,
Moore Machine, HDL, and a CAD-EDA tool.
i. Programming using an HDLiii
j. Using a simulator with a test-bench filejj
k. Pin assignment, constraints, and BitStream file generationk
D. Ability to test lab exercises
using the remote laboratory.
l. File transfer to the serverlll
m. Remote access to the servermmm
n. FPGA programmingnn
o. Manipulation of lab exercise inputso
p. Verification of lab exercise resultsp
E. Ability to write technical reports using the
transmedia narrative strategy.
q. Writing the report using digital mediaqqq
r. Explanation of lab exercise testingrr
s. Explanation of results and conclusionsss
t. Inclusion of appropriate references according to the digital mediat
Table 12. Analysis of learner performance in DLD courses.
Table 12. Analysis of learner performance in DLD courses.
Specific2017201820192021
Indicator 2CM1 2CM10 2CM7 2CM16
A7.8477.0578.3039.714
B7.8817.0068.4429.782
C7.9817.1348.5769.790
D7.9147.1278.3709.880
E8.1367.0538.3489.721
Total average ( x ¯ ) 7.9517.0758.4089.777
Table 13. Descriptive statistic results in DLD courses.
Table 13. Descriptive statistic results in DLD courses.
YearGroupMeanStandard DeviationVarianceCV
( x ¯ ) ( s ) ( s 2 ) (%)
20172CM17.9511.19051.41714.97
20182CM107.0751.9673.87127.8
20192CM78.4081.9783.91623.53
20212CM169.7770.6340.4036.49
Table 14. Differences between laboratory modalities.
Table 14. Differences between laboratory modalities.
Aspects to ConsiderIn-Person ModalityRemote Modality
Usage time (Availability)Very limited, only 1.5 h per weekContinuous time, 24/7 access
AccessibilityOnly learners belonging to the academic unitAny learner with an assigned account
Learner locationThe learner must attend the academic unitAnywhere in the world
Laboratory infrastructurePersonal computer, CAD-EDA tool and development boardEmbedded system, internet connection, and development board
Learner infrastructureNonePC, CAD-EDA tool, and Internet connection
Lab physical spaceLab classroom dimensions determined by educational institution30 cm × 20 cm × 20 cm dimensions for each remote lab
Educational staffStaff is required to provide access, materials, and perform reviewsNo staff required
Learners who interact during the reviewLimited to team-member learnersAll the learners in the class
Table 15. Comparison with the literature review.
Table 15. Comparison with the literature review.
PaperTechnological ArchitectureLearning TheorySoftware ApplicationMeasuring InstrumentLab Exercises SequenceLearning Outcomes
Paper proposedServer on Raspberry Pi3 SoC, video from the Camera Module V2, Nexys 4 with Xilinx FPGAConnectivismVisual shell applications for FPGA programming and interaction with virtual switchesLikert scale survey with extended TAM frameworkNine lab exercisesEvaluation rubric, descriptive statistics
[34]Server on Raspberry Pi4 SoC, without video, Nexys3 with Xilinx FPGANoneWeb applicationLikert scale survey without modelFour projectsNone
[36]Server on PC, video from WebCam, Spartan-3E with Xilinx FPGANoneWeb applicationWithout survey and modelTwo projects without interactionNone
[41]Server on Raspberry Pi3 SoC, video from WebCam, DE2-115 with Altera FPGANoneWeb applicationLikert scale survey without modelWithout lab exercisesNone
[37]Cloud Server, host computer, without video, Spartan 6 FPGANoneWeb applicationWithout survey and modelNine projectsNone
[38]Server on Zynq-7 SoC, without video, Artix-7 with Xilinx FPGANoneWeb applicationWithout survey and modelMIPS32 processorNone
[39]Host Computer, without video, Cyclone II with Altera FPGANoneVisual interfaceWithout survey and modelWithout lab exercisesNone
[35,40,43,44]Host Computer, with video, FPGA BoardNoneWeb applicationWithout survey and modelWithout lab exercisesNone
[45]Raspberry Pi5, with video, Basys 3 FPGANoneWeb applicationWith survey but without model3 lab exercisesNone
[46]MPSoC-FPGA, without videoNoneJupyter Hub InterfaceWithout survey and model5 lab exercisesNone
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

García Ortega, V.H.; Bárcenas López, J.; Sánchez, E.R.-V. Remote Laboratory Based on FPGA Devices Using the E-Learning Approach. Appl. Syst. Innov. 2026, 9, 37. https://doi.org/10.3390/asi9020037

AMA Style

García Ortega VH, Bárcenas López J, Sánchez ER-V. Remote Laboratory Based on FPGA Devices Using the E-Learning Approach. Applied System Innovation. 2026; 9(2):37. https://doi.org/10.3390/asi9020037

Chicago/Turabian Style

García Ortega, Victor H., Josefina Bárcenas López, and Enrique Ruiz-Velasco Sánchez. 2026. "Remote Laboratory Based on FPGA Devices Using the E-Learning Approach" Applied System Innovation 9, no. 2: 37. https://doi.org/10.3390/asi9020037

APA Style

García Ortega, V. H., Bárcenas López, J., & Sánchez, E. R.-V. (2026). Remote Laboratory Based on FPGA Devices Using the E-Learning Approach. Applied System Innovation, 9(2), 37. https://doi.org/10.3390/asi9020037

Article Metrics

Back to TopTop