Next Article in Journal
Design and Implementation of an Indoor and Outdoor Air Quality Measurement Device for the Detection and Monitoring of Gases with Hazardous Health Effects
Previous Article in Journal
Optical Detection of Cerium (Ce3+/Ce4+) Ions in Microparticles of Yttrium–Aluminum Garnet Powder (YAG:Ce3+)-Embedded Free-Standing Composite Films for Narrowband Blue to Broadband Visible Light Downconversion
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Proceeding Paper

Remote Control Device for the Detection and Correction of Errors in the FDM 3D Printing Process in Real Time †

1
Mechatronics Engineering Program, Faculty of Engineering, Universidad Autónoma del Caribe, Barranquilla 080020, Colombia
2
GIIM Group, Mechatronic Engineering, Universidad Autónoma del Caribe, Barranquilla 080020, Colombia
*
Author to whom correspondence should be addressed.
Presented at the III International Congress on Technology and Innovation in Engineering and Computing, Lima, Peru, 20–24 November 2023.
Eng. Proc. 2025, 83(1), 12; https://doi.org/10.3390/engproc2025083012
Published: 14 January 2025

Abstract

:
This project focuses on developing a remote control device for the real-time detection and correction of errors in fused deposition modeling (FDM) 3D printing. It utilizes a Raspberry Pi computer and a webcam to capture images while a neural network trained with a dataset generated by the research team identifies errors such as warping, stringing, and spaghetti. Information is efficiently transmitted via MQTT, with instant notifications through Telegram and a user interface. The methodology spans from training the neural network to integrated control strategies with the remote device. Evaluation highlights high precision using confusion matrices and IoU, promising substantial improvements in industrial and critical 3D printing environments.

1. Introduction

Three-dimensional printing is the process of creating objects by depositing layers of material on top of each other. This technology has been around for about four decades, having been invented in the early 80s. While it initially started as a slow and costly technique, significant technological advances have made current technologies more affordable and faster than ever. It offers several advantages, with the most significant being the ability to produce highly complex designs that are otherwise impossible [1]. Besides professional use for prototyping and low-volume manufacturing, 3D printers are becoming widespread among end users, starting with the so-called Maker Movement. The most common type of consumer-grade 3D printer is fused deposition modeling (FDM, also known as FFF—fused filament fabrication). This work focuses on FDM machinery due to its widespread use and numerous open issues such as precision and failure. These 3D printers can fail to print objects at a statistically dependent speed, depending on the printer’s manufacturer and model. Failures may occur due to misalignment of the print bed or print head, motor slippage, material deformation, lack of adhesion, or other reasons [2]. The process of 3D printing must be monitored throughout to prevent the loss of printing material and operator time and to ensure product quality. While some 3D printers come with basic built-in monitors, there are currently no advanced remote control and real-time monitoring systems for 3D printers. The monitoring of failures and job completion during 3D printing processes must be carried out on-site, either by operators or video cameras. A solution to oversee the entire 3D printing process in real-time is remote video surveillance. This solution works well for small-scale production but can be a challenge for larger and more sensitive production. The amount of video data generated by more than six 3D printers can be overwhelming to transmit and analyze. Additionally, sending all these data for external cloud processing would require significant bandwidth and low-latency communications [3].

2. Materials, Theory, and Methods

2.1. Materials

In the execution of their project, the team utilized an “Ender 3” 3D printer (Figure 1a), well-known for its efficiency. The system was integrated with a Raspberry Pi 4 model B with 4 GB RAM serving as a remote control hub (Figure 1b), and a manually focused camera (Figure 1c) was employed to capture detailed images during the printing process. Importantly, the necessary parts for the dataset were produced using PLA filament, an environmentally friendly and easy-to-handle material. This material choice ensured the acquisition of high-quality data for the precise training of the neural network, playing a crucial role in the overall success of the project.

2.2. Theory

2.2.1. 3D Printing Errors

  • Warping: this is one of the most common problems encountered in 3D printing. It causes the corners of your prints to curl up, making them look worse, and can even cause your print to pop off the heated bed and fail completely [4] (Figure 2a).
  • Spaghetti: this print issue looks exactly like it sounds, a big mess of “spaghetti” on and around your print. It is extruded filament that was misplaced by the print head (extruder) because at a certain point during the print the object below moved or collapsed [5] (Figure 2b).
  • Stringing basically occurs when using filaments; essentially, the strings are fine, melted filament threads that settle in places other than where they are supposed to be on the 3D-printed object. This can usually happen due to incorrect settings, so that the filament continues to drop out of the nozzle even though the extruder is about to move to another location to continue FDM 3D printing [6] (Figure 2c).

2.2.2. 3D Printer Communication

Serial communication is a method to send data one bit at a time in the form of binary pulses. The binary means that zero represents 0 volts or a logic LOW while one represents 5 volts or a logic HIGH. The full-duplex mode is a mode where both the transmitter and receiver can send and receive the data simultaneously (Figure 3). To put it simply, it is a simultaneous two-way communication method. The smartphone, or a phone unit, is an easy example that we can find [7].
G-code is simply a programming language for CNC (computer numerically controlled) machines like 3D printers, CNC mills, etc. It contains a set of commands that the firmware uses to control the printer’s operation and the printhead’s motion (Table 1). G-code for 3D printers is created using a special application called a slicer. This program takes your 3D model and slices it into thin 2D layers.
It then specifies the coordinates or path for the printhead to pass through to build up these layers. It also controls and sets specific printer functions like turning on the heater, fans, cameras, etc. [8].

2.2.3. IoT Technologies—MQTT

MQTT is an OASIS standard messaging protocol for the Internet of Things (IoT). It is designed as an extremely lightweight publish/subscribe messaging transport that is ideal for connecting remote devices with a small code footprint and minimal network bandwidth (Figure 4). MQTT today is used in a wide variety of industries, such as automotive, manufacturing, telecommunications, oil and gas, etc. [9].

2.2.4. YoloV5 Pretrained Model

The identification of objects in an image is considered a common assignment for the human brain, though it is not so trivial for a machine. The identification and localization of objects in photos is a computer vision task called ”object detection”, and several algorithms have emerged in the past few years to tackle the problem. One of the most popular algorithms to date for real-time object detection is YOLO (You Only Look Once) [10].
YOLOv5 is the world’s most loved vision AI, representing Ultralytics open-source research into future vision AI methods, incorporating the lessons learned and the best practices evolved over thousands of hours of research and development [11] (Figure 5).

2.3. Methods

2.3.1. Data Collection

For the model training, various images of commonly known calibration pieces for 3D printers were used. These pieces contain the necessary features to validate that a printer does not produce specific failures. In the case of the present project, they were used to generate the failures that need to be detected. Taking advantage of the wide range of errors that can be generated by poor parameterization, the model was trained with additional errors such as layer shifting, detachment, and the detection of human interaction on the print bed (Figure 6).
After obtaining a dataset of approximately 2500 images, labeling was performed using the LabelImg v1.8.6 software, a tool that allows for extracting the coordinates of the object to be detected within each image (Figure 7).

2.3.2. Control System

For the control system, the Python library PySerial was employed, which, when provided with the port and baud rate, automatically establishes the connection with the printer. Subsequently, using various programming techniques, the printer control logic was developed, enabling the management of temperatures, movement, and monitoring. A user interface was designed with VueJS, responsible for sending actions via MQTT from any computer to the device (Figure 8).
In this same control system, the trained error detection model was implemented. Using the OpenCV and PyTorch libraries, real-time detections were performed during printing. Due to the high bandwidth and low latency required for the video transmission of the printing process, an option replaced by the computer vision system, the capability to capture a photo of the print was added. Additionally, the option to enable the sending of model detection images through the Telegram API was included. All notifications can also be observed in the user interface as console messages.

3. Results

Through the conducted detection tests, an understanding of the model’s behavior and environmental characteristics was gained to ensure its optimal performance. Within the matrix in Figure 9, it can be observed that the model performs well in terms of error detection accuracy, particularly in the case of ”spaghetti”, where it demonstrated an 89% accuracy in detecting this error and only a 1% chance of misidentifying other types of errors, such as detachment, for instance. This was the only instance where the model showed a slight confusion with another error type. This can be explained by the fact that in 100% of cases where detachment occurs, spaghetti is also present due to the printing of the piece in mid-air. The spaghetti error was labeled under circumstances where detachment was always present, indicating that it does not represent a misclassification concerning the potential warnings that such detections may generate in the future.
For this analysis, various errors were intentionally generated with the device in operation to compare the detected images with the actual errors. It is noteworthy that despite the model being equipped for the detection of various error types, the analysis focused solely on the project’s scope due to the higher number of images during training.
To evaluate the results of the developed neural network’s detections, various methods were employed for metric analysis. In addition to the confusion matrix used in the previous analysis, the intersection over union analysis was also employed (Figure 10). Since the trained model utilizes bounding boxes to display detections, these boxes can be compared to the actual image to obtain a percentage of accuracy (Figure 11).
The IoU formula refers to the ratio of the shared areas between the total area of both bounding boxes. If the IoU is less than 0.5, it is considered a false positive.

4. Discussion

The device has demonstrated its capability to fulfill all previously established functions. With the successful design of the control strategy, it enabled the real-time pausing and halting of the printer, as well as axis control, monitoring, and temperature adjustment. Through the integration of the control strategy with the computer vision system, error correction was facilitated, preventing material and time losses. Validation tests showcased the developed device’s high proficiency in issuing alerts, enabling users to reliably halt failed prints. This accomplishment is particularly valuable in production and manufacturing environments, where efficiency and quality are paramount.
Future research is suggested to explore the possibility of addressing detachment and layershifting errors more specifically, as they currently pose challenges in real-time correction. Investigating additional strategies, such as extruder and fan speed control, could enhance the device’s versatility and correction capabilities.

5. Conclusions

This project has culminated in a highly effective remote-control device for the real-time detection and correction of errors in FDM 3D printers. By implementing a Raspberry Pi computer and training a neural network with the pretrained YOLOv5 model using a dataset of self-authored images, it successfully detected errors such as warping, spaghetti, stringing, and additional ones like detachment and layershifting to a greater extent. Furthermore, the detection of human interaction through hand detection adds an extra level of supervision and control during the printing process.
In summary, this project has achieved a significant milestone in improving the FDM 3D printing process while opening the door to future research and enhancements promising greater efficiency and precision in additive manufacturing. The ongoing expansion and refinement of the remote-control device have the potential to have a significant impact on the 3D printing industry.

Author Contributions

Conceptualization, H.R. and K.P.; methodology, H.R. and C.D.; software, H.R. and K.P.; validation, C.D., H.R. and J.C.; formal analysis, C.D., H.R. and J.C.; investigation, H.R. and K.P.; resources, H.R. and K.P.; data curation, H.R.; writing—original draft preparation, H.R. and C.D.; writing—review and editing, H.R.; visualization, H.R.; supervision, C.D. and J.C.; project administration, H.R. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The datasets presented in this article are not readily available because they are part of the development of a minimum viable product currently under development. Requests to access the datasets should be directed to henry.requena@uac.edu.co.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Dassault Systemes. Available online: https://www.3ds.com/es/make/guide/process/3d-printing (accessed on 30 August 2023).
  2. Baumann, F.; Roller, D. Vision based error detection for 3D printing processes. MATEC 2016, 59, 7. [Google Scholar] [CrossRef]
  3. Lanner. Available online: https://www.lanner-america.com/es/blog-es/solucion-de-monitorizacion-remota-para-impresoras-3d/ (accessed on 30 August 2023).
  4. 3DSOURCED. 25 October 2023. Available online: https://www.3dsourced.com/rigid-ink/3d-prints-warping-curling-how-to-prevent/ (accessed on 11 November 2023).
  5. Prusa. Prusa Research. 2021. Available online: https://help.prusa3d.com/article/spaghetti-monster_1999 (accessed on 30 August 2023).
  6. 3D Natives. 5 December 2022. Available online: https://www.3dnatives.com/en/stringing-3d-printing-051220226/#! (accessed on 10 August 2023).
  7. Sudrajat. Fully Instrumented. Available online: https://www.fullyinstrumented.com/serial-communication/ (accessed on 10 August 2023).
  8. Dwamena, M. 3DPrinterly. 27 August 2022. Available online: https://3dprinterly.com/ultimate-marlin-g-code-guide-how-to-use-them-for-3d-printing/ (accessed on 10 August 2023).
  9. MQTT. mqtt.org. 2022. Available online: https://mqtt.org/ (accessed on 11 November 2023).
  10. Arie, L.G. TowardsDataScience.com. 14 March 2022. Available online: https://towardsdatascience.com/the-practical-guide-for-object-detection-with-yolov5-algorithm-74c04aac4843 (accessed on 11 November 2023).
  11. Ultralytics. GitHub. 22 November 2022. Available online: https://github.com/ultralytics/yolov5 (accessed on 11 November 2023).
Figure 1. (a) Ender 3 3D printer machine; (b) Raspberry Pi 4 model B 4 GB RAM; (c) manually focused USB camera.
Figure 1. (a) Ender 3 3D printer machine; (b) Raspberry Pi 4 model B 4 GB RAM; (c) manually focused USB camera.
Engproc 83 00012 g001
Figure 2. Images of 3D printing errors: (a) warping; (b) spaghetti; (c) stringing.
Figure 2. Images of 3D printing errors: (a) warping; (b) spaghetti; (c) stringing.
Engproc 83 00012 g002
Figure 3. Diagram of 3D printer all time serial communications.
Figure 3. Diagram of 3D printer all time serial communications.
Engproc 83 00012 g003
Figure 4. MQTT publish/subscribe architecture.
Figure 4. MQTT publish/subscribe architecture.
Engproc 83 00012 g004
Figure 5. YOLOv5 pretrained models performance.
Figure 5. YOLOv5 pretrained models performance.
Engproc 83 00012 g005
Figure 6. Images capturing errors generated for training an object detection model.
Figure 6. Images capturing errors generated for training an object detection model.
Engproc 83 00012 g006
Figure 7. Image labeling with LabelImg.
Figure 7. Image labeling with LabelImg.
Engproc 83 00012 g007
Figure 8. User interface for 3D printer control.
Figure 8. User interface for 3D printer control.
Engproc 83 00012 g008
Figure 9. Trained model confusion matrix.
Figure 9. Trained model confusion matrix.
Engproc 83 00012 g009
Figure 10. Intersection over union.
Figure 10. Intersection over union.
Engproc 83 00012 g010
Figure 11. Intersection over union analysis: (a) warping; (b) stringing; (c) spaghetti.
Figure 11. Intersection over union analysis: (a) warping; (b) stringing; (c) spaghetti.
Engproc 83 00012 g011
Table 1. G-code commands for 3D printers.
Table 1. G-code commands for 3D printers.
CommandFunction
M0Pause
M104Fuse Temperature
M140 and M190Bed Temperature
M112Emergency Stop
G0Movement
G28Autohome
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Requena, H.; Pozuelo, K.; Díaz, C.; Coll, J. Remote Control Device for the Detection and Correction of Errors in the FDM 3D Printing Process in Real Time. Eng. Proc. 2025, 83, 12. https://doi.org/10.3390/engproc2025083012

AMA Style

Requena H, Pozuelo K, Díaz C, Coll J. Remote Control Device for the Detection and Correction of Errors in the FDM 3D Printing Process in Real Time. Engineering Proceedings. 2025; 83(1):12. https://doi.org/10.3390/engproc2025083012

Chicago/Turabian Style

Requena, Henry, Kelvin Pozuelo, Carlos Díaz, and Jean Coll. 2025. "Remote Control Device for the Detection and Correction of Errors in the FDM 3D Printing Process in Real Time" Engineering Proceedings 83, no. 1: 12. https://doi.org/10.3390/engproc2025083012

APA Style

Requena, H., Pozuelo, K., Díaz, C., & Coll, J. (2025). Remote Control Device for the Detection and Correction of Errors in the FDM 3D Printing Process in Real Time. Engineering Proceedings, 83(1), 12. https://doi.org/10.3390/engproc2025083012

Article Metrics

Back to TopTop