Next Article in Journal
Variable Depth Tillage: Importance, Applicability, and Impact—An Overview
Previous Article in Journal
Effect of Defoliation on Growth, Yield and Forage Quality in Maize, as a Simulation of the Impact of Fall Armyworm (Spodoptera frugiperda)
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Augmented Reality Glasses Applied to Livestock Farming: Potentials and Perspectives

Department of Agricultural Sciences, University of Sassari, Viale Italia 39/A, 07100 Sassari, Italy
*
Author to whom correspondence should be addressed.
AgriEngineering 2024, 6(2), 1859-1869; https://doi.org/10.3390/agriengineering6020108
Submission received: 10 May 2024 / Revised: 7 June 2024 / Accepted: 17 June 2024 / Published: 20 June 2024

Abstract

:
In the last decade, Smart Glasses (SG) and augmented reality (AR) technology have gained considerable interest in all production sectors. In the agricultural field, an SG can be considered a valuable device to support farmers and agricultural operators. SGs can be distinguished by technical specification, type of display, interaction system, and specific features. These aspects can affect their integration into farms, influencing users’ experience and the consequent level of performance. The aim of the study was to compare four SGs for AR with different technical characteristics to evaluate their potential integration in agricultural systems. This study analyzed the capability of QR code reading in terms of distance and time of visualization, the audio–video quality of image streaming during conference calls and, finally, the battery life. The results showed different levels of performance in QR code reading for the selected devices, while the audio–video quality in conference calls demonstrated similar results for all the devices. Moreover, the battery life of the SGs ranged from 2 to 7 h per charge cycle, and it was influenced by the type of usage. The findings also underlined the potential use and integration of SGs to support operators during farm management. Specifically, SGs might enable farmers to obtain fast and precise augmented information using markers placed at different points on the farm. In conclusion, the study highlights how the different technical characteristics of SG represent an important factor in the selection of the most appropriate device for a farm.

1. Introduction

Augmented reality (AR) consists of the enhancement of real objects and physical items with digital content that could help people to perform profitable work or live comfortably [1,2,3]. Several devices, such as smartphones, tablets, and laptops, could be used to visualize AR information, but Smart Glasses (SG) are devices specifically suited for professional AR applications [4]. Recently, improvements have been made in the development of a wide range of SG, largely thanks to the extensive use of smartphones, which have reduced the cost of microelectronic components [5,6]. Essentially, SG are miniaturized head-mounted computers, connected to a small display placed in front of the user’s eyes [7]. Moreover, SGs can vary in terms of the operating system (Android, Windows, etc.), input methods (hand gesture, external joypad or trackpad, voice recognition, etc.), user interface, technical features (camera definition, weight, processor, etc.), and cost. Nevertheless, one of their main characteristics concerns the visualization system, which can be an optical see-through display (a semi-reflective or semi-transparent surface, where the virtual content is projected) or a video see-through display (a screen where the augmented information is shown, blended with the real-world image captured by a video camera) [8,9]. The process of overlaying the AR contents on the real environment can be employed in the following two main ways: using markers (QR code, barcode, etc.) or specific qualities of the item to track the AR content (vision-based), or with the use of inertial, electromagnetic, or ultrasonic sensors (GPS, gyroscope, accelerometer, etc.) to position the digital elements (sensor-based) [10,11]. The largest number of publications on AR have been on the research and development of AR technologies for the healthcare, education, and industrial sectors [12,13]. The growing number of scientific papers that have been published in the last five years (+400%) related to AR in agriculture highlights the increasing interest in the use of this new technology in the agricultural field. One of the first such studies developed a system for displaying NDVI and geo-referenced yield data on vineyards [14]. Other applications have involved the implementation of AR systems for tractor driving assistance [15], the control of the operation of two autonomous agricultural machines using an AR device [16], and the use of a mobile-based AR application for Abalone growth monitoring [17], to promote sustainable development in hydroponic cultivation, [18] and for plot diagnosis activities in vineyards [19]. The benefits of the adoption of SGs for AR in the livestock sector were shown in recent studies testing systems to support operators in farm management [20], for the visualization of the production data of dairy sheep in the milking parlor [21], for monitoring of the ventilation system in the barn [22] and for the visualization of specific animal data from sheep farms [23]. Although some scientific studies and applications of AR and SGs in agriculture have been identified, the adoption of this technology by farmers is still limited. This is mostly caused by the fact that SGs are an emerging technology and few specific applications have been developed for the agricultural domain. Also, the operating performance of SGs and its current integration into agricultural farms are still unknown. Thus, it is significant to evaluate whether these devices and the on-farm activities might be combined. In fact, recent studies have confirmed that these technologies represent a promising solution in the digitalization of modern farms [19,20,21]. Specifically, SGs require further investigation to support the wider use of these systems. Therefore, the evaluation of SG performances (detection of objects to be augmented, remote assistance, quality of the images displayed) highlights how different devices might be integrated into an agricultural environment and help manufacturers develop specific SGs for the agricultural domain. In fact, different technical characteristics and functions could correspond to different ways of use and situations. For example, some specifications, such as the battery capacity, must be considered when buying a device for certain tasks on the farm. Also, the overall performance (speed of function, bugs, and crushes) could strongly influence the operator work’s efficiency. Thus, the aims of this study include comparing four types of SGs for AR that are representative of the market of this technology in terms of specifications and performances, and to evaluate their potential uses and technical advantages for integration in agricultural farms.

2. Materials and Methods

In the present study, the four different SGs adopted for AR (Figure 1) are as follows: the HoloLens 2 (HL), produced by the Microsoft company (Washington, DC, USA); the M400 (M400) by Vuzix (New York, NY, USA); the Moverio BT-300 (BT300) by Seiko Epson (Suwa, Japan), and the GlassUp F4 (F4), produced by the GlassUp company (Modena, Italy). The SGs were selected to compare devices with different features, such as the visualization system, processor, and operating system. The HL and the BT300 have a binocular transparent optical see-through display, while the M400 glasses have a monocular video see-through display and finally, the F4 has a monocular transparent optical see-through display. The SG tested are also characterized for having different input systems, such as hand recognition (HL), touchpad (BT300, F4), key buttons (M400), and different frame designs.
The four selected SGs share similar functionalities but have distinctive technical characteristics (Table 1). All the SGs are intended for professional use, such as in the industrial sector, and were tested for agricultural-related applications. The performances of the devices were compared with specific tests, which were carried out at the laboratory of the Department of Agricultural Sciences, University of Sassari (Italy), concerning the main agricultural activities in which the studied devices could be integrated. These experiments represent an advancement and an expansion of the methodology used in Sara et al. (2021) [24] and involved the common functions of the SGs, such as AR marker scanning (QR codes) and audio–video transmission. Moreover, the battery duration was monitored during the tests. These characteristics were chosen since they are the main factors that could limit the use of SG in livestock environments.

2.1. Scanning Test of Augmented Reality Markers

The basic scan functionality, common to all tested devices, was used to compare the performances of the SGs in detecting augmented markers and subsequently to show digital information overlayed to the real world. For this, a QR code embedded with the farm information was used. Two different QR code trials were carried out in order to assess the scanning time and the scanning distance. The laboratory tests were specifically designed to analyze the smart glasses’ features and performances, concerning the main agricultural activities in which the studied devices can be integrated. The scanning time trial consisted of measuring the time interval to display the AR contents integrated into the marker on the SGs. The time frame included the tasks ranging from the scan function activation to data visualization on the SG display. Specifically, the scan time was measured by an operator who started measuring upon the activation of the SG’s scan function and ended once the wearer visualized the AR information. The scans were carried out at a distance of 40 cm from the position of the QR codes, which were placed on a vertical plane (wall), and the SG wearer was in front of the marker. The scanning distance trials involved measuring the maximum distances at which the SG was able to detect the QR codes. In these trials, the four different sizes of the QR code adopted include 3.5 × 3.5 cm (3.5), 4 × 4 cm (4), and 7.5 × 7.5 cm (7.5). The size range of the markers used in this study was based on previous methodologies that underline a linear strong relationship between the marker size and the scanning distance [20,24,25].

2.2. Remote Transmission Quality Assessment

The evaluation of the audio–video of the transmitted audio–visual contents from the SGs to a remote device (PC) consisted of two tests. These trials were carried out using VoIP (Voice over Internet Protocol) call apps, available on all the devices considered. For the trials, the SG and the laptop were connected to the University’s Wi-Fi network (20 Mbps upload/download) with a sharing video quality resolution of 640 × 480 pixels (VGA).
The tests were carried out with the two operators (SG wearer, PC operator) located in two different buildings of the Department of Agricultural Sciences. The first test measured the lag time between the SG and laptop in order to evaluate the delay during the transmission of the video and audio contents simulating a remote assistance procedure in agricultural farms (e.g., maintenance of a milking machine, tractors, etc.). The lag time was measured by synchronizing the clocks of the two operators (SG wearer and PC operator) and recording the time frame between the emission and receiving times of predetermined contents [26]. Specifically, the emitting and receiving times of a specific position and a set vocal signal were recorded for video and audio lag time detection, respectively. The second test assessed the observable level of detail of the transmitted images through the SGs. This trial highlighted whether, in a remote video call, the farmer would be able to show the remote operator a specific detail (e.g., particular components of a machine undergoing maintenance, plants affected by phytopathology, animal injuries, etc.). The visual acuity tests were performed using a standard Snellen chart, composed of eleven lines with decreasing size letter blocks. The printed chart was scanned with an SG at a 50 cm distance, and the receiving operator was reading the transmitted chart image on the laptop’s 16 inch screen. The results of the test were recorded along with the decreasing size of the characters [26]. The lag times were measured with 20 replications, whereas the Snellen chart tests were performed with 3 replications for each different operator.

2.3. Smart Glasses Battery Life Test

The battery life was monitored throughout the trials to evaluate the autonomy of the SGs with one charge cycle. These trials were designed to understand whether the useful usage time of SG would enable accomplishing on-farm activities with a single charge cycle. Two working situations were evaluated as follows: SGs exclusively running the scan function, and SGs running mixed functions (video calling, scanning, etc.). In each trial, the SGs were used continuously while avoiding the standby of the devices. The battery levels were checked every 30 min in the SG display, where the percentage of residual charge was shown in all the four devices tested.

2.4. Statistical Analysis

Descriptive statistics were calculated for the QR code performance tests and for the lag time tests. Due to the non-parametric data trends, the Kruskal–Wallis rank-sum test and post hoc multiple comparisons (p < 0.01) were adopted for the scanning time and audio–video quality transmission data to make comparisons between the treatments. The analyses were performed using RStudio (ver. 2023.12.1 build 402) [27].

3. Results and Discussion

3.1. QR Code Scanning Performance

The QR code scanning trials represent a primary step for the understanding of the capabilities of the SG to provide farmers with AR information in the field. Several production units (crops, animals, orchards) could be provided with markers and in several farm contexts (open field, milking parlor, greenhouse). Furthermore, the use of QR codes in agriculture is usually coupled with the traceability of food products [28,29], showing its potential applications in providing accurate information for the logistics of products through the supply chains [30], for storage of animal husbandry records [31], the management of vegetable productions [32], and for milking machine inspections through AR instructions [25]. The four SGs were compared, and each device showed different performances in the detection of QR codes with different speeds and distances. Table 2 shows the results of the QR code scanning times. Overall, the best performance was recorded for the HL, which required an average of 1.71 s to show the augmented farm information in the SG display. Moreover, the QR size influenced the scanning time, where for the 4 cm marker, the scanning time results were the lowest with an average time of 1.24 s. Considering the BT300, the average scanning time for the marker resulted in 5.67 s, where a difference was found in the 7.5 cm code (4.71 s). The M400 showed a scanning time of about 1.98 s, and a significant difference was found for the 7.5 QR code sizes (1.86 s). Finally, the F4 showed an average time of scanning of 9.33 s, with a statistical difference only between the 3.5 cm and the other QR code sizes. These results are in accordance with the technical characteristics of the tested SG. In fact, as shown in Table 1, HL has superior technical characteristics (processor, camera resolution, OS version) compared to all the other selected devices.
The time required to display the augmented information on an SG in an agricultural context represents one of the factors that might influence a farmer’s performance and productivity. The ability of the SGs to display specific AR information in a short time, 4.67 s on average, allows these devices to be implemented in farm routines, increasing work efficiency and precision (e.g., selecting animals during milking sessions). In this way, the SGs can support the farmer in the identification of food stocks to be selected for the preparation of the animal diet and also for the identification of animals with specific information (milk yield, health status, etc.) [20,21]. Figure 2 shows the results of the QR code scanning distances. This test highlights the distance at which augmented contents might be detectable through specific markers on the farm while using the different AR devices. The highest scanning distances were measured for the M400. In fact, with this device, it was possible to gather the augmented information at about 2 m, using a marker 4 cm in size. Likewise, a 7.5 QR code can be scanned at about 3 m away. Moreover, a strong linear trend between scanning distances and QR code size was observed for all the SG and QR code types. Overall, the results of these tests allow us to explore whether this function and the QR code markers are suitable for different agricultural applications. In fact, high scanning times or inappropriate scanning distances would limit the diffusion of this technology in the agricultural sector. For these reasons, different marker sizes should be specifically tailored to adapt to different agricultural contexts (e.g., markers to be placed on plants, greenhouse pallets, animals, food products, raw materials, machinery, machines, components, etc.). The marker size influences the scanning distance and scanning time, which are noteworthy aspects considering the dimensions of agricultural environments and farm productivity.
Specifically, the maximum scanning distance was tested to understand if the marker size is adequate for reading in the farms’ specific working places (e.g., between the rows of benches in the greenhouse, in the milking parlor, in the compartments of agricultural machinery, etc.). Moreover, the standard light conditions adopted during the QR code scanning tests ensure that they should not influence the results obtained in this study, since marker-based methods are considered easily identifiable in changing lighting conditions [10].

3.2. Audio and Video Quality Performance

The test on the lag times recorded during video calls between the SG and a laptop highlighted that all the devices had delays of lesser than one second for both audio and video transmission. Specifically, for audio transmission, HL required lesser time (0.26 s) on average than BT300, M400, and F4 with 0.43 s, 0.44 s, and 1.24 s, respectively. Considering the video transmission, no significant difference was found between the HL and BT300 with 0.51 s and 0.55 s of delay, respectively, whereas M400 (0.91 s) and F4 (2.21) showed significant results. The negligible lag times obtained for all the SGs tested should allow the farmer time to share their point of view with technicians (veterinarian, agronomist, mechanic, etc.) during the maintenance procedure of on-farm machinery. Moreover, the farmers could be guided during these activities in real-time in an interactive way.
Overall, the SGs tested are able to clearly distinguish characters greater than or equal to 9 mm in size in the Snellen test (Figure 3). The HL and BT300 showed a better performance, allowing users to clearly distinguish characters of a size greater than or equal to 7 mm. Moreover, the HL enabled users to distinguish 2 mm characters in 63% of cases, unlike the BT300 (43%), the M400 (39%) and the F4 (31%). These outcomes confirm the capability and feasibility of these devices in remote assistance operations. High-quality audio and video data would enable the implementation of SGs for AR to perform agricultural tasks by increasing the potential for its adoption by farmers. Moreover, the aspects characterizing agricultural environments, such as remote location, limited or difficult access routes, and reduced number of specialized maintenance workers per farm, are the main driving factors for using SGs as a support tool for specialized remote assistance. Furthermore, the field of view (the area where AR contents are shown to users) and the visualization system (optical or video) might influence the use of SGs in the farming work environment. As reported by [8], a minimum of 20 degrees of field of view, as disclosed for HL and BT300, are necessary for an industrial working context to avoid discomfort for the users and limitations on the amount of AR information visible [33]. Concerning the visualization system, optical see-through display, allows wearers to directly see the real world using both monocular (HL) and binocular (BT300) SGs. On the other hand, SGs equipped with a video system display (M400) have disadvantages due to the latency of the real vision, especially for binocular devices. These drawbacks, linked to the nature of this system, might raise concerns for the safety of the users as they create a blind spot in the operator’s real field of view [8].

3.3. Smart Glasses Battery Life Test

Figure 4 presents the results of the SG’s battery life. All the devices had shorter battery lives in a mixed-use situation than in the scanning activities. Specifically, the BT300 and M400, compared to the HL and the F4, showed greater differences in SG battery autonomy between the usage types (21 and 20 min, respectively). The F4 showed the best performance in terms of battery life, with a battery autonomy of about 6.91 h. This fact was probably due to a set of characteristics that differentiate the F4 device, such as the simplest user interface, operation mode, etc., that could influence the energy consumption [33,34]. Considering the results obtained, the battery life of all the tested SG would allow the accomplishment of on-farm tasks, but the work timing should be carefully considered. Overall, the devices tested might support the farmer during specific tasks, e.g., groping, selecting, feed preparation, and remote assistance in the field [20,21]. However, the length of time required per task is extremely variable as these types of activities are influenced by multiple factors (e.g., farm size, number of animals, workforce involved, etc.).

3.4. Challenges and Future Perspectives

Overall, SGs are promising and innovative devices, especially when implemented in production and professional contexts. However, several challenges need to be addressed to support their wider use in the farming domain. In fact, the availability of specific software for agricultural contexts (e.g., to help farmers in the management of feedstocks, animals, crops, operating machines, and tractors) is still limited. Moreover, another challenge will be to enhance the inter-communicability of SG with other smart tools available on the farm also using IoT technologies, as reported by [35]. The interconnection between SGs and smart sensors could greatly enhance the AR experience, providing insight on the real-time condition of animals such as their temperature, anomalous behavioral aspects or position. The challenges may include the cost of the Smart Glasses and the need for adequate training and support for farmers. Nowadays, there are numerous models of SGs available in the market which are ready to use; nevertheless, it is important to know their specific performance levels and capabilities in order to properly integrate them into agricultural production contexts. In addition, for each device tested, a comparison of the strengths and weaknesses for agricultural use in relation to their performances and technical features are shown in Table 3. Another important aspect to consider is the robustness of SGs. Although a multitude of devices are marketed for professional use, the level of robustness should be carefully evaluated prior to adoption in agricultural contexts.
Finally, to encourage the spread of these technologies in the agricultural domain, it is necessary to extend and improve internet access in rural areas, which represents one of the main concerns that could influence and increase the digital divide even more, especially for rural areas [36]. In fact, the availability of a high-speed internet connection is essential to optimizing the performance of these technologies.

4. Conclusions

In the present study, four different SGs for AR were compared and tested for their main available functionality (e.g., scanning codes, video calls, etc.) for use in an agricultural context. Overall, the results showed that the SGs might enable farmers to obtain fast and precise augmented information through the use of markers placed at different points on the farm. However, the SGs adopted in this study presented a wide range of performance, especially in the detection of QR codes with different printed sizes and amounts of encoded information. The quality of audio–visual transmission allowed for the discrimination of small details in remote assistance applications. Moreover, the battery autonomy of the SG ranged from 2 to 7 h per battery charge cycle, where the type of use influenced the battery life. This study highlights the potential use and integration of SGs to support farm management. In future studies, the authors will focus on the development of specific systems for the agricultural domain that aim to integrate and manage farm data and support agricultural operators by providing timely information through augmented reality devices.

Author Contributions

Conceptualization, G.S., G.T. and M.C.; methodology, G.S., G.T. and D.P.; formal analysis, G.S. and D.P.; investigation, G.S. and D.P.; data curation, G.S., G.T. and D.P.; writing—original draft preparation, G.S. and D.P.; writing—review and editing, G.S., G.T., D.P. and M.C.; visualization, G.T.; supervision, G.T. and M.C.; project administration, G.T. and M.C.; funding acquisition, M.C. All authors have read and agreed to the published version of the manuscript.

Funding

This study was carried out within the Agritech National Research Center and received funding from the European Union Next-GenerationEU (PIANO NAZIONALE DI RIPRESA E RESILIENZA (PNRR)—MISSIONE 4 COMPONENTE 2, INVESTIMENTO 1.4—D.D. 1032 17/06/2022, CN00000022). This manuscript reflects only the authors’ views and opinions, neither the European Union nor the European Commission can be considered responsible for them.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The original data presented in the study are openly available in Mendeley at [https://doi.org/10.17632/fzgv7rwx9g.1].

Conflicts of Interest

The authors declare no conflicts of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript; or in the decision to publish the results.

References

  1. Azuma, R.; Baillot, Y.; Behringer, R.; Feiner, S.; Julier, S.; MacIntyre, B. Recent advances in augmented reality. IEEE Comput. Graph. Appl. 2001, 21, 34–47. [Google Scholar] [CrossRef]
  2. Azuma, R. A survey of Augmented Reality. Presence Teleoperators Virtual Environ. 1997, 6, 355–385. [Google Scholar] [CrossRef]
  3. Milgram, P.; Kishino, F. A taxonomy of mixed reality visual displays. IEICE Trans. Inf. Syst. 1994, 77, 1321–1329. [Google Scholar]
  4. Klinker, K.; Wiesche, M.; Krcmar, H. Digital Transformation in Health Care: Augmented Reality for Hands-Free Service Innovation. Inf. Syst. Front. 2020, 22, 1419–1431. [Google Scholar] [CrossRef]
  5. Lee, L.-H.; Hui, P. Interaction Methods for Smart Glasses: A Survey. IEEE Access 2018, 6, 28712–28732. [Google Scholar] [CrossRef]
  6. Höllerer, T.; Feiner, S. Mobile Augmented Reality. In Telegeoinformatics: Location-Based Computing and Services; Taylor & Francis Books Ltd.: Abingdon, UK, 2004; p. 21. [Google Scholar]
  7. Kim, S.; Nussbaum, M.A.; Gabbard, J.L. Influences of augmented reality head-worn display type and user interface design on performance and usability in simulated warehouse order picking. Appl. Ergon. 2019, 74, 186–193. [Google Scholar] [CrossRef] [PubMed]
  8. Syberfeldt, A.; Danielsson, O.; Gustavsson, P. Augmented Reality Smart Glasses in the Smart Factory: Product Evaluation Guidelines and Review of Available Products. IEEE Access 2017, 5, 9118–9130. [Google Scholar] [CrossRef]
  9. Billinghurst, M.; Clark, A.; Lee, G. A Survey of Augmented Reality. Found. Trends Hum. –Comput. Interact. 2015, 8, 73–272. [Google Scholar] [CrossRef]
  10. Chatzopoulos, D.; Bermejo, C.; Huang, Z.; Hui, P. Mobile Augmented Reality Survey: From Where We Are to Where We Go. IEEE Access 2017, 5, 6917–6950. [Google Scholar] [CrossRef]
  11. Oufqir, Z.; Abderrahmani, A.E.; Satori, K. From marker to markerless in augmented reality. In Embedded Systems and Artificial Intelligence; Springer: Singapore, 2020; pp. 599–612. [Google Scholar]
  12. Muñoz-Saavedra, L.; Miró-Amarante, L.; Domínguez-Morales, M. Augmented and virtual reality evolution and future ten-dency. Appl. Sci. 2020, 10, 322. [Google Scholar] [CrossRef]
  13. Szajna, A.; Stryjski, R.; Woźniak, W.; Chamier-Gliszczyński, N.; Kostrzewski, M. Assessment of augmented reality in manual wiring production process with use of mobile AR glasses. Sensors 2020, 20, 4755. [Google Scholar] [CrossRef] [PubMed]
  14. King, G.R.; Piekarski, W.; Thomas, B. ARVino—Outdoor augmented reality visualisation of viticulture GIS data. In Proceedings of the Fourth IEEE and ACM International Symposium on Mixed and Augmented Reality (ISMAR’05), Vienna, Austria, 5–8 October 2005. [Google Scholar]
  15. Santana-Fernández, J.; Gómez-Gil, J.; DelPozo-SanCirilo, L. Design and Implementation of a GPS Guidance System for Agri-cultural Tractors Using Augmented Reality Technology. Sensors 2010, 10, 10435–10447. [Google Scholar] [CrossRef] [PubMed]
  16. Huuskonen, T.; Oksanen, J. Augmented reality for supervising multirobot system in agricultural field operation. IFAC-Pap. 2019, 52, 367–372. [Google Scholar] [CrossRef]
  17. Thomas, N.; Ickjai, L. Using mobile-based augmented reality and object detection for real-time Abalone growth monitoring. Comput. Electrionics Agric. 2023, 207, 107744. [Google Scholar]
  18. Garzon, J.; Baldiris, S.; Acevedo, J.; Pavon, J. Augmented Reality-based application to foster sustainable agriculture in the context of aquaponics. In Proceedings of the 2020 IEEE 20th International Conference on Advanced Learning Technologies (ICALT), Tartu, Estonia, 6–9 July 2020; pp. 316–318. [Google Scholar]
  19. Larbaigt, J.; Lemercier, C. An Evaluation of the Acceptability of Smart Eyewear for Plot Diagnosis Activity in Agriculture. Ergon. Des. Q. Hum. Factors Appl. 2021, 31, 32–39. [Google Scholar] [CrossRef]
  20. Caria, M.; Sara, G.; Todde, G.; Polese, M.; Pazzona, A. Exploring smart glasses for augmented reality: A valuable and integrative tool in precision livestock farming. Animals 2019, 9, 903. [Google Scholar] [CrossRef] [PubMed]
  21. Caria, M.; Todde, G.; Sara, G.; Piras, M.; Pazzona, A. Performance and usability of smartglasses for augmented reality in precision livestock farming operations. Appl. Sci. 2020, 10, 2318. [Google Scholar] [CrossRef]
  22. Luyu, D.; Yang, L.; Ligen, Y.; Weihong, M.; Qifeng, L.; Ronghua, G.; Qinyang, Y. Real-time monitoring of fan operation in livestock houses based on the image processing. Expert Syst. Appl. 2023, 213, 118683. [Google Scholar]
  23. Pinna, D.; Sara, G.; Todde, G.; Atzori, A.S.; Artizzu, V.; Spano, L.D.; Caria, M. Advancements in combining electronic animal identification and augmented reality technologies in digital livestock farming. Sci. Rep. 2023, 13, 18282. [Google Scholar] [CrossRef] [PubMed]
  24. Sara, G.; Todde, G.; Polese, M.; Caria, M. Evaluation of smart glasses for augmented reality: Technical advantages on their integration in agricultural systems. In Proceedings of the European Conference on Agricultural Engineering AgEng2021, Évora, Portugal, 4–8 July 2021; Barbosa, J.C., Silva, L.L., Lourenço, P., Sousa, A., Silva, J.R., Cruz, V.F., Baptista, F., Eds.; Universidade de Évora: Évora, Portugal, 2021; pp. 580–587. [Google Scholar]
  25. Sara, G.; Todde, G.; Caria, M. Assessment of video see-through smart glasses for augmented reality to support technicians during milking machine maintenance. Sci. Rep. 2022, 12, 15729. [Google Scholar] [CrossRef] [PubMed]
  26. Muensterer, O.J.; Lacher, M.; Zoeller, C.; Bronstein, M.; Kübler, J. Google Glass in pediatric surgery: An exploratory study. Int. J. Surg. 2014, 12, 281–289. [Google Scholar] [CrossRef] [PubMed]
  27. R Core Team. R: A Language and Environment for Statistical Computing; R Foundation for Statistical Computing: Vienna, Austria, 2021; Available online: https://www.R-project.org/ (accessed on 19 February 2024).
  28. Tarjan, L.; Šenk, I.; Tegeltija, S.; Stankovski, S.; Ostojic, G. A readability analysis for QR code application in a traceability system. Comput. Electron. Agric. 2014, 109, 1–11. [Google Scholar] [CrossRef]
  29. Bai, H.; Zhou, G.; Hu, Y.; Sun, A.; Xu, X.; Liu, X.; Lu, C. Traceability technologies for farm animals and their products in China. Food Control 2017, 79, 35–43. [Google Scholar] [CrossRef]
  30. Gebresenbet, G.; Bosona, T.; Olsson, S.-O.; Garcia, D. Smart system for the optimization of logistics performance of the pruning biomass value chain. Appl. Sci. 2018, 8, 1162. [Google Scholar] [CrossRef]
  31. Green, T.; Smith, T.; Hodges, R.; Fry, W.M. A simple and inexpensive way to document simple husbandry in animal care facilities using QR code scanning. Lab. Anim. 2017, 51, 656–659. [Google Scholar] [CrossRef]
  32. Yang, F.; Wang, K.; Han, Y.; Qiao, Z. A Cloud-Based Digital Farm Management System for Vegetable Production Process Management and Quality Traceability. Sustainability 2018, 10, 4007. [Google Scholar] [CrossRef]
  33. Kim, M.; Choi, S.H.; Park, K.-B.; Lee, J.Y. User Interactions for augmented reality smart glasses: A comparative evaluation of visual contexts and interaction gestures. Appl. Sci. 2019, 9, 3171. [Google Scholar] [CrossRef]
  34. Tarkoma, S.; Siekkinen, M.; Lagerspetz, E.; Xiao, Y. Smartphone Energy Consumption: Modeling and Optimization; Cambridge University Press: Cambridge, UK, 2014. [Google Scholar]
  35. Phupattanasilp, P.; Tong, S. Augmented reality in the integrative internet of things (AR-IoT): Application for precision farming. Sustainability 2019, 11, 2658. [Google Scholar] [CrossRef]
  36. Trendov, N.M.; Varas, S.; Zenf, M. Digital Technologies in Agriculture and Rural Areas: Briefing Paper; Food and Agriculture Organization of the United Nations: Rome, Italy, 2019. [Google Scholar]
Figure 1. The four models of smart glasses used in this study: (a) Microsoft HoloLens Mk.2; (b) Epson Moverio BT-300; (c) Vuzix M400; (d) GlassUp F4.
Figure 1. The four models of smart glasses used in this study: (a) Microsoft HoloLens Mk.2; (b) Epson Moverio BT-300; (c) Vuzix M400; (d) GlassUp F4.
Agriengineering 06 00108 g001
Figure 2. The maximum scanning distance of the QR code types with increasing printed size for the four SG tested.
Figure 2. The maximum scanning distance of the QR code types with increasing printed size for the four SG tested.
Agriengineering 06 00108 g002
Figure 3. Visual acuity test results show the percentage (%) of correct letters read on the laptop screen during a video call from the SG.
Figure 3. Visual acuity test results show the percentage (%) of correct letters read on the laptop screen during a video call from the SG.
Agriengineering 06 00108 g003
Figure 4. Results of the operating hours of the tested devices (HL, BT300, M400, F4) for different types of operational use, i.e., repeated markers scanning (Scanning) and use of different applications (Mixed Use).
Figure 4. Results of the operating hours of the tested devices (HL, BT300, M400, F4) for different types of operational use, i.e., repeated markers scanning (Scanning) and use of different applications (Mixed Use).
Agriengineering 06 00108 g004
Table 1. Technical features of the smart glasses used in this study (as reported by the producers).
Table 1. Technical features of the smart glasses used in this study (as reported by the producers).
ModelHoloLens2BT300M400F4
BrandMicrosoftEpsonVuzixGlassUp
Operating systemWindows HolographicAndroid 5.1Android 8.1
ProcessorQualcomm Snapdragon 850Intel Atom (Quad-core)Qualcomm XR1 (Octa-core)Cortex A9
SensorsHead tracking, eye tracking, depth sensor, Gyroscope–accelerometer–magnetometer (3 axes)Gyroscope–accelerometer–magnetometer (3 axes) lux sensorGyroscope–accelerometer–er-magnetometer (3 axes)Accelerometer–gyroscope–compass (9 axis), lux sensor
ConnectivityGPS, Wi-fi, Bluetooth, USB-CGPS, Wi-fi, Bluetooth, micro-USBGPS, Wi-fi, Bluetooth, USB-CWi-Fi, Bluetooth
DisplayHolographic lenses (1440 × 936)Si-Oled (1280 × 720)Occluded OLED (640 × 360)LCD Full color (640 × 480)
Controller inputHand recognition, Eye recognition, Voice commandExternal touchpadIntegrated touchpad, 3 buttons, voice commandExternal Joypad, one button on the glasses
Camera8 Megapixel5 Megapixel12.8 Megapixel5 Megapixel
Field of view43°23°16.8°22°
Battery life2–3 h4 h2–12 h6–8 h
Weight566 g69 g190 g251 g
Table 2. Average scanning times in seconds and standard deviations (SD) of different sizes of QR codes for the four devices (HL, BT300, M400, F4).
Table 2. Average scanning times in seconds and standard deviations (SD) of different sizes of QR codes for the four devices (HL, BT300, M400, F4).
QR Code Size (cm)3.547.5
HL2.06 aA1.24 bA1.84 aA
SD0.610.670.83
BT3006.01 aB6.30 aB4.71 bB
SD1.011.251.42
M4002.04 abA2.04 aC1.86 bA
SD0.340.310.31
F412.07 aC8.54 bC7.37 bC
SD5.213.042.09
a–b. Mean values in the same row with diverse lowercase superscripts are statistically different (p < 0.01). A–C. Mean values in the same column with diverse capital superscripts are statistically different (p < 0.01).
Table 3. Strengths (S) and weaknesses (W) comparison of the four SG tested for their integration in the farming domain and considering their functions and structural parameters.
Table 3. Strengths (S) and weaknesses (W) comparison of the four SG tested for their integration in the farming domain and considering their functions and structural parameters.
Parameter/
Function
Farming Application/
Implication
HLBT300M400F4
Battery lifeHigh battery life, reduced recharge times, or interchangeable batteries (M400) allow continuous field use or with less loss of time.WWSS
Marker detection (time/distance)Obtaining overlap information on animals, plants, crops, and agricultural machinery, in a reduced time interval or at high distances could increase the farmer’s efficiency.SWSW
Audio-Video TransmissionTransmitting images and audio with good quality (clarity, timing, detail) could improve and expand remote technical assistance to farmers in the field.SSWW
Field of ViewLimited field of view, <20°, ref. [8] could negatively influence the farmer’s use of SG while working.SSWW
Visualization SystemThe system could affect the farmer’s safety by limiting the farmer’s field of view, especially in the case of binocular or video see-through systemsSSWS
Weight/comfortNormal glasses weigh about 20 g, an excessive or unbalanced weight of SG could negatively affect its use by farmersSSWW
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Sara, G.; Pinna, D.; Todde, G.; Caria, M. Augmented Reality Glasses Applied to Livestock Farming: Potentials and Perspectives. AgriEngineering 2024, 6, 1859-1869. https://doi.org/10.3390/agriengineering6020108

AMA Style

Sara G, Pinna D, Todde G, Caria M. Augmented Reality Glasses Applied to Livestock Farming: Potentials and Perspectives. AgriEngineering. 2024; 6(2):1859-1869. https://doi.org/10.3390/agriengineering6020108

Chicago/Turabian Style

Sara, Gabriele, Daniele Pinna, Giuseppe Todde, and Maria Caria. 2024. "Augmented Reality Glasses Applied to Livestock Farming: Potentials and Perspectives" AgriEngineering 6, no. 2: 1859-1869. https://doi.org/10.3390/agriengineering6020108

APA Style

Sara, G., Pinna, D., Todde, G., & Caria, M. (2024). Augmented Reality Glasses Applied to Livestock Farming: Potentials and Perspectives. AgriEngineering, 6(2), 1859-1869. https://doi.org/10.3390/agriengineering6020108

Article Metrics

Back to TopTop