Sign in to use this feature.

Years

Between: -

Article Types

Countries / Regions

Search Results (34)

Search Parameters:
Journal = ASI
Section = Information Systems

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
16 pages, 3059 KiB  
Article
OFF-The-Hook: A Tool to Detect Zero-Font and Traditional Phishing Attacks in Real Time
by Nazar Abbas Saqib, Zahrah Ali AlMuraihel, Reema Zaki AlMustafa, Farah Amer AlRuwaili, Jana Mohammed AlQahtani, Amal Aodah Alahmadi, Deemah Alqahtani, Saad Abdulrahman Alharthi, Sghaier Chabani and Duaa Ali AL Kubaisy
Appl. Syst. Innov. 2025, 8(4), 93; https://doi.org/10.3390/asi8040093 - 30 Jun 2025
Viewed by 589
Abstract
Phishing attacks continue to pose serious challenges to cybersecurity, with attackers constantly refining their methods to bypass detection systems. One particularly evasive technique is Zero-Font phishing, which involves the insertion of invisible or zero-sized characters into email content to deceive both users and [...] Read more.
Phishing attacks continue to pose serious challenges to cybersecurity, with attackers constantly refining their methods to bypass detection systems. One particularly evasive technique is Zero-Font phishing, which involves the insertion of invisible or zero-sized characters into email content to deceive both users and traditional email filters. Because these characters are not visible to human readers but still processed by email systems, they can be used to evade detection by traditional email filters, obscuring malicious intent in ways that bypass basic content inspection. This study introduces a proactive phishing detection tool capable of identifying both traditional and Zero-Font phishing attempts. The proposed tool leverages a multi-layered security framework, combining structural inspection and machine learning-based classification to detect both traditional and Zero-Font phishing attempts. At its core, the system incorporates an advanced machine learning model trained on a well-established dataset comprising both phishing and legitimate emails. The model alone achieves an accuracy rate of up to 98.8%, contributing significantly to the overall effectiveness of the tool. This hybrid approach enhances the system’s robustness and detection accuracy across diverse phishing scenarios. The findings underscore the importance of multi-faceted detection mechanisms and contribute to the development of more resilient defenses in the ever-evolving landscape of cybersecurity threats. Full article
(This article belongs to the Special Issue The Intrusion Detection and Intrusion Prevention Systems)
Show Figures

Figure 1

19 pages, 6995 KiB  
Article
Investigating Stress During a Virtual Reality Game Through Fractal and Multifractal Analysis of Heart Rate Variability
by Penio Lebamovski and Evgeniya Gospodinova
Appl. Syst. Innov. 2025, 8(1), 16; https://doi.org/10.3390/asi8010016 - 21 Jan 2025
Cited by 1 | Viewed by 2824
Abstract
This article presents the process of creating a virtual reality (VR) game designed to assess the impact of stress on heart rate variability (HRV). The game features dynamic and challenging scenarios to induce stress responses, incorporating advanced 3D modelling and 3D animation techniques. [...] Read more.
This article presents the process of creating a virtual reality (VR) game designed to assess the impact of stress on heart rate variability (HRV). The game features dynamic and challenging scenarios to induce stress responses, incorporating advanced 3D modelling and 3D animation techniques. A study involving 20 volunteers was conducted, with electrocardiographic (ECG) data collected before and during game play. HRV analysis focused on fractal and multifractal characteristics, utilizing detrended fluctuation analysis (DFA) and multifractal detrended fluctuation analysis (MFDFA) methods. DFA results revealed decreased values of α1, α2, and αall, indicating alterations in short-term and long-term correlations under stress. MFDFA further analyzed changes in fluctuation function Fq(s), generalized Hurst exponent Hq, multifractal scaling exponent τ(q), and multifractal spectrum f(α), showing significant differences in these parameters under stress. These findings validate the game’s effectiveness in simulating stress and its impact on HRV. The present study not only demonstrates the relationship between stress and the fractal characteristics of HRV but also offers a new foundation for future applications in psychology, physiology, and the development of VR technologies for stress management. Full article
(This article belongs to the Section Information Systems)
Show Figures

Figure 1

15 pages, 4088 KiB  
Article
Options for Performing DNN-Based Causal Speech Denoising Using the U-Net Architecture
by Hwai-Tsu Hu and Tung-Tsun Lee
Appl. Syst. Innov. 2024, 7(6), 120; https://doi.org/10.3390/asi7060120 - 29 Nov 2024
Viewed by 1484
Abstract
Speech enhancement technology seeks to improve the quality and intelligibility of speech signals degraded by noise, particularly in telephone communications. Recent advancements have focused on leveraging deep neural networks (DNN), especially U-Net architectures, for effective denoising. In this study, we evaluate the performance [...] Read more.
Speech enhancement technology seeks to improve the quality and intelligibility of speech signals degraded by noise, particularly in telephone communications. Recent advancements have focused on leveraging deep neural networks (DNN), especially U-Net architectures, for effective denoising. In this study, we evaluate the performance of a 6-level skip-connected U-Net constructed using either conventional convolution activation blocks (CCAB) or innovative global local former blocks (GLFB) across different processing domains: temporal waveform, short-time Fourier transform (STFT), and short-time discrete cosine transform (STDCT). Our results indicate that the U-Nets can receive better signal-to-noise ratio (SNR) and perceptual evaluation of speech quality (PESQ) when applied in the STFT and STDCT domains, with comparable short-time objective intelligibility (STOI) scores across all domains. Notably, the GLFB-based U-Net outperforms its CCAB counterpart in metrics such as CSIG, CBAK, COVL, and PESQ, while maintaining fewer learnable parameters. Furthermore, we propose domain-specific composite loss functions, considering the acoustic and perceptual characteristics of the spectral domain, to enhance the perceptual quality of denoised speech. Our findings provide valuable insights that can guide the optimization of DNN designs for causal speech denoising. Full article
(This article belongs to the Section Information Systems)
Show Figures

Figure 1

22 pages, 7697 KiB  
Article
Using IoT for Cistern and Water Tank Level Monitoring
by Miguel A. Wister, Ernesto Leon, Alejandro Alejandro-Carrillo, Pablo Pancardo and Jose A. Hernandez-Nolasco
Appl. Syst. Innov. 2024, 7(6), 112; https://doi.org/10.3390/asi7060112 - 11 Nov 2024
Viewed by 2938
Abstract
This paper proposes an experimental design to publish online the measurements obtained from four sensors: one sensor inside a cistern measures the level of drinking water, another sensor in a water tank monitors its level, a third sensor measures water flow or pressure [...] Read more.
This paper proposes an experimental design to publish online the measurements obtained from four sensors: one sensor inside a cistern measures the level of drinking water, another sensor in a water tank monitors its level, a third sensor measures water flow or pressure from the pipes, and a fourth sensor assesses water quality. Several tank filling and emptying tests were performed. Experimental results demonstrated that if the cistern perceived that there was no water in the tank, it turned on the water pump to fill the tank to 100% of its storage capacity; while this was happening, the water level in the cistern and tank, the flow of water from the piped water, and the quality of the water could be visualized on a dashboard. In short, this proposal monitors water levels and flows through the Internet of Things. Data collected by sensors are posted online and stored in a database. Full article
(This article belongs to the Section Information Systems)
Show Figures

Figure 1

17 pages, 362 KiB  
Article
Low-Complexity SAOR and Conjugate Gradient Accelerated SAOR Based Signal Detectors for Massive MIMO Systems
by Imran A. Khoso, Mazhar Ali, Muhammad Nauman Irshad, Sushank Chaudhary, Pisit Vanichchanunt and Lunchakorn Wuttisittikulkij
Appl. Syst. Innov. 2024, 7(6), 102; https://doi.org/10.3390/asi7060102 - 24 Oct 2024
Viewed by 1357
Abstract
A major challenge for massive multiple-input multiple-output (MIMO) technology is designing an efficient signal detector. The conventional linear minimum mean square error (MMSE) detector is capable of achieving good performance in large antenna systems but requires computing the matrix inverse, which has very [...] Read more.
A major challenge for massive multiple-input multiple-output (MIMO) technology is designing an efficient signal detector. The conventional linear minimum mean square error (MMSE) detector is capable of achieving good performance in large antenna systems but requires computing the matrix inverse, which has very high complexity. To address this problem, several iterative signal detection methods have recently been introduced. Existing iterative detectors perform poorly, especially as the system dimensions increase. This paper proposes two detection schemes aimed at reducing computational complexity in massive MIMO systems. The first method leverages the symmetric accelerated over-relaxation (SAOR) technique, which enhances convergence speed by judiciously selecting the relaxation and acceleration parameters. The SAOR technique offers a significant advantage over conventional accelerated over-relaxation methods due to its symmetric iteration. This symmetry enables the use of the conjugate gradient (CG) acceleration approach. Based on this foundation, we propose a novel accelerated SAOR method named CGA-SAOR, where CG acceleration is applied to further enhance the convergence rate. This combined approach significantly enhances performance compared to the SAOR method. In addition, a detailed analysis of the complexity and numerical results is provided to demonstrate the effectiveness of the proposed algorithms. The results illustrate that our algorithms achieve near-MMSE detection performance while reducing computations by an order of magnitude and significantly outperform recently introduced iterative detectors. Full article
(This article belongs to the Section Information Systems)
Show Figures

Figure 1

25 pages, 2362 KiB  
Article
The E(G)TL Model: A Novel Approach for Efficient Data Handling and Extraction in Multivariate Systems
by Aleksejs Vesjolijs
Appl. Syst. Innov. 2024, 7(5), 92; https://doi.org/10.3390/asi7050092 - 26 Sep 2024
Cited by 4 | Viewed by 2747
Abstract
This paper introduces the EGTL (extract, generate, transfer, load) model, a theoretical framework designed to enhance the traditional ETL processes by integrating a novel ‘generate’ step utilizing generative artificial intelligence (GenAI). This enhancement optimizes data extraction and processing, presenting a high-level solution architecture [...] Read more.
This paper introduces the EGTL (extract, generate, transfer, load) model, a theoretical framework designed to enhance the traditional ETL processes by integrating a novel ‘generate’ step utilizing generative artificial intelligence (GenAI). This enhancement optimizes data extraction and processing, presenting a high-level solution architecture that includes innovative data storage concepts: the Fusion and Alliance stores. The Fusion store acts as a virtual space for immediate data cleaning and profiling post-extraction, facilitated by GenAI, while the Alliance store serves as a collaborative data warehouse for both business users and AI processes. EGTL was developed to facilitate advanced data handling and integration within digital ecosystems. This study defines the EGTL solution design, setting the groundwork for future practical implementations and exploring the integration of best practices from data engineering, including DataOps principles and data mesh architecture. This research underscores how EGTL can improve the data engineering pipeline, illustrating the interactions between its components. The EGTL model was tested in the prototype web-based Hyperloop Decision-Making Ecosystem with tasks ranging from data extraction to code generation. Experiments demonstrated an overall success rate of 93% across five difficulty levels. Additionally, the study highlights key risks associated with EGTL implementation and offers comprehensive mitigation strategies. Full article
(This article belongs to the Section Information Systems)
Show Figures

Figure 1

16 pages, 4769 KiB  
Article
Digital Forensics Readiness in Big Data Networks: A Novel Framework and Incident Response Script for Linux–Hadoop Environments
by Cephas Mpungu, Carlisle George and Glenford Mapp
Appl. Syst. Innov. 2024, 7(5), 90; https://doi.org/10.3390/asi7050090 - 25 Sep 2024
Viewed by 2298
Abstract
The surge in big data and analytics has catalysed the proliferation of cybercrime, largely driven by organisations’ intensified focus on gathering and processing personal data for profit while often overlooking security considerations. Hadoop and its derivatives are prominent platforms for managing big data; [...] Read more.
The surge in big data and analytics has catalysed the proliferation of cybercrime, largely driven by organisations’ intensified focus on gathering and processing personal data for profit while often overlooking security considerations. Hadoop and its derivatives are prominent platforms for managing big data; however, investigating security incidents within Hadoop environments poses intricate challenges due to scale, distribution, data diversity, replication, component complexity, and dynamicity. This paper proposes a big data digital forensics readiness framework and an incident response script for Linux–Hadoop environments, streamlining preliminary investigations. The framework offers a novel approach to digital forensics in the domains of big data and Hadoop environments. A prototype of the incident response script for Linux–Hadoop environments was developed and evaluated through comprehensive functionality and usability testing. The results demonstrated robust performance and efficacy. Full article
(This article belongs to the Section Information Systems)
Show Figures

Figure 1

23 pages, 4841 KiB  
Article
Neural Network System for Predicting Anomalous Data in Applied Sensor Systems
by Serhii Vladov, Victoria Vysotska, Valerii Sokurenko, Oleksandr Muzychuk, Mariia Nazarkevych and Vasyl Lytvyn
Appl. Syst. Innov. 2024, 7(5), 88; https://doi.org/10.3390/asi7050088 - 23 Sep 2024
Cited by 9 | Viewed by 1641
Abstract
This article advances the research on the intelligent monitoring and control of helicopter turboshaft engines in onboard conditions. The proposed neural network system for anomaly prediction functions as a module within the helicopter turboshaft engine monitoring and control expert system. A SARIMAX-based preprocessor [...] Read more.
This article advances the research on the intelligent monitoring and control of helicopter turboshaft engines in onboard conditions. The proposed neural network system for anomaly prediction functions as a module within the helicopter turboshaft engine monitoring and control expert system. A SARIMAX-based preprocessor model was developed to determine autocorrelation and partial autocorrelation in training data, accounting for dynamic changes and external factors, achieving a prediction accuracy of up to 97.9%. A modified LSTM-based predictor model with Dropout and Dense layers predicted sensor data, with a tested error margin of 0.218% for predicting the TV3-117 aircraft engine gas temperature values before the compressor turbine during one minute of helicopter flight. A reconstructor model restored missing time series values and replaced outliers with synthetic values, achieving up to 98.73% accuracy. An anomaly detector model using the concept of dissonance successfully identified two anomalies: a sensor malfunction and a sharp temperature drop within two minutes of sensor activity, with type I and II errors below 1.12 and 1.01% and a detection time under 1.611 s. The system’s AUC-ROC value of 0.818 confirms its strong ability to differentiate between normal and anomalous data, ensuring reliable and accurate anomaly detection. The limitations involve the dependency on the quality of data from onboard sensors, affected by malfunctions or noise, with the LSTM network’s accuracy (up to 97.9%) varying with helicopter conditions, and the model’s high computational demand potentially limiting real-time use in resource-constrained environments. Full article
(This article belongs to the Section Information Systems)
Show Figures

Figure 1

25 pages, 16100 KiB  
Article
E-Marketplace State of the Art and Trends: VR-ZOCO—An Architectural Proposal for the Future
by José Jesús Castro-Schez, Rubén Grande, Vanesa Herrera, Santiago Schez-Sobrino, David Vallejo and Javier Albusac
Appl. Syst. Innov. 2024, 7(5), 76; https://doi.org/10.3390/asi7050076 - 29 Aug 2024
Cited by 3 | Viewed by 2093
Abstract
E-commerce has become uniquely relevant to small- and medium-sized enterprises (SMEs) as an essential catalyst for their growth and sustainability. SMEs see e-commerce portals as a strategic way to engage in digital business activities without having to implement costly proprietary e-commerce solutions. In [...] Read more.
E-commerce has become uniquely relevant to small- and medium-sized enterprises (SMEs) as an essential catalyst for their growth and sustainability. SMEs see e-commerce portals as a strategic way to engage in digital business activities without having to implement costly proprietary e-commerce solutions. In addition, partnering with these portals frees them from complex tasks such as positioning, portal maintenance, and adapting the portal to new technologies and trends. This multifaceted advantage positions e-commerce portals as invaluable partners, streamlining operations and allowing SMEs to focus more on their core business competencies. However, e-commerce portals or e-marketplaces are not without their challenges. Today, they face increasing pressure to reduce their environmental impact and to empower local commercial businesses, as well as local businesses in the entertainment and culture industry. To address these challenges, there is a pressing need to propose new types of e-marketplaces that support the concept of the 15-minute city and in which virtual and augmented reality play a key role. These marketplaces would not only boost environmental sustainability but also strengthen the connection between local businesses and the community, creating a stronger and more collaborative network that benefits both businesses and consumers. Full article
(This article belongs to the Section Information Systems)
Show Figures

Figure 1

29 pages, 5934 KiB  
Article
The Method of Restoring Lost Information from Sensors Based on Auto-Associative Neural Networks
by Serhii Vladov, Ruslan Yakovliev, Victoria Vysotska, Mariia Nazarkevych and Vasyl Lytvyn
Appl. Syst. Innov. 2024, 7(3), 53; https://doi.org/10.3390/asi7030053 - 20 Jun 2024
Cited by 29 | Viewed by 2024
Abstract
The research aims to develop a neural network-based lost information restoration method when the complex nonlinear technical object (using the example of helicopter turboshaft engines) sensors fail during operation. The basis of the research is an auto-associative neural network (autoencoder), which makes it [...] Read more.
The research aims to develop a neural network-based lost information restoration method when the complex nonlinear technical object (using the example of helicopter turboshaft engines) sensors fail during operation. The basis of the research is an auto-associative neural network (autoencoder), which makes it possible to restore lost information due to the sensor failure with an accuracy of more than 99%. An auto-associative neural network (autoencoder)-modified training method is proposed. It uses regularization coefficients that consist of the loss function to create a more stable and common model. It works well on the training sample of data and can produce good results on new data. Also, it reduces its overtraining risk when it adapts too much to the training data sample and loses its ability to generalize new data. This is especially important for small amounts of data or complex models. It has been determined based on the computational experiment results (the example of the TV3-117 turboshaft engine) that lost information restoration based on an auto-associative neural network provides a data restoring error of no more than 0.45% in the case of single failures and no more than 0.6% in case of double failures of the engine parameter registration sensor event. Full article
(This article belongs to the Section Information Systems)
Show Figures

Figure 1

19 pages, 2283 KiB  
Article
The Role of ChatGPT in Elevating Customer Experience and Efficiency in Automotive After-Sales Business Processes
by Piotr Sliż
Appl. Syst. Innov. 2024, 7(2), 29; https://doi.org/10.3390/asi7020029 - 28 Mar 2024
Cited by 4 | Viewed by 4054
Abstract
Purpose: The advancements in deep learning and AI technologies have led to the development of such language models, in 2022, as OpenAI’s ChatGPT. The primary objective of this paper is to thoroughly examine the capabilities of ChatGPT within the realm of business-process management [...] Read more.
Purpose: The advancements in deep learning and AI technologies have led to the development of such language models, in 2022, as OpenAI’s ChatGPT. The primary objective of this paper is to thoroughly examine the capabilities of ChatGPT within the realm of business-process management (BPM). This exploration entails analyzing its practical application, particularly through process-mining techniques, within the context of automotive after-sales processes. Originality: this article highlights the issue of possible ChatGPT application in selected stages of after-sales processes in the automotive sector. Methods: to achieve the main aim of this paper, methods such as a literature review, participant observation, unstructured interviews, CRISP-DM methodology, and process mining were used. Findings: This study emphasizes the promising impact of implementing the ChatGPT OpenAI tool to enhance processes in the automotive after-sales sector. Conducted in 2023, shortly after the tool’s introduction, the research highlights its potential to contribute to heightened customer satisfaction within the after-sales domain. The investigation focuses on the process-execution time. A key premise is that waiting time represents an additional cost for customers seeking these services. Employing process-mining methodologies, the study identifies stages characterized by unnecessary delays. Collaborative efforts with domain experts are employed to establish benchmark durations for researched processes’ stages. The study proposes the integration of ChatGPT to improve and expedite stages, including service reception, reception check-out, repair and maintenance, and claim repair. This holistic approach aligns with the current imperatives of business-process improvement and optimalization, aiming to enhance operational efficiency and customer-centric service delivery in the automotive after-sales sector. Full article
(This article belongs to the Section Information Systems)
Show Figures

Figure 1

31 pages, 8358 KiB  
Article
Advancements in Healthcare: Development of a Comprehensive Medical Information System with Automated Classification for Ocular and Skin Pathologies—Structure, Functionalities, and Innovative Development Methods
by Ana-Maria Ștefan, Nicu-Răzvan Rusu, Elena Ovreiu and Mihai Ciuc
Appl. Syst. Innov. 2024, 7(2), 28; https://doi.org/10.3390/asi7020028 - 27 Mar 2024
Cited by 2 | Viewed by 3479
Abstract
This article introduces a groundbreaking medical information system developed in Salesforce, featuring an automated classification module for ocular and skin pathologies using Google Teachable Machine. Integrating cutting-edge technology with Salesforce’s robust capabilities, the system provides a comprehensive solution for medical practitioners. The article [...] Read more.
This article introduces a groundbreaking medical information system developed in Salesforce, featuring an automated classification module for ocular and skin pathologies using Google Teachable Machine. Integrating cutting-edge technology with Salesforce’s robust capabilities, the system provides a comprehensive solution for medical practitioners. The article explores the system’s structure, emphasizing innovative functionalities that enhance diagnostic precision and streamline medical workflows. Methods used in development are discussed, offering insights into the integration of Google Teachable Machine into the Salesforce framework. This collaborative approach is a significant stride in intelligent pathology classification, advancing the field of medical information systems and fostering efficient healthcare practices. Full article
(This article belongs to the Section Information Systems)
Show Figures

Figure 1

17 pages, 5433 KiB  
Article
IntelliTrace: Intelligent Contact Tracing Method Based on Transmission Characteristics of Infectious Disease
by Soorim Yang, Kyoung-Hwan Kim, Hye-Ryeong Jeong, Seokjun Lee and Jaeho Kim
Appl. Syst. Innov. 2023, 6(6), 112; https://doi.org/10.3390/asi6060112 - 23 Nov 2023
Cited by 1 | Viewed by 2477
Abstract
The COVID-19 pandemic has underscored the necessity for rapid contact tracing as a means to effectively suppress the spread of infectious diseases. Existing contact tracing methods leverage location-based or distance-based detection to identify contact with a confirmed patient. Existing contact tracing methods have [...] Read more.
The COVID-19 pandemic has underscored the necessity for rapid contact tracing as a means to effectively suppress the spread of infectious diseases. Existing contact tracing methods leverage location-based or distance-based detection to identify contact with a confirmed patient. Existing contact tracing methods have encountered challenges in practical applications, stemming from the tendency to classify even casual contacts, which carry a low risk of infection, as close contacts. This issue arises because the transmission characteristics of the virus have not been fully considered. This study addresses the above problem by proposing IntelliTrace, an intelligent method that introduces methodological innovations prioritizing shared environmental context over physical proximity. This approach more accurately assesses potential transmission events by considering the transmission characteristics of the virus, with a special focus on COVID-19. In this study, we present space-based indoor Wi-Fi contact tracing using machine learning for indoor environments and trajectory-based outdoor GPS contact tracing for outdoor environments. For an indoor environment, a contact is detected based on whether users are in the same space with the confirmed case. For an outdoor environment, we detect contact through judgments based on the companion statuses of people, such as the same movements in their trajectories. The datasets obtained from 28 participants who installed the smartphone application during a one-month experiment in a campus space were utilized to train and validate the performance of the proposed exposure-detection method. As a result of the experiment, IntelliTrace exhibited an F1 score performance of 86.84% in indoor environments and 94.94% in outdoor environments. Full article
(This article belongs to the Section Information Systems)
Show Figures

Figure 1

14 pages, 4458 KiB  
Article
Modular Open-Core System for Collection and Near Real-Time Processing of High-Resolution Data from Wearable Sensors
by Dorota S. Temple, Meghan Hegarty-Craver, Pooja Gaur, Matthew D. Boyce, Jonathan R. Holt, Edward A. Preble, Randall P. Eckhoff, Hope Davis-Wilson, Howard J. Walls, David E. Dausch and Matthew A. Blackston
Appl. Syst. Innov. 2023, 6(5), 79; https://doi.org/10.3390/asi6050079 - 4 Sep 2023
Cited by 2 | Viewed by 4026
Abstract
Wearable devices, such as smartwatches integrating heart rate and activity sensors, have the potential to transform health monitoring by enabling continuous, near real-time data collection and analytics. In this paper, we present a novel modular architecture for collecting and end-to-end processing of high-resolution [...] Read more.
Wearable devices, such as smartwatches integrating heart rate and activity sensors, have the potential to transform health monitoring by enabling continuous, near real-time data collection and analytics. In this paper, we present a novel modular architecture for collecting and end-to-end processing of high-resolution signals from wearable sensors. The system obtains minimally processed data directly from the smartwatch and further processes and analyzes the data stream without transmitting it to the device vendor cloud. The standalone operation is made possible by a software stack that provides data cleaning, extraction of physiological metrics, and standardization of the metrics to enable person-to-person and rest-to-activity comparisons. To illustrate the operation of the system, we present examples of datasets from volunteers wearing Garmin Fenix smartwatches for several weeks in free-living conditions. As collected, the datasets contain time series of each interbeat interval and the respiration rate, blood oxygen saturation, and step count every 1 min. From the high-resolution datasets, we extract heart rate variability metrics, which are a source of information about the heart’s response to external stressors. These biomarkers can be used for the early detection of a range of diseases and the assessment of physical and mental performance of the individual. The data collection and analytics system has the potential to broaden the use of smartwatches in continuous near to real-time monitoring of health and well-being. Full article
(This article belongs to the Section Information Systems)
Show Figures

Figure 1

11 pages, 3689 KiB  
Article
Data Lake Architecture for Smart Fish Farming Data-Driven Strategy
by Sarah Benjelloun, Mohamed El Mehdi El Aissi, Younes Lakhrissi and Safae El Haj Ben Ali
Appl. Syst. Innov. 2023, 6(1), 8; https://doi.org/10.3390/asi6010008 - 7 Jan 2023
Cited by 7 | Viewed by 3162
Abstract
Thanks to continuously evolving data management solutions, data-driven strategies are considered the main success factor in many domains. These strategies consider data as the backbone, allowing advanced data analytics. However, in the agricultural field, and especially in fish farming, data-driven strategies have yet [...] Read more.
Thanks to continuously evolving data management solutions, data-driven strategies are considered the main success factor in many domains. These strategies consider data as the backbone, allowing advanced data analytics. However, in the agricultural field, and especially in fish farming, data-driven strategies have yet to be widely adopted. This research paper aims to demystify the situation of the fish farming domain in general by shedding light on big data generated in fish farms. The purpose is to propose a dedicated data lake functional architecture and extend it to a technical architecture to initiate a fish farming data-driven strategy. The research opted for an exploratory study to explore the existing big data technologies and to propose an architecture applicable to the fish farming data-driven strategy. The paper provides a review of how big data technologies offer multiple advantages for decision making and enabling prediction use cases. It also highlights different big data technologies and their use. Finally, the paper presents the proposed architecture to initiate a data-driven strategy in the fish farming domain. Full article
(This article belongs to the Section Information Systems)
Show Figures

Figure 1

Back to TopTop