Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (24)

Search Parameters:
Keywords = execution tracing quality

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
35 pages, 633 KB  
Article
Bi-Objective Optimization for Scalable Resource Scheduling in Dense IoT Deployments via 5G Network Slicing Using NSGA-II
by Francesco Nucci and Gabriele Papadia
Telecom 2026, 7(2), 24; https://doi.org/10.3390/telecom7020024 - 2 Mar 2026
Viewed by 227
Abstract
The proliferation of Internet of Things (IoT) devices demands efficient resource management in fifth-generation (5G) networks, particularly through network slicing mechanisms supporting massive machine-type communications (mMTCs). This paper addresses IoT connectivity in 5G network slicing through a bi-objective optimization framework balancing operational costs [...] Read more.
The proliferation of Internet of Things (IoT) devices demands efficient resource management in fifth-generation (5G) networks, particularly through network slicing mechanisms supporting massive machine-type communications (mMTCs). This paper addresses IoT connectivity in 5G network slicing through a bi-objective optimization framework balancing operational costs with quality-of-service. We formulate a bi-objective optimization problem that balances operational costs with quality-of-service (QoS) requirements across heterogeneous 5G network slices. The proposed approach employs a tailored Non-dominated Sorting Genetic Algorithm II (NSGA-II) incorporating domain-specific constraints, including device priorities, slicing isolation requirements, radio resource limitations, and battery capacity. Through extensive simulations on scenarios with up to 5000 devices, our method generates diverse Pareto-optimal solutions achieving hypervolume improvements of 8–13% over multi-objective DRL, 15–28% over single-objective DRL baselines, and 22–41% over heuristic approaches while maintaining computational scalability suitable for real-time network management (sub-2 min execution). Validation with real-world traffic traces from operational deployments confirms algorithm robustness under realistic burstiness and temporal patterns, with 7% performance degradation vs. synthetic traffic—within expected simulation–reality gaps. This work provides a practical framework for IoT resource scheduling in current 5G and future Beyond-5G (B5G) telecommunications infrastructures, validated in scenarios of up to 5000 devices. Full article
Show Figures

Figure 1

31 pages, 2800 KB  
Article
Intelligent Fusion: A Resilient Anomaly Detection Framework for IoMT Health Devices
by Flavio Pastore, Raja Waseem Anwar, Nafaa Hadi Jabeur and Saqib Ali
Information 2026, 17(2), 117; https://doi.org/10.3390/info17020117 - 26 Jan 2026
Viewed by 481
Abstract
Modern healthcare systems increasingly depend on wearable Internet of Medical Things (IoMT) devices for the continuous monitoring of patients’ physiological parameters. It remains challenging to differentiate between genuine physiological anomalies, sensor faults, and malicious cyber interference. In this work, we propose a hybrid [...] Read more.
Modern healthcare systems increasingly depend on wearable Internet of Medical Things (IoMT) devices for the continuous monitoring of patients’ physiological parameters. It remains challenging to differentiate between genuine physiological anomalies, sensor faults, and malicious cyber interference. In this work, we propose a hybrid fusion framework designed to attribute the most plausible source of an anomaly, thereby supporting more reliable clinical decisions. The proposed framework is developed and evaluated using two complementary datasets: CICIoMT2024 for modelling security threats and a large-scale intensive care cohort from MIMIC-IV for analysing key vital signs and bedside interventions. The core of the system combines a supervised XGBoost classifier for attack detection with an unsupervised LSTM autoencoder for identifying physiological and technical deviations. To improve clinical realism and avoid artefacts introduced by quantised or placeholder measurements, the physiological module incorporates quality-aware preprocessing and missingness indicators. The fusion decision policy is calibrated under prudent, safety-oriented constraints to limit false escalation. Rather than relying on fixed fusion weights, we train a lightweight fusion classifier that combines complementary evidence from the security and clinical modules, and we select class-specific probability thresholds on a dedicated calibration split. The security module achieves high cross-validated performance, while the clinical model captures abnormal physiological patterns at scale, including deviations consistent with both acute deterioration and data-quality faults. Explainability is provided through SHAP analysis for the security module and reconstruction-error attribution for physiological anomalies. The integrated fusion framework achieves a final accuracy of 99.76% under prudent calibration and a Matthews Correlation Coefficient (MCC) of 0.995, with an average end-to-end inference latency of 84.69 ms (p95 upper bound of 107.30 ms), supporting near real-time execution in edge-oriented settings. While performance is strong, clinical severity labels are operationalised through rule-based proxies, and cross-domain fusion relies on harmonised alignment assumptions. These aspects should be further evaluated using realistic fault traces and prospective IoMT data. Despite these limitations, the proposed framework offers a practical and explainable approach for IoMT-based patient monitoring. Full article
(This article belongs to the Special Issue Intrusion Detection Systems in IoT Networks)
Show Figures

Graphical abstract

30 pages, 4772 KB  
Article
Beyond Histotrust: A Blockchain-Based Alert in Case of Tampering with an Embedded Neural Network in a Multi-Agent Context
by Antonio Pereira, Dylan Paulin and Christine Hennebert
Appl. Syst. Innov. 2026, 9(1), 19; https://doi.org/10.3390/asi9010019 - 8 Jan 2026
Cited by 1 | Viewed by 547
Abstract
An intrusion into the operational network (OT) of a production site can cause serious damage by affecting productivity, reliability, and quality. The presence of embedded neural networks (NNs), such as classifiers, in physical devices opens the door to new attack vectors. Due to [...] Read more.
An intrusion into the operational network (OT) of a production site can cause serious damage by affecting productivity, reliability, and quality. The presence of embedded neural networks (NNs), such as classifiers, in physical devices opens the door to new attack vectors. Due to the stochastic behavior of the classifier and the difficulty of reproducing results, the Artificial Intelligence (AI) Act requires the NN’s behavior to be explainable. For this purpose, the platform HistoTrust enables tracing NN behavior, thanks to secure hardware components issuing attestations registered in a blockchain ledger. This solution helps to build trust between independent actors whose devices perform tasks in cooperation. This paper proposes going further by integrating a mechanism for detecting tampering of embedded NN, and using smart contracts executed on the blockchain to propagate the alert to the peer devices in a distributed manner. The use case of a bit-flip attack, targeting the weights of the NN model, is considered. This attack can be carried out by repeatedly injecting very small messages that can be missed by the Intrusion Detection System (IDS). Experiments are being conducted on the HistoTrust platform to demonstrate the feasibility of our distributed approach and to qualify the time required to detect intrusion and propagate the alert, in relation to the time it takes for the attack to impact decisions made by the AI. As a result, the blockchain may be a relevant technology to complement traditional IDS in order to face distributed attacks. Full article
(This article belongs to the Section Control and Systems Engineering)
Show Figures

Figure 1

31 pages, 3484 KB  
Article
CEDAR: An Ontology-Based Framework Using Event Abstractions to Contextualise Financial Data Processes
by Aya Tafech and Fethi Rabhi
Electronics 2026, 15(1), 145; https://doi.org/10.3390/electronics15010145 - 29 Dec 2025
Viewed by 418
Abstract
Financial institutions face data quality (DQ) challenges in regulatory reporting due to complex architectures where data flows through multiple systems. Data consumers struggle to assess quality because traditional DQ tools operate on data snapshots without capturing temporal event sequences and business contexts that [...] Read more.
Financial institutions face data quality (DQ) challenges in regulatory reporting due to complex architectures where data flows through multiple systems. Data consumers struggle to assess quality because traditional DQ tools operate on data snapshots without capturing temporal event sequences and business contexts that determine whether anomalies represent genuine issues or valid behavior. Existing approaches address either semantic representation (ontologies for static knowledge) or temporal pattern detection (event processing without semantics), but not their integration. This paper presents CEDAR (Contextual Events and Domain-driven Associative Representation), integrating financial ontologies with event-driven processing for context-aware DQ assessment. Novel contributions include (1) ontology-driven rule derivation that automatically translates OWL business constraints into executable detection logic; (2) temporal ontological reasoning extending static quality assessment with event stream processing; (3) explainable assessment tracing anomalies through causal chains to violated constraints; and (4) standards-based design using W3C technologies with FIBO extensions. Following the Design Science Research Methodology, we document the first, early-stage iteration focused on design novelty and technical feasibility. We present conceptual models, a working prototype, controlled validation with synthetic equity derivative data, and comparative analysis against existing approaches. The prototype successfully detects context-dependent quality issues and enables ontological root cause exploration. Contributions: A novel integration of ontologies and event processing for financial DQ management with validated technical feasibility, demonstrating how semantic web technologies address operational challenges in event-driven architectures. Full article
(This article belongs to the Special Issue Visual Analysis of Software Engineering Data)
Show Figures

Figure 1

31 pages, 3464 KB  
Article
An Intelligent Method for C++ Test Case Synthesis Based on a Q-Learning Agent
by Serhii Semenov, Oleksii Kolomiitsev, Mykhailo Hulevych, Patryk Mazurek and Olena Chernyk
Appl. Sci. 2025, 15(15), 8596; https://doi.org/10.3390/app15158596 - 2 Aug 2025
Cited by 6 | Viewed by 1460
Abstract
Ensuring software quality during development requires effective regression testing. However, test suites in open-source libraries often grow large, redundant, and difficult to maintain. Most traditional test suite optimization methods treat test cases as atomic units, without analyzing the utility of individual instructions. This [...] Read more.
Ensuring software quality during development requires effective regression testing. However, test suites in open-source libraries often grow large, redundant, and difficult to maintain. Most traditional test suite optimization methods treat test cases as atomic units, without analyzing the utility of individual instructions. This paper presents an intelligent method for test case synthesis using a Q-learning agent. The agent learns to construct compact test cases by interacting with an execution environment and receives rewards based on branch coverage improvements and simultaneous reductions in test case length. The training process includes a pretraining phase that transfers knowledge from the original test suite, followed by adaptive learning episodes on individual test cases. As a result, the method requires no formal documentation or API specifications and uses only execution traces of the original test cases. An explicit synthesis algorithm constructs new test cases by selecting API calls from a learned policy encoded in a Q-table. Experiments were conducted on two open-source C++ libraries of differing API complexity and original test suite size. The results show that the proposed method can reach up to 67% test suite reduction while preserving branch coverage, confirming its effectiveness for regression test suite minimization in resource-constrained or specification-limited environments. Full article
Show Figures

Figure 1

13 pages, 7492 KB  
Article
Design and Fabrication of Orthokeratology Lens with Multi-Linear and Spherical Aberration Corrected for Myopia Control
by Zhengwang Li, Ruijin Hong and Dawei Zhang
Photonics 2025, 12(1), 87; https://doi.org/10.3390/photonics12010087 - 19 Jan 2025
Viewed by 4453
Abstract
Myopia, an increasingly grave public health concern, necessitates the implementation of various techniques for its management. These techniques predominantly comprise the employment of spectacles correction, orthokeratology (ortho-k), and soft bifocal and multifocal lenses. In the present study, a pioneering polish-free ortho-k lens was [...] Read more.
Myopia, an increasingly grave public health concern, necessitates the implementation of various techniques for its management. These techniques predominantly comprise the employment of spectacles correction, orthokeratology (ortho-k), and soft bifocal and multifocal lenses. In the present study, a pioneering polish-free ortho-k lens was devised, featuring two reverse lines and three alignment lines, which, respectively, expedite the shaping process and enhance centration. The structural blueprint of the ortho-k lens, along with the simulation of fluorescence staining, was executed employing the FocalPoints software V7.0 (Advance Medical, Milan, Italy). Subsequently, lens aberration elimination was accomplished through ray tracing utilizing ZEMAX software V13.0 (Focus Software, Wixom, MI, USA). The fabrication of the lens was carried out via high-precision lathe turning using the UPC 100 Vision instrument (SCHNEIDER, Ratingen, Germany). The power profile of the ortho-k lens was measured using the CONTEST 2 apparatus (ROTLEX, Omer, Israel). The surface quality was observed under a 200× microscope (ZEISS, Oberkochen, Germany). The fitting of the lens was assessed through the utilization of both Slit-lamp microscopy (MediWorks, Shanghai, China) and Corneal topographer (Medmont E300, Melbourne, VIC, Australia) Full article
Show Figures

Figure 1

19 pages, 484 KB  
Article
Preventing Dysgraphia: Early Observation Protocols and a Technological Framework for Monitoring and Enhancing Graphomotor Skills
by Silvia Ceccacci, Arianna Taddei, Noemi Del Bianco, Catia Giaconi, Dolors Forteza Forteza and Francisca Moreno-Tallón
Information 2024, 15(12), 781; https://doi.org/10.3390/info15120781 - 5 Dec 2024
Cited by 4 | Viewed by 5137
Abstract
Writing is first-order instrumental learning that develops throughout the life cycle, a complex process evolving from early childhood education. The identification of risk predictors of dysgraphia at age 5 has the potential to significantly reduce the impact of graphomotor difficulties in early primary [...] Read more.
Writing is first-order instrumental learning that develops throughout the life cycle, a complex process evolving from early childhood education. The identification of risk predictors of dysgraphia at age 5 has the potential to significantly reduce the impact of graphomotor difficulties in early primary school, which affects handwriting performance to such an extent that it can become illegible. Building on established scientific literature, this study focuses on screening processes, with particular attention to writing requirements. This paper proposes a novel prevention and intervention system based on new technologies for teachers and educators or therapists. Specifically, it presents a pilot study testing an innovative tactile device to analyze graphomotor performance and motor coordination in real time. The research explores whether this haptic device can be used as an effective pedagogical aid for preventing graphomotor issues in children aged 5 to 6 years. The results showed a high level of engagement and usability among young participants. Furthermore, the quality of graphomotor traces, respectively executed by children after virtual and physical training, were comparable, supporting the use of the tool as a complementary training resource for the observation and enhancement of graphomotor processes. Full article
Show Figures

Figure 1

29 pages, 1792 KB  
Article
AbstractTrace: The Use of Execution Traces to Cluster, Classify, Prioritize, and Optimize a Bloated Test Suite
by Ziad A. Al-Sharif and Clinton L. Jeffery
Appl. Sci. 2024, 14(23), 11168; https://doi.org/10.3390/app142311168 - 29 Nov 2024
Cited by 2 | Viewed by 1498
Abstract
Due to the incremental and iterative nature of the software testing process, a test suite may become bloated with redundant, overlapping, and similar test cases. This paper aims to optimize a bloated test suite by employing an execution trace that encodes runtime events [...] Read more.
Due to the incremental and iterative nature of the software testing process, a test suite may become bloated with redundant, overlapping, and similar test cases. This paper aims to optimize a bloated test suite by employing an execution trace that encodes runtime events into a sequence of characters forming a string. A dataset of strings, each of which represents the code coverage and execution behavior of a test case, is analyzed to identify similarities between test cases. This facilitates the de-bloating process by providing a formal mechanism to identify, remove, and reduce extra test cases without compromising software quality. This form of analysis allows for the clustering and classification of test cases based on their code coverage and similarity score. This paper explores three levels of execution traces and evaluates different techniques to measure their similarities. Test cases with the same code coverage should generate the exact string representation of runtime events. Various string similarity metrics are assessed to find the similarity score, which is used to classify, detect, and rank test cases accordingly. Additionally, this paper demonstrates the validity of the approach with two case studies. The first shows how to classify the execution behavior of various test cases, which can provide insight into each test case’s internal behavior. The second shows how to identify similar test cases based on their code coverage. Full article
(This article belongs to the Special Issue Artificial Intelligence in Software Engineering)
Show Figures

Figure 1

16 pages, 5620 KB  
Article
Online Optical Axis Parallelism Measurement Method for Continuous Zoom Camera Based on High-Precision Spot Center Positioning Algorithm
by Chanchan Kang, Yao Fang, Huawei Wang, Feng Zhou, Zeyue Ren and Feixiang Han
Photonics 2024, 11(11), 1017; https://doi.org/10.3390/photonics11111017 - 29 Oct 2024
Viewed by 1445
Abstract
Ensuring precise alignment of the optical axis is critical for achieving high-quality imaging in continuous zoom cameras. However, existing methods for measuring optical axis parallelism often lack accuracy and fail to assess parallelism across the entire focal range. This study introduces an online [...] Read more.
Ensuring precise alignment of the optical axis is critical for achieving high-quality imaging in continuous zoom cameras. However, existing methods for measuring optical axis parallelism often lack accuracy and fail to assess parallelism across the entire focal range. This study introduces an online measurement method designed to address these limitations by incorporating two enhancements. First, image processing methodologies enable sub-pixel-level extraction of the spot center, achieved through improved morphological processing and the incorporation of an edge tracing algorithm. Second, measurement software developed using Qt Creator can output real-time data on optical axis parallelism across the full focal range post-measurement. This software features a multi-threaded architecture that facilitates the concurrent execution of image acquisition, data processing, and serial communication. Experimental results derived from simulations and real data indicate that the maximum average error in extracting the center of the spot is 0.13 pixels. The proposed system provides critical data for optical axis calibration during camera adjustment and inspection. Full article
(This article belongs to the Special Issue Advancements in Optical Measurement Techniques and Applications)
Show Figures

Figure 1

17 pages, 329 KB  
Article
Traffic Classification in Software-Defined Networking Using Genetic Programming Tools
by Spiridoula V. Margariti, Ioannis G. Tsoulos, Evangelia Kiousi and Eleftherios Stergiou
Future Internet 2024, 16(9), 338; https://doi.org/10.3390/fi16090338 - 19 Sep 2024
Cited by 5 | Viewed by 3491
Abstract
The classification of Software-Defined Networking (SDN) traffic is an essential tool for network management, network monitoring, traffic engineering, dynamic resource allocation planning, and applying Quality of Service (QoS) policies. The programmability nature of SDN, the holistic view of the network through SDN controllers, [...] Read more.
The classification of Software-Defined Networking (SDN) traffic is an essential tool for network management, network monitoring, traffic engineering, dynamic resource allocation planning, and applying Quality of Service (QoS) policies. The programmability nature of SDN, the holistic view of the network through SDN controllers, and the capability for dynamic adjustable and reconfigurable controllersare fertile ground for the development of new techniques for traffic classification. Although there are enough research works that have studied traffic classification methods in SDN environments, they have several shortcomings and gaps that need to be further investigated. In this study, we investigated traffic classification methods in SDN using publicly available SDN traffic trace datasets. We apply a series of classifiers, such as MLP (BFGS), FC2 (RBF), FC2 (MLP), Decision Tree, SVM, and GENCLASS, and evaluate their performance in terms of accuracy, detection rate, and precision. Of the methods used, GenClass appears to be more accurate in separating the categories of the problem than the rest, and this is reflected in both precision and recall. The key element of the GenClass method is that it can generate classification rules programmatically and detect the hidden associations that exist between the problem features and the desired classes. However, Genetic Programming-based techniques require significantly higher execution time compared to other machine learning techniques. This is most evident in the feature construction method where at each generation of the genetic algorithm, a set of learning models is required to be trained to evaluate the generated artificial features. Full article
Show Figures

Figure 1

23 pages, 1257 KB  
Article
BIoTS-Path: Certification Transmission of Supply Chains Based on Blockchain–Internet of Things Architectures by Validating the Information Path
by Carlos Andrés Gonzalez-Amarillo, Anabel Fraga Vazquez, Gustavo Adolfo Ramirez-Gonzalez, Miguel Angel Mendoza-Moreno and Juan Carlos Corrales Muñoz
Mathematics 2023, 11(19), 4108; https://doi.org/10.3390/math11194108 - 28 Sep 2023
Cited by 2 | Viewed by 2086
Abstract
A food traceability system (FTS) can record information about processes along a production chain to determine their safety and quality. Under the Internet of Things (IoT) concept, the communication technologies that support FTSs act as platforms for mass access to information with limited [...] Read more.
A food traceability system (FTS) can record information about processes along a production chain to determine their safety and quality. Under the Internet of Things (IoT) concept, the communication technologies that support FTSs act as platforms for mass access to information with limited security. However, the integrity of the collected data is not immune to security attacks. This paper proposes a point-to-point information transmission path with no edges or access boundaries (no intermediaries) to transmit data with integrity. This route is possible thanks to the architectural articulation of a hardware device (sensor BIoTS) at the perception layer, with the Blockchain architecture at the application layer. This pairing makes an ecosystem with the ability to trace and certify in parallel the products, the supply chain processes, and the data recorded in it possible. The design of the security testing ecosystem is based on the theoretical and technical principles of cybersecurity. It is executed through mathematical models that define the probability of attacks’ success against the transmitted data’s integrity. The security tests performed allow for establishing that this BIoTS information transmission route is unlikely to suffer from transmission vulnerabilities and that it is not prone to security attacks against integrity. This work paves the way toward fully integrating Blockchain technology in dedicated IoT architectures. Full article
Show Figures

Figure 1

23 pages, 3326 KB  
Article
An Intelligent Task Scheduling Model for Hybrid Internet of Things and Cloud Environment for Big Data Applications
by Souvik Pal, N. Z. Jhanjhi, Azmi Shawkat Abdulbaqi, D. Akila, Faisal S. Alsubaei and Abdulaleem Ali Almazroi
Sustainability 2023, 15(6), 5104; https://doi.org/10.3390/su15065104 - 14 Mar 2023
Cited by 47 | Viewed by 4431
Abstract
One of the most significant issues in Internet of Things (IoT) cloud computing is scheduling tasks. Recent developments in IoT-based technologies have led to a meteoric rise in the demand for cloud storage. In order to load the IoT services onto cloud resources [...] Read more.
One of the most significant issues in Internet of Things (IoT) cloud computing is scheduling tasks. Recent developments in IoT-based technologies have led to a meteoric rise in the demand for cloud storage. In order to load the IoT services onto cloud resources efficiently even while satisfying the requirements of the applications, sophisticated planning methodologies are required. This is important because several processes must be well prepared on different virtual machines to maximize resource usage and minimize waiting times. Different IoT application tasks can be difficult to schedule in a cloud-based computing architecture due to the heterogeneous features of IoT. With the rise in IoT sensors and the need to access information quickly and reliably, fog cloud computing is proposed for the integration of fog and cloud networks to meet these demands. One of the most important necessities in a fog cloud setting is efficient task scheduling, as this can help to lessen the time it takes for data to be processed and improve QoS (quality of service). The overall processing time of IoT programs should be kept as short as possible by effectively planning and managing their workloads, taking into account limitations such as task scheduling. Finding the ideal approach is challenging, especially for big data systems, because task scheduling is a complex issue. This research provides a Deep Learning Algorithm for Big data Task Scheduling System (DLA-BDTSS) for the Internet of Things (IoT) and cloud computing applications. When it comes to reducing energy costs and end-to-end delay, an optimized scheduling model based on deep learning is used to analyze and process various tasks. The method employs a multi-objective strategy to shorten the makespan and maximize resource consumption. A regional exploration search technique improves the optimization algorithm’s capacity to exploit data and avoid becoming stuck in local optimization. DLA-BDTSS was compared to other well-known task allocation methods in accurate trace information and the CloudSim tools. The investigation showed that DLA-BDTSS performed better than other well-known algorithms. It converged faster than different approaches, making it beneficial for big data task scheduling scenarios, and it obtained an 8.43 percent improvement in the outcomes. DLA-BDTSS obtained an 8.43% improvement in the outcomes with an execution time of 34 s and fitness value evaluation of 76.8%. Full article
Show Figures

Figure 1

20 pages, 2276 KB  
Article
Modeling the Data Provenance of Relational Databases Supporting Full-Featured SQL and Procedural Languages
by Deyou Tang, Rong Zhao, Yuebang Lin, Tangqing Zhang and Pingjian Zhang
Appl. Sci. 2023, 13(1), 64; https://doi.org/10.3390/app13010064 - 21 Dec 2022
Cited by 1 | Viewed by 5564
Abstract
Data provenance is information about where data come from (provenance data) and how they transform (provenance transformation). Data provenance is widely used to evaluate data quality, trace errors, audit data, and understand references among data. Current studies on data provenance in relational database [...] Read more.
Data provenance is information about where data come from (provenance data) and how they transform (provenance transformation). Data provenance is widely used to evaluate data quality, trace errors, audit data, and understand references among data. Current studies on data provenance in relational database management systems (RDBMS) still have limitations in supporting full-featured SQL or procedural languages. With these challenges in mind, we present a formal definition of provenance data and provenance transformation for relational data. Then, we propose a solution to support data provenance in relational databases, including provenance graphs and provenance routes. Our method not only solves the complicated problem of modeling provenance in DBMS but also is capable of extending procedural languages in SQL. We also present ProvPg, a PostgreSQL-based prototype database system supporting data provenance in multiple granularities. ProvPg implements extraction, calculation, query, and visualization of provenance. We perform TPC-H tests for ProvPg and PostgreSQL, respectively. Experimental results show that ProvPg addresses the vision of supporting data provenance with little extra computation overhead for the execution engine, which indicates that our model is applicable to lineage tracing applications. Full article
Show Figures

Figure 1

16 pages, 5624 KB  
Article
Trends in Summer-Time Tropospheric Ozone during COVID-19 Lockdown in Indian Cities Might Forecast a Higher Future Risk
by Sujit Das, Abhijit Sarkar, Usha Mina, Senjuti Nandy, Md Najmus Saadat, Ganesh Kumar Agrawal and Randeep Rakwal
Atmosphere 2022, 13(7), 1115; https://doi.org/10.3390/atmos13071115 - 14 Jul 2022
Cited by 6 | Viewed by 3246
Abstract
High concentrations of tropospheric ozone (O3) is a serious concern in India. The generation and atmospheric dynamics of this trace gas depend on the availability of its precursors and meteorological variables. Like other parts of the world, the COVID-19 imposed lockdown [...] Read more.
High concentrations of tropospheric ozone (O3) is a serious concern in India. The generation and atmospheric dynamics of this trace gas depend on the availability of its precursors and meteorological variables. Like other parts of the world, the COVID-19 imposed lockdown and restrictions on major anthropogenic activities executed a positive impact on the ambient air quality with reduced primary pollutants/precursors load. In spite of this, several reports pointed towards a higher O3 in major Indian cities during the lockdown. The present study designed with 30 pan-Indian mega-, class I-, and class II-cities revealed critical and contrasting aspects of the geographical location, source, precursor, and meteorological variable dependency of the spatial and temporal O3 formation. This unexpected O3 increase in the major cities might forecast the probable future risks for the National Air Quality policies, especially O3 pollution management, in the Indian sub-continent. The results also pointed towards the severity of the north Indian air quality, followed by the western and eastern parts. We believe these results will definitely pave the way for researchers and policy-makers for predicting/framing regional and/or national O3 management strategies in the future. Full article
(This article belongs to the Special Issue Feature Papers in Atmosphere Science)
Show Figures

Figure 1

10 pages, 916 KB  
Review
The Effectiveness of Semi-Automated and Fully Automatic Segmentation for Inferior Alveolar Canal Localization on CBCT Scans: A Systematic Review
by Julien Issa, Raphael Olszewski and Marta Dyszkiewicz-Konwińska
Int. J. Environ. Res. Public Health 2022, 19(1), 560; https://doi.org/10.3390/ijerph19010560 - 4 Jan 2022
Cited by 31 | Viewed by 4739
Abstract
This systematic review aims to identify the available semi-automatic and fully automatic algorithms for inferior alveolar canal localization as well as to present their diagnostic accuracy. Articles related to inferior alveolar nerve/canal localization using methods based on artificial intelligence (semi-automated and fully automated) [...] Read more.
This systematic review aims to identify the available semi-automatic and fully automatic algorithms for inferior alveolar canal localization as well as to present their diagnostic accuracy. Articles related to inferior alveolar nerve/canal localization using methods based on artificial intelligence (semi-automated and fully automated) were collected electronically from five different databases (PubMed, Medline, Web of Science, Cochrane, and Scopus). Two independent reviewers screened the titles and abstracts of the collected data, stored in EndnoteX7, against the inclusion criteria. Afterward, the included articles have been critically appraised to assess the quality of the studies using the Quality Assessment of Diagnostic Accuracy Studies-2 (QUADAS-2) tool. Seven studies were included following the deduplication and screening against exclusion criteria of the 990 initially collected articles. In total, 1288 human cone-beam computed tomography (CBCT) scans were investigated for inferior alveolar canal localization using different algorithms and compared to the results obtained from manual tracing executed by experts in the field. The reported values for diagnostic accuracy of the used algorithms were extracted. A wide range of testing measures was implemented in the analyzed studies, while some of the expected indexes were still missing in the results. Future studies should consider the new artificial intelligence guidelines to ensure proper methodology, reporting, results, and validation. Full article
Show Figures

Figure 1

Back to TopTop