Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (38)

Search Parameters:
Keywords = extract, transform, and load (ETL)

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
27 pages, 1889 KiB  
Article
Advancing Smart City Sustainability Through Artificial Intelligence, Digital Twin and Blockchain Solutions
by Ivica Lukić, Mirko Köhler, Zdravko Krpić and Miljenko Švarcmajer
Technologies 2025, 13(7), 300; https://doi.org/10.3390/technologies13070300 - 11 Jul 2025
Cited by 1 | Viewed by 650
Abstract
This paper presents an integrated Smart City platform that combines digital twin technology, advanced machine learning, and a private blockchain network to enhance data-driven decision making and operational efficiency in both public enterprises and small and medium-sized enterprises (SMEs). The proposed cloud-based business [...] Read more.
This paper presents an integrated Smart City platform that combines digital twin technology, advanced machine learning, and a private blockchain network to enhance data-driven decision making and operational efficiency in both public enterprises and small and medium-sized enterprises (SMEs). The proposed cloud-based business intelligence model automates Extract, Transform, Load (ETL) processes, enables real-time analytics, and secures data integrity and transparency through blockchain-enabled audit trails. By implementing the proposed solution, Smart City and public service providers can significantly improve operational efficiency, including a 15% reduction in costs and a 12% decrease in fuel consumption for waste management, as well as increased citizen engagement and transparency in Smart City governance. The digital twin component facilitated scenario simulations and proactive resource management, while the participatory governance module empowered citizens through transparent, immutable records of proposals and voting. This study also discusses technical, organizational, and regulatory challenges, such as data integration, scalability, and privacy compliance. The results indicate that the proposed approach offers a scalable and sustainable model for Smart City transformation, fostering citizen trust, regulatory compliance, and measurable environmental and social benefits. Full article
(This article belongs to the Section Information and Communication Technologies)
Show Figures

Figure 1

14 pages, 2429 KiB  
Article
End-to-End Architecture for Real-Time IoT Analytics and Predictive Maintenance Using Stream Processing and ML Pipelines
by Ouiam Khattach, Omar Moussaoui and Mohammed Hassine
Sensors 2025, 25(9), 2945; https://doi.org/10.3390/s25092945 - 7 May 2025
Cited by 3 | Viewed by 2384
Abstract
The rapid proliferation of Internet of Things (IoT) devices across industries has created a need for robust, scalable, and real-time data processing architectures capable of supporting intelligent analytics and predictive maintenance. This paper presents a novel comprehensive architecture that enables end-to-end processing of [...] Read more.
The rapid proliferation of Internet of Things (IoT) devices across industries has created a need for robust, scalable, and real-time data processing architectures capable of supporting intelligent analytics and predictive maintenance. This paper presents a novel comprehensive architecture that enables end-to-end processing of IoT data streams, from acquisition to actionable insights. The system integrates Kafka-based message brokering for the high-throughput ingestion of real-time sensor data, with Apache Spark facilitating batch and stream extraction, transformation, and loading (ETL) processes. A modular machine-learning pipeline handles automated data preprocessing, training, and evaluation across various models. The architecture incorporates continuous monitoring and optimization components to track system performance and model accuracy, feeding insights to users via a dedicated Application Programming Interface (API). The design ensures scalability, flexibility, and real-time responsiveness, making it well suited for industrial IoT applications requiring continuous monitoring and intelligent decision-making. Full article
Show Figures

Figure 1

45 pages, 4361 KiB  
Article
Engineering Sustainable Data Architectures for Modern Financial Institutions
by Sergiu-Alexandru Ionescu, Vlad Diaconita and Andreea-Oana Radu
Electronics 2025, 14(8), 1650; https://doi.org/10.3390/electronics14081650 - 19 Apr 2025
Cited by 3 | Viewed by 2648
Abstract
Modern financial institutions now manage increasingly advanced data-related activities and place a growing emphasis on environmental and energy impacts. In financial modeling, relational databases, big data systems, and the cloud are integrated, taking into consideration resource optimization and sustainable computing. We suggest a [...] Read more.
Modern financial institutions now manage increasingly advanced data-related activities and place a growing emphasis on environmental and energy impacts. In financial modeling, relational databases, big data systems, and the cloud are integrated, taking into consideration resource optimization and sustainable computing. We suggest a four-layer architecture to address financial data processing issues. The layers of our design are for data sources, data integration, processing, and storage. Data ingestion processes market feeds, transaction records, and customer data. Real-time data are captured by Kafka and transformed by Extract-Transform-Load (ETL) pipelines. The processing layer is composed of Apache Spark for real-time data analysis, Hadoop for batch processing, and an Machine Learning (ML) infrastructure that supports predictive modeling. In order to optimize access patterns, the storage layer includes various data layer components. The test results indicate that the processing of market data in real-time, compliance reporting, risk evaluations, and customer analyses can be conducted in fulfillment of environmental sustainability goals. The metrics from the test deployment support the implementation strategies and technical specifications of the architectural components. We also looked at integration models and data flow improvements, with applications in finance. This study aims to enhance enterprise data architecture in the financial context and includes guidance on modernizing data infrastructure. Full article
Show Figures

Figure 1

24 pages, 8329 KiB  
Article
Leveraging Deep Learning and Internet of Things for Dynamic Construction Site Risk Management
by Li-Wei Lung, Yu-Ren Wang and Yung-Sung Chen
Buildings 2025, 15(8), 1325; https://doi.org/10.3390/buildings15081325 - 17 Apr 2025
Cited by 2 | Viewed by 1161
Abstract
The construction industry faces persistent occupational health and safety challenges, with numerous risks arising from construction sites’ complex and dynamic nature. Accidents frequently result from inadequate safety distances and poorly managed work-er–machine interactions, highlighting the need for advanced safety management solutions. This study [...] Read more.
The construction industry faces persistent occupational health and safety challenges, with numerous risks arising from construction sites’ complex and dynamic nature. Accidents frequently result from inadequate safety distances and poorly managed work-er–machine interactions, highlighting the need for advanced safety management solutions. This study develops and validates an innovative hazard warning system that leverages deep learning-based image recognition (YOLOv7) and Internet of Things (IoT) modules to enhance construction site safety. The system achieves a mean average precision (mAP) of 0.922 and an F1 score of 0.88 at a 0.595 confidence threshold, detecting hazards in under 1 s. Integrating IoT-enabled smart wearable devices provides real-time monitoring, delivering instant hazard alerts and personalized safety warnings, even in areas with limited network connectivity. The system employs the DIKW knowledge management framework to extract, transform, and load (ETL) high-quality labeled data and optimize worker and machinery recognition. Robust feature extraction is performed using convolutional neural networks (CNNs) and a fully connected approach for neural network training. Key innovations, such as perspective projection coordinate transformation (PPCT) and the security assessment block module (SABM), further enhance hazard detection and warning generation accuracy and reliability. Validated through extensive on-site experiments, the system demonstrates significant advancements in real-time hazard detection, improving site safety, reducing accident rates, and increasing productivity. The integration of IoT enhances scalability and adaptability, laying the groundwork for future advancements in construction automation and safety management. Full article
(This article belongs to the Special Issue Data Analytics Applications for Architecture and Construction)
Show Figures

Figure 1

25 pages, 5922 KiB  
Article
Cloud-Driven Data Analytics for Growing Plants Indoor
by Nezha Kharraz and István Szabó
AgriEngineering 2025, 7(4), 101; https://doi.org/10.3390/agriengineering7040101 - 2 Apr 2025
Viewed by 601
Abstract
The integration of cloud computing, IoT (Internet of Things), and artificial intelligence (AI) is transforming precision agriculture by enabling real-time monitoring, data analytics, and dynamic control of environmental factors. This study develops a cloud-driven data analytics pipeline for indoor agriculture, using lettuce as [...] Read more.
The integration of cloud computing, IoT (Internet of Things), and artificial intelligence (AI) is transforming precision agriculture by enabling real-time monitoring, data analytics, and dynamic control of environmental factors. This study develops a cloud-driven data analytics pipeline for indoor agriculture, using lettuce as a test crop due to its suitability for controlled environments. Built with Apache NiFi (Niagara Files), the pipeline facilitates real-time ingestion, processing, and storage of IoT sensor data measuring light, moisture, and nutrient levels. Machine learning models, including SVM (Support Vector Machine), Gradient Boosting, and DNN (Deep Neural Networks), analyzed 12 weeks of sensor data to predict growth trends and optimize thresholds. Random Forest analysis identified light intensity as the most influential factor (importance: 0.7), while multivariate regression highlighted phosphorus (0.54) and temperature (0.23) as key contributors to plant growth. Nitrogen exhibited a strong positive correlation (0.85) with growth, whereas excessive moisture (–0.78) and slightly elevated temperatures (–0.24) negatively impacted plant development. To enhance resource efficiency, this study introduces the Integrated Agricultural Efficiency Metric (IAEM), a novel framework that synthesizes key factors, including resource usage, alert accuracy, data latency, and cloud availability, leading to a 32% improvement in resource efficiency. Unlike traditional productivity metrics, IAEM incorporates real-time data processing and cloud infrastructure to address the specific demands of modern indoor farming. The combined approach of scalable ETL (Extract, Transform, Load) pipelines with predictive analytics reduced light use by 25%, water by 30%, and nutrients by 40% while simultaneously improving crop productivity and sustainability. These findings underscore the transformative potential of integrating IoT, AI, and cloud-based analytics in precision agriculture, paving the way for more resource-efficient and sustainable farming practices. Full article
Show Figures

Figure 1

46 pages, 1920 KiB  
Article
Pattern Shared Vision Refinement for Enhancing Collaboration and Decision-Making in Government Software Projects
by Mohammad Daud Haiderzai, Pavle Dakić, Igor Stupavský, Marijana Aleksić and Vladimir Todorović
Electronics 2025, 14(2), 334; https://doi.org/10.3390/electronics14020334 - 16 Jan 2025
Viewed by 2037
Abstract
This study proposes a new approach and explores how pattern recognition enhances collaboration between users and Agile teams in software development, focusing on shared resources and decision-making efficiency. Using domain-specific modeling languages (DSMLs) within a security-by-design framework, the research identifies patterns that support [...] Read more.
This study proposes a new approach and explores how pattern recognition enhances collaboration between users and Agile teams in software development, focusing on shared resources and decision-making efficiency. Using domain-specific modeling languages (DSMLs) within a security-by-design framework, the research identifies patterns that support team selection, effort estimation, and Agile risk management for Afghanistan’s ministries. These patterns align software development with governmental needs by clarifying stakeholder roles and fostering cooperation. The study builds on the p-mart-Repository-Programs (P-MARt) repository, integrating data mining, algorithms, and ETL (Extract, Transform, Load) processes to develop innovative methodologies. These approaches enable dynamic knowledge management, refine documentation, and improve project outcomes. Central to this effort is our new Pattern Shared Vision Refinement (PSVR) approach, which emphasizes robust collaboration, data security, and adaptability. By addressing challenges unique to governmental operations, PSVR strengthens Agile practices and ensures high-quality software delivery. By analyzing historical trends and introducing new strategies, the study underscores the critical role of pattern recognition in aligning development processes with organizational goals. It demonstrates how systematic pattern identification can optimize interaction and secure stakeholder consensus, ultimately enhancing software engineering outcomes in Afghanistan’s governmental context. Full article
Show Figures

Figure 1

24 pages, 2974 KiB  
Article
Digitalization and Dynamic Criticality Analysis for Railway Asset Management
by Mauricio Rodríguez Hernández, Antonio Sánchez-Herguedas, Vicente González-Prida, Sebastián Soto Contreras and Adolfo Crespo Márquez
Appl. Sci. 2024, 14(22), 10642; https://doi.org/10.3390/app142210642 - 18 Nov 2024
Viewed by 1778
Abstract
The primary aim of this paper is to support the optimization of asset management in railway infrastructure through digitalization and criticality analysis. It addresses the current challenges in railway infrastructure management, where data-driven decision making and automation are key for effective resource allocation. [...] Read more.
The primary aim of this paper is to support the optimization of asset management in railway infrastructure through digitalization and criticality analysis. It addresses the current challenges in railway infrastructure management, where data-driven decision making and automation are key for effective resource allocation. The paper presents a methodology that emphasizes the development of a robust data model for criticality analysis, along with the advantages of integrating advanced digital tools. A master table is designed to rank assets and automatically calculate criticality through a novel asset attribute characterization (AAC) process. Digitalization facilitates dynamic, on-demand criticality assessments, which are essential in managing complex networks. The study also underscores the importance of combining digital technology adoption with organizational change management. The data process and structure proposed can be viewed as an ontological framework adaptable to various contexts, enabling more informed and efficient asset ranking decisions. This methodology is derived from its application to a metropolitan railway network, where thousands of assets were evaluated, providing a practical approach for conducting criticality assessments in a digitized environment. Full article
(This article belongs to the Special Issue Big-Data-Driven Advances in Smart Maintenance and Industry 4.0)
Show Figures

Figure 1

20 pages, 5721 KiB  
Article
Low-Cost Data, High-Quality Models: A Semi-Automated Approach to LOD3 Creation
by Harshit, Pallavi Chaurasia, Sisi Zlatanova and Kamal Jain
ISPRS Int. J. Geo-Inf. 2024, 13(4), 119; https://doi.org/10.3390/ijgi13040119 - 3 Apr 2024
Cited by 4 | Viewed by 3285
Abstract
In the dynamic realm of digital twin modeling, where advancements are swiftly unfolding, users now possess the unprecedented ability to capture and generate geospatial data in real time. This article delves into a critical exploration of this landscape by presenting a meticulously devised [...] Read more.
In the dynamic realm of digital twin modeling, where advancements are swiftly unfolding, users now possess the unprecedented ability to capture and generate geospatial data in real time. This article delves into a critical exploration of this landscape by presenting a meticulously devised workflow tailored for the creation of Level of Detail 3 (LOD3) models. Our research methodology capitalizes on the integration of Apple LiDAR technology alongside photogrammetric point clouds acquired from Unmanned Aerial Vehicles (UAVs). The proposed process unfolds with the transformation of point cloud data into Industry Foundation Classes (IFC) models, which are subsequently refined into LOD3 Geographic Information System (GIS) models leveraging the Feature Manipulation Engine (FME) workbench 2022.1.2. This orchestrated synergy among Apple LiDAR, UAV-derived photogrammetric point clouds, and the transformative capabilities of the FME culminates in the development of precise LOD3 GIS models. Our proposed workflow revolutionizes this landscape by integrating multi-source point clouds, imbuing them with accurate semantics derived from IFC models, and culminating in the creation of valid CityGML LOD3 buildings through sophisticated 3D geometric operations. The implications of this technical innovation are profound. Firstly, it elevates the capacity to produce intricate infrastructure models, unlocking new vistas for modeling digital twins. Secondly, it extends the horizons of GIS applications by seamlessly integrating enriched Building Information Modeling (BIM) components, thereby enhancing decision-making processes and facilitating more comprehensive spatial analyses. Full article
Show Figures

Figure 1

14 pages, 2937 KiB  
Article
Digital Twin Data Management: Framework and Performance Metrics of Cloud-Based ETL System
by Austeja Dapkute, Vytautas Siozinys, Martynas Jonaitis, Mantas Kaminickas and Milvydas Siozinys
Machines 2024, 12(2), 130; https://doi.org/10.3390/machines12020130 - 12 Feb 2024
Cited by 5 | Viewed by 2184
Abstract
This study delves into the EA-SAS platform, a digital twin environment developed by our team, with a particular focus on the EA-SAS Cloud Scheduler, our bespoke program designed to optimize ETL (extract, transform, and load) scheduling and thereby enhance automation within industrial systems. [...] Read more.
This study delves into the EA-SAS platform, a digital twin environment developed by our team, with a particular focus on the EA-SAS Cloud Scheduler, our bespoke program designed to optimize ETL (extract, transform, and load) scheduling and thereby enhance automation within industrial systems. We elucidate the architectural intricacies of the EA-SAS Cloud Scheduler, demonstrating its adeptness in efficiently managing computationally heavy tasks, a capability underpinned by our empirical benchmarks. The architecture of the scheduler incorporates Docker to create isolated task environments and leverages RabbitMQ for effective task distribution. Our analysis reveals the EA-SAS Cloud Scheduler’s prowess in maintaining minimal overhead times, even in scenarios characterized by high operational loads, underscoring its potential to markedly bolster operational efficiency in industrial settings. While acknowledging the limitations inherent in our current assessment, particularly in simulating real-world industrial complexities, the study also charts potential future research pathways. These include a thorough exploration of the EA-SAS Cloud Scheduler’s adaptability across diverse industrial scenarios and an examination of the integration challenges associated with its reliance on specific technological frameworks. Full article
(This article belongs to the Special Issue Advances in Digital Twins for Manufacturing)
Show Figures

Figure 1

26 pages, 4290 KiB  
Article
A Model for Enhancing Unstructured Big Data Warehouse Execution Time
by Marwa Salah Farhan, Amira Youssef and Laila Abdelhamid
Big Data Cogn. Comput. 2024, 8(2), 17; https://doi.org/10.3390/bdcc8020017 - 6 Feb 2024
Cited by 1 | Viewed by 3036
Abstract
Traditional data warehouses (DWs) have played a key role in business intelligence and decision support systems. However, the rapid growth of the data generated by the current applications requires new data warehousing systems. In big data, it is important to adapt the existing [...] Read more.
Traditional data warehouses (DWs) have played a key role in business intelligence and decision support systems. However, the rapid growth of the data generated by the current applications requires new data warehousing systems. In big data, it is important to adapt the existing warehouse systems to overcome new issues and limitations. The main drawbacks of traditional Extract–Transform–Load (ETL) are that a huge amount of data cannot be processed over ETL and that the execution time is very high when the data are unstructured. This paper focuses on a new model consisting of four layers: Extract–Clean–Load–Transform (ECLT), designed for processing unstructured big data, with specific emphasis on text. The model aims to reduce execution time through experimental procedures. ECLT is applied and tested using Spark, which is a framework employed in Python. Finally, this paper compares the execution time of ECLT with different models by applying two datasets. Experimental results showed that for a data size of 1 TB, the execution time of ECLT is 41.8 s. When the data size increases to 1 million articles, the execution time is 119.6 s. These findings demonstrate that ECLT outperforms ETL, ELT, DELT, ELTL, and ELTA in terms of execution time. Full article
(This article belongs to the Special Issue Big Data and Information Science Technology)
Show Figures

Figure 1

44 pages, 38595 KiB  
Article
Enhancing Urban Resilience: Smart City Data Analyses, Forecasts, and Digital Twin Techniques at the Neighborhood Level
by Andreas F. Gkontzis, Sotiris Kotsiantis, Georgios Feretzakis and Vassilios S. Verykios
Future Internet 2024, 16(2), 47; https://doi.org/10.3390/fi16020047 - 30 Jan 2024
Cited by 40 | Viewed by 9178
Abstract
Smart cities, leveraging advanced data analytics, predictive models, and digital twin techniques, offer a transformative model for sustainable urban development. Predictive analytics is critical to proactive planning, enabling cities to adapt to evolving challenges. Concurrently, digital twin techniques provide a virtual replica of [...] Read more.
Smart cities, leveraging advanced data analytics, predictive models, and digital twin techniques, offer a transformative model for sustainable urban development. Predictive analytics is critical to proactive planning, enabling cities to adapt to evolving challenges. Concurrently, digital twin techniques provide a virtual replica of the urban environment, fostering real-time monitoring, simulation, and analysis of urban systems. This study underscores the significance of real-time monitoring, simulation, and analysis of urban systems to support test scenarios that identify bottlenecks and enhance smart city efficiency. This paper delves into the crucial roles of citizen report analytics, prediction, and digital twin technologies at the neighborhood level. The study integrates extract, transform, load (ETL) processes, artificial intelligence (AI) techniques, and a digital twin methodology to process and interpret urban data streams derived from citizen interactions with the city’s coordinate-based problem mapping platform. Using an interactive GeoDataFrame within the digital twin methodology, dynamic entities facilitate simulations based on various scenarios, allowing users to visualize, analyze, and predict the response of the urban system at the neighborhood level. This approach reveals antecedent and predictive patterns, trends, and correlations at the physical level of each city area, leading to improvements in urban functionality, resilience, and resident quality of life. Full article
Show Figures

Graphical abstract

34 pages, 5820 KiB  
Review
3D Cadastral Database Systems—A Systematic Literature Review
by Javad Shahidinejad, Mohsen Kalantari and Abbas Rajabifard
ISPRS Int. J. Geo-Inf. 2024, 13(1), 30; https://doi.org/10.3390/ijgi13010030 - 17 Jan 2024
Cited by 10 | Viewed by 7166
Abstract
Cadastral databases have been used for over 20 years, but most contain 2D data. The increasing presence of high-rise buildings with modern architecture complicates the process of determining property rights, restrictions, and responsibilities. It is, therefore, necessary to develop an efficient system for [...] Read more.
Cadastral databases have been used for over 20 years, but most contain 2D data. The increasing presence of high-rise buildings with modern architecture complicates the process of determining property rights, restrictions, and responsibilities. It is, therefore, necessary to develop an efficient system for storing and managing multidimensional cadastral data. While there have been attempts to develop 3D cadastral database schemas, a comprehensive solution that meets all the requirements for effective data storage, manipulation, and retrieval has not yet been presented. This study aims to analyse the literature on 3D cadastral databases to identify approaches and technologies for storing and managing these data. Based on a systematic literature review integrated with a snowballing methodology, 108 documents were identified. During the analysis of the related documents, different parameters were extracted, including the conceptual data model, query type, and evaluation metrics, as well as the database management system (DBMS) used and technologies for visualisation, data preparation, data transformation, and the ETL (extract, transform, and load) process. The study emphasised the importance of adhering to database design principles and identified challenges associated with conceptual design, DBMS selection, logical design, and physical design. The study results provide insights for selecting the appropriate standards, technologies, and DBMSs for designing a 3D cadastral database system. Full article
Show Figures

Figure 1

22 pages, 9457 KiB  
Article
Deep Learning Applications in Vessel Dead Reckoning to Deal with Missing Automatic Identification System Data
by Atefe Sedaghat, Homayoon Arbabkhah, Masood Jafari Kang and Maryam Hamidi
J. Mar. Sci. Eng. 2024, 12(1), 152; https://doi.org/10.3390/jmse12010152 - 12 Jan 2024
Cited by 12 | Viewed by 2527
Abstract
This research introduces an online system for monitoring maritime traffic, aimed at tracking vessels in water routes and predicting their subsequent locations in real time. The proposed framework utilizes an Extract, Transform, and Load (ETL) pipeline to dynamically process AIS data by cleaning, [...] Read more.
This research introduces an online system for monitoring maritime traffic, aimed at tracking vessels in water routes and predicting their subsequent locations in real time. The proposed framework utilizes an Extract, Transform, and Load (ETL) pipeline to dynamically process AIS data by cleaning, compressing, and enhancing it with additional attributes such as online traffic volume, origin/destination, vessel trips, trip direction, and vessel routing. This processed data, enriched with valuable details, serves as an alternative to raw AIS data stored in a centralized database. For user interactions, a user interface is designed to query the database and provide real-time information on a map-based interface. To deal with false or missing AIS records, two methods, dead reckoning and machine learning techniques, are employed to anticipate the trajectory of the vessel in the next time steps. To evaluate each method, several metrics are used, including R squared, mean absolute error, mean offset, and mean offset from the centerline. The functionality of the proposed system is showcased through a case study conducted in the Gulf Intracoastal Waterway (GIWW). Three years of AIS data are collected and processed as a simulated API to transmit AIS records every five minutes. According to our results, the Seq2Seq model exhibits strong performance (0.99 R squared and an average offset of ~1400 ft). However, the second scenario, dead reckoning, proves comparable to the Seq2Seq model as it involves recalculating vessel headings by comparing each data point with the previous one. Full article
(This article belongs to the Special Issue Application of Artificial Intelligence in Maritime Transportation)
Show Figures

Figure 1

16 pages, 5211 KiB  
Article
Developing Integrated Performance Dashboards Visualisations Using Power BI as a Platform
by Célia Talma Gonçalves, Maria José Angélico Gonçalves and Maria Inês Campante
Information 2023, 14(11), 614; https://doi.org/10.3390/info14110614 - 15 Nov 2023
Cited by 15 | Viewed by 22925
Abstract
The rapid advance of business technologies in recent years has made knowledge an essential and strategic asset that determines the success or failure of an organisation. Access to the right information in real time and with high selectivity can be a competitive advantage [...] Read more.
The rapid advance of business technologies in recent years has made knowledge an essential and strategic asset that determines the success or failure of an organisation. Access to the right information in real time and with high selectivity can be a competitive advantage in the business environment. Business intelligence systems help corporate executives, business managers, and other operational workers make better and more informed business decisions. This study aimed to assess the impact of using business intelligence tools on the decision-making process in organisations, specifically in sales marketing. The methodology applied to realise the study’s objective was the Vercellis methodology. A set of data available on the sales marketing website SuperDataScience was used to implement a set of pressing KPIs for the business decision-making process in the area. Using these data, a complete business intelligence system solution was implemented. A data warehouse was created using the ETL (extract–transform–load) process, and the data were then explored using a set of dynamics dashboards with a view of the business metrics. The results showed the use of business intelligence systems that allow the integration and transformation of data from various sources stored in data warehouses, where it is possible to implement KPIs and carry out quick, concise, easy-to-interpret graphical analyses. This paper contributes to a better understanding of the importance of data-integrated dashboard visualisation for the decision-making process. Full article
(This article belongs to the Special Issue Artificial Intelligence (AI) for Economics and Business Management)
Show Figures

Figure 1

27 pages, 8048 KiB  
Article
Pervasive Real-Time Analytical Framework—A Case Study on Car Parking Monitoring
by Francisca Barros, Beatriz Rodrigues, José Vieira and Filipe Portela
Information 2023, 14(11), 584; https://doi.org/10.3390/info14110584 - 25 Oct 2023
Viewed by 2382
Abstract
Due to the amount of data emerging, it is necessary to use an online analytical processing (OLAP) framework capable of responding to the needs of industries. Processes such as drill-down, roll-up, three-dimensional analysis, and data filtering are fundamental for the perception of information. [...] Read more.
Due to the amount of data emerging, it is necessary to use an online analytical processing (OLAP) framework capable of responding to the needs of industries. Processes such as drill-down, roll-up, three-dimensional analysis, and data filtering are fundamental for the perception of information. This article demonstrates the OLAP framework developed as a valuable and effective solution in decision making. To develop an OLAP framework, it was necessary to create the extract, transform and load the (ETL) process, build a data warehouse, and develop the OLAP via cube.js. Finally, it was essential to design a solution that adds more value to the organizations and presents several characteristics to support the entire data analysis process. A backend API (application programming interface) to route the data via MySQL was required, as well as a frontend and a data visualization layer. The OLAP framework was developed for the ioCity project. However, its great advantage is its versatility, which allows any industry to use it in its system. One ETL process, one data warehouse, one OLAP model, six indicators, and one OLAP framework were developed (with one frontend and one API backend). In conclusion, this article demonstrates the importance of a modular, adaptable, and scalable tool in the data analysis process and in supporting decision making. Full article
Show Figures

Figure 1

Back to TopTop