Next Article in Journal
HFC-YOLO11: A Lightweight Model for the Accurate Recognition of Tiny Remote Sensing Targets
Next Article in Special Issue
GARMT: Grouping-Based Association Rule Mining to Predict Future Tables in Database Queries
Previous Article in Journal
Accounting Support Using Artificial Intelligence for Bank Statement Classification
Previous Article in Special Issue
Educational Resource Private Cloud Platform Based on OpenStack
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Revolutionizing Data Exchange Through Intelligent Automation: Insights and Trends

by
Yeison Nolberto Cardona-Álvarez
*,
Andrés Marino Álvarez-Meza
and
German Castellanos-Dominguez
Signal Processing and Recognition Group, Universidad Nacional de Colombia, Manizales 170003, Colombia
*
Author to whom correspondence should be addressed.
Computers 2025, 14(5), 194; https://doi.org/10.3390/computers14050194
Submission received: 21 April 2025 / Revised: 13 May 2025 / Accepted: 15 May 2025 / Published: 17 May 2025
(This article belongs to the Special Issue Cloud Computing and Big Data Mining)

Abstract

:
This review paper presents a comprehensive analysis of the evolving landscape of data exchange, with a particular focus on the transformative role of emerging technologies such as blockchain, field-programmable gate arrays (FPGAs), and artificial intelligence (AI). We explore how the integration of these technologies into data management systems enhances operational efficiency, precision, and security through intelligent automation and advanced machine learning techniques. The paper also critically examines the key challenges facing data exchange today, including issues of interoperability, the demand for real-time processing, and the stringent requirements of regulatory compliance. Furthermore, it underscores the urgent need for robust ethical frameworks to guide the responsible use of AI and to protect data privacy. In addressing these challenges, the paper calls for innovative research aimed at overcoming current limitations in scalability and security. It advocates for interdisciplinary approaches that harmonize technological innovation with legal and ethical considerations. Ultimately, this review highlights the pivotal role of collaboration among researchers, industry stakeholders, and policymakers in fostering a digitally inclusive future—one that strengthens data exchange practices while upholding global standards of fairness, transparency, and accountability.

1. Introduction

A comprehensive understanding of modern data management is essential in today’s complex and interconnected digital landscape. This encompasses the entire data lifecycle, including generation, acquisition, storage, processing, and eventual utilization [1]. Grasping these components underscores the critical importance of efficient data exchange in ensuring the accuracy, reliability, and accessibility of information across systems and platforms [2]. Furthermore, such foundational knowledge facilitates an exploration of the historical evolution and underlying principles that have shaped current data exchange architectures [3]. By analyzing these dimensions, we will be better equipped to contextualize contemporary advancements and critically assess ongoing developments in the field [4].
Despite the crucial role of data exchange, particularly in the early stages of data management, there is a notable deficiency in comprehensive analysis within the academic literature. This lack of expertise is especially pronounced in areas such as big data, real-time processing, and cloud computing, where the complexities of data exchange are rapidly evolving and most significant [5]. The scarcity of detailed scholarly discussions on the intricacies of data exchange and its impact on data quality highlights a pressing need for thorough investigation, particularly in sectors like healthcare, finance and scientific research, where accuracy, efficiency, and data security are paramount. Therefore, the primary motivation for this review is to address this evident gap in the comprehensive academic analysis [6]. Given the rapid evolution and increasing complexity of data exchange, there is an urgent need to explore these areas in depth. By focusing on advanced technologies such as blockchain, Field programable gate array (FPGA), and AI, we aim to provide a detailed evaluation of current practices, identify key challenges and propose solutions to improve the efficiency and security of data management systems [7].
Rather than limiting the scope to a single emerging technology, this study deliberately adopts a comparative lens to explore how multiple innovations—including blockchain, FPGA-based hardware, zero-knowledge cryptography, and AI—jointly address the evolving challenges of data exchange. To provide a focused and actionable review, this paper centers on three key enabling technologies in modern data exchange systems: blockchain, field-programmable gate arrays (FPGAs), and artificial intelligence (AI). Each technology is analyzed in terms of its contribution to enhancing interoperability, scalability, and security in data management systems. Blockchain ensures data integrity and traceability in decentralized systems, FPGA-based hardware enables low-latency, high-performance data processing, and AI supports automation and predictive analytics in dynamic data environments.
Then, this paper aims to address the existing gap in scholarly literature by presenting a comprehensive analysis of the processes and technologies that underpin modern data exchange. Particular emphasis is placed on the critical stages of the data lifecycle—generation, storage, and processing—as foundational components that shape the effectiveness of data exchange systems. In doing so, we provide both the necessary theoretical grounding and historical context that have influenced the evolution of contemporary data exchange technologies, thereby facilitating a more nuanced understanding of their current state. We evaluate a range of cutting-edge tools and protocols with the potential to revolutionize data exchange, examining their practical effectiveness and identifying persistent challenges such as data security, transfer latency, interoperability, and the real-time handling of large-scale datasets. Ultimately, the core objective is to investigate how these innovations can overcome existing limitations and contribute to enhancing the efficiency, reliability, and security of modern data administration practices. In summary, this review presents a holistic exploration of the evolving landscape of data exchange by examining a broad spectrum of interrelated themes. These include advanced data processing techniques—such as FPGA-based architectures—secure and decentralized frameworks through blockchain and IoT integration, and the application of artificial intelligence for intelligent automation. It also addresses key systemic challenges, including interoperability, scalability, and regulatory compliance, while highlighting emerging innovations like edge computing, real-time analytics, and privacy-preserving mechanisms such as zero-knowledge proofs. By linking these technological domains with practical implementations in sectors such as healthcare, finance, and industry.
The remainder of this paper is organized as follows: Section 2 presents the main theoretical background. Next, Section 4 presents the literature findings. Finally, Section 5 and Section 6 show the discussion and the concluding remarks.

2. Background

2.1. Main Definitions

For the sake of clarity and consistency throughout this work, we define several key terms that underpin our discussion:
Data Exchange: The process of transferring data across various systems, platforms, or organizations. This involves not only the physical transmission of data but also the transformation and integration of formats, protocols, and system architectures [8].
Intelligent Automation: The application of advanced technologies—such as Artificial intelligence (AI), Machine learning (ML), and Robotic process automation (RPA)—to automate complex tasks traditionally performed by humans. The core aim is to enhance operational efficiency and productivity [9].
Blockchain: A decentralized digital ledger system that records transactions across multiple nodes. It ensures transparency and data integrity, especially in contexts where mutual trust between parties is limited [10].
FPGA: A field-programmable gate array is an integrated circuit that can be configured post-manufacturing. It is often employed in specialized hardware applications to optimize data processing capabilities [11].
Data Integrity: Refers to the accuracy, consistency, and reliability of data throughout its lifecycle. It is vital to maintain unaltered and dependable data within databases or other data structures [12].
Data Privacy: Involves the appropriate management of data in accordance with relevant data protection laws and ethical guidelines. This is particularly important when handling personal or sensitive information [13].
Interoperability: The ability of heterogeneous computer systems and software applications to seamlessly exchange and utilize information. It is a critical requirement for efficient and effective data sharing [14].

2.2. Data Exchange Evolution

The history and evolution of data exchange have been characterized by notable technology advancements that have fundamentally transformed the way data is exchanged and utilized across many platforms and industries. At first, the transfer of data was done manually, using tangible mediums like paper documents and magnetic tapes. This process was both time-consuming and susceptible to mistakes. With the emergence of computers and the Internet, Electronic data exchange (EDI) became widespread, enabling a more efficient and streamlined exchange of information [15]. The advent of databases and network computers in the latter part of the 20th century greatly eased the flow of data in real-time or near-real-time, hence improving the capacity to conduct commerce and research on a worldwide scale [16]. Then, cloud computing emerged at the turn of the millennium, offering scalable and adaptable options for storing and exchanging data. This technology greatly reduced the obstacles for data-intensive processes. In recent times, there have been significant developments in blockchain and machine learning that have tackled security and automation concerns. These breakthroughs have introduced new ways of ensuring the accuracy of data and improving the complexity of data exchange procedures. Each of these technical advancements has played a role in creating a society that is more interconnected and reliant on data, paving the way for the next wave of breakthroughs in managing and organizing data [17].
Of note, effective data exchange enables businesses to leverage large datasets, thereby enhancing decision-making, gaining customer insights, and driving innovation. In healthcare, efficient data sharing is vital for improving patient outcomes, integrating Electronic health records (EHR) across diverse systems, and supporting telemedicine. In the financial sector, secure and rapid data exchange is essential for real-time transaction processing, risk management, and regulatory compliance [18]. Additionally, in scientific research, data exchange promotes collaboration and accelerates discovery by allowing researchers to share findings and access extensive databases globally. The growing Internet of Things (IoT) network heavily relies on seamless data transmission between devices to automate and optimize processes in smart homes, industrial environments, and urban development. Furthermore, data exchange is crucial in the field of artificial intelligence and machine learning, where large and diverse datasets are necessary for developing accurate models. Efficient and secure data transmission not only fosters innovation and competition but also supports governance and sustainability initiatives by providing essential insights into environmental, social, and governance issues [19].
In particular, crucial technologies that enable the transfer of data are essential in today’s interconnected digital economy. Networking protocols such as Transmission Control Protocol/Internet Protocol (TCP/IP) form the foundation for data communication via the internet. They guarantee the secure and efficient transmission of data packets between devices and locations. Database management systems (DBMS) are essential for the storage, retrieval, and administration of data across networks. Structured Query Language (SQL) and Not Only Structured Query Language (NoSQL) databases are used to handle organized and unstructured data, respectively [20]. Application programming interface (API)s, or Application Programming Interfaces, are essential in facilitating communication and integration across different software systems. They accomplish this by establishing the specific methods and data structures that programs can utilize to connect with one another. Also, cloud computing systems, such as AWS, Google Cloud, and Microsoft Azure, provide adaptable and versatile settings for storing and sharing data. These platforms support a wide range of services, including data hosting and comprehensive enterprise solutions [21].
Besides, Blockchain technology plays a crucial role in maintaining the integrity and security of data in decentralized networks. It offers a transparent and tamper-proof system for conducting data transactions. In addition, data serialization formats such as JavaScript Object Notation (JSON), Extensible Markup Language (XML), and Protobuf streamline the transfer of data between various systems by establishing a standardized method for encoding and decoding data. Edge computing is becoming increasingly important as it processes data in close proximity to its source, resulting in reduced delays and bandwidth usage in large-scale IoT operations, and improved capacities to share real-time data [22].

2.3. Main Issues

Despite notable advancements in data exchange technologies, modern data management continues to face a range of complex challenges that hinder seamless and secure data sharing. One of the foremost concerns is data security and privacy. The increasing frequency and sophistication of cyber-attacks threaten both the integrity and confidentiality of data. To mitigate these risks, organizations must continually update their cybersecurity strategies to defend against breaches, ransomware, and other malicious threats—especially in sectors handling sensitive data, such as finance and healthcare [23].
Interoperability remains another persistent hurdle. Discrepancies in data formats, standards, and system architectures across organizations often obstruct smooth integration and exchange. Achieving effective communication between disparate systems typically demands substantial effort in data transformation, normalization, and semantic alignment [24]. Moreover, data quality is also a critical issue. Inaccurate, incomplete, or inconsistent data can significantly undermine decision-making processes and lead to operational inefficiencies [25]. Ensuring high-quality data requires rigorous validation, cleaning, and governance mechanisms—activities that can be resource-intensive but are indispensable for reliable analytics and system performance.
Compounding these challenges is the need to maintain regulatory compliance [26]. Stringent frameworks such as the General Data Protection Regulation (GDPR) in the European Union and the California Consumer Privacy Act (CCPA) in California impose comprehensive rules on data privacy, storage, and transmission. Navigating these regulatory landscapes demands meticulous attention to compliance practices to avoid severe legal and financial consequences. Also, as data volumes grow exponentially, scalability becomes another critical concern [27]. In particular, efficiently managing vast datasets without degrading system performance requires the adoption of scalable architectures and advanced processing technologies.
Finally, the rise of applications requiring real-time data sharing—such as real-time analytics and IoT ecosystems—has intensified the demand for faster, more responsive technologies. These applications rely on instantaneous data access and processing capabilities to deliver timely insights and actions, underscoring the urgent need for infrastructure that supports low-latency, high-throughput data exchange [28].

2.4. Automation and Data Sharing

The integration of AI and automation into data exchange systems has profoundly transformed the way data is managed, processed, and secured. By automating complex operations such as Extract, Transform and Load (ETL), AI significantly reduces human error and accelerates data workflows. Machine learning algorithms further enhance application performance through dynamic optimization, continuously adjusting configurations based on real-time performance metrics to improve resource utilization and operational efficiency [29]. AI-driven automation also plays a pivotal role in maintaining data quality, enabling the real-time detection and correction of anomalies to uphold data integrity. Predictive analytics powered by AI facilitate proactive decision-making, allowing organizations to anticipate trends and respond more effectively to changing market conditions [30]. Additionally, advanced AI models enhance security by embedding intelligent anomaly detection into data transmission protocols, ensuring early identification and mitigation of potential threats.
Beyond operational enhancements, AI is revolutionizing regulatory compliance by automatically tracking legislative updates and adjusting data handling procedures accordingly. This proactive compliance management reduces the risk of regulatory breaches and associated penalties [31]. Collectively, the adoption of AI and automation not only refines existing data exchange practices but also lays the groundwork for more intelligent, responsive, and secure data ecosystems capable of addressing the complexities of modern digital infrastructures.
Likewise, legal compliance, ethical considerations are gaining prominence, particularly in ensuring that data management practices uphold principles of fairness, transparency, and non-discrimination. Heightened awareness of bias in AI algorithms—especially in automated decision-making—has spurred calls for ethical AI development, emphasizing the need for accountable, explainable, and unbiased systems [32]. This is especially critical in sensitive domains like healthcare and finance, where ethical lapses can have profound societal consequences. Lastly, the dynamic nature of technological innovation demands that organizations not only comply with existing regulations but also remain agile in adapting to evolving legal standards and societal expectations. Staying informed and responsive to this shifting landscape is vital for fostering public trust, achieving sustainable compliance, and ensuring responsible data exchange practices [33].

3. Research Methodology

The research methodology employed in this study is grounded on a systematic and well-defined approach to ensure reliability, reproducibility, and analytical depth. Our methodology adheres to the principles of the Systematic Literature Review (SLR) as defined by [34]. This review was performed in accordance to the PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) guidelines. The complete study selection process is illustrated in the PRISMA flow diagram (see Figure 1).
To ensure methodological clarity, we adopted a structured Systematic Literature Review (SLR) framework. Initially, a total of 348 records were retrieved from the selected databases. After applying the inclusion and exclusion criteria, 102 papers were retained for detailed analysis from 2020 to 2024, covering application domains such as healthcare, finance, logistics, and cloud computing. The selection process prioritized peer-reviewed journal articles and conference proceedings published in English. A summary of the four-phase review protocol is presented in Table 1, and Table 2 shows the thematic distribution of the reviewed papers, highlighting key areas such as blockchain, data privacy, and healthcare applications.
Table 3 further classifies the selected studies according to the main challenges addressed, such as interoperability, scalability, security, and regulatory compliance. This issue-based classification enables a clearer view of research concentration and existing gaps. To enhance clarity and comparative depth, we include a dedicated table (Table 4) summarizing the main technological paradigms discussed in this review, providing an overview of their respective strengths, limitations, and regulatory profiles.
Table 2 and Table 3 refer to subsets of the 102 studies included in the systematic review. While the article contains 168 references, only 102 were selected for formal analysis; the rest support methodological or contextual discussions (e.g., [1,3,34]).
Among the 102 reviewed studies, 92 were classified thematically (Table 2), and 74 were mapped to specific problem areas (Table 3). The remaining studies lacked a clear alignment with the predefined categories and were excluded from the tables to maintain consistency. Note that categories are not mutually exclusive, and a single study may appear in multiple groups. These distinctions are now clarified to avoid confusion.
Hence, the process is structured in four key stages:
Planning the Review: A detailed protocol was established to define the scope and objectives of the study. This included the formulation of research questions, selection criteria for inclusion and exclusion, and the identification of databases and keywords for the search. The protocol aimed to address critical aspects of data exchange systems, including methodologies, tools, challenges, and emerging trends.
Conducting the Review: A comprehensive search was conducted across multiple academic databases, including IEEE Xplore, ACM Digital Library, and SpringerLink, targeting publications from 2020 to 2024. Keywords such as “data exchange”, “blockchain in data processing”, “FPGA for data handling”, and “AI in secure data sharing” were used to retrieve relevant articles. Additionally, snowballing techniques were applied to identify key studies cited in the primary sources, ensuring broader coverage.
Data Extraction and Synthesis: The selected studies were critically evaluated to extract relevant data on methodologies, tools, frameworks, and challenges. Each study was cataloged using reference management software, allowing for consistent organization and traceability. The extracted data was systematically synthesized to identify common trends, gaps, and innovative solutions in the field of data exchange.
Quality Assessment and Validation: Each study was assessed against predefined quality metrics, including methodological rigor, relevance to the research objectives, and contribution to the field. To enhance validation, findings were cross-referenced with related works and corroborated through domain expertise.
The reviewed articles were thematically classified into categories including: Blockchain and Distributed Ledger (16), Data Privacy and Security (14), Healthcare and Medical Data (10), Ethical and Legal Frameworks (9), among others. This classification supports the identification of overrepresented and underexplored areas in the current research landscape. This dual classification—by topic and by problem—helps identify both saturated areas and those warranting further exploration.
The subsequent sections elaborate on the insights derived from this structured analysis and discuss their implications for advancing the state-of-the-art in data exchange systems.

4. Technological Approaches to Data Exchange

In this chapter, we explore how contemporary technologies such as Blockchain, FPGAs, and AI are shaping the mechanisms of data exchange. These paradigms are discussed in detail regarding their architecture, operational logic, and use cases in secure and scalable data transmission environments.

4.1. Technology and Infrastructure

We emphasize the innovations that drive data management systems by investigating sophisticated solutions such as FPGA-based data processing and the integration of blockchain with IoT. The discussions also encompass the difficulties in upholding real-time data settings and the pivotal importance of interoperability and standardized infrastructures in guaranteeing smooth data transmission. The purpose of this exploration is to provide insight into the fundamental technologies that improve the ability to handle data and increase operational efficiency.
FPGA-based Data Processing: Key research in the subject have highlighted the substantial milestones achieved in the storage and handling of large-scale data through recent improvements in FPGA-based data processing [35]. Nevertheless, whereas these technologies offer the potential for enhanced efficiency and accuracy, they also present significant problems that must not be disregarded. These obstacles encompass intricacies in execution and the requirement for consistent standards that may effortlessly integrate into current information technology ecosystems. For instance authors in [36] present a revolutionary paradigm for creating FPGA-based accelerators. This framework is crucial for intelligent storage systems that involve data filtering and transformation in key-value stores. The paper by [37] explores the use of FPGA-based SmartNICs to improve the performance of read-intensive workloads in ordered key-value stores. The importance of ensuring real-time data freshness guarantees in key-value stores is emphasized by the study conducted by [38], which highlights a growing focus on enhancing data integrity in these systems. The paper by [39] introduces a novel approach to FPGA-offloading, which demonstrates significant enhancements in performance compared to conventional CPU-only configurations. In [40] the authors investigate caching strategies on FPGAs in the blockchain field to improve the efficiency of key-value stores. The study demonstrates significant advancements in both performance and efficiency. In addition, the paper [41] presents a sophisticated data structure specifically developed for LSM-tree-based key-value stores. The objective is to improve the efficiency of read operations and reduce read amplification. A representative application of FPGA-based acceleration is found in real-time packet classification systems, where FPGAs are used to process incoming data packets with deterministic low latency. These systems enable rapid filtering and routing decisions, essential in high-throughput networks. Similarly, in ETL (Extract, Transform, Load) pipelines, FPGAs have been applied to offload and parallelize tasks such as format conversion, compression, and deduplication, thereby significantly improving throughput and reducing processing delays in large-scale data integration workflows.
These studies collectively highlight a growing inclination towards advanced hardware-level data processing, which is a crucial element in the initial phases of the data lifecycle.
Blockchain and IoT Integration: The symbiotic relationship between blockchain technology and the Interplanetary File System (IPFS) in data exchange is gaining recognition in current research across all disciplines. The paper by [42] introduces a novel architecture that integrates blockchain and IPFS in an electronic medical record system. This design effectively tackles the intricate issues related to medical data sharing and enhances safe and efficient data management in sensitive domains such as healthcare. The paper by [43] presents a framework that utilizes blockchain technology to enable monetization of IoT data. The system demonstrates its effectiveness in facilitating decentralized sharing of IoT data and implementing smart contracts. Authors in [44] propose a revolutionary solution that combines Permissioned Blockchain and IPFS to provide a safe decentralized storage system that overcomes the limitations of centralized solutions. The paper by [45] investigates the utilization of Ethereum’s smart contracts in conjunction with IPFS for the purpose of keeping personal health information. The study emphasizes the effectiveness of this approach in managing personal data within the healthcare sector. In the field of IoT logistics, the study conducted by [46] presents a model that combines blockchain with IoT technology, specifically using IPFS, to securely store and monitor logistics data in real-time. This model aims to overcome major issues faced by IoT logistics systems. In addition, the article [47] presents a web application that combines blockchain technology and IPFS to facilitate data exchange in smart crop production.
These studies highlight the growing significance of blockchain and IPFS in several industries, demonstrating their promise in enabling secure, efficient, and decentralized data transmission.
Advanced AI Frameworks: Recent approaches such as BlockASP [48] integrate model checking techniques within blockchain-based systems to verify smart contracts and ensure traceable, compliant data flows. This enables formal reasoning about policy enforcement in decentralized data environments. BlockASP is a model checking framework designed for blockchain systems, integrating aspect-oriented programming (AOP) to verify the correctness of smart contract behaviors. It enables static analysis of contract logic and runtime verification of execution flows, ensuring that data exchange rules comply with formal security and integrity policies. This is especially useful in scenarios involving programmable compliance, such as automated regulatory enforcement in decentralized applications [49]. Another notable example is the use of AI-powered aspect-oriented programming (AOP), where large language models (LLMs) and statistical learning are employed to monitor and adapt runtime data behaviors. These systems can dynamically enforce policies, detect anomalies, and adjust workflows in cloud-based data exchange environments. By embedding AI into AOP modules, systems gain contextual awareness and self-regulation capabilities [50]. Similarly, AI-powered AOP [51] enhances runtime monitoring using large language models and statistical learning, supporting proactive adaptation to shifting regulatory contexts and workload profiles.
These AI-enhanced frameworks demonstrate the emerging convergence between formal verification methods and intelligent automation in secure data exchange. Their integration enables auditable, adaptable, and regulation-aware infrastructures that are particularly well-suited for real-time and cloud-based environments. The integration of formal verification (via model checking) with blockchain mechanisms, as seen in BlockASP, represents a significant step toward intelligent governance in data infrastructures. This allows for proactive error detection, auditable logic validation, and improved trust in decentralized data workflows. This approach aligns with the current shift toward intelligent and self-adaptive data governance, allowing runtime policy enforcement without human intervention. It enhances both observability and compliance in dynamic data pipelines, especially under changing workloads or regulatory conditions.
Interoperability and Standardization: The latest research highlights the persistent difficulties and advancements in the compatibility and standardization of data exchange. The article by [52] examines the main obstacles and possibilities for enhancing compatibility between various healthcare participants, with a specific emphasis on technological, organizational, and environmental aspects. The article by [53] focuses on the urgent problem of interoperability in healthcare informatics, specifically examining how Structured Data Capture (SDC) can help handle the complications, particularly in the context of oncology data. Ref. [54] examines the adoption of Fast Healthcare Interoperability Resources (FHIR) by Health Level Seven (HL7) and the difficulties it presents in achieving semantic interoperability, as well as addressing concerns related to security and privacy. The article by [55] evaluates the current status of APIs in the healthcare industry. It highlights the difficulties in accessing and exchanging data using APIs, as well as the possibilities for enhancing their capabilities to allow for data writing. The examination conducted by [56] sheds light on the obstacles and deficiencies in attaining inter cloud interoperability, emphasizing the necessity for strong standards and procedures to provide seamless compatibility. The article by [57] emphasizes the need to address issues related to accessing and sharing data in this field. The authors propose the LOTUS effort as a solution to promote the standardization and open dissemination of referenced structure-organism pairs. Recent AI-driven interoperability frameworks have adopted semantic metadata mapping and ontology-based schema matching to address heterogeneity in data exchange. These techniques enable systems to automatically align and interpret structurally distinct data models. For example, semantic AI engines can identify conceptual overlaps across datasets using probabilistic reasoning and entity embeddings, facilitating integration in dynamic and distributed environments.
These discussions demonstrate the urgent and continuous requirement for better compatibility and standardization across different sectors, especially in healthcare. In this field, the incorporation of advanced technological solutions and strict standards is crucial for improving the efficiency and security of data exchange systems. Additionally, semantic mapping and automated schema alignment have emerged as essential tools in overcoming structural heterogeneity across data platforms, improving integration efficiency and reducing manual intervention.
Enhanced Data Management: Table 5 presents a wide range of data management tools that are classified based on their specific applications in different industries. This classification emphasizes the important roles these tools play in enhancing operational efficiency, ensuring security, and meeting regulatory requirements. Apache Kafka is particularly strong in the Real-Time Data Management area, where it stands out in handling extensive data streams in the banking and telecoms industries. This allows for instant data analysis and decision-making. Redis and similar tools improve the efficiency of processing data quickly, which is important in industries like e-commerce and gaming where it is necessary to optimize transaction processing and user engagement. Data integration platforms like Apache NiFi and MuleSoft simplify intricate data workflows, which are crucial in government and retail sectors to guarantee data coherence and availability. In the field of Data Security, tools such as Cisco SecureX and IBM Guardium offer strong security measures to safeguard sensitive information from potential threats in corporate and healthcare industries. In the Predictive Analysis and Machine Learning category, platforms such as Google Cloud AI Platform and SAS Viya 3.5 utilize sophisticated machine learning algorithms to convert data into actionable insights. These platforms provide notable benefits in retail and financial services by improving predictive capabilities and operational intelligence.
This thorough study enhances comprehension of how each tool aligns with its operational ecosystem, assisting organizations in choosing the most suitable solutions to tackle their unique data management difficulties.
Real-Time Data Environments: The works presented in [74,75] offer valuable insights into the management of data exchange in dynamic, real-time systems. They investigate the functionalities of solutions such as MuleSoft Anypoint and Apache Kafka. Apache Kafka is well-known for its strong capacity to scale and process data in real-time. It excels at handling continuous data streams with the ability to recover from faults and minimal delay. It is used in various applications such as IoT networks and federated learning data streams. Besides, the article by [76] emphasizes Kafka’s expertise in efficiently processing large amounts of data with minimal delay, and underscores the need of configuring Kafka with Apache Zookeeper to improve its performance. Now, authors in [77] delve into the fault tolerance and message recovery features of Kafka. They propose novel solutions, such as the Checkpoint Interval Message Logging Algorithm, to improve the reliability of messages in the event of server failures. Nevertheless, the study conducted by [78] investigates the difficulties Kafka faces in achieving a balance between consistency, performance, and cost in stream processing. This research sheds light on the intricate nature of trade-off judgments. In addition, ref. [79] discusses the issue of data security in cloud-based deployments and suggests novel approaches, such as Secure Door on Cloud, to reduce the risks of privacy breaches.
These studies emphasize the important functions and unique constraints of tools such as Kafka in terms of scalability, administration complexity, and data security in real-time data exchange situations.

4.2. Security, Privacy, and Compliance

We address the crucial elements of security, privacy, and compliance in data exchange systems, considering the growing digital dangers and strict regulatory demands. The subject encompasses a range of methods, including traceable and privacy-preserving exchange mechanisms, as well as in-depth investigations on security and efficiency. We also analyzes regulatory compliance and scalability, specifically evaluating how various solutions adhere to international norms and enhance secure data sharing methods.
Traceable and Privacy-Preserving Exchange: An influential study conducted by [80] presents an innovative traceable data sharing system that guarantees both privacy and fairness, which is crucial in the initial phases of data management. This novel technique not only enables the tracking of data transformation but also utilizes Zero-knowledge proofs (ZKP), thereby representing a notable progress in the fields of data security and privacy. In addition, the study conducted by [80] explores ZKDET, a new data exchange paradigm that combines non-fungible tokens and ZKP to provide traceability and privacy. This technique enables the transparent monitoring of data transformation on blockchain platforms, while also guaranteeing equity and confidentiality in public storage systems through the use of verifiable ZKP. In addition, the article by [81] investigates the effects of ZKPs on improving privacy in digital identity systems, indicating their wide-ranging potential for use. The paper titled ‘zk-BeSC: Confidential Blockchain Enabled Supply Chain Based on Polynomial Zero-Knowledge Proofs’ by [82] presents a novel supply chain model called zk-BeSC. This model utilizes zero-knowledge polynomial proofs to maintain privacy during transactions and guarantee the integrity of data. Then, authors in [83] explore the incorporation of blockchain technology, Soulbound tokens, and ZKP into a protocol designed for maintaining anonymity during identity verification. They demonstrate the effectiveness of these components in ensuring safe identity validation. A notable advancement is described in [84], which introduces a system for secure and private sharing of IoT data. This is achieved through the use of Self-Sovereign Identities, Verifiable Credentials, and Zero-Knowledge Proofs, allowing for selective data disclosure and precise access control.
The latter highlights the increasing significance of secure and private data exchange methods that can be traced, using advanced technologies such as blockchain and ZKP. These methods enhance the security and privacy of data in many applications.
Privacy in Big Data Exchange: The paper by [85] specifically examines the issue of privacy in the context of exchanging vast amounts of data, and discusses the problems and future research areas that are essential for ensuring privacy in such exchanges. This study emphasizes the growing acknowledgment and necessity of strong privacy safeguards in data management. Expanding on this discussion, the paper by [86] introduces a privacy method in decentralized cloud settings through the use of smart contracts, making a substantial contribution to the subject. Simultaneously, [87] introduce FAPS, a massive data sharing strategy that ensures fairness and privacy by utilizing smart contracts and the oblivious transfer protocol. The study conducted by [88] investigates a specific cryptographic framework for analyzing large-scale data, emphasizing the crucial requirement for sophisticated encryption techniques to ensure the effective protection of user data. The paper by [89] investigates the idea of hybrid cloud-based privacy-preserving clustering. The objective is to achieve a balance between compute and communication expenses while ensuring data efficiency. In addition, the authors of [80] have reexamined ZKDET, a data exchange method that combines non-fungible tokens and zero-knowledge proofs to ensure traceability and privacy.
These studies emphasize the crucial significance of preserving privacy in the flow of large amounts of data and the various technological advancements created to address these intricate difficulties.
Regulatory Compliance and Scalability: Automation solutions have demonstrated concrete capabilities in addressing regulatory requirements such as those defined in the GDPR. For example, data lineage tracking systems can automatically log data access and transformation events, satisfying auditability and accountability provisions [90]. Smart contract mechanisms enable programmable consent enforcement, ensuring that personal data is processed only for authorized purposes [91]. Additionally, AI-based policy engines can dynamically adapt data handling rules in response to changes in regulatory conditions, supporting continuous compliance in distributed environments [92]. In addition, aligning automation technologies with formal governance frameworks is crucial to ensuring consistent regulatory adherence. Frameworks such as the OECD AI Principles and the NIST AI Risk Management Framework (RMF) provide actionable guidelines for implementing fairness, transparency, accountability, and privacy in AI-driven systems [93,94]. Integrating these principles into data exchange architectures supports explainability and auditability, which are key to compliance with regulations like the GDPR.
Within the domain of data exchange technology, regulatory compliance and scalability emerge as crucial factors, especially in businesses that deal with sensitive information. IBM Integration Bus and Microsoft Azure Data Factory are examples of tools that demonstrate compliance with severe standards such as GDPR and Health Insurance Portability and Accountability Act (HIPAA). This highlights their crucial roles in facilitating secure and compliant data transmission. The significance of these characteristics is outlined in Table 6, which evaluates the capabilities of different tools in terms of compliance, scalability, and integration features. This analysis offers valuable insights to practitioners, aiding them in selecting the most suitable tools for their operational requirements. The focus on adhering to regulations is strengthened by new academic contributions. The article by [95] presents a data sharing system that adheres to the GDPR and operates on a global scale, emphasizing the worldwide significance of data privacy issues. Similarly, the article by [96] introduces a framework that ensures confidence in the sharing of health big data while complying with regulations. The system specifically addresses the distinct issues associated with managing healthcare data. The article by [97] examines the development of these systems, with a particular focus on the importance of implementing robust data security and compliance protocols. In addition, the article by [98] examines the importance of implementing privacy by design approaches to protect sensitive data and comply with data protection requirements. Finally, the article by [99] examines the utilization of blockchain technology in the healthcare industry, specifically analyzing its ability to improve data security and comply with privacy standards such as GDPR and HIPAA.
These studies emphasize the increasing need for solutions that not only enable reliable data sharing but also comply with strict regulatory criteria to guarantee data security and privacy. The insights derived from the table allow organizations to efficiently negotiate these challenges, enabling them to make educated decisions that link the deployment of technology with regulatory compliance and operational scalability.
Comparative Studies on Data Exchange Security and Efficiency: The article by [107] examines authentication and secure access procedures in scientific data. It compares several technologies such SciTokens, Verifiable Credentials, and Smart Contracts, emphasizing their crucial role in enabling trustworthy data exchange. The paper by [108] provides a thorough analysis of time series forecasting using Automated Machine Learning on various datasets. The study highlights the significance of flexibility and accuracy in predictive analytics tools. In the domain of geoinformatics, the study conducted by [109] presents a thorough comparative examination of cloud services for processing geoinformation data. The study demonstrates how cloud-based solutions can improve the accessibility and processing capabilities of such data. In addition, the article by [110] investigates methods of extracting data from Android smartphones that use Qualcomm technology. The article offers valuable information on the processing and extraction of mobile data. The study conducted by [111] investigates the aspects of data security and protection in marketing. It presents a thorough analysis of programming methodologies using Golang and PHP, highlighting the importance of implementing strong data protection methods. This thorough investigation, conducted through theoretical analyses and in-depth case studies, demonstrates the importance of selecting appropriate tools to address unique operational issues. These insights help industry experts make informed decisions that are crucial for streamlining data exchange operations, guaranteeing security, and improving overall system performance.
Challenges and Solutions in Data Exchange Security: Recent research focuses on addressing security concerns in data communication and highlights significant advancements and solutions. The proposal of [112] seeks to connect big data with fast data technologies in order to improve cybersecurity analytics, specifically targeting intricate threats in the digital realm. The paper by [113] examines cybersecurity attacks targeting IoT devices and suggests secure authentication and communication methods. The article by [114] provides a comprehensive analysis of the vulnerabilities associated with the IoT. The author categorizes these threats according to the different layers of the IoT architecture and proposes effective solutions. The article by [115] investigates the use of blockchain technology for authentication in the Industrial Internet of Things (IIoT). The study specifically addresses the privacy and security issues that arise in industrial settings. The study conducted by [116] on Distributed Denial of Service (DDoS) assaults targeting IoT and machine learning solutions is a notable advancement in the field of cybersecurity. The paper by [117] presents a self-defense system for cloud firms that utilizes multilayer machine learning. On the other hand, [118] focuses on the application of data science for protecting digital footprints. The paper by [119] introduces TRADE, a blockchain-powered platform designed for the purpose of sharing anonymous cyber threat information. The paper by [120] investigates the utilization of a combination of AES and Twofish algorithms to improve the security of data in cloud systems. Ultimately, ref. [121] examines strategies for guaranteeing the privacy of data in peer-to-peer communication within blockchain technology. These studies emphasize the intricate nature of ensuring secure data exchange, showcasing cutting-edge technologies such as machine learning, blockchain, and hybrid cryptographic methods.

4.3. AI Impact, Applications, and Ethical Considerations

We examine the practical applications and ethical considerations of sharing data in different industries, with a specific focus on healthcare systems and the use of artificial intelligence. The text highlights the significance of ethical frameworks and sustainability in data management, exploring how these concepts are incorporated into both established and emerging technology. This part seeks to offer a thorough comprehension of the impact of data exchange on society norms and individual privacy, as well as its role in improving system efficiency and decision-making capabilities.
Data Exchange in Healthcare Systems: Recent literature on healthcare systems highlights the crucial need of standards and technical integrations for exchanging data. The importance of standards such as XML, HL7, and Digital Imaging and Communication in Medicine (DICOM) in hospital information systems for integrating different systems is emphasized. However, their relevance in present architectures is not fully recognized, as pointed out by [122]. The MEdge-Chain approach, as explained by [123], utilizes edge computing and blockchain technologies to provide efficient and secure transmission of medical data, thereby improving latency and reducing computational expenses. The paper by [124] presents a robust framework that utilizes Directed Acyclic Graphs (DAG) to facilitate the exchange of electronic health information. The primary focus of this design is to enhance patient data security and ensure its immutability. The article by [125] explores the ways in which Health Information interchange (HIE) models can improve healthcare quality and management efficiency by facilitating better integration and interchange of data. The system presented in [126] integrates blockchain with the interplanetary file system to facilitate data sharing in the healthcare sector. It incorporates advanced access control measures. Finally, according to [127], a new set of procedures is proposed for exchanging confidential health information. This protocol suite incorporates proxies and anonymizers to ensure secure transmission, with a specific emphasis on meeting the requirements of anonymization, secrecy, and authentication. These studies emphasize the significance of implementing standards and using cutting-edge technologies such as blockchain and edge computing to guarantee secure and efficient data exchange in healthcare systems.
Impact of Artificial Intelligence: AI and machine learning are transforming data exchange processes, improving efficiency, and facilitating predictive analysis and instantaneous decision-making. The article by [128] explores the utilization of AI in the field of currency exchange rate prediction. It highlights how AI’s ability to make informed decisions is enhanced by analyzing large and comprehensive data sets. The study conducted by [129] showcases the application of machine learning in human resources management to facilitate decision-making based on data analysis. The study conducted by [130] investigates the utilization of AI-driven big data analytics in B2B marketing and its impact on rational decision-making and firm performance. The article by [131] explores the application of artificial intelligence in the management of large financial datasets, with a focus on improving the analysis of financial information and decision-making support. The study conducted by [132] emphasizes the contribution of AI in enhancing the precision of financial management systems within the information sector. The article by [133] demonstrates the rise of hybrid intelligent systems in the field of animal research. These systems integrate artificial intelligence algorithms with traditional models to enable sophisticated data analysis. The paper by [79] introduces an AI learning system designed for fair categorization in non-stationary data streams. The article by [134] examines the influence of artificial intelligence on business intelligence and data analytics, with a specific focus on decision support systems for e-business. The paper by [135] introduces a method that combines human input with artificial intelligence to improve decision-making using prescriptive analytics. The citation of [136] highlights the increasing significance of AI in healthcare, particularly in the areas of data processing and decision-making, for improved efficiency. The article by [137] investigates the utilization of data science and AI in revolutionizing company operations and strategic decision-making. These studies collectively show that AI and machine learning have a substantial effect on improving the efficiency of data transmission, allowing for predictive analytics, and aiding informed decision-making in many industries.
Sustainability and Ethics: The convergence of sustainability and ethics in data exchange is being increasingly seen as crucial. The analysis conducted by [138] focuses on the sustainability and social acceptability of data networks on exchange platforms. The article by [139] emphasizes the ethical issues associated with managing big data and suggests a paradigm for stakeholder ethics that is connected to sustainability. The article by [140] examines the legal and ethical structures surrounding the interchange of health information on a global scale. It specifically delves into the topics of informed consent and ethical approval. The article by [141] proposes the implementation of dynamic consent and responsible governance in the field of medical research. The article by [142] examines the ethical ramifications of exchanging information in the healthcare sector, specifically focusing on issues such as confidentiality and informed consent. The paper by [143] presents a data trust paradigm aimed at promoting sustainable economic growth in the food chain. The paper by [144] presents a data governance framework for e-prescriptions that utilizes blockchain technology, with a focus on privacy and consent management. The article by [145] promotes the importance of reliable data governance in the field of dementia research. The article by [146] highlights the environmental consequences of data-driven technology and emphasizes the ethical aspects of sustainability. These studies emphasize the importance of ethically and sustainably handling data exchange, namely by prioritizing informed consent, responsible governance, and environmental factors.

4.4. Emerging Challenges and Technological Responses

Adaptive Systems for Real-Time Data Exchange: Contemporary literature emphasizes the primary obstacles and technology advancements that are transforming the field of data communication. The issue of real-time API performance limitations, which are frequently caused by fixed configuration settings, is tackled through dynamic resource management, as explained by [147]. Additionally, adaptive scheduling in fog and IoT environments, as described by [148], and event-triggered communication networks, as proposed by [149], also address this problem. These studies highlight the necessity for systems that are more adaptable and quick to respond. In addition, ref. [150] has made significant contributions to the development and execution of real-time resource access protocols. Furthermore, ref. [151] has explored the use of reinforcement learning to improve the flexibility of Representational State Transfer API (REST-API) testing, demonstrating further advancements in this area. Ref. [152] also investigates the incorporation of machine learning in self-managing cloud storage to facilitate the shift towards more autonomous data exchange frameworks. Furthermore, distributed messaging systems encounter inefficiency and scalability challenges that are crucial for preserving the integrity of real-time communication. The challenges of scalability and system responsiveness are addressed by [153] through the use of adaptive topology in distributed publish/subscribe messaging, by [154] through reliability enhancements using distributed event streaming platforms, and by [155] through the implementation of Byzantine-resilient protocols for event-triggered systems. The paper by [156] discusses the architecture of real-time IoT data streaming testbeds, while the paper by [157] explores scalable cross-chain messaging. Both of these contributions contribute to the ongoing development of distributed data exchange.
Scalability and Reliability in Distributed Messaging: Furthermore, the incorporation of AI-powered multimodal ETL systems in managing unstructured and temporal data exposes notable inefficiencies, especially in sectors like as healthcare and energy management, where the handling of data in real-time is vital. An instance of this is the combination of IoT with machine learning to monitor the health of patients in real-time, as investigated by [158]. This integration has demonstrated the ability to optimize the processing of data. Furthermore, the study conducted by [159] has shown that the implementation of real-time energy consumption data management and prediction systems has resulted in enhanced operational efficiencies. The paper by [160] highlights the progress made in healthcare for early disease diagnosis using machine learning. It showcases the improved predictive analytics skills achieved through these developments. In addition, the application of deep learning in evaluating spatiotemporal maritime data provides innovative methods for managing intricate datasets, which can be especially beneficial in situations where data streams are insufficient, as discussed in the journal article [161]. Furthermore, ref. [162] emphasizes the significance of AI in extracting valuable information from intricate medical images, namely sophisticated bioimaging markers identified from echocardiograms by deep learning. These studies emphasize the increasing importance of AI and machine learning in improving the ability to process data in distributed messaging frameworks, making it more efficient and dependable.
AI-Driven Innovations for Data Processing: Finally, the task of guaranteeing efficient and secure transmission of enormous files on a wide scale, notwithstanding the various conditions of the network, continues to be of utmost importance. Recent advancements in this field involve the utilization of network coding to augment the efficiency of file transfers in low-bandwidth settings. This has been exemplified by the work of [163], which demonstrates a noteworthy enhancement in both reliability and throughput. In addition, the study conducted by [164] emphasizes how the design of intelligent transportation systems in big data environments may effectively address challenges associated with transferring large files on a wide scale through the use of advanced data handling techniques. The research conducted by [165] emphasizes the dependable and effective transmission of huge files across unreliable, one-way connections while ensuring the preservation of data integrity even in difficult network environments. In addition, the paper by [166] explores the enhancement of distributed file systems through the utilization of Remote Direct Memory Access (RDMA) technology. This technique has demonstrated notable improvements in both speed and efficiency for data transfers. Furthermore, ref. [167] presents a comprehensive examination of network coding-based Peer to Peer (P2P) file sharing. This survey highlights effective strategies that improve the distribution and accessibility of files in large-scale networks. These references highlight the significant advancements in technology that are being made to tackle the challenges of transmitting huge files, with a focus on improving efficiency and security.

5. Discussion

The landscape of data generation and acquisition continues to evolve rapidly due to the proliferation of IoT devices, the expansion of social media, and the increasing adoption of smart technologies. This transformation has led to a substantial increase in the volume, velocity, and variety of data, creating both opportunities and challenges in data management [36,37,38,39]. While state-of-the-art technologies such as edge computing and 5G networks enable real-time data collection and processing, ensuring data authenticity and reliability remains a complex issue [40,41,42,43]. Additionally, the lack of standardization in data formats and quality assurance methods from the point of generation exacerbates inconsistencies, which in turn impacts subsequent stages of data processing and exchange [52,53,54,55,56,57]. Addressing these gaps requires the development of universal guidelines and robust frameworks to ensure consistent data quality across its lifecycle.
The transition from traditional storage systems to cloud-based solutions has significantly improved scalability and accessibility. However, it has also raised concerns about data sovereignty, privacy, and governance [107,108]. Ensuring the accuracy and reliability of stored data is essential, as it directly influences trust in data exchange processes [95,96]. Advanced encryption techniques and redundant storage systems are critical to mitigating risks associated with data breaches and corruption [97,98]. Furthermore, aligning storage technologies with evolving regulatory standards requires the integration of comprehensive security protocols and scalable technical solutions [74,75,76,99].
In turn, the convergence of real-time processing capabilities and AI technologies has revolutionized data processing, enabling faster decision-making and more accurate insights. Machine learning algorithms and predictive analytics have become central to modern data workflows, particularly for managing complex datasets [36,37,128,129,130]. Nevertheless, integrating these technologies into existing data infrastructures presents challenges, including the need for specialized expertise and the potential for bias in AI-driven systems [79,131,132,133]. Efforts to address these issues should focus on developing explainable AI models and standardized frameworks that facilitate interoperability while maintaining ethical considerations [130,131,132,133]. These improvements would not only enhance data transfer efficiency but also ensure compliance with global data protection standards [52,53,54,55,95,96].
Next, technological innovations such as network coding, intelligent transportation systems, and RDMA-based distributed file systems have demonstrated significant improvements in scalability and reliability for large-scale data transmission [163,164,165,166]. Despite these advancements, challenges persist in integrating these solutions with existing infrastructures and balancing performance with stringent security requirements. Future research should prioritize the development of adaptive and interoperable platforms capable of addressing these complexities, ensuring robust and secure data exchange in an increasingly interconnected digital ecosystem.
In a nutshell, these discussions emphasize the need for a holistic approach to data exchange that combines technical innovation with regulatory, ethical, and operational considerations. By addressing these multifaceted challenges, the field can continue to advance towards a more efficient, secure, and sustainable data management paradigm.
Despite the comprehensive scope and methodological rigor of this review, several limitations must be acknowledged. First, no formal risk of bias assessment was performed, which may limit the critical appraisal of the included studies. Second, the certainty of evidence was not evaluated using formal frameworks (e.g., GRADE), restricting the strength of interpretive conclusions. Third, the synthesis approach was purely narrative, without meta-analytic aggregation, thus limiting statistical generalizability. Finally, while PRISMA guidelines were followed, the study was neither registered in a systematic review registry nor accompanied by a pre-published protocol. These limitations should be addressed in future studies to enhance transparency, reproducibility, and methodological robustness.
Rather than focusing on a single technology, this study has deliberately adopted a comparative perspective to examine how diverse emerging paradigms—including blockchain, FPGA-based systems, and AI—jointly address the multifaceted challenges of modern data exchange. This integrative approach reveals how these technologies complement one another in terms of performance, scalability, and regulatory compliance, offering a holistic framework for building secure and efficient data infrastructures.

6. Conclusions

This review has presented a comprehensive examination of the evolving landscape of data exchange, emphasizing the transformative potential of emerging technologies such as blockchain, FPGA, and artificial intelligence. Through the analysis of contemporary tools, protocols, and architectures, it has become evident that intelligent automation significantly enhances the efficiency, scalability, and security of data exchange systems. Key insights reveal that the integration of AI-driven automation into data workflows reduces human error, facilitates real-time analytics, and supports predictive decision-making. Similarly, the use of blockchain ensures immutability and trust in decentralized data environments, while FPGA-based infrastructures offer high-performance, low-latency processing capabilities essential for large-scale and time-sensitive applications. These advancements collectively address many of the historical limitations in interoperability, data integrity, and compliance, particularly within critical sectors such as healthcare, finance, and industrial systems.
In summary, each technological paradigm analyzed in this study contributes uniquely to the ongoing evolution of data exchange systems. Blockchain and Zero-Knowledge mechanisms offer robust security and privacy guarantees, while FPGA-based solutions excel in low-latency scenarios. IoT architectures bring scalability at the cost of compliance and security. The comparative analysis presented in Table 4 underscores the importance of context-aware technology selection, and it reinforces the need for hybrid frameworks that combine complementary strengths across these domains. Also, this comparative approach has proven valuable in revealing the complementary roles of blockchain, FPGA, and AI technologies across the data exchange landscape. By analyzing them side by side, we provide not only a technical evaluation but also strategic insights into how hybrid implementations can overcome the limitations of isolated systems. Lastly, our analysis shows that while each technology addresses distinct bottlenecks—such as latency, traceability, or automation—their integration creates a more resilient and adaptable ecosystem for modern data exchange.
Future research must concentrate on several pressing challenges and opportunities. First, there is a growing need to develop robust, ethically aligned AI frameworks that ensure fairness, transparency, and non-discrimination in automated decision-making. Additionally, further work is needed to advance real-time data governance models capable of dynamically adapting to evolving regulatory landscapes across jurisdictions. Enhancing the scalability of secure data sharing protocols through zero-knowledge proofs, federated learning, and edge computing is another key frontier. Interdisciplinary collaboration among technologists, legal experts, and policy-makers will be essential to designing systems that harmonize technological innovation with societal values. Lastly, as data ecosystems continue to expand in complexity and scope, future research must aim to create more adaptive, intelligent, and resilient infrastructures for secure and equitable data exchange.

Author Contributions

Conceptualization, Y.N.C-Á. and A.M.Á.-M.; methodology, Y.N.C-Á., A.M.Á.-M. and G.C.-D.; project administration, A.M.Á.-M. and G.C.-D.; supervision, A.M.Á.-M. and G.C.-D.; resources, Y.N.C-Á., A.M.Á.-M. and G.C.-D. All authors have read and agreed to the published version of the manuscript.

Funding

Under grants provided by the project: “Sistema de visión artificial para el monitoreo y seguimiento de efectos analgésicos y anestésicos administrados vía neuroaxial epidural en población obstétrica durante labores de parto para el fortalecimiento de servicios de salud materna del Hospital Universitario de Caldas - SES HUC” (Hermes 57661) funded by Universidad Nacional de Colombia. Also, G. Castellanos-Dominguez and A. Alvarez-Meza thank to the program: “Alianza científica con enfoque comunitario para mitigar brechas de atención y manejo de trastornos mentales relacionados con impulsividad en Colombia (ACEMATE)-91908”—Project: “Sistema multimodal apoyado en juegos serios orientado a la evaluación e intervención neurocognitiva personalizada en trastornos de impulsividad asociados a TDAH como soporte a la intervención presencial y remota en entornos clínicos, educativos y comunitarios-790-2023”, funded by Minciencias.

Conflicts of Interest

The authors declare no conflict of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript; or in the decision to publish the results.

References

  1. Tenopir, C.; Rice, N.; Allard, S.L.; Baird, L.; Borycz, J.; Christian, L.; Grant, B.; Olendorf, R.; Sandusky, R.J. Data sharing, management, use, and reuse: Practices and perceptions of scientists worldwide. PLoS ONE 2020, 15, e0229003. [Google Scholar] [CrossRef] [PubMed]
  2. Stach, C. Data Is the New Oil-Sort of: A View on Why This Comparison Is Misleading and Its Implications for Modern Data Administration. Future Internet 2023, 15, 71. [Google Scholar] [CrossRef]
  3. Fischer, R.P.; Schnicke, F.; Beggel, B.; Antonino, P. Historical Data Storage Architecture Blueprints for the Asset Administration Shell. In Proceedings of the 2022 IEEE 27th International Conference on Emerging Technologies and Factory Automation (ETFA), Stuttgart, Germany, 6–9 September 2022; pp. 1–8. [Google Scholar] [CrossRef]
  4. Derakhshannia, M.; Gervet, C.; Hajj-Hassan, H.; Laurent, A.; Martin, A. Data Lake Governance: Towards a Systemic and Natural Ecosystem Analogy. Future Internet 2020, 12, 126. [Google Scholar] [CrossRef]
  5. Chamoli, S. Big Data with Cloud Computing: Discussions and Challenges. Math. Stat. Eng. Appl. 2021, 5, 32–40. [Google Scholar] [CrossRef]
  6. Ge, L.; YanLi, P. Research on Network Data Monitoring and Legal Evidence Integration Based on Cloud Computing. Mob. Inf. Syst. 2022, 2022, 1544981. [Google Scholar] [CrossRef]
  7. Gao, J. Computer Data Processing Mode in the era of Big Data: From Feature Analysis to Comprehensive Mining. In Proceedings of the 2021 6th International Conference on Inventive Computation Technologies (ICICT), Coimbatore, India, 20–22 January 2021; pp. 614–617. [Google Scholar] [CrossRef]
  8. Contreras, J.P.; Majumdar, S.; El-Haraki, A. Methods for Transferring Data from a Compute to a Storage Cloud. In Proceedings of the 2022 9th International Conference on Future Internet of Things and Cloud (FiCloud), Rome, Italy, 22–24 August 2022; pp. 67–74. [Google Scholar] [CrossRef]
  9. Siderska, J.; Aunimo, L.; Süße, T.; von Stamm, J.; Kedziora, D.; Aini, S.N.B.M. Towards Intelligent Automation (IA): Literature Review on the Evolution of Robotic Process Automation (RPA), its Challenges, and Future Trends. Eng. Manag. Prod. Serv. 2023, 15, 90–103. [Google Scholar] [CrossRef]
  10. Abdu, N.A.A.; Wang, Z. Blockchain Framework for Collaborative Clinical Trials Auditing. Wirel. Pers. Commun. 2023, 132, 39–65. [Google Scholar] [CrossRef]
  11. Moore, C.H.; Lin, W. FPGA Correlator for Applications in Embedded Smart Devices. Biosensors 2022, 12, 236. [Google Scholar] [CrossRef]
  12. Medileh, S.; Laouid, A.; Hammoudeh, M.; Kara, M.; Bejaoui, T.; Eleyan, A.; Al-Khalidi, M. A Multi-Key with Partially Homomorphic Encryption Scheme for Low-End Devices Ensuring Data Integrity. Information 2023, 14, 263. [Google Scholar] [CrossRef]
  13. OneTrust. Data Privacy Program Guide: How to Build a Privacy Program that Inspires Trust and Achieves Compliance; OneTrust White Paper: Atlanta, GA, USA, 2019. [Google Scholar]
  14. Liu, X. Design and implementation of heterogeneous data exchange platform based on web technology. In Proceedings of the Proceedings Volume 12332 International Conference on Intelligent Systems, Communications, and Computer Networks (ISCCN 2022), Chengdu, China, 17–19 June 2022; Volume 12332, p. 123320M. [Google Scholar] [CrossRef]
  15. Hasan, H.; Ali, M. A Modern Review of EDI: Representation, Protocols and Security Considerations. In Proceedings of the 2019 2nd IEEE Middle East and North Africa COMMunications Conference (MENACOMM), Manama, Bahrain, 19–21 November 2019; pp. 1–5. [Google Scholar] [CrossRef]
  16. Blakely, B.E.; Pawar, P.; Jololian, L.; Prabhaker, S. The Convergence of EDI, Blockchain, and Big Data in Health Care. In Proceedings of the SoutheastCon 2021, Atlanta, GA, USA, 10–13 March 2021; pp. 1–5. [Google Scholar] [CrossRef]
  17. Oleiwi, R. The Impact of Electronic Data Interchange on Accounting Systems. Int. J. Prof. Bus. Rev. 2023, 8, e01163. [Google Scholar] [CrossRef]
  18. Kuhn, M.; Franke, J. Data continuity and traceability in complex manufacturing systems: A graph-based modeling approach. Int. J. Comput. Integr. Manuf. 2021, 34, 549–566. [Google Scholar] [CrossRef]
  19. Andrianov, A.M. Analysing Technologies for the Comprehensive Digitalisation of Hightech Industrial Production in an «Industry 4.0» Paradigm. Sci. Work Free Econ. Soc. Russ. 2021, 228, 298–317. [Google Scholar] [CrossRef]
  20. Fadhel, S.A.; Jameel, E.A. A comparison between NOSQL and RDBMS: Storage and retrieval. MINAR Int. J. Appl. Sci. Technol. 2022, 4, 172–184. [Google Scholar] [CrossRef]
  21. Dhasmana, G.; J, P.G.; S, G.M.; R, P.K.H. SQL and NOSQL Databases in the Application of Business Analytics. In Proceedings of the 2023 International Conference on Computer Science and Emerging Technologies (CSET), Bangalore, India, 10–12 October 2023; pp. 1–5. [Google Scholar] [CrossRef]
  22. Kanungo, S.; Morena, R.D. Concurrency versus consistency in NoSQL databases. J. Auton. Intell. 2023. [Google Scholar] [CrossRef]
  23. Karunarathne, S.M.; Saxena, N.; Khan, M. Security and Privacy in IoT Smart Healthcare. IEEE Internet Comput. 2021, 25, 37–48. [Google Scholar] [CrossRef]
  24. Alzahrani, A.G.; Alhomoud, A.; Wills, G. A Framework of the Critical Factors for Healthcare Providers to Share Data Securely Using Blockchain. IEEE Access 2022, 10, 41064–41077. [Google Scholar] [CrossRef]
  25. Spengler, H.; Gatz, I.; Kohlmayer, F.; Kuhn, K.A.; Prasser, F. Improving data quality in medical research: A monitoring architecture for clinical and translational data warehouses. In Proceedings of the 2020 IEEE 33rd International Symposium on Computer-Based Medical Systems (CBMS), Rochester, MN, USA, 28–30 July 2020; IEEE: New York, NY, USA, 2020; pp. 415–420. [Google Scholar]
  26. Hawig, D.; Zhou, C.; Fuhrhop, S.; Fialho, A.S.; Ramachandran, N. Designing a distributed ledger technology system for interoperable and general data protection regulation–compliant health data exchange: A use case in blood glucose data. J. Med. Internet Res. 2019, 21, e13665. [Google Scholar] [CrossRef]
  27. Rahmatulloh, A.; Nugraha, F.; Gunawan, R.; Darmawan, I. Event-Driven Architecture to Improve Performance and Scalability in Microservices-Based Systems. In Proceedings of the 2022 International Conference Advancement in Data Science, E-learning and Information Systems (ICADEIS), Bandung, Indonesia, 23–24 November 2022; pp. 1–6. [Google Scholar] [CrossRef]
  28. Rafik, H.; Maizate, A.; Ettaoufik, A. Data Security Mechanisms, Approaches, and Challenges for e-Health Smart Systems. Int. J. Online Biomed. Eng. 2023, 19, 42–66. [Google Scholar] [CrossRef]
  29. Williamson, S.; Vijayakumar, K. Artificial intelligence techniques for industrial automation and smart systems. Concurr. Eng. 2021, 29, 291–292. [Google Scholar] [CrossRef]
  30. Quispe, J.F.P.; Diaz, D.Z.; Choque-Flores, L.; León, A.L.C.; Carbajal, L.V.R.; Serquen, E.E.P.; García-Huamantumba, A.; García-Huamantumba, E.; García-Huamantumba, C.F.; Paredes, C.E.G. Quantitative Evaluation of the Impact of Artificial Intelligence on the Automation of Processes. Data Metadata 2023. [Google Scholar] [CrossRef]
  31. Kulakov, Y.; Korenko, D. Methods of applying artificial intelligence in software-defined networks. Probl. Informatiz. Manag. 2023. [Google Scholar] [CrossRef]
  32. Abiteboul, S.; Stoyanovich, J. Transparency, fairness, data protection, neutrality: Data management challenges in the face of new regulation. J. Data Inf. Qual. 2019, 11, 1–9. [Google Scholar] [CrossRef]
  33. Voss, W.G. The CCPA and the GDPR Are Not the Same: Why You Should Understand Both. AARN Law Technol. 2021. Available online: https://www.competitionpolicyinternational.com/wp-content/uploads/2021/01/1-The-CCPA-and-the-GDPR-Are-Not-the-Same-Why-You-Should-Understand-Both-By-W.-Gregory-Voss.pdf (accessed on 1 January 2025).
  34. Kitchenham, B.; Charters, S.M. Guidelines for Performing Systematic Literature Reviews in Software Engineering; ver. 2.3 EBSE Technical Report; EBSE: Rio de Janeiro, Brazil, 2007. [Google Scholar]
  35. Bobda, C.; Mbongue, J.; Chow, P.; Ewais, M. The Future of FPGA Acceleration in Datacenters and the Cloud. Commun. ACM 2022, 15, 1–42. [Google Scholar] [CrossRef]
  36. Weber, L.; Sommer, L.; Solis-Vasquez, L.; Vinçon, T.; Knödler, C.; Bernhardt, A.; Petrov, I.; Koch, A. A Framework for the Automatic Generation of FPGA-based Near-Data Processing Accelerators in Smart Storage Systems. In Proceedings of the 2021 IEEE International Parallel and Distributed Processing Symposium Workshops (IPDPSW), Portland, OR, USA, 17–21 June 2021; pp. 136–143. [Google Scholar] [CrossRef]
  37. Liu, J.; Dragojević, A.; Flemming, S.; Katsarakis, A.; Korolija, D.; Zablotchi, I.; Ng, H.c.; Kalia, A.; Castro, M. Honeycomb: Ordered key-value store acceleration on an FPGA-based SmartNIC. IEEE Trans. Comput. 2023, 73, 857–871. [Google Scholar] [CrossRef]
  38. Hu, Y.; Yao, X.; Zhang, R.; Zhang, Y. Freshness Authentication for Outsourced Multi-Version Key-Value Stores. IEEE Trans. Dependable Secur. Comput. 2023, 20, 2071–2084. [Google Scholar] [CrossRef]
  39. Zhang, T.; Wang, J.; Cheng, X.; Xu, H.; Yu, N.; Huang, G.; Zhang, T.; He, D.; Li, F.; Cao, W.; et al. FPGA-Accelerated Compactions for LSM-based Key-Value Store. In Proceedings of the 18th USENIX Conference on File and Storage Technologies, FAST 2020, Santa Clara, CA, USA, 24–27 February 2020; Noh, S.H., Welch, B., Eds.; USENIX Association: Berkeley, CA, USA, 2020; pp. 225–237. [Google Scholar]
  40. Siddiqui, M.F.; Ali, F.; Javed, M.A.; Khan, M.B.; Saudagar, A.K.J.; Alkhathami, M.; Abul Hasanat, M.H. An FPGA-Based Performance Analysis of Hardware Caching Techniques for Blockchain Key-Value Database. Appl. Sci. 2023, 13, 4092. [Google Scholar] [CrossRef]
  41. Cai, M.; Jiang, X.; Shen, J.; Ye, B. SplitDB: Closing the Performance Gap for LSM-Tree-Based Key-Value Stores. IEEE Trans. Comput. 2024, 73, 206–220. [Google Scholar] [CrossRef]
  42. Li, L.; Yue, Z.; Wu, G. Electronic Medical Record Sharing System Based on Hyperledger Fabric and InterPlanetary File System. In Proceedings of the 2021 ACM Symposium on Document Engineering (DocEng ’21), Sanya, China, 2–4 February 2021. [Google Scholar] [CrossRef]
  43. Ali, M.; Vecchio, M.; Antonelli, F. A Blockchain-Based Framework for IoT Data Monetization Services. Comput. J. 2020, 64, 195–210. [Google Scholar] [CrossRef]
  44. Kumar, A.; Kumar, V.P. An Approach to Secure Decentralized Storage System Using Blockchain and Interplanetary File System. In Proceedings of the 2023 International Conference on Big Data & Smart Systems (ICBDS), New Raipur, India, 6–8 October 2023. [Google Scholar] [CrossRef]
  45. Sentausa, D.; Hareva, D.H. Decentralize Application for Storing Personal Health Record using Ethereum Blockchain and Interplanetary File System. In Proceedings of the 2022 International Conference on Telecommunications, Information, and Internet of Things Applications (ICTIIA), Tangerang, Indonesia, 23 September 2022. [Google Scholar] [CrossRef]
  46. Ugochukwu, N.A.; Goyal, S.B.; Rajawat, A.; Verma, C.; Illés, Z. Enhancing Logistics With the Internet of Things: A Secured and Efficient Distribution and Storage Model Utilizing Blockchain Innovations and Interplanetary File System. IEEE Access 2023, 12, 4139–4152. [Google Scholar] [CrossRef]
  47. Popchev, I.; Radeva, I.; Doukovska, L.; Dimitrova, M. A Web Application for Data Exchange Blockchain Platform. In Proceedings of the 2023 International Conference on Big Data, Knowledge and Control Systems Engineering (BdKCSE), Sofia, Bulgaria, 2–3 November 2023; pp. 1–7. [Google Scholar] [CrossRef]
  48. AlSobeh, A.M.; Magableh, A.A. BlockASP: A framework for AOP-based model checking blockchain system. IEEE Access 2023, 11, 115062–115075. [Google Scholar] [CrossRef]
  49. Azzopardi, S.; Ellul, J.; Falzon, R.; Pace, G.J. AspectSol: A solidity aspect-oriented programming tool with applications in runtime verification. In Proceedings of the International Conference on Runtime Verification; Springer: Berlin/Heidelberg, Germany, 2022; pp. 243–252. [Google Scholar]
  50. AlSobeh, A.; Shatnawi, A.; Al-Ahmad, B.; Aljmal, A.; Khamaiseh, S. AI-Powered AOP: Enhancing Runtime Monitoring with Large Language Models and Statistical Learning. Int. J. Adv. Comput. Sci. Appl. 2024, 15. [Google Scholar] [CrossRef]
  51. Rajesh, S.C.; Borada, D. AI-Powered Solutions for Proactive Monitoring and Alerting in Cloud-Based Architectures. Int. J. Res. Mod. Eng. Emerg. Technol. (IJRMEET) 2024, 12, 208–215. [Google Scholar]
  52. Walker, D.; Tarver, W.; Jonnalagadda, P.; Ranbom, L.; Ford, E.W.; Rahurkar, S. Perspectives on Challenges and Opportunities for Interoperability. JMIR Med Inform. 2023, 11, e43848. [Google Scholar] [CrossRef]
  53. Goel, A.K.; Campbell, W.S.; Moldwin, R. Structured Data Capture for Oncology. JCO Clin. Cancer Inform. 2021, 5, 194–201. [Google Scholar] [CrossRef]
  54. Jaffe, C.; Vreeman, D.; Kaminker, D.; Nguyen, V. Implementing HL7 FHIR. J. Healthc. Manag. Stand. 2023, 2, 1–8. [Google Scholar] [CrossRef]
  55. Dullabh, P.; Hovey, L.; Heaney-Huls, K.; Rajendran, N.; Wright, A.; Sittig, D.F. Application Programming Interfaces in Health Care. Appl. Clin. Inform. 2020, 11, 59–69. [Google Scholar] [CrossRef]
  56. Akhtar, S.; Rauf, A.; Abbas, H.; Amjad, M.F. Inter Cloud Interoperability Use Cases and Gaps in Corresponding Standards. In Proceedings of the 2020 IEEE Intl Conf on Dependable, Autonomic and Secure Computing, Calgary, AB, Canada, 17–22 August 2020; pp. 585–592. [Google Scholar] [CrossRef]
  57. Rutz, A.; Sorokina, M.; Galgonek, J.; Mietchen, D.; Willighagen, E.; Gaudry, A.; Graham, J.; Stephan, R.; Page, R.; Vondrášek, J.; et al. The LOTUS initiative for open knowledge management in natural products research. eLife 2022, 11, e70780. [Google Scholar] [CrossRef]
  58. Narkhede, N.; Shapira, G.; Palino, T. Kafka: A Distributed Messaging System for Log Processing; O’Reilly Media: Sebastopol, CA, USA, 2017. [Google Scholar]
  59. Carlson, J.L. Redis: Lightweight Key/Value Store That Exceeds Expectations; Apress: New York, NY, USA, 2013. [Google Scholar]
  60. Deyhim, P. The Tale of Two Messaging Platforms: Apache Kafka and Amazon Kinesis. Amaz. Web Serv. Blog. 2016. Available online: https://aws.amazon.com/blogs/startups/the-tale-of-two-messaging-platforms-apache-kafka-and-amazon-kinesis/ (accessed on 1 January 2025).
  61. Morais, J.; George, J.; Rojas, R.L. Low Latency Real-Time Cache Updates with Amazon ElastiCache for Redis and Confluent Cloud Kafka. AWS Blog. 2021. Available online: https://aws.amazon.com/blogs/apn/low-latency-real-time-cache-updates-with-amazon-elasticache-for-redis-and-confluent-cloud-kafka/ (accessed on 1 January 2025).
  62. Bowen, J. Data Integration with Talend Open Studio; Packt Publishing: Birmingham, UK, 2012. [Google Scholar]
  63. Dossot, D.; D’Emic, J. Mule in Action; Manning Publications: Shelter Island, NY, USA, 2014. [Google Scholar]
  64. Bende, B.; Payne, M.; Dyer, J. Dataflow Management with Apache NiFi; Packt Publishing: Birmingham, UK, 2017. [Google Scholar]
  65. Nwokike, C. Dell Boomi AtomSphere: A Cloud-Based Integration Platform. J. Cloud Comput. 2013. [Google Scholar]
  66. Feinman, T. Data Loss Prevention: Protecting Sensitive Data in the Enterprise; McGraw-Hill Education: Columbus, OH, USA, 2010. [Google Scholar]
  67. Chen, W.-J.; Barkai, B.; DiPietro, J.M.; Langman, V.; Perlov, D.; Riah, R.; Rozenblit, Y.; Santos, A. Implementing IBM InfoSphere Guardium; IBM Redbooks: New York, NY, USA, 2015. [Google Scholar]
  68. Santos, O. Cisco CyberOps Associate CBROPS 200–201 Official Cert Guide; Cisco Press: San Jose, CA, USA, 2020. [Google Scholar]
  69. Networks, P.A. Prisma Access SASE Security: Secure Your Remote Workforce. Palo Alto Netw. White Pap. 2020. [Google Scholar]
  70. Zadeh, R.B.; Ramsundar, B. Machine Learning with TensorFlow on Google Cloud Platform; O’Reilly Media: Sebastopol, CA, USA, 2017. [Google Scholar]
  71. Barnes, J. Azure Machine Learning: Microsoft Azure Essentials; Microsoft Press: Redmond, WA, USA, 2015. [Google Scholar]
  72. Institute, S. White Paper, North Carolina; SAS White Paper: Cary, NC, USA, 2016. [Google Scholar]
  73. Bakshi, T.; Gaikwad, A. Getting Started with IBM Watson: How to Build and Deploy AI Models; Packt Publishing: Birmingham, UK, 2020. [Google Scholar]
  74. Calderon, G.; Campo, G.D.; Saavedra, E.; Santamaria, A. Management and Monitoring IoT Networks through an Elastic Stack-based Platform. In Proceedings of the 2021 8th International Conference on Future Internet of Things and Cloud (FiCloud), Rome, Italy, 23–25 August 2021; pp. 184–191. [Google Scholar] [CrossRef]
  75. Bano, S. PhD Forum Abstract: Efficient Computing and Communication Paradigms for Federated Learning Data Streams. In Proceedings of the 2021 IEEE International Conference on Smart Computing (SMARTCOMP), Irvine, CA, USA, 23–27 August 2021; pp. 410–411. [Google Scholar] [CrossRef]
  76. Vyas, S.; Tyagi, R.; Jain, C.; Sahu, S. Performance Evaluation of Apache Kafka—A Modern Platform for Real Time Data Streaming. In Proceedings of the 2022 2nd International Conference on Innovative Practices in Technology and Management (ICIPTM), Gautam Buddha Nagar, India, 23–25 February 2022; pp. 465–470. [Google Scholar] [CrossRef]
  77. Aung, T.; Min, H.Y.; Maw, A. CIMLA: Checkpoint Interval Message Logging Algorithm in Kafka Pipeline Architecture. In Proceedings of the 2020 International Conference on Advanced Information Technologies (ICAIT), Yangon, Myanmar, 4–5 November 2020; pp. 30–35. [Google Scholar] [CrossRef]
  78. Wang, G.; Chen, L.; Dikshit, A.; Gustafson, J.; Chen, B.S.; Sax, M.; Roesler, J.; Blee-Goldman, S.; Cadonna, B.; Mehta, A.; et al. Consistency and Completeness: Rethinking Distributed Stream Processing in Apache Kafka. In Proceedings of the Proceedings of the 2021 International Conference on Management of Data, Virtual, 20–25 June 2021. [Google Scholar] [CrossRef]
  79. Zhang, H.; Fang, L.; Jiang, K.; Zhang, W.; Li, M.; Zhou, L. Secure Door on Cloud: A Secure Data Transmission Scheme to Protect Kafka’s Data. In Proceedings of the 2020 IEEE 26th International Conference on Parallel and Distributed Systems (ICPADS), Hong Kong, China, 2–4 December 2020; pp. 406–413. [Google Scholar] [CrossRef]
  80. Song, R.; Gao, S.; Song, Y.; Xiao, B. A Traceable and Privacy-Preserving Data Exchange Scheme based on Non-Fungible Token and Zero-Knowledge. In Proceedings of the IEEE 42nd International Conference on Distributed Computing Systems (ICDCS), Bologna, Italy, 10–13 July 2022. [Google Scholar] [CrossRef]
  81. Shalannanda, W. Using Zero-Knowledge Proof in Privacy-Preserving Networks. In Proceedings of the 2023 17th International Conference on Telecommunication Systems, Services, and Applications (TSSA), Lombok, Indonesia, 12–13 October 2023. [Google Scholar] [CrossRef]
  82. Nasri, J.Z.; Rais, H. zk-BeSC: Confidential Blockchain Enabled Supply Chain Based on Polynomial Zero-Knowledge Proofs. In Proceedings of the IEEE 19th International Wireless Communications and Mobile Computing Conference (IWCMC), Marrakesh, Morocco, 19–23 June 2023. [Google Scholar] [CrossRef]
  83. Cabot-Nadal, M.A.; Playford, B.; Payeras-Capellà, M.; Gerske, S.; Mut-Puigserver, M.; Pericàs-Gornals, R. Private Identity-Related Attribute Verification Protocol Using SoulBound Tokens and Zero-Knowledge Proofs. In Proceedings of the IEEE International Symposium on Networks, Computers and Communications (ISNCC), Montreal, QC, Canada, 16–18 October 2023. [Google Scholar] [CrossRef]
  84. Fotiou, N.; Pittaras, I.; Chadoulos, S.; Siris, V.; Polyzos, G.; Ipiotis, N.; Keranidis, S. Authentication, Authorization, and Selective Disclosure for IoT data sharing using Verifiable Credentials and Zero-Knowledge Proofs. arXiv 2022, arXiv:2209.00586. [Google Scholar]
  85. Cuzzocrea, A.; Damiani, E. Privacy-Preserving Big Data Exchange: Models, Issues, Future Research Directions. In Proceedings of the 2021 IEEE International Conference on Big Data (Big Data), Orlando, FL, USA, 15–18 December 2021; pp. 5081–5084. [Google Scholar] [CrossRef]
  86. Madan, S. SABPP: Privacy-Preserving Data Exchange in The Big Data Market Using The Smart Contract Approach. Indian J. Sci. Technol. 2023, 16, 4388–4400. [Google Scholar] [CrossRef]
  87. Li, T.; Ren, W.; Xiang, Y.; Zheng, X.; Zhu, T.; Choo, K.-K.R.; Srivastava, G. FAPS: A fair, autonomous and privacy-preserving scheme for big data exchange based on oblivious transfer, Ether cheque and smart contracts. Inf. Sci. 2021, 544, 469–484. [Google Scholar] [CrossRef]
  88. Jena, L.; Mohanty, R.; Mohanty, M.N. Privacy-Preserving Cryptographic Model for Big Data Analytics. In Privacy and Security Issues in Big Data; Springer: Singapore, 2021. [Google Scholar] [CrossRef]
  89. Kulkarni, A.; Manjunath, T.N. Hybrid Cloud-Based Privacy Preserving Clustering as Service for Enterprise Big Data. Int. J. Recent Technol. Eng. 2023, 11, 146–156. [Google Scholar] [CrossRef]
  90. Azam, N.; Michala, A.L.; Ansari, S.; Truong, N.B. Modelling Technique for GDPR-Compliance: Toward a Comprehensive Solution. In Proceedings of the GLOBECOM 2023—2023 IEEE Global Communications Conference, Kuala Lumpur, Malaysia, 4–8 December 2023; pp. 3300–3305. [Google Scholar] [CrossRef]
  91. Merlec, M.; Lee, Y.; Hong, S.; In, H. A Smart Contract-Based Dynamic Consent Management System for Personal Data Usage under GDPR. Sensors 2021, 21, 7994. [Google Scholar] [CrossRef]
  92. Kaal, W. AI Governance via Web3 Reputation System. Stan. J. Blockchain L. Pol’y 2025, 8, 1. [Google Scholar] [CrossRef]
  93. Radanliev, P. AI Ethics: Integrating Transparency, Fairness, and Privacy in AI Development. Appl. Artif. Intell. 2025, 39, 2463722. [Google Scholar] [CrossRef]
  94. Dotan, R.; Blili-Hamelin, B.; Madhavan, R. Evolving AI Risk Management: A Maturity Model Based on the NIST AI Risk Management Framework. arXiv 2024, arXiv:2401.15229. [Google Scholar]
  95. Nam, Y.; Shin, E.; Lee, S.; Jung, S.; Bae, Y.; Kim, J. Global-scale GDPR Compliant Data Sharing System. In Proceedings of the 2020 International Conference on Electronics, Information, and Communication (ICEIC), Barcelona, Spain,, 19–22 January 2020. [Google Scholar] [CrossRef]
  96. Kim, D.-Y.; Elluri, L.; Joshi, K. Trusted Compliance Enforcement Framework for Sharing Health Big Data. In Proceedings of the IEEE International Conference on Big Data (Big Data), Orlando, FL, USA, 15–18 December 2021. [Google Scholar] [CrossRef]
  97. Anonymous. Advancing Through Data: A Critical Review of the Evolution of Medical Information Management Systems. Int. J. Biomed. Med. Eng. Health Sci. 2023, 11, 73–78. [Google Scholar] [CrossRef]
  98. Babu, M.S.; Raj, K.J.; Devi, D. Data Security and Sensitive Data Protection using Privacy by Design Technique. In Proceedings of the 2nd EAI International Conference on Big Data Innovation for Sustainable Cognitive Computing (BDCC 2019), Coimbatore, India, 12–13 December 2019. [Google Scholar] [CrossRef]
  99. Arbabi, M.S.; Lal, C.; Veeraragavan, N.; Marijan, D.; Nygård, J.; Vitenberg, R. A Survey on Blockchain for Healthcare: Challenges, Benefits, and Future Directions. IEEE Commun. Surv. Tutor. 2022, 25, 386–424. [Google Scholar] [CrossRef]
  100. Seth, M. Mulesoft—Salesforce Integration Using Batch Processing. In Proceedings of the 2018 5th International Conference on Computational Science/Intelligence and Applied Informatics (CSII), Yonago, Japan, 10–12 July 2018; pp. 7–14. [Google Scholar] [CrossRef]
  101. Madhavi, T.; Rithvik, M.; Sindhura, N. A CONJOIN Approach to IOT snd Software Engineering Using TIBCO B.W. Int. J. Eng. Technol. Manag. Res. 2020, 4, 36–42. [Google Scholar] [CrossRef]
  102. Gregg, T.A.; Pandey, K.M.; Errickson, R.K. The Integrated Cluster Bus for the IBM S/390 Parallel Sysplex. IBM J. Res. Dev. 1999, 43, 795–806. [Google Scholar] [CrossRef]
  103. Rawat, S.; Narain, A. Introduction to Azure Data Factory. In Understanding Azure Data Factory; Apress: Berkeley, CA, USA, 2018. [Google Scholar] [CrossRef]
  104. Pogiatzis, A.; Samakovitis, G. An Event-Driven Serverless ETL Pipeline on AWS. Appl. Sci. 2020, 11, 191. [Google Scholar] [CrossRef]
  105. Ebert, N.; Weber, K.; Koruna, S. Integration Platform as a Service. Bus. Inf. Syst. Eng. 2017, 59, 375–379. [Google Scholar] [CrossRef]
  106. Wang, H.; Cen, Y. Implementation of Information Integration Platform in Chinese Tobacco Industry Enterprise Based on SOA. Adv. Mater. Res. 2013, 765–767, 1360–1364. [Google Scholar] [CrossRef]
  107. Faruk, M.J.H.; Saha, B.; Basney, J. A Comparative Analysis Between SciTokens, Verifiable Credentials, and Smart Contracts: Novel Approaches for Authentication and Secure Access to Scientific Data. In Proceedings of the 2023 ACM Symposium, Portland, OR, USA, 23–27 July 2023. [Google Scholar] [CrossRef]
  108. Westergaard, G.; Erden, U.; Mateo, O.A.; Lampo, S.M.; Akinci, T.C.; Topsakal, O. Time Series Forecasting Utilizing Automated Machine Learning (AutoML): A Comparative Analysis Study on Diverse Datasets. Information 2024, 15, 39. [Google Scholar] [CrossRef]
  109. Nedosnovanyi, O.; Cherniak, O.; Golinko, V. Comparative Analysis of Cloud Services for Geoinformation Data Processing. Inf. Technol. Comput. Eng. 2023, 57, 50–57. [Google Scholar] [CrossRef]
  110. Parhad, O.; Naik, V. Comparative analysis of Data Extraction for Qualcomm based android devices. In Proceedings of the ICCCNT, Delhi, India, 6–8 July 2023. [Google Scholar] [CrossRef]
  111. Robu, E. Enhancing data security and protection in marketing: A comparative analysis of Golang and PHP approaches. Ecosoen 2024, 19–30. [Google Scholar] [CrossRef]
  112. Rathod, P.; Hamalainen, T. Leveraging the Benefits of Big Data with Fast Data for Effective and Efficient Cybersecurity Analytics Systems: A Robust Optimisation Approach. In Proceedings of the 15th International Conference on Cyber Warfare and Security (ICCWS 2020), Norfolk, VA, USA, 12–13 March 2020; Payne, B.K., Wu, H., Eds.; Academic Conferences International: South Oxfordshire, UK, 2020; pp. 290–298. [Google Scholar]
  113. Islam, M.; Aktheruzzaman, K. An Analysis of Cybersecurity Attacks against Internet of Things and Security Solutions. J. Comput. Commun. 2020, 8, 11–25. [Google Scholar] [CrossRef]
  114. Altulaihan, E.; Almaiah, M.; Aljughaiman, A. Cybersecurity Threats, Countermeasures and Mitigation Techniques on the IoT: Future Research Directions. Electronics 2022, 11, 3330. [Google Scholar] [CrossRef]
  115. Sukumaran, R.P.; Benedict, S. Survey on Blockchain Enabled Authentication for Industrial Internet of Things. Comput. Commun. 2021, 182, 1–14. [Google Scholar] [CrossRef]
  116. Verma, P.; Sharma, R.; Mistry, M. A Paradigm Shift in IoT Cyber-Security: A Systematic Review. Int. J. Res. Appl. Sci. Eng. Technol. 2022, 10, 1024–1034. [Google Scholar] [CrossRef]
  117. Mishra, S.; Sharma, S.; Alowaidi, M.A. Multilayer Self-defense System to Protect Enterprise Cloud. Comput. Mater. Contin. 2020, 66, 71–85. [Google Scholar] [CrossRef]
  118. Mathew, A. The Power of Cybersecurity Data Science in Protecting Digital Footprints. Cogniz. J. Multidiscip. Stud. 2023, 3, 1–4. [Google Scholar] [CrossRef]
  119. Allouche, Y.; Tapas, N.; Longo, F.; Shabtai, A.; Wolfsthal, Y. TRADE: TRusted Anonymous Data Exchange: Threat Sharing Using Blockchain Technology. arXiv 2021, arXiv:2103.13158. [Google Scholar]
  120. Santoso, K.; Muin, M.; Mahmudi, M. Implementation of AES cryptography and twofish hybrid algorithms for cloud. J. Physics Conf. Ser. 2020, 1517, 012099. [Google Scholar] [CrossRef]
  121. Stodt, J.; Reich, C. Data Confidentiality In P2P Communication And Smart Contracts Of Blockchain In Industry 4.0. arXiv 2020, arXiv:2020.101001. [Google Scholar] [CrossRef]
  122. Shahmoradi, L.; Ebrahimi, M.; Shahmoradi, S.; Farzanehnejad, A.R.; Moammaie, H.; Koolaee, M.H. Usage of Standards to Integration of Hospital Information Systems. Front. Health Inform. 2020, 9, 28. [Google Scholar] [CrossRef]
  123. Abdellatif, A.A.; Samara, L.; Mohamed, A.M.; Erbad, A.; Chiasserini, C.F.; Guizani, M.; O’Connor, M.D.; Laughton, J. MEdge-Chain: Leveraging Edge Computing and Blockchain for Efficient Medical Data Exchange. IEEE Internet Things J. 2021, 8, 15762–15775. [Google Scholar] [CrossRef]
  124. Shmatko, O.; Kliuchka, Y. A novel architecture of a secure medical system using dag. InterConf 2022. [Google Scholar] [CrossRef]
  125. Salomi, M.; Claro, P.B. Adopting Healthcare Information Exchange among Organizations, Regions, and Hospital Systems toward Quality, Sustainability, and Effectiveness. Technol. Investig. 2020, 11, 58–97. [Google Scholar] [CrossRef]
  126. Tariq, F.; Khan, Z.; Sultana, T.; Rehman, M.; Shahzad, Q.; Javaid, N. Leveraging Fine-Grained Access Control in Blockchain-Based Healthcare System. In Proceedings of the 34th International Conference on Advanced Information Networking and Applications (AINA 2020), Caserta, Italy, 15–17 April 2020; pp. 106–115. [Google Scholar] [CrossRef]
  127. Ecarot, T.; Fraikin, B.; Ouellet, F.; Lavoie, L.; McGilchrist, M.; Éthier, J.F. Sensitive Data Exchange Protocol Suite for Healthcare. In Proceedings of the 2020 IEEE Symposium on Computers and Communications (ISCC), Rennes, France, 7–10 July 2020; pp. 1–7. [Google Scholar] [CrossRef]
  128. Alsaleem, M.; Hasoon, S.O. Comparison of DT & GBDT Algorithms for Predictive Modeling Of Currency Exchange Rates. Ureka Phys. Eng. 2020, 1, 56–61. [Google Scholar] [CrossRef]
  129. Fallucchi, F.; Coladangelo, M.; Giuliano, R.; Luca, E.D. Predicting Employee Attrition Using Machine Learning Techniques. Computers 2020, 9, 86. [Google Scholar] [CrossRef]
  130. Bag, S.; Gupta, S.; Kumar, A.; Sivarajah, U. An integrated artificial intelligence framework for knowledge creation and B2B marketing rational decision making for improving firm performance. Ind. Mark. Manag. 2021, 92, 178–189. [Google Scholar] [CrossRef]
  131. Yang, N. Financial Big Data Management and Control and Artificial Intelligence Analysis Method Based on Data Mining Technology. Wirel. Commun. Mob. Comput. 2022, 2022, 7596094. [Google Scholar] [CrossRef]
  132. Xiong, X.; Wei, W.; Zhang, C. Dynamic User Allocation Method and Artificial Intelligence in the Information Industry Financial Management System Application. In Proceedings of the 2022 6th International Conference on Intelligent Computing and Control Systems (ICICCS), Madurai, India, 25–27 May 2022; pp. 982–985. [Google Scholar] [CrossRef]
  133. Tedeschi, L. ASAS-NANP Symposium: Mathematical Modeling in Animal Nutrition: The progression of data analytics and artificial intelligence in support of sustainable development in animal science. J. Anim. Sci. 2022, 100, skac111. [Google Scholar] [CrossRef]
  134. Badyal, S.; Kumar, R. Insightful Business Analytics Using Artificial Intelligence—A Decision Support System for E-Businesses. In Proceedings of the 2021 3rd International Conference on Advances in Computing, Communication Control and Networking (ICAC3N), Greater Noida, India, 17–18 December 2021; pp. 109–115. [Google Scholar] [CrossRef]
  135. Lepenioti, K.; Bousdekis, A.; Apostolou, D.; Mentzas, G. Human-Augmented Prescriptive Analytics With Interactive Multi-Objective Reinforcement Learning. IEEE Access 2021, 9, 100677–100693. [Google Scholar] [CrossRef]
  136. Mijwil, M.M. Has the Future Started? The Current Growth of Artificial Intelligence, Machine Learning, and Deep Learning. Iraqi J. Comput. Sci. Math. 2022, 3, 13. [Google Scholar] [CrossRef]
  137. Lakhan, N. Applications of Data Science and AI in Business. Int. J. Res. Appl. Sci. Eng. Technol. 2022, 10, 4115–4118. [Google Scholar] [CrossRef]
  138. Hayashi, T.; Ohsawa, Y. Variable-Based Network Analysis of Datasets on Data Exchange Platforms. arXiv 2020, arXiv:2003.05109. [Google Scholar]
  139. Nair, S. A review on ethical concerns in big data management. Int. J. Big Data Manag. 2020, 1, 8. [Google Scholar] [CrossRef]
  140. Bernasconi, L.; Sen, S.; Angerame, L.; Balyegisawa, A.P.; Hui, D.H.Y.; Hotter, M.; Hsu, C.Y.; Ito, T.; Jorger, F.; Krassnitzer, W.; et al. Legal and ethical framework for global health information and biospecimen exchange—An international perspective. BMC Med. Ethics 2020, 21, 8. [Google Scholar] [CrossRef]
  141. Morrison, M. Research using free text data in medical records could benefit from dynamic consent and other tools for responsible governance. J. Med. Ethics 2020, 46, 380–381. [Google Scholar] [CrossRef]
  142. Benson, T.; Grieve, G. Privacy and Consent; Springer: Cham, Switzerland, 2020; pp. 363–378. [Google Scholar] [CrossRef]
  143. Brewer, S.; Pearson, S.; Maull, R.; Godsiff, P.; Frey, J.G.; Zisman, A.; Parr, G.; McMillan, A.; Cameron, S.; Blackmore, H.; et al. A trust framework for digital food systems. Nat. Food 2021, 2, 543–545. [Google Scholar] [CrossRef]
  144. Garcia, R.; Ramachandran, G.S.; Jurdak, R.; Ueyama, J. A Blockchain-based Data Governance with Privacy and Provenance: A case study for e-Prescription. In Proceedings of the 2022 IEEE International Conference on Blockchain and Cryptocurrency (ICBC), Shanghai, China, 2–5 May 2022; pp. 1–5. [Google Scholar] [CrossRef]
  145. Milne, R.; Brayne, C. We need to think about data governance for dementia research in a digital era. Alzheimer’s Res. Ther. 2020, 12, 17. [Google Scholar] [CrossRef]
  146. Lucivero, F.; Samuel, G.; Blair, G.; Darby, S.J. Data-Driven Unsustainability? An Interdisciplinary Perspective on Governing the Environmental Impacts of a Data-Driven Society; SSRN: Rochester, NY, USA, 2020. [Google Scholar] [CrossRef]
  147. Wang, Z.; Chen, C.; Guo, C. Dynamic Resources Management Under Limited Communication Based on Multi-level Agent System. In Proceedings of the 2023 IEEE 6th International Conference on Electronic Information and Communication Technology (ICEICT), Qingdao, China, 21–24 July 2023; IEEE: New York, NY, USA, 2023; pp. 846–849. [Google Scholar]
  148. Joshi, N.; Srivastava, S. Online Task Allocation and Scheduling in Fog IoT using Virtual Bidding. In Proceedings of the 2022 IEEE 10th Region 10 Humanitarian Technology Conference (R10-HTC), Hyderabad, India, 16–18 September 2022; IEEE: New York, NY, USA, 2022; pp. 81–86. [Google Scholar]
  149. Hu, G.; Zhu, Y.; Zhao, D.; Zhao, M.; Hao, J. Event-triggered communication network with limited-bandwidth constraint for multi-agent reinforcement learning. IEEE Trans. Neural Netw. Learn. Syst. 2021, 34, 3966–3978. [Google Scholar] [CrossRef]
  150. Dos Santos, L.M.; Gracioli, G.; Kloda, T.; Caccamo, M. On the design and implementation of real-time resource access protocols. In Proceedings of the 2020 X Brazilian Symposium on Computing Systems Engineering (SBESC), Florianopolis, Brazil, 24–27 November 2020; IEEE: New York, NY, USA, 2020; pp. 1–8. [Google Scholar]
  151. Kim, M.; Sinha, S.; Orso, A. Adaptive REST API Testing with Reinforcement Learning. In Proceedings of the 2023 38th IEEE/ACM International Conference on Automated Software Engineering (ASE), Echternach, Luxembourg, 11–15 November 2023; IEEE: New York, NY, USA, 2023; pp. 446–458. [Google Scholar]
  152. Noel, R.R.; Mehra, R.; Lama, P. Towards self-managing cloud storage with reinforcement learning. In Proceedings of the 2019 IEEE International Conference on Cloud Engineering (IC2E), Prague, Czech Republic, 24–27 June 2019; IEEE: New York, NY, USA, 2019; pp. 34–44. [Google Scholar]
  153. Banno, R.; Shudo, K. Adaptive topology for scalability and immediacy in distributed publish/subscribe messaging. In Proceedings of the 2020 IEEE 44th Annual Computers, Software, and Applications Conference (COMPSAC), Madrid, Spain, 13–17 July 2020; IEEE: New York, NY, USA, 2020; pp. 575–583. [Google Scholar]
  154. Dincă, A.M.; Axinte, S.D.; Bacivarov, I.C.; Petrică, G. Reliability enhancements for high-availability systems using distributed event streaming platforms. In Proceedings of the 2023 IEEE 29th International Symposium for Design and Technology in Electronic Packaging (SIITME), Craiova, Romania, 18–20 October 2023; IEEE: New York, NY, USA, 2023; pp. 41–46. [Google Scholar]
  155. Kozhaya, D.; Decouchant, J.; Rahli, V.; Esteves-Verissimo, P. Pistis: An event-triggered real-time byzantine-resilient protocol suite. IEEE Trans. Parallel Distrib. Syst. 2021, 32, 2277–2290. [Google Scholar] [CrossRef]
  156. Štufi, M.; Bačić, B. Designing a real-time iot data streaming testbed for horizontally scalable analytical platforms: Czech post case study. arXiv 2021, arXiv:2112.03997. [Google Scholar]
  157. Otavio Chervinski, J.; Kreutz, D.; Yu, J. Towards Scalable Cross-Chain Messaging. arXiv 2023, arXiv:2310.10016. [Google Scholar]
  158. Vimal, S.; Vadivel, M.; Baskar, V.V.; Sivakumar, V.; Srinivasan, C. Integrating IoT and Machine Learning for Real-Time Patient Health Monitoring with Sensor Networks. In Proceedings of the 2023 4th International Conference on Smart Electronics and Communication (ICOSEC), Trichy, India, 20–22 September 2023; IEEE: New York, NY, USA, 2023; pp. 574–578. [Google Scholar]
  159. Im, J.; Lee, J.; Lee, S.; Kwon, H.Y. Data pipeline for real-time energy consumption data management and prediction. Front. Big Data 2024, 7, 1308236. [Google Scholar] [CrossRef] [PubMed]
  160. Reddy, K.P.; Satish, M.; Prakash, A.; Babu, S.M.; Kumar, P.P.; Devi, B.S. Machine Learning Revolution in Early Disease Detection for Healthcare: Advancements, Challenges, and Future Prospects. In Proceedings of the 2023 IEEE 5th International Conference on Cybernetics, Cognition and Machine Learning Applications (ICCCMLA), Hamburg, Germany, 7–8 October 2023; IEEE: New York, NY, USA, 2023; pp. 638–643. [Google Scholar]
  161. Sedaghat, A.; Arbabkhah, H.; Jafari Kang, M.; Hamidi, M. Deep Learning Applications in Vessel Dead Reckoning to Deal with Missing Automatic Identification System Data. J. Mar. Sci. Eng. 2024, 12, 152. [Google Scholar] [CrossRef]
  162. Chang, A.; Wu, X.; Liu, K. Deep learning from latent spatiotemporal information of the heart: Identifying advanced bioimaging markers from echocardiograms. Biophys. Rev. 2024, 5, 011304. [Google Scholar] [CrossRef]
  163. Yin, K.; Huang, H.; Liang, W.; Xiao, H.; Wang, L. Network Coding for Efficient File Transfer in Narrowband Environments. Inf. Technol. Control 2023, 52. [Google Scholar] [CrossRef]
  164. Hao, Q.; Qin, L. The design of intelligent transportation video processing system in big data environment. IEEE Access 2020, 8, 13769–13780. [Google Scholar] [CrossRef]
  165. Van Besien, W.L.; Ferris, B.; Dudish, J. Reliable, Efficient Large-File Delivery over Lossy, Unidirectional Links. In Proceedings of the 2021 IEEE Aerospace Conference (50100), Big Sky, MT, USA, 6–13 March 2021; IEEE: New York, NY, USA, 2021; pp. 1–10. [Google Scholar]
  166. He, Q.; Gao, P.; Zhang, F.; Bian, G.; Zhang, W.; Li, Z. Design and optimization of a distributed file system based on RDMA. Appl. Sci. 2023, 13, 8670. [Google Scholar] [CrossRef]
  167. AbuDaqa, A.A.; Mahmoud, A.; Abu-Amara, M.; Sheltami, T. Survey of network coding based P2P file sharing in large scale networks. Appl. Sci. 2020, 10, 2206. [Google Scholar] [CrossRef]
Figure 1. PRISMA 2020 flow diagram showing the study selection process.
Figure 1. PRISMA 2020 flow diagram showing the study selection process.
Computers 14 00194 g001
Table 1. Summary of the Systematic Review Process.
Table 1. Summary of the Systematic Review Process.
PhaseDescription
1. Planning the ReviewDefined the review scope and objectives, formulated research questions, and established inclusion/exclusion criteria. Targeted studies published between 2020–2024 in English. Followed the guidelines proposed by Keele [34].
2. Article RetrievalSearched IEEE Xplore, ACM Digital Library, and SpringerLink using terms such as “data exchange”, “blockchain in data processing”, “AI in secure data sharing”, and “FPGA for data handling”. Snowballing was applied to capture additional relevant studies.
3. Screening and EligibilityRemoved duplicates and screened 313 articles by title, abstract, and full text. Applied PRISMA 2020 framework (Figure 1). Retained 102 eligible studies for full review.
4. Data Extraction and AnalysisReviewed each article to extract methodological details, challenges addressed, and domain relevance. 92 studies were classified thematically (Table 2) and 74 by problem area (Table 3). Quality assessment ensured analytical rigor.
Table 2. Distribution of Selected Papers by Topic.
Table 2. Distribution of Selected Papers by Topic.
TopicNumber of References
Blockchain and Distributed Ledger16
Data Privacy and Security14
Healthcare and Medical Data10
AI and Machine Learning Applications9
Ethical and Legal Frameworks9
IoT (Internet of Things)7
Big Data and Analytics7
Zero-Knowledge Proofs6
Cybersecurity5
FPGA-Based Acceleration5
Digital Food Systems4
Table 3. Classification of Selected Papers by Addressed Problems.
Table 3. Classification of Selected Papers by Addressed Problems.
Problem AddressedNumber of Papers
Interoperability and Standardization14
Scalability and Performance11
Data Security and Privacy23
Real-Time Processing9
Regulatory Compliance10
Ethical and Governance Aspects7
Table 4. Comparison of Emerging Technologies for Data Exchange.
Table 4. Comparison of Emerging Technologies for Data Exchange.
TechnologySecurityScalabilityLatencyRegulatory Compliance
BlockchainHighMediumHighStrong (e.g., GDPR-ready)
FPGA-Based SystemsMediumHighLowLow
Zero-Knowledge ProofsVery HighLowHighVery Strong
IoT ArchitecturesLowVery HighMediumWeak
AI-Driven SolutionsVariableHighMediumMedium
Table 5. Tools for Real-Time Data Management and Data Integration.
Table 5. Tools for Real-Time Data Management and Data Integration.
CategoryTool (Reference)Key FeaturesApplications in Industries
Real-Time Data ManagementApache Kafka [58]Real-time streaming • High-throughput • Fault-tolerantTelecommunications, Finance
Redis [59]In-memory • Low latency • Message brokerE-commerce, Gaming
Amazon Kinesis [60]Scalable • Real-time • Analytics-readyLog analysis, Media monitoring
Confluent Platform [61]Kafka-enhanced • Secure • ManageableRecommendation systems, IoT
Data IntegrationTalend [62]ETL • Cloud/on-premise • TransformationsHealthcare, Finance
MuleSoft [63]API-based • Multi-environment • ConnectivityRetail, Financial services
Apache NiFi [64]Visual flows • Routing • Real-timeGovernment, Cybersecurity
Dell Boomi [65]Low-code • Visual workflows • iPaaSEducation, Healthcare
Data SecuritySymantec Data Loss Prevention [66]Data protection • Monitoring • PreventionCorporate IT, Legal
IBM Guardium [67]Threat protection • Monitoring • ComplianceFinance, Healthcare
Cisco SecureX [68]Unified visibility • Threat responseCorporate, Critical infrastructure
Palo Alto Networks Prisma Access [69]Cloud-delivered • Remote access • SecureHealthcare, Government
Predictive Analysis and Machine LearningGoogle Cloud AI Platform [70]End-to-end ML • Scalable • DeploymentRetail, Technology
Microsoft Azure Machine Learning [71]Cloud-based • Workflow management • ScalableHealthcare, Finance
SAS Viya [72]Advanced ML • Unified platform • AnalyticsRetail, Financial services
IBM Watson [73]AI+NLP  • Analytics • AutomationHealthcare, Education
Table 6. Comparative Analysis of Database Integration: Scalability and Compliance Features.
Table 6. Comparative Analysis of Database Integration: Scalability and Compliance Features.
Database Technology (Reference)ScalabilityNative IntegrationProtocol SupportRegulatory Compliance
MuleSoft Anypoint [100]Vertical/HorizSalesforce, SAPHTTP, REST, SOAPGDPR, HIPAA
TIBCO [101]VerticalSalesforceHTTP, REST, JMSGDPR
IBM Integration Bus [102]VerticalIBM CloudHTTP, REST, SOAP, MQTTGDPR, HIPAA
Microsoft Azure Data Factory [103]HorizontalAzure servicesHTTP, RESTGDPR, Azure Policy
AWS Data Pipeline [104]HorizontalAWS servicesAWS SDKGDPR, HIPAA
Apache Kafka [58]HorizontalHadoop, SparkKafka Protocol-
Dell Boomi [105]VerticalSalesforce, SAPHTTP, RESTGDPR, HIPAA
SAP Data Services [106]VerticalSAPHTTP, REST, SOAPGDPR, SAP Policy
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Cardona-Álvarez, Y.N.; Álvarez-Meza, A.M.; Castellanos-Dominguez, G. Revolutionizing Data Exchange Through Intelligent Automation: Insights and Trends. Computers 2025, 14, 194. https://doi.org/10.3390/computers14050194

AMA Style

Cardona-Álvarez YN, Álvarez-Meza AM, Castellanos-Dominguez G. Revolutionizing Data Exchange Through Intelligent Automation: Insights and Trends. Computers. 2025; 14(5):194. https://doi.org/10.3390/computers14050194

Chicago/Turabian Style

Cardona-Álvarez, Yeison Nolberto, Andrés Marino Álvarez-Meza, and German Castellanos-Dominguez. 2025. "Revolutionizing Data Exchange Through Intelligent Automation: Insights and Trends" Computers 14, no. 5: 194. https://doi.org/10.3390/computers14050194

APA Style

Cardona-Álvarez, Y. N., Álvarez-Meza, A. M., & Castellanos-Dominguez, G. (2025). Revolutionizing Data Exchange Through Intelligent Automation: Insights and Trends. Computers, 14(5), 194. https://doi.org/10.3390/computers14050194

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop