Next Article in Journal
A New Code-Based Identity-Based Signature Scheme from the Ternary Large-Weight SDP
Previous Article in Journal
Multi-Line Prefetch Covert Channel with Huge Pages
Previous Article in Special Issue
Dynamic Sharding and Monte Carlo for Post-Quantum Blockchain Resilience
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Towards Empowering Stakeholders Through Decentralized Trust and Secure Livestock Data Sharing

by
Abdul Ghafoor
1,*,
Iraklis Symeonidis
1,
Anna Rydberg
2,
Cecilia Lindahl
2 and
Abdul Qadus Abbasi
3
1
Industrial Systems, RISE Research Institutes of Sweden AB, Isafjordsgatan 28 A, 16440 Kista, Sweden
2
Jordbruk och Trädgård, RISE Research Institutes of Sweden AB, Ultunaallén 4, 75007 Uppsala, Sweden
3
Institute of Information Technology, Quaid-e-Azam University, Islamabad 44000, Pakistan
*
Author to whom correspondence should be addressed.
Cryptography 2025, 9(3), 52; https://doi.org/10.3390/cryptography9030052
Submission received: 21 May 2025 / Revised: 7 July 2025 / Accepted: 15 July 2025 / Published: 23 July 2025
(This article belongs to the Special Issue Emerging Trends in Blockchain and Its Applications)

Abstract

Cybersecurity represents a critical challenge for data-sharing platforms involving multiple stakeholders, particularly within complex and decentralized systems such as livestock supply chain networks. These systems demand novel approaches, robust security protocols, and advanced data management strategies to address key challenges such as data consistency, transparency, ownership, controlled access or exposure, and privacy-preserving analytics for value-added services. In this paper, we introduced the Framework for Livestock Empowerment and Decentralized Secure Data eXchange (FLEX), as a comprehensive solution grounded on five core design principles: (i) enhanced security and privacy, (ii) human-centric approach, (iii) decentralized and trusted infrastructure, (iv) system resilience, and (v) seamless collaboration across the supply chain. FLEX integrates interdisciplinary innovations, leveraging decentralized infrastructure-based protocols to ensure trust, traceability, and integrity. It employs secure data-sharing protocols and cryptographic techniques to enable controlled information exchange with authorized entities. Additionally, the use of data anonymization techniques ensures privacy. FLEX is designed and implemented using a microservices architecture and edge computing to support modularity and scalable deployment. These components collectively serve as a foundational pillar of the development of a digital product passport. The FLEX architecture adopts a layered design and incorporates robust security controls to mitigate threats identified using the STRIDE threat modeling framework. The evaluation results demonstrate the framework’s effectiveness in countering well-known cyberattacks while fulfilling its intended objectives. The performance evaluation of the implementation further validates its feasibility and stability, particularly as the volume of evidence associated with animal identities increases. All the infrastructure components, along with detailed deployment instructions, are publicly available as open-source libraries on GitHub, promoting transparency and community-driven development for wider public benefit.

1. Introduction

In the midst of a rapid transition to a data-driven economy, where innovations such as smart farming and Agri 4.0 are redefining the agricultural landscape, this research work introduces Framework for Livestock Empowerment and Decentralized Secure Data eXchange (FLEX). FLEX is designed to transform data sharing in the livestock sector by enabling transparent, secure, and privacy-preserving exchange. Despite the broader digital transformation in agriculture, the livestock industry remains one of the least digitized segments. This study aims to bridge the gap by enhancing the capabilities of existing data-sharing platforms. Our approach is informed by a comprehensive analysis of current solutions, insights gathered from stakeholder interviews, and identification of critical challenges through targeted research. FLEX addresses these gaps with a robust, security-focused framework tailored to the unique demands of the livestock supply chain.
Security and trust management are fundamental functions in any complex, heterogeneous, and interconnected system, particularly when multiple organizations are involved. The livestock supply chain exemplifies such a system, where sensitive information is frequently exchanged with partners who may not be fully trusted. To ensure the security, traceability, verifiability and privacy of data within this supply chain, it is essential to implement internal and external security controls while assessing the risks associated with third-party interactions [1]. These measures are essential in establishing a reliable and resilient system that can withstand various challenges.
In the livestock supply chain, sensitive data are exchanged between authorized stakeholders. However, the lack of pre-established trust and secure communication channels between these parties introduces significant security challenges [2]. Addressing these challenges requires more than just robust authentication and authorization mechanisms; it also requires the incorporation of verifiability, traceability, and data ownership features. These elements are critical to building the confidence of data consumers and ensuring the integrity of the shared data. By aligning these requirements with the principles of Zero Trust Architecture (ZTA) [3], originally developed for data centers and cloud environments, key elements of this approach can be effectively adapted within the livestock supply chain. Furthermore, its core principles can be extended to support the unique demands of distributed and decentralized systems in this domain. With the rise of increasingly complex cybersecurity threats, ZTA has gained prominence for its proactive and dynamic security approach, which this research leverages. Implementing ZTA in the livestock supply chain stands to benefit from its micro-segmentation capabilities and stringent authentication, authorization, and verification processes for each data request and sharing event. However, data sovereignty and asset classification need to be managed at the application level. In this context, Web3 technologies, particularly distributed ledger technology (DLT), can support these advanced features. Web3 solutions are gaining industry adoption where privacy and data sovereignty are critical. These solutions utilize concepts such as digital wallets to manage digital assets and cryptographic credentials. These wallets facilitate the exchange of information, while DLT provides a trusted repository, an immutable source of truth. The combination of immutability, source authentication, system resilience, and availability contributes to the establishment of a reliable, decentralized infrastructure, making DLTs suitable for livestock supply chain management and data sharing applications. DLT enables trust among participants, ensures data integrity, and facilitates efficient and automated processes through smart contracts.
In terms of data sharing and consent management, selective disclosure plays a crucial role. This practice allows data to be shared only to authorized individuals or groups for a particular purpose, a concept aligned with data protection and privacy regulations such as the General Data Protection Regulation (GDPR). When integrated with Verifiable Credentials, selective disclosure not only controls data sharing but also enhances data verification, which is essential for building trust among data consumers. Accurate data collection and analysis are fundamental to effective livestock management, providing vital information for decision-making regarding feed usage and the nutrient profile of an animal’s diet. Regular monitoring enables the tracking of growth rates and the early detection of health issues, while systematic data integration lays the foundation for predictive analytics that support informed decision-making across specific breeds and operational domains. Advanced analytics play a crucial role in livestock record keeping by facilitating the monitoring of individual animal development, evaluating feed efficiency, and optimizing breeding strategies. The emergence of artificial intelligence (AI) has transformed data collection practices, i.e., from individual stakeholders to a more comprehensive cross-stakeholder approach. However, this transition introduces significant trust-related challenges, particularly concerning data ownership, integrity, and secure sharing across organizational boundaries. Maintaining well-documented records significantly enhances farm productivity and profitability, encouraging farmers to utilize resources from reputable agricultural institutions for best practices in weight estimation and record-keeping systems. In this study, FLEX investigates the infrastructure and trust platform necessary to enable and execute analytic functions within and across stakeholder boundaries, thereby facilitating a comprehensive trusted data collection system for the livestock supply chain domain. Furthermore, this research addresses the security issues and challenges inherent in data sharing among various stakeholders in the livestock supply chain. After reviewing existing approaches, particularly those involving distributed infrastructure, we identified a clear gap: most of the existing approaches focus on basic security controls and traceability features, while neglecting the following four critical aspects of data-sharing applications.
  • Consistency: Data objects should be consistent throughout the supply chain and should be verifiable so at any stage, the data consumer can verify its correctness and ensure data objects integrity.
  • Data ownership: This control empowers data owners with the ability to manage their data which is a fundamental principle in privacy and data protection. This approach aligns with the principles of privacy by design. Solution should provide a robust and transparent mechanism for individuals to control and consent to the use of their data. It also supports compliance with data protection regulations that emphasize the importance of informed and verifiable consent.
  • Controlled exposure of data: It emphasis on sharing only necessary data and implementing security controls aligned with the principles of data minimization and a risk-based approach to data sharing. By incorporating these security controls and principles, livestock owners can strike a balance between facilitating necessary data sharing for business purposes and safeguarding the privacy and security of data owners.
  • Data Analytics: Facilitating comprehensive and accurate data collection and analysis within and across stakeholders to enhance livestock management through predictive analytics, while addressing security and trust-related challenges in AI-driven cross-stakeholder data aggregation.
Building on the motivations outlined above, our proposed FLEX offers a secure data sharing architecture that not only complies with privacy regulations such as the GDPR but also fosters trust among stakeholders within the data sharing ecosystem. We contend that this research is distinctive and transformative, particularly as the agricultural sector advances towards a data-driven economy, with smart farming and Agri 4.0 serving as foundational pillars of this transition. In this context, data generated from a wide array of sensors, devices, and digital services act as primary data producers. Consequently, secure and interoperable data sharing platforms like Framework for Livestock Empowerment and Decentralized Secure Data eXchange (FLEX) become essential components of the digital infrastructure, enabling seamless integration, supporting informed decision making, building a data-driven economy, and accelerating the sector’s digital transformation. Our approach is built on a hybrid model, wherein operational data from farms, transporters and slaughterhouses are managed locally at the edge, while the data intended for regulatory agencies, stakeholders, and external data processing services are handled through global data spaces. At the edge level, data owners such as farmers or farm owners, transport operators or agency and slaughterhouse managers are empowered to apply granular security controls based on the data classification. In addition, data owners implement cryptographic mechanisms to ensure the data integrity, authenticity and confidentiality. Within the global data space, dynamic access control policies are enforced in accordance with the data owner’s consent. This enables the data to be selectively shared with authorized recipients (e.g., government agencies, supply chain stakeholders, etc.) or made openly available for analytic purposes. When data owners choose to remain anonymous (e.g., for open data sharing), the shared data are anonymized while retaining its verifiability and authenticity for data consumers. This ensures privacy without compromising data utility. Our approach is guided by the following core principles:
  • Enhanced security and privacy: The emphasis on data security, integrity, and privacy is crucial, especially when dealing with sensitive information and shared with partners through open data sharing platform.
  • Human centric approach: The human-centric approach empowers the individuals to manage and control their own data when publishing their data through data sharing platform.
  • Decentralized trusted infrastructure: Provide a trusted infrastructure for achieving resilience in the verification of shared objects to support trusted environments for all stakeholders.
  • System resilience: The system is based on the decentralized architecture therefore the system inherently will provide resiliency features.
  • Seamless collaboration across supply chain: The platform’s ability to bridge organizational boundaries can foster collaboration on a broader scale. This is particularly important in the livestock industry and supply chain, where collaboration between different entities is often necessary for effective utilization of the shared data.
The structure of the paper is as follows: Section 2 covers the background and existing solutions, along with their analysis. In Section 3, the use case is described, which is followed by the proposed FLEX architecture, including its components and various flows in Section 4. Section 5 discusses the analytics and value-added services provided by FLEX while the evaluation and discussion of the results are presented in Section 6. The paper concludes with the final remarks but architecture limitations are described in Section 7.

2. Background and Existing Approaches

As the concept of Agriculture 4.0 is evolving, the agriculture and framing data are considered the basic input for the next generation of the agriculture and farming industry [4]. Therefore, various properties of the processes, data handling, and location have been identified where more investment is required to increase the profitability of the agricultural sector. All steps in the value chain are important to the value creation process including data generation, data acquisition, data storage, and data analysis while processes including farm processes, farm management and the underlying data chain are important to contribute to the system [5,6]. The European Union has a strong belief that the digitalization of the agricultural and livestock sector has great potential to revolutionize the industry, promoting efficiency, sustainability, and competitiveness [7]. They created the European Data Act, which places new rules and regulations for a fair and innovative data economy.
Data handling and sharing also introduce various technological and societal challenges, which must be addressed at the community level [8]. The agricultural sector and the farming industry can now realize that the benefits of data sharing are larger than keeping data in silos. If farmers open their data, then various issues related to food safety, traceability, and transparency can be easily addressed, and digital transformation can be made in this sector.

2.1. Security Challenges in Agriculture Data Sharing

In data sharing applications, societal trust and technological challenges are the main barriers to data sharing. In societal challenges, the farmer fears data exposures may reveal their commercial secrets to competitors, while in the livestock supply chain, they do not trust supply chain stakeholders. Farmers are sometimes reluctant to share data due to a lack of trust with those who are gathering, analyzing, and sharing the data, and uncertainty about how data will be used and shared eventually [9]. Only six percent (6%) of farmers trust such service providers, while 32 percent (32%) have zero trust in such service providers. For data sharing, data models, security, privacy and transparency of the data access are the core technological challenges. In addition to the societal trust and technological challenges, issues concerning data quality, interoperability, traceability, intellectual property ownership, and data privacy need to be handled when designing solutions for data sharing [10]. In this regard, public and private sector partnerships must be established to build trust. This will result in meeting the defined objectives of a data-driven economy [11].
In order to overcome the trust, security, and privacy challenges, the extended security principle CIAPri (Confidentiality, Integrity, Availability, and Privacy) is the basic guideline for integrating security controls in the data-sharing applications. It is obvious that these controls are becoming challenges when the complexity of the system increases, and systems of systems are being integrated with each other for larger benefits. The agriculture and livestock supply chain is one of them, where various actors are involved in transactions. The security posture of the livestock industry is not mature enough since individual farms are normally non-technical and have very little knowledge about Information Technology (IT) and security. Most of them are not well aware of the deployed topology, installed IT equipment, sensors, etc. [12].
Various sensors, digital devices, and equipment are used in smart farming; therefore, advanced security issues like privacy, social engineering attacks, ransomware attacks, DDoS attacks, and attack vectors related to the cyber-physical systems must be addressed at each phase of the system life cycle [13]. For example, a smart solution for tracking animal movements, e.g., when they go out for grazing, uses IoT technology and GPS sensors for tracking and geofencing [14]. Such types of tracking and navigation solutions are equipped with GPRS sensors, a communication module, and weather sensors. These navigation devices collect data about the position of the animal while communication devices send the current location back to computing and storage servers. All these data can be used for business, customer satisfaction and animal welfare analytics. If there is any issue with the correctness or reliability of the data, then the data will not possess much value.
Data generated for scientific purposes or value-added services must be based on the findable, accessible, interoperable, and reusable, i.e., FAIR [15] since the framers are very slow to adopt new standards and digital transformation compared with other industries. Specifically, the use of precision farming technologies and the development and adoption of open data standards were particularly low in extensive livestock farming. The authors of [15] performed data analytics on the available datasets in Australia and tested their proposed FAIRness and other quality metrics. This approach enabled them to perform better data analytics. However, privacy, security, and data classification were missing. Analytics on real-time data collected from various IoT devices also introduced various other issues, along with security and privacy concerns, such as interoperability. To handle the communication-level security, especially in a microservice-based solution, HTTPS is the most suitable security control, while for access control, proper authentication and authorization-level security controls are the potential measures available [16].
Shared data through data sharing platforms are used for analytics, which facilitates higher management for better decision-making for food security [17]. The correctness and accuracy of analytics and decisions depend on the trusted originality and authenticity of the data throughout the value chain. These requirements become more challenging when data sharing is performed across the boundaries of the agri-food value chain. Existing solutions only focus on the correctness of the data and sharing for analytics. However, the main hindrance to data sharing is the security issues, such as the protection of users’ data, trust between the actors, and how confident the data owner is in sharing the data with other actors across borders under agri-4.0 [18].

2.2. Blockchain-Enabled Traceability Solutions

Blockchain and distributed ledger technologies are disruptive technologies, which have inherent security properties like trust, data immutability and source authenticity. These properties make it a more suitable candidate for agriculture supply chain monitoring. However, its adoption in the agriculture domain raises several challenges, including storage capacity, privacy leakage, high cost, throughput, and latency issues. Blockchain technology has positive impacts on the profitability of the users, which leads to an increase in extrinsic food quality attributes [19]. It fosters better information management along with the food chains since the information is exposed, accessible, and available for various analytical services. Therefore, to make solutions more robust, blockchain technology’s integration with advanced Information Communication Technologies (ICT) and Internet of Things (IoT) can produce better results for achieving trust, traceability, and information security in the agriculture and food industry [20]. To integrate blockchain with the agriculture sector, various enablers (functions) must be carefully investigated, and the respective data models should be defined to meet the comprehensive list of requirements for traceability, security, and future applications [21,22].
In the early stages of digitizing the cattle farming industry, the development of traceability solutions for managing the Calf Birth Registration System and Cattle Movement Monitoring System was the most popular trend [23]. This system recorded fundamental information about the animal’s identity and movements through electronic tags. However, it was developed using conventional software development technologies. With the advent of blockchain, many of the already deployed solutions incorporate this disruptive technology to enhance traceability. The real-time tracking of goods and items poses a continual challenge, demanding reliable, consistent, and secure tracking information. The private sector, such as CyberSecurity [24], operates in the information technology (IT) security sector. developed a solution known as the Milk Verification Project, a prototype that counters food fraud in the dairy supply chain using blockchain technology. This prototype automates the acquisition and registration of information in the supply chain processes. Currently, the integration of blockchain technology with IoTs offers a more resilient solution, contributing to the preservation of consistent data. Additionally, the source of information is authenticated, instilling trust in the shared data [25]. Blockchain and IoT are the most popular combination used for the deployment of traceable solutions, where data are collected from various sources and populated on the blockchain. Similarly, in the livestock and agriculture sector, data are collected from IoT, farms, food processing units, and customers’ data stored and shared through a blockchain network [26]. In most of the solutions, their functional requirements are traceability, but since the data are open and accessible to anyone, security, privacy, regulation compliance, trusted relationships between stakeholders, data ownership, scalability, etc., take equal importance [27].
If the above-mentioned issues are addressed in a solution, then blockchain has significant implications for data sharing, offering enhanced security, transparency, and decentralization. While blockchain technology offers significant advantages for data sharing, it is essential to consider factors such as scalability, energy consumption, and regulatory compliance when implementing blockchain solutions. As the technology continues to evolve, it is likely to play an increasingly important role in shaping the future of secure and transparent data sharing. For data sharing, lightweight encryption, data-level access, and authentication security controls are the basic steps. In addition, anonymization should be applied at the communication level to reduce privacy risks [28].

2.3. Selective Disclosure and Verifiable Credentials

Selective disclosure (SD) and verifiable credentials (VC) are two pivotal concepts that introduce enhanced security, data ownership, and privacy within data-sharing applications. Selective disclosure empowers the data owner by allowing them to control which attributes of their data objects can be accessed by recipients. Verifiable credentials ensure the authenticity, integrity, and legitimacy of the data being shared.
Currently, various approaches are used for selective disclosure, such as hash-based and signature-based methods. Further, the data model also plays a pivotal role. These data models are as follows:
  • AnonCreds with Camenisch-Lysyanskaya (CL) signature [29]: This approach follows the AnonCreds data models and heavily relies on the CL signature scheme. The CL signature [30,31] utilizes the RSA algorithm [32], which takes time to generate a signature, and the key size is significantly large, leading to inefficiencies. To increase efficiency, AnonCreds employs the BBS+ signature, which is based on pairing-based elliptic-curve cryptography. This method is efficient as it uses shorter keys and signatures without compromising security [33].
  • ISO/IEC 18013-5:2021 [34]: This is an ISO standard used for Personal Identification, Mobile Driving License (mDL) and their applications. The mDL relies on hash-based techniques with its own data model. This approach is simple, efficient, and easy to implement.
  • SD-JWT [35]: This data model is defined by the IETF and also relies on hash-based techniques but follows the Java Web Token (JWT) data formats. The extended version of SD-JWT is also available in the verifiable credentials data model [36].
  • Selective Disclosure—Verifiable Credentials (SD-VC) [37]: It represents a proposed standard by the World Wide Web Consortium (W3C) that is extensively utilized in the realm of digital identities. This standard leverages the Verifiable Credentials data model in conjunction with selective disclosure techniques, which can be either hash-based or digital signature-based, and enables granular and secure information sharing. The SD-VC standard operates on the Verifiable Credentials Data Model, providing a framework for expressing credentials in a way that is cryptographically secure, ensures privacy, and is machine-verifiable. The selective disclosure techniques allow for the selective sharing of specific data elements within a credential without disclosing the entire credential. These techniques use either hash-based selective disclosure or digital signature-based selective disclosure. By integrating these components, the SD-VC standard enhances privacy and security in digital identity management, enabling individuals to control the disclosure of their personal information efficiently.
Among others, X.509 and JSON-LD-based data models are noticeable ones [38,39].
Based on the above analysis, existing research highlights the significant potential of blockchain technology for enabling secure data sharing across various domains. However, a critical gap remains in the dedicated research focused on data-sharing platforms specifically designed for the livestock industry. To address this gap, there is a pressing need to investigate the design, implementation, and evaluation of blockchain-based data-sharing frameworks that cater to the unique requirements of the livestock sector, particularly in areas such as privacy preservation, data verifiability, ownership rights, and fine-grained access control within open and decentralized environments.

3. Use Case Description

In the livestock supply chain, various stakeholders engage in bidirectional information exchange (upstream to downstream and vice versa) as shown in Figure 1. Initially, the farm owner assigns an identity to each animal and records this in local storage. Concurrently, a farm employee registers the animal’s identity on the blockchain. The employee also documents information related to the animal’s feed, movements, and submits health monitoring requests to a licensed veterinarian. All these records are stored in a persistent storage. Upon receiving a health monitoring request, the veterinarian visits the farm to examine the specified animals. Following each examination, the veterinarian provides a report to the farm, which is deemed valid only if it bears the veterinarian’s signature and stamp. The farm owner maintains records of medical examinations, feed, and animal movements in the local storage. When an animal is ready for transport to the slaughterhouse, the farm employee assesses its health and transfers its identity to the transporter. Additionally, the farm owner must provide evidence demonstrating the welfare and proper treatment of the animal during its time on the farm. The animal owner also submits the healthcare report to the transporter. To ensure customer satisfaction, data on animal movement, feed, observations, and health must be accessible. Given these functional data disclosure requirements, a subset of data may need to be shared with various stakeholders, including farmers, transporters, slaughterhouses, customers, and third-party applications. However, trust issues may arise between these actors, raising concerns about the reliability of the disclosed data. Therefore, data ownership must be enforced, ensuring that only necessary information is shared with stakeholders who require it for analytics, decision-making, and process optimization.
Figure 1 shows the data pipeline for this real-world use case. The beef supply chain from farm to fork has many entities involved, where data sharing becomes necessary. The scope of data sharing varies from one actor or entity to another; for instance, more information about animals can be added for legislative requirements. For example, if a farmer hands over an animal to the transporter, the supply chain data must provide an updated health record of the animal to make sure that the animal is healthy and up to the mark for transportation and slaughtering.

Research Methodology

The methodology employed for the design, development, and evaluation of FLEX, a secure, privacy-compliant, and hybrid data-sharing architecture, adopts a Design Science Research (DSR) approach. This approach integrates system design, prototyping, and empirical evaluation to ensure the development of a robust and context-appropriate solution. The methodology is structured into four main phases. (i) Requirements Analysis, this phase involves identifying and analyzing the expectations of stakeholders across the livestock supply chain, along with examining current limitations in data-sharing practices, such as security vulnerabilities and trust deficits. The requirements analysis ensures that FLEX adheres to domain-specific constraints and incorporates human-centric values, particularly in regard to data ownership and ethical data use. (ii) Architectural Design, a hybrid architecture is designed, consisting of edge-level processing and a global data space. At the edge level, operational data are managed locally by data owners (e.g., farmers, transporters, slaughterhouses) within their own computing environments. Data intended for broader dissemination, such as for analytics or sharing with external stakeholders, is anonymized, verified, and shared through a decentralized infrastructure. Dynamic access control policies, guided by user consent, govern the sharing process to uphold privacy and accountability. (iii) Implementation, a functional prototype of FLEX is developed, incorporating edge nodes for local data collection and control, a middleware layer for data classification, encryption, and policy enforcement, and integration with the InterPlanetary File System (IPFS) and Distributed Ledger Technology (DLT) for secure data dissemination and analytics. Special emphasis is placed on implementing anonymization techniques that maintain data verifiability, as well as cryptographic mechanisms that ensure confidentiality, integrity, and non-repudiation. (iv) Security Validation, the proof-of-concept system is evaluated using both quantitative and qualitative methods. Quantitatively, performance metrics such as latency and throughput are assessed under varying transaction loads and evidence volumes. Qualitatively, the STRIDE threat modeling framework is applied to identify and mitigate potential security threats, guiding the selection and deployment of appropriate security controls. Throughout the methodology, strong alignment is maintained with the core principles of the FLEX architecture: enhanced security and privacy, human-centric data ownership and control, decentralized and resilient system design, and support for cross-organizational collaboration within the agricultural supply chain.

4. FLEX: Framework for Livestock Empowerment and Decentralized Secure Data eXchange

FLEX is built upon three foundational layers, as shown in Figure 2, each serving a critical function in supporting the research objectives described in this paper. The underlying architecture leverages distributed ledger technology, enabling the development of a web-based digital wallet designed to facilitate data sovereignty and consent management. Furthermore, the business layer employs a microservices architecture to ensure modularity and scalability, enhancing the platform’s adaptability and performance.

4.1. Business Layer

The business layer comprises core business components essential to supporting domain-specific use cases within the supply chain data pipeline. These components vary based on the requirements of specific use cases and their roles. At a higher level of abstraction, the primary business entities include farms, transporters, slaughterhouses, and retailers. The functionality of each business entity is further decomposed into functional microservices, which represent discrete business processes. These microservices expose functionalities to user interaction modules, such as dashboards or front-end web applications.
Microservices also function as data collection points, serving as sources of information. In some instances, additional data sources include various sensors deployed across farms, slaughterhouses, transport vehicles, or even attached directly to animals. Examples of such sensors include temperature and weight sensors, which monitor critical metrics in real time.
Within our systems, all data generated locally are securely managed within the operational environment of the respective business entity. These data remain confidential either in a local data center or on cloud-based infrastructure, depending on the deployment model. For example, on farms, owners oversee resource management and administrative activities, while employees record operational data such as animal movement and feeding patterns. Additionally, critical information from external stakeholders, including veterinarians and observers, is integrated into the system.
All data flow through a secure and trusted persistence layer, which ensures data integrity and security. This approach safeguards sensitive information while enabling seamless data collection and management across various operational contexts.

4.2. Trusted Persistence Layer

The Trusted Persistent Layer (TPL) is a critical middleware component that connects business entities with the trusted infrastructure layer. Developed in alignment with Web3 principles, the TPL facilitates core transaction management and oversees the storage of various objects within the trusted infrastructure. Each TPL component is explicitly mapped to its corresponding business entity and deployed in the same environment, ensuring seamless integration and efficient operation. We adopted a middleware approach to maximize portability, allowing business entities to interface with a wide range of high-level trusted infrastructures. This approach ensures the continuity of business operations regardless of variations in the underlying trusted infrastructure. The current implementation includes two primary middleware components: the Trusted Application Programming Interface (TAPI) and the Verifiable Data Registry (VDR).
  • TAPI—Traceable Application Programming Interfaces: TAPI, functions as a business layer wallet, managing user credentials and maintaining a direct connection with the trusted layer for executing transactions on the decentralized trusted layer. It oversees the user accounts and credentials necessary for interaction with the distributed ledger. User Account Management: Any user can create an account, but only organizational owners with specific privileges can assign access rights for business-specific actions. Typical user roles include Employee, Observer, and Veterinarian. Smart Contract Interface: TAPI also interfaces with a smart contract responsible for managing traceable information. Pre-transaction rules, implemented within the smart contract, are executed prior to transacting the identity of objects (in this use case, the identity of animals).
  • VDR—Verified Digital Registry Upon user registration, VDR generates asymmetric credentials using the RSA algorithm and registers the user’s public key along with their blockchain address. Once a user is assigned a specific role by the owner, these public credentials are recognized as trusted credentials, allowing the user to perform various transactions and actions on the distributed ledger.
  • IPFS—Middleware for Off-Chain Data Management The InterPlanetary File System (IPFS) middleware consists of a set of libraries acting as intermediaries between users and the IPFS open network for shared information storage. In the proposed architecture, the IPFS middleware supports off-chain data management, ensuring the efficient and secure handling of data not stored directly on the blockchain. The Trusted Persistent Layer, through its middleware components TAPI and VDR, provides a robust framework for managing user credentials and transactions within a trusted infrastructure. The integration of IPFS for off-chain data management further enhances the system’s capability, ensuring that business entities can operate effectively across various trusted infrastructures.

4.3. Trust Infrastructure Layer

The infrastructure consists of two primary oracles that serve as trust anchors, facilitating secure communication among various actors:
  • Distributed Ledger Network: The Hyperledger node operates within the standard Hyperledger network, where the Verifiable Data (VD) and Traceability smart contracts are deployed. The addresses of these contracts, along with the necessary information, are shared with nodes interested in joining the network. As this is a permissioned network, any entity that wishes to participate must first obtain approval. The network is exclusively used for managing the Verifiable Data Registry (VDR) and traceability-related transactional data.
  • IPFS Network: The IPFS node employs standard IPFS components to store data while ensuring its integrity. This component is interchangeable with other systems, as the data formatting standards used inherently ensure data integrity.

4.4. System Protocols and Flows

Following are the steps and flows to show how the FLEX is implemented and configured.
  • Initial bootstrap: The system begins with an initial bootstrap phase, during which the system owner establishes the decentralized network and configures the InterPlanetary File System (IPFS). Subsequently, the owner deploys the Verifiable Data Registry (VDR) and traceability smart contracts onto the distributed ledger nodes. Detailed information regarding these smart contracts can be accessed in the project’s repository on Github, main branch (https://github.com/aghafoor77/hltestnetwork/tree/main (accessed on 9 December 2022)).
The system owner is responsible for assigning administrative roles to the operators of each organization within the network. These administrators are authorized to perform the following designated operations:
  • Assigns roles to their employees: As the identity and role are the basis for the system, the owner assigns roles to the employees for performing designated functions. These roles vary from organization to organization, but for the completeness of our proof of concepts, admin assign roles shown in the architecture diagram within their organization.
  • Create identity of newly born animal or any other object that need traceability: The admin of the organization creates identity of each animal in the DLT (through TAPI) which is born at the farm or received identities when an animal transferred from another farm. This identity comprises various attributes that logically define a physical animal.
  • Transaction registration: The admin of the organization registers a transaction in the distributed ledger network when an animal is moved from one farm to another farm or to another location. It also uploads associated objects on the IPFS, which provides detailed information about the animal treatment, feeding, and movement. Each transaction contains references to all the objects uploaded on the IPFS for reference retrieval purposes.
  • Data Exchange Protocol: Data exchange protocol is the main flow of the proposed solution, which is further divided into two main flows, used for data and security perimeter exchanges. (i) Data Sharing Flow and (ii) Data Retrieval Flow. In the Data Sharing flow, the originator of the data selects the type of information they want to share and then selects who can access it. We defined various access levels. For example, the previous owner, the recipient, or any person in the traceability chain of the identity.
Data Sharing Flow:
The data-sharing process within a Distributed Ledger Technology (DLT) system entails the addition of a transaction to facilitate the transfer of an identity object between locations while ensuring both security and traceability throughout the operation.
The pseudocode of the data sharing algorithm is presented in the Algorithm 1. As shown in the Figure 3, at the Edge, various forms of evidence are maintained, including animal health records, movement logs, feeding records, and objects shared by partner organizations. Some of these records are represented as Verifiable Credentials, such as health certificates issued by veterinarians, along with evidence provided by the previous owner of the identity.
Algorithm 1 Secure Data Sharing Flow
  1:
User/Sensor calls registerData()
  2:
Edge receives data
  3:
Edge calls requestIdentityToShare()
  4:
Owner executes selectIdentity()
  5:
Edge calls requestEvidences()
  6:
Owner executes selectEvidences()
  7:
Owner applies applyAccessLevel()
  8:
Owner shares access info
  9:
Edge executes protectAndEncapsulateData()
10:
Edge calls uploadToMiddleware()
11:
Middleware calls pinDataToIPFS()
12:
DLT sends reference to Middleware
13:
Middleware calls protectEncryptionKey()
14:
Middleware executes encapsulate()
15:
Middleware sends transaction to DLT
16:
DLT sends notify() to Middleware
17:
Middleware calls notifyEdge()
As depicted in Figure 3, the farm administrator accesses the required data from the Edge. These data include the animal’s identity and associated evidence. The administrator then provides consent and selects specific attributes of the object to be shared with partner organizations.
If an object is already in Verifiable Credential format, the system automatically generates a selective-disclosure-verifiable-presentation (SD-VP). For objects not yet in this format, the system will generate them as Verifiable Credentials. This project employs a hash-based selective disclosure function as referenced in [34]. Examples of SD-VP implementations can be accessed on GitHub (https://github.com/aghafoor77/selectivedisclouser (accessed on 9 December 2022)).
During the access control phase, the administrator selects the authorized recipients who are permitted to access the information. This can include the recipient alone, the recipient and the previous owner of the identity, the recipient and all previous owners of the identity, or all participants within the identity’s traceability chain, including relevant government institutions. After applying access control, the system will automatically protect each object using randomly generated symmetric key K s (shown in (1)) and AES cryptographic algorithm [40] as shown in (2). After that, the edge will upload an encrypted object on the IPFS through middleware, which returns a hash of the object to the edge (3).
K s = g e n r a n d ( s e e d )
E o = A E S ( K s , o b j )
H o = E o u p l o a d ( I P F S )
where
K s : randomly generated symmetric key
E o : encrypted object
H o : hash of the encrypted object
In the next phase, based on the selected recipients, the owner fetches public keys of each one from VDR and encrypts K s with the recipient’s public key P K r as shown in (4) [32]. After that, it concatenates hashes of each object H o with the E K s as shown in (5) and sends to the middleware (TAPI), shown in (6), which registers the transaction in the traceability ledger.
E K s = R S A ( [ P K r ] , K s )
P m = c o n c a t e ( E K s , [ H o ] )
P m m i d d l e w a r e
where
P K r : public key of the recipient
E K s : encrypted symmetric key
P m : protected message
Data Retrieval Flow: Upon the registration of a new animal identity along with the corresponding evidentiary data on the Distributed Ledger Technology (DLT), the recipient receives a notification and follows the flow demonstrated in Figure 4. The transaction data are then retrieved by the recipient, allowing the extraction of all relevant hashes, as described in Equation (5). These hashes serve as references to the objects stored in the IPFS. Utilizing these references, the recipient downloads the protected object. To decrypt this object, the recipient extracts the encrypted symmetric key as specified in Equation (7). Using the symmetric key K s , the recipient is able to decrypt the encrypted object E o . Furthermore, to verify the authenticity of the data, the recipient can validate the proof-of-origin of the object by downloading the owner’s public key from the Verifiable Data Registry (VDR). This ensures that the integrity and authenticity of the object are maintained throughout the process.
K s = R S A ( P R R , K s )
P R R : private key of the recipient
Figure 4. Flow of information for consuming data shared by the stakeholder of the livestock data pipeline.
Figure 4. Flow of information for consuming data shared by the stakeholder of the livestock data pipeline.
Cryptography 09 00052 g004
The traceability of the animal is maintained using a smart contract framework implemented within a Distributed Ledger Technology (DLT) system. This DLT-based system is accessible via a Traceability API (TAPI) as mentioned above. Users can access and view the traceability information of the animal, which includes minimal but essential data (as permitted by the data owner) on various aspects of the animal’s lifecycle. This includes details regarding the farm where the animal was raised, its health treatments, feeding practices, movements, and other relevant records. The traceable data provide a comprehensive overview of the animal’s welfare and management throughout its life. In addition, the combination of various standards and technologies used in FLEX can be considered a foundational building block of the digital product passport, as it ensures traceability and verifiability.

5. Analytics—Value Added Services

FLEX delves into the necessary infrastructure and trust platform to enable analytic functions to operate both within and across stakeholder boundaries. This facilitates comprehensive data collection and the execution of analytic functions. It is important to note that FLEX serves as an initial proof of concept for a working system designed to support analytic operations, rather than an exhaustive solution for analytics. Accurate data collection and analysis are fundamental to effective livestock management. Parameters such as animal growth rate provide essential information that supports decisions regarding feed usage and the nutrient profile of an animal’s diet. This process involves synthesizing baseline knowledge of normal growth patterns, feed consumption, and the nutrient profile of an animal’s diet. Regular monitoring and data collection enable farmers to track growth rates, identify health issues, and make informed management decisions. The accurate tracking of an animal’s growth is critical for avoiding data inaccuracies and reducing workload.
Integrating data systematically allows for the development of predictive analytics for more informed decisions about specific breeds or operational areas. Analytics are crucial for livestock record-keeping, enabling farmers to track individual animal performance, assess feed efficiency, and optimize breeding programs. Consistent and accurate data collection, whether through traditional scales or alternative methods like body measurements, is pivotal for herd management.
Data collection traditionally has predominantly relied on individual stakeholders. However, the emergence of AI introduces the potential for a more comprehensive trend analysis through the aggregation of data across multiple stakeholders. Advancing baseline knowledge is more effectively achieved by integrating information from multiple stakeholders rather than relying on a single source. Nevertheless, this approach comes with its own set of challenges, particularly in terms of trust. Stakeholders may be reluctant to share their data due to trust-related concerns of cross-stakeholder data sharing in a system.
Maintaining well-documented records significantly contributes to overall farm productivity and profitability. Farmers are encouraged to utilize resources from reputable agricultural institutions to learn best practices for estimating animal weight and implementing effective record-keeping systems.

5.1. Key Performance Indicators and Calculations—Optimal Growth Parameters

There is a fundamental question: “How do we provide key insights for optimal growth in livestock (supply chain monitoring, payment processing, digital identity)?”
Below is a set of parameters for optimal growth in livestock and calculated based on the information shared by the owner of a farmhouse or calculated internally on the edge:
  • Total Weight Gain (TWG):
The weight gain of an animal over a specified period. This can be distinguished as Current Total Weight Gain (TWG-C) and Expected Total Weight Gain (TWG-E), indicating the actual and optimal weight gain, respectively. An optimal TWG helps determine the appropriate time for slaughter.
  • Number of Days Elapsed (NDL):
The number of days it takes for the animal to gain the indicated weight.
  • Average Daily Gain (ADG):
The expected amount of weight gained per day, calculated as TWG-C divided by NDL.
ADG = Current total weight gain Number of days between start weight and current weight
  • Days-to-Market (DtM):
The number of days required to reach market weight is calculated by dividing the total gain required to reach market weight by the ADG. This can be expressed as Expected Days-to-Market (DtM-E) and Current Days-to-Market (DtM-C).
  • Feed Consumption Ratio (FCR):
The amount of feed needed for an animal to gain 1 lb. (0.45 kg) of weight per day. It is calculated as TWG divided by the total feed consumed, including Current FCR (FCR-C) and Expected FCR (FCR-E).
FCR = Current total weight gain Total feed consumed
  • Total Estimated Feed (TEF):
The total feed required to reach market weight is calculated as the total gain required multiplied by the FCR.
TEF = Current total weight gain ( to reach market weight ) × FCR

5.2. Growth Rate Tracking and Feed Management for Livestock—Toy Examples

5.2.1. Example of Growth Tracking

Record the animal’s weight, feeding details, and relevant notes systematically. For instance (Table 1):
Using these records shown in Table 1, calculations can be made as follows:
  • Current Total Weight Gain (TWG-C): 120 lbs. − 100 lbs. = 20 lbs. (9.07 kg) gained.
  • Number of Days Elapsed (NDL): 15 − 1 = 14 days.
  • Average Daily Gain (ADG): 20 lbs. (9.07 kg)/14 days = 1.43 lbs. (0.65 kg) per day.
If the market weight target is 260 lbs. (117.93 kg), the pig needs to gain an additional 140 lbs. (63.50 kg). If the pig continues to gain 1.43 lbs. (0.65 kg) per day, it will take approximately 98 days to reach the market weight. This calculation gives us the Current Days-to-Market (DtM-C).
If today is 15 May, the pig is expected to reach market weight by 21 August. If the planned slaughter date is 21 September, adjustments are needed to slow the growth rate.

5.2.2. Example of Adjusting Feed for Growth Rate Control

Records show the pig consumed 50 lbs. (22.68 kg) of feed in 12 days, leading to a daily consumption rate of 4.17 lbs. (1.89 kg). Calculating the Feed Consumption Ratio (FCR) provides:
FCR - C = 4.17 lbs . ( 1.89 kg ) 1.43 lbs . ( 0.65 kg ) gain = 2.9 lbs .
( 1.31 kg ) feed per lb . ( 0.45 kg ) of gain
To estimate the total feed required to reach the market weight:
TEF = 140 lbs . ( 63.50 kg ) gain × 2.9 lbs . ( 1.31 kg ) feed = 406 lbs .
( 184.16 kg ) of feed
To align with the 21 September slaughter date, reduce the pig’s daily feed intake. For example, reducing the total feed consumption to 3.17 lbs. (1.44 kg) per day over 128 days can be achieved by incorporating high-fiber, low-energy feedstuffs to slow the growth rate.

6. System Evaluation and Discussion

In this section, we validated our claims using the most widely adopted threat assessment methodology and evaluated transactional performance to assess the framework’s applicability in supply chains broadly and to the livestock data pipeline in particular.

6.1. Extended STRIDE Threat Modeling Approach

We systematically applied the extended STRIDE threat modeling framework to evaluate the implemented security controls against identified risks and vulnerabilities. This approach enabled a comprehensive assessment of the proposed solution’s security posture.
Authenticity: Unauthorized entities attempting to impersonate legitimate components of the supply chain pose significant security risks. To address this, our solution implements robust authentication mechanisms. From the user’s portal to the edge level, initial authentication is enforced through username and password credentials, while subsequent authentication and authorization to various microservices are handled via Single Sign-On (SSO). The authentication token, digitally signed using the RSA algorithm, provides robust protection against brute-force attacks and encapsulates role-based authorization information which is equivalent to the OAUth protocol. For communications between the Edge and the data-sharing platform, a public-private key-based authentication scheme is implemented. All users are registered in the Virtual Data Registry (VDR) with their public cryptographic credentials. Together, these security measures provide comprehensive protection against spoofing and unauthorized access.
Integrity: From the user’s portal to the edge, communication is secured using HTTPS. The integration of the IPFS and Distributed Ledger Technology (DLT) inherently ensures data integrity. By default, IPFS generates a unique hash for each stored file, providing a robust mechanism to detect and prevent tampering. Furthermore, the Transaction API (TAPI) digitally signs transactions before submitting them to the DLT, thereby ensuring integrity at both the data and transactional levels in our system architecture. In addition, each object when converted to Selective Disclosure-based Verifiable Credentials (SD-VC) is digitally signed ensuring the data integrity at object and attribute levels.
Non-repudiability: Transactions received at the edge level are systematically logged in files that maintain a comprehensive record of all requests. These logs serve dual purposes: anomaly detection and ensuring non-repudiation in cases of disputes. Furthermore, the data shared via the IPFS are structured as Verifiable Credentials (VCs), which are digitally signed by their respective owners. Consequently, the solution ensures non-repudiation at two critical levels: from the user’s portal to the edge level and from the edge level to the data-sharing platform.
Confidentiality: In the proposed solution, communication between the user’s portal and the edge is securely established using HTTPS. Data objects stored on IPFS are protected through symmetric key cryptography, i.e., AES is used for encryption. The symmetric encryption key is shared via a Distributed Ledger Technology (DLT) system as part of a transaction, where it is encrypted using the recipient’s public key. Consequently, confidentiality as a security control is maintained throughout the entire data pipeline.
Access Control: In this solution access control is implemented through two complementary mechanism (i) Role Based Access Control (RBAC), and (ii) Selective Disclosure Based Access Control. The RBAC model is implemented at the edge, where users operate within their designated domain according to the predefined roles. Within each domain, the Admin holds the authority to finalize and submit transactions to the data sharing platform. The selective disclosure measure enables fine-grained access control, allowing data owners to specify which attributes or subsets of data they wish to share with authorized parties.
Data Ownership: We implemented SD-VC which empowers the data owners to control their own data and enable them to only share those attributes which are meant to be shared. It not only ensures the privacy of the data but also protects the confidential information from being shared.
Availability and Resilience: The data-sharing platform employs IPFS and DLT, ensuring the continuous availability of data shared among various stakeholders. This infrastructure is inherently resilient to the failure of any individual node, maintaining the integrity and accessibility of the data across the network.

6.2. Performance Evaluation

The performance of the proposed proof of concept was evaluated in two primary phases on an Ubuntu-based virtual machine, equipped with a 12th Gen Intel® Core™ i7-1260P CPU, 8 GB of RAM, and a 64 GB hard disk.
In this study, we investigated the application of SD-VC to secure animal identity objects along with their associated evidence. Each credential cryptographically safeguards the evidence and enables selective disclosure of specific attributes. As illustrated in Figure 5, an individual animal may be linked with multiple types of evidence, including health records, movement data, feed information, and drug administration logs. The evaluation results reveal that during the initial system loading phase, the processing time for handling evidence and generating SD-VCs is significantly higher when only a single piece of evidence is attached.
However, as the amount of evidence increases, the system stabilizes and requires more time proportionally with the increase in evidence. Some spikes have been shown in the graph; however, this is due to the configuration striking a balance between resource utilization and performance, ensuring efficient processing and secure storage of verifiable credentials.
The proposed solution focuses on efficient data sharing among various stakeholders. The results for uploading the generated SD-VC on the IPFS have been calculated and are presented in Figure 6. These results highlight the performance of the system during the credential generation and its subsequent upload to the IPFS and also comparison with the VC creation process. Notably, it has been observed that the process of creating and uploading SD-VCs is optimized when an identity contains 15 pieces of evidence. This indicates that the system’s efficiency is influenced by the number of associated evidence, suggesting a balance between the volume of identity data and performance optimization. The findings demonstrate that the system can handle the creation and storage of credentials effectively, ensuring scalability and reliability for secure data sharing.
The performance analysis of animal data sharing in relation to the number of attached evidences indicates a linear system response, as depicted in Figure 7. In this graph, the X-axis represents the number of pieces of evidence attached to an animal when it is being transferred to a slaughterhouse. These evidences are randomly selected and may include animal data, health reports, drug records, feed data, and/or movement records that accompany the animal for traceability or regulatory compliance. The Y-axis indicates time in milliseconds (ms), which likely reflects the processing time or delay involved in transferring the animal data along with its evidence. As mentioned above, the graph shows a generally increasing trend in processing time as the number of pieces of evidence increases. At lower evidence counts (3–7), the time remains relatively low (11,000–16,000 ms), while a notable rise occurs between 12 and 13 pieces of evidence, suggesting a possible threshold where processing overhead increases, possibly due to system architecture limitations or increased validation complexity. From 13 pieces of evidence onward, there is a steady increase in processing time, likely due to data transmission and storage demands. The system appears to perform efficiently with up to approximately 10 pieces of evidence. Beyond that point, particularly after 12, performance begins to degrade more noticeably. A comparative analysis of the proposed approach with prior published methodologies reveals that the earlier solutions exhibit superior performance [41]. Specifically, the process of uploading five evidentiary items in the prior implementation required approximately 2000 milliseconds, whereas the current approach incurs a latency of around 11,000 milliseconds. This significant increase in processing time is primarily attributed to the adoption of standard digital signature schemes and advanced data models in the present work. Furthermore, the current approach incorporates a selective disclosure mechanism, a feature that was not implemented in the earlier solutions, which inherently introduces additional computational overhead.

6.3. Demonstration

The system is implemented using a micro-services architecture where mysql is used to store data at the edge level. We also deployed Hyperledger decentralized network associated with IPFS decentralized nodes. All servers are accessible through web interfaces to authenticated and authorized users. The status of the registered animals is shown in the Figure 8. This view is only available to the farm owners and data are only fetched from the edge. The complete demo of the system is shown on the github https://github.com/aghafoor77/flex/blob/main/FLEXSamll.zip (accessed on 18 February 2025) and source code can be found at github https://github.com/aghafoor77/flex/tree/main (accessed on 19 May 2025).

7. Architectural Limitations

The system is designed based on a microservices architecture, which is the core architectural decision aimed at achieving scalability and expandability. However, it has certain limitations:
  • Deployment Complexity: As the system is based on a hybrid architecture, it requires local infrastructure-level computing and storage components. Therefore, farmers with limited resources may not be able to afford it. To overcome this issue, existing decentralized infrastructure may be utilized where available.
  • Technical Complexity: Operating and maintaining a hybrid system may require technical expertise that many farmers lack. It also increases operational costs, which farmers must bear. A government-level initiative could help mitigate such challenges.
  • IT Awareness: The system requires a certain level of IT literacy, which poses a challenge since many end users—such as farmers and supply chain workers—may lack the necessary technical skills. This leads to a dependency on continuous training and support, potentially increasing operational costs and slowing down adoption.
  • Integration Challenges: If the system needs to interoperate with legacy systems or third-party tools (e.g., already developed systems used in this sector), this integration effort might be non-trivial.
  • Connectivity Dependency: The system’s reliance on stable internet connectivity during critical processes, such as animal transportation to the slaughterhouse, can be a bottleneck in areas with poor network infrastructure. This may result in delays in data retrieval or transaction recording, affecting traceability and real-time tracking.

8. Conclusions

Selective disclosure plays a critical role in data sharing and consent management by ensuring that information is shared exclusively with authorized parties while maintaining compliance with data protection regulations. When combined with Verifiable Credentials, it enhances both data verification and stakeholder trust. In the livestock industry, accurate data collection and analysis are essential for optimizing feed efficiency, monitoring animal health, and enabling predictive analytics. However, the rise of AI-driven data aggregation introduces significant trust-related challenges that must be addressed to ensure secure and ethical data use. This paper presented FLEX, a secure data-sharing framework specifically designed for the livestock supply chain. FLEX addresses key security gaps in existing solutions by focusing on four critical dimensions: (i) data consistency, (ii) ownership, (iii) controlled exposure, and iv) decentralization. FLEX ensures data consistency across the supply chain, empowers data owners with fine-grained control mechanisms, enforces data minimization principles, and supports cross-stakeholder predictive analytics, all while preserving trust and security. FLEX employs a hybrid architecture model, where operational data from farms, transporters, and slaughterhouses is managed locally at the edge, while global data spaces enable broader analytics. Cryptographic controls are employed to guarantee data integrity and verifiability, and access control mechanisms are dynamically enforced based on the data owner’s consent. FLEX is grounded in core principles, including the following:
-
enhanced security and privacy,
-
a human-centric approach,
-
decentralized and trusted infrastructure,
-
system resilience, and
-
seamless collaboration across the supply chain.
By addressing trust and security challenges, FLEX delivers a transformative approach to secure data sharing. It incorporates robust security controls aligned with the STRIDE threat model, ensuring protection against identified threats. Performance evaluations confirm the effectiveness and feasibility of the implemented framework. FLEX has been successfully demonstrated through its implementation and integration with Hyperledger blockchain tools and IPFS for decentralized storage. In the future, FLEX can be extended with additional features to ensure compliance with Digital Product Passport (DPP) regulations, as its current architecture components are already aligned with emerging DDP requirements.

Author Contributions

Conceptualization, A.G., A.R. and I.S.; background and methodology, A.G., I.S. and A.Q.A.; software, A.G. and A.Q.A.; validation, I.S. and A.G.; investigation, A.R. and C.L.; resources, A.R.; writing—original draft preparation, A.G., I.S. and A.Q.A.; writing—review and editing, A.R. and C.L.; project administration, A.R.; funding acquisition, A.R. All authors have read and agreed to the published version of the manuscript.

Funding

We acknowledge that this project (grant number: 2019-02277) was funded by Formas—a Swedish Research Council for Sustainable Development, within the National Research Programme for Food. https://formas.se/en/start-page/about-formas/what-we-do/national-research-programmes.html (accessed on 24 June 2024).

Data Availability Statement

All data, source code and docker files and smart contract files are uploaded on the github https://github.com/aghafoor77/flex (accessed on 19 May 2025). No restriction on the uploaded data in the github repo.

Conflicts of Interest

The authors declare no conflicts of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript; or in the decision to publish the results.

Abbreviations

The following abbreviations are used in this manuscript:
STRIDESpoofing, Tampering, Repudiation, Information disclosure, Denial of service, Elevation of privileges categories
FLEXFramework for Livestock Empowerment and Decentralized Secure Data eXchange
ZTAZero Trust Architecture
DLTDistributed Ledger Technology
GDPRGeneral Data Protection Regulation
AIArtificial Intelligence

References

  1. IBM. Data Protection Strategy: Key Components and Best Practices. 2024. Available online: https://www.ibm.com/blog/data-protection-strategy/ (accessed on 15 March 2025).
  2. Bondar, M.; Cascone, J.; Mussomeli, A.; Graeff, K.; Buckley, N.; Murphy, T. Is Your Supply Chain Trustworthy? 2023. Available online: https://www2.deloitte.com/us/en/insights/focus/supply-chain/issues-in-global-supply-chain.html (accessed on 18 March 2024).
  3. Rose, S.; Borchert, O.; Mitchell, S.; Connelly, S. Zero Trust Architecture, Special Publication (NIST SP); National Institute of Standards and Technology: Gaithersburg, MD, USA, 2020. Available online: https://tsapps.nist.gov/publication/get_pdf.cfm?pub_id=930420 (accessed on 7 July 2025). [CrossRef]
  4. Wysel, M.; Baker, D.; Billingsley, W. Data sharing platforms: How value is created from agricultural data. Agric. Syst. 2021, 193, 103241. [Google Scholar] [CrossRef]
  5. Mao, S.; Liu, Y. Community detection in graphs. Mob. Netw. Appl. 2014, 19I, 171–209. [Google Scholar]
  6. Wolfert, S.; Ge, L.; Verdouw, C.; Bogaardt, M.J. Big data in smart farming—A review. Agric. Syst. 2017, 153, 69–80. [Google Scholar] [CrossRef]
  7. EC.Europa. The Digitalisation of the European Agricultural Sector. 2023. Available online: https://digital-strategy.ec.europa.eu/en/policies/digitalisation-agriculture (accessed on 18 December 2024).
  8. Durrant, A.; Markovic, M.; Matthews, D.; May, D.; Leontidis, G.; Enright, J. How might technology rise to the challenge of data sharingin agri-food? Glob. Food Secur. 2021, 28, 100493. [Google Scholar] [CrossRef]
  9. Wiseman, L.; Sanderson, J.; Zhang, A.; Jakku, E. Farmers and their data: An examination of farmers’ reluctance to share their data through the lens of the laws impacting smart farming. NJAS-Wagening. J. Life Sci. 2019, 90–91, 100301. [Google Scholar] [CrossRef]
  10. Moore, E.K.; Kriesberg, A.; Schroeder, S.; Geil, K.; Haugen, I.; Barford, C.; Johns, E.M.; Arthur, D.; Sheffield, M.; Ritchie, S.M.; et al. Agricultural data management and sharing: Best practices and case study. Agron. J. 2022, 114, 2624–2634. [Google Scholar] [CrossRef]
  11. Runck, B.C.; Joglekar, A.; Silverstein, K.A.T.; Chan-Kang, C.; Pardey, P.G.; Wilgenbusch, J.C. Digital agriculture platforms: Driving data-enabled agricultural innovation in a world fraught with privacy and security concerns. Agron. J. 2022, 114, 2635–2643. [Google Scholar] [CrossRef]
  12. Nikander, J.; Manninen, O.; Laajalahti, M. Requirements for cybersecurity in agricultural communication networks. Comput. Electron. Agric. 2020, 179, 105776. [Google Scholar] [CrossRef]
  13. Barreto, L.; Amaral, A. Smart farming: Cyber security challenges. In Proceedings of the 2018 International Conference on Intelligent Systems (IS), Funchal, Portugal, 25–27 September 2018; pp. 870–876. [Google Scholar] [CrossRef]
  14. Ilyas, Q.M.; Ahmad, M. Smart farming: An enhanced pursuit of sustainable remote livestock tracking and geofencing using iot and gprs. Wirel. Commun. Mob. Comput. 2020, 2018, 6660733. [Google Scholar] [CrossRef]
  15. Bahlo, C.; Dahlhaus, P. Livestock data—Is it there and is it fair? a systematic review of livestock farming datasets in australia. Comput. Electron. Agric. 2021, 188, 106365. [Google Scholar] [CrossRef]
  16. Mateo-Fornés, J.; Pagès-Bernaus, A.; Plà-Aragonés, L.M.; Castells-Gasia, J.P.; Babot-Gaspa, D. An internet of things platform based on microservices and cloud paradigms for livestock. Sensors 2021, 21, 5949. [Google Scholar] [CrossRef] [PubMed]
  17. Boshkoska, B.M.; Liu, S.; Zhao, G.; Fernandez, A.; Gamboa, S.; del Pino, M.; Zarate, P.; Hernandez, J.; Chen, H. A decision support system for evaluation of the knowledge sharing crossing boundaries in agri-food value chains. Comput. Ind. 2019, 110, 64–80. [Google Scholar] [CrossRef]
  18. Lezoche, M.; Hernandez, J.E.; del Mar Eva Alemany Díaz, M.; Panetto, H.; Kacprzyk, J. Agri-food 4.0: A survey of the supply chains and technologies for the future agriculture. Comput. Ind. 2020, 117, 103187. [Google Scholar] [CrossRef]
  19. Stranieri, S.; Riccardi, F.; Meuwissen, M.P.; Soregaroli, C. Exploring the impact of blockchain on the performance of agri-food supply chains. Food Control 2021, 119, 107495. [Google Scholar] [CrossRef]
  20. Zhao, G.; Liu, S.; Lopez, C.; Lu, H.; Elgueta, S.; Chen, H.; Boshkoska, B.M. Blockchain technology in agri-food value chain management: A synthesis of applications, challenges and future research directions. Comput. Ind. 2019, 109, 83–99. [Google Scholar] [CrossRef]
  21. Kamble, S.S.; Gunasekaran, A.; Sharma, R. Modeling the blockchain enabled traceability in agriculture supply chain. Int. J. Inf. Manag. 2020, 52, 101967. [Google Scholar] [CrossRef]
  22. Thakur, M.; Martens, B.J.; Hurburgh, C.R. Data modeling to facilitate internal traceability at a grain elevator. Comput. Electron. Agric. 2011, 75, 327–336. [Google Scholar] [CrossRef]
  23. Shanahan, C.; Kernan, B.; Ayalew, G.; McDonnell, K.; Butler, F.; Ward, S. A framework for beef traceability from farm to slaughter using global standards: An irish perspective. Comput. Electron. Agric. 2009, 66, 62–69. [Google Scholar] [CrossRef]
  24. Antonucci, F.; Figorilli, S.; Costa, C.; Pallottino, F.; Raso, L.; Menesatti, P. A review on blockchain applications in the agri-food sector. J. Sci. Food Agric. 2019, 99, 6129–6138. [Google Scholar] [CrossRef] [PubMed]
  25. Jahanbin, P.; Wingreen, S.; Sharma, R. A blockchain traceability information system for trust improvement in agricultural supply chain. In Proceedings of the 27th European Conference on Information Systems (ECIS), Stockholm, Sweden, 8–14 June 2019. [Google Scholar]
  26. Lin, J.; Shen, Z.; Zhang, A.; Chai, Y. Blockchain and iot based food traceability for smart agriculture. In Proceedings of the ICCSE’18: 3rd International Conference on Crowd Science and Engineering, Singapore, 28–31 July 2018; pp. 1–6. [Google Scholar]
  27. Demestichas, K.; Peppes, N.; Alexakis, T.; Adamopoulou, E. Blockchain in agriculture traceability systems: A review. Appl. Sci. 2020, 10, 4113. [Google Scholar] [CrossRef]
  28. Abidin, A.; Marquet, E.; Moeyersons, J.; Limani, X.; Pohle, E.; Van Kenhove, M.; Marquez-Barja, J.M.; Slamnik-Kriještorac, N.; Volckaert, B. Mozaik: An end-to-end secure data sharing platform. In Proceedings of the Second ACM Data Economy Workshop, Association for Computing Machinery, Seattle, WA, USA, 18 June 2023; pp. 34–40. [Google Scholar] [CrossRef]
  29. Curran, S.; Philipp, A.; Yildiz, H.; Curren, S.; Jurado, V.M.; Bhaduri, A.; Ivanov, A. Anoncreds Specification. Available online: https://hyperledger.github.io/anoncreds-spec/ (accessed on 11 April 2024).
  30. Camenisch, J.; Lysyanskaya, A. An efficient system for non-transferable anonymous credentials with optional anonymity revocation. In Advances in Cryptology—EUROCRYPT 2001; Pfitzmann, B., Ed.; Springer: Berlin/Heidelberg, Germany, 2001; pp. 93–118. [Google Scholar]
  31. Camenisch, J.; Lysyanskaya, A. Signature schemes and anonymous credentials from bilinear maps. In Advances in Cryptology—CRYPTO 2004; Franklin, M., Ed.; Springer: Berlin/Heidelberg, Germany, 2004; pp. 56–72. [Google Scholar]
  32. Rivest, R.L.; Shamir, A.; Adleman, L. A method for obtaining digital signatures and public-key cryptosystems. Commun. ACM 1978, 21, 120–126. [Google Scholar] [CrossRef]
  33. Camenisch, J.; Drijvers, M.; Lehmann, A. Anonymous attestation using the strong diffie hellman assumption revisited. In Trust and Trustworthy Computing; Franz, M., Papadimitratos, P., Eds.; Springer International Publishing: Cham, Switzerland, 2016; pp. 1–20. [Google Scholar]
  34. ISO/IEC 18013-5:2021; Personal Identification—ISO-Compliant Driving Licence Part 5: Mobile Driving Licence (mDL) Application. International Organization for Standardization (ISO)/International Electrotechnical Commission (IEC): Geneva, Switzerland, 2021.
  35. Fett, D.; Yasuda, K.; Campbell, B. Selective Disclosure for Jwts (sd-jwt). 2023. Available online: https://www.ietf.org/archive/id/draft-ietf-oauth-selective-disclosure-jwt-07.html (accessed on 27 February 2024).
  36. Terbu, O.; Fett, D. Sd-Jwt-Based Verifiable Credentials (sd-jwt vc). 2023. Available online: https://www.ietf.org/archive/id/draft-ietf-oauth-sd-jwt-vc-01.html (accessed on 27 February 2024).
  37. Sporny, M.; Longley, D.; Chadwick, D.; Herman, I. Verifiable Credentials Data Model v1.1, w3c Recommendation. 2022. Available online: https://www.w3.org/TR/vc-data-model/ (accessed on 27 February 2024).
  38. Nystrom, M.; Kaliski, B. Pkcs #10: Certification Request Syntax Specification Version 1.7 (rfc:2986). 2000. Available online: https://www.rfc-editor.org/rfc/rfc2986 (accessed on 11 April 2025).
  39. Kalos, V.; Polyzos, G.C. Requirements and secure serialization for selective disclosure verifiable credentials. In ICT Systems Security and Privacy Protection; Meng, W., Fischer-Hübner, S., Jensen, C.D., Eds.; Springer International Publishing: Cham, Switzerland, 2022; pp. 231–247. [Google Scholar]
  40. FIPS PUB 197; Announcing the Advanced Encryption Standard (AES). U.S. Department of Commerce, NIST: Gaithersburg, MD, USA, 2001. [CrossRef]
  41. Abbasi, A.G.; Rydberg, A.; Altmann, P. Towards a verifiable and secure data sharing platform for livestock supply chain. In Proceedings of the 2022 IEEE Intl Conf on Dependable, Autonomic and Secure Computing, Intl Conf on Pervasive Intelligence and Computing, Intl Conf on Cloud and Big Data Computing, Intl Conf on Cyber Science and Technology Congress (DASC/PiCom/CBDCom/CyberSciTech), Falerna, Italy, 12–15 September 2022. [Google Scholar] [CrossRef]
Figure 1. Data pipeline for defined use case of supply chain.
Figure 1. Data pipeline for defined use case of supply chain.
Cryptography 09 00052 g001
Figure 2. Layered architect of FLEX: Framework for Livestock Empowerment and Decentralized Secure Data eXchange.
Figure 2. Layered architect of FLEX: Framework for Livestock Empowerment and Decentralized Secure Data eXchange.
Cryptography 09 00052 g002
Figure 3. Flow of information for sharing data between various stakeholders of the livestock data pipeline.
Figure 3. Flow of information for sharing data between various stakeholders of the livestock data pipeline.
Cryptography 09 00052 g003
Figure 5. Performance analysis results for generating SD-VC for animal identity and evidence.
Figure 5. Performance analysis results for generating SD-VC for animal identity and evidence.
Cryptography 09 00052 g005
Figure 6. Performance analysis and comparison of SD-VC creation and sharing through IPFS.
Figure 6. Performance analysis and comparison of SD-VC creation and sharing through IPFS.
Cryptography 09 00052 g006
Figure 7. Performance evaluation of the complete cycle of animal data sharing with supporting evidence.
Figure 7. Performance evaluation of the complete cycle of animal data sharing with supporting evidence.
Cryptography 09 00052 g007
Figure 8. FLEX user interface to show the current status of registered animals.
Figure 8. FLEX user interface to show the current status of registered animals.
Cryptography 09 00052 g008
Table 1. Example of growth tracking records.
Table 1. Example of growth tracking records.
DateWeightNotes
1 May 2020100 lbs. (45.36 kg)Pig is eating well, feeder was empty, added a bag of feed (50 lbs. or 22.68 kg)
15 May 2020120 lbs. (54.43 kg)Pig is eating well, feeder was empty on 12 May, added a new bag of feed (50 lbs. or 22.68 kg)
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Ghafoor, A.; Symeonidis, I.; Rydberg, A.; Lindahl, C.; Abbasi, A.Q. Towards Empowering Stakeholders Through Decentralized Trust and Secure Livestock Data Sharing. Cryptography 2025, 9, 52. https://doi.org/10.3390/cryptography9030052

AMA Style

Ghafoor A, Symeonidis I, Rydberg A, Lindahl C, Abbasi AQ. Towards Empowering Stakeholders Through Decentralized Trust and Secure Livestock Data Sharing. Cryptography. 2025; 9(3):52. https://doi.org/10.3390/cryptography9030052

Chicago/Turabian Style

Ghafoor, Abdul, Iraklis Symeonidis, Anna Rydberg, Cecilia Lindahl, and Abdul Qadus Abbasi. 2025. "Towards Empowering Stakeholders Through Decentralized Trust and Secure Livestock Data Sharing" Cryptography 9, no. 3: 52. https://doi.org/10.3390/cryptography9030052

APA Style

Ghafoor, A., Symeonidis, I., Rydberg, A., Lindahl, C., & Abbasi, A. Q. (2025). Towards Empowering Stakeholders Through Decentralized Trust and Secure Livestock Data Sharing. Cryptography, 9(3), 52. https://doi.org/10.3390/cryptography9030052

Article Metrics

Back to TopTop