Abstract
Currently, big data is considered one of the most significant areas of research and development. The advancement in technologies along with the involvement of intelligent and automated devices in each field of development leads to huge generation, analysis, and the recording of information in the network. Though a number of schemes have been proposed for providing accurate decision-making while analyzing the records, however, the existing methods lead to massive delays and difficulty in the management of stored information. Furthermore, the excessive delays in information processing pose a critical challenge to making accurate decisions in the context of big data. The aim of this paper is to propose an effective approach for accurate decision-making and analysis of the vast volumes of data generated by intelligent devices in the healthcare sector. The processed and managed records can be stored and accessed in a systematic and efficient manner. The proposed mechanism uses the hybrid of ensemble learning along with blockchain for fast and continuous recording and surveillance of information. The recorded information is analyzed using several existing methods focusing on various measurement outcomes. The results of the proposed technique are compared with existing techniques through various experiments that demonstrate the efficiency and accuracy of this technique.
1. Introduction
Currently, advancement in technologies such as automated systems, intelligent devices, and smart technologies leads to vast volumes of generation, processing, and analysis of records. The analysis of large-scale datasets has emerged as a contemporary area of research for scientists and engineers [1]. The enormous volume of records generated by modern information systems such as IoT devices, cloud computing, online storage mechanisms, etc. requires schemes and mechanisms to extract useful information along with maintaining and structuring all of the information [2]. This phenomenon leads to the generation of data scientists whose task is to access the generated records from multiple or heterogeneous sources and further make them usable in order to obtain the desired output. Data scientists/engineers/researchers can use Machine Learning (ML)/Artificial Intelligence (AI) approaches, and other structured models to identify and detect information patterns and presentations for quick access and analysis of records in a more reliable and efficient manner [3,4]. Several data science methods and techniques exist to process the generated records in a structured format that can be further cleaned and accessed by applying data science tools. Once the information is in a proper and structured format, data scientists may apply other mechanisms to further process it [5,6]. Figure 1 illustrates the typical architecture, where intelligent devices generate vast amounts of data from various hospitals through access points, ultimately storing the entire dataset on a private cloud or other storage on the network.
Figure 1.
Typical architecture of generation and storage of vast volume of data by smart/intelligent devices.
1.1. Motivation and Objective
A number of schemes have been proposed by authors in order to access and analyze heterogeneous records from several sources in various kinds of applications [7]. In the case of healthcare applications, where the information is very sensitive and cannot be accessed by any random sources, it is much needed to process and analyze the complete information in a proper manner. Furthermore, the access to vast medical records of patients by authorities presents a critical challenge in managing the heterogeneous and diverse types of patient data [8,9]. Moreover, the access to records of patients leads to excessive delays and management issues in the network. Therefore, blockchain-based mechanisms are required while accessing or processing the medical records that ensure more security and surveillance. Furthermore, in order to provide an accurate decision and analysis of records, the doctors need to verify the results which are generated by the automated devices [10,11,12]. The automated analysis of several types of information such as medical imaging, text recording, and data sampling leads to critical issues in terms of security as well as accessing and managing such a huge amount of data [13,14,15]. Hence, an efficient learning mechanism is required to provide an accurate decision. The aim of this paper is to propose accurate and efficient decision-making by further accessing and processing healthcare records while managing multiple types of healthcare information from several hospitals in the network [16]. Figure 2 presents a typical hospital management system specifying data processing, data analysis, security, and the application of medical records along with their specific application and scheme.
Figure 2.
Hospital management system.
1.2. Contribution
The aim of this manuscript is to propose an efficient and accurate decision-making analysis of generated samples from heterogeneous intelligent devices. The proposed mechanism consists of the integration of data sampling, boosting ensemble learning, and blockchain mechanism while generating and recording large datasets of healthcare applications. The proposed mechanism is further validated and verified against several existing schemes/mechanisms in order to evaluate the efficiency and validity of the proposed technique in terms of accuracy and delay. The major contributions of this paper are as follows:
- Integrating the learning method with a secure mechanism in order to detect the malicious behavior of the communicating devices along with improving the accuracy of the proposed mechanism.
- The boosting ensemble learning mechanism is used to validate the data sampling while recording and generating the information from intelligent devices in order to provide accurate decision-making.
- Blockchain mechanism is used for identifying the malicious activities and continuous surveillance of heterogeneous information recorded by several intelligent devices while processing and communicating the information in the network.
The remaining structure of this paper is organized as follows. Section 2 deliberates the literature survey of the traditional data sampling and analysis schemes. Section 3 delves into the proposed mechanism integrating the data sampling, ensemble learning, and blockchain mechanism. Section 4 evaluates the performance of the proposed approach in comparison to existing techniques. Finally, Section 5 concludes this paper along with its future directions.
2. Related Work
This section presents existing schemes and methods proposed by several researchers for managing and recording huge data from hospitals. Lu et al. [17] proposed the concept of a big knowledge system including data cleaning, massive concepts, consistency, capabilities, and cumulativeness. The authors investigated the research efforts by determining and considering the big knowledge systems. The authors presented the definition by checking the engineering projects. The authors further discussed the research perspectives. Salloum et al. [18] proposed a distributed data-parallel mechanism by analyzing the random sample partitioning data blocks. The authors proposed and developed a prototype using frameworks like the Hadoop distributed file system by estimating and computing the entire dataset. Shang et al. [19] proposed an identity-based dynamic data auditing mechanism by supporting the dynamic operations. The authors achieved an efficient operation using data structures of the Merkle Hash tree for block authentication and updating the information using integrity assurances. The authors have analyzed the security and measured the performance of the proposed mechanism.
Yu et al. [20] proposed a cluster-based data analysis framework using recursive PCA by aggregating the redundant information. The authors defined the abnormal squared prediction error by adapting and updating the changes in IoT systems. The authors analyzed the cluster-based framework by processing computational burdens on sensor nodes. The authors used practical database simulations in order to verify the proposed scheme and validate its accuracy. Sanyal et al. [21] proposed a data aggregation scheme for improving the quality of information in device-to-device communication. The authors achieved the aforementioned tasks by determining the uncensored information. They eliminated all the uncertainties while preserving the global attributes of information. The performance is evaluated using real-world sensor data by injecting the outliers, noise, and missing values. The authors further measured the communication information delivery by accommodating the huge number of IoT devices. Lin et al. [22] reviewed the recent research efforts while conducting a comprehensive overview of medical big data. The authors focused on data processing and data visualization issues in big data processing. The authors combined the big data technologies along with its analysis details by targeting chronic diseases and health monitoring of huge information technologies. Emara et al. [23] proposed data distribution strategies while developing various scenarios in multiple data centers. The authors have used a random partition data model in order to analyze all of the information. The authors avoided the information transformation among data centers. They demonstrated the performance of two strategies by simulating the results on global and local data centers. Ding et al. [24] provided a systematic summary of data cleaning and error detection methods for identifying single-point, multi-point, and qualitative errors. Further, the authors discussed the error data repairing methods by mentioning the strengths and limitations of IoT data applications. Table 1 illustrates the related work done in this area.
Table 1.
Related work discussion.
Research Gap
Though a number of schemes have been proposed by several researchers and scientists by determining the data sampling, ensuring the security and information processing using several machine learning techniques and schemes. The existing techniques lead to extra overhead and delay while ensuring the accuracy along with security. The security mechanisms increase the complexity, cost, and communication process that may further affect the overall performance of the network. The existing authors further tried to propose a number of secure and efficient techniques for analyzing the behavior of IoT devices along with reduced processing delay while communicating in the network [25]. The aim of this paper is to propose an efficient and secure communication scheme that integrates ensemble learning and a blockchain mechanism for analyzing and processing large amount of data, ensuring reduced delays and accurate decision-making among several intelligent devices.
3. Proposed Approach
A typical architecture for supporting the healthcare record consists of several steps including data processing, sampling, data analysis, and data security mechanisms, as illustrated in Figure 3. Initially, after generating and accessing the healthcare records from the IoT sensors, it is passed to an information-processing engine using the PCA technique. Once the information is processed and sampled using the PCA technique, the data analysis is maintained using the ensemble learning scheme. Once all the information is analyzed and recorded in the base stations, the complete information is recorded and maintained through a blockchain mechanism. The data warehouse is further used to map and reduce the relational database by further accessing the entire network through an application. The application may further have searching capabilities, document recording, and query processing from the endpoint. Algorithm 1 presents the data sampling code to analyze and generate the information.
| Algorithm 1: Data sampling |
![]() |
Figure 3.
Typical architecture for supporting healthcare record. Purple, Green, Orange and Teal color blocks in Blockchain Network refer to different blocks of the Blockchain.
3.1. Ensemble Learning
It is a machine learning technique that aggregates two or more learners such as models or networks in order to produce a better prediction of records more accurately [26]. The information is generated from the intelligent devices in a huge amount; therefore, the generated record should be processed before analysis and provide the decisions in the network.
3.1.1. Data Processing
The proper structuring of records along with the management is needed, which can be conducted using several learning techniques and mechanisms. Bias, variance, null values, etc. are several types of schemes that can be further applied to pre-process the data for sampling and further processing the information. During the sampling of records, we are using sequential methods to train the base learners as it minimizes the errors followed by previous models in the preceding steps [27,28]. The aim of using ensemble learning here is to reduce the number of errors in the generated records so that it can be further processed with reduced delay and effort. Figure 4 presents the healthcare industry management record which represents the types of records that can be stored and accessed while recording the information on the network.
Figure 4.
A typical healthcare information management record.
3.1.2. Boosting Algorithm
It is a sequential ensemble method that trains the initial dataset d that is ideally in an unstructured format. The boosting algorithm generates another dataset as to sample the instances and prioritize the misclassification of data instances. The obtained dataset, after prioritizing the misclassification of data instances, is a new dataset . The third dataset is compiled from d1 and d2, by prioritizing the d2 dataset misclassified instances and samples in which d1 and d2 were in disagreement. The process will repeat n times by combining all the weights together to obtain the final prediction. Figure 5 presents the boosting ensemble learning method including training data and weighted information having incorrect and correct prediction records.
Figure 5.
Boosting ensemble learning method. Colored blocks in predictions and Models are different blocks of the Blockchain.
Algorithm 2 presents the sequential boosting ensemble learning.
| Algorithm 2: Sequential boosting ensemble learning |
Input: N number of data samples, n number of learners, n number of models Output: final prediction of the samples and instances Step 1: train the initial dataset d on first learner and generate the misclassification of samples Step 2: new dataset d2 is generated from the d1 by prioritizing the misclassified data instances from first learner model Step 3: by prioritizing the second learners misclassified dataset instances Repeat n times Step 4: Combines and weights all the learners in to obtain the final prediction |
3.1.3. Blockchain Network
The large volumes of generated information from multiple sources of intelligent devices are further analyzed and structured using ensemble learning method where the data are sampled into various instances. The boosting ensemble learning is used to produce a better prediction of generated records using multiple learners [29,30]. Once all the data samples and their respective instances have corrected and weighted information that can be easily accessed and stored by the network. The next step is to access and store all the records in a more efficient and secure manner. Blockchain technology is the one where in addition to providing security, it can also ensure transparency and more efficiency towards data access while generating and processing the records. The blockchain network accepts the weighted information generated by the boosting learner’s method, where each sample is further monitored and accessed in a more efficient manner as compared to existing methods. The architectural representation of boosting learners integrated with the blockchain network is further presented in Figure 6 consisting of n number of samples. The computed weighted samples of n models that are identified as corrected are now stored in the blockchain network for further surveillance and analysis. The block of each network consists of a data sample, hash function, and number of learners as a record.
Figure 6.
Boosting learning integrated with blockchain network. Purple, Green, Orange and Teal color blocks in Blockchain Network refer to different blocks of the Blockchain.
3.1.4. Block Addition
During the addition of a new block in a network, the existing number of devices verify the upcoming data sample from the boosting learner by examining the number of learner models used to process the sample. Once the information is verified by the existing devices, the new upcoming sample is added as a new block in the existing blockchain network.
3.1.5. Block Updation
During the blockchain updations process, in case any record is deleted or altered by authorized entities such as doctors, patients, or intermediate entities such as nurses or pathology lab recorders, the information is further processed by the boosting learner in order to verify the validity of records. Once the information is accurately predicted, then the sample remains inserted in the blockchain network.
The final predicted and stored data samples in the network can be easily accessed and maintained by the system with proper structuring and managing of records. The simulation setup of the proposed mechanism is detailed and validated in the performance analysis section.
4. Performance Analysis
The proposal operates on two different datasets. The first dataset is the real-world dataset that comes from the IoT sensors consisting of patient records including specific diseases, and complete health history records. In addition, another dataset is a synthetic dataset constructed over a 500 × 500 test data information matrix to emulate the data record of 100 IoT devices by generating 400 samples for a given duration of time. The test dataset is a sum of fixed rank k using sparse and rank matrix. In order to uniformly distribute the records, offset and gain are used with the distribution of [0.5,1,1.5] and [−0.5,0.5].
4.1. Methodology
The performance evaluation is focused on managing the critical records by categorizing and managing the data distribution information. The practical implementation is conducted using Python where the dataset is initially structured and processed before applying the ensemble learning algorithm. Further, the devices are stored in a blockchain database for further surveillance. The java script is used to generate blockchain mechanism in the network. The graphical representations are used to determine the validity and out-performance of proposed mechanism in terms of accuracy, false positive, false negative, and other authentication scenarios and policies.
4.2. Results and Discussion
The recorded information is further analyzed against all the mentioned security and metrics such as true negative, true positive, false negative, false positive, and accuracy. Figure 7 presents the accuracy graph of the existing and proposed schemes while analyzing the recorded information in the network.
Figure 7.
Accuracy.
The recorded information is efficiently managed and recorded by the proposed scenario because of the involvement of the blockchain network. Further, Figure 8, Figure 9, Figure 10 and Figure 11 represent the true positive, true negative, false positive, and false negative rates against traditional and proposed mechanisms.
Figure 8.
True positive.
Figure 9.
True negative.
Figure 10.
False positive.
Figure 11.
False negative.
Results Analysis
The proposed framework is analyzed and validated against accuracy, true positive, and true negative scenarios. The existing approach that is considered here is proposed by [14], proposing a cluster-based framework using PCA. The authors simulated the results against accuracy. The presented graphs depict the out-performance of the proposed mechanism because of the integration of boosting and blockchain methods. The boosting learning algorithm improves the decision-making accuracy while the implementation of blockchain further enhances the authentication scenarios by identifying the behavior of each communicating device in the network. Furthermore, the privacy of patient information is maintained in the proposed scenario through blockchain network. Every time an entity requests access to patient information, the patient always guarantees permission. Any alteration or accessing of records by their party can be easily monitored through the blockchain network.
4.3. Summary
The overall summary of the project is managed and recorded by the proposed mechanism more efficiently in terms of accuracy and other measuring parameters due to the involvement of blockchain and learning schemes. The existing mechanism ensures security and efficiency by involving various learning schemes that result in huge overheads of computation and communication. The proposed information can be easily maintained and recorded by involving security metrics such as blockchain networks. The blockchain network not only ensures transparency but also maintains the efficiency and continuous analysis of records in the network.
5. Conclusions
There are numerous data science methods and techniques to process the generated records in a structured format. Once the information is in a proper and structured format, the data scientists may apply other mechanisms to further process it and access the desired records within a limited amount of space and time with efficient analysis of records. However, existing techniques pose a challenge of delayed responses and inaccurate information communication within a large network. The proposed mechanism integrated the ensemble learning and blockchain mechanism in order to reduce the delay while processing structured records after identifying errors. The blockchain mechanism is used to monitor the corrected prediction record in order to improve the accuracy in decision-making. The proposed mechanism outperformed the existing schemes in terms of several measuring metrics such as accuracy, true positive, true negative, false positive, and false negative. The proposed mechanism provides better efficiency of large volumes of generated information as compared to the traditional schemes by integrating ensemble learning and blockchain mechanisms. The accuracy of the proposed mechanism can be further enhanced by including a neural-based self-supervised learning approach that learns the patterns of heterogeneous records from several IoT devices, which is more of a future plan for this study.
Author Contributions
Conceptualization, G.R. and R.I.; Methodology, G.R.; Software, G.R.; Validation, G.R. and R.I.; Formal Analysis, G.R.; Investigation, G.R.; Resources, G.R.; Data Curation, G.R.; Writing – Original Draft Preparation, G.R.; Writing – Review Editing, G.R. and R.I.; Visualization, G.R.; Supervision, R.I.; Project Administration, G.R. All authors have read and agreed to the published version of the manuscript.
Funding
This research received no external funding.
Institutional Review Board Statement
Not Applicable.
Informed Consent Statement
Not Applicable.
Data Availability Statement
The raw data supporting the conclusions of this article will be made available by the authors on request.
Conflicts of Interest
The authors declare no conflicts of interest.
References
- Pereira, P.; Cunha, J.; Fernandes, J.P. On understanding data scientists. In Proceedings of the 2020 IEEE Symposium on Visual Languages and Human-Centric Computing (VL/HCC), Dunedin, New Zealand, 11–14 August 2020; pp. 1–5. [Google Scholar]
- Zhang, Z.; Kouzani, A.Z. Implementation of DNNs on IoT devices. Neural Comput. Appl. 2020, 32, 1327–1356. [Google Scholar] [CrossRef]
- Badar, M.S.; Shamsi, S.; Haque, M.M.U.; Aldalbahi, A.S. Applications of AI and ML in IoT. In Integration of WSNs into Internet of Things; CRC Press: Boca Raton, FL, USA, 2021; pp. 273–290. [Google Scholar]
- Alajlan, N.N.; Ibrahim, D.M. TinyML: Enabling of inference deep learning models on ultra-low-power IoT edge devices for AI applications. Micromachines 2022, 13, 851. [Google Scholar] [CrossRef] [PubMed]
- Qiu, J.; Wu, Q.; Ding, G.; Xu, Y.; Feng, S. A survey of machine learning for big data processing. EURASIP J. Adv. Signal Process. 2016, 2016, 67. [Google Scholar] [CrossRef]
- Sodhro, A.H.; Malokani, A.S.; Sodhro, G.H.; Muzammal, M.; Zongwei, L. An adaptive QoS computation for medical data processing in intelligent healthcare applications. Neural Comput. Appl. 2020, 32, 723–734. [Google Scholar] [CrossRef]
- Ma, F.; Ye, M.; Luo, J.; Xiao, C.; Sun, J. Advances in mining heterogeneous healthcare data. In Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery & Data Mining, Singapore, 14–18 August 2021; pp. 4050–4051. [Google Scholar]
- Wang, M.; Li, S.; Zheng, T.; Li, N.; Shi, Q.; Zhuo, X.; Ding, R.; Huang, Y. Big data health care platform with multisource heterogeneous data integration and massive high-dimensional data governance for large hospitals: Design, development, and application. JMIR Med. Inform. 2022, 10, e36481. [Google Scholar] [CrossRef] [PubMed]
- Thakur, A.; Molaei, S.; Nganjimi, P.C.; Soltan, A.; Schwab, P.; Branson, K.; Clifton, D.A. Knowledge abstraction and filtering based federated learning over heterogeneous data views in healthcare. NPJ Digit. Med. 2024, 7, 283. [Google Scholar] [CrossRef] [PubMed]
- Kasban, H.; El-Bendary, M.A.M.; Salama, D.H. A comparative study of medical imaging techniques. Int. J. Inf. Sci. Intell. Syst. 2015, 4, 37–58. [Google Scholar]
- Suetens, P. Fundamentals of Medical Imaging; Cambridge University Press: Cambridge, UK, 2017. [Google Scholar]
- Suzuki, K. Overview of deep learning in medical imaging. Radiol. Phys. Technol. 2017, 10, 257–273. [Google Scholar] [CrossRef]
- Hammi, M.T.; Bellot, P.; Serhrouchni, A. BCTrust: A decentralized authentication blockchain-based mechanism. In Proceedings of the 2018 IEEE Wireless Communications and Networking Conference (WCNC), Barcelona, Spain, 15–18 April 2018; IEEE: Piscataway, NJ, USA, 2018; pp. 1–6. [Google Scholar]
- Ma, H.; Huang, E.X.; Lam, K.Y. Blockchain-based mechanism for fine-grained authorization in data crowdsourcing. Future Gener. Comput. Syst. 2020, 106, 121–134. [Google Scholar] [CrossRef]
- Pelekoudas-Oikonomou, F.; Zachos, G.; Papaioannou, M.; de Ree, M.; Ribeiro, J.C.; Mantas, G.; Rodriguez, J. Blockchain-based security mechanisms for IoMT Edge networks in IoMT-based healthcare monitoring systems. Sensors 2022, 22, 2449. [Google Scholar] [CrossRef]
- Parsaeian, M.; Mahdavi, M.; Saadati, M.; Mehdipour, P.; Sheidaei, A.; Khatibzadeh, S.; Farzadfar, F.; Shahraz, S. Introducing an efficient sampling method for national surveys with limited sample sizes: Application to a national study to determine quality and cost of healthcare. BMC Public Health 2021, 21, 1414. [Google Scholar] [CrossRef]
- Lu, R.; Jin, X.; Zhang, S.; Qiu, M.; Wu, X. A study on big knowledge and its engineering issues. IEEE Trans. Knowl. Data Eng. 2018, 31, 1630–1644. [Google Scholar] [CrossRef]
- Salloum, S.; Huang, J.Z.; He, Y.; Chen, X. An asymptotic ensemble learning framework for big data analysis. IEEE Access 2018, 7, 3675–3693. [Google Scholar] [CrossRef]
- Shang, T.; Zhang, F.; Chen, X.; Liu, J.; Lu, X. Identity-based dynamic data auditing for big data storage. IEEE Trans. Big Data 2019, 7, 913–921. [Google Scholar] [CrossRef]
- Yu, T.; Wang, X.; Shami, A. Recursive principal component analysis-based data outlier detection and sensor data aggregation in IoT systems. IEEE Internet Things J. 2017, 4, 2207–2216. [Google Scholar] [CrossRef]
- Sanyal, S.; Zhang, P. Improving quality of data: IoT data aggregation using device to device communications. IEEE Access 2018, 6, 67830–67840. [Google Scholar] [CrossRef]
- Lin, R.; Ye, Z.; Wang, H.; Wu, B. Chronic diseases and health monitoring big data: A survey. IEEE Rev. Biomed. Eng. 2018, 11, 275–288. [Google Scholar] [CrossRef]
- Emara, T.Z.; Huang, J.Z. Distributed data strategies to support large-scale data analysis across geo-distributed data centers. IEEE Access 2020, 8, 178526–178538. [Google Scholar] [CrossRef]
- Ding, X.; Wang, H.; Li, G.; Li, H.; Li, Y.; Liu, Y. IoT data cleaning techniques: A survey. Intell. Converg. Netw. 2022, 3, 325–339. [Google Scholar] [CrossRef]
- Tsogbaatar, E.; Bhuyan, M.H.; Taenaka, Y.; Fall, D.; Gonchigsumlaa, K.; Elmroth, E.; Kadobayashi, Y. DeL-IoT: A deep ensemble learning approach to uncover anomalies in IoT. Internet Things 2021, 14, 100391. [Google Scholar] [CrossRef]
- Jatoth, C.; Jain, R.; Fiore, U.; Chatharasupalli, S. Improved classification of blockchain transactions using feature engineering and ensemble learning. Future Internet 2021, 14, 16. [Google Scholar] [CrossRef]
- Al-Utaibi, K.A.; El-Alfy, E.S.M. Intrusion detection taxonomy and data preprocessing mechanisms. J. Intell. Fuzzy Syst. 2018, 34, 1369–1383. [Google Scholar] [CrossRef]
- Wang, S.; Celebi, M.E.; Zhang, Y.-D.; Yu, X.; Lu, S.; Yao, X.; Zhou, Q.; Miguel, M.-G.; Tian, Y.; Gorriz, J.M.; et al. Advances in data preprocessing for biomedical data fusion: An overview of the methods, challenges, and prospects. Inf. Fusion 2021, 76, 376–421. [Google Scholar] [CrossRef]
- Lashkari, B.; Musilek, P. A comprehensive review of blockchain consensus mechanisms. IEEE Access 2021, 9, 43620–43652. [Google Scholar] [CrossRef]
- Yawalkar, P.M.; Paithankar, D.N.; Pabale, A.R.; Kolhe, R.V.; William, P. Integrated identity and auditing management using blockchain mechanism. Meas. Sensors 2023, 27, 100732. [Google Scholar] [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
