Next Article in Journal
Introduction of SGLT2 Inhibitors and Variations in Other Disease-Modifying Drugs in Heart Failure Patients: A Single-Centre Real-World Experience
Previous Article in Journal
The Impact of the Test Dissociation on the Binocular Balance of Children
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

The Constrained-Disorder Principle Assists in Overcoming Significant Challenges in Digital Health: Moving from “Nice to Have” to Mandatory Systems

Hadassah Medical Center, Department of Medicine, Faculty of Medicine, Hebrew University, POB 1200, Jerusalem IL91120, Israel
*
Author to whom correspondence should be addressed.
Clin. Pract. 2023, 13(4), 994-1014; https://doi.org/10.3390/clinpract13040089
Submission received: 19 July 2023 / Revised: 16 August 2023 / Accepted: 18 August 2023 / Published: 20 August 2023

Abstract

:
The success of artificial intelligence depends on whether it can penetrate the boundaries of evidence-based medicine, the lack of policies, and the resistance of medical professionals to its use. The failure of digital health to meet expectations requires rethinking some of the challenges faced. We discuss some of the most significant challenges faced by patients, physicians, payers, pharmaceutical companies, and health systems in the digital world. The goal of healthcare systems is to improve outcomes. Assisting in diagnosing, collecting data, and simplifying processes is a “nice to have” tool, but it is not essential. Many of these systems have yet to be shown to improve outcomes. Current outcome-based expectations and economic constraints make “nice to have,” “assists,” and “ease processes” insufficient. Complex biological systems are defined by their inherent disorder, bounded by dynamic boundaries, as described by the constrained disorder principle (CDP). It provides a platform for correcting systems’ malfunctions by regulating their degree of variability. A CDP-based second-generation artificial intelligence system provides solutions to some challenges digital health faces. Therapeutic interventions are held to improve outcomes with these systems. In addition to improving clinically meaningful endpoints, CDP-based second-generation algorithms ensure patient and physician engagement and reduce the health system’s costs.

1. Introduction

Artificial intelligence (AI) is still expected to influence healthcare delivery and the practice of medicine [1]. Despite the hype and attention around it and the rapid growth of digital technologies, the involvement of patients, clinicians, the insurance industry, and medicine regulators still needs to be higher]. The Valley of Death (VoD) is a challenge entrepreneurs, business owners, technology experts, innovators, and inventors must consider [2]. It reflects a series of challenges facing many companies in the digital world as the hype of the last decade seems to be over [2]. The “no evidence, no implementation–no implementation, no evidence” paradox is often related to the digital health field. The lack of evidence on how these systems may impact health outcomes, health systems efficiencies, and cost-effectiveness of service delivery is a significant challenge [3].
Digital health systems differ from other digital systems because patients do not approach health care voluntarily but because they are forced to. The challenge of implementing mandatory instead of “nice to have” systems poses a significant barrier to their implementation. Even though people in the field are beginning to believe that digital health has not met its expectations, that does not mean that it does not have a place; instead, it means rethinking some of the challenges. Whether AI succeeds in medicine and healthcare depends on its ability to penetrate the boundaries of evidence-based medicine, the lack of policies, and the reluctance of medical professionals to use it [4,5,6,7].
This paper discusses several challenges facing the digital field and describes using second-generation AI systems, which can answer some of them. Our paper emphasizes the need for mandatory platforms to improve outcomes rather than nice-to-have tools.

2. Uncertainty in the Healthcare Sector: Digital Health Has Failed to Meet Expectations

Uncertainty underlies the healthcare sector [8,9]. Patients sense there is no easy way to tap into the vast knowledge of healthcare services [10,11]. Healthcare is expensive, feels impersonal and corporate, and confronts people with multiple options and choices they need to prepare for [12]. The uncertainty is particularly pronounced in digital health, where many solutions still need to live up to their initial promise. Despite significant investments and efforts to implement digital technologies in healthcare, many solutions have yet to deliver the expected results [13]. Technological challenges, organizational barriers, and cultural resistance to change contribute to the failure of digital health to reach its expected potential [14]. A lack of education about the capabilities of digital medicine, and the added administrative burden that came with the early digitization of healthcare processes, contributed to physician burnout [15,16,17]. There is also fear that AI may eventually replace physicians [18]. The lack of a legal framework defining liability in adopting or rejecting algorithm recommendations leaves doctors vulnerable to legal consequences when using AI [19].
The healthcare community is repeatedly excited by the hope of providing better care through effective technology adoption [7,20,21,22,23,24]. There is no denying that digital health has not been delivered, and digital health has not transformed the health system. More healthcare intelligent technology (IT) companies have gone bankrupt in the past five years than in two decades before, and 98% of digital health startups have failed to survive [25,26]. Corporations are shutting down digital health labs, staunching investments in digital health, and consolidating digital health conferences, and governments are re-evaluating the funding regimes for such initiatives [27]. Digital health is yet to witness a large-scale adoption that could match the hope created about it [28,29].
Clinically robust, more inclusive, and better-personalized services are needed, using a marketplace model that incentivizes lower costs and better matching services with patients’ needs [30].

3. Digital Health Trends over the Last Decade: First-Generation Systems

Many digital systems aim to achieve stand-alone digital interventions. To achieve this goal, therapeutic interventions must be integrated with digital systems, and drugs must complement digital-first interventions. Digital solutions are often grouped based on the potential risk to patients into solutions that improve system efficiency: measurable patient outcome benefit; mobile systems that inform or monitor and encourage change and self-management; and clinical decision support (CDS), and prediction models, that guide treatment, actively monitor, diagnose, and support treatment [7,31,32].
First-generation digital health tools were expected to resolve long-standing healthcare access and treatment inequalities in low- and middle-income countries [33]. However, many of these expectations were not met [34,35]. Expert systems can help diagnose medical conditions based on a patient’s symptoms and other information. As a result of the patient’s symptoms and other factors, the system would use a knowledge base of rules and information about different medical conditions to make a diagnosis. Moreover, they include systems that can assist with interpreting medical images, such as X-rays and MRIs, and systems that can analyze large amounts of data, such as electronic health records.
Monitoring is also part of the first-generation AI usage in medicine. Early detection of atrial fibrillation was one of the first uses of AI in medicine. Smartphone-based electrocardiogram (ECG) monitoring and detecting atrial fibrillation are regulatory approved [36]. While wearable and portable ECG technologies have helped detect atrial fibrillation, they have limitations, such as a high rate of false positives due to movement artifacts and difficulty in adoption among elderly patients at higher risk [37]. With continuous glucose monitoring, people with diabetes can monitor their blood sugar levels in real-time. The FDA approved the Guardian glucose monitoring system. Sugar IQ is an app that helps users better prevent low blood sugar episodes based on repeated measurements [38], such as seizure detection with an alarm sent to close relatives and physicians [39]. AI-driven diagnostics perform cardiac ultrasound imaging without specialized training and systems that assist in radiology and pathology image interpretations [40,41,42]. Digital systems are proposed for improving precision medicine, such as matching genetic mutations found in tumor samples with patterns found in genetic data and medical records of patients [43,44]. The treatment can be personalized as a result.
Wearable biosensors can detect multiple cardiovascular parameters and multianalyte such as lactate, glucose, and electrolytes [45,46]. The application of health-tracking reward programs by insurance companies encourages using wearable health technology [47]. Decision support systems using signal and image processing improve diagnosing diseases in pathology and radiology [48]; telemedicine screening in ophthalmology assists in screening for glaucoma, diabetic retinopathies, and retinopathy of prematurity [49]. Autism and myocardial infarctions are also handled by telemedicine [50,51].
Risk stratification and prediction are also possible with this system [52,53], such as predicting the decline of glomerular filtration rate in patients with polycystic kidney disease [54] and risk stratification for the progression of IgA nephropathy [55]; and predicting outcomes in lower gastrointestinal bleeding [56], inflammatory bowel disease [57], esophageal cancer [58], and metastasis in colorectal cancer [59]. They can also be used for interpreting pulmonary function tests [60], processing images from endoscopy and ultrasound detecting abnormal structures in colonoscopy [61], and automate tedious, labor-intensive tasks, such as typing on keyboards and electronic records.
Telemedicine is used for follow-ups to reduce the cost of patients’ accommodation in hospitals, providing access to doctors in rural areas [62]. It enables the implementation of health information systems considering countries’ context, needs, vulnerabilities, and priorities [33]. Automated tools can fill gaps in trained healthcare workforce availability, simplify clinical workflows and assist in overcoming the lack of professionals [63]. In low-resource rural settings, the blockchain tracks supply chains in medication deliveries from pharmaceutical companies to hospitals and patients [64]. Thus, digital tools reduce inequalities in under-resourced neighborhoods, which are essential for promoting innovation and knowledge generation in low- and middle-income countries [33,65].
Using digital systems, treatment plans can be designed, and evidence-based treatment options provided by analyzing structured and unstructured data in medical records can be explored [66]. AI systems combine data from medical records with clinical expertise and research papers to suggest a treatment plan [4,67,68,69]. Using AI, a triage tool can identify high-risk patients and indicate their need for critical care early [70,71]. AI systems are used to facilitate drug development [72].
Virtual reality (VR)-based devices provide a virtual digital picture, while augmented reality (AR) results from integrating information or graphical elements into the user’s environment in real time [73]. Mixed reality (MR), which combines AR and VR, improves the effectiveness of medical education and physician performance [74]. AR and VR are used to rehabilitate post-stroke and posttraumatic stress disorder [75].
The COVID-19 pandemic provided an opportunity to improve the collection and management of data to inform decision-making, screening, disease surveillance, and monitoring [76,77]. Digital health-assisted healthcare systems provide means for prevention and primary care [76,78]. Digital screening for COVID-19 decreased the number of visits to emergency departments [79]. Telemedicine for distance consultation and health gadgets, like pulse oximetry, were used in the COVID-19 pandemic [80,81].
Over the last decade, artificial intelligence has been used to analyze quantifiable data and perform highly repetitive tasks in healthcare [4]. Medical records can be collected, stored, and tracked digitally [82]. AI can improve in-person and online consultations using a patient’s medical history and common medical knowledge. A patient reports their symptoms through the app, which checks them against a database of diseases using speech recognition and offers suggestions [4,83,84]. As a virtual nurse, AI can assist patients with monitoring their health or managing diseases between doctor visits, providing health assistance and medication management [85]. Patients with chronic diseases can benefit from customized follow-up care, and treatment adherence is expected to increase [4,86,87,88].
Several smaller countries, such as South Korea and Estonia, have implemented digital health solutions, but their impact on the global population is small [89]. With the implementation of effective telemedicine and digital health record projects, the National Health Service (NHS) in England and Kaiser Permanente attempt to implement digital systems on a mass scale [90].
First-generation AI has had some success in the medical field, but its abilities are limited, and it cannot adapt to new situations or learn from its mistakes. Modern AI systems, including second-generation and beyond, are more flexible and can learn and adapt over time, which could make them more useful in medicine. To provide high-quality patient care, digital systems must provide a holistic approach that uses robust data and personalized parameters relevant to their healthcare needs [91].
Besides the direct application of artificial intelligence to medical data, generative AI, such as generative pre-trained transformers (GPT), is being introduced to enhance the precision, productivity, and clinical outcomes of patients [92]. Technologies like BERT (Bidirectional Encoder Representations from Transformers) and GPT comprise advanced natural language processing models with remarkable capabilities in interpreting and generating human-like text. Integrating BERT and GPT into digital health strategies may assist in information retrieval, patient–physician communication, and clinical decision-making. By leveraging their contextual understanding, these AI models can help physicians and patients sift through vast medical literature, decode complex terminology, and make informed choices about healthcare options [93,94,95,96]. In ophthalmology and radiology, various ways have been investigated to use ChatGPT-4 in research, medical education, and support for clinical decision-making [97]. There are, however, some limitations and risks associated with the current use of these systems [98]. GPT-like models are limited by the lack of domain expertise, the need to rely on the quality of the input data, the inability to detect errors, and the lack of understanding of ethical issues. Despite their ability to generate coherent and grammatically correct text, GPT-like models may lack the nuance and context that a human expert in the field would provide [98].

4. Challenges in Healthcare Systems That Need to Be Accounted for by Digital Systems

Healthcare markets are undergoing significant changes, and uncertainty, ambiguity, and complexity keeps increasing [99]. Healthcare systems worldwide are becoming unsustainable, and technology is asked to improve them [99,100,101], therefore the cost of care increases, including labor costs [102]. The staff shortage in healthcare systems is estimated to be 18 M by 2030 [103]. Reduced numbers of physicians are expected in many countries [104]. GPs have less than 15 min per patient, and 40% of emergency department visits are marked unnecessary [105]. Patient expectation increases and satisfaction in healthcare have been the lowest since 1997 [106]. Treatment must be tailored to the patient, and providers often fail to establish trust in the system.
The aging of the population means an increasing burden on healthcare services [107]. Diagnosis of several rare diseases takes 4–9 years [108]. Chronic conditions significantly burden healthcare systems, with 50% of GP appointments being for long-term conditions [109,110]. In total, 27% of health spending is on preventable illnesses [111]. There is low trust in pharma companies and their motives [112].
Pharma companies, payers, and health providers hesitate to use digital systems [113]. Though they “talk about it,” they do not know how to implement these systems in a way that increases profits.
Health systems are conservative; however, the digital health world must respond fast to these challenges [114]. Changing processes require flexibility and adaptability and are often associated with costs [115]. Technology advances exponentially, but humans are linear, making it difficult to adapt to new systems. The narrative must shift from e-health/telemedicine to fitness devices, machine learning, artificial intelligence, blockchain, and automation [62,116,117]. Most healthcare providers fail to provide whole-person care. A patient-outcome-centered approach is more likely to succeed [118,119]. Automation is expected to reduce errors in healthcare and increase safety, while providing equity in care [120].
The following are a few challenges in developing a patient-tailored dynamic algorithm for diagnosis and treatment.

4.1. Digital Health’s Data-Related Challenges with Machine Learning

Digital health relies extensively on mobile health, telemedicine, and various smart devices to collect human health data [78]. Even so, “big data” failed to translate into profitable products. Using machine learning (ML) in digital health poses several data-related challenges [121,122,123,124,125,126]. To use ML effectively, data must be high quality and complete. Nevertheless, digital health data can be challenging to collect and incomplete or noisy. When digital health data come from multiple sources, such as electronic health records, wearables, and mobile apps, heterogeneity and complexity can result. An ML model requires annotation and labeling of data. Particularly in digital health, where data may be complex and diverse, it can be time-consuming and labor-intensive. In terms of resources and infrastructure, storing and managing large amounts of digital health data can be challenging.
Potential algorithmic biases can occur if the data quality is unsatisfactory [127]. For AI to excel at tasks, it needs access to high-quality data [4,69,128]. Over the last decade, numerous systems have been associated with biases that evolve from the data type used [129]. Many current ML systems are based on existing data, not looking for relevant data [130,131]. It is common for algorithms to be trained using data from tertiary settings instead of primary care, which does not represent mild diseases well. It can have clinically significant effects, such as inflated case fatality estimates [132]. Overfitting training datasets and unforeseen errors from incidental variations or artifacts in input data can lead to bias in the algorithm’s output [133]. The availability of real-world health data instead of the momentary snapshots seen in hospitals and clinics is mandatory for reforming disease management [134].
ML learns from historical data, and those underrepresented in these data sets may receive inaccurate diagnoses, so real-world data validation is crucial [132]. The population bias results from a focus on common presentations involving a solid predisposition toward training sub-populations [127,135,136]. Most data focus on subpopulations with limited ethnicity, such as white males, western populations with high income and literacy, and do not apply to many other subjects [127]. Algorithms often discriminate against women, minorities, other cultures, and ideologies. As such, algorithms learn from the data they are fed, and AI programmers must know about the issue of bias in algorithms to actively fight against it by tailoring them [137]. The information needs to account for differences in conditions in healthcare systems and how people are treated [100,138]. Building equitable sociodemographic representation in data repositories, gender, expertise, and clinical specialties is crucial in ameliorating health inequities [135]. There is a need to minimize dependence on trusted third parties or data movement [139]. When designing algorithms and introducing them into clinical practice, the principles of equity should be incorporated to ensure that the output does not cause harm to patients [33].
The democratization of data and care are mandatory [140]. Increasing amounts of data are being generated, which can be hard to account for, and interoperability requires sharing data and collaboration, which can be costly and require adaptability [141]. Data standardization, data access, overcoming biases due to limited datasets, efficient algorithm deployment, and the need for data collaboration while keeping costs low are all barriers to scalable digital health [33]. More data is often unnecessary to improve its quality [142,143]. EMR (electronic medical records) are more than what may be required, and only some data and interactions are necessary [144]. Eighty percent of data remains silent as, in many cases, it is hard to take the data to the last mile toward the patient [145]. To provide meaningful information, the data must be translated into precision health [91]. It is possible for vulnerabilities, such as adversarial attacks and a lack of tools to regulate the quality of information and cybersecurity, to affect results negatively.
Commonly used systems focus on means of single points and are less accountable for the dynamicity of biological systems [146]. Using big data ignores the n = 1 concept, attempting to implement individual conclusions made from large populations, which may result in a bias that affects the treatment of patients [146,147,148].

4.2. Patients and Physicians Face Challenges in Using Digital Systems

Most digital systems need to engage patients or physicians. In addition, physicians are reluctant to use ML platforms because patients ignore them [69,149]. Despite expectations, healthcare players still needed to adopt digital strategies. Providers must convince patients of the benefits of using a new system, so explainable AI is crucial. For most patients and providers, the system must be relevant to the moment rather than the future. Innovations are accepted at different levels by end users [150,151]. The challenge of attitude comes from changing mindsets. For a system to be used for a long time, the user must want to use it.
It is possible that these systems need to be better understood or trusted, especially if they are perceived as complex or unfamiliar. Patients and physicians may only accept AI if they fully understand how it works and are sure about its reliability and accuracy. With physicians already having varying technology literacy levels, frustrations may be added as physicians learn how to incorporate and utilize AI platforms while struggling with existing technologies such as EMR [152,153]. Physician burnout can be exacerbated by understanding how AI algorithms work. As a result of AI, there are concerns about the potential for errors or biases and the possibility that these systems replace human interaction and support. It is particularly relevant for patients who may be concerned about the accuracy and reliability of AI-based diagnoses or treatments [154,155], and how these systems may undermine healthcare professionals’ autonomy and judgment [156].
Augmented intelligence implies the AI’s assistive role by enhancing human intelligence rather than replacing it [1,157]. In addition, it refers to combining the unique capabilities of human experts with AI to provide better care [4]. Medical professionals make decisions using data obtained with technologies they understand [158]. An explainable AI is crucial to gaining trust in AI-based algorithms [158,159]. New technologies tend to be accepted by physicians if they add to their knowledge of diagnosis and treatment, increase income, and save time by allowing them to do more in their practice. Knowledge, money, and time are the underlying benefits of investing in digital health [69,160,161].
Usability testing examines whether specified users can achieve their intended use effectively and efficiently [162,163]. The VoD often occurs during the clinical translation stage of digital tools due to issues with AI performance, generalizability, black boxes, and explainability [33].
A patient should be involved in the highest decision-making level when designing algorithms for medical purposes to ensure their needs and recommendations are implemented [4,164]. Overall, 75% of consumers want a more personalized experience [165,166]. Balancing technology with real people is needed.
GPT and BERT models lack domain-specific medical knowledge and may generate plausible-sounding yet medically inaccurate information. Integrating medical expertise into these models’ training and fine-tuning processes is essential for reliable medical applications. Furthermore, medical language is highly technical and filled with domain-specific jargon. GPT and BERT may struggle to handle the linguistic complexity and diverse terminology in medical literature and patient records [167].
Patients and medical professionals need time and resources to trust AI with medical diagnoses, support medical decision-making, or design therapies [4]. Trust is associated with a system’s performance and ability to improve outcomes. Scaling innovations for increasing adoption by clinician- and patient-mandate large investments is a significant challenge [168,169].

4.3. Challenges Related to Ethics and Law

Liability needs to be resolved if AI systems cause errors or adverse outcomes. Who is responsible for the consequences if an AI system provides an incorrect diagnosis or treatment recommendation? Several parties are involved, including the developers of the AI system, the healthcare providers who use the system, and the affected patients [168,169]. If an algorithm misses a diagnosis that a physician accepted, the consensus is that the professional is liable if the tool was misused [170]. In other cases, the liability falls back on the creators and the companies behind them [4]. With AI and ML, human doctors are subjected to confirmation bias. The system often tends to reproduce doctors’ errors, thus strengthening their mistakes [171,172].
Due to legal concerns, data with personal identifiers may not always be distinguishable from fully anonymous data. If the data provider does not trust the potential recipient, they may be reluctant to share their data. Official guidelines on data sharing may be absent or unclear, making it difficult to determine how to balance data accessibility with the need to protect privacy and intellectual, financial, and time investments.
GPT- or BERT-generated information might inadvertently lead to incorrect diagnoses, inappropriate treatment recommendations, or miscommunication between healthcare providers and patients [167,173].
Fairness in digital applications requires ethical considerations. A fair selection process considers differences in race, gender, demographic disparities, disability, and other factors [33]. There may also be ethical concerns, as public health agencies may disagree with data requestors about the risks and benefits of sharing data. Additionally, data producers may feel they need more credit or benefit in transit, where they share their data, while data users may benefit from academic credit and career advancement [174].
It is challenging to own and access data generated by AI systems. It may need to be clarified who owns and has access to the data. There may also be concerns about the privacy and security of this data, as it often contains sensitive personal information [175]. The COVID-19 pandemic revealed the need for data sharing and for evaluation and ethical aspects to be developed in the emerging field of digital healthcare, such as consent and transparency regarding what data are collected, and which third parties can access patient data [78,176].
In many cases, it has been proven that individual profiles can still be traced back even if data is anonymized by institutions [177]. Ethical challenges, such as user consent, are significant in health digitalization [178]. Data governance is another challenge, and governments must set up policies and standards for data governance [179].

4.4. Challenges Related to Healthcare Providers and Pharmaceutical Companies

Digital solution evaluation and adoption requires collective efforts from multiple parties, such as health authorities, healthcare providers, manufacturers, small and medium-sized enterprises (SMEs), and multinational corporations (MNCs) [7]. More giant corporations have more resources to develop evidence, but are equally limited by time and have much hesitancy to use digital platforms [4,180]. It can be challenging to justify investments into expensive and time-consuming clinical studies for early stage solutions for internal budget allocation. The evidence published today may reflect a product that has been updated and refined multiple times since investigations typically take three years. Sales and manufacturing investments are more tangible for companies, with a more predictable return on investment than clinical studies [7,181]. Researchers may not be willing to conduct studies to evaluate digital solutions, which require different settings and capabilities and whose outputs involve benefits on the operational and cost level, and therefore, indirectly, patient outcome versus a drug that improves patient outcome is a challenge [182].

4.5. Cost-Increase Challenges in Healthcare

The cost-effectiveness and sustainability of digital solutions are significant challenges for implementing these systems [78,183,184]. AI systems pose a significant barrier to implementation in most countries due to their burden on healthcare systems [185]. In parallel, there is a need for value-based health care in low-income places [186].
Implementing AI systems in healthcare requires a significant initial investment, including the costs of acquiring or developing the technology, training staff, and integrating the systems with existing information technology infrastructure. It increases costs in the short term [187]. AI systems require ongoing maintenance and support, which incurs additional costs. Software updates, hardware repairs, staff training, and data storage and processing may all be necessary to ensure the smooth operation of AI systems [188,189]. Complying with relevant laws and regulations, such as HIPAA, involves additional costs, such as legal fees and registration fines.
Digital platforms are asked to assist in treating more patients with the same personnel and require readiness of local infrastructures such as clinical services, equipment, treatment modalities, IT systems, telecommunication network, and cost of AI platforms [190,191]. Resource limitations, workforce, and infrastructure are presently significant barriers to the translation of benefits of digital technologies to improve public health measures, particularly in low- to middle-income countries [192,193,194]. Most payers and healthcare systems do not justify adding costs for digital systems, which are “nice to have”.
Unlike drugs, where the evidence leads to reimbursement, reimbursement for digital systems requires showing their value and potential in the real world, not just in clinical trials. To receive reimbursement, digital systems must be affordable and cost-saving. Most digital apps address problems that seem “not critical”; they do not improve survival, making it challenging to ask health systems for reimbursement.

4.6. Regulations, Validations, and Standards Challenges

The regulatory process for medical devices includes establishment registration, listing, and premarket notification or approval. The process is very complicated and lengthy, and The FDA acknowledges that traditional forms of medical-device regulation are not well suited for the faster pace of design and modification for software-based medical devices [195].
Numerous regulations and standards are relevant to digital health solutions [7,24,196]. Regulations must provide life cycle requirements for developing medical software, such as that found within medical devices, and communicating electronic health record (EHR) information [197]. These instruct on the principles and requirements for privacy protection using pseudonymization services to protect personal health information [7]. Standards and criteria are defined in some regulations for ensuring the interoperability of components used in applications monitoring personal health and wellness [7]. Guidelines provide frameworks for evaluating the benefits and risks of digital solutions, guidelines on effectiveness, and economic standards [198,199,200].
The challenges of interoperability, data privacy, legal frameworks, systemic acceptability, and project financing are obstacles to large-scale digital platforms [201,202,203,204,205].
While the strength of evidence and study duration is mandatory for proper assessment of the efficacy of digital systems, only a limited number of products were tested in RCT [206]. Examples of studies are patients randomly assigned to routine outpatient chemotherapy for advanced tumors with patient-reported outcomes vs. usual care with monitoring at the discretion of clinicians [207]; text messaging to reduce early discontinuation of aromatase inhibitor therapy in breast cancer [208]; patients with type 2 diabetes using cell phone-based software [209]; clinical decision support system to aid computerized orders entry of chemotherapy [210]; and the use of a deep-learning framework (Med3R), which utilizes human-like learning and reasoning process [7,211].
For evaluating digital health solutions, the pre-post design is most commonly used. It involves pre-phase, which provides control data; a “washout” period with no interventions implemented with a time gap of up to several months to allow familiarization and to limit bias related to implementation and post-phase to collect data on solution effectiveness [212]. There are differences between the evidence required for initial adopters (e.g., surveys and interviews, case studies) and those needed for the majority (prospective RCT studies) [213,214,215].
Before implementing digital systems, RCTs are required, which is challenging and may delay their implementation by years at a time when technology has already advanced. The value of digital systems may be better reflected in real-world studies. Collaboration between companies can benefit many systems, but it is difficult to implement and encourage.

5. Moving from “Nice to Have” to “Mandatory” Digital Systems

Several challenges, including those listed above, present significant obstacles to implementing digital systems. While many believe time and fine-tuning of these systems can lead to breaching everyday life, the reality is more complicated. Most digital systems are still considered “nice to have” and are not required for care.
The purpose of healthcare systems is to improve outcomes. Diagnoses, data collection, and simplifying processes are “nice to have” tools but not mandatory. The majority of these systems have never been shown to improve outcomes. “Nice to have”, “assist”, and “ease processes” do not suffice in the current outcome-based environment. Systems need to adapt fast to changing circumstances [216]. Most current systems are not sufficiently dynamic in response to internal and external perturbations [217]. There is a need to be both fast and accurate, make all actions computable, and support different data types, algorithms, and statistics for accounting for the dynamicity of biological systems [218]. A challenge for end users, such as patients and providers, is determining a new solution’s credibility and compliance with standards [7,33].
Unless a digital platform improves outcomes, it is unlikely to break through the glass ceiling of everyday use in healthcare. The lack of resources makes digital systems mandatory for improving outcomes, ensuring a high engagement rate by patients and providers, and being reimbursed.

6. Constrained-Disorder Principle-Based Digital Systems Get Closer to Their Biological Basis

The disorder is inherent to the function of complex systems, and variability characterizes the proper operation of biological systems [219,220,221,222]. At the genome, a combination of deterministic and stochastic effects regulate processes of DNA transcription and translation [223]. At the cellular level, dynamic instability is the hallmark of normal microtubule function [224,225,226,227,228,229,230]. Heart rate variability (HRV) and blood pressure variability (BPV) are examples of an autonomic nervous system, where normal regulation of the heart and vascular function happens [231]. Loss of HRV is associated with poor prognosis and increased mortality [232,233]. Abnormal BPV affects the morphology and composition of coronary plaques and the related mechanisms of inflammation and hemodynamics. Regulating BPV can prevent the occurrence and development of coronary heart disease [234].
The constrained-disorder principle (CDP) defines complex systems based on the degree of their inherent disorder, bounded by dynamic boundaries [235]. It differentiates between living organisms characterized by a high degree of disorder and non-living systems with a minimal degree. A system’s malfunction evolves from reducing the degree of disorder or when disorder becomes out of bounds [235,236,237,238]
CDP-based second-generation AI systems are designed to generate therapeutic regimens close to human biology, improving the response to chronic therapies, and thus clinical outcomes [146]. By using personalized algorithms, these systems incorporate controlled variability into therapeutic interventions. Chronic diseases are a significant burden on healthcare systems, and the loss of response to chronic drugs is a significant challenge in treating patients with common chronic diseases [239]. Patients with chronic diseases also fail to adhere to chronic regimens because of a loss of response to therapies. In addition, there is marked inter and intra-subject variability in response to chronic therapies [239,240,241,242]. The CDP-based second-generation AI systems provide a platform for overcoming drug resistance and improving adherence by implementing variability-based therapeutic regimens for patients with chronic diseases [76,108,242,243,244,245,246,247,248,249,250,251,252,253,254,255,256,257,258,259,260]. The system enables personalized therapies based on individual variability signatures [146,239,257,261].
The CDP-based second-generation AI is a platform on which the digital pill was developed [261]. The digital pill is any drug regulated by a second-generation AI and is available at three levels. First, an open-loop system implements variability in drug administration times and dosages to overcome tolerance. At the second level, a closed-loop design personalized the variability signatures in a way that dynamically adapts the variability to each subject’s response. At the third level, the algorithm incorporates signatures of variability, which are relevant to the disease dynamically [245,258,261]. Examples are the use of HRV in cardiac patients, quantifying variability in cytokines secretion in patients with inflammatory disorders, and respiratory and gait variability parameters in patients with pulmonary disease and neurological disorders [146,239,257,261]. With this CDP-based digital system, patients with severe heart failure had fewer emergency room visits, fewer hospitalizations, improved clinical performance, and improved laboratory tests.
By viewing the digital pill as part of therapy, not as a reminder, patients and physicians are more likely to engage with the platform. It provides end users with confidence that their outcomes will improve if they take their medications according to the app-based regimens. When introducing digital systems, healthcare organizations, payers, and end users incur high costs. The digital pill is based on an “Uber/Airbnb” model, where a digital system improves the efficiency and effectiveness of existing drugs and devices [245,258,261,262].

7. Digital Health Challenges Can Be Overcome by Using CDP-Based Digital Systems

With CDP-based digital systems, patient outcomes are improved; therefore, some challenges associated with digital health are overcome.
Since patients are the “kings of healthcare,” digital health must transform from a stand-alone product to a service that supports clinical outcomes [76,108,217,243,244,245,246,247,248,249,250,251,252,253,254,255,256,257,258,259,260,263,264,265,266,267]. Unlike first-generation systems, second-generation AI systems are outcome-based, with clinically meaningful endpoints. Healthcare players can rely on them for solutions and to add value to their systems. Patients benefit from them because they improve their response and reduce side effects. They focus on the patient’s essential endpoints to overcome the challenge of technology-patient interaction. Healthcare providers ensure improved adherence. Payers reduce costs by reducing admissions and the need for expensive therapies that are not necessarily better. Pharma companies can increase revenues without developing new expensive products or dealing with regulatory barriers using these methods [261].
These systems improved clinical and laboratory measures in patients with chronic heart failure, reducing emergency room admissions and hospitalizations [268]. Similar results were shown for patients with multiple sclerosis and those suffering from chronic pain [238]. These examples support the concept that the introduction of outcome-based digital systems can overcome many of the above-discussed challenges.
In many cases, big data can be overcome using second-generation systems, which is of limited relevance when designing therapies for individual subjects. By generating insightful datasets, the systems create individualized parameters associated with drug effectiveness, adherence, and side effects [261,262].
Second-generation systems must dynamically retune biological, environmental, and social factors [33]. They are dynamic and continuously change their output based on internal and external perturbations. By improving outcomes, these systems are turning “nice to have” digital algorithms into mandatory ones [262].
Table 1 summarizes some of the challenges and the suggested methods for overcoming them using the described system.

8. Summary and Conclusions

In the case of digital health, it can either be declared dead or resurrected [269,270]. To move digital platforms into everyday use, ensure high engagement by patients and physicians, and ensure reimbursement, it is crucial to differentiate between “nice to have” systems and mandatory systems. Any new technology needs to be pragmatic, solve problems, reduce the cost of care delivery, and be sustainable in the long term [5,78]. As long as it benefits players in the health system, doing so one step at a time is reasonable as long as it does not require perfection. A CDP-based second-generation AI system is showing promise and has the potential to overcome some of the challenges digital systems face. As long as medical associations provide clear guidelines for implementing AI and policymakers create policies encouraging its adoption, AI can become part of standard care [4]. It all depends on the ability to show improved outcomes when using digital systems, moving from “nice to have” into mandatory systems.

Author Contributions

N.H. and Y.I. wrote the manuscript. Y.I. conceptualized. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

All data is available on public domains.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

AI: artificial intelligence; CDP: constrained disorder principle; VoD: valley of death; IT: intelligent technology; CDS: clinical decision support; MRI: magnetic resonance imaging; ECG: electrocardiogram; FDA: food and drug administration; BERT: Bidirectional Encoder Representations from Transformers; GPT: Generative pre-trained transformers; VR: Virtual reality; AR: augmented reality; MR: Mixed reality; NHS: National Health Service; GP: General practitioner; ML: machine learning; EMR: Electronic medical records; SMEs: small and medium-sized enterprises; MNCs: multinational corporations; HIPAA: Health Insurance Portability and Accountability Act; EHR: electronic health record; RCT: randomized controlled trail; HRT: heart rate variability; BPV: blood pressure variability.

References

  1. Bajwa, J.; Munir, U.; Nori, A.; Williams, B. Artificial intelligence in healthcare: Transforming the practice of medicine. Future Healthc. J. 2021, 8, e188–e194. [Google Scholar] [CrossRef] [PubMed]
  2. Gbadegeshin, S.A.; Natsheh, A.A.; Ghafel, K.; Mohammed, O.; Koskela, A.; Rimpiläinen, A.; Tikkanen, J.; Kuoppala, A. Overcoming the Valley of Death: A New Model for High Technology Startups. Sustain. Futures 2022, 4, 100077. [Google Scholar] [CrossRef]
  3. Greaves, F.; Joshi, I.; Campbell, M.; Roberts, S.; Patel, N.; Powell, J. What is an appropriate level of evidence for a digital health intervention? Lancet 2018, 392, 2665–2667. [Google Scholar] [CrossRef] [PubMed]
  4. Meskó, B.; Görög, M. A short guide for medical professionals in the era of artificial intelligence. NPJ Digit. Med. 2020, 3, 126. [Google Scholar] [CrossRef] [PubMed]
  5. Cripps, M.; Scarbrough, H. Making Digital Health “Solutions” Sustainable in Healthcare Systems: A Practitioner Perspective. Front. Digit. Health 2022, 4, 727421. [Google Scholar] [CrossRef] [PubMed]
  6. Hughes, J.; Lennon, M.; Rogerson, R.J.; Crooks, G. Scaling Digital Health Innovation: Developing a New ‘Service Readiness Level’ Framework of Evidence. Int. J. Environ. Res. Public Health 2021, 18, 2575. [Google Scholar] [CrossRef] [PubMed]
  7. Guo, C.; Ashrafian, H.; Ghafur, S.; Fontana, G.; Gardner, C.; Prime, M. Challenges for the evaluation of digital health solutions-A call for innovative evidence generation approaches. NPJ Digit. Med. 2020, 3, 110. [Google Scholar] [CrossRef]
  8. Khan, S.; Vandermorris, A.; Shepherd, J.; Begun, J.W.; Lanham, H.J.; Uhl-Bien, M.; Berta, W. Embracing uncertainty, managing complexity: Applying complexity thinking principles to transformation efforts in healthcare systems. BMC Health Serv. Res. 2018, 18, 192. [Google Scholar] [CrossRef] [PubMed]
  9. Saxena, S.; Amritesh. Evolving uncertainty in healthcare service interactions during COVID-19: Artificial Intelligence—A threat or support to value cocreation? In Cyber-Physical Systems; Academic Press: Cambridge, MA, USA, 2022; pp. 93–116. [Google Scholar] [CrossRef]
  10. Shahmoradi, L.; Safadari, R.; Jimma, W. Knowledge Management Implementation and the Tools Utilized in Healthcare for Evidence-Based Decision Making: A Systematic Review. Ethiop. J. Health Sci. 2017, 27, 541–558. [Google Scholar] [CrossRef] [PubMed]
  11. Evans, D.K.; Welander Tärneberg, A. Health-care quality and information failure: Evidence from Nigeria. Health Econ. 2018, 27, e90–e93. [Google Scholar] [CrossRef]
  12. Branning, G.; Vater, M. Healthcare Spending: Plenty of Blame to Go Around. Am. Health Drug Benefits 2016, 9, 445–447. [Google Scholar]
  13. Iyanna, S.; Kaur, P.; Ractham, P.; Talwar, S.; Najmul Islam, A.K.M. Digital transformation of healthcare sector. What is impeding adoption and continued usage of technology-driven innovations by end-users? J. Bus. Res. 2022, 153, 150–161. [Google Scholar] [CrossRef]
  14. Jacob, C.; Sanchez-Vazquez, A.; Ivory, C. Social, Organizational, and Technological Factors Impacting Clinicians’ Adoption of Mobile Health Tools: Systematic Literature Review. JMIR Mhealth Uhealth 2020, 8, e15935. [Google Scholar] [CrossRef] [PubMed]
  15. Haag, M.; Igel, C.; Fischer, M.; German Medical Education Society (GMA); Committee “Digitization-Technology-Assisted Learning and Teaching”; Joint Working Group “Technology-Enhanced Teaching and Learning in Medicine (TeLL)” of the German Association for Medical Informatics, Biometry and Epidemiology (GMDS); German Informatics Society (GI). Digital Teaching and Digital Medicine: A national initiative is needed. GMS J. Med. Educ. 2018, 35, Doc43. [Google Scholar] [PubMed]
  16. Chaiyachati, K.H.; Shea, J.A.; Asch, D.A.; Liu, M.; Bellini, L.M.; Dine, C.J.; Sternberg, A.L.; Gitelman, Y.; Yeager, A.M.; Asch, J.M. Assessment of inpatient time allocation among first-year internal medicine residents using time-motion observations. JAMA Intern. Med. 2019, 179, 760–767. [Google Scholar] [CrossRef]
  17. Poghosyan, L. Clinican burnout: New times, old issue. Nurs. Econ. 2018, 36, 109–111. [Google Scholar]
  18. Shah, N.R. Health care in 2030: Will artificial intelligence replace physicians? Ann. Intern. Med. 2019, 170, 407–408. [Google Scholar] [CrossRef]
  19. Price, W.N.; Gerke, S.; Cohen, I.G. Potential liability for physicians using artificial intelligence. JAMA 2019, 322, 1765–1766. [Google Scholar] [CrossRef] [PubMed]
  20. Yao, R.; Zhang, W.; Evans, R.; Cao, G.; Rui, T.; Shen, L. Inequities in Health Care Services Caused by the Adoption of Digital Health Technologies: Scoping Review. J. Med. Internet Res. 2022, 24, e34144. [Google Scholar] [CrossRef] [PubMed]
  21. Golinelli, D.; Boetto, E.; Carullo, G.; Nuzzolese, A.G.; Landini, M.P.; Fantini, M.P. Adoption of Digital Technologies in Health Care During the COVID-19 Pandemic: Systematic Review of Early Scientific Literature. J. Med. Internet Res. 2020, 22, e22280. [Google Scholar] [CrossRef]
  22. Ngwatu, B.K.; Nsengiyumva, N.P.; Oxlade, O.; Mappin-Kasirer, B.; Nguyen, N.L.; Jaramillo, E.; Falzon, D.; Schwartzman, K. The impact of digital health technologies on tuberculosis treatment: A systematic review. Eur. Respir. J. 2018, 51, 1701596. [Google Scholar] [CrossRef]
  23. Mitchell, M.; Kan, L. Digital Technology and the Future of Health Systems. Health Syst. Reform. 2019, 5, 113–120. [Google Scholar] [CrossRef]
  24. Hourani, D.; Darling, S.; Cameron, E.; Dromey, J.; Crossley, L.; Kanagalingam, S.; Muscara, F.; Gwee, A.; Gell, G.; Hiscock, H.; et al. What Makes for a Successful Digital Health Integrated Program of Work? Lessons Learnt and Recommendations From the Melbourne Children’s Campus. Front. Digit. Health 2021, 3, 661708. [Google Scholar] [CrossRef] [PubMed]
  25. Why Real-World Results Are So Challenging for Digital Health. NEJM Catalyst. 2017. Available online: https://catalyst.nejm.org/doi/full/10.1056/CAT.17.0453 (accessed on 1 June 2023).
  26. 98% of Digital Health Startups Fail—Here’s Why. Becker Health. 2016. Available online: https://www.beckershospitalreview.com/healthcare-information-technology/98-of-digital-health-startups-fail-here-s-why.html (accessed on 1 June 2023).
  27. Gupta, R. Digital health—Hope, Hype, and Halt. 2018. Preprint. Available online: https://www.hhmglobal.com/knowledge-bank/articles/digital-health-from-hope-hype-and-halt-to-hope-heal-and-health (accessed on 1 June 2023).
  28. Venkataramanan, R.; Pradhan, A.; Kumar, A.; Purushotham, A.; Alajlani, M.; Arvanitis, T.N. Digital Inequalities in Cancer Care Delivery in India: An Overview of the Current Landscape and Recommendations for Large-Scale Adoption. Front. Digit. Health 2022, 4, 916342. [Google Scholar] [CrossRef]
  29. Bhyat, R.; Hagens, S.; Bryski, K.; Kohlmaier, J.F. Digital Health Value Realization Through Active Change Efforts. Front. Public Health 2021, 9, 741424. [Google Scholar] [CrossRef] [PubMed]
  30. Mathur, S.; Sutton, J. Personalized medicine could transform healthcare. Biomed. Rep. 2017, 7, 3–5. [Google Scholar] [CrossRef] [PubMed]
  31. NICE. 2019. Available online: https://www.nice.org.uk/about/what-we-do/our-programmes/evidence-standards-framework-for-digital-health-technologies (accessed on 1 June 2023).
  32. Cummins, N.; Schuller, B.W. Five Crucial Challenges in Digital Health. Front. Digit. Health 2020, 2, 536203. [Google Scholar] [CrossRef]
  33. Challenges in digital medicine applications in under-resourced settings. Nat. Commun. 2022, 13, 3020. [CrossRef]
  34. Ibrahim, M.S.; Mohamed Yusoff, H.; Abu Bakar, Y.I.; Thwe Aung, M.M.; Abas, M.I.; Ramli, R.A. Digital health for quality healthcare: A systematic mapping of review studies. Digit. Health 2022, 8, 20552076221085810. [Google Scholar] [CrossRef]
  35. Kim, H.S.; Kwon, I.H.; Cha, W.C. Future and Development Direction of Digital Healthcare. Healthc. Inform. Res. 2021, 27, 95–101. [Google Scholar] [CrossRef]
  36. Benjamin, E.J.; Blaha, M.J.; Chiuve, S.E.; Cushman, M.; Das, S.R.; Deo, R.; De Ferranti, S.D.; Floyd, J.; Fornage, M.; Gillespie, C. Heart disease and stroke statistics—2017 update: A report from the American Heart Association. Circulation 2017, 135, e146–e603. [Google Scholar]
  37. Raja, J.M.; Elsakr, C.; Roman, S.; Cave, B.; Pour-Ghaz, I.; Nanda, A.; Maturana, M.; Khouzam, R.N. Apple watch, wearables, and heart rhythm: Where do we stand? Ann. Transl. Med. 2019, 7, 417. [Google Scholar] [CrossRef] [PubMed]
  38. Lawton, J.; Blackburn, M.; Allen, J.; Campbell, F.; Elleri, D.; Leelarathna, L.; Rankin, D.; Tauschmann, M.; Thabit, H.; Hovorka, R. Patients’ and caregivers’ experiences of using continuous glucose monitoring to support diabetes self-management: Qualitative study. BMC Endocr. Disord. 2018, 18, 12. [Google Scholar] [CrossRef] [PubMed]
  39. Lai, M.; Regalia, G.; Onorati, F.; Picard, R. Embrace and E4: Devices for seizure detection and advancing research. Epilepsy Res. 2019, 153, 79–82. [Google Scholar]
  40. Dey, D.; Slomka, P.J.; Leeson, P.; Comaniciu, D.; Shrestha, S.; Sengupta, P.P.; Marwick, T.H. Artificial Intelligence in Cardiovascular Imaging: JACC State-of-the-Art Review. J. Am. Coll. Cardiol. 2019, 73, 1317–1335. [Google Scholar] [CrossRef]
  41. Hosny, A.; Parmar, C.; Quackenbush, J.; Schwartz, L.H.; Aerts, H. Artificial intelligence in radiology. Nat. Rev. Cancer 2018, 18, 500–510. [Google Scholar] [CrossRef]
  42. Chang, H.Y.; Jung, C.K.; Woo, J.I.; Lee, S.; Cho, J.; Kim, S.W.; Kwak, T.Y. Artificial Intelligence in Pathology. J. Pathol. Transl. Med. 2019, 53, 1–12. [Google Scholar] [CrossRef]
  43. Johnson, K.B.; Wei, W.Q.; Weeraratne, D.; Frisse, M.E.; Misulis, K.; Rhee, K.; Zhao, J.; Snowdon, J.L. Precision Medicine, AI, and the Future of Personalized Health Care. Clin. Transl. Sci. 2021, 14, 86–93. [Google Scholar] [CrossRef] [PubMed]
  44. Jones, S.; Anagnostou, V.; Lytle, K.; Parpart-Li, S.; Nesselbush, M.; Riley, D.R.; Shukla, M.; Chesnick, B.; Kadan, M.; Papp, E.; et al. Personalized genomic analyses for cancer mutation discovery and interpretation. Sci. Transl. Med. 2015, 7, 283ra253. [Google Scholar] [CrossRef] [PubMed]
  45. Kim, J.; Campbell, A.S.; de Ávila, B.E.; Wang, J. Wearable biosensors for healthcare monitoring. Nat. Biotechnol. 2019, 37, 389–406. [Google Scholar] [CrossRef]
  46. Sharma, A.; Badea, M.; Tiwari, S.; Marty, J.L. Wearable Biosensors: An Alternative and Practical Approach in Healthcare and Disease Monitoring. Molecules 2021, 26, 748. [Google Scholar] [CrossRef]
  47. Dinh-Le, C.; Chuang, R.; Chokshi, S.; Mann, D. Wearable Health Technology and Electronic Health Record Integration: Scoping Review and Future Directions. JMIR Mhealth Uhealth 2019, 7, e12861. [Google Scholar] [CrossRef] [PubMed]
  48. Neri, E.; Miele, V.; Coppola, F.; Grassi, R. Use of CT and artificial intelligence in suspected or COVID-19 positive patients: Statement of the Italian Society of Medical and Interventional Radiology. La Radiol. Medica 2020, 125, 505–508. [Google Scholar] [CrossRef]
  49. Hark, L.A.; Katz, L.J.; Myers, J.S.; Waisbourd, M.; Johnson, D.; Pizzi, L.T.; Leiby, B.E.; Fudemberg, S.J.; Mantravadi, A.V.; Henderer, J.D. Philadelphia telemedicine glaucoma detection and follow-up study: Methods and screening results. Am. J. Ophthalmol. 2017, 181, 114–124. [Google Scholar] [CrossRef]
  50. Miller, A.C.; Ward, M.M.; Ullrich, F.; Merchant, K.A.; Swanson, M.B.; Mohr, N.M. Emergency department telemedicine consults are associated with faster time-to-electrocardiogram and time-to-fibrinolysis for myocardial infarction patients. Telemed. E-Health 2020, 26, 1440–1448. [Google Scholar] [CrossRef] [PubMed]
  51. Knutsen, J.; Wolfe, A.; Burke, B.L.; Hepburn, S.; Lindgren, S.; Coury, D. A systematic review of telemedicine in autism spectrum disorders. Rev. J. Autism Dev. Disord. 2016, 3, 330–344. [Google Scholar] [CrossRef]
  52. Bouadjenek, M.R.; Verspoor, K.; Zobel, J. Automated detection of records in biological sequence databases that are inconsistent with the literature. J. Biomed. Inform. 2017, 71, 229–240. [Google Scholar] [CrossRef]
  53. Mortazavi, B.; Downing, N.; Bucholz, E.; Dharmarajan, K.; Manhapra, A. Analysis of machine learning techniques for heart failure readmissions. Circ. Cardiovasc. Qual. Outcomes 2016, 9, 629–640. [Google Scholar] [CrossRef]
  54. Niel, O.; Boussard, C.; Bastard, P. Artificial Intelligence Can Predict GFR Decline During the Course of ADPKD. Am. J. Kidney Dis. Off. J. Natl. Kidney Found. 2018, 71, 911–912. [Google Scholar] [CrossRef] [PubMed]
  55. Geddes, C.C.; Fox, J.G.; Allison, M.E.; Boulton-Jones, J.M.; Simpson, K. An artificial neural network can select patients at high risk of developing progressive IgA nephropathy more accurately than experienced nephrologists. Nephrol. Dial. Transplant. 1998, 13, 67–71. [Google Scholar] [CrossRef] [PubMed]
  56. Das, A.; Ben-Menachem, T.; Cooper, G.S.; Chak, A.; Sivak Jr, M.V.; Gonet, J.A.; Wong, R.C. Prediction of outcome in acute lower-gastrointestinal haemorrhage based on an artificial neural network: Internal and external validation of a predictive model. Lancet 2003, 362, 1261–1266. [Google Scholar] [CrossRef]
  57. Peng, J.C.; Ran, Z.H.; Shen, J. Seasonal variation in onset and relapse of IBD and a model to predict the frequency of onset, relapse, and severity of IBD based on artificial neural network. Int. J. Color. Dis. 2015, 30, 1267–1273. [Google Scholar] [CrossRef] [PubMed]
  58. Sato, F.; Shimada, Y.; Selaru, F.M.; Shibata, D.; Maeda, M.; Watanabe, G.; Mori, Y.; Stass, S.A.; Imamura, M.; Meltzer, S.J. Prediction of survival in patients with esophageal carcinoma using artificial neural networks. Cancer 2005, 103, 1596–1605. [Google Scholar] [CrossRef] [PubMed]
  59. Ichimasa, K.; Kudo, S.-e.; Mori, Y.; Misawa, M.; Matsudaira, S.; Kouyama, Y.; Baba, T.; Hidaka, E.; Wakamura, K.; Hayashi, T. Artificial intelligence may help in predicting the need for additional surgery after endoscopic resection of T1 colorectal cancer. Endoscopy 2018, 50, 230–240. [Google Scholar]
  60. Topalovic, M.; Das, N.; Burgel, P.-R.; Daenen, M.; Derom, E.; Haenebalcke, C.; Janssen, R.; Kerstjens, H.A.; Liistro, G.; Louis, R. Artificial intelligence outperforms pulmonologists in the interpretation of pulmonary function tests. Eur. Respir. J. 2019, 53, 1801660. [Google Scholar] [CrossRef] [PubMed]
  61. Fernández-Esparrach, G.; Bernal, J.; López-Cerón, M.; Córdova, H.; Sánchez-Montes, C.; De Miguel, C.R.; Sánchez, F.J. Exploring the clinical potential of an automatic colonic polyp detection method based on the creation of energy maps. Endoscopy 2016, 48, 837–842. [Google Scholar] [CrossRef]
  62. Haleem, A.; Javaid, M.; Singh, R.P.; Suman, R. Telemedicine for healthcare: Capabilities, features, barriers, and applications. Sens. Int. 2021, 2, 100117. [Google Scholar] [CrossRef]
  63. Salvador, M.; Valente, V. Digitalization in Telehealth: An integrative review. Int. J. Adv. Eng. Res. Sci. 2021, 8, 171–185. [Google Scholar] [CrossRef]
  64. Haleem, A.; Javaid, M.; Singh, R.P.; Suman, R.; Rab, S. Blockchain technology applications in healthcare: An overview. Int. J. Intell. Netw. 2021, 2, 130–139. [Google Scholar] [CrossRef]
  65. Arsenijevic, J.; Tummers, L.; Bosma, N. Adherence to Electronic Health Tools Among Vulnerable Groups: Systematic Literature Review and Meta-Analysis. J. Med. Internet Res. 2020, 22, e11613. [Google Scholar] [CrossRef]
  66. Zauderer, M.G.; Gucalp, A.; Epstein, A.S.; Seidman, A.D.; Caroline, A.; Granovsky, S.; Fu, J.; Keesing, J.; Lewis, S.; Co, H.; et al. Piloting IBM Watson Oncology within Memorial Sloan Kettering’s regional network. J. Clin. Oncol. 2014, 32, e17653. [Google Scholar] [CrossRef]
  67. Combi, C.; Pozzi, G. Clinical Information Systems and Artificial Intelligence: Recent Research Trends. Yearb. Med. Inform. 2019, 28, 83–94. [Google Scholar] [CrossRef]
  68. Ahuja, A.S. The impact of artificial intelligence in medicine on the future role of the physician. PeerJ 2019, 7, e7702. [Google Scholar] [CrossRef] [PubMed]
  69. Davenport, T.; Kalakota, R. The potential for artificial intelligence in healthcare. Future Health J. 2019, 6, 94–98. [Google Scholar] [CrossRef] [PubMed]
  70. Kang, D.-Y.; Cho, K.-J.; Kwon, O.; Kwon, J.-m.; Jeon, K.-H.; Park, H.; Lee, Y.; Park, J.; Oh, B.-H. Artificial intelligence algorithm to predict the need for critical care in prehospital emergency medical services. Scand. J. Trauma Resusc. Emerg. Med. 2020, 28, 17. [Google Scholar] [CrossRef]
  71. Winn, A.N.; Somai, M.; Fergestrom, N.; Crotty, B.H. Association of use of online symptom checkers with patients’ plans for seeking care. JAMA Netw. Open 2019, 2, e1918561. [Google Scholar] [CrossRef] [PubMed]
  72. Liu, Z.; Roberts, R.A.; Lal-Nag, M.; Chen, X.; Huang, R.; Tong, W. AI-based language models powering drug discovery and development. Drug Discov. Today 2021, 26, 2593–2607. [Google Scholar] [CrossRef]
  73. Hu, H.-Z.; Feng, X.-B.; Shao, Z.-W.; Xie, M.; Xu, S.; Wu, X.-H.; Ye, Z.-W. Application and prospect of mixed reality technology in medical field. Curr. Med. Sci. 2019, 39, 1–6. [Google Scholar] [CrossRef] [PubMed]
  74. Moro, C.; Štromberga, Z.; Raikos, A.; Stirling, A. The effectiveness of virtual and augmented reality in health sciences and medical anatomy. Anat. Sci. Educ. 2017, 10, 549–559. [Google Scholar] [CrossRef]
  75. Howard, M.C. A meta-analysis and systematic literature review of virtual reality rehabilitation programs. Comput. Hum. Behav. 2017, 70, 317–327. [Google Scholar] [CrossRef]
  76. Ishay, Y.; Potruch, A.; Schwartz, A.; Berg, M.; Jamil, K.; Agus, S.; Ilan, Y. A digital health platform for assisting the diagnosis and monitoring of COVID-19 progression: An adjuvant approach for augmenting the antiviral response and mitigating the immune-mediated target organ damage. Biomed. Pharmacother. 2021, 143, 112228. [Google Scholar] [CrossRef] [PubMed]
  77. Massoudi, B.L.; Sobolevskaia, D. Keep Moving Forward: Health Informatics and Information Management beyond the COVID-19 Pandemic. Yearb. Med. Inform. 2021, 30, 75–83. [Google Scholar] [CrossRef] [PubMed]
  78. Manteghinejad, A.; Javanmard, S.H. Challenges and opportunities of digital health in a post-COVID19 world. J. Res. Med. Sci. 2021, 26, 11. [Google Scholar] [CrossRef] [PubMed]
  79. Chou, E.; Hsieh, Y.-L.; Wolfshohl, J.; Green, F.; Bhakta, T. Onsite telemedicine strategy for coronavirus (COVID-19) screening to limit exposure in ED. Emerg. Med. J. 2020, 37, 335–337. [Google Scholar] [CrossRef] [PubMed]
  80. Seshadri, D.R.; Davies, E.V.; Harlow, E.R.; Hsu, J.J.; Knighton, S.C.; Walker, T.A.; Voos, J.E.; Drummond, C.K. Wearable sensors for COVID-19: A call to action to harness our digital infrastructure for remote patient monitoring and virtual assessments. Front. Digit. Health 2020, 2, 8. [Google Scholar] [CrossRef]
  81. Grossman, Z.; Chodick, G.; Reingold, S.M.; Chapnick, G.; Ashkenazi, S. The future of telemedicine visits after COVID-19: Perceptions of primary care pediatricians. Isr. J. Health Policy Res. 2020, 9, 53. [Google Scholar] [CrossRef]
  82. Evans, R.S. Electronic Health Records: Then, Now, and in the Future. Yearb. Med. Inform. 2016, 25 (Suppl. 1), S48–S61. [Google Scholar] [CrossRef]
  83. Bohr, A.; Memarzadeh, K. The rise of artificial intelligence in healthcare applications. In Artificial Intelligence in Healthcare; Academic Press: Cambridge, MA, USA, 2020; pp. 25–60. [Google Scholar] [CrossRef]
  84. Shara, N.; Bjarnadottir, M.V.; Falah, N.; Chou, J.; Alqutri, H.S.; Asch, F.M.; Anderson, K.M.; Bennett, S.S.; Kuhn, A.; Montalvo, B.; et al. Voice activated remote monitoring technology for heart failure patients: Study design, feasibility and observations from a pilot randomized control trial. PLoS ONE 2022, 17, e0267794. [Google Scholar] [CrossRef]
  85. Jadczyk, T.; Wojakowski, W.; Tendera, M.; Henry, T.D.; Egnaczyk, G.; Shreenivas, S. Artificial Intelligence Can Improve Patient Management at the Time of a Pandemic: The Role of Voice Technology. J. Med. Internet Res. 2021, 23, e22959. [Google Scholar] [CrossRef]
  86. Tahri Sqalli, M.; Al-Thani, D. On How Chronic Conditions Affect the Patient-AI Interaction: A Literature Review. Healthcare 2020, 8, 313. [Google Scholar] [CrossRef]
  87. Viswanathan, M.; Golin, C.E.; Jones, C.D.; Ashok, M.; Blalock, S.J.; Wines, R.C.; Coker-Schwimmer, E.J.; Rosen, D.L.; Sista, P.; Lohr, K.N. Interventions to improve adherence to self-administered medications for chronic diseases in the United States: A systematic review. Ann. Intern. Med. 2012, 157, 785–795. [Google Scholar] [CrossRef]
  88. Babel, A.; Taneja, R.; Mondello Malvestiti, F.; Monaco, A.; Donde, S. Artificial Intelligence Solutions to Increase Medication Adherence in Patients With Non-communicable Diseases. Front. Digit. Health 2021, 3, 669869. [Google Scholar] [CrossRef] [PubMed]
  89. Lee, K.; Seo, L.; Yoon, D.; Yang, K.; Yi, J.E.; Kim, Y.; Lee, J.H. Digital Health Profile of South Korea: A Cross Sectional Study. Int. J. Environ. Res. Public Health 2022, 19, 6329. [Google Scholar] [CrossRef] [PubMed]
  90. Bhaskar, S.; Bradley, S.; Chattu, V.K.; Adisesh, A.; Nurtazina, A.; Kyrykbayeva, S.; Sakhamuri, S.; Moguilner, S.; Pandya, S.; Schroeder, S.; et al. Telemedicine as the New Outpatient Clinic Gone Digital: Position Paper From the Pandemic Health System REsilience PROGRAM (REPROGRAM) International Consortium (Part 2). Front. Public Health 2020, 8, 410. [Google Scholar] [CrossRef]
  91. Adamo, J.E.; Bienvenu Ii, R.V.; Dolz, F.; Liebman, M.; Nilsen, W.; Steele, S.J. Translation of Digital Health Technologies to Advance Precision Medicine: Informing Regulatory Science. Digit. Biomark. 2020, 4, 1–12. [Google Scholar] [CrossRef]
  92. Khanna, R.K.; Ducloyer, J.B.; Hage, A.; Rezkallah, A.; Durbant, E.; Bigoteau, M.; Mouchel, R.; Guillon-Rolf, R.; Le, L.; Tahiri, R.; et al. Evaluating the potential of ChatGPT-4 in ophthalmology: The good, the bad and the ugly. J. Fr. Ophtalmol. 2023. [Google Scholar] [CrossRef]
  93. Nori, H.; King, N.; McKinney, S.M.; Carignan, D.; Horvitz, E. Capabilities of gpt-4 on medical challenge problems. arXiv 2023, arXiv:2303.13375. [Google Scholar]
  94. Biswas, S. ChatGPT and the Future of Medical Writing. Radiology 2023, 307, e223312. [Google Scholar] [CrossRef] [PubMed]
  95. Khare, Y.; Bagal, V.; Mathew, M.; Devi, A.; Priyakumar, U.D.; Jawahar, C. Mmbert: Multimodal bert pretraining for improved medical vqa. In Proceedings of the 2021 IEEE 18th International Symposium on Biomedical Imaging (ISBI), Nice, France, 13–16 April 2021; pp. 1033–1036. [Google Scholar]
  96. Zhang, N.; Jankowski, M. Hierarchical BERT for medical document understanding. arXiv 2022, arXiv:2204.09600. [Google Scholar]
  97. Chassagnon, G.; Billet, N.; Rutten, C.; Toussaint, T.; Cassius de Linval, Q.; Collin, M.; Lemouchi, L.; Homps, M.; Hedjoudje, M.; Ventre, J.; et al. Learning from the machine: AI assistance is not an effective learning tool for resident education in chest X-ray interpretation. Eur. Radiol. 2023. [Google Scholar] [CrossRef]
  98. Lecler, A.; Duron, L.; Soyer, P. Revolutionizing radiology with GPT-based models: Current applications, future possibilities and limitations of ChatGPT. Diagn. Interv. Imaging 2023, 104, 269–274. [Google Scholar] [CrossRef] [PubMed]
  99. Schiavone, F.; Ferretti, M. The FutureS of healthcare. Futures 2021, 134, 102849. [Google Scholar] [CrossRef]
  100. Meskó, B.; Drobni, Z.; Bényei, É.; Gergely, B.; Győrffy, Z. Digital health is a cultural transformation of traditional healthcare. mHealth 2017, 3, 38. [Google Scholar] [CrossRef]
  101. Van Velthoven, M.H.; Cordon, C. Sustainable Adoption of Digital Health Innovations: Perspectives From a Stakeholder Workshop. J. Med. Internet Res. 2019, 21, e11922. [Google Scholar] [CrossRef] [PubMed]
  102. Brinster, C.J.; Escousse, G.T.; Rivera, P.A.; Sternbergh, W.C., 3rd; Money, S.R. Drastic increase in hospital labor costs led to a sustained financial loss for an academic vascular surgery division during the coronavirus disease 2019 pandemic. J. Vasc. Surg. 2022, 76, 1710–1718. [Google Scholar] [CrossRef] [PubMed]
  103. Boniol, M.; Kunjumen, T.; Nair, T.S.; Siyam, A.; Campbell, J.; Diallo, K. The global health workforce stock and distribution in 2020 and 2030: A threat to equity and ‘universal’ health coverage? BMJ Glob. Health 2022, 7, e009316. [Google Scholar] [CrossRef] [PubMed]
  104. Scheffler, R.M.; Liu, J.X.; Kinfu, Y.; Dal Poz, M.R. Forecasting the global shortage of physicians: An economic- and needs-based approach. Bull. World Health Organ. 2008, 86, 516–523b. [Google Scholar] [CrossRef]
  105. Gaughan, J.; Liu, D.; Gutacker, N.; Bloor, K.; Doran, T.; Benger, J.R. Does the presence of general practitioners in emergency departments affect quality and safety in English NHS hospitals? A retrospective observational study. BMJ Open 2022, 12, e055976. [Google Scholar] [CrossRef]
  106. Waters, A. Government is to blame for lowest NHS satisfaction rating in 25 years, says BMA. BMJ 2022, 376, o836. [Google Scholar] [CrossRef]
  107. Cristea, M.; Noja, G.G.; Stefea, P.; Sala, A.L. The Impact of Population Aging and Public Health Support on EU Labor Markets. Int. J. Environ. Res. Public Health 2020, 17, 1439. [Google Scholar] [CrossRef] [PubMed]
  108. Hurvitz, N.; Azmanov, H.; Kesler, A.; Ilan, Y. Establishing a second-generation artificial intelligence-based system for improving diagnosis, treatment, and monitoring of patients with rare diseases. Eur. J. Hum. Genet. 2021, 29, 1485–1490. [Google Scholar] [CrossRef]
  109. Hajat, C.; Stein, E. The global burden of multiple chronic conditions: A narrative review. Prev. Med. Rep. 2018, 12, 284–293. [Google Scholar] [CrossRef]
  110. McFarland, S.; Coufopolous, A.; Lycett, D. The effect of telehealth versus usual care for home-care patients with long-term conditions: A systematic review, meta-analysis and qualitative synthesis. J. Telemed. Telecare 2021, 27, 69–87. [Google Scholar] [CrossRef] [PubMed]
  111. Galea, S.; Maani, N. The cost of preventable disease in the USA. Lancet Public Health 2020, 5, e513–e514. [Google Scholar] [CrossRef]
  112. Pahus, L.; Suehs, C.M.; Halimi, L.; Bourdin, A.; Chanez, P.; Jaffuel, D.; Marciano, J.; Gamez, A.S.; Vachier, I.; Molinari, N. Patient distrust in pharmaceutical companies: An explanation for women under-representation in respiratory clinical trials? BMC Med. Ethics 2020, 21, 72. [Google Scholar] [CrossRef] [PubMed]
  113. Silfee, V.; Williams, K.; Leber, B.; Kogan, J.; Nikolajski, C.; Szigethy, E.; Serio, C. Health Care Provider Perspectives on the Use of a Digital Behavioral Health App to Support Patients: Qualitative Study. JMIR Form. Res. 2021, 5, e28538. [Google Scholar] [CrossRef] [PubMed]
  114. Bodenheimer, T. The political divide in health care: A liberal perspective. Health Aff. 2005, 24, 1426–1435. [Google Scholar] [CrossRef]
  115. Łukasik, M.; Porębska, A. Responsiveness and Adaptability of Healthcare Facilities in Emergency Scenarios: COVID-19 Experience. Int. J. Environ. Res. Public Health 2022, 19, 675. [Google Scholar] [CrossRef]
  116. Tagde, P.; Tagde, S.; Bhattacharya, T.; Tagde, P.; Chopra, H.; Akter, R.; Kaushik, D.; Rahman, M.H. Blockchain and artificial intelligence technology in e-Health. Environ. Sci. Pollut. Res. Int. 2021, 28, 52810–52831. [Google Scholar] [CrossRef]
  117. Pap, I.A.; Oniga, S. A Review of Converging Technologies in eHealth Pertaining to Artificial Intelligence. Int. J. Environ. Res. Public Health 2022, 19, 11413. [Google Scholar] [CrossRef]
  118. Taylor, K.I.; Staunton, H.; Lipsmeier, F.; Nobbs, D.; Lindemann, M. Outcome measures based on digital health technology sensor data: Data- and patient-centric approaches. NPJ Digit. Med. 2020, 3, 97. [Google Scholar] [CrossRef]
  119. Edgman-Levitan, S.; Schoenbaum, S.C. Patient-centered care: Achieving higher quality by designing care through the patient’s eyes. Isr. J. Health Policy Res. 2021, 10, 21. [Google Scholar] [CrossRef] [PubMed]
  120. Yusuff, K.B.; Mustafa, M.; Al-Qahtani, N.H. Prevalence, types and severity of medication errors associated with the use of automated medication use systems in ambulatory and institutionalized care settings: A systematic review protocol. PLoS ONE 2021, 16, e0260992. [Google Scholar] [CrossRef]
  121. Gianfrancesco, M.A.; Tamang, S.; Yazdany, J.; Schmajuk, G. Potential Biases in Machine Learning Algorithms Using Electronic Health Record Data. JAMA Intern. Med. 2018, 178, 1544–1547. [Google Scholar] [CrossRef] [PubMed]
  122. Roth, J.A.; Battegay, M.; Juchler, F.; Vogt, J.E.; Widmer, A.F. Introduction to Machine Learning in Digital Healthcare Epidemiology. Infect. Control Hosp. Epidemiol. 2018, 39, 1457–1462. [Google Scholar] [CrossRef]
  123. Triantafyllidis, A.K.; Tsanas, A. Applications of Machine Learning in Real-Life Digital Health Interventions: Review of the Literature. J. Med. Internet Res. 2019, 21, e12286. [Google Scholar] [CrossRef]
  124. Ellahham, S.; Ellahham, N.; Simsekler, M.C.E. Application of Artificial Intelligence in the Health Care Safety Context: Opportunities and Challenges. Am. J. Med. Qual. 2020, 35, 341–348. [Google Scholar] [CrossRef]
  125. Deo, R.C. Machine Learning in Medicine. Circulation 2015, 132, 1920–1930. [Google Scholar] [CrossRef]
  126. Alanazi, A. Using machine learning for healthcare challenges and opportunities. Inform. Med. Unlocked 2022, 30, 100924. [Google Scholar] [CrossRef]
  127. Norori, N.; Hu, Q.; Aellen, F.M.; Faraci, F.D.; Tzovara, A. Addressing bias in big data and AI for health care: A call for open science. Patterns 2021, 2, 100347. [Google Scholar] [CrossRef]
  128. Egger, J.; Gsaxner, C.; Pepe, A.; Pomykala, K.L.; Jonske, F.; Kurz, M.; Li, J.; Kleesiek, J. Medical deep learning-A systematic meta-review. Comput. Methods Programs Biomed. 2022, 221, 106874. [Google Scholar] [CrossRef]
  129. Verheij, R.A.; Curcin, V.; Delaney, B.C.; McGilchrist, M.M. Possible Sources of Bias in Primary Care Electronic Health Record Data Use and Reuse. J. Med. Internet Res. 2018, 20, e185. [Google Scholar] [CrossRef]
  130. Senbekov, M.; Saliev, T.; Bukeyeva, Z.; Almabayeva, A.; Zhanaliyeva, M.; Aitenova, N.; Toishibekov, Y.; Fakhradiyev, I. The Recent Progress and Applications of Digital Technologies in Healthcare: A Review. Int. J. Telemed. Appl. 2020, 2020, 8830200. [Google Scholar] [CrossRef]
  131. Magalhães, T.; Dinis-Oliveira, R.J.; Taveira-Gomes, T. Digital Health and Big Data Analytics: Implications of Real-World Evidence for Clinicians and Policymakers. Int. J. Environ. Res. Public Health 2022, 19, 8364. [Google Scholar] [CrossRef] [PubMed]
  132. Blanco, J.R.; Verdugo-Sivianes, E.M.; Amiama, A.; Munoz-Galvan, S. The circadian rhythm of viruses and its implications on susceptibility to infection. Expert. Rev. Anti-Infect. Ther. 2022, 20, 1109–1117. [Google Scholar] [CrossRef]
  133. de Hond, A.A.H.; Leeuwenberg, A.M.; Hooft, L.; Kant, I.M.J.; Nijman, S.W.J.; van Os, H.J.A.; Aardoom, J.J.; Debray, T.P.A.; Schuit, E.; van Smeden, M.; et al. Guidelines and quality criteria for artificial intelligence-based prediction models in healthcare: A scoping review. NPJ Digit. Med. 2022, 5, 2. [Google Scholar] [CrossRef]
  134. Osei, E.; Agyei, K.; Tlou, B.; Mashamba-Thompson, T.P. Availability and Use of Mobile Health Technology for Disease Diagnosis and Treatment Support by Health Workers in the Ashanti Region of Ghana: A Cross-Sectional Survey. Diagnostics 2021, 11, 1233. [Google Scholar] [CrossRef] [PubMed]
  135. Celi, L.A.; Cellini, J.; Charpignon, M.-L.; Dee, E.C.; Dernoncourt, F.; Eber, R.; Mitchell, W.G.; Moukheiber, L.; Schirmer, J.; Situ, J.; et al. Sources of bias in artificial intelligence that perpetuate healthcare disparities—A global review. PLoS Digit. Health 2022, 1, e0000022. [Google Scholar] [CrossRef]
  136. Curioso, W.H. Building Capacity and Training for Digital Health: Challenges and Opportunities in Latin America. J. Med. Internet Res. 2019, 21, e16513. [Google Scholar] [CrossRef] [PubMed]
  137. Chen, I.Y.; Szolovits, P.; Ghassemi, M. Can AI help reduce disparities in general medical and mental health care? AMA J. Ethics 2019, 21, 167–179. [Google Scholar]
  138. Cohen, A.B.; Dorsey, E.R.; Mathews, S.C.; Bates, D.W.; Safavi, K. A digital health industry cohort across the health continuum. NPJ Digit. Med. 2020, 3, 68. [Google Scholar] [CrossRef]
  139. Adjekum, A.; Blasimme, A.; Vayena, E. Elements of Trust in Digital Health Systems: Scoping Review. J. Med. Internet Res. 2018, 20, e11254. [Google Scholar] [CrossRef]
  140. Wang, Y.; Blobel, B.; Yang, B. Reinforcing Health Data Sharing through Data Democratization. J. Pers. Med. 2022, 12, 1380. [Google Scholar] [CrossRef]
  141. Szarfman, A.; Levine, J.G.; Tonning, J.M.; Weichold, F.; Bloom, J.C.; Soreth, J.M.; Geanacopoulos, M.; Callahan, L.; Spotnitz, M.; Ryan, Q.; et al. Recommendations for achieving interoperable and shareable medical data in the USA. Commun. Med. 2022, 2, 86. [Google Scholar] [CrossRef] [PubMed]
  142. Brewer, L.C.; Fortuna, K.L.; Jones, C.; Walker, R.; Hayes, S.N.; Patten, C.A.; Cooper, L.A. Back to the Future: Achieving Health Equity Through Health Informatics and Digital Health. JMIR Mhealth Uhealth 2020, 8, e14512. [Google Scholar] [CrossRef]
  143. Cordeiro, J.V. Digital Technologies and Data Science as Health Enablers: An Outline of Appealing Promises and Compelling Ethical, Legal, and Social Challenges. Front. Med. 2021, 8, 647897. [Google Scholar] [CrossRef] [PubMed]
  144. Gianfrancesco, M.A.; Goldstein, N.D. A narrative review on the validity of electronic health record-based research in epidemiology. BMC Med. Res. Methodol. 2021, 21, 234. [Google Scholar] [CrossRef] [PubMed]
  145. Mahajan, S.; Lu, Y.; Spatz, E.S.; Nasir, K.; Krumholz, H.M. Trends and Predictors of Use of Digital Health Technology in the United States. Am. J. Med. 2021, 134, 129–134. [Google Scholar] [CrossRef]
  146. Ilan, Y. Second-Generation Digital Health Platforms: Placing the Patient at the Center and Focusing on Clinical Outcomes. Front. Digit. Health 2020, 2, 569178. [Google Scholar] [CrossRef]
  147. Cozzoli, N.; Salvatore, F.P.; Faccilongo, N.; Milone, M. How can big data analytics be used for healthcare organization management? Literary framework and future research from a systematic review. BMC Health Serv. Res. 2022, 22, 809. [Google Scholar] [CrossRef]
  148. Dolley, S. Big Data’s Role in Precision Public Health. Front. Public Health 2018, 6, 68. [Google Scholar] [CrossRef]
  149. Javaid, M.; Haleem, A.; Pratap Singh, R.; Suman, R.; Rab, S. Significance of machine learning in healthcare: Features, pillars and applications. Int. J. Intell. Netw. 2022, 3, 58–73. [Google Scholar] [CrossRef]
  150. Peek, S.T.M.; Wouters, E.J.; Luijkx, K.G.; Vrijhoef, H.J. What it takes to successfully implement technology for aging in place: Focus groups with stakeholders. J. Med. Internet Res. 2016, 18, e5253. [Google Scholar] [CrossRef] [PubMed]
  151. Wu, Y.-H.; Damnée, S.; Kerhervé, H.; Ware, C.; Rigaud, A.-S. Bridging the digital divide in older adults: A study from an initiative to inform older adults about new technologies. Clin. Interv. Aging 2015, 10, 193. [Google Scholar] [CrossRef] [PubMed]
  152. Singh, R.P.; Hom, G.L.; Abramoff, M.D.; Campbell, J.P.; Chiang, M.F. Current Challenges and Barriers to Real-World Artificial Intelligence Adoption for the Healthcare System, Provider, and the Patient. Transl. Vis. Sci. Technol. 2020, 9, 45. [Google Scholar] [CrossRef]
  153. Aung, Y.Y.; Wong, D.C.; Ting, D.S. The promise of artificial intelligence: A review of the opportunities and challenges of artificial intelligence in healthcare. Br. Med. Bull. 2021, 139, 4–15. [Google Scholar] [CrossRef] [PubMed]
  154. Di Nucci, E. Should we be afraid of medical AI? J. Med. Ethics 2019, 45, 556–558. [Google Scholar] [CrossRef] [PubMed]
  155. Shaheen, M.Y. AI in Healthcare: Medical and socio-economic benefits and challenges. ScienceOpen 2021, Preprints. [Google Scholar] [CrossRef]
  156. Dalton-Brown, S. The Ethics of Medical AI and the Physician-Patient Relationship. Camb. Q. Healthc. Ethics 2020, 29, 115–121. [Google Scholar] [CrossRef]
  157. Bhandari, M.; Reddiboina, M. Augmented intelligence: A synergy between man and the machine. Indian J. Urol. 2019, 35, 89–91. [Google Scholar] [CrossRef]
  158. Batko, K.; Ślęzak, A. The use of Big Data Analytics in healthcare. J. Big Data 2022, 9, 3. [Google Scholar] [CrossRef] [PubMed]
  159. Linardatos, P.; Papastefanopoulos, V.; Kotsiantis, S. Explainable AI: A Review of Machine Learning Interpretability Methods. Entropy 2020, 23, 18. [Google Scholar] [CrossRef] [PubMed]
  160. Mosadeghrad, A.M. Factors influencing healthcare service quality. Int. J. Health Policy Manag. 2014, 3, 77–89. [Google Scholar] [CrossRef]
  161. Thimbleby, H. Technology and the future of healthcare. J. Public Health Res. 2013, 2, e28. [Google Scholar] [CrossRef]
  162. Hardy, A.; Wojdecka, A.; West, J.; Matthews, E.; Golby, C.; Ward, T.; Lopez, N.D.; Freeman, D.; Waller, H.; Kuipers, E. How inclusive, user-centered design research can improve psychological therapies for psychosis: Development of SlowMo. JMIR Ment. Health 2018, 5, e11222. [Google Scholar] [CrossRef]
  163. Maramba, I.; Chatterjee, A.; Newman, C. Methods of usability testing in the development of eHealth applications: A scoping review. Int. J. Med. Inform. 2019, 126, 95–104. [Google Scholar] [CrossRef] [PubMed]
  164. Panch, T.; Mattie, H.; Celi, L.A. The “inconvenient truth” about AI in healthcare. NPJ Digit. Med. 2019, 2, 77. [Google Scholar] [CrossRef]
  165. Chandra, S.; Verma, S.; Lim, W.M.; Kumar, S.; Donthu, N. Personalization in personalized marketing: Trends and ways forward. Psychol. Mark. 2022, 39, 1529–1562. [Google Scholar] [CrossRef]
  166. O’Connor, S.; Hanlon, P.; O’Donnell, C.A.; Garcia, S.; Glanville, J.; Mair, F.S. Understanding factors affecting patient and public engagement and recruitment to digital health interventions: A systematic review of qualitative studies. BMC Med. Inform. Decis. Mak. 2016, 16, 120. [Google Scholar] [CrossRef]
  167. Johnson, D.; Goodman, R.; Patrinely, J.; Stone, C.; Zimmerman, E.; Donald, R.; Chang, S.; Berkowitz, S.; Finn, A.; Jahangir, E. Assessing the accuracy and reliability of AI-generated medical responses: An evaluation of the Chat-GPT model. Res. Sq. 2023, preprint. [CrossRef]
  168. Greenhalgh, T.; Wherton, J.; Papoutsi, C.; Lynch, J.; Hughes, G.; A’Court, C.; Hinder, S.; Fahy, N.; Procter, R.; Shaw, S. Beyond Adoption: A New Framework for Theorizing and Evaluating Nonadoption, Abandonment, and Challenges to the Scale-Up, Spread, and Sustainability of Health and Care Technologies. J. Med. Internet Res. 2017, 19, e367. [Google Scholar] [CrossRef]
  169. Malcarney, M.B.; Horton, K.; Seiler, N.; Hastings, D. Advancing the Public’s Health by Scaling Innovations in Clinical Quality. Public Health Rep. 2017, 132, 512–517. [Google Scholar] [CrossRef]
  170. Tobia, K.; Nielsen, A.; Stremitzer, A. When Does Physician Use of AI Increase Liability? J. Nucl. Med. Off. Publ. Soc. Nucl. Med. 2021, 62, 17–21. [Google Scholar] [CrossRef]
  171. Modgil, S.; Singh, R.K.; Gupta, S.; Dennehy, D. A Confirmation Bias View on Social Media Induced Polarisation During COVID-19. Inf. Syst. Front. J. Res. Innov. 2021, 1–25. [Google Scholar] [CrossRef] [PubMed]
  172. Thornhill, C.; Meeus, Q.; Peperkamp, J.; Berendt, B. A Digital Nudge to Counter Confirmation Bias. Front. Big Data 2019, 2, 11. [Google Scholar] [CrossRef] [PubMed]
  173. Waisberg, E.; Ong, J.; Masalkhi, M.; Kamran, S.A.; Zaman, N.; Sarker, P.; Lee, A.G.; Tavakkoli, A. GPT-4: A new era of artificial intelligence in medicine. Irish J. Med. Sci. 2023, 1–4. [Google Scholar] [CrossRef] [PubMed]
  174. van Panhuis, W.G.; Paul, P.; Emerson, C.; Grefenstette, J.; Wilder, R.; Herbst, A.J.; Heymann, D.; Burke, D.S. A systematic review of barriers to data sharing in public health. BMC Public Health 2014, 14, 1144. [Google Scholar] [CrossRef]
  175. Ahmad, O.F.; Stoyanov, D.; Lovat, L.B. Barriers and pitfalls for artificial intelligence in gastroenterology: Ethical and regulatory issues. Tech. Innov. Gastrointest. Endosc. 2020, 22, 80–84. [Google Scholar] [CrossRef]
  176. Madore, C.; Yin, Z.; Leibowitz, J.; Butovsky, O. Microglia, Lifestyle Stress, and Neurodegeneration. Immunity 2020, 52, 222–240. [Google Scholar] [CrossRef]
  177. Chevrier, R.; Foufi, V.; Gaudet-Blavignac, C.; Robert, A.; Lovis, C. Use and Understanding of Anonymization and De-Identification in the Biomedical Literature: Scoping Review. J. Med. Internet Res. 2019, 21, e13484. [Google Scholar] [CrossRef]
  178. Vayena, E.; Mastroianni, A.; Kahn, J. Caught in the web: Informed consent for online health research. Sci. Transl. Med. 2013, 5, 173fs176. [Google Scholar] [CrossRef]
  179. World Health Organization. Global Diffusion of eHealth: Making Universal Health Coverage Achievable: Report of the Third Global Survey on eHealth; World Health Organization: Geneva, Switzerland, 2017.
  180. Nguyen, G.; Dlugolinsky, S.; Bobák, M.; Tran, V.; López García, Á.; Heredia, I.; Malík, P.; Hluchý, L. Machine Learning and Deep Learning frameworks and libraries for large-scale data mining: A survey. Artif. Intell. Rev. 2019, 52, 77–124. [Google Scholar] [CrossRef]
  181. Schäferhoff, M.; Zimmerman, A.; Diab, M.M.; Mao, W.; Chowdhary, V.; Gill, D.; Karanja, R.; Madikizela, M.; Ogbuoji, O.; Yamey, G. Investing in late-stage clinical trials and manufacturing of product candidates for five major infectious diseases: A modelling study of the benefits and costs of investment in three middle-income countries. Lancet Glob. Health 2022, 10, e1045–e1052. [Google Scholar] [CrossRef]
  182. Macdonald, J.C.; Isom, D.C.; Evans, D.D.; Page, K.J. Digital Innovation in Medicinal Product Regulatory Submission, Review, and Approvals to Create a Dynamic Regulatory Ecosystem-Are We Ready for a Revolution? Front. Med. 2021, 8, 660808. [Google Scholar] [CrossRef]
  183. McPhail, S.M. Multimorbidity in chronic disease: Impact on health care resources and costs. Risk Manag. Healthc. Policy 2016, 9, 143–156. [Google Scholar] [CrossRef] [PubMed]
  184. Swartz, A.; LeFevre, A.E.; Perera, S.; Kinney, M.V.; George, A.S. Multiple pathways to scaling up and sustainability: An exploration of digital health solutions in South Africa. Glob. Health 2021, 17, 77. [Google Scholar] [CrossRef]
  185. Kwee, A.; Teo, Z.L.; Ting, D.S.W. Digital health in medicine: Important considerations in evaluating health economic analysis. Lancet Reg. Health. West. Pac. 2022, 23, 100476. [Google Scholar] [CrossRef]
  186. Teisberg, E.; Wallace, S.; O’Hara, S. Defining and Implementing Value-Based Health Care: A Strategic Framework. Acad. Med. 2020, 95, 682–685. [Google Scholar] [CrossRef] [PubMed]
  187. Adir, O.; Poley, M.; Chen, G.; Froim, S.; Krinsky, N.; Shklover, J.; Shainsky-Roitman, J.; Lammers, T.; Schroeder, A. Integrating Artificial Intelligence and Nanotechnology for Precision Cancer Medicine. Adv. Mater. 2020, 32, 1901989. [Google Scholar] [CrossRef]
  188. Gomez Rossi, J.; Feldberg, B.; Krois, J.; Schwendicke, F. Evaluation of the Clinical, Technical, and Financial Aspects of Cost-Effectiveness Analysis of Artificial Intelligence in Medicine: Scoping Review and Framework of Analysis. JMIR Med. Inf. 2022, 10, e33703. [Google Scholar] [CrossRef] [PubMed]
  189. Sloane, E.B.; Silva, R.J. Chapter 83—Artificial intelligence in medical devices and clinical decision support systems. In Clinical Engineering Handbook, 2nd ed.; Iadanza, E., Ed.; Academic Press: Cambridge, MA, USA, 2020; pp. 556–568. [Google Scholar]
  190. Mbunge, E.; Batani, J.; Gaobotse, G.; Muchemwa, B. Virtual healthcare services and digital health technologies deployed during coronavirus disease 2019 (COVID-19) pandemic in South Africa: A systematic review. Glob. Health J. 2022, 6, 102–113. [Google Scholar] [CrossRef]
  191. Dorsey, E.R. The new platforms of health care. NPJ Digit. Med. 2021, 4, 112. [Google Scholar] [CrossRef]
  192. Iyamu, I.; Gómez-Ramírez, O.; Xu, A.X.; Chang, H.J.; Watt, S.; McKee, G.; Gilbert, M. Challenges in the development of digital public health interventions and mapped solutions: Findings from a scoping review. Digit. Health 2022, 8, 20552076221102255. [Google Scholar] [CrossRef] [PubMed]
  193. Thomford, N.E.; Bope, C.D.; Agamah, F.E.; Dzobo, K.; Owusu Ateko, R.; Chimusa, E.; Mazandu, G.K.; Ntumba, S.B.; Dandara, C.; Wonkam, A. Implementing Artificial Intelligence and Digital Health in Resource-Limited Settings? Top 10 Lessons We Learned in Congenital Heart Defects and Cardiology. Omics A J. Integr. Biol. 2020, 24, 264–277. [Google Scholar] [CrossRef]
  194. Kumar, Y.; Koul, A.; Singla, R.; Ijaz, M.F. Artificial intelligence in disease diagnosis: A systematic literature review, synthesizing framework and future research agenda. J. Ambient Intell. Humaniz. Comput. 2022, 14, 8459–8486. [Google Scholar] [CrossRef]
  195. He, J.; Baxter, S.L.; Xu, J.; Xu, J.; Zhou, X.; Zhang, K. The practical implementation of artificial intelligence technologies in medicine. Nat. Med. 2019, 25, 30–36. [Google Scholar] [CrossRef]
  196. Shuren, J.; Patel, B.; Gottlieb, S. FDA Regulation of Mobile Medical Apps. JAMA 2018, 320, 337–338. [Google Scholar] [CrossRef]
  197. Aguirre, R.R.; Suarez, O.; Fuentes, M.; Sanchez-Gonzalez, M.A. Electronic Health Record Implementation: A Review of Resources and Tools. Cureus 2019, 11, e5649. [Google Scholar] [CrossRef] [PubMed]
  198. Murray, E.; Hekler, E.B.; Andersson, G.; Collins, L.M.; Doherty, A.; Hollis, C.; Rivera, D.E.; West, R.; Wyatt, J.C. Evaluating Digital Health Interventions: Key Questions and Approaches. Am. J. Prev. Med. 2016, 51, 843–851. [Google Scholar] [CrossRef] [PubMed]
  199. Yoon, J.; Lee, M.; Ahn, J.S.; Oh, D.; Shin, S.Y.; Chang, Y.J.; Cho, J. Development and Validation of Digital Health Technology Literacy Assessment Questionnaire. J. Med. Syst. 2022, 46, 13. [Google Scholar] [CrossRef]
  200. Hayden, J.; van der Windt, D.; Cartwright, J.; Côté, P.; Bombardier, C. Assessing Bias in Studies of Prognostic Factors. Ann. Intern. Med. 2013, 158, 280–286. [Google Scholar] [CrossRef]
  201. Richter, P.; Harst, L. Tackling the scaling-up problem of digital health applications. J. Public Health 2022, 30, 1–3. [Google Scholar] [CrossRef]
  202. Shrivastava, U.; Song, J.; Han, B.T.; Dietzman, D. Do data security measures, privacy regulations, and communication standards impact the interoperability of patient health information? A cross-country investigation. Int. J. Med. Inform. 2021, 148, 104401. [Google Scholar] [CrossRef] [PubMed]
  203. Lam, L.; Fadrique, L.; Bin Noon, G.; Shah, A.; Morita, P.P. Evaluating Challenges and Adoption Factors for Active Assisted Living Smart Environments. Front. Digit. Health 2022, 4, 891634. [Google Scholar] [CrossRef]
  204. Nazeha, N.; Pavagadhi, D.; Kyaw, B.M.; Car, J.; Jimenez, G.; Tudor Car, L. A Digitally Competent Health Workforce: Scoping Review of Educational Frameworks. J. Med. Internet Res. 2020, 22, e22706. [Google Scholar] [CrossRef]
  205. Kruk, M.E.; Gage, A.D.; Arsenault, C.; Jordan, K.; Leslie, H.H.; Roder-DeWan, S.; Adeyi, O.; Barker, P.; Daelmans, B.; Doubova, S.V.; et al. High-quality health systems in the Sustainable Development Goals era: Time for a revolution. Lancet Glob. Health 2018, 6, e1196–e1252. [Google Scholar] [CrossRef] [PubMed]
  206. Pawloski, P.A.; Brooks, G.A.; Nielsen, M.E.; Olson-Bullis, B.A. A systematic review of clinical decision support systems for clinical oncology practice. J. Natl. Compr. Cancer Netw. 2019, 17, 331–338. [Google Scholar] [CrossRef]
  207. Billingy, N.E.; Tromp, V.; Veldhuijzen, E.; Belderbos, J.; Aaronson, N.K.; Feldman, E.; Hoek, R.; Bogaard, H.J.; Onwuteaka-Philipsen, B.D.; van de Poll-Franse, L.; et al. SYMptom monitoring with Patient-Reported Outcomes using a web application among patients with Lung cancer in the Netherlands (SYMPRO-Lung): Study protocol for a stepped-wedge randomised controlled trial. BMJ Open 2021, 11, e052494. [Google Scholar] [CrossRef] [PubMed]
  208. Hershman, D.L.; Unger, J.M.; Hillyer, G.C.; Moseley, A.; Arnold, K.B.; Dakhil, S.R.; Esparaz, B.T.; Kuan, M.C.; Graham, M.L., 2nd; Lackowski, D.M.; et al. Randomized Trial of Text Messaging to Reduce Early Discontinuation of Adjuvant Aromatase Inhibitor Therapy in Women With Early-Stage Breast Cancer: SWOG S1105. J. Clin. Oncol. 2020, 38, 2122–2129. [Google Scholar] [CrossRef]
  209. Yang, Y.; Lee, E.Y.; Kim, H.S.; Lee, S.H.; Yoon, K.H.; Cho, J.H. Effect of a Mobile Phone-Based Glucose-Monitoring and Feedback System for Type 2 Diabetes Management in Multiple Primary Care Clinic Settings: Cluster Randomized Controlled Trial. JMIR Mhealth Uhealth 2020, 8, e16266. [Google Scholar] [CrossRef]
  210. Rahimi, R.; Kazemi, A.; Moghaddasi, H.; Arjmandi Rafsanjani, K.; Bahoush, G. Specifications of Computerized Provider Order Entry and Clinical Decision Support Systems for Cancer Patients Undergoing Chemotherapy: A Systematic Review. Chemotherapy 2018, 63, 162–171. [Google Scholar] [CrossRef]
  211. Zhu, C.Y.; Wang, Y.K.; Chen, H.P.; Gao, K.L.; Shu, C.; Wang, J.C.; Yan, L.F.; Yang, Y.G.; Xie, F.Y.; Liu, J. A Deep Learning Based Framework for Diagnosing Multiple Skin Diseases in a Clinical Environment. Front. Med. 2021, 8, 626369. [Google Scholar] [CrossRef] [PubMed]
  212. Bowen, D.J.; Kreuter, M.; Spring, B.; Cofta-Woerpel, L.; Linnan, L.; Weiner, D.; Bakken, S.; Kaplan, C.P.; Squiers, L.; Fabrizio, C. How we design feasibility studies. Am. J. Prev. Med. 2009, 36, 452–457. [Google Scholar] [CrossRef] [PubMed]
  213. Mentz, R.J.; Hernandez, A.F.; Berdan, L.G.; Rorick, T.; O’Brien, E.C.; Ibarra, J.C.; Curtis, L.H.; Peterson, E.D. Good clinical practice guidance and pragmatic clinical trials: Balancing the best of both worlds. Circulation 2016, 133, 872–880. [Google Scholar] [CrossRef] [PubMed]
  214. Cunanan, K.M.; Iasonos, A.; Shen, R.; Begg, C.B.; Gönen, M. An efficient basket trial design. Stat. Med. 2017, 36, 1568–1579. [Google Scholar] [CrossRef] [PubMed]
  215. Ford, I.; Norrie, J. Pragmatic trials. N. Engl. J. Med. 2016, 375, 454–463. [Google Scholar] [CrossRef] [PubMed]
  216. Hospodková, P.; Berežná, J.; Barták, M.; Rogalewicz, V.; Severová, L.; Svoboda, R. Change Management and Digital Innovations in Hospitals of Five European Countries. Healthcare 2021, 9, 1508. [Google Scholar] [CrossRef]
  217. Marwaha, J.S.; Landman, A.B.; Brat, G.A.; Dunn, T.; Gordon, W.J. Deploying digital health tools within large, complex health systems: Key considerations for adoption and implementation. NPJ Digit. Med. 2022, 5, 13. [Google Scholar] [CrossRef] [PubMed]
  218. Loftus, T.J.; Tighe, P.J.; Ozrazgat-Baslanti, T.; Davis, J.P.; Ruppert, M.M.; Ren, Y.; Shickel, B.; Kamaleswaran, R.; Hogan, W.R.; Moorman, J.R.; et al. Ideal algorithms in healthcare: Explainable, dynamic, precise, autonomous, fair, and reproducible. PLoS Digit. Health 2022, 1, e0000006. [Google Scholar] [CrossRef]
  219. Ilan, Y. Overcoming randomness does not rule out the importance of inherent randomness for functionality. J. Biosci. 2019, 44, 132. [Google Scholar] [CrossRef]
  220. Ilan, Y. Generating randomness: Making the most out of disordering a false order into a real one. J. Transl. Med. 2019, 17, 49. [Google Scholar] [CrossRef]
  221. Ilan, Y. Advanced Tailored Randomness: A Novel Approach for Improving the Efficacy of Biological Systems. J. Comput. Biol. 2020, 27, 20–29. [Google Scholar] [CrossRef] [PubMed]
  222. Ilan, Y. Order Through Disorder: The Characteristic Variability of Systems. Front. Cell Dev. Biol. 2020, 8, 186. [Google Scholar] [CrossRef] [PubMed]
  223. Finn, E.H.; Misteli, T. Molecular basis and biological function of variability in spatial genome organization. Science 2019, 365, eaaw9498. [Google Scholar] [CrossRef] [PubMed]
  224. Ilan, Y. Randomness in microtubule dynamics: An error that requires correction or an inherent plasticity required for normal cellular function? Cell Biol. Int. 2019, 43, 739–748. [Google Scholar] [CrossRef]
  225. Ilan, Y. Microtubules: From understanding their dynamics to using them as potential therapeutic targets. J. Cell. Physiol. 2019, 234, 7923–7937. [Google Scholar] [CrossRef] [PubMed]
  226. Ilan-Ber, T.; Ilan, Y. The role of microtubules in the immune system and as potential targets for gut-based immunotherapy. Mol. Immunol. 2019, 111, 73–82. [Google Scholar] [CrossRef]
  227. Forkosh, E.; Kenig, A.; Ilan, Y. Introducing variability in targeting the microtubules: Review of current mechanisms and future directions in colchicine therapy. Pharmacol. Res. Perspect. 2020, 8, e00616. [Google Scholar] [CrossRef]
  228. Ilan, Y. Microtubules as a potential platform for energy transfer in biological systems: A target for implementing individualized, dynamic variability patterns to improve organ function. Mol. Cell. Biochem. 2022, 478, 375–392. [Google Scholar] [CrossRef]
  229. Mitchison, T.; Kirschner, M. Dynamic instability of microtubule growth. Nature 1984, 312, 237–242. [Google Scholar] [CrossRef]
  230. Kirschner, M.W.; Mitchison, T. Microtubule dynamics. Nature 1986, 324, 621. [Google Scholar] [CrossRef]
  231. Turana, Y.; Shen, R.; Nathaniel, M.; Chia, Y.C.; Li, Y.; Kario, K. Neurodegenerative diseases and blood pressure variability: A comprehensive review from HOPE Asia. J. Clin. Hypertens. 2022, 24, 1204–1217. [Google Scholar] [CrossRef] [PubMed]
  232. Chiera, M.; Cerritelli, F.; Casini, A.; Barsotti, N.; Boschiero, D.; Cavigioli, F.; Corti, C.G.; Manzotti, A. Heart Rate Variability in the Perinatal Period: A Critical and Conceptual Review. Front. Neurosci. 2020, 14, 561186. [Google Scholar] [CrossRef] [PubMed]
  233. Forte, G.; Favieri, F.; Casagrande, M. Heart Rate Variability and Cognitive Function: A Systematic Review. Front. Neurosci. 2019, 13, 710. [Google Scholar] [CrossRef]
  234. Liu, Y.; Luo, X.; Jia, H.; Yu, B. The Effect of Blood Pressure Variability on Coronary Atherosclerosis Plaques. Front. Cardiovasc. Med. 2022, 9, 803810. [Google Scholar] [CrossRef] [PubMed]
  235. Ilan, Y. The constrained disorder principle defines living organisms and provides a method for correcting disturbed biological systems. Comput. Struct. Biotechnol. J. 2022, 20, 6087–6096. [Google Scholar] [CrossRef] [PubMed]
  236. Ilan, Y. Making use of noise in biological systems. Prog. Biophys. Mol. Biol. 2023, 178, 83–90. [Google Scholar] [CrossRef]
  237. Ilan, Y. Constrained disorder principle-based variability is fundamental for biological processes: Beyond biological relativity and physiological regulatory networks. Prog. Biophys. Mol. Biol. 2023, 180–181, 37–48. [Google Scholar] [CrossRef]
  238. Sigawi, T.; Lehmann, H.; Hurvitz, N.; Ilan, Y. Constrained Disorder Principle-Based Second-Generation Algorithms Implement Quantified Variability Signatures to Improve the Function of Complex Systems. J. Bioinform. Syst. Biol. 2023, 6, 82–89. [Google Scholar] [CrossRef]
  239. Ilan, Y. Overcoming Compensatory Mechanisms toward Chronic Drug Administration to Ensure Long-Term, Sustainable Beneficial Effects. Mol. Ther. Methods Clin. Dev. 2020, 18, 335–344. [Google Scholar] [CrossRef]
  240. Shabat, Y.; Lichtenstein, Y.; Ilan, Y. Short-Term Cohousing of Sick with Healthy or Treated Mice Alleviates the Inflammatory Response and Liver Damage. Inflammation 2021, 44, 518–525. [Google Scholar] [CrossRef]
  241. El-Haj, M.; Kanovitch, D.; Ilan, Y. Personalized inherent randomness of the immune system is manifested by an individualized response to immune triggers and immunomodulatory therapies: A novel platform for designing personalized immunotherapies. Immunol. Res. 2019, 67, 337–347. [Google Scholar] [CrossRef] [PubMed]
  242. Ilan, Y. beta-Glycosphingolipids as Mediators of Both Inflammation and Immune Tolerance: A Manifestation of Randomness in Biological Systems. Front. Immunol. 2019, 10, 1143. [Google Scholar] [CrossRef]
  243. Kessler, A.; Weksler-Zangen, S.; Ilan, Y. Role of the Immune System and the Circadian Rhythm in the Pathogenesis of Chronic Pancreatitis: Establishing a Personalized Signature for Improving the Effect of Immunotherapies for Chronic Pancreatitis. Pancreas 2020, 49, 1024–1032. [Google Scholar] [CrossRef]
  244. Ishay, Y.; Kolben, Y.; Kessler, A.; Ilan, Y. Role of circadian rhythm and autonomic nervous system in liver function: A hypothetical basis for improving the management of hepatic encephalopathy. Am. J. Physiol. Gastrointest. Liver Physiol. 2021, 321, G400–G412. [Google Scholar] [CrossRef]
  245. Kolben, Y.; Weksler-Zangen, S.; Ilan, Y. Adropin as a potential mediator of the metabolic system-autonomic nervous system-chronobiology axis: Implementing a personalized signature-based platform for chronotherapy. Obes. Rev. 2021, 22, e13108. [Google Scholar] [CrossRef]
  246. Kenig, A.; Kolben, Y.; Asleh, R.; Amir, O.; Ilan, Y. Improving Diuretic Response in Heart Failure by Implementing a Patient-Tailored Variability and Chronotherapy-Guided Algorithm. Front. Cardiovasc. Med. 2021, 8, 695547. [Google Scholar] [CrossRef] [PubMed]
  247. Azmanov, H.; Ross, E.L.; Ilan, Y. Establishment of an Individualized Chronotherapy, Autonomic Nervous System, and Variability-Based Dynamic Platform for Overcoming the Loss of Response to Analgesics. Pain Physician 2021, 24, 243–252. [Google Scholar]
  248. Potruch, A.; Khoury, S.T.; Ilan, Y. The role of chronobiology in drug-resistance epilepsy: The potential use of a variability and chronotherapy-based individualized platform for improving the response to anti-seizure drugs. Seizure 2020, 80, 201–211. [Google Scholar] [CrossRef]
  249. Isahy, Y.; Ilan, Y. Improving the long-term response to antidepressants by establishing an individualized platform based on variability and chronotherapy. Int. J. Clin. Pharmacol. Ther. 2021, 59, 768–774. [Google Scholar] [CrossRef] [PubMed]
  250. Khoury, T.; Ilan, Y. Introducing Patterns of Variability for Overcoming Compensatory Adaptation of the Immune System to Immunomodulatory Agents: A Novel Method for Improving Clinical Response to Anti-TNF Therapies. Front. Immunol. 2019, 10, 2726. [Google Scholar] [CrossRef]
  251. Khoury, T.; Ilan, Y. Platform introducing individually tailored variability in nerve stimulations and dietary regimen to prevent weight regain following weight loss in patients with obesity. Obes. Res. Clin. Pract. 2021, 15, 114–123. [Google Scholar] [CrossRef] [PubMed]
  252. Kenig, A.; Ilan, Y. A Personalized Signature and Chronotherapy-Based Platform for Improving the Efficacy of Sepsis Treatment. Front. Physiol. 2019, 10, 1542. [Google Scholar] [CrossRef] [PubMed]
  253. Ilan, Y. Why targeting the microbiome is not so successful: Can randomness overcome the adaptation that occurs following gut manipulation? Clin. Exp. Gastroenterol. 2019, 12, 209–217. [Google Scholar] [CrossRef] [PubMed]
  254. Gelman, R.; Bayatra, A.; Kessler, A.; Schwartz, A.; Ilan, Y. Targeting SARS-CoV-2 receptors as a means for reducing infectivity and improving antiviral and immune response: An algorithm-based method for overcoming resistance to antiviral agents. Emerg. Microbes Infect. 2020, 9, 1397–1406. [Google Scholar] [CrossRef]
  255. Ilan, Y.; Spigelman, Z. Establishing patient-tailored variability-based paradigms for anti-cancer therapy: Using the inherent trajectories which underlie cancer for overcoming drug resistance. Cancer Treat. Res. Commun. 2020, 25, 100240. [Google Scholar] [CrossRef]
  256. Ilan, Y. Digital Medical Cannabis as Market Differentiator: Second-Generation Artificial Intelligence Systems to Improve Response. Front. Med. 2021, 8, 788777. [Google Scholar] [CrossRef]
  257. Gelman, R.; Berg, M.; Ilan, Y. A Subject-Tailored Variability-Based Platform for Overcoming the Plateau Effect in Sports Training: A Narrative Review. Int. J. Environ. Res. Public Health 2022, 19, 1722. [Google Scholar] [CrossRef]
  258. Azmanov, H.; Bayatra, A.; Ilan, Y. Digital Analgesic Comprising a Second-Generation Digital Health System: Increasing Effectiveness by Optimizing the Dosing and Minimizing Side Effects. J. Pain Res. 2022, 15, 1051–1060. [Google Scholar] [CrossRef]
  259. Hurvitz, N.; Elkhateeb, N.; Sigawi, T.; Rinsky-Halivni, L.; Ilan, Y. Improving the effectiveness of anti-aging modalities by using the constrained disorder principle-based management algorithms. Front. Aging 2022, 3, 1044038. [Google Scholar] [CrossRef]
  260. Kolben, Y.; Azmanov, H.; Gelman, R.; Dror, D.; Ilan, Y. Using chronobiology-based second-generation artificial intelligence digital system for overcoming antimicrobial drug resistance in chronic infections. Ann. Med. 2023, 55, 311–318. [Google Scholar] [CrossRef]
  261. Ilan, Y. Improving Global Healthcare and Reducing Costs Using Second-Generation Artificial Intelligence-Based Digital Pills: A Market Disruptor. Int. J. Environ. Res. Public Health 2021, 18, 811. [Google Scholar] [CrossRef] [PubMed]
  262. Ilan, Y. Next-Generation Personalized Medicine: Implementation of Variability Patterns for Overcoming Drug Resistance in Chronic Diseases. J. Pers. Med. 2022, 12, 1303. [Google Scholar] [CrossRef]
  263. Patel, D.; Konstantinidou, H. Prescribing in personality disorder: Patients’ perspectives on their encounters with GPs and psychiatrists. Fam. Med. Community Health 2020, 8, e000458. [Google Scholar] [CrossRef] [PubMed]
  264. Fernandes, L.; FitzPatrick, M.E.; Roycroft, M. The role of the future physician: Building on shifting sands. Clin. Med. 2020, 20, 285–289. [Google Scholar] [CrossRef] [PubMed]
  265. Baltaxe, E.; Czypionka, T.; Kraus, M.; Reiss, M.; Askildsen, J.E.; Grenkovic, R.; Lindén, T.S.; Pitter, J.G.; Rutten-van Molken, M.; Solans, O.; et al. Digital Health Transformation of Integrated Care in Europe: Overarching Analysis of 17 Integrated Care Programs. J. Med. Internet Res. 2019, 21, e14956. [Google Scholar] [CrossRef] [PubMed]
  266. Warraich, H.J.; Califf, R.M.; Krumholz, H.M. The digital transformation of medicine can revitalize the patient-clinician relationship. NPJ Digit. Med. 2018, 1, 49. [Google Scholar] [CrossRef]
  267. Sutton, R.T.; Pincock, D.; Baumgart, D.C.; Sadowski, D.C.; Fedorak, R.N.; Kroeker, K.I. An overview of clinical decision support systems: Benefits, risks, and strategies for success. NPJ Digit. Med. 2020, 3, 17. [Google Scholar] [CrossRef]
  268. Gelman, R.; Hurvitz, N.; Nesserat, R.; Kolben, Y.; Nachman, D.; Jamil, K.; Agus, S.; Asleh, R.; Amir, O.; Berg, M.; et al. A second-generation artificial intelligence-based therapeutic regimen improves diuretic resistance in heart failure: Results of a feasibility open-labeled clinical trial. Biomed. Pharmacother. 2023, 161, 114334. [Google Scholar] [CrossRef] [PubMed]
  269. Park, J.I.; Lee, H.Y.; Kim, H.; Lee, J.; Shinn, J.; Kim, H.S. Lack of Acceptance of Digital Healthcare in the Medical Market: Addressing Old Problems Raised by Various Clinical Professionals and Developing Possible Solutions. J. Korean Med. Sci. 2021, 36, e253. [Google Scholar] [CrossRef]
  270. Mathews, S.C.; McShea, M.J.; Hanley, C.L.; Ravitz, A.; Labrique, A.B.; Cohen, A.B. Digital health: A path to validation. NPJ Digit. Med. 2019, 2, 38. [Google Scholar] [CrossRef] [PubMed]
Table 1. Some challenges faced by digital health systems and suggested methods for overcoming them using CDP-based systems.
Table 1. Some challenges faced by digital health systems and suggested methods for overcoming them using CDP-based systems.
Digital Health System ChallengeConstrained-Disorder Principle-Based Second-Generation Artificial Intelligence Solutions
Data“Big data” failed to translate into improving patient outcomeGenerating insightful, personalized datasets for subject-tailored therapeutic regimens [146,236,239,261,262]
UsersLack of engagement by patients and physicians Outcome-based systems ensure long-term engagement as patients view the system as part of the therapy [262].
A need for explainable systemsThe improved outcome is quantifiable in most cases, easing the process of adapting to digital systems [146,261,262]
System functionsBiasesThe system reduces biases as it is independent of the physician. Algorithms are targeted to clinically meaningful outcomes [146].
PayersIncreased costsBy improving outcomes, the system reduces hospitalizations and the need for more expensive therapies, thus saving costs [261].
Pharma companiesCannot translate digital system to profitsImproving adherence increases sales while providing pharma with a market disruptor [261].
ValidationDifficulty in validating advantagesOutcome-based endpoints are quantifiable and, in most cases, are easily validated [238,268].
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Hurvitz, N.; Ilan, Y. The Constrained-Disorder Principle Assists in Overcoming Significant Challenges in Digital Health: Moving from “Nice to Have” to Mandatory Systems. Clin. Pract. 2023, 13, 994-1014. https://doi.org/10.3390/clinpract13040089

AMA Style

Hurvitz N, Ilan Y. The Constrained-Disorder Principle Assists in Overcoming Significant Challenges in Digital Health: Moving from “Nice to Have” to Mandatory Systems. Clinics and Practice. 2023; 13(4):994-1014. https://doi.org/10.3390/clinpract13040089

Chicago/Turabian Style

Hurvitz, Noa, and Yaron Ilan. 2023. "The Constrained-Disorder Principle Assists in Overcoming Significant Challenges in Digital Health: Moving from “Nice to Have” to Mandatory Systems" Clinics and Practice 13, no. 4: 994-1014. https://doi.org/10.3390/clinpract13040089

APA Style

Hurvitz, N., & Ilan, Y. (2023). The Constrained-Disorder Principle Assists in Overcoming Significant Challenges in Digital Health: Moving from “Nice to Have” to Mandatory Systems. Clinics and Practice, 13(4), 994-1014. https://doi.org/10.3390/clinpract13040089

Article Metrics

Back to TopTop