Next Article in Journal
Assessment of Dynamic Object Information Utilization Service in a Control Center for Each Urban Scale via Fuzzy AHP
Next Article in Special Issue
Theoretical Reflections on Reductionism and Systemic Research Issues: Dark Systems and Systemic Domains
Previous Article in Journal
Offset Optimization Model for Signalized Intersections Considering the Optimal Location Planning of Bus Stops
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Systems Precision Medicine: Putting the Pieces Back Together

Department of Computer, Control and Management Engineering “A. Ruberti”, Sapienza University of Rome, 00185 Rome, Italy
Systems 2023, 11(7), 367; https://doi.org/10.3390/systems11070367
Submission received: 11 June 2023 / Revised: 14 July 2023 / Accepted: 16 July 2023 / Published: 18 July 2023
(This article belongs to the Special Issue Theoretical Issues on Systems Science)

Abstract

:
Systems precision medicine is an interdisciplinary approach that recognises the complexity of diseases and emphasises the integration of clinical knowledge, multi-omics data, analytical models, and the expertise of physicians and data analysts to personalise the care pathway in complex diseases, such as cancer or diabetes. The aim is to gain a comprehensive understanding of diseases by analysing individual components and identifying relevant aspects for therapy and diagnosis. Key components, their interactions and emerging patterns can be studied using statistical, mathematical and computational tools. The combination of data analysis and clinical evaluation is crucial to effective decision-making, emphasising the need for an integrative approach rather than relying on data alone. Therefore, the crucial point discussed in this paper is that the “computational” part and the “artistic” part (i.e., the physician’s intuition) cannot be separated, and therefore, systems precision medicine can be configured as a collective work of art, involving not only different medical professionals but also, and above all, professional data analysts. The work is “artistic” because data and mathematics alone, without medical knowledge of the context, are not enough. But the work is also “collective” in the sense that it must be the place of cultural integration between the professional intuition of the physician, which cannot be translated into mathematical formulas, and the ability to extract information from multi-omics data of the data analysts, who instead use formal and computational mathematical methods. However, to drive the medical revolution and reassemble a patient’s parts, data analysts need to be involved in the hospital context, and precision medicine physicians should embrace data analytical perspectives. This will require ongoing dialogue, new languages of communication, and education that promotes continuous learning and collaboration between professions, fostering a new level of interdisciplinary collaboration for personalised care.

1. Introduction

Systems precision medicine is a multidisciplinary approach that involves quantitative biotechnologies, information systems for data management, and methods for their analysis and interpretation. Ongoing and recent efforts and projects are playing a crucial role in advancing its implementation. Two notable examples are the 1000 Genomes Project and the European Health Data Space. The 1000 Genomes Project is an international collaborative effort to create a comprehensive catalogue of human genetic variation. The European Health Data Space is a European Union (EU) initiative that aims to facilitate the sharing of health data between EU Member States. Both initiatives are essential to the implementation of systems precision medicine.
Moreover, data quality, assessment, standards and good practices are vital components of effective data management forsystems precision medicine. They contribute to the reliability, integrity and usability of data, enabling organisations to make informed decisions, conduct meaningful analysis and derive valuable insights. Organisations can optimise the value and influence of their data assets by adhering to established standards and best practices, thereby ensuring data quality is maintained.
However, the success of this approach depends not only on large databases, advanced technological tools and the formation of interdisciplinary groups but also on the integration of heterogeneous data, ranging from patient records and multi-omics data to the identification of pharmacological targets [1]. This twofold integration is an incredibly difficult task, which in fact requires a long series of human and technological ingredients largely not seen in clinical practice. Instead, we see the intrusive presence of high-tech hucksters selling miraculous algorithms with fancy names. Meanwhile, in just a few years, the algorithm as such has become the new philosopher’s stone, precisely because, like the philosopher’s stone, it is thought to be able to do the following [2]:
-
Provide data analysis methods capable of curing any disease (elixir of life);
-
Acquire omniscience, or absolute knowledge of the future, through artificial intelligence and machine learning (the philosopher’s ingredient);
-
Transform data (base metals) into information (gold).
Unfortunately, the temptation to delegate therapeutic decisions to some distant, cold and calculating deity is an ancient human temptation that transcends time and space, as we turn to an “elsewhere” when we no longer feel able to solve our problems here at home. Artificial intelligence and all its esoteric offshoots now occupy the Olympus of fashion. Just think of the “neural networks”, which have been given this suggestive name simply because they are inspired by the functioning of biological neurons, but they actually have nothing in common with biology [3,4]. To understand the inappropriate and dangerous association, just think of a network of tanks connected by tubes, which could legitimately be called a “urological network”, inspired by the functioning of the bladder and urethra. Or a doll that cries because it wants a dummy could be called an “artificial baby” because it is inspired by human behaviour. And what about the dishwasher? The vacuum cleaner? Words are important because they define contexts of meaning in collective perception. If I call “Sleepwell” a pillow-shaped sweet that does not make me sleep well, I cannot say that I was inspired by the pillow; I am making an improper use of words and, in this case, a very dangerous one for people’s health. Of course, neural networks are data analysis tools that are potentially very useful, even in medical applications, but it is important not to associate their functioning with that of a biological brain because this operation, as I said, is not only inappropriate but also very harmful in terms of the false expectations it can create in the patient.
Systems precision medicine as such puts its finger on the wound by clearly and unambiguously showing that treatment is an “earthly” matter and that it is not necessary to rely on in silico divinities but rather on flesh-and-blood people with the right interdisciplinary approach. We are all aware of the problem of integrating medical knowledge not only with data and analytical models but also, and above all, with physicians’ intuition, because their deep, unstructured knowledge, which is not organised in schemes that can be captured by universal algorithms, is the most valuable resource for the true personalisation of the care pathway. The road is still long and arduous, and the reason is simple: if, on the one hand, “tearing apart” a patient is an operation that we can perform with great efficiency and accuracy, the same cannot be said when it comes to reassembling the overall picture to highlight the most relevant aspects for therapeutic purposes, diagnostics and treatment [5]. Therefore, let’s leave the algorithms in their place in our toolbox and start discussing the path indicated by the systems view.
In precision medicine, the concept of a “system” refers to a set of “objects”, such as molecules, cells or tissues, that are interconnected at different levels of aggregation, either physically or through functional and regulatory relationships [1]. The first step in a systems approach is, therefore, to identify the key components and their interactions. The next step is to examine the consequences of these interactions and observe emerging patterns or the system’s behaviour. The ultimate goal is to understand the determinants that contribute to the birth and development of a disease using statistical, mathematical and computational tools that support each phase of the systems approach. The problem, however, is that while the physical and engineering sciences have a long history of using these approaches, medicine has unique characteristics that make this work much more complex to put into practice.
Identifying and quantitatively measuring the parts that make up the system is an unavoidable first step: we can only start with data, i.e., “measurements” of some kind, and the biotechnological tools available today allow us to “see” more and more deeply, i.e., more and more details. Put simply, the parts of the system are becoming smaller and smaller and more numerous. While, on the one hand, we can unpack a patient in an increasingly fragmented way, on the other hand, the problem of putting the pieces back together is becoming more and more difficult. The incredible development of molecular biotechnology brings with it the price of a patient that is fragmented into ever thinner powders that become ever larger clouds. And there is still no agreement, even minimally, among researchers on how to integrate data and skills into a unified and coherent vision that can be applied in clinical reality.
In this article, I try to summarise, from a very personal perspective, the salient points of the historical and cultural development of the systemic approach in precision medicine and the challenges/opportunities it presents. In particular, I support the idea that, while the definition of the parts is now an uncontrollable and irreversible process that has been underway for decades, their integration into a unified framework cannot be based solely and exclusively on data and their analysis—as all approaches labelled as “data-driven” or “digital driven” suggest—but must build its foundations on the harmonious interaction of data, data analysts and clinicians. In other words, we also need a cultural integration that goes through the personal experience of each professional involved. This is why the path of diagnosis and treatment in systems precision medicine is strikingly similar to a collective work of art. The problem is not one of human “supervision”, but of incorporating multidisciplinary technologies and skills into the same process of diagnosis and treatment.
I begin by identifying the turning points of systemic thinking and its first successes, which are making a huge impact in all areas of medicine, but particularly in what is known as “precision medicine”, i.e., medicine based on molecular data that can identify the specific therapeutic targets of a given patient. Finally, the limitations of this single-target approach will lead us into the unexplored territory of defining new strategies that are not yet on the horizon, but which will certainly have truly revolutionary characteristics because, like all true revolutions, they will change the lives not only of patients but also of doctors and data analysts, who will increasingly work together in ways and forms that we could define as “artistic” rather than strictly “scientific”.

2. Descartes’ Dream

The idea of a world in which science represents the ultimate knowledge of reality, where truth coincides with certainty and where rationality is the only guide in the midst of life’s chaos, was born in a most daring way on 10 November 1619. This occurred in a small room in Ulm, a Bavarian town on the banks of the Danube, which many years later would prophetically witness the birth of Albert Einstein. For weeks, young René would remain locked up in this poorly heated room, trying to solve the terrible problems that had plagued mankind since the dawn of time and that our “humble” friend, now only 23 years old, was pursuing relentlessly and with titanic determination and perseverance. It was very cold that evening and he fell asleep in the faint heat of the stove.
The first dream was of a ghostly whirlwind that tossed him here and there, making him pirouette three or four times on his left foot as he felt himself falling towards nowhere. Then, the wind stopped, and he woke up. But immediately he fell asleep again, and the second dream took him suddenly. He heard thunder and saw sparks flying around his room. The second dream ended and immediately began the third, the most disturbing and foreboding (at least according to him). Here, everything was extremely quiet, calm, the atmosphere serene, meditative. There was a table with a book of poems: he chose a page at random and read a verse: “What path will I take in life?” Then, another apparition, a stranger, recited another verse: “Yes and no”. Then, he told the stranger that he had a more beautiful book than the one he had just read, but as soon as he finished, everything disappeared: the table, the book and the stranger. He woke up. René was shocked and began to pray. When he recovered, he felt that he had had a supernatural experience. In fact, he had good reason to because his interpretation of the dreams and visions he had that fateful night showed him nothing less than the way to unify all human knowledge by means of the “method of reason”; the Spirit of Truth had wanted to open to him the treasures of all sciences through this dream. And it took René eighteen years to write a book containing this ominous result: the “Discourse on the Method” of 1637, published in French by a publisher in Leiden, Holland. René’s great idea was the certainty of the existence of a precise instrument capable of unifying all the sciences, from metaphysics to physics and then all the other sciences (including biology, of course): the instrument was mathematics. In fact, according to him, the very essence of reality is mathematical, echoing and reviving the Pythagorean and Platonic visions. It is precisely this identification of mathematics as the essence of the world that allows it to be applied to reality, all reality, from the physical to the metaphysical, without any limitation. In short, René was one of those who, in the end, did not care much for experiments because theory, i.e., abstraction, is a guarantee of reality, as Plato believed.
René is fascinated by the procedures of mathematics, where, starting from very simple objects and properties, it is possible, by applying a few rules, to arrive at extremely complex knowledge, a bit like the elements of Euclid, who, starting from concepts such as point and line, built structures and discovers wonderful and complex properties. Why not extend this approach to the whole of reality? On the other hand, if everything is mathematics, it is not clear why we cannot use its wonderful methods to analyse it. Descartes’ dream was indeed the total and absolute mathematisation of the world of objects and thought. For example, René argued with a priori arguments that boiled water brought back to room temperature freezes at a lower temperature than unboiled water. Too bad he did not conduct an experiment!
But let us see what these rules are that allow us to “learn to distinguish the true from the false, to see clearly in my actions, and to proceed with confidence in this life”. There are only four of them, but they are very important because they have shaped (and will continue to shape) profound scientific and technical thinking. We must not forget that these rules are used, in a more or less recognisable form, in every scientific and technological enterprise. They are as follows:
  • First rule: never accept anything as true without clearly knowing it to be so; that is, do not judge anything that does not present itself to reason so clearly and distinctly as to leave no doubt.
  • Second rule (top-down): divide any problem into as many parts as possible to make it easier to solve.
  • Third rule (bottom-up): organise your thoughts in an orderly way, starting with the simplest and easiest things to know, and working your way up to the most complex knowledge.
  • Fourth rule: make such perfect enumerations of all cases and such complete reviews that you are sure nothing has been left out.
The second and third rules are the ones that we can consider to be very current in data analysis. The second rule corresponds to what is known as “top-down” reasoning, i.e., the need to break down any object or problem into its parts in order to reveal its internal structure. This procedure is reminiscent of the reductionist approach, which, as we shall see, is found in the development of the medical sciences and the paradigms of health and disease that have followed one another over time, starting with the organs as “carriers” of some “dysfunction”, then the individual cells, right up to the current paradigm that goes in search of the “faulty” protein, DNA or RNA in order to define a cure. But also, the analysis of the data, as in the universal intentions of René, represents a phase of classification and characterisation of the pieces of reality that are available, namely, the data, which are obviously the only elements that should define the picture of the disease, that is, the division into many parts, down to the simplest (from the organism to the molecule in our case).
The third rule is definitely the most fascinating of all because even today, no one knows exactly how to achieve it in biology and medicine, even though it has been successfully applied in many cases. We are talking about the “bottom-up” approach, that is, the phase of reconstructing reality (or the problem, but for René, the laws of physics and thought were the same), which allows us to obtain an “overview” of the phenomenon that interests us. In other words, the third rule should enable us, after identifying the trees, to see the forest as a whole again. But the third rule could wait because medical science has progressed rapidly on the basis of the second rule only, which is the one that children like to use when they take apart a toy they have just been given.

3. Patients Torn to Pieces

Historians of medicine have identified several stages in the development of ideas about disease. The first phase is magical and religious, as in the Egyptian civilisation (around 2700 BC), where illness was seen as a magical phenomenon, of supernatural origin, a real divine punishment [6]. The phase that we can consider “pre-modern” is the one that began in Greece in the fifth century BC with Hippocrates of Kos, whose oath is still taken today by the new doctors of medicine and surgery. In the purely rationalist style that characterised the Greek worldview, disease was considered to have a natural origin, without the intervention of the gods. In fact, Hippocrates believed that illness was caused by an imbalance in the four basic “humours” (blood, phlegm, yellow bile, and black bile). It is interesting to note how the disease was “delocalised”, i.e., seen as a problem of the whole body, to which a “sick person” corresponded as a whole [7]. Even the method of Hippocrates is worthy of Dr. House, at least in the initial phase when all the symptoms are carefully evaluated to find the cause of the disease, and therefore, the therapy. In fact, many centuries before Hippocrates, traditional Chinese medicine considered disease to be an imbalance of the forces of nature, of yin and yang, and therefore, also characterised by the disturbance of a balance that affects the whole organism and not just a part of it. The common characteristic of illness, therefore, regardless of the different cultures, is that it is a property of the person and not of a single part, as we have seen. The patient as a whole was the object of study for the physician, who, therefore, had to treat the person with a particular disease and not the disease in isolation [8].
The irony of fate is that today, 2500 years later, we are back where we started: modern medicine is desperately searching for a way to put the pieces back together, to have a characterisation of the disease that is not simply a collection of the results of many tests or analyses at various levels, from the functionality of an organ to its molecular profile. In fact, the aim is to “reconstruct” the patient in order to see the disease as Hippocrates did, i.e., at one with the body, the social environment and lifestyle. The effect of Cartesian scientific reductionism in medical practice can be linked to the conception of disease as something separate from the sick person and, as such, analysed with increasingly sophisticated analytical tools. In the seventeenth century, for example, the English physician Thomas Sydenham, one of the fathers of English medicine and a profound expert on the manifestations and course of smallpox and scarlet fever, wrote the following: “it is necessary that all diseases be reduced to definite and certain species with the same care which we see exhibited by botanists in their phytologies” [9], as reported in [10]. In other words, the idea affirmed by modern medicine is analogous to the work of the botanist, who collects the plants he finds during his explorations and classifies them in a precise and systematic way, in the style defined by Linnaeus.
The history of pulmonary tuberculosis [10] is a perfect example of the transformation that Descartes and many others brought to medicine over time [11]. Tuberculosis has existed for at least 15–20,000 years and was described by Hippocrates as “an ulcer of the lungs, chest and throat, accompanied by cough, fever and destruction of the body by pus”. Until the nineteenth century, tuberculosis was incorporated into the medical classification of “consumption”, which was based on a mixture of hereditary characteristics, occupation, social status and personality traits. It was not until the precise assessment of the tuberculous lesions in the lung tissue that the disease was clearly defined, and its symptoms brought into a single category. It was William Stark, in his observations published by James C. Smith 18 years after his death [12], who proposed the evolution of the pulmonary nodules into ulcers and cavities, thus justifying the different manifestations of the disease in a single location of the lesion. These lesions were further studied in microscopic detail until, in 1882, Robert Koch identified the microbiological cause, the bacillus Mycobacterium tuberculosis [10]. In this exemplary case, Descartes’ second rule is clearly seen in action, from the whole organ to its lesion, down to the microscopic level where the cause of the disease is evident.
We must not forget that this process of reducing the disease from the whole patient to its parts, organs and tissues, and then to microbiology, allowed for the development of streptomycin and other antibiotic therapies. However, this great success of the Cartesian method hides serious limitations. During Koch’s time and in the years that followed, many other scientists and clinicians realised that apart from special cases, no therapeutic advance could ignore the totality of the patient’s condition [13]. Despite the doubts, the impact of such a clear and simple story could not fail to fascinate the most refined scientific minds, leading them to try to repeat this approach for other diseases, not only for their diagnosis but also for patient care.

4. “Go, Catch, and Kill”

The process that led to the discovery of the cause of tuberculosis and its cure was in fact simple, elegant and very seductive. Identifying smaller and smaller parts of our body down to the individual molecules that make it up, in perfect accordance with the Cartesian method, suggests the possibility of digging all the way down to find a hidden cause, in perfect analogy with the search for a diamond in a mine. The reductionist/Cartesian approach has been very effective in all areas of science and technology, and still supports all those visions of disease that point inwards, from the macroscopic to the microscopic, in the belief that by curing the microscopic, the macroscopic will automatically be cured. Unfortunately, this is not always the case; in fact, the opposite is more often the case, in the sense that the cause of a disease cannot always be considered at the “lowest” level, i.e., the DNA, because many other factors, at different levels, influence the effect that a mutation of a gene can have on the organism as a whole. In addition, external factors, such as the environment and lifestyle, have a huge impact on many diseases, such as hypertension, cancer and diabetes. This is why doctors often recommend lifestyle changes and the elimination of environmental stressors in addition to medication. In any case, the basic idea introduced into medicine by the spectacular success of tuberculosis treatment is still with us and has immense implications. Medicine’s job is to identify the causes of disease at the microscopic level and find another target to hit. In the words of the Pulitzer-Prize-winning Indian oncologist Siddhartha Mukherjee, medicine today is based on this simple sequence of events: You have an illness and you go to the doctor. Then, you take the pill he prescribes, and that pill kills something [14] (not just bacteria or viruses, but individual proteins or whole cells).
The “go, catch, and kill” paradigm underpins all of modern medicine and pharmacology and has deep roots in the success stories of tuberculosis and antibiotics. Drugs produced by pharmaceutical companies all have the same characteristics: once introduced into the body, these molecules can find their target, lock it up, make it unable to function and then throw away the key. Over the last hundred years, we have tried in every way to replicate this model for all diseases, from diabetes to cancer, from cardiovascular disease to autoimmune disease, and with great success, so much so that we have given it a truly suggestive name: the magic bullet paradigm. But how did we get there? We need to take a step back and talk about the extraordinary intuition of a great German doctor, avid cigar smoker, Nobel Prize winner in 1908 for his studies in immunology and passionate reader of Sherlock Holmes: Paul Ehrlich.

5. Paul’s Magic Bullet

He had a very annoying vice. He scribbled drawings and formulae on every conceivable surface: laboratory walls, tablecloths, napkins, and even the soles of his shoes and the shirts of his colleagues. And finally, his most important contribution to medicine came in the form of a crazy and breathtaking idea. In 1900, he published an article entitled “On Immunity with Special Reference to Cellular Life”, in which he explained his theory of the immune response, which he called the “side chain theory”. In modern terms, Ehrlich’s idea is that potentially dangerous foreign molecules (antigens) bind to receptors scattered on the cell wall. Receptors are very localised areas on the membrane or within the cell itself that interact with molecules of different types (for example, hormones, small molecules or chemical messengers). The receptors stimulate the cell to produce more receptors of the same type and to release the excess (antibodies) into the extracellular fluid.
Ehrlich called these antibodies “magic bullets” that targeted single antigens. This model of immune response was completely out of the box and provoked furious and disbelieving reactions from the scientific establishment of the time. The point that was difficult to digest was actually surprising: the foreign body, the antigen, finds a receptor on the cell surface that binds it perfectly and uniquely, without ever having encountered it before! But what was the point of having so many receptors, most of which would be useless? And was there enough space on the cell wall? In fact, these observations were very sensible, though Ehrlich’s proposal was not entirely correct. However, his idea of an antibody that binds solely and exclusively to a specific antigen was absolutely correct and was the basis for subsequent models of the immune response. But for our history, another unexpected development of this idea would lead to the birth of pharmacology and the “go, catch, and kill” model that pervades modern and future medicine. Ehrlich himself had called antibodies “magic bullets” in the search for target antigens. He was probably inspired by Emil Fisher, who had proposed a “key-lock” model for the workings of enzymes.
Paul’s intuition was to reapply the “magic bullet” model he had imagined for the way antibodies worked, and thus, to develop the concept of a chemical that binds to another and blocks its function, thus selectively killing microbes (as in the case of tuberculosis) or even cancer cells (as in chemotherapy). The importance of this intuition cannot be underestimated [15].
Ehrlich’s powerful metaphor became a central element of re-organisation of twentieth-century medicine and its depiction as the “golden age”. The image of the “magic bullet” depicted institutional medicine as being able to control and defeat diseases. The new biomedical paradigm able to identify a specific cause and cure, with its strong links with laboratory science and new biotechnological apparatuses, has been the central pivot for the rise of the status of medicine.
The “heroic” nature of his scientific endeavour is well expressed by the historian of science Fritz Stern [16]: “And perhaps more than anyone else, Paul Ehrlich has shown how much an individual can achieve on his own: thanks to his laboratory discoveries, clinical doctors have been able to save an incalculable number of human lives”.

6. Four Senators and Descartes’ Third Rule

How far do you have to go to dismember a human being? Indeed, the so-called “omics technologies” have reached an impressive level of detail and have managed to measure an incredible number of elements of an organism down to the molecular level. Of course, it is very difficult to know a priori what is “necessary” for a complete characterisation of a disease in a given patient. The reason for this is that each pathology is very different from the others, and what may be enough for one person to see their doctor may not be enough for another. In other words, we know with certainty that we do not know, and that is why biotechnologies have entered into a whirlwind of technological innovations that allow us to measure new molecules or entire organisms (for example, the microbes that live in our intestines) every day in the hope that “the more data we have, the better”. This approach, although understandable and even sensible in principle, must never lead us to forget that data is not information, and therefore, having many, many rivers, floods, tsunamis of data does not guarantee that useful information is somewhere in this immense sea. However, the quantity and quality of data that can now be obtained from biotechnology have grown so rapidly that it requires an epochal turning point in medicine, defining what is now commonly called the era of “big data medicine”. However, the appropriate use of big data is not only a problem for medicine; in fact, completely different sectors are struggling with very similar problems, from which it is possible to learn to avoid falling into the same traps.
All major film and TV production companies, such as Netflix or Amazon, need to know which projects to make and which to drop. The future of the company depends on these decisions and, of course, the head of the executive, who cannot afford to miss a beat. But these companies have an ace up their sleeve: when we watch a show, they watch us, and they can obtain a huge amount of data about the behaviour of millions of viewers. There is a lot of heterogeneous data that can be obtained. For example, for each user, you can record when they press play; when they pause; and when they continue watching, perhaps the next day. But you can also know which parts are skipped or re-watched: basically, all the actions we carry out on the screen, from the terms we use in searches to our favourite programmes; our ratings (thumbs up, thumbs down); and our preferences in terms of producers, directors, actors and so on. So, for each user, there is a lot of data that characterises their behaviour and should, therefore, allow us to assess the “fate” of a series project, i.e., whether it will be successful, at least in probabilistic terms. The interesting thing—as Sebastian Wernicke reports in a very successful TED talk [17]—is that this is how it actually worked. And indeed, in April 2013, after collecting an impressive amount of data, Amazon’s content executive and his data analyst staff conducted their analysis and all the necessary calculations until the data produced an answer: “Amazon should produce a series about four Republican senators in the US”. They made this series, called “Alpha House”. The result? A mediocre IMDB score [18] of 7.5 out of 10. Keep in mind that a real success should get a score above 8.5 (for example, Game of Thrones attained a score of 9.2 and Breaking Bad attained a score of 9.5) to be in the top 5% of all shows ever made. Around the same time, another executive at a competitor like Netflix is tackling the same problem using “big data” extracted from the user mine. Again, the stakes are high: the goal is to find a “hit show” rather than a mediocre product like many others. Same problem, same type of data, but in this case, the analysts hit the mark: they produced “House of cards”, which is a popular series that scored an 8.7.
How is it possible that two groups of expert data analysts, two companies with great and long experience and decisions based on the same “ingredients” produced such different results? In fact, it seems that an approach based on a lot of qualified data must always work, i.e., the collection of immense amounts of data and high quality must somehow “guarantee” a good decision, or at least the possibility of defining a good TV programme. But if there is no guarantee, even for a television programme, what should we think about much more “delicate” applications, such as those relating to health? It is really worrying, but it is too big of a problem to underestimate. But back to the point. What was the difference between the two approaches taken by Amazon and Netflix? Both have followed the “Cartesian method” to the letter, i.e., they have “torn apart” the problem of trying to understand viewer behaviour on the basis of many small individual actions: without this data, these many small aspects, they could never have a realistic idea of what is happening in the mind of a user watching a series. Data is, therefore, extremely important to define the parts or “pieces” of a problem and to characterise them quantitatively so that the various statistical algorithms for data analysis can be used. However, the “reverse” process, the practical application of Descartes’ third rule, in which the solution to the original problem is reconstructed from the many small pieces (the synthesis of data), was quite different. The decision that led to the production of Alpha House was based on data alone. Instead, as Wernicke recounted in his TED talk, the decision to create the “house of cards” was made by the executive and his staff on the basis of considerations independent of the data. In other words, the data were used in the first phase of analysis, i.e., in determining the detailed “constituent” elements that were considered relevant to define the overall problem, but the final decision was not based solely on the available data, but on a collective choice of the working group, which shared its assessments and came to an agreement. If the human brains can do anything good collectively, it is to put the pieces together, to make sense of incomplete and partial information.

7. Networks

A very promising way to put the pieces together in systems precision medicine is to use networks. As a matter of fact, phenotypic outcomes result from the intricate interplay of molecular components orchestrated by biological networks. Homeostasis represents the delicate balance and dynamic equilibrium of network states that maintain proper physiological function. However, perturbations in these networks can lead to pathological disease states. Understanding the dynamics of biological networks and their role in shaping phenotypic outcomes can provide a valuable framework for putting the pieces back together and designing effective interventions [19]. Moreover, the use of networks in systems precision medicine may also have the role of a “lingua franca” that can be used to break down the barriers between precision physicians and multi-omics data analysts [20].
A “network” is a “cognitive schema”, an abstract framework of concepts that helps us make sense of the complex world of life. It is not a tangible entity, but rather a representation of relationships and connections between concepts. The characterisation of a network involves cores (concentrated areas) connected by edges (representing pathways or connections) in a less dense environment. However, when applied to protein–protein interactions, for example, this concept recognises that interactions are not simply lines connecting points, but complex phenomena influenced by spatial and energetic factors.
By replacing proteins with nodes and their interactions with links, we create a metaphorical projection of the protein–protein interaction network onto the network diagram. This allows complex systems to be represented and understood using a simplified abstraction of the data. In essence, the network serves as a functional abstraction that allows us to understand complex systems by simplifying their representation, thereby facilitating dialogue between physicians and data analysts.
A network is a visual representation of data that focuses on the relationships (links) between elements (nodes), with more emphasis on the configuration of the links than on the specific characteristics of the nodes. The aim is to identify network patterns that can be applied metaphorically to biological concepts, creating a common language that bridges the gap between computational and clinical researchers. This will allow them to discuss and interpret the same reality from different perspectives.
Biological systems, such as cells and organisms, exhibit intricate organisation and function through the interplay of different molecular components. These components interact in a highly regulated manner, forming complex networks that control fundamental processes, such as metabolism and gene regulation. Understanding the different modalities of biological network organisation and considering their interactions on a larger scale has been shown to be of great value in understanding the functioning of living systems.
Metabolism, for example, refers to the totality of chemical reactions that occur within a cell or organism to sustain life. These reactions involve the interconversion of molecules, the production of energy and the synthesis of cellular components. Metabolic pathways form a network in which different molecules serve as substrates, products or intermediates, with enzymes acting as catalysts to facilitate the reactions. By analysing the organisation of metabolic networks, researchers can uncover key pathways, identify regulatory nodes and gain insights into how cellular metabolism is coordinated.
Another relevant example is gene regulation, where the control of gene expression enables cells to respond to internal and external cues. Gene regulatory networks consist of genes, transcription factors and other regulatory molecules that interact to modulate gene activity. These networks determine when and to what extent genes are activated or repressed, thereby influencing cellular processes and development. By studying gene regulatory networks, scientists can elucidate the mechanisms underlying cellular differentiation, disease progression and responses to environmental changes.
Considering the large-scale interactions of these biological networks provides a more holistic view of how different modalities of cellular organisation work together to achieve overall system function. By integrating data from multiple networks, scientists can identify novel regulatory mechanisms, predict cellular responses and uncover emergent properties that cannot be understood by studying individual networks in isolation. A comprehensive review that discusses the relevant applications of network science to the field of medicine can be found in [21].
Finally, networks can be used as a vocabulary generator, where different disciplines use the same “words” to refer to the same underlying concepts (cell behaviour, health, disease, etc.). Using this approach, researchers can continue to study their respective disciplines while engaging in interdisciplinary dialogue to advance systems precision medicine in a clinical setting.

8. Conclusions

Systems precision medicine, as an interdisciplinary approach, plays a key role in tackling complex diseases. This approach requires the integration of clinical knowledge, data and analytical models, as well as the in-depth knowledge of physicians and data analysts to personalise the care pathway. The challenge, however, is to reconstruct the whole picture of the disease after analysing the individual parts to identify the relevant aspects for therapy and diagnosis. Systems precision medicine focuses on identifying key system components and their interactions and studying the consequences of these interactions and the patterns that emerge. The use of statistical, mathematical and computational tools supports this approach but is not sufficient. The implementation of systems precision medicine presents unique challenges because it requires agreement between researchers on how to combine data and expertise into a coherent vision for clinical application. Cultural integration, including the personal experience of each professional involved, is essential.
Descartes’ dream of unifying all the sciences through mathematics, after the initial spectacular successes heralded by the metaphor of Erlich’s “magic bullet” for identifying therapeutic targets, is showing all its limitations. Systems precision medicine, on the other hand, seeks to consider the patient as a whole, which overcomes the reductionist approach that separates the disease from the patient, by integrating data, mathematical and statistical models, and active participation (not just supervision) of the professionals involved, from the clinician to the data analyst, in the diagnosis and treatment process. It is important to combine data analysis with an overview and collective evaluation in order to make effective decisions, because—as shown by the choice of TV series to be produced—the use of big data alone does not guarantee success but requires an integrative and collective evaluation.
Moreover, in the realm of biological research, it is crucial to differentiate between statistical significance and biological meaning when interpreting experimental outcomes. While statistical significance provides a measure of the reliability of an observed effect, it does not necessarily imply that the observed effect is biologically meaningful or relevant. This distinction emphasises the importance of sanity checking or critically evaluating research findings beyond purely statistical analyses. In fact, statistical significance alone does not provide insight into the magnitude or relevance of the observed effect. It is essential to consider the biological context and the practical significance of the findings. Biological meaning refers to the relevance, importance and interpretability of the observed effect within the context of the specific biological system being studied.
In addition, sanity checking involves scrutinising the experimental design, methodology and potential confounding factors that may influence the results. It is essential to consider factors such as the sample size, study population characteristics, experimental controls and the validity of the chosen statistical tests. Conducting independent replication studies or meta-analyses can further strengthen the reliability and generalisability of the findings. By incorporating sanity checking, researchers and scientists can ensure that their interpretations of statistical results align with the underlying biology and have meaningful implications. It helps to prevent unwarranted conclusions, misinterpretations and overgeneralisations based solely on statistical significance.
The recognition that complex phenotypic outcomes in biology are driven by chains of physically or functionally related activities of biological components highlights the importance of establishing causal relationships and associations. Statistical associations, while valuable, must be scrutinised in light of these connections in order to distinguish genuine underlying mechanisms from chance findings. By focusing on the interactions and relationships between biological components, researchers can gain a deeper understanding of complex biological phenomena.
The key issue for an effective implementation of systems precision medicine in a clinical setting is to encourage direct collaboration between precision physicians and multi-omics data analysts, which is an issue of great relevance to patients. However, this collaboration is blocked by the fashionable approach called “data-driven” or “digital-driven”, where it is believed that the physician’s role should only be that of an independent evaluator of the output of an algorithm. The idea that one can automatically extract useful information for patients using somewhat “intelligent” algorithms is deeply misguided and dangerous: data alone should never be trusted; it should always be put in the right context and in the more general perspective of the patient, which is an operation that can only be undertaken by an expert physician in the field. In other words, the message here is that there are no one-way roads from data to patients, but that a collective enterprise of integrating data and skills is needed. This means that the computational and mathematical approaches must coexist harmoniously with the more artistic approach of intuition and the physician’s non-formalised knowledge of the patient’s general context.
What is the rational basis or premise for this intuition? Physician intuition, often referred to as clinical intuition or clinical judgement, is the ability of experienced physicians to make accurate and rapid decisions based on their intuition or gut feeling, even in the absence of explicit conscious reasoning. It is an important aspect of medical practice, particularly in situations where time is limited or information is incomplete. The rational basis or premise of medical intuition lies in the accumulation of knowledge and expertise over years of clinical practice. Through extensive training, doctors develop a deep understanding of medical concepts, diseases and patterns of patient presentation. This knowledge is stored in their long-term memory and is readily accessible during clinical encounters. Doctors develop an intuitive sense of recognising patterns, picking up subtle clues and synthesising complex information. They learn to identify similarities between a patient’s symptoms, signs and medical history and those seen in previous cases. This recognition of familiar patterns enables them to make quick connections and generate diagnostic hypotheses. Intuition can be seen as a form of pattern recognition that occurs rapidly and automatically, drawing on a vast repertoire of experiences. In addition, physician intuition is supported by the integration of both conscious and unconscious cognitive processes. While explicit conscious reasoning and analytical thinking are essential to medical decision-making, intuitive judgements involve unconscious cognitive processes that operate below the level of conscious awareness. These processes, including heuristics, mental shortcuts and pattern recognition, allow doctors to process information quickly and make intuitive judgments.
Biologists and physicians have a wealth of expertise in known biological pathways and systems. This expertise enables them to streamline the process of prioritising associations that are more likely to be mechanistically feasible based on the known biological context. In doing so, they can effectively triage potentially spurious statistical quirks and focus on associations that hold greater promise for further investigation and understanding.
The path to diagnosis and treatment of a complex disease can be compared with a collective work of art, requiring not only competence but also the presence of the human element. It is necessary to combine the creativity of nature, which generates the disease, with that of human experts, who must create a unique path based on data. This requires cooperation between physicians and data analysts, with both contributing to the construction of the patient’s story. It is crucial to involve data analysts directly in the hospital context and to enable precision medicine physicians to understand and benefit from the new perspectives offered by data analytics. The physician must be involved in the loop, there cannot be a direct, linear data–diagnosis–therapy pathway, but a circular activity of encounter/clash between data, analysis techniques and contextual medical knowledge. There is also an urgent need for continuous dialogue and new languages of communication, such as the language of networks [20], as well as education that promotes continuous learning and collaboration between both professions. This revolution in medicine requires a radical change in our conception of ourselves as physicians and data analysts, challenging us to welcome the other into our cognitive process. This challenge goes beyond algorithms, as “intelligent” as they may be, and requires a personal commitment to allow the “other” to enter and reach new levels of true interdisciplinary collaboration for personalised care.

Funding

This research received no external funding.

Data Availability Statement

Not applicable.

Acknowledgments

I would like to thank Marco Filetti and Manuela Petti for their critical reading of the manuscript.

Conflicts of Interest

The author declares no conflict of interest.

References

  1. Apweiler, R.; Beissbarth, T.; Berthold, M.R.; Blüthgen, N.; Burmeister, Y.; Dammann, O.; Deutsch, A.; Feuerhake, F.; Franke, A.; Hasenauer, J.; et al. Whither systems medicine? Exp. Mol. Med. 2018, 50, e453. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  2. Wikipedia Page. Available online: https://en.wikipedia.org/wiki/Philosopher%27s_stone (accessed on 11 June 2023).
  3. Di Bella Daniela, J.; Habibi, E. Genetics of cortical development. In Encyclopedia of Child and Adolescent Health; Elsevier Science: Amsterdam, The Netherlands, 2023; pp. 25–39. [Google Scholar]
  4. Beniaguev, D.; Segev, I.; London, M. Single cortical neurons as deep artificial neural networks. Neuron 2021, 109, 2727–2739.e3. [Google Scholar] [CrossRef] [PubMed]
  5. Subramanian, I.; Verma, S.; Kumar, S.; Jere, A.; Anamika, K. Multi-omics Data Integration, Interpretation, and Its Application. Bioinform. Biol. Insights 2020, 14, 1177932219899051. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  6. Metwaly, A.M.; Ghoneim, M.M.; Eissa, I.H.; Elsehemy, I.A.; Mostafa, A.E.; Hegazy, M.M.; Afifi, W.M.; Dou, D. Traditional ancient Egyptian medicine: A review. Saudi J. Biol. Sci. 2021, 28, 5823–5832. [Google Scholar] [CrossRef] [PubMed]
  7. Grammaticos, P.C.; Diamantis, A. Useful known and unknown views of the father of modern medicine, Hippocrates and his teacher Democritus. Hell. J. Nucl. Med. 2008, 11, 2–4. [Google Scholar] [PubMed]
  8. Kleisiaris, C.F.; Sfakianakis, C.; Papathanasiou, I.V. Health care practices in ancient Greece: The Hippocratic ideal. J. Med. Ethics Hist. Med. 2014, 7, 6. [Google Scholar] [PubMed]
  9. Sydenham, T. The Works of Thomas Sydenham (Latham RG, Trans); The Sydenham Society: London, UK, 1848; p. 13. [Google Scholar]
  10. Greene, J.A.; Loscalzo, J. Putting the Patient Back Together-Social Medicine, Network Medicine, and the Limits of Reductionism. N. Engl. J. Med. 2017, 377, 2493–2499. [Google Scholar] [CrossRef] [PubMed]
  11. Kamada, T. System biomedicine: A new paradigm in biomedical engineering. Front. Med. Biol. Eng. 1992, 4, 1–2. [Google Scholar] [PubMed]
  12. Doyle, L. William Stark (1740–1770): His Life, Manuscript and Death. J. Med. Biogr. 2000, 8, 146–148. [Google Scholar] [CrossRef] [PubMed]
  13. Debru, C. From nineteenth century ideas on reduction in physiology to non-reductive explanations in twentieth-century biochemistry. In Regenmortel MHV; Hull, D.L., Ed.; Promises and limits of reductionism in the biomedical sciences; John Wiley: Philadelphia, PA, USA, 2002; pp. 35–46. [Google Scholar]
  14. TED Talks Website. Available online: https://www.ted.com/talks/siddhartha_mukherjee_soon_we_ll_cure_diseases_with_a_cell_not_a_pill? (accessed on 17 July 2023).
  15. Androutsos, G. Paul Ehrlich (1854–1915): Founder of chemotherapy and pioneer of haematology, immunology and oncology. J. BUON 2004, 9, 485–491. [Google Scholar]
  16. Stern, F. Paul Ehrlich: The founder of chemotherapy. Angew. Chem. Int. Ed. 2004, 43, 4254–4261. [Google Scholar] [CrossRef] [PubMed]
  17. TED Talks Website. Available online: https://www.ted.com/talks/sebastian_wernicke_how_to_use_data_to_make_a_hit_tv_show? (accessed on 17 July 2023).
  18. IMDB Website. Available online: https://www.imdb.com/title/tt3012160/ (accessed on 17 July 2023).
  19. Bogdan, P.; Caetano-Anollés, G.; Jolles, A.; Kim, H.; Morris, J.; Murphy, C.A.; Royer, C.; Snell, E.H.; Steinbrenner, A.; Strausfeld, N. Biological Networks across Scales—The Theoretical and Empirical Foundations for Time-Varying Complex Networks that Connect Structure and Function across Levels of Biological Organization. Integr. Comp. Biol. 2021, 61, 1991–2010. [Google Scholar] [CrossRef] [PubMed]
  20. Farina, L. Network as a language for precision medicine. Ann. Ist. Super. Sanita. 2021, 57, 330–342. [Google Scholar] [PubMed]
  21. Silverman, E.K.; Schmidt, H.H.H.W.; Anastasiadou, E.; Altucci, L.; Angelini, M.; Badimon, L.; Balligand, J.L.; Benincasa, G.; Capasso, G.; Conte, F.; et al. Molecular networks in Network Medicine: Development and applications. Wiley Interdiscip. Rev. Syst. Biol. Med. 2020, 12, e1489. [Google Scholar] [CrossRef] [PubMed]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Farina, L. Systems Precision Medicine: Putting the Pieces Back Together. Systems 2023, 11, 367. https://doi.org/10.3390/systems11070367

AMA Style

Farina L. Systems Precision Medicine: Putting the Pieces Back Together. Systems. 2023; 11(7):367. https://doi.org/10.3390/systems11070367

Chicago/Turabian Style

Farina, Lorenzo. 2023. "Systems Precision Medicine: Putting the Pieces Back Together" Systems 11, no. 7: 367. https://doi.org/10.3390/systems11070367

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop