Next Article in Journal
New Certain Results of a Linear Multiplier Fractional q-Differintegral Operator for Fuzzy Differential Subordination and Superordination
Next Article in Special Issue
Detrended Fluctuation Analysis Complements Spectral Features in Characterizing Functional Brain Aging
Previous Article in Journal
Fractional Euler–Lagrange Equations Under Periodic and Antiperiodic Boundary Conditions
Previous Article in Special Issue
Complexity and Persistence of Electrical Brain Activity Estimated by Higuchi Fractal Dimension
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

An Evolution of Our Understanding of Decomplexification Estimation for Early Detection, Monitoring and Modeling of Human Physiology

by
Milena Čukić Radenković
1,*,
Camillo Porcaro
2,3,4 and
Victoria Lopez
5
1
School of Life Sciences and Facility Management, University of Applied Sciences (ZHAW), 8820 Wadenswil, Switzerland
2
Biomedical Engineering Research to Advance and Innovate Translational Neuroscience (BRAIN Unit), Department of Neuroscience and Padova Neuroscience Center, University of Padova, 35128 Padua, Italy
3
Institute of Cognitive Sciences and Technologies (ISTC)-National Research Council (CNR), 00185 Rome, Italy
4
Centre for Human Brain Health and School of Psychology, University of Birmingham, Birmingham B15 2TT, UK
5
Quantitative Methods Department, Cunef University, 28040 Madrid, Spain
*
Author to whom correspondence should be addressed.
Fractal Fract. 2026, 10(3), 169; https://doi.org/10.3390/fractalfract10030169
Submission received: 28 December 2025 / Revised: 13 February 2026 / Accepted: 27 February 2026 / Published: 4 March 2026

Abstract

Human physiology is among the most complex systems in nature, characterized by intricate structural and functional networks and rich temporal dynamics. Electrophysiological signals produced by different tissues/organs reflect physiological activity, and are inherently non-stationary, non-linear, and noisy. This work focuses on fractal analysis, a framework that captures the self-similar and scale-free properties of electrophysiological signals, which is considered to act as an output of complex physiological structures that generate complex processes. Central to this approach is the principle of ‘decomplexification’, whereby aging and disease are associated with a loss of physiological complexity. We discuss key algorithms, particularly Higuchi’s fractal dimension, which is often combined with other nonlinear measures and machine-learning models for real-time analysis of electrophysiological signals. Evidence shows that fractal metrics enable the early detection and monitoring of neurological and psychiatric disorders, outperforming traditional spectral measures. In movement disorders and mood disorders, fractal and nonlinear features show high diagnostic accuracy. Beyond diagnostics, we discuss therapeutic applications, including the prediction of responsiveness to non-invasive brain stimulation. Here, we envisage the evolution of one fractal or nonlinear measure use, to several measures applied, then use it as a feature for machine learning, and then realize that a whole cluster of biomarkers must be used to reflect the state of autonomic profile, which then can be used for ontology-based application profiles that can be machine-actionable. In addition, we discuss the fractal and fractional description of transport processes, which offer innovative improvement for a much more accurate description of physiological reality as a prerequisite for further modeling: for example, this is needed for digital twins to support the clinical translation of fractal analysis for personalized medicine. In essence, if one is trying to mathematically describe or quantify structures or processes in human physiology, fractal and fractional are the supreme and adequate approach to accurately model that reality.

1. Introduction

Physicians recognize disease through distinct patterns in a patient’s physiology or behavior that diverge from what is considered typical for healthy functioning. In order to reliably identify such deviations, it is first necessary to understand the defining characteristics of healthy human physiology. Only then can we detect, and ideally predict at an early stage, the subtle changes that indicate the beginning of pathological processes [1,2].
This is particularly important in neurology and psychiatry, where many disorders develop gradually over years. The sooner emerging changes in physiological dynamics are identified, the greater the chance of timely intervention: whether to slow down disease progression, prevent further decline, or preserve quality of life. Early prediction also raises the possibility that, in the future, therapeutic advances may allow for interventions at the most effective time.
A key step towards achieving this goal is to improve our ability to quantify healthy physiological variability. Traditional linear measures only provide a partial picture. By contrast, fractal and non-linear analytical methods offer a richer, more realistic representation of physiological signals by capturing the inherent complexity that characterizes healthy systems.
The conceptual foundations of fractal analysis can be traced back to Benoît Mandelbrot, who coined the term ‘fractal’ (from the Latin ‘fractus’, meaning ‘broken’) in 1967. Mandelbrot observed that natural forms cannot be described by Euclidean geometry: ‘Lightning does not travel in a straight line, nor does a dog’s bark. Mountains are not cones and clouds are not spheres…’ [3]. Such natural structures display self-similarity across scales. The same is true of physiological signals. For example, when an electrocardiogram (ECG) trace is observed with different magnifications, its rugged fluctuations remain recognizably similar. One segment of the signal resembles another, which is called self-similarity: a defining feature of fractal geometry.
This behavior can be described mathematically. Some fractals are exact, which is characteristic of generated mathematical graphs, whereas statistical fractals better describe natural forms. The mathematical foundations for these concepts were established in the 18th to 19th centuries by Sierpiński, Koch, Cantor, Weierstrass, Julia and Peano, among others. However, their functions were long considered ‘pathological,’ as they could not be analyzed using traditional methods [3].
Fractal dimension is a quantitative measure of how a structure occupies space. Physiological structures, such as the folds and gyri of the human brain, maximized space usage in non-Euclidean ways. Similarly, the fractal dimension of a physiological signal estimates its temporal complexity, or how ‘wrinkled’ or irregular it is over time. Of the available algorithms, Higuchi’s fractal dimension (HFD; [4]) is particularly well-suited to electrophysiological signals, as it captures temporal complexity directly [5,6,7,8,9,10].
While classical Euclidean geometry describes smooth, regular shapes, biological structures and processes display irregular, non-integer, fractional dimensions [5,11,12]. Fractal objects consist of nested sub-units that resemble the larger whole—a hallmark of self-similarity or scale invariance. This principle is exemplified by many natural systems, including vascular trees, pulmonary structures, Purkinje cell networks, coral reefs, coastlines and cloud formations [3,11,12].
In human physiology, fractal organization enables the efficient distribution of resources across complex, spatially extended systems such as blood flow, nutrient delivery, gas exchange and the electrical conductance of neuronal networks within the central nervous system (CNS). Importantly, fractal concepts apply not only to anatomical form but also to the processes occurring within these structures. Physiological processes generate irregular fluctuations across multiple time scales, and these temporal patterns also exhibit statistical self-similarity [5,10,11,12,13,14].
Recently, it has become normalized that very young researchers who are just entering the field are publishing review papers using the PRISMA approach (made for meta-analytic studies), while the original intent of a review format was for the author to give an overview after doing their own research, giving an experience-based opinion and conclusion for the accomplishments of a research field. Concurrently, many authors are using agents to search and summarize the literature, despite the known inability to check the sources (which is a consequence of automatization of plagiarism), nor provenance. We are also aware that many publishers are not able to cope with the problem of citing nonexistent publications without a system. With this work, we wanted to do something different (and old-fashioned); after years of trying to understand the intricate characteristics of physiological complexity, we focus on the evolution of how our understanding of physiological complexity processes developed, leading to innovative interpretations and possible practical use. We thought it would be interesting to illustrate how, over the years, a researcher’s perspective might change, as change is the only constant. The research questions that we asked changed from ‘how we can use fractal analysis to detect depression’ (to differentiate patients from healthy controls, based on fractal analysis) to whether even ‘a cluster of different nonlinear biomarkers can capture the state of the aberrated autonomic system in depression’, and even then, ‘how we can connect that with a clinical presentation that clinicians see in a patient’.
This work summarizes (via human intelligence only) the application of fractal and nonlinear analysis in clinical neuroscience, from our own and our immediate network’s research. In the beginning, we focused on one or two nonlinear measures as potential biomarkers (also analyzing how compatible they are); then, we realized that every single nonlinear measure provides different information from a signal under study. Then, we realized that clusters of biomarkers actually represent the autonomic profiles, and those can be either used for further machine learning or for knowledge discovery via ontology-based application profiles, which is the latest direction of our work. We also highlight recent advances in the fractal and fractional calculus-based modeling of physiological transport processes (through highly inhomogeneous structures of human tissue) that support the development of digital twins, as we understand that complex structures require improved mathematical modeling.
This work is organized into sections that cover necessary theoretical reviews from the older literature (seldom cited recently/a form of ‘academic amnesia’ we want to correct), a section on the interpretation of regularity changes, and then sections dedicated to different research projects answering specific tasks (from the detection of depression, to differentiating between the phases of disease, to cluster biomarkers, to NIBS therapeutic effect interpretation, to mild cognitive impairment early detection, to movement disorders’ early detection), followed by a methodological comparison of two nonlinear measures, where we also discuss the technical aspects of preprocessing, the contrasting algorithms used, and the limitations of this approach. Finally, we discuss one recent application of the fractal and fractional approach that is important for the adequate development of digital twins, and close with conclusions and further directions.

2. What Is Decomplexification?

The concept of decomplexification, a pathological reduction in physiological complexity described by Goldberger, Pincus, Hausdorff and colleagues, offers a unifying framework for interpreting neurological disorders [5]. Within this framework, HFD and related nonlinear metrics quantify multiscale irregularity in neural activity, enabling the early, disease-specific detection of altered brain dynamics.
Early theoretical studies of the healthy heartbeat revealed that physiological systems combine organization (order) and variability (disorder), rather than being simply orderly. As Goldberger and colleagues [5] observed, this healthy state exhibits hierarchical, long-range organization, which is enabled by scale-invariance or self-similarity. This ‘organized variability’ is the characteristic signature of a natural, healthy process. As Goldberger observed, a fractal system exhibits a type of ‘roughness’ or irregularity that remains statistically similar across a wide range of scales. This underlying scale-invariant mechanism allows for a wide range of interindividual variability—a key challenge for engineers modeling these processes—and aligns with contemporary models of biological self-regulation, such as the Active Inference model [13,14,15]. As this fractal nature appears to be a fundamental mechanism of physiological structures and functions, it follows that standard mechanistic or reductionist (e.g., Fourier-based) approaches to signal analysis are inadequate [7]. To better model physiological systems, fractal and nonlinear approaches should be adopted instead [16,17].
Healthy complexity is often defined as an organism’s natural ability to adapt to constant external and internal changes. As we age, however, this physiological complexity decreases slightly, making the system less adaptable. Disease, however, is different. Rather than a gradual decline, disease often causes an abrupt breakdown of this fractal organization, meaning the physiological system becomes severely disrupted. This breakdown creates opportunities for new approaches to early detection and monitoring. This leads to what Goldberger calls the ‘central paradox’: many illnesses, despite being labeled ‘disorders’, exhibit strikingly predictable (periodic) behavior [5]. For instance, patients diagnosed with Parkinson’s disease often exhibit an ‘exactly predictable’ tremor, which is an involuntary movement. This ‘stereotype of disease’, where the pathological pattern is rigid and regular, is a key concept for clinicians and is the very opposite of healthy variability [18,19].
Our goal is to match this clinical stereotype with a mathematical framework that can quantify it using physiological signals. Using our knowledge of healthy physiology, we can predict disease as a measurable breakdown of long-range order, resulting in a loss of self-similar structure. This pathological loss of complexity is termed ‘decomplexification’ [5]. In other words, the measurable loss of physiological complexity, which leads to more predictable, rigid and periodic behavior, is the mathematical counterpart to clinical stereotypy.
This loss of complexity is evident in the emergence of highly periodic dynamics (and a corresponding loss of information) in numerous clinical states. The first recorded instance of this occurred in 1816, when Dr John Cheyne described the ‘highly predictable oscillatory nature’ of congestive heart failure. This was evident in abnormal cyclic breathing patterns (Figure 1), now known as Cheyne–Stokes breathing. This rigid regularity contrasts sharply with the normal irregular and complex coupling of healthy heart and breathing rhythms. Following Cheyne and Stokes, Dr. Riemann cataloged numerous ‘dynamical diseases’ with strong periodicities that contrast sharply with the deterministic chaos of healthy dynamics. This principle applies broadly; decomplexification is evident in predictable patterns observed in blood cell counts in leukemia, epileptic seizures, certain psychiatric behaviors (e.g., OCD), sudden infant death syndrome (SIDS) and heart failure, to name a few [19,20]. In conclusion, these highly structured, periodic patterns represent a significant reduction in complexity (i.e., healthy variability) in pathological conditions. Many authors demonstrated this phenomenon in the field of physiological complexity, allowing clinicians to recognize the ‘stereotypy of disease’ [1,5,12,13,19,20]. We will discuss some recent applications of this concept in psychiatry and neurology.

3. What Does Regularity in Physiology Mean?

Steven M. Pincus is an (unjustly) relatively unknown author who has made a significant contribution to our understanding of how we should model physiological processes. His initial work addressed randomness and degrees of irregularity in physiological datasets, clarifying that terms such as ‘random’ or ‘chance’ are often misunderstood and that non-mathematicians often believe that probability theory adequately addresses them. He explains that contrary to the moment statistics and variability measures typically used, which neglect the historical context of the data, a regularity statistic such as approximate entropy (ApEn) is needed. ApEn can detect irregularities that may indicate emerging clinical symptoms [19], and forms part of a general theoretical development as the natural information–theoretical parameter for a process [19,20]. The loss of information, mentioned above in the Decomplexification Section, is addressed by entropy-based measures in a more statistical physics-based context. That is also applied in [21].
This regularity statistic is directly associated with the concept of decomplexification. It links the compromised physiology in multiple systems to more regular, patterned behavior (e.g., sinus rhythm heart rate tracings), whereas normal physiology is linked to greater irregularity and complexity (e.g., abnormal heart rhythms in premature babies or sudden infant death syndrome), as discussed by Pincus and Goldberger [18,19,20]. The formulation of ApEn was driven by the need to quantify ensemble regularity versus randomness in physiological signals. ApEn measures the logarithmic likelihood of patterns remaining close over observations: greater regularity produces smaller ApEn values [18,19,20]. When compared to the fractal dimension, the greater regularity (higher predictability) usually leads to (or can be interpreted as) lower complexity: two facets of the same coin.
The key difference between ApEn and standard frequency- or time-based measures is the order of the data. Standard statistics are primarily concerned with how ‘smeared’ the data are around the mean, so the sequence of samples is irrelevant. In entropy-based measures such as ApEn, however, the order of the data is the central factor. Pincus demonstrated this by showing that shuffling the order of samples in a physiological dataset results in white noise (completely random structure). This illustrates a loss of intrinsic relationships and highlights that a loss of complexity also entails a loss of information from the system. The primary focus of this parameter is discerning these changes in order, from apparently random to very regular [18,19,20]. Consequently, pathologically low complexity (decomplexification) is still a feature of an even partially organized system, while randomness is characteristic of the total absence of an intrinsic structure, or in physiology, existing controls via feedback loops.
However, interpreting entropy can be difficult, since physiological signals comprise both stochastic and deterministic components. While low complexity in pathological cases is often clear, high complexity can be ambiguous. In our own research [22], we found that patients with depression had higher levels of complexity in their EEGs. Surprisingly, their EEG complexity increased further during recovery. This led us to question whether this change was due to a return to healthy, adaptable complexity, or whether it was instead due to ‘pathological randomness’ resulting from a decoupling of underlying central control mechanisms [23]. This demonstrates the utility of ApEn as a sophisticated tool for probing system dynamics and the importance of flagging it when something is wrong. Its correspondence to mechanistic inferences regarding subsystem autonomy, feedback and coupling has been demonstrated in various contexts [24]. For instance, in studies of cortisol secretion in depressed men, ApEn findings indicated a loss of regulatory control [25]. ApEn has also been applied to daily self-rating mood data to identify mood regulation dynamics that could inform phenotypic classification [26]. ApEn typically increases with greater system coupling and external influences, thus providing an explicit measure of autonomy. Conversely, greater regularity (lower ApEn) corresponds to greater component and subsystem isolation, as observed in sudden infant death syndrome (SIDS), and is associated with compromised physiology [18,19,20].

4. Applications of Complexity Measures in Psychiatry: Depression Detection and Monitoring of Therapeutic Outcomes

The application of complexity measures in psychiatry has shown particular promise in the detection of major depressive disorder (MDD). This line of inquiry is driven by neurobiological findings such as the demonstration that, in patients with MDD, the uncinate fasciculus—a deep white matter tract connecting the prefrontal cortices with the limbic system—is impaired [27]. This structural impairment can be as high as 35%, leaving only 65% intact, and provides a potential explanation for why patients with mood disorders struggle to control their emotions. This damage may force the brain to develop compensatory mechanisms, often recruiting atypical areas to perform the same task (‘a different program’). Building on this, in 2014/15, we speculated that, as in movement disorders, this deep-structure compensation might produce detectable changes in cortical excitability, which could be measured using EEG. While other studies have confirmed this fronto-limbic disconnection and disrupted global network topology using fMRI and digital tractography [28], we aimed to establish whether these disturbances could be detected using EEG, which is much more accessible and cost-effective. Table 1 gives an overview of several similar studies at the same classification task.
Several publications have originated from this research. We developed a methodology that uses fractal and entropy measures (specifically HFD and SampEn), extracted from resting-state (closed-eyes) EEG, as features for machine learning models. Our initial results showed good separation between patients with major depressive disorder (MDD) and age-matched healthy controls (HCs) [23]. A subsequent publication explored this further by comparing seven popular machine learning classifiers on the same dataset. This study demonstrated that, when characterized by these nonlinear features, any of the tested classifiers could accurately distinguish between MDD patients and HCs [22]. Principal component analysis (PCA) confirmed that the data were clearly separable based on these features; what is more, when two measures are combined, even the first three PCAs would suffice for accurate detection tasks, regardless of the model used. The consistent performance of all the classifiers led to the key conclusion that, for this specific task, the primary diagnostic power may lie in the nonlinear characterization itself, rather than in the sophistication of the machine learning model used. However, this approach is not without limitations. The field continues to suffer from relatively small datasets (in our study, N was 46, in Ahmadlou [29] it was 24, in [30] it was 45, in Bairy [31] it was 60, in [32] was 30, and in Bahcmann [33] it was 26, but even in newer attempts of the same tasks, the sample size is usually seldom below 100), which carries a significant risk of overfitting. This lack of generalizability, coupled with the ‘curse of dimensionality’ and ‘blind spots’ [34], prevents clinically useful automation and highlights the need for continued validation [22,34]. In addition, we noted that the model also learns so-called ‘nuisance variables’: namely, the characteristic of a presented training dataset that we did not specify as important, but it learns it anyway, and in essence, tries to identify those features in a new unseen dataset.
In another experiment, we adapted this approach to distinguish between phases of major depressive disorder (MDD), by recording resting-state electroencephalograms (EEGs) from patients over a longer period and comparing their recordings during exacerbation (episode) versus remission. The non-linear measures again provided a clear distinction between these two states, but a surprising finding emerged: the complexity of the patients’ EEG signals was higher during remission than during an acute episode [22]. We offered a neurophysiological explanation for this, drawing parallels with the changes in excitability seen in movement disorders and the uncinate fasciculus damage known to occur in MDD [27]. These results suggest that inexpensive, non-invasive EEG analysis using nonlinear methods could potentially be employed for the clinical monitoring of patient progress and the efficacy of therapeutic interventions.
However, this finding also raises a critical question. While the lower complexity during the acute episode aligns with the decomplexification hypothesis, the further increase in complexity during remission is ambiguous. It remains unclear whether this elevated complexity indicates a genuine return to healthy, adaptive variability, or whether it conceals an escalation in pathological randomness—a system whose intrinsic stabilizing mechanisms and feedback loops are impaired, resulting in a loss of overall integrity.
While earlier studies on serious depression [35] concluded that the brain treated with antidepressants is definitely different than a healthy brain (in biochemical sense based on easy relapse after the same challenge), we can also take into account that once compensation (to circumvent missing or insufficient connectivity) occurs, the remission does not overlap with the return to ‘healthy’ functioning, but resolves affective control by other means. Building on this, a subsequent review [36,37] compared the results of various groups that used nonlinear measures (often involving machine learning) with those that used standard spectral or time-based measures. The findings strongly suggested that the non-linear approach is more accurate and has a greater chance of being translated into clinical practice, since many of these measures can be calculated in real time to inform clinical decisions more effectively. Based on the effect sizes, it is evident that this approach is superior and more advisable than the current standard for signal analysis. However, despite this early attention in computational psychiatry, we also recognized significant limitations in the generalizability and practical real-life applications that must be addressed [38].
As Klonowski noticed [2], the main reason why those nonlinear approaches are not widely accepted and used more might lie in the simple fact that the majority of electrophysiological equipment (devices that record EEG, ECG, and EMG) have integrated only frequency-based or time-based (FFT-related) measures software. We discuss this technical aspect (and others) of signal analysis and ML-related issues more in Section 9 and Section 10, below.

5. Electrocardiogram-Extracted Cluster Biomarkers for MDD

While the previous section focused on EEG-based feature extraction, we must recognize that ECG signal is also an important source for detection, given the close relationship between the brain and the heart as two dynamic systems (cardio/pulmonary regulation). This neuroanatomical link is justified by the Central Autonomic Network (CAN), which physically connects major neural hubs involved in the pathophysiology of MDD (e.g., the dorsolateral prefrontal cortex (DLPC), the insula and the hippocampi) with the autonomic nervous system (ANS). Furthermore, the vagus nerve is widely accepted as being critical for ANS allostasis, and it is known that the ANS is aberrated in depression. In practice, the ECG signal is simpler and easier to measure, and many medical-grade wearable devices and a well-established telehealth monitoring infrastructure are already available.
Steven Pincus (1994) [19] was the first to explore the validity of ECG-extracted nonlinear measures systematically. He demonstrated that newborns at risk of sudden infant death syndrome (SIDS) could be identified by using approximate entropy (ApEn) analysis of their ECGs. Although the ECG records compared, which were of a healthy infant and the one who escaped SIDS, had almost the same mean, their ApEn had a seven-fold difference, which was the basis for an automatic warning control in pediatric intensive care departments at that time [20]. This approach was later applied to depression and anxiety by Kemp and colleagues [39,40,41], who also used measures such as detrended fluctuation analysis (DFA) and Poincaré plots to distinguish patients from healthy controls successfully. They noted that distinctions were more pronounced with greater depression severity [39,40,41]. The same was found to be significant in detecting anxiety, which often co-occurs with MDD. This line of analysis has even been extended to detect suicidal ideation [42] and has been corroborated by other physiological measures, such as electrodermal hypoactivity [43].
Following years of research, it has become evident that researchers should never rely on just one non-linear measure. Systematic comparisons [44] show that each measure extracts unique information about the signal. The ‘art’ lies in finding an optimal combination of measures to support clinical decision-making. In a recent pivotal study, Weber and colleagues [45] re-analyzed the UK Biobank dataset (comprising 15,768 participants) to determine whether clusters of ECG-derived HRV biomarkers could accurately reflect known autonomic nervous system (ANS) abnormalities in depression and correlate with clinical symptoms. Using K-means clustering, they identified distinct psychophysiological profiles. They pinpointed a high-risk cluster (LRS, or low relative sympathetic cluster) that was associated with maladaptive stress-coping strategies. Interestingly, they identified two clusters with low HRV but different risk profiles: one cluster with high relative sympathetic activity (HRS) and a lower breathing rate was associated with greater resilience. In comparison, a cluster with dominant relative parasympathetic activity and a higher breathing rate was associated with a higher prevalence of depression and suicide attempts. These biologically derived clusters aligned perfectly with the psychometric scales, suggesting that susceptibility to depression is associated with more rigid and maladaptive coping strategies [45]. Although has been known for quite a long time that chronic stress and depression are associated with low heart rate variability, this particular result is actually interpreted by aligning with the Active Inference approach [16,17,18]. In the LRS cluster, rigid priors based on depresogenic inferences lead to a hypervigilance state where a constantly increased metabolic need is anticipated (i.e., increased breathing rate/BR), which increases the prediction error. To minimize the prediction error, the parasympathetic tonus is elevated as a result of an ongoing attempt to counterbalance an overactive sympathetic nervous system, guiding the body toward sickness behavior, reflecting the maladaptive physiological coping behaviors [45].
This approach strongly supports the recent position paper of the European College of Neuropsychopharmacology (ECNP) regarding their Precision Psychiatry Roadmap [46]. The paper advocates moving away from the current ineffective nosology and focusing instead on biology-based markers to improve clinical decision-making. Building on this, our current work combines fractal and nonlinear electrophysiological clusters with additional peripheral metallomic biomarkers [47]. This could enhance the effectiveness of large-scale, out-of-clinic screening, particularly for underserved subpopulations such as perimenopausal and postmenopausal women, who are at a higher risk of cardiovascular disease and mortality when experiencing depression (in print). We knew for quite long time, that reliance on just one measure extracted from electrophysiological signal is misleading, that Burns and Ryan rightly stated [48], demonstrating that each slightly different (in mathematical sense) measure is capable of extracting different information. Our current research already connects the above-mentioned and enhanced (with fractal and several entropy-based) measures with items on the CIDI psychometric scale to form ontology-based application profiles that are a result of applied semantic modeling. As a derivative of knowledge modeling and FAIR guiding principles, those ontologies are machine-actionable, providing an even better option for all data that are AI-Ready, thus allowing our research to move from a couple of statistical learning models of choice to being discoverable, accessible, interoperable and replicable (in press).

6. Complexity-Based Explanation for Why NIBS Are Effective in Depression Treatment

Information obtained by applying non-linear measures to neurodegenerative disorders has revealed a new application in neurology. For example, in movement disorders, neurologists have long recognized that elevated levels of cortical excitability (which can be detected as elevated complexity) in patients with Parkinson’s disease (PD) are often a result of the brain attempting to compensate for significant structural (deep brain structures/substantia nigra) damage. The brain’s remarkable plasticity enables it to utilize existing functional structures to perform new tasks when a specific structure is damaged, as is evident following a stroke or injury. We believe that this finding of elevated excitability could also be used for diagnostic purposes.
We hypothesized that the same principle might apply to major depressive disorder (MDD). A key feature of MDD is an inability to regulate emotions, which corresponds with a known physical deficiency: the uncinate fasciculus—the deep white matter tract that links the prefrontal cortex with the limbic system—is compromised [27]. When its functionality is reduced (e.g., to 65% compared to healthy controls), it cannot transmit regulatory messages effectively. We questioned whether, as in PD, this deep-structure compromise in MDD would lead to detectable changes in cortical excitability. First, we confirmed that patients with MDD do exhibit altered levels of cortical excitability (complexity), which we used for detection [21,22,28].
This raises the question of how non-invasive brain stimulation (NIBS), such as rTMS and tDCS, affects this situation. In order to gain an understanding of the physical process of stimulation, we replicated the mathematical approach of Mutanen and colleagues [39], who used recurrence plots to demonstrate that, following a TMS stimulus, the shift in brain state persists beyond the duration of the stimulation itself. Using our own tDCS data, we confirmed this, finding that the brain remained in a ‘highly improbable state’ for more than half an hour after stimulation ended—a timeframe that meets the neuroscientific benchmark for plasticity [40]. We hypothesized that the therapeutic efficacy of NIBS lies in its ability to modulate pathological excitability. Our work suggests that rTMS and tDCS help patients with depression by lowering their already elevated excitability for a period of time [41]. This also corresponds with the fact that responsive patients often require another series of treatments after around six months (maintenance), as the effect is not permanent, but provides significant relief.
Our colleagues from our immediate network confirmed the idea that changes in complexity can describe this phenomenon. Lebiecka and colleagues [42] used HFD to analyze 64-channel EEG in patients with major MDD and bipolar depressive disorder (BDD) undergoing repetitive transcranial magnetic stimulation (rTMS). Their results showed that HFD can be a useful marker for evaluating the effectiveness of rTMS and for differentiating between responders and non-responders. Differences in HFD were demonstrated after the application of rTMS, but the trend detected differed in MDD and bipolar depression patients (BDD). Researchers measured the difference induced with stimulation after the 1st, 10th and 20th session to differentiate between responders and non-responders. In MDD responders, FD decreased after the longer stimulation (the whole therapeutic cycle of 20 sessions). In MDD non-responders, FD increased after the 1st and after the 10th session, but overall, after the long stimulation period, it did not change. When they compared MDD responders and MDD non-responders, FD was higher in responders in every band, but also before and after the stimulation. In BD, the trends were the opposite. That means that even without testing after the first session, the clinician can estimate the difference between responders and non-responders [42]. This suggests that changes in HFD can unambiguously indicate whether the effect of stimulation is positive or nonexistent; identifying responders early allows for effective and timely adjustments to the best therapeutic outcomes. Maybe the next step would be to apply similar methodology to [49].

7. Applications in Neurology

The concept of decomplexification—a pathological loss of physiological complexity originally articulated by Goldberger, Pincus, Hausdorff and their colleagues—provides a unifying framework for understanding neurological disorders [5,19,50,51,52]. Within this framework, HFD and related nonlinear metrics serve as powerful tools for quantifying multiscale irregularity in neural activity. They reveal early and disease-specific alterations in brain dynamics.
Fractal-based biomarkers have been successfully applied to several neurological conditions. In epilepsy, HFD capture aperiodic shifts related to cortical excitability and treatment effects [41,53,54]. In stroke patients, HFD decreases acutely and increases during recovery, tracking neuroplasticity [5,42]. Furthermore, HFD distinguishes between different levels of consciousness and depth of anesthesia [50,55,56] and reveals network-specific alterations in migraines [57,58,59,60]. Our earlier work demonstrated globally elevated cortical excitability in PD [61,62], which is likely to reflect cortical compensation for subcortical damage—an effect that can be quantified through nonlinear EEG analysis. This utility also applies to multiple sclerosis (MS), where fractal dimension has been used to quantify the neurodynamic state of the sensorimotor cortex. Fatigue in MS [62] was found to distort the dynamics in the primary somatosensory cortex (S1), and a personalized neuromodulation (tDCS) treatment that successfully alleviated fatigue was also found to revert these fractal dynamics to a normal state [5,55,63]. Furthermore, HFD discriminates between levels of consciousness, including the minimal vegetative state, and captures anesthetic modulation of the balance between excitation and inhibition [50,56,64]. Network-specific reductions in HFD in migraine-related cortical [63,64] systems further support the clinical utility of this approach [50,51,53]. Furthermore, after Smits and colleagues [65], we demonstrated that mild cognitive impairment (MCI) [66], which is considered a prodromal phase of AD, can be detected early in seemingly healthy aging individuals via HFD extracted from their EEG. By reanalyzing a dataset from a study that aimed at elucidating what kind of physical and/or cognitive activity keeps seniors in the best shape (exogames’ utility), we showed that around 20% of those people considered to be ‘healthy’ are already showing the signs of developing MCI (significant loss of complexity in parieto-occipital regions’ EEG-extracted HFD), which was later validated by a dedicated neurologist, as the study was performed out of the clinical setting [63,66]. The effect sizes were higher than in the previous literature. Based on those features, even PCA showed that the subgroups were clearly separable, demonstrating that further classification was not inevitable with such powerful EEG characterization of a signal [66]. Those results could be used to screen the population that are older than 65, when prevention is still possible.
From previous neuroimaging studies it is expected to observe lesions and, before that, functional and/or structural changes in the occipital and parietal changes in Alzheimer Disease (AD) [64]; interestingly, despite many mentions in the neuroimaging literature, our results on MCI were most prominent on parietal positions (P7, P8, P3, P4, on Figure 2). As we cannot confirm what the cause of those aberrations is based on the complexity changes in a composite signal detected from the cortex/surface of the brain, we can only hypothesize that in MCI, the first changes most likely occurred in the parietal regions, and maybe with progression, they became so prominent that it could be measured in occipital regions, too. In any case, we know from other projects where we used HFD for detection that we must not claim any direct detection (as the source of EEG as composite signal might change every millisecond, and we cannot confidently reconstruct the sources at this point), but only consider this kind of result as a first detection of change in a trend, and interpret that as a red flag that should be communicated with neurologists to further investigate the case. We base our conviction on the fact that in many prior studies, it was reported that some 10%+ autopsies in that age range show that seemingly healthy older people had already started developing AD without passing the threshold of clinical presentation to confirm it.
Nevertheless, despite the limitations of this methodology, we are convinced that like in many other successful strategies of early detection, even sensing the slightest changes that can be quantified in this particular problem, a prodromal phase to one of the most devastating dementias such as AD, could serve the purpose of out-of-clinic (or in primary care) low-cost detection, so timely protective strategies could be put in place.
Beyond specific pathologies, fractal analysis has proven to be a powerful tool for characterizing the complex organization of the brain’s intrinsic, large-scale networks, known as resting state networks (RSNs). It has been used to characterize hemodynamic activity from functional magnetic resonance imaging (fMRI), demonstrating its ability to map the fractal properties of blood oxygen level-dependent (BOLD) signals within these networks [63]. Importantly, this is not limited to hemodynamics: neuronal dynamics from EEG can also be used to functionally differentiate these same RSNs, providing a robust electrophysiological basis for network complexity [67]. This mechanistic insight is also crucial for understanding brain stimulation. Recent work shows that the brain’s ‘state’ of complexity before a stimulus affects the response [42,67,68,69]. Specifically, pre-stimulus fractal dimension, along with oscillatory power (e.g., in the gamma band), can predict the amplitude of TMS-evoked potentials. This supports the model of the brain operating near a state of ‘criticality’ and highlights the potential of FD as a biomarker of brain reactivity and the immediate effects of neuromodulation [69,70,71,72].

8. Early Detection of Parkinson’s Disease Using sEMG Nonlinear Markers

In addition to EEG, the nonlinear analysis of surface electromyography (sEMG) offers another underutilized method for the early detection of PD [71,72]. EMG signals reflect the motor unit synchronization and spinal network dynamics [73,74,75] and can be treated as a time series, making them highly suitable for fractal- and entropy-based analysis. This is particularly relevant, given that PD has a long prodromal phase, which can last four to six years before classical motor symptoms become visible. During this period, 60–80% of dopaminergic neurons in the substantia nigra may already have degenerated, highlighting the importance of early detection for neuroprotective intervention. Furthermore, given misdiagnosis rates of up to 30% (PD is often confused with essential tremor, MS, or Huntington’s disease), there is an urgent need for robust physiological markers.
Pioneering work by Meigal and colleagues [76,77,78,79] demonstrated that sEMG nonlinear parameters are more effective than traditional spectral features at distinguishing patients with PD from healthy controls. By treating sEMG as a nonlinear, deterministic–chaotic process, they found that PD was characterized by significantly increased determinism (≈32% vs. 11–17% in controls) and markedly lower sample entropy (SampEn) (≈0.93 vs. 1.02–1.17), as well as a reduced correlation dimension (CD) (≈4.9 vs. 6.8–6.9). In stark contrast, conventional metrics such as RMS amplitude and median frequency showed no discriminative power. These results strongly suggest heightened motor unit synchronization and reduced neural complexity—classic signatures of decomplexification. A follow-up study achieved 85% diagnostic accuracy using these nonlinear parameters alone [75], underscoring their translational potential. Notably, the reductions in complexity were most pronounced in bicep recordings, which aligns with the tremor-dominant PD phenotype. This suggests that akinetic-rigid variants may require alternative protocols.
During the same period, we experimented with the effect of TMS in PD, but could not confirm a statistically significant and lasting effect. One byproduct of that research in our Neurophysiology lab was that we were constantly witnessing the elevated excitability of the PD patient’s cortex(M1), which corroborates our later realization of how specific compensatory processes affect their EEG, and supports the further applications in yet another field.

9. Comparison of HFD and SampEn Importance for Movement Disorders Tracking

Although sEMG analysis in medicine is traditionally dominated by spectral or amplitude measures, it has repeatedly been shown that nonlinear analyses provide additional and often more sensitive information. Klonowski showed [2] the mathematical reason why Fourier-based algorithms, made for successfully treating mechanistic processes, remove essential information from the raw electrophysiological signals; in the second step of the procedure, in which one has to decompose the original signal on spectral components, the original frequency component is lost and replaced with two other neighboring components, providing the core reason why spectral-based approaches are not adequate for complex time series analysis [2]. The reductionist approach simply does not work for complex living systems. These methods can reveal underlying motor strategies [76,77], hidden rhythms [78] and fatigue [79]. Critically, they can also detect pathological changes [72,73]. For instance, fractal analysis has been used to relate muscle force to sEMG complexity [80,81,82,83,84,85,86,87] and, as previously mentioned, to distinguish patients with PD from healthy controls [72].
However, choosing a nonlinear measure is not trivial [48]. To investigate this, our pilot study [88] compared HFD and SampEn when assessing sEMG from the first dorsal interosseous (FDI) muscle, induced by TMS stimulus, on healthy volunteers. We analyzed sEMG during mild (10–20% MVC), medium (20–40% MVC) and strong (40–70% MVC) contractions, both before (PRE) and after (POST) a single-pulse transcranial magnetic stimulation (spTMS) pulse. HFD estimates the complexity in the time domain, whereas SampEn quantifies the probability that patterns will remain similar as the sequence progresses.
Figure 3 shows the raw signal recorded at three different contraction levels from the FDI muscle, studied in this particular research project with TMS application.
The results confirmed that sEMG complexity computed by HFD decreases with increasing contraction intensity, and spTMS decreased it further at all levels. However, SampEn showed different behavior: it decreased from a mild to medium contraction, but increased from a medium to strong contraction. Nevertheless, it was consistently decreased by spTMS. This discrepancy, whereby HFD and SampEn changed differently in response to increased muscle contraction, prompted a calibration analysis using mathematically generated (Weierstrass functions used) sine waves. This test revealed that the two parameters have different frequency sensitivities. SampEn was more accurate at lower frequencies (0–40 Hz), whereas HFD was more accurate at higher frequencies (60–120 Hz).
This finding was crucial. The reduction in sEMG complexity with increased force may seem counterintuitive, given that a higher force involves more motor unit recruitment and higher discharge rates, which one might assume would result in a more complex signal. However, this depends on the muscle’s recruitment strategy. In the FDI muscle, for example, almost all motor units are recruited at ~30% MVC; any further force generated is due to frequency modulation (i.e., increased discharge rates) [82,83]. Therefore, the differing results from HFD and SampEn likely reflect their differential sensitivity to the changing frequency components of the sEMG as the contraction intensity rises.
As an illustration, EMG-signal-extracted HFD and SampEn were compared, and the calibration curves demonstrate (Figure 4) the different frequency content sensitivities of HFD and SampEn [88].
We concluded that changes in sEMG complexity associated with muscle contraction cannot be accurately depicted by a single complexity measure. Our results strongly suggest that sEMG analysis should incorporate both SampEn and HFD, as these measures provide complementary information about the signal’s different frequency components and, by extension, the underlying corticospinal activity.

10. Discussion of Fractal Analysis Application

The evolution of our understanding of how nonlinear analysis should be applied in order to extract important information from a system under the study before some pathological changes translate to clinical manifestation with this was not over. Several other research groups trying to answer similar research questions started applying more nonlinear measures together [48], in the hope that their different mathematical origin would help to extract more information from the signal, but after combining those measures as features for various statistical learning algorithms for automate detection, to yield more accurate classification, it became obvious that our research hit the sigmoidal curve of the plateau of our knowledge. We then tried to analyze the limitations of this research and focused on the part of the protocol using various machine learning models [89]. We already knew that in order to give optimal results, fractal and nonlinear measures should be applied to broadband instead of sub-bands extracted from the raw signal [36,38]. Also, minimal preprocessing was advised, due to loss of information with some forms of filtering that became standard long time ago [36,38]. After Klonowski, who demonstrated two decades ago why Fourier-based analysis of electrophysiological signals are less than optimal [2], to say the least, Kalauzi [90,91] showed that they are also redundant, as HFD calculated from amplitudes of Fourier spectral components are their weighted functions. Esteller [92] and other researchers, compared the different methods used to calculate fractal dimension, showing that Katz’s algorithm is most consistent in isolating epileptic states (likely due to its exponential transformation of FD and insensitivity to noise), while HFD was more accurate in estimating FD (but is more sensitive to noise). The box-counting algorithm is more reserved for space filling estimation tasks, and performs less optimally in the electrophysiological signal area; Petrosian’s method is less suitable for analog signal analysis, given its poor reproducibility of the dynamic range of synthetic FD. Similar to comparison of a list of entropy-based measures, their use depends on context, and each of them is informative. We showed that the characterization of electrophysiological signal with fractal and nonlinear measures is crucial for any classifier to yield high accuracy [21].
In our extensive analysis of the application of ML models on electrophysiology-based nonlinear features, we described several other reasons why that kind of research is still not good enough to be translated to clinical practice. One of the problems was that the majority of researchers from the biomedical and neuroscientific field (who are applying ML) did not understand the intricacies of statistical learning. For example, they used most popular models like SVM (the use of embedded regularization frameworks is recommended, at least with the absolute shrinkage and selection operator [89,93]), when there were several others that were much better-suited to the nature of the problem. LOOCV and k-fold cross-validation were also popular procedures for validation (for model evaluation), and the model generalization capability is typically untested on independent samples [94]. They often neglected the so-called ‘curse of dimensionality’ (which refers to issues that arise when the number of datapoints is small, relative to the intrinsic dimension of the data/usually treating the problem in high-dimension spaces), especially in actual neuroimaging studies, where the number of features is often higher than the number of subjects, triggering predictable overfitting. They rarely employed the Vapnik–Chevronenkis dimension [93], which should be of standard use for model evaluation or reduction. The external validation is often missing; the samples are too small and are usually collected at one site, when in reality, multiple-site collection would be the solution. When developing a model, one does not wish to train the classifier on a general sample characteristic; for example, if using nonlinear measures, they may differ because some measures change with age [89] or may be characteristic of a certain gender [95,96]. On top of subtleties from statistical learning theory, a similar layer of differences between the performance of algorithms and stark difference between any spectral measure in comparison to fractal is well documented [90,91]. Some authors refer to these as ‘nuisance variables’ because the algorithm learns to recognize that particular data set with all of its characteristics. Berisha [34] described yet another common problem contributing to overfitting and unwarranted optimism in ML applications (especially in medicine): ‘blind spots’. He demonstrated hilarious inefficiency in ML algorithms used to delineate MCI patients from healthy controls; in one run, the developed model detected patients as controls, and in another, vice versa; it obviously hit the blind spot. This refers to the fact that we can never collect all the possible data: we always collect representative samples. In reality, between those collected samples are the big holes called ‘blind spots’ that any model sooner or later approaches and fails (the best case scenario is that models developed on certain training set never approach larger unknown data). The only solution for this particular problem is to constantly monitor and optimize already developed algorithms, even after deployment, which is expensive and annuls the essential goal of automation. Electrical engineers always approximate that as the error of digitalization, but those techniques are never thought of in life sciences, neuroscience or medicine. How can the majority of those problems be overcome? The usual recommendation is ‘collect more data,’ but in our field, this is very expensive (as it means much more data than usual) and minimizes the chances for the success of the project as a whole. Another question that arises then is ‘is automation of the task not sustainable?’. Then, we realize that no matter how the promised efficacy and cheap use of classifiers may resolve the problem for us, the reality denies that as a viable future standard. In our opinion, portable devices (for ECG mostly, many with medical grade quality of signal) might be the answer, but then one has to do all the precaution measures working with a platform or another provider cloud, where GDPR and the security of health data might be the issue, which is yet another hard problem to solve.
The next step that is inevitable in successful early detection (and consequently, monitoring of therapeutic outcomes) came with the advent of the cluster biomarkers discussed in the above Section 5; it happened that we were looking on only one source—the human brain, represented by the EEG it generates—while we missed the connection between the heart and brain, the so-called heart–brain axis. In addition, the more we learned about the intricate dynamics of one system (brain or heart or muscles), we realized how ultimately, they were all connected and navigated by internal control mechanisms, that are known from anatomical connections (CAN mentioned above that physically connects regions of the brain like DLPFC, Hippocampi and Insula with the Autonomic Nervous System—ANS), realizing that the fact that those systems have a hierarchy must have a role in its overall complexity that was previously not taken into account. Extended CAN is also known as the Allostatic Network, and the role it plays, with numerous feedback loops acting as controlling mechanisms, complicated our grasp of its complexity even further. All things being synchronized was obvious when Friston and his network, including Andy Clark [17], started advocating for the Active Inference [16,18] approach in their quest to understand how our perception and decision-making actually works (with all that complexity hierarchically organized). Contrary to previous consensus, they hypothesized that brain is actually a black box, relying on information from numerous sensors from the periphery of the body, where the back-propagation of error is crucial in avoiding surprises, which are unwanted for the survival. Numerous neuroscientific studies in effect confirmed this model accuracy, from basic biochemical, to Biological Psychiatry research, to our new look on how to interpret cluster biomarkers as ECG-extracted autonomic profiles that are correlated to the severity of depression [49]. Even HRS and LRS cluster biomarkers (mentioned in Weber et al., 2025) are indicators of healthy vs. unhealthy or ‘maladaptive’ ways in which we cope with stress [49]. Based on those findings, we detected the gap in the research that we are currently trying to fill, and that is ontology-based application profiles to serve clinicians to be able to quantify, via electrophysiological measurements, the standardized clinical severity of depression [in press]. Our current FAIR Mind project is trying to do that. On one hand, we have a list of electrophysiology-extracted measures (e.g., a cluster of 14 mixed measures) that illustrate the state of the patient’s autonomic health. For that, we are working on the extension of existing semantic artifacts to be able to both offer clinicians practical use of electrophysiological measurements and for those newly generated data to be machine-actionable (based on FAIR Guiding principles), which in reality means they are AI-Ready (in press). With this innovative application of already known fractal and nonlinear measures (as the basis for semantic modeling), connected via knowledge modeling with clinical presentation, we believe the future reuse and interoperability of data would be much easier and a more acceptable candidate for standardization in clinical practice.
We want to summarize here this tortuous path of eventually recognizing that complex systems cannot rely on simple biomarkers for anything, as so many factors are involved: both internal and external. Our knowledge progressed from a single fractal biomarker to several markers applied in parallel to a single source of signal, to cluster biomarkers that are representing the state of ANS that is important for regulation of downward systems. This new development, where we connected this electrophysiology research (already tested for viability) with standardized clinical protocols used to establish the diagnosis and severity of the disorder, is waiting for a replication, as we are waiting for approved access to a larger public dataset. This proves once more that the reality of doing scientific research is often disappointing in some instances, but we will eventually bring fractal and fractional to practical use.
After this empirically developed opinion above, we are going further with yet another promising use of fractal description of human physiology structure (in this case, skin that is a multilayered system, with many embedded structures serving various purposes) that played the role for a fractional calculus-based modeling framework to be used for digital twins, but in our view, this holds promise for other applications as well. We believe that the section about fractal and nonlinear biomarkers has one particular purpose in making early diagnoses possible and also allows for monitoring of the slightest changes that indicate the progress of a patient’s state. Making timely intervention possible is extensive, but focuses on the analysis of time series or analog signals, which is limited in a geometrical sense. From another point of view (like separation on fractal analysis in 2D and 3D), the next level would be modeling complex processes that are ongoing through such a complex fractal structure as any physiological structure. If one, for example, aims at developing a new drug (nowadays, it pertains solely to the use of AI to discover previously unknown links), besides discovering the promising biochemical candidate, an even more important part of that research would be how the target substance would be transported through a highly inhomogeneous structure (like the digestive system, tortuous network of blood vessels, or alveolar system, or the network of neurons where the transport of ions or electrical particles can occur). In the next section, we give an overview of that research.

11. Fractional Modeling of Human Physiology for Digital Twins

The significant variability in human physiology, particularly in pharmacokinetics and pharmacodynamics, highlights the limitations of traditional modeling approaches. A recent project (2022–2025) focused on developing a digital twin for transdermal fentanyl transport, which highlighted this issue. Although standard modeling software (COMSOL v 6.2) is excellent for many physical processes, it usually models the transport of substances across highly heterogeneous membranes, such as human skin, as a straightforward diffusion process (Fick’s second law) through a homogeneous gel. Our simulation studies confirmed that this is an obvious oversimplification of a real physiological tissue structure.
To better model reality, two crucial concepts were introduced: firstly, the fractal, tortuous path that a molecule takes through the inhomogeneous skin layers (where it can be stacked and released later, indicating the memory processes of skin), and secondly, a novel mathematical framework based on fractional calculus. This combined approach is necessary to realistically describe the anomalous diffusion process. This novel methodological approach was detailed in two publications: one introducing a time-delayed flux concept [97], and the other focusing on inertial memory effects in molecular transport across nanoporous membranes [98].
When observing the transport of a molecular drug from a reservoir (the patch with the drug) across the skin barrier, a key finding emerged. All generalized models that account for molecular inertia—and thus the non-locality of flux and the finite propagation speed—predict oscillatory changes and resonances in the flux profiles at high frequencies (Figure 5).
Crucially, at low frequencies, the spectra of cumulative flux predicted by all generalized models converge with the classical model, indicating that they all predict the same total amount of stationary drug delivered. This work [97] provides further clear evidence that in order to accurately model the complex, non-homogeneous nature of human physiology, a fractal and fractional approach is not only beneficial, but necessary.
By visual comparison (Figure 5), it can be concluded that the Fickean prediction is too simple for complex human physiology, given the oversimplified estimate of the memory properties of this highly nonhomogeneous structure.
When we compared the results of the preexisting digital twin output (using just diffusion for modeling transport, Figure 6), with all known theoretical pharmacokinetics and pharmacodynamics knowledge included in a developed simulator, the curve actually turned out to be a monotonous function, contrary to the real thing (courtesy of a colleague collecting healthy participants’ plasma samples to illustrate absorption). It demonstrated a completely different mathematical description as a quasi-oscillatory function with dampening amplitudes, vs. a DT monotonous function that was almost identical for all the patients [97].
Just before the closure of this work, we published yet another study where we further developed this approach by examining boundary conditions [99].
If we want to develop digital twins that accurately represent transport processes, uptake of the drugs and excretion from the system, more accurate representation of those extremely complex physiological processes that take into account the fractal description of a structure and highly inhomogeneous characteristics of human physiological processes are urgently needed.

12. Conclusions and Future Direction

Scientists from other fields, including medical professionals, often describe concepts of physiological complexity as ‘novel’, even though the mathematical concepts of fractals and non-linearity in natural shapes, forms and processes are by no standard new. The aim of this work is to address this misconception, which is probably the consequence of students on life science courses receiving little to no training in mathematics nor advanced statistical learning. We also wanted to offer an overview of the evolution of our understanding of how in different settings, we can describe and appropriately interpret the results of this analytic approach. Technological advancements in computer power and cloud computing, plus the ubiquity of portable/wearable devices with medical-grade signal quality, may contribute to the wider acceptance of technically feasible ways for early detection and monitoring outside of clinics, for real life situations. In this work, we present how these measures should be applied and interpreted to provide the most accurate descriptions of reality and to foster better approaches to early detection and forecasting in medicine, supported by new theoretical and technological advances. This approach to signal analysis, forecasting, early detection and monitoring has already produced sufficient evidence to qualify for translational studies, which would allow it to increase the accuracy and effectiveness of diagnostics, particularly in neurology and psychiatry.
Building on this foundation, the future direction of this field must shift from a proof of concept to robust clinical application. This involves bringing biomarkers from the laboratory into wearable technology and clinical practice. This requires these metrics to be validated on a large scale against the current standards, such as sEMG complexity for preclinical PD screening and the superiority of HFD over spectral methods for neurodegenerative processes. They must also be integrated into telehealth platforms, which is allowed via a semantic modeling link that makes those markers machine-actionable. However, the utility of these metrics extends beyond mere diagnostics. The next frontier is biomarker-guided therapeutics, where complexity measures such as pre-stimulus fractal dimension can predict how a patient will respond to neuromodulatory interventions. Therapeutic success itself is then redefined as the ‘normalization’ of pathological fractal dynamics, as demonstrated in the treatment of MS fatigue or rTMS treatment of depression. To achieve this, our methodologies must also evolve. The future lies in fusing multiple nonlinear metrics to create high-dimensional ‘complexity fingerprints’, recognizing that HFD and SampEn, for example, in combination with a list of other markers (to form cluster biomarkers) provide complementary information. It also lies in building next-generation digital twins upon more realistic foundations, such as the fractal geometry and fractional calculus framework, to accurately model the anomalous diffusion and complex dynamics that are inherent to human physiology.

Author Contributions

Conceptualization, M.Č.R.; methodology, M.Č.R. and C.P.; software, M.Č.R. and V.L.; validation, M.Č.R., C.P. and V.L.; writing—original draft preparation, M.Č.R.; writing—review and editing, M.Č.R. and C.P.; visualization, M.Č.R.; corrections and rewriting after revision, M.Č.R. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

In this review, no new data were created; all mentioned results are cited from original work available online, either as peer-reviewed publications, preprints or older publications, or from past research projects.

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
ADAlzheimer’s Disease
ANSAutonomic Nervous System
ApEnApproximate Entropy
BDDBipolar Depressive Disorder
CANCentral Autonomic Network
CNSCentral Nervous System
DFADetrended Fluctuation Analysis
DLPCDorsolateral Prefrontal Cortex
ECGElectrocardiogram
EEGElectroencephalography
EMGElectromyography
FFTFast Fourier Transform
fMRIFunctional Magnetic Resonance Imaging
HFDHiguchi’s Fractal Dimension
HRVHeart Rate Variability
MCIMild Cognitive Impairment
MDDMajor Depressive Disorder
MSMultiple Sclerosis
NIBSNon-Invasive Brain Stimulation
OCDObsessive–Compulsive Disorder
PCAPrincipal Component Analysis
PDParkinson’s Disease
PSDPower Spectral Density
RSNsResting State Networks
rTMSRepetitive Transcranial Magnetic Stimulation
SampEnSample Entropy
sEMGSurface Electromyography
tDCSTranscranial Direct Current Stimulation
TMSTranscranial Magnetic Stimulation

References

  1. Goldberger, A.L.; Rigney, D.R.; West, B.J. Chaos and Fractals in Human Physiology. Sci. Am. 1990, 262, 42–49. [Google Scholar] [CrossRef]
  2. Klonowski, W. From Conformons to Human Brains: An Informal Overview of Nonlinear Dynamics and Its Applications in Biomedicine. Nonlinear Biomed. Phys. 2007, 1, 5. [Google Scholar] [CrossRef]
  3. Mandelbrot, B.B. The Fractal Geometry of Nature; Freeman: San Francisco, CA, USA, 1967. [Google Scholar]
  4. Higuchi, T. Approach to an Irregular Time Series on the Basis of the Fractal Theory. Phys. D Nonlinear Phenom. 1988, 31, 277–283. [Google Scholar] [CrossRef]
  5. Goldberger, A.L. Fractal Variability versus Pathologic Periodicity: Complexity Loss and Stereotypy in Disease. Perspect. Biol. Med. 1997, 40, 543–561. [Google Scholar] [CrossRef]
  6. Lipsitz, L.A.; Goldberger, A.L. Loss of “complexity” and Aging. Potential Applications of Fractals and Chaos Theory to Senescence. JAMA 1992, 267, 1806–1809. [Google Scholar] [CrossRef] [PubMed]
  7. Klonowski, W. Signal and Image Analysis Using Chaos Theory and Fractal Geometry. Mach. Graph. Vis. 2000, 9, 403–431. [Google Scholar]
  8. Goldberger, A.L. Heartbeats, hormones, and health: Is variability the spice of life? Am. J. Respir. Crit. Care Med. 2001, 163, 1289–1290. [Google Scholar] [CrossRef]
  9. Eke, A.; Herman, P.; Kocsis, L.; Kozak, L.R. Fractal Characterization of Complexity in Temporal Physiological Signals. Physiol. Meas. 2002, 23, R1–R38. [Google Scholar] [CrossRef] [PubMed]
  10. Stam, C.J. Nonlinear Dynamical Analysis of EEG and MEG: Review of an Emerging Field. Clin. Neurophysiol. 2005, 116, 2266–2301. [Google Scholar] [CrossRef]
  11. Di Ieva, A.; Esteban, F.J.; Grizzi, F.; Klonowski, W.; Martín-Landrove, M. Fractals in the neurosciences, part II: Clinical applications and future perspectives. Neuroscientist 2015, 21, 30–43. [Google Scholar] [CrossRef] [PubMed]
  12. Di Ieva, A.; Grizzi, F.; Jelinek, H.; Pellionisz, A.J.; Losa, G.A. Fractals in the neurosciences, part I: General principles and basic neurosciences. Neuroscientist 2014, 20, 403–417. [Google Scholar] [CrossRef]
  13. Friston, K. The Free-Energy Principle: A Unified Brain Theory? Nat. Rev. Neurosci. 2010, 11, 127–138. [Google Scholar] [CrossRef]
  14. Clark, A. Embodied Prediction. In Open MIND: 7(T); Metzinger, T., Windt, J.M., Eds.; MIND Group: Frankfurt am Main, Germany, 2015. [Google Scholar] [CrossRef]
  15. Friston, K.; FitzGerald, T.; Rigoli, F.; Schwartenbeck, P.; Pezzulo, G. Active Inference: A Process Theory. Neural Comput. 2017, 29, 1–49. [Google Scholar] [CrossRef]
  16. Porcaro, C.; Moaveninejad, S.; D’Onofrio, V.; DiIeva, A. Fractal Time Series: Background, Estimation Methods, and Performances. Adv. Neurobiol. 2024, 36, 95–137. [Google Scholar] [CrossRef]
  17. Frischer, R.; Singh, M.P.; Porcaro, C.; Namazi, H. Fractal Theory: A Comprehensive Review of Applications Across Engineering Disciplines. Fractals 2025, 33, 2530008. [Google Scholar] [CrossRef]
  18. Goldberger, A.L.; Amaral, L.A.N.; Hausdorff, J.M.; Ivanov, P.C.; Peng, C.K.; Stanley, H.E. Fractal Dynamics in Physiology: Alterations with Disease and Aging. Proc. Natl. Acad. Sci. USA 2002, 99, 2466–2472. [Google Scholar] [CrossRef]
  19. Pincus, S.M.; Goldberger, A.L. Physiological Time-Series Analysis: What Does Regularity Quantify? Am. J. Physiol. 1994, 266, H1643–H1656. [Google Scholar] [CrossRef]
  20. Pincus, S.; Singer, B.H. Randomness and Degrees of Irregularity. Proc. Natl. Acad. Sci. USA 1996, 93, 2083–2088. [Google Scholar] [CrossRef] [PubMed]
  21. Čukić, M.; Stokić, M.; Simić, S.; Pokrajac, D. The Successful Discrimination of Depression from EEG Could Be Attributed to Proper Feature Extraction and Not to a Particular Classification Method. Cogn. Neurodynamics 2020, 14, 443–455. [Google Scholar] [CrossRef] [PubMed]
  22. Čukić, M.; Stokić, M.; Radenković, S.; Ljubisavljević, M.; Simić, S.; Savić, D. Nonlinear Analysis of EEG Complexity in Episode and Remission Phase of Recurrent Depression. Int. J. Methods Psychiatr. Res. 2020, 29, e1816. [Google Scholar] [CrossRef] [PubMed]
  23. De Torre-luque, A.; Bornas, X. Complexity and Irregularity in the Brain Oscillations of Depressive Patients: A Systematic Review. Neuropsychiatry 2017, 7, 466–477. [Google Scholar] [CrossRef]
  24. Pincus, S.M.; Cummins, T.R.; Haddad, G.G. Heart Rate Control in Normal and Aborted-SIDS Infants. Am. J. Physiol. 1993, 264, R638–R646. [Google Scholar] [CrossRef] [PubMed]
  25. Posener, J.A.; DeBattista, C.; Veldhuis, J.D.; Province, M.A.; Williams, G.H.; Schatzberg, A.F. Process Irregularity of Cortisol and Adrenocorticotropin Secretion in Men with Major Depressive Disorder. Psychoneuroendocrinology 2004, 29, 1129–1137. [Google Scholar] [CrossRef] [PubMed]
  26. Pincus, S.M. Quantitative Assessment Strategies and Issues for Mood and Other Psychiatric Serial Study Data. Bipolar Disord. 2003, 5, 287–294. [Google Scholar] [CrossRef]
  27. De Kwaasteniet, B.; Ruhe, E.; Caan, M.; Rive, M.; Olabarriaga, S.; Groefsema, M.; Heesink, L.; van Wingen, G.; Denys, D. Relation between Structural and Functional Connectivity in Major Depressive Disorder. Biol. Psychiatry 2013, 74, 40–47. [Google Scholar] [CrossRef]
  28. Zhang, S.; Tsai, S.J.; Hu, S.; Xu, J.; Chao, H.H.; Calhoun, V.D.; Li, C.S.R. Independent Component Analysis of Functional Networks for Response Inhibition: Inter-Subject Variation in Stop Signal Reaction Time. Hum. Brain Mapp. 2015, 36, 3289–3302. [Google Scholar] [CrossRef]
  29. Ahmadlou, M.; Adeli, H.; Adeli, A. Fractality Analysis of Frontal Brain in Major Depressive Disorder. Int. J. Psychophysiol. 2012, 85, 206–211. [Google Scholar] [CrossRef]
  30. Hosseinifard, B.; Moradi, M.H.; Rostami, R. Classifying Depression Patients and Normal Subjects Using Machine Learning Techniques and Nonlinear Features from EEG Signal. Comput. Methods Programs Biomed. 2013, 109, 339–345. [Google Scholar] [CrossRef]
  31. Bairy, G.M.; Bhat, S.; Eugene, L.W.; Niranjan, U.C.; Puthankattil, S.D.; Joseph, P.K. Automated classification of depression electroencephalographic signals using discrete cosine transform and nonlinear dynamics. J. Med. Imaging Health Inform. 2015, 5, 635–640. [Google Scholar] [CrossRef]
  32. Acharya, U.R.; Sudarshan, V.K.; Adeli, H.; Santhosh, J.; Koh, J.E.; Puthankatti, S.D.; Adeli, A. A novel depression diagnosis index using nonlinear features in EEG signals. Eur. Neurol. 2015, 74, 79–83. [Google Scholar] [CrossRef]
  33. Bachmann, M.; Päeske, L.; Kalev, K.; Aarma, K.; Lehtmets, A.; Ööpik, P.; Lass, J.; Hinrikus, H. Methods for Classifying Depression in Single Channel EEG Using Linear and Nonlinear Signal Analysis. Comput. Methods Programs Biomed. 2018, 155, 11–17. [Google Scholar] [CrossRef]
  34. Berisha, V.; Krantsevich, C.; Hahn, P.R.; Hahn, S.; Dasarathy, G.; Turaga, P.; Liss, J. Digital Medicine and the Curse of Dimensionality. npj Digit. Med. 2021, 4, 153. [Google Scholar] [CrossRef]
  35. Willner, P.; Scheel-Krüger, J.; Belzung, C. The neurobiology of depression and antidepressant action. Neurosci. Biobehav. Rev. 2013, 37, 2331–2371. [Google Scholar] [CrossRef]
  36. Čukić, M.; López, V.; Pavón, J. Classification of Depression Through Resting-State Electroencephalogram as a Novel Practice in Psychiatry: Review. J. Med. Internet Res. 2020, 22, e19548. [Google Scholar] [CrossRef] [PubMed]
  37. Čukić, M.; Savić, D. Another Godot Who Is Still Not Coming: More on Biomarkers for Depression. Rev. Psiquiatr. Salud Ment. 2022, 15, 153–154. [Google Scholar] [CrossRef]
  38. Čukić, M.; Pokrajac, D.; Lopez, V. On Mistakes we made in Prior Computational Psychiatry Data Driven Approach Projects and How they Jeopardize Translation of Those Findings in Clinical Practice. In Proceedings of the 2020 Intelligent Systems Conference (IntelliSys); Advances in Intelligent Systems and Computing; Springer Nature: Berlin/Heidelberg, Germany, 2020; Chapter 37; Volume 3. [Google Scholar] [CrossRef]
  39. Mutanen, T.; Nieminen, J.O.; Ilmoniemi, R.J. TMS-Evoked Changes in Brain-State Dynamics Quantified by Using EEG Data. Front. Hum. Neurosci. 2013, 7, 155. [Google Scholar] [CrossRef]
  40. Cukic, M.; Stokic, M.; Radenkovic, S.; Ljubisavljevic, M.; Pokrajac, D.D. The Shift in brain-state induced by tDCS: An EEG study. In Novel Approaches in Treating Major Depressive Disorder; NOVA Scientific Publishers Ltd.: New York, NY, USA, 2019; ISBN 978-1-53614-382-9. [Google Scholar]
  41. Čukić, M. The Reason Why RTMS and TDCS Are Efficient in Treatments of Depression. Front. Psychol. 2019, 10, 2923. [Google Scholar] [CrossRef] [PubMed]
  42. Lebiecka, K.; Zuchowicz, U.; Wozniak-Kwasniewska, A.; Szekely, D.; Olejarczyk, E.; David, O. Complexity Analysis of EEG Data in Persons with Depression Subjected to Transcranial Magnetic Stimulation. Front. Physiol. 2018, 9, 1385. [Google Scholar] [CrossRef] [PubMed]
  43. Kemp, A.H.; Quintana, D.S.; Quinn, C.R.; Hopkinson, P.; Harris, A.W.F. Major Depressive Disorder with Melancholia Displays Robust Alterations in Resting State Heart Rate and Its Variability: Implications for Future Morbidity and Mortality. Front. Psychol. 2014, 5, 1387. [Google Scholar] [CrossRef]
  44. Kemp, A.H.; Quintana, D.S.; Felmingham, K.L.; Matthews, S.; Jelinek, H.F. Depression, Comorbid Anxiety Disorders, and Heart Rate Variability in Physically Healthy, Unmedicated Patients: Implications for Cardiovascular Risk. PLoS ONE 2012, 7, e30777. [Google Scholar] [CrossRef]
  45. Kemp, A.H.; Quintana, D.S.; Malhi, G.S. Effects of Serotonin Reuptake Inhibitors on Heart Rate Variability: Methodological Issues, Medical Comorbidity, and Clinical Relevance. Biol. Psychiatry 2011, 69, e27–e28. [Google Scholar] [CrossRef] [PubMed]
  46. Khandoker, A.H.; Luthra, V.; Abouallaban, Y.; Saha, S.; Ahmed, K.I.; Mostafa, R.; Chowdhury, N.; Jelinek, H.F. Predicting Depressed Patients with Suicidal Ideation from ECG Recordings. Med. Biol. Eng. Comput. 2017, 55, 793–805. [Google Scholar] [CrossRef]
  47. Thorell, L.H.; Wolfersdorf, M.; Straub, R.; Steyer, J.; Hodgkinson, S.; Kaschka, W.P.; Jandl, M. Electrodermal Hyporeactivity as a Trait Marker for Suicidal Propensity in Uni- and Bipolar Depression. J. Psychiatr. Res. 2013, 47, 1925–1931. [Google Scholar] [CrossRef]
  48. Burns, T.; Rajan, R. Combining Complexity Measures of EEG Data: Multiplying Measures Reveal Previously Hidden Information. F1000Research 2015, 4, 137. [Google Scholar] [CrossRef]
  49. Weber, S.; Müller, M.; Kronenberg, G.; Seifritz, E.; Ajdacic-Gross, V.; Olbrich, S. Electrocardiography-Derived Autonomic Profiles in Depression and Suicide Risk with Insights from the UK Biobank. npj Ment. Health Res. 2025, 4, 17. [Google Scholar] [CrossRef]
  50. Kas, M.J.H.; Penninx, B.W.J.H.; Knudsen, G.M.; Cuthbert, B.; Falkai, P.; Sachs, G.S.; Ressler, K.J.; Bałkowiec-Iskra, E.; Butlen-Ducuing, F.; Leboyer, M.; et al. Precision Psychiatry Roadmap: Towards a Biology-Informed Framework for Mental Disorders. Mol. Psychiatry 2025, 30, 3846–3855. [Google Scholar] [CrossRef] [PubMed]
  51. Squitti, R.; Ventriglia, M.; Simonelli, I.; Bonvicini, C.; Crescenti, D.; Borroni, B.; Rongioletti, M.; Ghidoni, R. Copper Dysregulation in Major Depression: A Systematic Review and Meta-Analytic Evidence for a Putative Trait Marker. Int. J. Mol. Sci. 2025, 26, 9247. [Google Scholar] [CrossRef]
  52. Klonowski, W. Chaotic Dynamics Applied to Signal Complexity in Phase Space and in Time Domain. Chaos Solit. Fractals 2002, 14, 1379–1387. [Google Scholar] [CrossRef]
  53. Klonowski, W. Everything You Wanted to Ask about EEG but Were Afraid to Get the Right Answer. Nonlinear Biomed. Phys. 2009, 3, 2. [Google Scholar] [CrossRef]
  54. Porcaro, C.; Seppi, D.; Pellegrino, G.; Dainese, F.; Kassabian, B.; Pellegrino, L.; De Nardi, G.; Grego, A.; Corbetta, M.; Ferreri, F. Characterization of Antiseizure Medications Effects on the EEG Neurodynamic by Fractal Dimension. Front. Neurosci. 2024, 18, 1401068. [Google Scholar] [CrossRef] [PubMed]
  55. Zappasodi, F.; Olejarczyk, E.; Marzetti, L.; Assenza, G.; Pizzella, V.; Tecchio, F. Fractal Dimension of EEG Activity Senses Neuronal Impairment in Acute Stroke. PLoS ONE 2014, 9, e100199. [Google Scholar] [CrossRef] [PubMed]
  56. Porcaro, C.; Marino, M.; Carozzo, S.; Russo, M.; Ursino, M.; Ruggiero, V.; Ragno, C.; Proto, S.; Tonin, P. Fractal Dimension Feature as a Signature of Severity in Disorders of Consciousness: An EEG Study. Int. J. Neural Syst. 2022, 32, 2250031. [Google Scholar] [CrossRef] [PubMed]
  57. Porcaro, C.; Di Renzo, A.; Tinelli, E.; Di Lorenzo, G.; Parisi, V.; Caramia, F.; Fiorelli, M.; Di Piero, V.; Pierelli, F.; Coppola, G. Haemodynamic Activity Characterization of Resting State Networks by Fractal Analysis and Thalamocortical Morphofunctional Integrity in Chronic Migraine. J. Headache Pain 2020, 21, 112. [Google Scholar] [CrossRef]
  58. Porcaro, C.; Di Renzo, A.; Tinelli, E.; Parisi, V.; Di Lorenzo, C.; Caramia, F.; Fiorelli, M.; Giuliani, G.; Cioffi, E.; Seri, S.; et al. A Hypothalamic Mechanism Regulates the Duration of a Migraine Attack: Insights from Microstructural and Temporal Complexity of Cortical Functional Networks Analysis. Int. J. Mol. Sci. 2022, 23, 13238. [Google Scholar] [CrossRef]
  59. Porcaro, C.; Di Renzo, A.; Tinelli, E.; Di Lorenzo, G.; Seri, S.; Di Lorenzo, C.; Parisi, V.; Caramia, F.; Fiorelli, M.; Di Piero, V.; et al. Hypothalamic Structural Integrity and Temporal Complexity of Cortical Information Processing at Rest in Migraine without Aura Patients between Attacks. Sci. Rep. 2021, 11, 18701. [Google Scholar] [CrossRef]
  60. Cukic, M.; Oommen, J.; Mutavdzic, D.; Jorgovanovic, N.; Ljubisavljevic, M. The Effect of Single-Pulse Transcranial Magnetic Stimulation and Peripheral Nerve Stimulation on Complexity of EMG Signal: Fractal Analysis. Exp. Brain Res. 2013, 228, 97–104. [Google Scholar] [CrossRef]
  61. Cukic, M.; Kalauzi, A.; Ilic, T.; Miskovic, M.; Ljubisavljevic, M. The Influence of Coil-Skull Distance on Transcranial Magnetic Stimulation Motor-Evoked Responses. Exp. Brain Res. 2009, 192, 53–60. [Google Scholar] [CrossRef] [PubMed]
  62. Porcaro, C.; Cottone, C.; Cancelli, A.; Rossini, P.M.; Zito, G.; Tecchio, F. Cortical Neurodynamics Changes Mediate the Efficacy of a Personalized Neuromodulation against Multiple Sclerosis Fatigue. Sci. Rep. 2019, 9, 18213. [Google Scholar] [CrossRef]
  63. Maschke, C.; O’Byrne, J.; Colombo, M.A.; Boly, M.; Gosseries, O.; Laureys, S.; Rosanova, M.; Jerbi, K.; Blain-Moraes, S. Critical Dynamics in Spontaneous EEG Predict Anesthetic-Induced Loss of Consciousness and Perturbational Complexity. Commun. Biol. 2024, 7, 946. [Google Scholar] [CrossRef]
  64. Fiorenzato, E.; Moaveninejad, S.; Weis, L.; Biundo, R.; Antonini, A.; Porcaro, C. Brain Dynamics Complexity as a Signature of Cognitive Decline in Parkinson’s Disease. Mov. Disord. 2024, 39, 305–317. [Google Scholar] [CrossRef]
  65. Smits, F.M.; Porcaro, C.; Cottone, C.; Cancelli, A.; Rossini, P.M.; Tecchio, F. Electroencephalographic Fractal Dimension in Healthy Ageing and Alzheimer’s Disease. PLoS ONE 2016, 11, e0149587. [Google Scholar] [CrossRef]
  66. Radenkovic, M.C.; Annaheim, S.; Eggenberger, P.; Rossi, R.M. An Early Dementia Risk Screening Approach for Healthy Aging Citizens. In Proceedings of the 2024 IEEE International Workshop on Metrology for Industry 4.0 & IoT (MetroInd4. 0 & IoT), Florence, Italy, 29–31 May 2024; IEEE: New York, NY, USA, 2024; pp. 417–423. [Google Scholar]
  67. Vallesi, A.; Porcaro, C.; Visalli, A.; Fasolato, D.; Rossato, F.; Bussè, C.; Cagnin, A. Resting-State EEG Spectral and Fractal Features in Dementia with Lewy Bodies with and without Visual Hallucinations. Clin. Neurophysiol. 2024, 168, 43–51. [Google Scholar] [CrossRef] [PubMed]
  68. Porcaro, C.; Mayhew, S.D.; Marino, M.; Mantini, D.; Bagshaw, A.P. Characterisation of Haemodynamic Activity in Resting State Networks by Fractal Analysis. Int. J. Neural Syst. 2020, 30, S0129065720500616. [Google Scholar] [CrossRef]
  69. Marino, M.; Liu, Q.; Samogin, J.; Tecchio, F.; Cottone, C.; Mantini, D.; Porcaro, C. Neuronal Dynamics Enable the Functional Differentiation of Resting State Networks in the Human Brain. Hum. Brain Mapp. 2019, 40, 1445–1457. [Google Scholar] [CrossRef]
  70. Olejarczyk, E.; Jozwik, A.; Valiulis, V.; Dapsys, K.; Gerulskis, G.; Germanavicius, A. Statistical Analysis of Graph-Theoretic Indices to Study EEG-TMS Connectivity in Patients with Depression. Front. Neuroinform. 2021, 15, 651082. [Google Scholar] [CrossRef]
  71. Olejarczyk, E.; Zuchowicz, U.; Wozniak-Kwasniewska, A.; Kaminski, M.; Szekely, D.; David, O. The Impact of Repetitive Transcranial Magnetic Stimulation on Functional Connectivity in Major Depressive Disorder and Bipolar Disorder Evaluated by Directed Transfer Function and Indices Based on Graph Theory. Int. J. Neural Syst. 2020, 30, 2050015. [Google Scholar] [CrossRef]
  72. Bisogno, A.L.; Moaveninejad, S.; Corbetta, M.; Porcaro, C. Pre-Stimulus Neural Dynamics Predict TMS Responses: The Role of Fractal Dimension and Oscillatory Activity. Comput. Biol. Med. 2025, 198, 111220. [Google Scholar] [CrossRef]
  73. Hausdorff, J.M.; Cudkowicz, M.E.; Firtion, R.; Wei, J.Y.; Goldberger, A.L. Gait Variability and Basal Ganglia Disorders: Stride-to-Stride Variations of Gait Cycle Timing in Parkinson’s Disease and Huntington’s Disease. Mov. Disord. Off. J. Mov. Disord. Soc. 1998, 13, 428–437. [Google Scholar] [CrossRef] [PubMed]
  74. Frischer, R.; Krejcar, O.; Abdullah, J.; Porcaro, C.; Ghosh, D.K.; Namazi, H. Analysis of Brain-Muscle Correlation in Different Hand Movements. Fractals 2025, 33, 2550064. [Google Scholar] [CrossRef]
  75. De Luca, C.J. Myoelectrical Manifestations of Localized Muscular Fatigue in Humans. Crit. Rev. Biomed. Eng. 1984, 11, 251–279. [Google Scholar]
  76. Meigal, A.I.; Rissanen, S.; Tarvainen, M.P.; Karjalainen, P.A.; Iudina-Vassel, I.A.; Airaksinen, O.; Kankaanpää, M. Novel Parameters of Surface EMG in Patients with Parkinson’s Disease and Healthy Young and Old Controls. J. Electromyogr. Kinesiol. Off. J. Int. Soc. Electrophysiol. Kinesiol. 2009, 19, e206–e213. [Google Scholar] [CrossRef]
  77. Meigal, A.Y.; Rissanen, S.M.; Tarvainen, M.P.; Georgiadis, S.D.; Karjalainen, P.A.; Airaksinen, O.; Kankaanpää, M. Linear and Nonlinear Tremor Acceleration Characteristics in Patients with Parkinson’s Disease. Physiol. Meas. 2012, 33, 395–412. [Google Scholar] [CrossRef]
  78. Meigal, A.Y.; Rissanen, S.M.; Tarvainen, M.P.; Airaksinen, O.; Kankaanpää, M.; Karjalainen, P.A. Non-Linear EMG Parameters for Differential and Early Diagnostics of Parkinson’s Disease. Front. Neurol. 2013, 4, 135. [Google Scholar] [CrossRef] [PubMed]
  79. Meigal, A.Y.; Rissanen, S.M.; Zaripova, Y.R.; Miroshnichenko, G.G.; Karjalainen, P. Nonlinear Parameters of the Surface Electromyogram for the Diagnostics of Neuromuscular Disorders and Normal Conditions of the Motor System in Human. Fiziol. Cheloveka 2015, 41, 119–127. [Google Scholar] [CrossRef]
  80. Del Santo, F.; Gelli, F.; Mazzocchio, R.; Rossi, A. Recurrence Quantification Analysis of Surface EMG Detects Changes in Motor Unit Synchronization Induced by Recurrent Inhibition. Exp. Brain Res. 2007, 178, 308–315. [Google Scholar] [CrossRef]
  81. Farina, D.; Fattorini, L.; Felici, F.; Filligoi, G. Nonlinear Surface EMG Analysis to Detect Changes of Motor Unit Conduction Velocity and Synchronization. J. Appl. Physiol. 2002, 93, 1753–1763. [Google Scholar] [CrossRef]
  82. Filligoi, G.; Felici, F. Detection of Hidden Rhythms in Surface EMG Signals with a Non-Linear Time-Series Tool. Med. Eng. Phys. 1999, 21, 439–448. [Google Scholar] [CrossRef]
  83. Ikegawa, S.; Shinohara, M.; Fukunaga, T.; Zbilut, J.P.; Webber, C.L.J. Nonlinear Time-Course of Lumbar Muscle Fatigue Using Recurrence Quantifications. Biol. Cybern. 2000, 82, 373–382. [Google Scholar] [CrossRef] [PubMed]
  84. Gitter, J.A.; Czerniecki, M.J. Fractal Analysis of the Electromyographic Interference Pattern. J. Neurosci. Methods 1995, 58, 103–108. [Google Scholar] [CrossRef] [PubMed]
  85. Gupta, V.; Suryanarayanan, S.; Reddy, N.P. Fractal Analysis of Surface EMG Signals from the Biceps. Int. J. Med. Inform. 1997, 45, 185–192. [Google Scholar] [CrossRef]
  86. Carpentier, A.; Duchateau, J.; Hainaut, K. Motor Unit Behaviour and Contractile Changes during Fatigue in the Human First Dorsal Interosseus. J. Physiol. 2001, 534, 903–912. [Google Scholar] [CrossRef]
  87. Riley, Z.A.; Maerz, A.H.; Litsey, J.C.; Enoka, R.M. Motor Unit Recruitment in Human Biceps Brachii during Sustained Voluntary Contractions. J. Physiol. 2008, 586, 2183–2193. [Google Scholar] [CrossRef]
  88. Cukic, M.B.; Platisa, M.M.; Kalauzi, A.; Oommen, J.; Ljubisavljevic, M.R. The Comparison of Higuchi Fractal Dimension and Sample Entropy Analysis of SEMG: Effects of Muscle Contraction Intensity and TMS 2018. arXiv 2018, arXiv:1803.10753. [Google Scholar]
  89. Yahata, N.; Kasai, K.; Kawato, M. Computational neuroscience approach to biomarkers and treatments for mental disorders. Psychiatry Clin. Neurosci. 2017, 71, 215–237. [Google Scholar] [CrossRef] [PubMed]
  90. Kalauzi, A.; Bojić, T.; Vuckovic, A. Modeling the Relationship between Higuchi’s Fractal Dimension and Fourier Spectra of Physiological Signals. Med. Biol. Eng. Comput. 2012, 50, 689–699. [Google Scholar] [CrossRef] [PubMed]
  91. Spasić, S.; Kalauzi, A.; Ćulić, M.; Grbić, G.; Martać, L. Estimation of Parameter Kmax in Fractal Analysis of Rat Brain Activity. Ann. N. Y. Acad. Sci. 2005, 1048, 427–429. [Google Scholar] [CrossRef]
  92. Esteller, R.; Vachtsevanos, G.; Echauz, J.; Litt, B. A comparison of waveform fractal dimension algorithms. IEEE Trans. Circuits Syst. I Fundam. Theory Appl. 2001, 48, 177–183. [Google Scholar] [CrossRef]
  93. Vapnik, V.N. Statistical Learning Theory; John Wiley & Sons: Hoboken, NJ, USA, 1998; p. 42. Available online: https://www.vialibri.net/years/books/858660368/1998-vladimir-n-vapnik-statistical-learning-theory (accessed on 9 February 2026).
  94. Ng, A.Y. Preventing overfitting of cross-validation data. In Proceedings of the 14th International Conference on Machine Learning, Nashville, TN, USA, 23 July 1997; Presented at: ICML’97. Available online: http://robotics.stanford.edu/~ang/papers/cv-final.pdf (accessed on 9 February 2026).
  95. Ahmadi, K.; Ahmadlou, M.; Rezazade, M.; Azad-Marzabadi, E.; Sajedi, F. Brain activity of women is more fractal than men. Neurosci. Lett. 2013, 535, 7–11. [Google Scholar] [CrossRef]
  96. Ahmadlou, M.; Adeli, H.; Adeli, A. Spatiotemporal analysis of relative convergence of EEGs reveals differences between brain dynamics of depressive women and men. Clin. EEG Neurosci. 2013, 44, 175–181. [Google Scholar] [CrossRef]
  97. Čukić, M.; Galovic, S. Mathematical Modeling of Anomalous Diffusive Behavior in Transdermal Drug-Delivery Including Time-Delayed Flux Concept. Chaos Solit. Fractals 2023, 172, 113584. [Google Scholar] [CrossRef]
  98. Galovic, S.; Čukić, M.; Chevizovich, D. Inertial Memory Effects in Molecular Transport Across Nanoporous Membranes. Membranes 2025, 15, 11. [Google Scholar] [CrossRef] [PubMed]
  99. Galovic, S.; Radenkovic, M.C.; Suljovrujic, E. Impedance-Controlled Molecular Transport Across Multilayer Skin Membranes. Membranes 2026, 2026, 4144942. [Google Scholar] [CrossRef]
Figure 1. Illustration of pathological loss of complexity from the physionet.org website: above is the heart rate time series of a healthy person, which is very variable and complex; below is the trace of a record from a patient with severe congestive heart failure (CHF), with a visual resemblance to half sinusoid being a very predictable, highly regular shape. They have almost identical means/variances, but very different dynamics. Source: physionet.org/Tutorials.
Figure 1. Illustration of pathological loss of complexity from the physionet.org website: above is the heart rate time series of a healthy person, which is very variable and complex; below is the trace of a record from a patient with severe congestive heart failure (CHF), with a visual resemblance to half sinusoid being a very predictable, highly regular shape. They have almost identical means/variances, but very different dynamics. Source: physionet.org/Tutorials.
Fractalfract 10 00169 g001
Figure 2. Effect sizes of HFD calculated from particular EEG positions, showing the most prominent results from our MCI early detection research. On specific positions P7, P3, P4 and P8, the effect sizes were the highest; at positions like O1 and O2 that were prominent in the literature, no statistical significance was detected (only individual significant detections). Our interpretation was that most likely, the first changes start at parietal positions and future detection should focus on that area.
Figure 2. Effect sizes of HFD calculated from particular EEG positions, showing the most prominent results from our MCI early detection research. On specific positions P7, P3, P4 and P8, the effect sizes were the highest; at positions like O1 and O2 that were prominent in the literature, no statistical significance was detected (only individual significant detections). Our interpretation was that most likely, the first changes start at parietal positions and future detection should focus on that area.
Fractalfract 10 00169 g002
Figure 3. Raw sEMG signal at three different contraction levels. The top panel (A) shows mild, the middle panel (B) shows medium, and the lower panel (C) shows strong contraction. Arrows indicate the beginning and the end of the segment used for analysis pre- (left side of the recording) and post-TMS (right side of the recording). The HFD of the same segments were: 1.0921/1.0914 (mild), 1.0495/1.047 (medium) and 1.0167/1.015 (strong). The SampEn values of the same segments were: 0.036877/0.035281 (mild), 0.071382/0.056896 (medium) and 0.121227/0.104262 (strong).
Figure 3. Raw sEMG signal at three different contraction levels. The top panel (A) shows mild, the middle panel (B) shows medium, and the lower panel (C) shows strong contraction. Arrows indicate the beginning and the end of the segment used for analysis pre- (left side of the recording) and post-TMS (right side of the recording). The HFD of the same segments were: 1.0921/1.0914 (mild), 1.0495/1.047 (medium) and 1.0167/1.015 (strong). The SampEn values of the same segments were: 0.036877/0.035281 (mild), 0.071382/0.056896 (medium) and 0.121227/0.104262 (strong).
Fractalfract 10 00169 g003
Figure 4. (a) Changes in SampEn before and after spTMS. SampEn of sEMG time series before and after spTMS at three levels of contraction. p < 0.001 PRE vs. POST TMS. p < 0.001 medium vs. mild contraction and 0.001 strong vs. medium contraction. (b) Changes in HFD before and after spTMS. HFD of sEMG time series before and after spTMS at three levels of contraction. p < 0.001 PRE vs. POST TMS and p < 0.001 medium and strong vs. mild contraction.
Figure 4. (a) Changes in SampEn before and after spTMS. SampEn of sEMG time series before and after spTMS at three levels of contraction. p < 0.001 PRE vs. POST TMS. p < 0.001 medium vs. mild contraction and 0.001 strong vs. medium contraction. (b) Changes in HFD before and after spTMS. HFD of sEMG time series before and after spTMS at three levels of contraction. p < 0.001 PRE vs. POST TMS and p < 0.001 medium and strong vs. mild contraction.
Fractalfract 10 00169 g004
Figure 5. Cumulative amount of particles that pass through skin membrane and enter into blood stream, predicted by classical Fickean’s model (green line) and three generalized damped wave models (blue, red and magenta lines) damped wave model, fractional superdiffusive damped wave model, and fractional subdiffusive damped wave model.
Figure 5. Cumulative amount of particles that pass through skin membrane and enter into blood stream, predicted by classical Fickean’s model (green line) and three generalized damped wave models (blue, red and magenta lines) damped wave model, fractional superdiffusive damped wave model, and fractional subdiffusive damped wave model.
Fractalfract 10 00169 g005
Figure 6. Comparison of simulation of a digital twin (that is using diffusion to model the transport through skin) and the curve that resulted from actual sampling of fentanyl concentrations in plasma taken from healthy volunteers (courtesy of a colleague who collected the samples and allowed us to reuse it for this calculation). All the curves (here, just an illustration of a subsample) can be mathematically described with a quasi-periodic function with peaks far away from monotonous function, generated with the oversimplification of a process.
Figure 6. Comparison of simulation of a digital twin (that is using diffusion to model the transport through skin) and the curve that resulted from actual sampling of fentanyl concentrations in plasma taken from healthy volunteers (courtesy of a colleague who collected the samples and allowed us to reuse it for this calculation). All the curves (here, just an illustration of a subsample) can be mathematically described with a quasi-periodic function with peaks far away from monotonous function, generated with the oversimplification of a process.
Fractalfract 10 00169 g006
Table 1. Contrasting related depression detection studies, based on fractal measures combined with consequent machine learning for automatic detection, before explosion of automatic protocols.
Table 1. Contrasting related depression detection studies, based on fractal measures combined with consequent machine learning for automatic detection, before explosion of automatic protocols.
PublicationMeasures UsedML UsedAccuracy %N
Ahmadlou et al., 2012 [29]Higuchi and Katz FDEnhanced probabilistic neural networks91.324
Hosseinifard et al., 2013 [30]DFA, Higuchi, correlation dimension, Lyapunov expKNN, LR, Linear discriminant9045
Bairy et al., 2015 [31]HFD, SampEn, CD, Hurst ext, LLA, DFADT, KNN, NB, SVM93.860
Acharya et al., 2015 [32]HFD, CD, DFA, H, LLE, DET, ENTR, LAM, Emph, Ent 1, Ent 2 W_bSVM, KNN, NB, PNN, DT9830
Bachmann et al., 2018 [33]HFD, DFA, LempelZiv complexity, and SASIaLogistic regression8826
Čukić et al., 2018 & 2020 [21]HFD, SampEnMP, LR, SVM (with and without kernel), DT, RF, NB97.5046
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Radenković, M.Č.; Porcaro, C.; Lopez, V. An Evolution of Our Understanding of Decomplexification Estimation for Early Detection, Monitoring and Modeling of Human Physiology. Fractal Fract. 2026, 10, 169. https://doi.org/10.3390/fractalfract10030169

AMA Style

Radenković MČ, Porcaro C, Lopez V. An Evolution of Our Understanding of Decomplexification Estimation for Early Detection, Monitoring and Modeling of Human Physiology. Fractal and Fractional. 2026; 10(3):169. https://doi.org/10.3390/fractalfract10030169

Chicago/Turabian Style

Radenković, Milena Čukić, Camillo Porcaro, and Victoria Lopez. 2026. "An Evolution of Our Understanding of Decomplexification Estimation for Early Detection, Monitoring and Modeling of Human Physiology" Fractal and Fractional 10, no. 3: 169. https://doi.org/10.3390/fractalfract10030169

APA Style

Radenković, M. Č., Porcaro, C., & Lopez, V. (2026). An Evolution of Our Understanding of Decomplexification Estimation for Early Detection, Monitoring and Modeling of Human Physiology. Fractal and Fractional, 10(3), 169. https://doi.org/10.3390/fractalfract10030169

Article Metrics

Back to TopTop