Abstract
In this paper, we investigate an application of the statistical concept of causality, based on Granger’s definition of causality, on raw increasing processes as well as on optional and predictable measures. A raw increasing process is optional (predictable) if the bounded (left-continuous) process X, associated with the measure , is self-caused. Also, the measure is optional (predictable) if an associated process X is self-caused with some additional assumptions. Some of the obtained results, in terms of self-causality, can be directly applied to defining conditions for an optional stopping time to become predictable.
Keywords:
filtration; statistical causality; increasing processes; optional and predictable measures; predictable stopping time MSC:
60G44; 60G40; 60G57; 62P20
1. Introduction
In the past few decades, scientists have been interested in investigating probabilistic concepts of causality regarding which causes should involve and induce their effects, or, to make be more clear, to raise the probability of their effects. This problem has taken a central role in philosophy and it has great applications in many different sciences.
The processes that we investigate took place under different conditions, but scientists discovered that within the changes, there are connections that remain constant. For example, objects released in mid-air under various conditions will usually fall to the ground. But if the object is a piece of paper, and if there is a strong breeze blowing, the piece of paper will almost certainly rise. Hence, we can understand the laws of nature as conditional, because they can be applied only when unusual or rarely occurring circumstances are neglected. This example is not alone; plenty of different, yet similar phenomena can be observed. Obviously, based on this type of behaviour, we can think about the possibility that during these processes, some of relationships are consistent and the resulting changes are no coincidence. This kind of consistent relationship can lead us to interpret this constancy as necessary; in other words, it could not be otherwise, because it is an essential part of an observed stochastic process. The necessary relationships between objects, events, conditions, or other things at a given time and those at later times are referred to as causal laws.
The causal laws in a specific problem are usually not known a priori; they can be discovered later in nature. The existence of a common relationship that holds between two objects regardless of the conditions is the first sign of potential causal laws. When we notice such conditions, we suppose that they are causal relationships instead of rejecting them as a result of arbitrary connections. The next step in finding regularities which we want to prove are a result of causality is to create a hypothesis about any observed connections, which can help us explain and understand their origin.
The first problem that was considered was to further investigate the relationship between causality and a realisation of events through time. For example, before winter, the leaves fall off the trees. Of course, the loss of the leaves by the trees is not the cause of winter, but is instead the effect of the process of lowering of the temperature, which first leads to the loss of leaves by the trees and, later, to the coming of winter. Clearly, the concept of causal relations implies more than just regular realisation of events, in which the preceding events represent the cause, while an event that is realised later is merely an effect. Obviously, in this case, the event of the trees losing their leaves has happened before the coming of winter, but this is not the cause of winter. In this case, it is easy to see what is the cause and what is the effect, but there are phenomena for which this conclusion is not so simple. This is the reason why some causes can be considered spurious.
But now, if we consider the idea that each occurrence and event has many causes, we have some new problems. First of all, we can say that all events and objects may have some connections, even if those connections are very tiny. On the other side, someone could say that everything is connected and there are numerous causes of some problem of interest. But usually, most of these events have insignificant effects. This is the reason why we define the “significant causes” as those occurrences which have a significant influence on the considered effects [1,2].
Sometimes, causes and effects can be observed by conducting an experiment, but this is not always possible. A very well-known example of a science in which controlled experiments cannot be applied, and in which the conditions of the problem cannot be outlined very well, is geology. “What could have caused these present structures to be what they are?” For example, consider a set of layers of rock, placed diagonally. Obviously, the layers were placed horizontally at first, when the surface was at the bottom of a sea or a lake. Then, the layers were probably pushed up and modified because of earthquakes or some other earth movements. Although this explanation seems very reasonable and credible, there is no way to prove it by controlled experiments or observations carried out under prescribed conditions, because all these processes have happened a long time ago, and that were tectonic movements that are not easy to verify.
In this paper, we consider the concept of causality, which is established based on Granger causality, where the probability change was understood as a comparison of conditional probabilities. The Granger causality pays particular attention to discrete time stochastic processes (time series). But, in many cases—for example, in physics—it is very difficult to consider relations of causality in discrete-time models, and such relations may depend on the length of interval between each sampling. The continuous time framework is fruitful for the statistical approach to causality analysis between stochastic processes that rapidly evolve (see [3]). So, continuous-time models are seeing increasing use in a range of applications (see, for example, [4,5,6]). This is the reason why we investigate the concept of statistical causality in continuous time. Here, the concept of causality is analysed by implementing conditional independence among the -fields, which is a foundation for a general probabilistic theory of causality.
The paper is organised as follows. After the Introduction, in Section 2, we present a generalisation of the concept of causality. “ is a cause of within ”, which involves prediction in any horizon in continuous time. The foundation of this concept is Granger’s definition of causality (see [4,5]), but its generalisation, in terms of Hilbert spaces, is formulated in [7,8]. The concept of causality in continuous time associated with stopping times with some basic properties is introduced in [9]. The given concept of causality is related to separable processes [10] and extremal measures [11].
Section 3 contains our main results. In this section, we relate the given concept of causality in continuous time with the optionality and predictability of the raw increasing process associated with measure. We also give the conditions for a stopping time to be predictable, in terms of causality, which has great applications in mathematical finance, especially in risk theory. The relation between the concept of causality and optional and predictable measures is considered, too. Section 4 provides an application of obtained results on the theory of risk. We end the article with some concluding remarks.
2. Preliminaries and Notation
Causality is clearly a forecast property, and the main question is as follows: is it possible to lessen accessible information in direction to predict a given filtration?
A probabilistic model for a time-dependent system is represented by where is a probability space and is a “framework” filtration that satisfies the usual conditions of right continuity and completeness. The is the smallest -algebra which contains all the . An equivalent notation will be used for filtrations , and . Let us mention that the filtration is a subfiltration of and marked as , if for each t. For a stochastic process X by is marked the smallest -algebra, for which all with , are measurable and is the natural filtration of the process X. The natural filtration is the smallest filtration that allows X to be adapted.
The definition of causality uses the conditional independence of -algebras, and the initial step was to construct the “smallest” sub--algebra of a -algebra conditionally, on which becomes independent of another given -algebra .
Definition 1
(compare with [12,13]). Let be a probability space and , and arbitrary sub-σ-algebras from . It is said that is splitting for and or that and are conditionally independent given (and written as ) if
where denote positive random variables measurable with respect to the corresponding σ-algebras and , respectively.
Some elementary properties of this concept are introduced in [14].
Now, we give a definition of causality between -algebras using the concept of conditional independence.
Definition 2
(see [5]). Let be a σ-algebra. A σ-algebra is a sufficient cause of at time t (relative to ), or within (and written as ) if and only if
and
The intuitive notion of causality in continuous time, formulated in terms of Hilbert spaces, is investigated in [8]. Now, we consider the corresponding notion of causality for filtrations based on the conditional independence between sub--algebras of (see [12,14]).
Definition 3
(See [7,8]). It is said that is a cause of within relative to P (and written as if , and if is conditionally independent of given for each t, i.e., (i.e., holds for each t and each u), or
Intuitively, means that all information about that gives is contained in for arbitrary t; equivalently, holds all the information from the needed for predicting . We can consider subfiltration as a reduction in information.
The definition of causality introduced in [5] is of the following form: “It is said that entirely causes within relative to P (and written as ;P) if , and if for each t”. This definition is similar to the definition described in Definition 3, because instead of , this definition contains the condition , or identical for each t, without natural reasoning. Clearly, the Definition 3 is more general than the definition given in [5], so all results related to causality in the sense of the Definition 3 hold for the concept of causality introduced in the definition from [5] (p. 3), when we add the condition .
If and are equal filtrations, such that , we tell that is self-caused (or, its own cause) within (compared with [5]). It should be noted that the statement “ is self-caused” can sometimes be applied to the theory of martingales and stochastic integration (see [15]). The hypothesis introduced in [15] is equivalent to the concept of being “self-caused”. Also, let us mention that the notion of being "self-caused", as defined here, is equivalent to the notion of subordination (as introduced in [13]).
If and are such that (where is a -algebra defined by ), we say that does not cause . Clearly, the interpretation of Granger causality is that does not cause if holds (see [5]). It can be shown that the term “ does not anticipate ” (as introduced in [13]) and this term are equivalent.
The given concept of causality can be applied to stochastic processes. In that manner, instead of stochastic processes we consider the corresponding induced filtrations, i.e., natural filtrations. For example, -adapted stochastic process is self-caused if is self-caused within , i.e., if
Process X, which is self-caused, is entirely determined by its way of behaving with respect to its natural filtration (see [6]). For example, process is a Markov process relative to the filtration on a filtered probability space if and only if X is a Markov process with respect to and if it is self-caused within relative to P. Obviously, the same holds for Brownian motion . Namely, W is a Brownian motion with respect to the filtration on a filtered probability space if and only if it is self-caused within relative to probability P, i.e., (see [6]).
In many situations, we observe certain systems up to some random time, for example, up to the time when something happens for the first time.
Definition 4
([16,17]). A random variable T defined on is called a -stopping time, if for each .
Let us remind ourselves of some elementary properties of stopping times and -algebras. For a given -stopping time T, the associated -algebra is defined by . Intuitively, is information available at time T. For a process X, we set , whenever . We define the stopped process with
Note that if X is adapted and cadlag and if T is a stopping time, then stopped process is adapted, too.
So, it is natural to consider causality in continuous time which involves stopping times, a class of random variables that plays an essential role in the theory of martingales (for details, see [18,19]).
The generalisation of the Definition 3 from fixed to stopping time is introduced in [9].
Compared to the definition in (3), in the definition from [9], we have reduced the amount of information needed to predict some other filtration.
Definition 5
([19,20]). A stopping time T is predictable if there exists a sequence of stopping times such that is increasing, on for all n and a.s. Such a sequence is said to announce T.
Let us mention that because of the right continuity of the filtration and Theorem 3.27 in [16], we can use the definition of predictable time in the sense of Definition 3.25 from [16], too.
3. Causality, Increasing Processes, Optional and Predictable Measures
Increasing processes are very important in the general theory of stochastic processes, and sometimes an increasing process can be considered as a random measure on .
Definition 6
([12,21]). An increasing process is any process which is non-negative, adapted, and whose paths are increasing and cadlag.
Therefore, every increasing process is optional.
Remark 1
([12,21]). A stochastic process which is non-negative, and whose paths are increasing and cadlag, but which is not adapted, is called a raw increasing process.
Raw increasing processes are not “more general” objects than increasing processes: they are the increasing processes for the family of -fields , and hence there is no need for any special theory for them. Let us mention those processes that are not necessarily adapted but whose paths have this property.
The increasing process A is called integrable if . A process which can be represented as the difference of two increasing processes (resp. integrable increasing processes) is called a process of finite variation (resp. a process of integrable variation). An increasing process A is said to be natural, if it is integrable and such that for any bounded martingale .
In many situations, it is of interest to investigate processes which are not measurable through time, but which have some continuity properties. The left continuous and adapted processes are predictable (for example, consider the cadlag Feller process). For example, in mathematical finance, the consumption plan of an investor, when he is choosing a trading strategy, is usually described by non-negative optional process (see [22]). The right continuous and adapted processes are optional (for example, the process of Brownian motion). Let us mention that, in general, the integrands of stochastic integrals should be predictable processes. For example, in mathematical finance, the risk premiums associated with the risky Brownian motion are described by a predictable process, which considers the risky asset price evolution which satisfies NFLVR (no free lunch with vanishing risk (see [22])).
The connection between the concept of causality and optional and predictable processes is considered in [23]. Now, we investigate optional and predictable measures in the sense of Definition 3.
By , we denote optional projection of the process X, while represents predictable projection of the process X (in [23], we are given applications of the concept of causality on optional and predictable projections). According to Remark 45 in [12], if H is random variable and denotes a cadlag version of martingale , the optional projection of the process which is constant through time is the process and its predictable projection is . Let us mention that a self-caused process X admits a right-continuous modification, and therefore, for optional and predictable projection, we have .
For a given raw process A and bounded -measurable process X, we set
Every measure can be obtained this way and according to Theorem 65 in [12], the converse statement holds, too.
Theorem 1.
Let be a probability space. A raw integrable increasing process associated with measure is optional if the bounded process is self-caused, i.e., holds.
Proof.
Suppose that is a raw integrable increasing process. Because of causality and Theorem 3 in [23], bounded process is optional. Then, for the optional projection of , we have . Due to Theorem 59 in [12] (p. 124), for integrable, increasing process we have
Therefore, is optional process. □
Example 1.
We can take , where a is a positive Borel function on , and let be a cadlag version of the martingale . According to Theorem 3.1 in [6], is a self-caused process. According to Theorem 1, increasing process is optional. Indeed, we have
According to Theorem 3 in [23], is an optional process and therefore , and we have
According to Theorem 65 in [12], is an optional process.
Theorem 2.
Let be a probability space. A raw integrable increasing process associated with measure is predictable if the bounded, left continuous process is self-caused, i.e., holds.
Proof.
Suppose that is a raw integrable increasing process. Because of causality and Remark 2 in [23], the bounded, left-continuous process is predictable. Then, for the predictable projection of , we have , and because of left-continuity, we have . Due to Theorem 59 in [12] (p. 124), for integrable, increasing process , we have
Therefore, is predictable process. □
Example 2.
We can take , to be a cadlag version of the left-continuous, martingale . According to Theorem 3.1 in [6], is a self-caused process. According to Theorem 2, increasing process is predictable. Indeed, we have
According to Remark 2 in [23], is a predictable process and therefore , and we have
According to Theorem 65 in [12], is a predictable process.
Suppose that X is a measurable process. Then, based on the predictable projection denoted by , we present a predictable process which can be understood as a process ‘near’ to X. This feature means that for every predictable stopping time T, the expected values of the stopped variables and are equal. If X is the gain process of some game and T is an exit strategy, then the stopped variable is the value of the game if one plays the exit strategy T. If a stopping time is ‘predictable’ then it means that we can forecast and predict it. As and have the same expected value for predictable exit rules, it will be irrelevant, on average, whether we play the game X or the predictable game .
Next, Lemma is a consequence of the previous Theorem and has conditions in terms of causality for a stopping time to be predictable.
Lemma 1.
Let be a probability space and T be a stopping time. T is predictable time if every bounded, left-continuous process is self-caused, i.e., holds.
Proof.
The proof directly follows the previous Theorem 2 if an optional increasing process is of the form . Then, because of Remark 2 in [23], process is predictable, so we have
According to Theorem 2, is a predictable process. Therefore, the set for is predictable. According to definition 3.25 in [16], T is a predictable stopping time. □
Corollary 1.
Let T be a stopping time and set . The restriction is a predictable stopping time if every bounded, left-continuous process is self-caused.
Proof.
Let T be a stopping time and be a left-continuous, bounded and self-caused process. Due to Lemma 1, T is a predictable stopping time. According to Corollary 10.15 in [24] the restriction is predictable, too. □
Theorem 3.
Let T be an optional time. Stopping time T is predictable if and only if bounded; process with is self-caused, i.e., .
Proof.
Let T be a predictable stopping time. Due to Lemma 10.3 in [24], the process is predictable. According to Theorem 10.12 in [24], X is a natural process, and by definition, is martingale. Causality is then proved by Theorem 4.1 in [6], where every martingale is a self-caused process .
Conversely, suppose that is a bounded, self-caused process with . Because of causality, is - martingale. Due to Theorem 10.14 in [24], T is a predictable stopping time. □
Assume that K is a local martingale. The jumps Δ K of K clearly represent an ‘unpredictable’ part of K. The majority of examples of the predictable projection are the laws and . The previous equality shows that we cannot ‘foresee’ the size of the jumps of a local martingale.
It is very natural to ask how someone can forecast the jumps in stock prices. In mathematical finance, stock prices are usually driven by some local martingale, and obviously no one can predict the jumps in the price processes. Thus, the modest connection just mentioned has remarkably significant theoretical and applied inference.
We use to denote the -field of all measurable sets augmented with all the P-evanescent sets.
Definition 7
([12,16]). Let μ be a measure on without changing any evanescent set. Measure μ is called optional (predictable) if, for every non-negative, bounded, measurable process X
where
Obviously, for a bounded, measurable process X, we have where is an optional measure. Respectively, for predictable measure , we have .
Theorem 4.
Let be a probability space. Then, for self-caused, bounded process X and a positive P measure μ on , there is a raw increasing process A and the associated measure is an optional measure.
Proof.
Let be a probability space and a positive P measure. According to Theorem 65 in [12], there exists a raw integrable increasing process A, which is unique within an evanescent process. Then, for every bounded measurable process X, the measure associated with process A is defined by
Let hold. Then, according to Theorem 3 in [23], process is an optional process. Therefore, so
Due to definition 5.12 in [16] (p. 141), is an optional measure. □
Example 3.
Let λ be a P measure on . We can define a P measure μ on by setting for every self-caused, bounded measurable process X. Then, according to Theorem 4, measure is optional.
Indeed, according to Theorem 3 in [23], a process is optional if it is self-caused; therefore, X is optional process, so Thus, is an optional measure, so is optional, too.
Theorem 5.
Let be a probability space. Then, for self-caused, left-continuous, bounded process X and a positive P measure μ on , there exists a raw process of integrable variation A and the associated measure is a predictable measure.
Proof.
Let be a probability space and be a positive P measure. According to Theorem 65 in [12], there exists a raw process of integrable variation A, which is unique to within an evanescent process. Then, for every bounded measurable process X, the measure associated with process A is defined by
Let holds. Then, according to Remark 2 in [23], the self-caused, left-continuous process is an -predictable process. Therefore, , so
Due to definition 5.12 in [16], is a predictable measure. □
4. Application
Let be a filtered probability space. Mathematical risk theory investigates stochastic models of risk in finance and insurance. The basic risk model consider the value of a risk portfolio which is the sum of premium payments that increase the value of the portfolio and claim payouts that decrease the value of the portfolio. Premium payments are received to cover liabilities—expected losses from claim payouts and other costs. Claims are a result of risk events that occur at random times. Very often, claims’ payouts are modelled by point processes. An important problem in risk theory is the calculation of the probability of ruin and time to ruin, which is the likelihood that the value of a risk portfolio will ever become negative.
A risk process can be represented with with initial conditions . The process B is a continuous predictable process of finite variation which represents the stable income payments together with premiums, N is a continuous local martingale which models a random perturbation, D is a right-continuous jump process and represents accumulated claims and may also contain jumps made by non-anticipated falls or rises in returns on investment. L is a left-continuous jump process, which models gains or losses in returns on investment. All these processes are optional and are adapted to the filtration .
Measures and are random measures that describe jumps in the processes and , while the measures and are unique predictable measures of the form , where A is an increasing, predictable, right-continuous process (see Lemma in [25]). Based on Doob–Meyer decomposition of optional semimartingales, R is a special optional semimartingale adapted to .
For risk process R, we introduce optional cumulant function . Let be a self-caused optional process, with . Then, , and based on Theorem 3, T is a predictable stopping time. Then, according to Lemma 5.2.4 in [25], for optional cumulant process G, if , which in the risk theory means that claims of payout cannot be predicted beforehand. In other words, if optional semimartingale X is self-caused, then claims cannot be predicted beforehand.
Thus, we have the following proposition.
Proposition 1.
Let be a self-caused process with . Then, claims cannot be predicted beforehand.
5. Conclusions
In this paper, we presented several new results considering optional and predictable measures, increasing processes and predictable times. The predictable measures are very important, particularly for statistical analysis of financial data.
An open question is to consider enlargements of filtration in the context of its application in mathematical finance and its connection to the predictable and optional measures, as well as the preservation of these properties of measures. Also, it would be of interest to include the stopping time and consider what will happen when we have some finite horizon.
Author Contributions
Conceptualisation, D.V.; methodology, D.V.; investigation, D.V.; writing—original draft preparation, D.V.; formal analysis, D.V.; writing—review and editing, V.S.; visualisation, A.V.; resources, A.V. All authors have read and agreed to the published version of the manuscript.
Funding
This research received no external funding.
Institutional Review Board Statement
Not applicable.
Informed Consent Statement
Not applicable.
Data Availability Statement
No new data were created or analyzed in this study. Data sharing is not applicable to this article.
Conflicts of Interest
The authors declare no conflicts of interest.
References
- Bohm, D. Causality and Chance in Modern Physics; Routledge: London, UK, 1984. [Google Scholar]
- Eells, E. Probabilistic Causality; Cambridge University Press: Cambridge, UK, 1996. [Google Scholar]
- Comte, F.; Renault, E. Noncausality in Continuous Time Models The General. Economet. Theor. 1996, 12, 215–256. [Google Scholar] [CrossRef]
- Granger, C.W.J. Investigating Causal Relations by Econometric Models and Cross Spectral Methods. Econometrica 1969, 37, 424–438. [Google Scholar] [CrossRef]
- Mykland, P.A. Statistical Causality; Report No. 14; University of Bergen: Bergen, Norway, 1986. [Google Scholar]
- Petrović, L.; Stanojević, D. Statistical Causality, Extremal Measures and Weak Solutions of Stochastical Differential Equations with Driving Semimartingales. J. Math. Model. Algorithms 2010, 9, 113–128. [Google Scholar] [CrossRef]
- Gill, J.B.; Petrović, L. Causality and Stochastic Dynamic Systems. SIAM J. Appl. Math. 1987, 47, 1361–1366. [Google Scholar] [CrossRef]
- Petrović, L. Causality and Stochastic Realization Problem. Publ. Inst. Math. (Beograd) 1989, 45, 203–212. [Google Scholar]
- Petrović, L.; Dimitrijević, S.; Valjarević, D. Granger Causality and stopping times. Lith. Math. J. 2016, 56, 410–416. [Google Scholar] [CrossRef]
- Valjarević, D.; Petrović, L. Statistical causality and separable processes. Statist. Probab. Lett. 2020, 167, 108915. [Google Scholar] [CrossRef]
- Petrović, L.; Valjarević, D. Statistical causality and extremality of measures. Bull. Korean Math. Soc. 2018, 55, 561–572. [Google Scholar]
- Delacherie, C.; Meyer, P.A. Probability and Potentials; Blaisdell Publishing Company: Waltham, MA, USA, 1966. [Google Scholar]
- Rozanov, Y.A. Innovation Processes; V. H. Winston and Sons: New York, NY, USA, 1977. [Google Scholar]
- Florens, J.P.; Mouchart, M.; Rolin, J.M. Elements of Bayesian Statistics. Pure and Applied Mathematics, A Series of Monographs and Textbooks; Marcel Dekker Inc.: New York, NY, USA; Basel, Switzerland, 1990. [Google Scholar]
- Bremaud, P.; Yor, M. Changes of Filtration and of Probability Measures. Z. Wahrscheinlichkeitstheorie Verwandte Geb. 1978, 45, 269–295. [Google Scholar] [CrossRef]
- He, S.W.; Wang, J.G.; Yan, J.A. Semimartingale Theory and Stochastic Calculus; CRC Press: Boca Raton, FL, USA, 1992. [Google Scholar]
- Revuz, D.; Yor, M. Continuous Martingales and Brownian Motion; Springer: New York, NY, USA, 2005. [Google Scholar]
- Jacod, J.; Shiryaev, A.N. Limit Theorems for Stochastic Processes; Springer: Berlin/Heidelberg, Germany, 2002. [Google Scholar]
- Protter, P. Stochastic Integration and Differential Equations; Springer: Berlin/Heidelberg, Germany, 2004. [Google Scholar]
- Medvegyev, P. Stochastic Integration Theory; Oxford University Press: Oxford, UK, 2007. [Google Scholar]
- Nikeghbali, A. An essay on the general theory of stochastic processes. Probab. Surv. 2006, 3, 345–412. [Google Scholar] [CrossRef]
- Jarow, R. Continuous-Time Asset Pricing Theory, A MArtingale Based Approach; Springer: New York, NY, USA, 2021. [Google Scholar]
- Valjarević, D.; Dimitrijević, S.; Petrović, L. Statistical causality and optional and predictable projections. Lith. Math. J. 2023, 63, 104–116. [Google Scholar] [CrossRef]
- Kallenberg, O. Foundations of Modern Probability, 3rd ed.; Springer Nature: Cham, Switzerland, 2021. [Google Scholar]
- Pak, A. Optional Processes and their Applications in Mathematical Finance, Risk Theory and Statistics. Ph.D. Thesis, Department of Mathematical and Statistical Sciences, University of Alberta, Edmonton, AB, Canada, 2021. [Google Scholar]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).