Abstract
In this paper, we introduce the concept of monotonicity according to a direction for a set of random variables. This concept extends well-known multivariate dependence notions, such as corner set monotonicity, and can be used to detect dependence in multivariate distributions not detected by other known concepts of dependence. Additionally, we establish relationships with other known multivariate dependence concepts, outline some of their salient properties, and provide several examples.
MSC:
60E15; 62H05
1. Introduction
There are numerous methodologies and approaches available for the exploration and analysis of the intricate relationships of dependence among random variables. As underscored by Jogdeo [1], this area stands as a cornerstone of extensive research within the expansive domains of probability theory and statistics. The investigation of dependence among variables is fundamental in understanding the underlying structure and behavior of complex systems, making it a focal point of study across various scientific disciplines.
When delving into the examination of a multivariate model, it becomes imperative to conduct a thorough analysis of the specific type of dependence structure it encapsulates. This meticulous scrutiny is essential for discerning the suitability of a particular model for a given dataset or practical application. By comprehensively understanding the nature of dependence presents, researchers can make informed decisions regarding model selection and parameter estimation, thereby enhancing the robustness and reliability of their analyses.
Within the vast landscape of studied dependence types, our attention is particularly drawn to the nuanced distinctions between positive and negative dependence. Positive dependence expresses a tendency for the variables to move in the same direction, exhibiting a mutual influence that often reflects synergistic relationships. Conversely, negative dependence denotes an inverse relationship, where the movement of one variable is accompanied by a corresponding opposite movement in another, indicative of regulatory or inhibitory interactions.
By elucidating the intricacies of positive and negative dependence, researchers gain valuable insights into the underlying dynamics of the systems under study. This deeper understanding not only enriches theoretical frameworks but also has practical implications in various fields, including finance, engineering, and epidemiology. Moreover, it underscores the importance of considering diverse dependence structures in statistical modeling, ensuring that analyses accurately capture the complexities of real-world phenomena.
Positive dependence is defined by any criterion capable of mathematically characterizing the inclination of components within an n-variate random vector to assume concordant values [2]. As emphasized by Barlow and Proschan [3], the concepts of (positive) dependence in the multivariate context are more extensive and intricate compared to the bivariate case.
The literature contains various extensions of the bivariate dependence concepts to the multivariate domain (we refer to [4,5,6,7] for more details). Our objective in this study is to extend certain established notions of multivariate positive and negative dependence. This includes the exploration of concepts such as orthant dependence and corner set monotonicity, investigating their connections with other dependence concepts and presenting several associated properties.
The paper is organized as follows. We begin with some preliminaries (Section 2) pertaining to the properties of multivariate dependence. This section serves to lay the foundation for our subsequent analyses by elucidating key concepts and frameworks, essential for understanding the complexities of dependence structures among random variables. Following the preliminaries, in Section 3, we delve into the concept of monotonic random variables with respect to a given direction. This extends the notion of corner set monotonicity and provides a more nuanced understanding of the directional dependence present in multivariate systems. We further explore several properties pertaining to these monotonic random variables and provide some examples. Finally, Section 4 is dedicated to presenting our conclusions drawn from the analyses and discussions shown in the preceding sections.
2. Preliminaries
In the sequel, by convention, we will indistinctly use “increasing” (respectively, “decreasing”) and “nondecreasing” (respectively, “nonincreasing”). In addition, a subset , with , is an increasing set if its indicator function is increasing.
Let be a natural number. Let be a probability space, where is a nonempty set, is a -algebra of subsets of , and is a probability measure on , and let be an n-dimensional random vector from to .
The orthant dependence according to a direction is defined as follows [8]: Let be a vector in such that for all . An n-dimensional random vector —or its joint distribution function—is said to be orthant positive (respectively, negative) dependent according to direction —written PD() (respectively, ND())—if
(respectively, with the reversed inequality in (1)).
Note that for some elections of direction —e.g., for or —we obtain different known (bivariate and multivariate) dependence concepts in the literature, as positive quadrant dependence, positive upper orthant dependence, etc. (we refer to [2,7,9,10,11,12] for more details).
Let be an n-dimensional random vector. The following two multivariate positive dependence notions—on corner set monotonicity—were introduced in [13], where the expression “nonincreasing in ”—and similarly for nondecreasing—means that it is nonincreasing in each of the components of x, and means for all :
- is left corner set decreasing, denoted by LCSD, if
- is right corner set increasing, denoted by RCSI, if
The corresponding negative dependence concepts LCSI (left corner set increasing) and RCSD (right corner set decreasing) are defined in a similar manner by exchanging “nondecreasing” and “nonincreasing” in (2) and (3), respectively.
Let H be the joint distribution function of . We note that condition (2) can be written as
where . Denoting by the survival function of H, i.e., , condition (3) can be written as
where .
For properties of these notions and relationships with other multivariate dependence concepts, see, e.g., refs. [7,14].
3. Monotonic Dependence According to a Direction
In this section, we undertake a comprehensive examination of the concepts of left-corner set and right-corner set dependence in sequence according to a direction, delineating their definitions within the framework of directional dependence for a set of random variables. Building upon the foundations laid out in Section 2, where the concepts of LCSD and RCSI were recalled, we extend these notions to incorporate directional considerations, thus offering a more nuanced understanding of dependence structures. It is worth noting that a similar analytical approach could be applied to explore negative dependence concepts, mirroring the methodology employed for positive dependence. Furthermore, we not only define these directional dependence concepts but also delve into some of their salient properties.
3.1. Definition
We begin with the key definition of this work, in which for a direction and an n-dimensional random vector , denotes the vector .
Definition 1.
Let be an n-dimensional random vector and such that for all . The random vector —or its joint distribution function—is said to be increasing (respectively, decreasing) according to the direction α—denoted by I(α) (respectively D(α))—if
is nondecreasing (respectively, nonincreasing) in for all .
In the sequel, we focus on the I() concept. Similar results can be obtained for the D() concept, so we omit them.
Observe that the I() concept generalizes the LCSD and RCSI concepts defined in Section 2; that is, I() (respectively, I()) corresponds to LCSD (respectively, RCSI).
We wish to emphasize that, in general, the I() concept signifies positive dependence. This means that large values of the variables , for , are associated with small values of the variables , for , where and . Therefore, if a random vector is I(), then
is nondecreasing in each for , and nonincreasing in each for , for all .
3.2. Relationships with Other Multivariate Dependence Concepts
In this subsection, our focus lies on exploring the connections between the I() dependence notion and several established multivariate dependence concepts within the context of directional analysis. By examining these relationships, we aim to elucidate how the I() concept aligns with or diverges from other well-known measures of dependence, thus providing a more comprehensive understanding of their interplay. Through this investigation, we seek to uncover potential insights into the nature of multivariate dependence and its implications in various analytical scenarios.
We begin our study by recalling the increasingness in the sequence dependence concept.
Definition 2.
([15]). Let be n random variables and such that for all . The random variables are said to be increasing in sequence according to direction α—denoted by IS(α)—if, for any ,
is nondecreasing in for all .
The relationship between I() and IS() dependence concepts is given in the following result.
Proposition 1.
If a random vector is I(α), then it is IS(α).
Proof.
Let such that for . Since is I(), for any i, , we have
where
and
whence is IS(), which completes the proof. □
The converse of Proposition 1 does not hold in general: see, for instance, (Ref. [16], Exercise 5.33) for a counterexample in the bivariate case.
In [15], the authors establish a significant result demonstrating that the IS() condition implies PD(). Building upon this crucial insight, we can readily derive the following result, thereby highlighting the logical consequence of this implication.
Corollary 1.
If a random vector X is I(α), then it is PD(α).
The next definition involves the concept of multivariate totally positive of order two.
Definition 3.
([17]). Let be an n-dimensional random vector with joint density function f, and , with for all . Then, is said to be multivariate totally positive of order two according to direction α—denoted by MTP2—if
for all .
The relationship between the notions I() and MTP2 is given in the following result.
Proposition 2.
If a random vector X is MTP2, then it is I(α).
Proof.
Let be an n-dimensional random vector such that X is MTP2. Given , for , we consider three cases:
- If for all , we have thatis nondecreasing in .
- If for all , we haveand hence it is nondecreasing in .
- Given , consider, without loss of generality, for and for . Then, we haveSincefor all such that for , we have that (5) is nondecreasing in . In order to prove that (5) is also nondecreasing in , considering such that for , we need to verifyFor that, it suffices to show that the determinantis non-positive (note that, in this case, the quotient between the elements of the first column would be less than the quotient between the elements of the second column, obtaining (6)). First of all, if we add the second column changed of sign to the first column, we haveand now adding to the second row, the first row with a changed sign, we obtainSince X is MTP2, from Ref. [17] (Propositions 2 and 4), we havefor any pair of vectors such that for all , and for any , where and , and h is the joint density function of the random vector . By integrating both sides of (8) in , with for and for , we obtainIt easily follows that the determinant D in (7) is non-positive.
In the three cases, we obtain that X is I(), which completes the proof. □
In order to conclude this subsection, we summarize the relationships among the different dependence concepts outlined above in the following scheme:
3.3. Properties
The subsequent results presented herein encapsulate essential properties inherent to the I() families. These properties span a diverse range of scenarios, encompassing not only the behavior of independent random variables but also extending to subsets of the newly introduced concept IS(), as well as the concatenation of I() random vectors. Furthermore, these results touch upon topics, such as weak convergence, thereby providing a comprehensive framework for analyzing and understanding the dynamics of multivariate dependence within the realm of I() families.
Proposition 3.
Every set of independent random variables is I(α) for any .
Proof.
If the random variables are independent, then for any and for all , we have
Given , consider the probability . We study two cases:
- If , we have
- If , we haveWe consider two subcases:
- (a)
- If , then we haveand therefore
- (b)
- If , then we have
In any case, we obtain that the probability is nondecreasing in for any and for all , whence the result follows. □
Exploring the interplay of stochastic processes, we delve into the transformation of subsets of I() random variables.
Proposition 4.
Every subset of I(α) random variables is I(), where is the vector attained by excluding from α the components associated with the random variables not included in the subset.
Proof.
Assume that is I(), and let be a subvector of . Let . For any , and by considering for every , we have
Thus, given , , such that , and taking for every , we obtain
Since X is I(), we conclude that is I, completing the proof. □
Within the domain of stochastic processes, we now show that when applying strictly increasing functions to the components of an I random vector, the I property is retained.
Proposition 5.
If the random vector is I(α), and are n real-valued and strictly increasing functions, then the random vector is I(α).
Proof.
Let such that for all . Since are I() and for every , we have
i.e., is I(), which completes the proof. □
For the next result, we need some additional notations. Given and , will denote concatenation, that is, . Similar notation will be used in the case of random vectors.
Proposition 6.
If is I(α), is I(β), and and are independent, then is I().
Proof.
Let and such that and . Since is I(), is I(), and and are independent, then we have
whence is I(). □
The following result pertains to a closure property of the I() family of multivariate distributions and, similarly, of the D() family.
Proposition 7.
The family of I(α) distribution functions is closed under weak convergence.
Proof.
Let be a sequence of p-dimensional random vectors such that is I for all , and converges weakly to . If are such that for all , then we have
therefore, is I(), whence the result follows. □
3.4. Examples
In this section, we delve into examples that illustrate the I() concept of dependence. Through the following three examples—involving both continuous and discrete cases—, we aim to elucidate the behavior and implications of this type of dependence in various contexts. These examples serve to elucidate the impact on statistical analysis, decision-making processes, and other pertinent areas of study.
Example 1.
Let be a random vector with multivariate Normal distribution , where and Σ is the covariance matrix. Let such that for all with —a similar study can be conducted by considering . The probability density function of X is given by
Then, for every pair with , we can express the probability density function as follows:
where for and appropriate functions . Now, given such that and , and such that for , we have
Since
as long as , then (9) is non-negative if, and only if, . Then we have that, for any such that for all , the random vector X is MTP2 if, and only if, for any election of —see Theorem 3 of [17]. From Proposition 2, we conclude that X is I(α) for and .
Example 2.
Let be a random vector with Dirichlet distribution , with , and such that for all and . The probability density function is given by
with and . Given any selection of , with , and any real numbers such that and , we have
Since
and , then (10) is non-positive; therefore, we have that is MRR2( 1 ), that is, X
is a multivariate reverse rule of order two—the corresponding negative analog to (4) by reversing the inequality sign in (Ref. [17], Theorem 3)—according to the direction . Thus, by applying the corresponding negative dependence concept, in a manner similar to that provided in Proposition 2 for the corresponding positive dependence concept, we conclude that X is D.
Example 3.
Let be a random vector with a multinomial distribution with parameters N (number of trials) and (event probabilities) such that for all and . The joint probability mass function is given by
where . The multinomial distribution function is the conditional distribution function of independent Poisson random variables given their sum. As a consequence of (Ref. [5], Theorem 4.3) and (Ref. [17], Theorem 3), we have that is MRR2( 1 ). Thus, we conclude that X
is D.
Remark 1.
We want to note that by considering a similar reasoning to that given in Example 3, we have that any random vector with multivariate hypergeometric distribution—the conditional distribution function of independent binomial random variables given their sum—is also D.
Now we provide an illustrative example demonstrating the application of Proposition 7 regarding weak convergence.
Example 4.
Let be a random vector with joint distribution function
for all , and . This parametric family of distribution functions is a multivariate generalization of the Type B bivariate extreme-value distributions (see [16,18]). By applying (Ref. [19], Theorem 2.11)—which involves log-convex functions [20]—we have that the random vector X is I. Consider the sequence of distribution functions . When θ goes to ∞, we get , where , with , are the one-dimensional marginals of ; therefore, as a consequence of Proposition 7, we obtain that is I as well.
4. Conclusions
In this paper, we have undertaken a significant endeavor by introducing a novel concept of monotonicity, characterized by its directionality, for a set of random variables. This extension of existing multivariate dependence concepts represents a substantial contribution to the field, offering a more nuanced understanding of dependence structures. Moreover, we have not only defined this directional monotonicity concept but also delved into its implications by establishing relationships with other well-known multivariate dependence concepts. These comparative analyses shed light on the interconnectedness and compatibility between different analytical approaches, enriching our understanding of multivariate dependence. The exploration of I stochastic orders—closely resembling those studied in [21]—is ongoing and represents a fertile ground for future research.
Author Contributions
Conceptualization, J.J.Q.-M.; methodology, J.J.Q.-M. and M.Ú.-F.; validation, J.J.Q.-M. and M.Ú.-F.; investigation, J.J.Q.-M. and M.Ú.-F.; writing—original draft preparation, M.Ú.-F.; writing—review and editing, M.Ú.-F.; visualization, J.J.Q.-M. and M.Ú.-F.; supervision, J.J.Q.-M. and M.Ú.-F. All authors have read and agreed to the published version of the manuscript.
Funding
This research was funded by the Ministry of Science and Innovation (Spain) grant number PID2021-122657OB-I00.
Data Availability Statement
Data are contained within the article.
Acknowledgments
The authors thank the comments provided by three anonymous reviewers.
Conflicts of Interest
The authors declare no conflicts of interest.
References
- Jogdeo, K. Concepts of dependence. In Encyclopedia of Statistical Sciences; Kotz, S., Johnson, N.L., Eds.; Wiley: New York, NY, USA, 1982; Volume 1, pp. 324–334. [Google Scholar]
- Colangelo, A.; Scarsini, M.; Shaked, M. Some notions of multivariate positive dependence. Insur. Math. Econ. 2005, 37, 13–26. [Google Scholar] [CrossRef]
- Barlow, R.E.; Proschan, F. Statistical Theory of Reability and Life Testing: Probability Models; To Begin With: Silver Spring, MD, USA, 1981. [Google Scholar]
- Block, H.W.; Ting, M.-L. Some concepts of multivariate dependence. Comm. Statist. A Theory Methods 1981, 10, 749–762. [Google Scholar] [CrossRef]
- Block, H.W.; Savits, T.H.; Shaked, M. Some concepts of negative dependence. Ann. Probab. 1982, 10, 765–772. [Google Scholar] [CrossRef]
- Colangelo, A.; Müller, A.; Scarsini, M. Positive dependence and weak convergence. J. Appl. Prob. 2006, 43, 48–59. [Google Scholar] [CrossRef]
- Joe, H. Multivariate Models and Dependence Concepts; Chapman & Hall: London, UK, 1997. [Google Scholar]
- Quesada-Molina, J.J.; Úbeda-Flores, M. Directional dependence of random vectors. Inf. Sci. 2012, 215, 67–74. [Google Scholar] [CrossRef]
- Kimeldorf, G.; Sampson, A.R. A framework for positive dependence. Ann. Inst. Statist. Math. 1989, 41, 31–45. [Google Scholar] [CrossRef]
- Lehmann, E.L. Some concepts of dependence. Ann. Math. Statist. 1966, 37, 1137–1153. [Google Scholar] [CrossRef]
- Shaked, M. A general theory of some positive dependence notions. J. Multivariate Anal. 1982, 12, 199–218. [Google Scholar] [CrossRef][Green Version]
- Karlin, S.; Rinott, Y. Classes of orderings of measures and related correlation inequalities. I. Multivariate totally positive distributions. J. Multivariate Anal. 1980, 10, 467–498. [Google Scholar] [CrossRef]
- Harris, R. A multivariate definition for increasing hazard rate distribution functions. Ann. Math. Statist. 1970, 41, 713–717. [Google Scholar] [CrossRef]
- Popović, B.V.; Ristić, M.M.; Genç, A.İ. Dependence properties of multivariate distributions with proportional hazard rate marginals. Appl. Math. Model. 2020, 77, 182–198. [Google Scholar] [CrossRef]
- Quesada-Molina, J.J.; Úbeda-Flores, M. Monotonic in sequence random variables according to a direction. University of Almería: Almería, Spain, 2024; to be submitted. [Google Scholar]
- Nelsen, R.B. An Introduction to Copulas, 2nd ed.; Springer: New York, NY, USA, 2006. [Google Scholar]
- de Amo, E.; Quesada-Molina, J.J.; Úbeda-Flores, M. Total positivity and dependence of order statistics. AIMS Math. 2023, 8, 30717–30730. [Google Scholar] [CrossRef]
- Johnson, N.L.; Kotz, S. Distributions in Statistics: Continuous Multivariate Distributions; John Wiley & Sons: New York, NY, USA, 1972. [Google Scholar]
- Müller, A.; Scarsini, M. Archimedean copulae and positive dependence. J. Multivar. Anal. 2005, 93, 434–445. [Google Scholar] [CrossRef]
- Kingman, J.F.C. A convexity property of positive matrices. Quart. J. Math. Oxford 1961, 12, 283–284. [Google Scholar] [CrossRef]
- de Amo, E.; Rodríguez-Griñolo, M.R.; Úbeda-Flores, M. Directional dependence orders of random vectors. Mathematics 2024, 12, 419. [Google Scholar] [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).