Earthquake Catastrophe Bond Pricing Using Extreme Value Theory: A Mini-Review Approach

: Earthquake catastrophe bond pricing models (ECBPMs) employ extreme value theory (EVT) to predict severe losses, although studies on EVT’s use in ECBPMs are still rare. Therefore, this study aimed to use a mini-review approach (MRA) to examine the use of EVT and identify the gaps and weaknesses in the methods or models developed. The MRA stages include planning, search and selection, analysis, and interpretation of the results. The selection results showed ﬁve articles regarding the application of EVT in ECBPMs. Furthermore, the analysis found the following: First, the generalized extreme value (GEV) could eliminate extreme data in a period. Second, the trigger model using two parameters is better than one, but the study did not discuss the joint distribution of the two parameters. Third, the autoregressive integrated moving average (ARIMA) allows negative values. Fourth, Cox–Ingersoll–Ross (CIR) in-coupon modeling is less effective in depicting the real picture. This is because it has a constant volatility assumption and cannot describe jumps due to monetary policy. Based on these limitations, it is hoped that future studies can develop an ECBPM that reduces the moral hazard.

The countries that have sponsored earthquake bonds and collaborated with the World Bank in their issuance are Mexico, Peru, Chile, Colombia, and the Philippines. Table 1 presents the data related to the issuance.  The data in Table 1 show that the World Bank's earthquake catastrophe bonds range from USD 160 million to USD 500 million. On 8 September 2017, Mexico was rocked by an earthquake of magnitude 8 Mw, causing investors to lose all cash and coupon values. Similarly, the 2018 earthquake in Peru, with a magnitude of 8 Mw, caused investors to lose 30% of their cash value. Based on this real case, earthquake bonds aid the sponsoring countries in procuring disaster funds. However, they pose a moral hazard to investors [13][14][15] due to disaster losses that must be borne by sponsors approaching the trigger specified in eliminating the law pliers. When this happens, no investors will be drawn to purchase the bonds. Therefore, an accurate and transparent earthquake catastrophe bond pricing model is required in order to succeed in the market [16].
In earthquake bond pricing, the catastrophe risk needs to be modeled first. Extreme events related to historical observations can be described using EVT. In particular, the theory of extreme values in catastrophe bonds (CAT bonds) is used to study the distribution of extreme values related to the trigger parameters used in the model (e.g., loss, mortality, flood height, earthquake magnitude, etc.).
The systematic reviews of CAT bonds that have been carried out previously are briefly described in this paragraph. Pizzutilo and Venezia [17] analyzed the effectiveness of catastrophe bonds for the transportation and infrastructure industries; the results showed that catastrophe bonds benefit from managing potential losses related to disaster events. Calvet et al. [18] reviewed statistical and machine learning methods in designing trigger mechanisms; the results indicated that statistical methods and machine learning provide great gains in accuracy and efficiency by decreasing the classification time. Wu and Zhou [19] reviewed the catastrophe bond instrument and the modeling approach used; the study found that the three most popular models used in modeling disaster losses are the compound Poisson model, jump diffusion, and double-exponential jump diffusion. Sukono et al. [20] also reviewed the use of the compound Poisson process; the results were obtained for 30 selected articles, with 12 articles using constant interest rates, while 18 did not; furthermore, 8 articles used CIR, 5 articles used Vasicek, 2 used the robust approach, 1 article used CIR and Vasicek, 1 used the Hull-White model, and 1 article used the ARIMA method.
The published literature review articles have not discussed the use of EVT. Therefore, this study aimed to use a mini-review approach (MRA) to examine the use of EVT on earthquake bonds and determine the literature gaps and limitations of the methods or models developed in the ECBPM. The study's novelty is as follows: The MRA stages in this study consisted of planning, searching, analyzing, and interpreting the results. The study questions were prepared in the planning stage. The searching stage involved choosing the digital libraries, including Scopus, ScienceDirect, and ProQuest; this stage also involved defining, executing, and improving the keywords, as well as selecting the initial articles. The selection process used the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) based on inclusion and exclusion criteria. The analysis stage examined the mathematical model, compared its achievements, and discussed the literature gap by analyzing the model's limitations based on the assumptions of each method. This stage also provided alternative models to improve the limitations of the previous model through theoretical studies. Co-word bibliometric analysis was performed using the VOSviewer application. The results were expected to provide knowledge about the application of EVT. Additionally, they could suggest alternative models for use in developing ECBPMs. The findings could also motivate future studies on the development of ECBPMs.

Introductory Literature on the Study Topic
Hurricane Andrew sent 11 insurance firms into bankruptcy because they lacked sufficient reserves to meet claims. As a result, catastrophe bonds were developed in the middle of 1990 [21]. The Chicago Board of Trade launched the first contract in 1992 and introduced a trigger put-and-call option for catastrophes. This option was based on the Catastrophe Loss Index compiled by the Property Claims Service (PCS) and the industry's statistics agency [22]. Over time, CAT bonds are used by reinsurance companies and countries with a high potential for losses due to disasters to obtain alternative funding for disaster management.
Earthquake catastrophe bond pricing models (ECBPMs) require a detailed study of extreme losses, and the disaster severity is analyzed using EVT [23,24]. The extreme value theory of catastrophe bonds was used by Riza et al. [25] to model the distribution of actual losses and fatalities. Using multiple triggers, the study designed a price model for determining disaster bonds with and without coupons. ARIMA was used to model interest and coupon rates, while Burr, GEV, Weibull, GPD, and logistical models were employed to model the distribution of losses and deaths. Furthermore, maximum likelihood estimation (MLE) was used to estimate the parameters, while the nonhomogeneous Poisson process (NHPP) was used to determine the aggregate loss and death. The Nuel recursive method was used to calculate the CDF value of the NHPP. The simulation used data on storm catastrophe losses and fatalities in the United States from 2012 to 2021, the number of storms that occurred in the United States from 1986 to 2021, and USD LIBOR data from 1986 to 2021. The results of the analysis showed that the intensity of the disaster and factors such as the threshold value of losses and deaths affect the price of catastrophe bonds. More intense disasters and longer bond periods reduce the bond price. Additionally, a greater threshold value of the two triggers increases the bond price.
Marvi et al. [26] modeled multi-period bond pricing for multi-hazard and multiregional disasters. The study used GEV type II or Frechet methods to model the distribution of water levels and employed a binary function to model cash value. The Monte Carlo simulation method was used to generate water level data, and a hydraulic flow model was used to determine the flood depth and flood intensity in buildings in an area. The multi-hazard and multiregional models could reduce the residual risk and benefit the insurance companies.
Chao [27] developed a methodology to account for various risks while pricing multiperiod catastrophe bonds. The Poisson process was used to model the number of disaster events up to time t, while the CIR and GPD were used to model interest rates and the distribution of loss variables, respectively. Furthermore, the MLE was used to estimate the parameters, while the copula was used to model the dependence distribution of economic losses and deaths. The Kolmogorov-Smirnov test was used to assess the viability of choosing the copula model, and the sensitivity analysis was used to evaluate the impact of interest rates and GPD parameters on the bonds. The findings showed that the interest and the average long-term interest rates have an inverse relationship with catastrophe bond prices. However, there was a strong correlation between economic losses and mortality raising the price of disaster bonds.
Deng et al. [28] modeled multi-period drought bond prices. POT was used to determine the loss threshold, while GPD was employed to model the distribution of losses. Moreover, the study used a binary function to model the face value of bonds and a Poisson process to determine the number of disasters at time t. The Lagrange method, also known as MLE, was used to estimate the GPD distribution parameters. The results showed that a higher probability of a disaster-triggering event reduces catastrophe bond prices. High-risk investments provide a high-yield stimulus and attract many investors.
Chao and Zhou [29] modeled multi-period catastrophe bond pricing. The study used the Poisson process to determine the number of disaster bonds at time t and the CIR to model the interest rate. POT was used to obtain extreme values of deaths and economic losses due to disasters. Moreover, the copula was used to model the dependency distribution of deaths and economic losses, while Monte Carlo simulations were utilized for the experiments. The sensitivity analysis determined the impact of disaster intensity, maturity time, and the magnitude of the tau-Kendall correlation on catastrophe bond prices. The results showed that a catastrophe bond pricing model triggered by many events has a lower risk than a model with a single triggering event. It also has a larger market potential than ordinary bonds and has low risk and high returns, reducing the moral hazard. Catastrophe bond prices have an inverse relationship with disaster intensity, maturity time, and the value of the tau-Kendall correlation. The disaster intensity's effect on the maturity time is greater than that on the price. The previously developed model was applied to storms [25], floods, and drought [28], and is generally not specific to drought, earthquakes, or floods [29]. The trigger types used were modeled loss [25,27,28], industry loss [29], and parametric [26]. The parametric trigger type has high transparency and baseline risk, short settlement time, and low regulatory acceptance, while the opposite is true for the indemnity type [30]. This shows that the parametric type is better than other triggers for investors but poses a basic risk to the sponsor. In comparison, the GEV distribution could eliminate other extreme data in a period [27]. Investors also face financial risk, implying fluctuating interest rates, inflation, and coupons that affect the cash value of bonds. The model's methods are ARIMA [25] and CIR [15,21]. However, ARIMA allows negative values. CIR also assumes constant volatility, which is unrealistic.

Materials
The first study materials were articles discussing mathematical modeling for determining the price of earthquake bonds using EVT, published in Scopus-indexed journals. The articles used were obtained from Scopus, ScienceDirect, and ProQuest, published between 2000 and 2022. The selected articles were mapped using bibliometric co-word analysis with the help of the VOSviewer application. Earthquake bonds were selected due to the increased potential for losses caused by earthquakes. The extreme value theory describes extreme losses due to disasters [31]. Scopus and ScienceDirect were chosen because they are reliable and popular search engine databases [25,32], while ProQuest has a science database, a strong collection of scientific journals, and more complete texts. The second study materials were the mathematical models employed in the modeling.

Methods
This research is a preliminary scientific exploration of using EVT in the evaluation of earthquake bonds. The research method consisted of tracing primary literature studies using MRA to remind us of the limited scope. This approach was used to identify literature gaps and limitations of the methods or models developed, as well as to obtain state-of-theart help in developing a model for determining the price of bonds for earthquake disasters in the future. The aim of a thorough systematic study is to review all of the reliable data. To control the volume of material with limited resources, the extent of the search may be subject to arbitrary constraints in the MRA [33], so the articles covered in this study were only sourced from Scopus, ScienceDirect, and ProQuest. The MRA stages consisted of planning, searching, analyzing, and interpreting the results [32][33][34][35][36].
The planning stage of the MRA involved compiling questions to ensure that the review was focused [32]. This stage also determined the gaps in previous studies [37] on modeling to determine the price of earthquake catastrophe bonds.
The search process involved selecting Scopus, ScienceDirect, and ProQuest, as well as defining, executing, and improving keywords and selecting the initial articles [32]. The selection process was carried out using the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) based on inclusion and exclusion criteria. This method involves conducting an identification of which article components are present in a database [38,39].
The data analysis comprised the following stages: (a) Reviewing the ECBPM mathematical model to ascertain the models and methods used in its development by answering the questions compiled in the planning stage; (b) Comparing the results of the developed models; (c) Determining the literature gap from the earthquake catastrophe bond pricing model using EVT. The aim was to identify a gap that would be helpful in developing future study models and methods. (d) Using the VOSviewer application to conduct bibliometric co-word analysis, which is defined as scientific mapping of the relationships between studies [40].
The interpretation stage presented the results of the analysis, providing direction for future studies.

Planning
This study aimed to use the MRA to examine the use of EVT and determine the literature gaps and limitations of the methods or models developed in the ECBPM. The study questions (QR) were as follows: QR1: What is the purpose of the study? QR2: What method or model is used in the study? QR3: What type of trigger is used? QR4: What types of perils are used in the simulation?
The motivation of QR1 was to identify the study's purpose in order to determine which model to develop. QR2 was used to identify the method or model used in order to determine the limitations by analyzing the underlying assumptions. QR3 identified the trigger type used in the model in order to determine the trigger mechanism's weakness. QR4 was used to determine the types of perils used in the simulation.

Searching Strategy
Articles published between 2000 and 2022 were identified using the search engines of the Scopus, ScienceDirect, and ProQuest databases. The keywords used in the search strategy can be seen in Table 2. ("catastrophe bond" OR "CAT bond" OR "pricing cat bond" OR "valuation CAT bond") 194 372 317 2 ("catastrophe bond" OR "cat bond" OR "pricing cat bond" OR "valuation CAT bond") AND ("earthquake" OR "tsunamis") 73 71 154 3 ("catastrophe bond" OR "CAT bond" OR "pricing CAT bond" OR "valuation CAT bond") AND ("earthquake" OR "tsunamis") AND ("extreme value theory") 19 11 14 Table 2 shows that when the keywords "("catastrophe bond" OR "CAT bond" OR "pricing cat bond" OR "valuation cat bond") AND ("earthquake" OR "tsunamis")" were used, data for 44 articles were obtained: 19 from the Scopus database, 11 from ScienceDirect, and 14 from ProQuest.
There were two steps to the selection process: one was semiautomatic, and the other was manual. First, duplicate articles were checked for on the Scopus, ScienceDirect, and ProQuest databases using the semiautomatic selection process; these duplicates were to be removed. We used the JabRef application to check for duplicate documents. After this, a manual selection process was used to find suitable articles for inclusion in the study. The inclusion and exclusion criteria used in this study are presented in Table 3. The phases of PRISMA are identification, screening, eligibility, and inclusion. In the identification phase, the exclusion criterion is duplicate articles (E1). In the screening phase, the exclusion criteria are the relevance of the abstract, title, and keywords. The phases of eligibility are divided into two-part steps that check the accessibility of the article (E2) and the relevance of the whole article to our research goals (I1, I2, I3, I4, I5). The flow of information through the different phases of PRISMA is presented in Figure 1. Figure 1 shows that data were obtained from three reliable search engines: Scopus, ScienceDirect, and ProQuest. Therefore, 36 articles were screened after removal of duplicates. The next step was to filter articles relevant to the research topic. Fourteen articles were not suitable based on the relevance of their titles, abstracts, and keywords. One article was inaccessible, and seventeen did not link the catastrophe bond model to earthquakes based on the relevance of their full text. Moreover, one article was not indexed by Scopus. Thus, three articles were retained for use as primary studies.
A backward and forward process was carried out by collecting relevant citations from the three articles studied. This process was repeated and stopped when the article obtained was from the most recent year and there were no more cited articles. From this process, five articles were obtained that were relevant to the research topic [23,[41][42][43][44]. Table 4 shows the complete review of 23 articles based on accessibility, full-text selection, and reference relevance. Figure 1 shows that data were obtained from three reliable search engines: Scopus, ScienceDirect, and ProQuest. Therefore, 36 articles were screened after removal of duplicates. The next step was to filter articles relevant to the research topic. Fourteen articles were not suitable based on the relevance of their titles, abstracts, and keywords. One article was inaccessible, and seventeen did not link the catastrophe bond model to earthquakes based on the relevance of their full text. Moreover, one article was not indexed by Scopus. Thus, three articles were retained for use as primary studies. A backward and forward process was carried out by collecting relevant citations from the three articles studied. This process was repeated and stopped when the article obtained was from the most recent year and there were no more cited articles. From this process, five articles were obtained that were relevant to the research topic [23,[41][42][43][44]. Table 4 shows the complete review of 23 articles based on accessibility, full-text selection, and reference relevance.

Overviews of Previous Models
Tang and Yuan [44] developed a multi-period earthquake bond pricing model with coupons based on pricing measures and a distorted probability function. The study used the Poisson process to determine the number of disaster events at time t, while GEV and GPD were used to model the maximum annual loss caused by the earthquake. Moreover, MLE and the R function were used to estimate the parameters in the GEV and GPD distributions.
Vasicek was employed to model risk-free interest and coupons, while Wang's transformation was used to model the pricing of premium disaster bonds. The study also used the modeled loss trigger designed using a compound Poisson process based on frequency and severity. However, the probability measure was distorted based on severity alone.
Shao et al. [43] developed a Cox-Pederson catastrophe bond pricing model [55] and associated it with single-and multi-period earthquake disasters. The model used in the payment function was a linear piecewise function. The study modeled the interest rates and inflation using ARIMA (1, 1, 1), while CIR was used to model the coupons. Furthermore, block maxima were used to determine the maximum annual earthquake magnitude. GEV was employed to model the distribution of the magnitude of the annual earthquake disasters, while gamma distribution was used to model the depth of the earthquakes. The study used a parametric trigger, implying the magnitude and depth of an earthquake in California.
Zimbidis et al. [23] developed a single-and multi-period earthquake bond pricing model for the Greek area. The study used a Log-normal distribution to model the year-end deposit interest rate and a CIR to model coupons based on annual EURIBOR. GEV was used to model the maximum annual earthquake magnitude, while MLE and the FORTRAN subroutine MLEGEV were used to estimate the parameters of the GEV distribution.
Hardle and Carbera [41] analyzed multi-period CAT bonds with coupons for earthquakes in Mexico. The study used a homogeneous Poisson process (HPP) with exponentially distributed intensity to model the arrival process of an earthquake, a nonhomogeneous Poisson process (NHPP) to determine aggregate loss, a Poisson process to determine the intensity of natural events, and continuous compound interest to calculate interest and coupon rates. The coupon used was constant and equal to LIBOR in May 2006. The study assumed fixed quarterly coupon payments. The data used were from earthquakes that occurred in Mexico. The authors analyzed the effects of magnitude, depth, and Log-magnitude on the economic losses caused by the earthquake disaster in Mexico. The coefficient of determination was 0.15 for data in 1985 and 0.129 for data in 1989 and 1999. Thus, the authors used the mean excess function to find an accurate loss distribution. Because the losses had heavy tails, they used Log-normal, Pareto, Burr, exponential, gamma, and Weibull distributions to model loss data. The MAD and MAVRD were used to determine the impact of the loss model on zero CAT bond prices. The results were modeled-index CAT bonds that reduce the basis and the moral risk borne.
Wei et al. [42] developed a pricing model for hybrid-trigger catastrophe bonds. The Poisson distribution was used to model disaster intensity, the Poisson process was used to determine the probability of the number of earthquakes, and the binary function was used to model cash flows to bondholders at a given time. The study used the POT model to collect data on earthquake magnitude and losses as a result of the earthquake disaster, GPD to model magnitude and loss distribution, Archimedean copula to model the dependence and joint distribution of magnitude and loss, CIR to model interest rates and coupons, continuous compound interest to determine the interest and coupon rates, VaR to determine maximum losses, analysis of the effect of the trigger on CAT bond prices, and analysis of the effects of interest rates and coupon rates. The data were used to simulate the magnitude and losses due to earthquakes in China from 1990 to 2020. The numerical experiment showed that the maturity, trigger level, dependence of trigger indicators, and financial risk (i.e., interest and coupon rates) affected the CAT bond price. The models discussed in the primary study are represented in Table 5.

Model 4 [41]
Zero-Coupon Bond The earthquake bond pricing model developed by Tang and Yuan [44] measures the expected price uncertainty, the payment function's discounted value, information on trigger developments, and financial market performance over time t ∈ [0, T]. The assumptions used in the trigger model are a nonnegative, non-decreasing, and rightcontinuous stochastic process, defined as follows:

Model 5 [42]
where N t is the number of earthquakes modeled by the Poisson process up to time t, X j is the j-th earthquake's magnitude, y is the threshold, and 1 ∆ is an indicator of the occurrence of earthquakes, with a value of 1 when ∆ occurs and 0 otherwise. X j can be modeled using the GEV or GPD distribution. The cash payment value depends on trigger Y in the time interval [0, t], defined as Π(.) : [0, ∞) → [0, 1] , which is a non-decreasing and continuous function with Π(.0) = 1, meaning that at any time t ∈ [0, T] the cash value of the disaster bond is KΠ(Y t ). Therefore, a trigger would delete several cash values of and Y t is the value of Y if an earthquake occurs. The cash value function obtained by bondholders is formulated as follows: The cash value in Equation (2) represents the coupon at time t = 1, . . . , T and the face value KΠ(Y t−1 ) at maturity when no trigger occurs. The trigger functions are as follows: When the trigger occurs between two coupon dates, bondholders receive the proportion of coupons received before the trigger, calculated as follows: where . is a floor function, and the amount of cash value to be obtained by bondholders is Kϑ. D(t, s) is the discount factor associated with the interval [t, s], while D(t, s) = e − s t r u du , i t is a floating coupon per year, R is a fixed coupon per year, r t is the risk-free interest rate at time t for t ≥ 0, E Q 1 t is the expected yield obtained by the earthquake catastrophe bondholders, and E Q 2 t is the expected coupon size and discount factor. The notations in the work of Shao et al. [43] are as follows: K is the principal value of the earthquake catastrophe bond, r is the real risk-free interest rate for one period, and π is the inflation rate for one period. Furthermore, R was used as the coupon rate for one period (using US LIBOR for 12 months), and M was used as the maximum earthquake magnitude in an area. When two earthquakes occur simultaneously, M = max M 1 , M 2 , where M 1 and M 2 represent the earthquake's magnitude in each region, D is the earthquake's depth, and v(D) is the issue price of the earthquake catastrophe bond. Moreover, d(R; M, D) is a payoff function; T is a time maturity; r k is the market yield using the 1-year treasury security rate at time k, following the ARIMA (1, 1, 1); π k is the 1-year inflation rate at time k, following the ARIMA (1, 0, 0); R k is a coupon rate, modeled using CIR based on 12-month LIBOR at time k; M k is the maximum annual earthquake magnitude in the n-th year; k = 1, 2, . . . , T; and M k = max M 1 k , M 2 k for k = 1, 2, . . . , T. In this case, M 1 k and M 2 k have GEV distribution, D k is the depth and magnitude in year t, d(R k , M k , D k ) is the payoff function that bondholders receive at k = 1, 2, . . . , T, and d(R k ; M k , D k ) is the expected payoff function. The notations in the work of Zimbidis et al. [23] are as follows: E(Q 1 ) is the expected present value of the cash value associated with the probability of an earthquake; r is the risk-free interest rate; e is the extra premium; C(M) is the payoff function; and K is the cash value. The notations K, r, and e in the multi-period model are the same as in the single-period model. Furthermore, T is the maturity time; i n is the interest rate, assumed to follow a Log-normal distribution; R t is the coupon rate modeled using CIR; R j is the j-th path (realization) of the EURIBOR level (R t , t ∈ [0, T]}; M n is the maximum earthquake magnitude in Greece in the nth year; M j is the jth path (realization) of (M n , n = 1, 2, . . .) for the maximum earthquake magnitude per year; and f (R l , M l ) is the cash value obtained by earthquake catastrophe bondholders at time l = 1, 2, . . . , T.
The assumptions in the zero-coupon [41] that pays a principal amount at time maturity T are as follows: a threshold time event τ = in f (t : L t ≥ D) is the moment when the aggregate loss L t exceeds the threshold level D; F t is the increase in filtration with time t ∈ [0, T]; e −(R(t,T)) = e − t s r(ξ)dξ is a continously compounded discount interest rate; V 1 t is a zero-coupon bond price; and M t = 1(L t > D) is the threshold time as a doubly stochastic Poisson process. The assumptions in the coupon bond that pays the principal P at time to maturity T give a coupon C s with a fixed spread z until the threshold time τ, M T , and e −(R(t,T)) are the same as the zero-coupon model.
Wei et al. [42] developed a hybrid-triggered CAT bond; the notations in the model were as follows: N t denotes the number of earthquake disasters that occurred between (t − 1, t), d i , i = 1, 2, 3, . . . , m m = ∑ T t=1 N t (i.e., occurrence time of the i-th earthquake); the economic loss caused by the earthquake disaster is denoted by X d , d = d 1 , d 2 , . . . , d m ; the magnitude of earthquake disaster that occurred at time d is denoted by Y d , d = d 1 , d 2 , . . . , d m ; CF(t) is the cash value at time t; the principal's payoff structure is FI τ F >t ; FRI τ C >t is the structure of the coupon; τ C = min(d = d 1 , d 2 , . . . , d m |X d > Tr X or Y d > Tr Y ) is the first time when the economic loss exceeds Tr X or the magnitude exceeds Tr Y ; τ F = min(d = d 1 , d 2 , . . . , d m |X d > Tr X and Y d > Tr Y ) is the stopping time as the first time when the economic loss exceeds Tr X or the magnitude exceeds Tr Y ; v c is the probability that trigger condition for the coupon is satisfied in the interval [t − 1, t]; v F is the probability that the trigger condition for the principal is satisfied in the interval [0, T]; p(0, t) = A(0, t)e −B(0,t)r is the price of a non-defaultable zero-coupon bond with a face value of 1 maturing at a time is the mean reverting force measurement, ε > 0 is the votality parameter, θ > 0 is the long-term mean of the interest rate, r is the initial interest rate, I (.) is an indicator function, F is the face value, and R is the coupon rate.

Comparison of Previous Models
The description in Section 4.3.1 implies that triggering distribution modeling for earthquake catastrophe bond prices entails utilizing GEV, GPD, Burr, Log-normal, or gamma; using the Poisson process to estimate the number of earthquake catastrophes up to time t; NHPP to determine the aggregate loss; Archimedean copula (AC) to model the dependency of joint distribution losses and the magnitude of earthquakes; Log-normal, CIR, ARIMA, and Vasicek to model the financial risk; and binary and linear piecewise functions to model payoff. Table 6 summarizes the five chosen papers' techniques and models for earthquake catastrophe bond pricing.  Table 6 shows that the trigger types used are parametric and hybrid triggers. Tang and Yuan [44] and Zimbidis et al. [23] used earthquake magnitude, Shao et al. [43] used earthquake depth and magnitude, Hardle and Carbera [41] used aggregate loss, and Wei et al. [42] used the magnitude of the earthquake and the losses caused by the disaster. Table 7 shows that Tang and Yuan [44] used GEV and GPD to model the distribution of earthquake magnitude. Shao et al. [43] used GEV to model earthquake magnitude and GPD to model earthquake depth. Zimbidis et al. [23] and Hardle and Carbera [41] used GPD, Burr, and Log-normal to model losses due to earthquake disasters, and they used NHPP to model the aggregate loss distribution. Wei et al. [42] used GPD to model losses and earthquake magnitude distribution, and they used AC to model the dependency of magnitude and loss distribution.  Table 8 shows that Tang and Yuan [44] used Vasicek to model the interest and coupon rate. Shao et al. [43] used ARIMA to model interest rates and inflation, and they used CIR to model the coupon rate. Zimbidis et al. [23] used Log-normality to model the interest rate and CIR to model the coupon rate. Hardle and Carbera [41] used constant interest and coupon rates. Wei et al. [42] used CIR to model the interest and coupon rates.  Table 9 shows that [41,42,44] used binary functions to model payoff, while [23,43] used linear piecewise payoff functions.

Literature Gap
The limitations of the ECBPM in the five selected articles are described as follows: (a) Earthquake magnitude triggers [23,44] have not been used to describe the earthquake disaster severity [57]. (b) The earthquake severity depends on its location, magnitude, and depth, which are not discussed in the five selected articles. (c) Earthquake magnitude distribution modeled using GEV can eliminate extreme data for a period [29]. (d) The modeling of interest and coupon rates using the Vasicek model [44] allows negative values. The volatility of interest rate changes is assumed to be constant and unrealistic for bond prices [58]. (e) Modeling of interest rates and inflation using ARIMA [43] allows negative values [59]. (f) The coupon rate modeled using CIR is better than that using the Vasicek model [44].
However, it has constant volatility and no jump caused by monetary policy [58]. (g) Discussion of the Poisson process can be found in [41,42,44] for determining the number of earthquake events at a time t. This is considered necessary in modeling to determine a trigger's probability of occurring at time t. (h) The payoff function is modeled using a binary function in [41,42,44], while [23,43] use a linear piecewise payoff function. The linear piecewise modeling is better than a binary function because it can describe the level of losses due to earthquakes.

Bibliometric Analysis
About 23 articles were relevant to the keywords used, as described in Table 4. They were selected based on accessibility and the relevance of their full text and references. Figure 2 shows the results of visualizing the bibliometric network using the VOSviewer application.

Literature Gap
The limitations of the ECBPM in the five selected articles are described as follows: (a) Earthquake magnitude triggers [23,44] have not been used to describe the earthquake disaster severity [57]. (b) The earthquake severity depends on its location, magnitude, and depth, which are not discussed in the five selected articles. (c) Earthquake magnitude distribution modeled using GEV can eliminate extreme data for a period [29]. (d) The modeling of interest and coupon rates using the Vasicek model [44] allows negative values. The volatility of interest rate changes is assumed to be constant and unrealistic for bond prices [58]. (e) Modeling of interest rates and inflation using ARIMA [43] allows negative values [59]. (f) The coupon rate modeled using CIR is better than that using the Vasicek model [44].
However, it has constant volatility and no jump caused by monetary policy [58]. (g) Discussion of the Poisson process can be found in [41,42,44] for determining the number of earthquake events at a time t. This is considered necessary in modeling to determine a trigger's probability of occurring at time t. (h) The payoff function is modeled using a binary function in [41,42,44], while [23,43] use a linear piecewise payoff function. The linear piecewise modeling is better than a binary function because it can describe the level of losses due to earthquakes.

Bibliometric Analysis
About 23 articles were relevant to the keywords used, as described in Table 4. They were selected based on accessibility and the relevance of their full text and references. Figure 2 shows the results of visualizing the bibliometric network using the VOSviewer application.  There are several nodes with various sizes and distances. The amount of words discussed is indicated by the size of the node. Furthermore, the number of terms in the database increases with node size. The distance between each node reveals the degree of the connections between the words. Figure 2 shows that the distance from the earthquake and EVT to catastrophe bond pricing is quite far, despite the modest size of the earthquake and EVT node. In addition, the distance from copula and analysis to earthquake and catastrophe bond is far. Figure 3 shows no relationship between the earthquake catastrophe bond pricing model and regional grouping. In addition, there are no nodes that state the fuzzy time series and ECIR to model interest rates and inflation. The findings obtained open up opportunities for the development of a model to determine the price of earthquake bonds.
There are several nodes with various sizes and distances. The amount of words discussed is indicated by the size of the node. Furthermore, the number of terms in the database increases with node size. The distance between each node reveals the degree of the connections between the words. Figure 2 shows that the distance from the earthquake and EVT to catastrophe bond pricing is quite far, despite the modest size of the earthquake and EVT node. In addition, the distance from copula and analysis to earthquake and catastrophe bond is far. Figure 3 shows no relationship between the earthquake catastrophe bond pricing model and regional grouping. In addition, there are no nodes that state the fuzzy time series and ECIR to model interest rates and inflation. The findings obtained open up opportunities for the development of a model to determine the price of earthquake bonds.

Discussion
The extreme value theory in earthquake catastrophe bond pricing models extreme data from earthquake magnitude and losses due to earthquake disasters. The high earthquake magnitude or losses are obtained using the block maxima method (BMM). This involves selecting extreme data per month, quarter, semester, or year. The process could also use peaks over the threshold (POT) by choosing an extreme value greater than the threshold value. The extreme values obtained using BMM follow the GEV distribution [18]). In contrast, the extreme values obtained using POT follow the GPD distribution [29]). The parameters in GEV are location, scale, and shape, while in GPD they are only scale and shape. The parameter estimation uses the MLE method, in which the maximum value of the likelihood function in the parameter results in a closed form, which requires numerical methods to solve.
Zimbidis et al. [23] investigated the concept of ECBPM in 2007 and found that only five articles in Scopus-indexed journals addressed pricing models for earthquake catastrophe bonds utilizing EVT. However, many studies have worked on constructing catastrophe bond models. The five articles in question only developed models for the payment function, financial market risk, and trigger distribution, each with advantages and disadvantages. The trigger type created by Tang and Yuan [44] uses several variables, including earthquake depth and magnitude. However, it does not describe the joint probability distribution of the two parameters. Since inflation impacts catastrophe bond

Discussion
The extreme value theory in earthquake catastrophe bond pricing models extreme data from earthquake magnitude and losses due to earthquake disasters. The high earthquake magnitude or losses are obtained using the block maxima method (BMM). This involves selecting extreme data per month, quarter, semester, or year. The process could also use peaks over the threshold (POT) by choosing an extreme value greater than the threshold value. The extreme values obtained using BMM follow the GEV distribution [18]). In contrast, the extreme values obtained using POT follow the GPD distribution [29]). The parameters in GEV are location, scale, and shape, while in GPD they are only scale and shape. The parameter estimation uses the MLE method, in which the maximum value of the likelihood function in the parameter results in a closed form, which requires numerical methods to solve.
Zimbidis et al. [23] investigated the concept of ECBPM in 2007 and found that only five articles in Scopus-indexed journals addressed pricing models for earthquake catastrophe bonds utilizing EVT. However, many studies have worked on constructing catastrophe bond models. The five articles in question only developed models for the payment function, financial market risk, and trigger distribution, each with advantages and disadvantages. The trigger type created by Tang and Yuan [44] uses several variables, including earthquake depth and magnitude. However, it does not describe the joint probability distribution of the two parameters. Since inflation impacts catastrophe bond prices, only Shao et al. [43] discussed the inflation variable and modeled inflation using ARIMA. This method allows negative values [59]. The Vasicek model [44] is inferior to CIR interest rate modeling. Therefore, the CIR model must be improved to address the ongoing volatility and leaps caused by monetary policy. GPD is superior to GEV for modeling earthquake magnitude.
Tang and Yuan [44] performed sensitivity analysis of the location and shape parameters. This was insufficient, because bond prices are significantly influenced by interest and coupon rates, maturity, and inflation. Hardle and Carbera [41] analyzed the robustness of the various loss models, finding that the bond prices are more dispersed under various expected loss models with the same distribution assumption than under various distribution assumptions with the same loss model. Wei et al. [42] analyzed the effects of different maturity periods, dependencies, interest rates, and coupon rates on catastrophe bond prices. The results were as follows: the greater the maturity, the cheaper the price of the disaster bond; the greater the dependency, the cheaper the price of disaster bonds; interest rates are inversely related to bond prices; and the higher the coupon rate, the higher the price of disaster bonds. In addition, increasing the trigger level causes the probability of triggering events to be smaller. The analysis carried out on the three articles [41,42,44] was good, so this kind of analysis should be repeated in the future.
The triggers that could be used in ECBPMs are indemnity, modeled loss, industry loss, and parametric or hybrid triggers. However, the modeling has a weakness because the payment process in the indemnity trigger type begins with a long loss-verification process. This means that the payment settlement process cannot be completed quickly following the disaster-triggering event [60]. The modeled and industry losses do not have a basic risk, meaning that the problem of loss from the insurance company could be overcome, but with a moral hazard to investors [61]. The parameter trigger type does not depend on the expected loss value of the insurance company and does not cause adverse selection problems. However, it poses a basic risk that the payment of the disaster bonds may not fully match the sponsor's losses [62]. The parametric trigger type has high transparency and baseline risk, a short settlement time, and low regulatory acceptance, while the opposite is true for the indemnity type [30]. Therefore, the selection of parametric trigger types is referenced in the modeling. Figure 4 shows a comparison of the four triggers.
CIR interest rate modeling. Therefore, the CIR model must be improved to address ongoing volatility and leaps caused by monetary policy. GPD is superior to GEV modeling earthquake magnitude.
Tang and Yuan [44] performed sensitivity analysis of the location and sh parameters. This was insufficient, because bond prices are significantly influenced interest and coupon rates, maturity, and inflation. Hardle and Carbera [41] analyzed robustness of the various loss models, finding that the bond prices are more dispe under various expected loss models with the same distribution assumption than un various distribution assumptions with the same loss model. Wei et al. [42] analyzed effects of different maturity periods, dependencies, interest rates, and coupon rate catastrophe bond prices. The results were as follows: the greater the maturity, the che the price of the disaster bond; the greater the dependency, the cheaper the price of disa bonds; interest rates are inversely related to bond prices; and the higher the coupon the higher the price of disaster bonds. In addition, increasing the trigger level causes probability of triggering events to be smaller. The analysis carried out on the three art [41,42,44] was good, so this kind of analysis should be repeated in the future.
The triggers that could be used in ECBPMs are indemnity, modeled loss, indu loss, and parametric or hybrid triggers. However, the modeling has a weakness bec the payment process in the indemnity trigger type begins with a long loss-verifica process. This means that the payment settlement process cannot be completed qui following the disaster-triggering event [60]. The modeled and industry losses do not h a basic risk, meaning that the problem of loss from the insurance company could overcome, but with a moral hazard to investors [61]. The parameter trigger type does depend on the expected loss value of the insurance company and does not cause adv selection problems. However, it poses a basic risk that the payment of the disaster bo may not fully match the sponsor's losses [62]. The parametric trigger type has transparency and baseline risk, a short settlement time, and low regulatory accepta while the opposite is true for the indemnity type [30]. Therefore, the selectio parametric trigger types is referenced in the modeling. Figure 4 shows a compariso the four triggers. The modeled-index loss trigger developed by Hardle and Carbera [41] can red the basic risk. However, if a trigger occurs, the payment takes longer than the param type. The hybrid trigger model (i.e., parametric and modeled loss) developed by W al. [42] has limitations if the modeled loss-triggering event that occurs takes a longer than the parametric trigger. The best triggers were developed by Shao et al.; howe The modeled-index loss trigger developed by Hardle and Carbera [41] can reduce the basic risk. However, if a trigger occurs, the payment takes longer than the parametric type. The hybrid trigger model (i.e., parametric and modeled loss) developed by Wei et al. [42] has limitations if the modeled loss-triggering event that occurs takes a longer time than the parametric trigger. The best triggers were developed by Shao et al.; however, their calculation does not use a dependency model for the depth and magnitude of the earthquake.
Extreme-value-based earthquake catastrophe bonds have been well studied, but they still need development due to previous literature gaps in the following areas: (a) An earthquake depth parameter is required [43], since the usage of earthquake magnitude triggers [23,44] has not represented the severity of earthquake disasters [57]. However, using the approaches described in [25,27,29,42] necessitates the development of a model of the dependence of earthquake magnitude distribution and depth using copulas.
(b) Analysis of the characteristics of earthquakes based on seismic parameters is required, since the earthquake's severity depends on its location, magnitude, and depth [57,63]. The method discussed could group the area's characteristics based on the earthquake parameters. Calvet et al. [18] reviewed eight techniques for classifying the triggering events of earthquake catastrophes. Goda et al. [64] presented a bat-in-a-box grid based on the earthquake's location and magnitude to categorize a region's seismic conditions. Hofer et al. [65] proposed the design of earthquake bonds based on the spatial distribution of the portfolio, taking into account the combined asset portfolio consisting of residences in Italy to simplify the analysis by considering the PGA value for each particular municipality. Mistry et al. [66] proposed the design of an earthquake disaster bond based on an earthquake hazard model by considering the source area model and peak ground acceleration (PGA). The Earthquake Disaster Risk Index (EDRI) can be used to categorize places because earthquake risk has economic, social, and environmental impacts [67]. Therefore, it is possible to develop regional groupings using earthquake parameters, EDRI, and PGA. The purpose of grouping areas is to obtain a more accurate calculation of the probability of an earthquake event so as to produce a calculation of disaster bonds that can minimize moral hazard [26,68]. (c) The earthquake magnitude distribution was modeled using GEV [23]. This method was used along with BMM to select extreme values; however, it could eliminate extreme values in a period [29]. This would require GPD to fix the problem by selecting trigger data that exceed the threshold using the POT method [41,42,44]. (d) The main challenge in the POT model is selecting the optimal threshold to fit the model. The threshold value could be determined using a trigger value that causes economic losses or fatalities [29] due to earthquake disasters. (e) Vasicek's [44] modeling of interest rates and coupons permits negative values [69,70].
Bond prices are also unrealistic because the volatility of interest rate fluctuations should be constant [58]. In comparison to Vasicek's model [59], the coupon rate modeled using CIR is superior. However, it has limitations, including continual volatility and no surge caused by monetary policy [58]. The model could be built utilizing an extended Cox-Ingersoll-Ross (ECIR) approach to solve the current shortcomings [71]. (f) Modeling interest rates and inflation using ARIMA [43] requires a data requirement test, and allows negative values. Another method for predicting time-series data without a data requirement test [72] and guaranteeing positive values is fuzzy time series, as long as the data are positive. This method can be used for predicting interest rates and inflation. (g) Discussion of the Poisson process can be found in [41,42,44] for determining the number of earthquake events. This is considered necessary in modeling to determine the probability of a trigger occurring at time t. (h) The payoff function is modeled using a binary function in [41,42,44], while [23,43] use a linear piecewise function. Linear piecewise modeling is better than a binary function because it can describe the level of losses due to earthquakes. Therefore, it is recommended to use a linear piecewise function for modeling the payoff function.

Conclusions
This study presents an MRA of ECBPMs using EVT. We collected articles from Scopus, ScienceDirect, and ProQuest. After the article selection using PRISMA, five papers on ECBPM using EVT were selected. After that, the articles were analyzed by reviewing the ECBPMs, comparing the developed models, identifying research gaps, and conducting co-word bibliometric analysis.
The review identified five articles discussing earthquake catastrophe bond prices using EVT to model trigger distribution for earthquake magnitude, depth, and losses due to earthquake disasters. The limitations of the previous model could be improved using multiple parametric types and copulas for modeling the distribution of earthquake magnitude and depth dependencies. They could also be improved using fuzzy time-series modeling of inflation and interest rates, ECIR modeling of coupon rates, and regional grouping based on earthquake parameters and risk index. In addition, the earthquake characteristics of an area could be considered by decomposing it into subregions based on EDRI, PGA, and earthquake parameters. Furthermore, a sensitivity analysis could determine the effects of interest and coupon rates, inflation, and maturity on earthquake disaster bond prices.
The main limitation of this study was the use of only three digital sources. For this reason, it will be necessary to add other digital library sources in order to identify other gaps so as to complete this research. In the future, we will calibrate the earthquake bond pricing model according to the methods or models proposed in Section 5.
The results of this review are expected to provide knowledge about the application of EVT and alternative methods and models that could be helpful in developing ECBPMs. They could also provide opportunities and motivation for future studies on ECBPM development.