Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (9)

Search Parameters:
Keywords = worst value-at-risk allocation

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
17 pages, 653 KB  
Article
Cross-Impact Analysis with Crowdsourcing for Constructing Consistent Scenarios
by Robyn C. Thompson, Oludayo O. Olugbara and Alveen Singh
Algorithms 2026, 19(1), 41; https://doi.org/10.3390/a19010041 - 4 Jan 2026
Viewed by 138
Abstract
Cross-impact analysis is frequently used in scenario-analogous studies to identify critical factors influencing ecological change, strategic planning, technology foresight, resource allocation, risk mitigation, cost optimization, and decision support. Scenarios enable different organizations to comprehend prevailing situations, prepare for probable futures, and mitigate conceivable [...] Read more.
Cross-impact analysis is frequently used in scenario-analogous studies to identify critical factors influencing ecological change, strategic planning, technology foresight, resource allocation, risk mitigation, cost optimization, and decision support. Scenarios enable different organizations to comprehend prevailing situations, prepare for probable futures, and mitigate conceivable risks. Unfortunately, cross-impact analysis methods are often criticized for their difficulty in handling complex interactions, cognitive bias, time-intensiveness, heavy reliance on a limited pool of experts, and inconsistency in assigning judgment, which can affect the expected outcomes. This paper introduces a novel method for constructing consistent scenarios that addresses these criticisms and those associated with scenario methods. The method is based on cross-impact analysis and crowdsourcing for constructing consistent scenarios. The cross-impact analysis component of the method is based on advanced impact analysis and cross-impact balance analysis to, respectively, provide a time-efficient reduction in complex interdependent factors and construct consistent scenarios from a set of reduced factors. The crowdsourcing element leverages the cumulative intelligence of a group of experts to help mitigate cognitive bias and transparently give a more inclusive analysis. The method was implemented and validated with a practical case of renewable energy adoption, a vital challenge for socioeconomic progress and climate change resilience. While the method provides a sturdy foundation for writing scenario narratives, the result confirms its robustness for constructing consistent scenarios and suggests that the future of renewable energy adoption can be enhanced through careful cogitation of best-case, base-case, and worst-case scenarios, which include varying states of perceived value, awareness, and perceived support. These findings contribute to a more nuanced understanding of how socio-cognitive and institutional factors interact to influence the pace and direction of sustainable energy transitions. Full article
Show Figures

Figure 1

19 pages, 2119 KB  
Article
Integrating Shapley Value and Least Core Attribution for Robust Explainable AI in Rent Prediction
by Xinyu Wang and Tris Kee
Buildings 2025, 15(17), 3133; https://doi.org/10.3390/buildings15173133 - 1 Sep 2025
Viewed by 1307
Abstract
With the widespread application of artificial intelligence in real estate price prediction, model explainability has become a critical factor influencing its acceptability and trustworthiness. The Shapley value, as a classic cooperative game theory method, quantifies the average marginal contribution of each feature, ensuring [...] Read more.
With the widespread application of artificial intelligence in real estate price prediction, model explainability has become a critical factor influencing its acceptability and trustworthiness. The Shapley value, as a classic cooperative game theory method, quantifies the average marginal contribution of each feature, ensuring global fairness in the explanation allocation. However, its focus on average fairness lacks robustness under data perturbations, model changes, and adversarial attacks. To address this limitation, this paper proposes a hybrid explainability framework that integrates the Shapley value and Least Core attribution. The framework leverages the Least Core theory by formulating a linear programming problem to minimize the maximum dissatisfaction of feature subsets, providing bottom-line fairness. Furthermore, the attributions from the Shapley value and Least Core are combined through a weighted fusion approach, where the weight acts as a tunable hyperparameter to balance the global fairness and worst-case robustness. The proposed framework is seamlessly integrated into mainstream machine learning models such as XGBoost. Empirical evaluations on real-world real estate rental data demonstrate that this hybrid attribution method not only preserves the global fairness of the Shapley value but also significantly enhances the explanation consistency and trustworthiness under various data perturbations. This study provides a new perspective for robust explainable AI in high-risk decision-making scenarios and holds promising potential for practical applications. Full article
(This article belongs to the Section Architectural Design, Urban Science, and Real Estate)
Show Figures

Figure 1

19 pages, 386 KB  
Article
Optimal Investment Strategy for DC Pension Plan with Stochastic Salary and Value at Risk Constraint in Stochastic Volatility Model
by Zilan Liu, Huanying Zhang, Yijun Wang and Ya Huang
Axioms 2024, 13(8), 543; https://doi.org/10.3390/axioms13080543 - 10 Aug 2024
Cited by 1 | Viewed by 1554
Abstract
This paper studies the optimal asset allocation problem of a defined contribution (DC) pension plan with a stochastic salary and value under a constraint within a stochastic volatility model. It is assumed that the financial market contains a risk-free asset and a risky [...] Read more.
This paper studies the optimal asset allocation problem of a defined contribution (DC) pension plan with a stochastic salary and value under a constraint within a stochastic volatility model. It is assumed that the financial market contains a risk-free asset and a risky asset whose price process satisfies the Stein–Stein stochastic volatility model. To comply with regulatory standards and offer a risk management tool, we integrate the dynamic versions of Value-at-Risk (VaR), Conditional Value-at-Risk (CVaR), and worst-case CVaR (wcCVaR) constraints into the DC pension fund management model. The salary is assumed to be stochastic and characterized by geometric Brownian motion. In the dynamic setting, a CVaR/wcCVaR constraint is equivalent to a VaR constraint under a higher confidence level. By using the Lagrange multiplier method and the dynamic programming method to maximize the constant absolute risk aversion (CARA) utility of terminal wealth, we obtain closed-form expressions of optimal investment strategies with and without a VaR constraint. Several numerical examples are provided to illustrate the impact of a dynamic VaR/CVaR/wcCVaR constraint and other parameters on the optimal strategy. Full article
Show Figures

Figure 1

21 pages, 3829 KB  
Article
A Scenario-Based Multi-Criteria Decision-Making Approach for Allocation of Pistachio Processing Facilities: A Case Study of Zarand, Iran
by Mohammad Ebrahimi Sirizi, Esmaeil Taghavi Zirvani, Abdulsalam Esmailzadeh, Jafar Khosravian, Reyhaneh Ahmadi, Naeim Mijani, Reyhaneh Soltannia and Jamal Jokar Arsanjani
Sustainability 2023, 15(20), 15054; https://doi.org/10.3390/su152015054 - 19 Oct 2023
Cited by 9 | Viewed by 3775
Abstract
Site selection and allocation of manufacturing and processing facilities are essential to sustainable economic productivity of a given product while preserving soil, the environment, and biodiversity. An essential criterion when evaluating various approaches to model land suitability for pistachio processing facilities is their [...] Read more.
Site selection and allocation of manufacturing and processing facilities are essential to sustainable economic productivity of a given product while preserving soil, the environment, and biodiversity. An essential criterion when evaluating various approaches to model land suitability for pistachio processing facilities is their adaptability to accommodate diverse perspectives and circumstances of managers and decision makers. Incorporating the concept of risk into the decision-making process stands as a significant research gap in modeling land suitability for pistachio processing facilities. This study presents a scenario-based multi-criteria decision-making system for modeling the land suitability of pistachio processing facilities. The model was implemented based on a stakeholder analysis as well as inclusion of a set of influential criteria and restrictions for an Iranian case study, which is among the top three producers. The weight of each criterion was determined based on the best-worst method (BWM) after the stakeholder analysis. Then, the ordered weighted averaging (OWA) model was used to prepare maps of spatial potential for building a pistachio processing factory in different decision-making scenarios, including very pessimistic, pessimistic, intermediate, optimistic, and very optimistic attitudes. Finally, the sensitivity analysis of very-high- and high-potential regions to changes in the weight of the effective criteria was evaluated and proved that the most important criteria were proximity to pistachio orchards, proximity to residential areas, proximity to the road network, and proximity to industrial areas. Overall, 327 km2 of the study area was classified as restricted, meaning that they are not suitable locations for pistachio processing. The average estimated potential values based on the proposed model for very pessimistic, pessimistic, intermediate, optimistic, and very optimistic scenarios were 0.19, 0.47, 0.63, 0.78, and 0.97, respectively. The very-high-potential class covered 0, 0.41, 8.25, 39.64, and 99.78 percent of the study area based on these scenarios, respectively. The area of suitable regions for investment decreased by increasing risk aversion in decision making. The model was more sensitive to changes in the weights of proximity to residential areas, proximity to pistachio orchards, and proximity to transportation hubs. The proposed approach and the achieved findings could be of broader use to respective stakeholders and investors. Given the suitability of arid regions for planting pistachio and its relatively high profitability, the local authorities and decision makers can promote further expansion of the orchards, which can lead to better welfare of farmers and reducing rural-urban migration in the region. Full article
(This article belongs to the Special Issue Sustainable Environmental Analysis of Soil and Water)
Show Figures

Figure 1

16 pages, 2232 KB  
Article
Reserve Fund Optimization Model for Digital Banking Transaction Risk with Extreme Value-at-Risk Constraints
by Moch Panji Agung Saputra, Diah Chaerani, Sukono and Mazlynda Md. Yusuf
Mathematics 2023, 11(16), 3507; https://doi.org/10.3390/math11163507 - 14 Aug 2023
Cited by 1 | Viewed by 2014
Abstract
The digitalization of bank data and financial operations creates a large risk of loss. Losses due to the risk of errors in the bank’s digital system need to be mitigated through the readiness of reserve funds. The determination of reserve funds needs to [...] Read more.
The digitalization of bank data and financial operations creates a large risk of loss. Losses due to the risk of errors in the bank’s digital system need to be mitigated through the readiness of reserve funds. The determination of reserve funds needs to be optimized so that there is no large excess of reserve funds. Then the rest of the reserve fund allocation can be used as an investment fund by the bank to obtain additional returns or profits. This study aims to optimize the reserve fund allocation for digital banking transactions. In this case, the decision variable is value reserved based on potential loss of each digital banking, and the objective function is defined as minimizing reserve fund allocation. Furthermore, some conditions that become limitation are rules of Basel II, Basel III, and Article 71 paragraph 1 of the Limited Liability Company Law. Since the objective function can be expressed as a linear function, in this paper, linear programming optimization approach is thus employed considering Extreme Value-at-Risk (EVaR) constraints. In the use of EVaR approach in the digital banking problem, it is found that the loss meets the criteria of extreme data based on the Generalized Pareto Distribution (GPD). The strength of reserve funds using linear programming optimization with EVaR constraints is the consideration of potential losses from digital banking risks that are minimized so that the allocation of company funds becomes optimum. While the determination of reserve funds with a standard approach only considers historical profit data, this can result in excessive reserve funds because they are not considered potential risks in the future period. For the numerical experiment, the following risk data are used in the modeling, i.e., the result of a sample simulation of digital banking losses due to the risk of system downtime, system timeout, external failure, and operational user failure. Therefore, the optimization model with EVaR constraints produces an optimal reserve fund value, so that the allocation of bank reserve funds becomes efficient. This provides a view for banking companies to avoid the worst risk, namely collapse due to unbalanced mandatory reserve funds. Full article
Show Figures

Figure 1

15 pages, 1915 KB  
Article
Conundrum of Classifying Subtypes of Pulmonary Hypertension—Introducing a Novel Approach to Classify “Borderline” Patients in a Population with Severe Aortic Stenosis Undergoing TAVI
by Elke Boxhammer, Sarah X. Gharibeh, Bernhard Wernly, Malte Kelm, Marcus Franz, Daniel Kretzschmar, Uta C. Hoppe, Alexander Lauten and Michael Lichtenauer
J. Cardiovasc. Dev. Dis. 2022, 9(9), 294; https://doi.org/10.3390/jcdd9090294 - 4 Sep 2022
Cited by 1 | Viewed by 2607
Abstract
Background: Transcatheter aortic valve implantation (TAVI) is an established therapeutic option in patients with severe aortic valve stenosis (AS) and a high surgical risk profile. Pulmonary hypertension (PH)—often co-existing with severe AS—is associated with a limited factor for prognosis and survival. The purpose [...] Read more.
Background: Transcatheter aortic valve implantation (TAVI) is an established therapeutic option in patients with severe aortic valve stenosis (AS) and a high surgical risk profile. Pulmonary hypertension (PH)—often co-existing with severe AS—is associated with a limited factor for prognosis and survival. The purpose of this study was to evaluate the prevalence of PH in patients undergoing TAVI, classify these patients based on right heart catheter (RHC) measurements in different PH subtypes, and analyze prognostic values on survival after TAVI. Methods: 284 patients with severe AS underwent an RHC examination for hemodynamic assessment prior to TAVI and were categorized into subtypes of PH according to the 2015 European Society of Cardiology (ESC) guidelines. TAVI patients were followed-up with for one year with regard to 30-days and 1-year mortality as primary endpoints. Results: 74 of 284 participants showed a diastolic pressure gradient (DPG) < 7 mmHg and a pulmonary vascular resistance (PVR) > 3 Wood units (WU) and could not be formally allocated to either isolated post-capillary PH (ipc-PH) or combined pre- and post-capillary PH (cpc-PH). Therefore, a new subgroup called “borderline post-capillary PH” (borderlinepc-PH) was introduced. Compared with TAVI patients with pre-capillary PH (prec-PH), ipc-PH patients suffering from borderlinepc-PH (HR 7.114; 95% CI 2.015–25.119; p = 0.002) or cpc-PH (HR 56.459; 95% CI 7.738–411.924; p < 0.001) showed a significantly increased 1-year mortality. Conclusions: Postcapillary PH was expanded to include the so-called “borderlinepc-PH” variant in addition to the ipc-PH and cpc-PH subtypes. The one-year survival after TAVI was significantly different between the subgroups, with the worst prognosis for borderlinepc-PH and cpc-PH. Full article
Show Figures

Figure 1

13 pages, 384 KB  
Article
Taming Tail Risk: Regularized Multiple β Worst-Case CVaR Portfolio
by Kei Nakagawa and Katsuya Ito
Symmetry 2021, 13(6), 922; https://doi.org/10.3390/sym13060922 - 21 May 2021
Cited by 6 | Viewed by 3655
Abstract
The importance of proper tail risk management is a crucial component of the investment process and conditional Value at Risk (CVaR) is often used as a tail risk measure. CVaR is the asymmetric risk measure that controls and manages the downside risk of [...] Read more.
The importance of proper tail risk management is a crucial component of the investment process and conditional Value at Risk (CVaR) is often used as a tail risk measure. CVaR is the asymmetric risk measure that controls and manages the downside risk of a portfolio while symmetric risk measures such as variance consider both upside and downside risk. In fact, minimum CVaR portfolio is a promising alternative to traditional mean-variance optimization. However, there are three major challenges in the minimum CVaR portfolio. Firstly, when using CVaR as a risk measure, we need to determine the distribution of asset returns, but it is difficult to actually grasp the distribution; therefore, we need to invest in a situation where the distribution is uncertain. Secondly, the minimum CVaR portfolio is formulated with a single β and may output significantly different portfolios depending on the β. Finally, most portfolio allocation strategies do not account for transaction costs incurred by each rebalancing of the portfolio. In order to improve these challenges, we propose a Regularized Multiple β Worst-case CVaR (RM-WCVaR) portfolio. The characteristics of this portfolio are as follows: it makes CVaR robust with worst-case CVaR which is still an asymmetric risk measure, it is stable among multiple β, and against changes in weights over time. We perform experiments on well-known benchmarks to evaluate the proposed portfolio.RM-WCVaR demonstrates superior performance of having both higher risk-adjusted returns and lower maximum drawdown. Full article
(This article belongs to the Special Issue Symmetric Distributions, Moments and Applications)
Show Figures

Figure 1

27 pages, 9125 KB  
Article
Spatial Analysis, Interactive Visualisation and GIS-Based Dashboard for Monitoring Spatio-Temporal Changes of Hotspots of Bushfires over 100 Years in New South Wales, Australia
by Michael Visner, Sara Shirowzhan and Chris Pettit
Buildings 2021, 11(2), 37; https://doi.org/10.3390/buildings11020037 - 23 Jan 2021
Cited by 20 | Viewed by 9070
Abstract
The 2019–2020 bushfire season is estimated to be one of the worst fire seasons on record in Australia, especially in New South Wales (NSW). The devastating fire season ignited a heated public debate on whether prescribed burning is an effective tool for preventing [...] Read more.
The 2019–2020 bushfire season is estimated to be one of the worst fire seasons on record in Australia, especially in New South Wales (NSW). The devastating fire season ignited a heated public debate on whether prescribed burning is an effective tool for preventing bushfires, and how the extent of bushfires has been changing over time. The objective of this study is to answer these questions, and more specifically to identify how bushfire patterns have changed in the last 100 years in NSW. To do so, we conducted a spatio-temporal analysis on prescribed burns and bushfires using a 100-year dataset of bushfires. More specifically, three research questions were developed, with each one of them addressed differently. First, generalised linear modelling was applied to assess the changes in fire patterns. Second, a correlation analysis was conducted to examine whether prescribed burns are an effective tool for reducing bushfire risk. Third, a spatio-temporal analysis was applied to the bushfire location data to explore spatio-temporal clusters of high and low values for bushfires, known as hotspots and coldspots, respectively. The study found that the frequency of bushfires has increased over time; however, it did not identify a significant trend of change in their size. Based on the results of this study for the relationship between prescribed burns and bushfires, it seems impossible to determine whether prescribed burns effectively reduce bushfire risk. Thus, further analysis with a larger amount of data is required in the future. The results of the spatio-temporal analysis showed that cold spots are propagated around metropolitan areas such as Sydney, while hotspots are concentrated in rural areas such as the North Coast and South Coast regions of NSW. The analysis found four statistical areas that have become new bushfire frequency hotspots in the 2019–2020 bushfire season. These areas combined have about 40,000 residents and at least 13,000 built dwellings. We suggest that further analysis is needed in the field to determine if there is a pattern of movement of bushfire towards metropolitan areas. To make the results of this research accessible to the public, an online interactive GIS-based dashboard was developed. The insight gained from the spatial and temporal analyses in this research is crucial to making smarter decisions on allocating resources and developing preventive or mitigating strategies. Full article
Show Figures

Figure 1

28 pages, 949 KB  
Article
Implementing the Rearrangement Algorithm: An Example from Computational Risk Management
by Marius Hofert
Risks 2020, 8(2), 47; https://doi.org/10.3390/risks8020047 - 14 May 2020
Cited by 9 | Viewed by 4807
Abstract
After a brief overview of aspects of computational risk management, the implementation of the rearrangement algorithm in R is considered as an example from computational risk management practice. This algorithm is used to compute the largest quantile (worst value-at-risk) of the sum of [...] Read more.
After a brief overview of aspects of computational risk management, the implementation of the rearrangement algorithm in R is considered as an example from computational risk management practice. This algorithm is used to compute the largest quantile (worst value-at-risk) of the sum of the components of a random vector with specified marginal distributions. It is demonstrated how a basic implementation of the rearrangement algorithm can gradually be improved to provide a fast and reliable computational solution to the problem of computing worst value-at-risk. Besides a running example, an example based on real-life data is considered. Bootstrap confidence intervals for the worst value-at-risk as well as a basic worst value-at-risk allocation principle are introduced. The paper concludes with selected lessons learned from this experience. Full article
(This article belongs to the Special Issue Computational Risk Management)
Show Figures

Figure 1

Back to TopTop