Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (157)

Search Parameters:
Keywords = robust portfolio modelling

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
14 pages, 843 KB  
Article
A Scalarized Entropy-Based Model for Portfolio Optimization: Balancing Return, Risk and Diversification
by Florentin Șerban and Silvia Dedu
Mathematics 2025, 13(20), 3311; https://doi.org/10.3390/math13203311 - 16 Oct 2025
Viewed by 328
Abstract
Portfolio optimization is a cornerstone of modern financial decision-making, traditionally based on the mean–variance model introduced by Markowitz. However, this framework relies on restrictive assumptions—such as normally distributed returns and symmetric risk preferences—that often fail in real-world markets, particularly in volatile and non-Gaussian [...] Read more.
Portfolio optimization is a cornerstone of modern financial decision-making, traditionally based on the mean–variance model introduced by Markowitz. However, this framework relies on restrictive assumptions—such as normally distributed returns and symmetric risk preferences—that often fail in real-world markets, particularly in volatile and non-Gaussian environments such as cryptocurrencies. To address these limitations, this paper proposes a novel multi-objective model that combines expected return maximization, mean absolute deviation (MAD) minimization, and entropy-based diversification into a unified optimization structure: the Mean–Deviation–Entropy (MDE) model. The MAD metric offers a robust alternative to variance by capturing the average magnitude of deviations from the mean without inflating extreme values, while entropy serves as an information-theoretic proxy for portfolio diversification and uncertainty. Three entropy formulations are considered—Shannon entropy, Tsallis entropy, and cumulative residual Sharma–Taneja–Mittal entropy (CR-STME)—to explore different notions of uncertainty and structural diversity. The MDE model is formulated as a tri-objective optimization problem and solved via scalarization techniques, enabling flexible trade-offs between return, deviation, and entropy. The framework is empirically tested on a cryptocurrency portfolio composed of Bitcoin (BTC), Ethereum (ETH), Solana (SOL), and Binance Coin (BNB), using daily data over a 12-month period. The empirical setting reflects a high-volatility, high-skewness regime, ideal for testing entropy-driven diversification. Comparative outcomes reveal that entropy-integrated models yield more robust weightings, particularly when tail risk and regime shifts are present. Comparative results against classical mean–variance and mean–MAD models indicate that the MDE model achieves improved diversification, enhanced allocation stability, and greater resilience to volatility clustering and tail risk. This study contributes to the literature on robust portfolio optimization by integrating entropy as a formal objective within a scalarized multi-criteria framework. The proposed approach offers promising applications in sustainable investing, algorithmic asset allocation, and decentralized finance, especially under high-uncertainty market conditions. Full article
(This article belongs to the Section E5: Financial Mathematics)
Show Figures

Figure 1

24 pages, 1637 KB  
Article
Inverse DEA for Portfolio Volatility Targeting: Industry Evidence from Taiwan Stock Exchange
by Temitope Olubanjo Kehinde, Sai-Ho Chung and Oludolapo Akanni Olanrewaju
Int. J. Financial Stud. 2025, 13(4), 192; https://doi.org/10.3390/ijfs13040192 - 15 Oct 2025
Viewed by 963
Abstract
This work develops an inverse data envelopment analysis (Inverse DEA) framework for portfolio optimization, treating return as a desirable output and volatility as an undesirable output. Using 20 industry-level portfolios from the Taiwan Stock Exchange (1365 stocks; FY-2020), we first evaluate efficiency with [...] Read more.
This work develops an inverse data envelopment analysis (Inverse DEA) framework for portfolio optimization, treating return as a desirable output and volatility as an undesirable output. Using 20 industry-level portfolios from the Taiwan Stock Exchange (1365 stocks; FY-2020), we first evaluate efficiency with a directional-distance DEA model and identify 7 inefficient industries. We then formulate an Inverse DEA model that holds inputs and desirable outputs fixed and estimates the maximum feasible reduction in volatility. Estimated reductions range from 0.000827 to 0.007610, and substituting these targets into the base model drives each portfolio’s inefficiency score to zero (ϕ=0), thereby making them efficient. To test robustness, we extend the analysis to a calm pre-crisis year (2019) and a recovery year (2021), which confirm that inefficiency and volatility-reduction targets behave logically across regimes, smaller cuts in stable markets, larger cuts in stressed conditions, and intermediate adjustments during recovery. We interpret these targets as theoretical envelopes that inform risk-reduction priorities rather than investable guarantees. The approach adds a forward-planning layer to DEA-based performance evaluation and provides portfolio managers with quantitative, regime-sensitive volatility-reduction targets at the industry level. Full article
Show Figures

Figure 1

26 pages, 711 KB  
Article
Algorithmic Management in Hospitality: Examining Hotel Employees’ Attitudes and Work–Life Balance Under AI-Driven HR Systems
by Milena Turčinović, Aleksandra Vujko and Vuk Mirčetić
Tour. Hosp. 2025, 6(4), 203; https://doi.org/10.3390/tourhosp6040203 - 4 Oct 2025
Viewed by 765
Abstract
This study investigates hotel employees’ perceptions of AI-driven human resource (HR) management systems within the Accor Group’s properties across three major European cities: Paris, Berlin, and Amsterdam. These diverse urban contexts, spanning a broad portfolio of hotel brands from luxury to economy, provide [...] Read more.
This study investigates hotel employees’ perceptions of AI-driven human resource (HR) management systems within the Accor Group’s properties across three major European cities: Paris, Berlin, and Amsterdam. These diverse urban contexts, spanning a broad portfolio of hotel brands from luxury to economy, provide a rich setting for exploring how AI integration affects employee attitudes and work–life balance. A total of 437 employees participated in the survey, offering a robust dataset for structural equation modeling (SEM) analysis. Exploratory factor analysis identified two primary factors shaping perceptions: AI Perceptions, which encompasses employee views on AI’s impact on job performance, communication, recognition, and retention, and balanced management, reflecting attitudes toward fairness, personal consideration, productivity, and skill development in AI-managed environments. The results reveal a complex but optimistic view, where employees acknowledge AI’s potential to enhance operational efficiency and career optimism but also express concerns about flexibility loss and the need for human oversight. The findings underscore the importance of transparent communication, contextual sensitivity, and continuous training in implementing AI systems that support both organizational goals and employee well-being. This study contributes valuable insights to hospitality management by highlighting the relational and ethical dimensions of algorithmic HR systems across varied organizational and cultural settings. Full article
(This article belongs to the Special Issue Digital Transformation in Hospitality and Tourism)
Show Figures

Figure 1

43 pages, 4746 KB  
Article
The BTC Price Prediction Paradox Through Methodological Pluralism
by Mariya Paskaleva and Ivanka Vasenska
Risks 2025, 13(10), 195; https://doi.org/10.3390/risks13100195 - 4 Oct 2025
Viewed by 982
Abstract
Bitcoin’s extreme price volatility presents significant challenges for investors and traders, necessitating accurate predictive models to guide decision-making in cryptocurrency markets. This study compares the performance of machine learning approaches for Bitcoin price prediction, specifically examining XGBoost gradient boosting, Long Short-Term Memory (LSTM), [...] Read more.
Bitcoin’s extreme price volatility presents significant challenges for investors and traders, necessitating accurate predictive models to guide decision-making in cryptocurrency markets. This study compares the performance of machine learning approaches for Bitcoin price prediction, specifically examining XGBoost gradient boosting, Long Short-Term Memory (LSTM), and GARCH-DL neural networks using comprehensive market data spanning December 2013 to May 2025. We employed extensive feature engineering incorporating technical indicators, applied multiple machine and deep learning models configurations including standalone and ensemble approaches, and utilized cross-validation techniques to assess model robustness. Based on the empirical results, the most significant practical implication is that traders and financial institutions should adopt a dual-model approach, deploying XGBoost for directional trading strategies and utilizing LSTM models for applications requiring precise magnitude predictions, due to their superior continuous forecasting performance. This research demonstrates that traditional technical indicators, particularly market capitalization and price extremes, remain highly predictive in algorithmic trading contexts, validating their continued integration into modern cryptocurrency prediction systems. For risk management applications, the attention-based LSTM’s superior risk-adjusted returns, combined with enhanced interpretability, make it particularly valuable for institutional portfolio optimization and regulatory compliance requirements. The findings suggest that ensemble methods offer balanced performance across multiple evaluation criteria, providing a robust foundation for production trading systems where consistent performance is more valuable than optimization for single metrics. These results enable practitioners to make evidence-based decisions about model selection based on their specific trading objectives, whether focused on directional accuracy for signal generation or precision of magnitude for risk assessment and portfolio management. Full article
(This article belongs to the Special Issue Portfolio Theory, Financial Risk Analysis and Applications)
Show Figures

Figure 1

20 pages, 1541 KB  
Article
Optimizing Investments in the Portfolio Intelligence (PI) Model
by Nikolaos Loukeris, Lysimachos Maltoudoglou, Yannis Boutalis and Iordanis Eleftheriadis
J. Risk Financial Manag. 2025, 18(9), 521; https://doi.org/10.3390/jrfm18090521 - 17 Sep 2025
Viewed by 619
Abstract
A new methodology is introduced that incorporates advanced higher moment evaluation in a new approach to the Portfolio Selection problem, supported by effective Computational Intelligence models. The Portfolio Intelligence (PI) model extracts hidden patterns from numerous accounting data and financial statements, filtering misleading [...] Read more.
A new methodology is introduced that incorporates advanced higher moment evaluation in a new approach to the Portfolio Selection problem, supported by effective Computational Intelligence models. The Portfolio Intelligence (PI) model extracts hidden patterns from numerous accounting data and financial statements, filtering misleading effects such as noise or fraud offering an optimal portfolio selection method. The chaotic reflections of speculative behaviors of investors are analyzed in fractal distributions, as higher moments with fundamentals clear the turbulence of noise while the PI model, under its robust AI classifiers, provides optimal investment support. Full article
(This article belongs to the Section Mathematics and Finance)
Show Figures

Figure 1

18 pages, 1720 KB  
Article
Robust Portfolio Optimization in Crypto Markets Using Second-Order Tsallis Entropy and Liquidity-Aware Diversification
by Florentin Șerban and Silvia Dedu
Risks 2025, 13(9), 180; https://doi.org/10.3390/risks13090180 - 17 Sep 2025
Viewed by 617
Abstract
In this paper, we propose a novel optimization model for portfolio selection that integrates the classical mean–variance criterion with a second-order Tsallis entropy term. This approach enables a trade-off between expected return, risk, and diversification, extending Markowitz’s theory to account for non-Gaussian characteristics [...] Read more.
In this paper, we propose a novel optimization model for portfolio selection that integrates the classical mean–variance criterion with a second-order Tsallis entropy term. This approach enables a trade-off between expected return, risk, and diversification, extending Markowitz’s theory to account for non-Gaussian characteristics and heavy-tailed distributions that are typical in financial markets—especially in cryptocurrency assets. Unlike the first-order Tsallis entropy, the second-order version amplifies the effects of distributional structure and allows for more refined penalization of portfolio concentration. We derive the analytical solution for the optimal weights under this extended framework and demonstrate its performance through a case study using real data from selected cryptocurrencies. Efficient frontiers, portfolio weights, and entropy indicators are compared across models. This novel combination may improve portfolio selection under uncertainty, especially in the context of volatile assets such as cryptocurrencies, as the proposed model can provide a more robust and diversified portfolio structure compared to conventional theories. Full article
(This article belongs to the Special Issue Mathematical Methods Applied in Pricing and Investment Problems)
Show Figures

Figure 1

35 pages, 4885 KB  
Article
Evaluating Sectoral Vulnerability to Natural Disasters in the US Stock Market: Sectoral Insights from DCC-GARCH Models with Generalized Hyperbolic Innovations
by Adriana AnaMaria Davidescu, Eduard Mihai Manta, Margareta-Stela Florescu, Robert-Stefan Constantin and Cristina Manole
Sustainability 2025, 17(18), 8324; https://doi.org/10.3390/su17188324 - 17 Sep 2025
Viewed by 746
Abstract
The escalating frequency and severity of natural disasters present significant challenges to the stability and sustainability of global financial systems, with the US stock market being especially vulnerable. This study examines sector-level exposure and contagion dynamics during climate-related disaster events, providing insights essential [...] Read more.
The escalating frequency and severity of natural disasters present significant challenges to the stability and sustainability of global financial systems, with the US stock market being especially vulnerable. This study examines sector-level exposure and contagion dynamics during climate-related disaster events, providing insights essential for sustainable investing and resilient financial planning. Using an advanced econometric framework—dynamic conditional correlation GARCH (DCC-GARCH) augmented with Generalized Hyperbolic Processes (GHPs) and an asymmetric specification (ADCC-GARCH)—we model daily stock returns for 20 publicly traded US companies across five sectors (insurance, energy, automotive, retail, and industrial) between 2017 and 2022. The results reveal considerable sectoral heterogeneity: insurance and energy sectors exhibit the highest vulnerability, with heavy-tailed return distributions and persistent volatility, whereas retail and selected industrial firms demonstrate resilience, including counter-cyclical behavior during crises. GHP-based models improve tail risk estimation by capturing return asymmetries, skewness, and leptokurtosis beyond Gaussian specifications. Moreover, the ADCC-GHP-GARCH framework shows that negative shocks induce more persistent correlation shifts than positive ones, highlighting asymmetric contagion effects during stress periods. The results present the insurance and energy sectors as the most exposed to extreme events, backed by the heavy-tailed return distributions and persistent volatility. In contrast, the retail and select industrial firms exhibit resilience and show stable, and in some cases, counter-cyclical, behavior in crises. The results from using a GHP indicate a slight improvement in model specification fit, capturing return asymmetries, skewness, and leptokurtosis indications, in comparison to standard Gaussian models. It was also shown with an ADCC-GHP-GARCH model that negative shocks result in a greater and more durable change in correlations than positive shocks, reinforcing the consideration of asymmetry contagion in times of stress. By integrating sector-specific financial responses into a climate-disaster framework, this research supports the design of targeted climate risk mitigation strategies, sustainable investment portfolios, and regulatory stress-testing approaches that account for volatility clustering and tail dependencies. The findings contribute to the literature on financial resilience by providing a robust statistical basis for assessing how extreme climate events impact asset values, thereby informing both policy and practice in advancing sustainable economic development. Full article
Show Figures

Figure 1

19 pages, 4009 KB  
Article
An Integrated GIS–MILP Framework for Cost-Optimal Forest Biomass-to-Bioenergy Supply Chains: A Case Study in Queensland, Australia
by Sam Van Holsbeeck, Mauricio Acuna and Sättar Ezzati
Forests 2025, 16(9), 1467; https://doi.org/10.3390/f16091467 - 15 Sep 2025
Viewed by 376
Abstract
Renewable energy expansion requires cost-effective strategies to integrate underutilized biomass resources into energy systems. In Australia, forest residues represent a significant but largely untapped feedstock that could contribute to a more diversified energy portfolio. This study presents an integrated geospatial and optimization decision-support [...] Read more.
Renewable energy expansion requires cost-effective strategies to integrate underutilized biomass resources into energy systems. In Australia, forest residues represent a significant but largely untapped feedstock that could contribute to a more diversified energy portfolio. This study presents an integrated geospatial and optimization decision-support model designed to minimize the total cost of forest biomass-to-bioenergy supply chains through optimal facility selection and network design. The model combined geographic information systems with mixed-integer linear programming to identify the optimal candidate facility sites based on spatial constraints, biomass availability and infrastructure proximity. These inputs then informed an optimization framework that determined the number, size, and geographical distribution of bioenergy plants. The model was applied to a case study in Queensland, Australia, evaluating two strategic scenarios: (i) a biomass-driven approach that maximizes the use of forest residues; (ii) an energydriven approach that aligns facilities with regional energy consumption patterns. Results indicated that increasing the minimum facility size reduced overall costs by capitalizing on economies of scale. Biomass collection accounted for 81%–83% of total supply chain costs (excluding capital installation), emphasizing the need for logistically efficient sourcing strategies. Furthermore, the system exhibited high sensitivity to transportation distance and biomass availability; energy demands exceeding 400 MW resulted in sharply escalating transport expenses. This study provides a scalable, data-driven framework for the strategic planning of forest-based bioenergy systems. It offers actionable insights for policymakers and industry stakeholders to support the development of robust, cost-effective, and sustainable bioenergy supply chains in Australia and other regions with similar biomass resources. Full article
(This article belongs to the Special Issue Forest-Based Biomass for Bioenergy)
Show Figures

Figure 1

25 pages, 768 KB  
Article
Prioritizing Early-Stage Start-Up Investment Alternatives Under Uncertainty: A Venture Capital Perspective
by Mustafa Kellekci, Ufuk Cebeci and Onur Dogan
Appl. Sci. 2025, 15(18), 10060; https://doi.org/10.3390/app151810060 - 15 Sep 2025
Viewed by 737
Abstract
Early-stage start-up selection is a critical yet challenging task for venture capital (VC) investors due to high uncertainty, limited historical data, and rapidly evolving business environments. Traditional evaluation processes often fall short in systematically handling multiple qualitative and uncertain factors that influence start-up [...] Read more.
Early-stage start-up selection is a critical yet challenging task for venture capital (VC) investors due to high uncertainty, limited historical data, and rapidly evolving business environments. Traditional evaluation processes often fall short in systematically handling multiple qualitative and uncertain factors that influence start-up success. As a result, there is a growing demand for robust decision models that can support VC firms in identifying promising early-stage ventures more accurately and consistently. This study presents a hybrid fuzzy multi-criteria decision-making approach tailored to the needs of venture capital investment under uncertainty. The model integrates expert judgment using the proportional spherical fuzzy AHP method to evaluate the relative importance of key dimensions. Then, spherical fuzzy TOPSIS is applied to rank investment alternatives based on their overall performance rankings. The proposed framework enables VC decision-makers to incorporate both subjective insights and data ambiguity in a structured and transparent way. It offers a practical tool to enhance the reliability of early-stage investment evaluations and improve the effectiveness of venture capital portfolio strategies. Full article
(This article belongs to the Special Issue Applications of Fuzzy Systems and Fuzzy Decision Making)
Show Figures

Figure 1

21 pages, 3095 KB  
Article
Volatility Analysis of Returns of Financial Assets Using a Bayesian Time-Varying Realized GARCH-Itô Model
by Pathairat Pastpipatkul and Htwe Ko
Econometrics 2025, 13(3), 34; https://doi.org/10.3390/econometrics13030034 - 9 Sep 2025
Viewed by 703
Abstract
In a stage of more and more complex and high-frequency financial markets, the volatility analysis is a cornerstone of modern financial econometrics with practical applications in portfolio optimization, derivative pricing, and systematic risk assessment. This paper introduces a novel Bayesian Time-varying Generalized Autoregressive [...] Read more.
In a stage of more and more complex and high-frequency financial markets, the volatility analysis is a cornerstone of modern financial econometrics with practical applications in portfolio optimization, derivative pricing, and systematic risk assessment. This paper introduces a novel Bayesian Time-varying Generalized Autoregressive Conditional Heteroskedasticity (BtvGARCH-Itô) model designed to improve the precision and flexibility of volatility modeling in financial markets. Original GARCH-Itô models, while effective in capturing realized volatility and intraday patterns, rely on fixed or constant parameters; thus, it is limited to studying structural changes. Our proposed model addresses this restraint by integrating the continuous-time Ito process with a time-varying Bayesian inference to allow parameters to vary over time based on prior beliefs to quantify uncertainty and minimize overfitting, especially in small-sample or high-dimensional settings. Through simulation studies, using sample sizes of N = 100 and N = 200, we find that BtvGARCH-Itô outperformed original GARCH-Itô in-sample fit and out-of-sample forecast accuracy based on posterior estimates comparison with true parameter values and forecasting error metrics. For the empirical validation, this model is applied to analyze the volatility of S&P 500 and Bitcoin (BTC) using one-minute length data for S&P 500 (from 3 January 2023 to 31 December 2024) and BTC (from 1 January 2023 to 1 January 2025). This model has potential as a robust tool and a new direction in volatility modeling for financial risk management. Full article
Show Figures

Figure 1

20 pages, 1738 KB  
Article
Regime-Switching Asset Allocation Using a Framework Combing a Jump Model and Model Predictive Control
by Xianglong Li, Jianjun Chen, Xiangxing Tao and Yanting Ji
Mathematics 2025, 13(17), 2837; https://doi.org/10.3390/math13172837 - 3 Sep 2025
Viewed by 1259
Abstract
This study proposes a novel hybrid framework that integrates a jump model with model predictive control (JM-MPC) for dynamic asset allocation under regime-switching market conditions. The proposed approach leverages the jump model to identify distinct market regimes while incorporating a rolling prediction mechanism [...] Read more.
This study proposes a novel hybrid framework that integrates a jump model with model predictive control (JM-MPC) for dynamic asset allocation under regime-switching market conditions. The proposed approach leverages the jump model to identify distinct market regimes while incorporating a rolling prediction mechanism to estimate time-varying asset returns and covariance matrices across multiple horizons. These regime-dependent estimates are subsequently used as inputs for an MPC-based optimization process to determine optimal asset allocations. Through comprehensive empirical analysis, we demonstrate that the JM-MPC framework consistently outperforms an equal-weighted portfolio, delivering superior risk-adjusted returns while substantially mitigating portfolio drawdowns during high-volatility periods. Our findings establish the effectiveness of combining regime-switching modeling with model predictive control techniques for robust portfolio management in dynamic financial markets. Full article
Show Figures

Figure 1

30 pages, 651 KB  
Article
A Fusion of Statistical and Machine Learning Methods: GARCH-XGBoost for Improved Volatility Modelling of the JSE Top40 Index
by Israel Maingo, Thakhani Ravele and Caston Sigauke
Int. J. Financial Stud. 2025, 13(3), 155; https://doi.org/10.3390/ijfs13030155 - 25 Aug 2025
Viewed by 974
Abstract
Volatility modelling is a key feature of financial risk management, portfolio optimisation, and forecasting, particularly for market indices such as the JSE Top40 Index, which serves as a benchmark for the South African stock market. This study investigates volatility modelling of the JSE [...] Read more.
Volatility modelling is a key feature of financial risk management, portfolio optimisation, and forecasting, particularly for market indices such as the JSE Top40 Index, which serves as a benchmark for the South African stock market. This study investigates volatility modelling of the JSE Top40 Index log-returns from 2011 to 2025 using a hybrid approach that integrates statistical and machine learning techniques through a two-step approach. The ARMA(3,2) model was chosen as the optimal mean model, using the auto.arima() function from the forecast package in R (version 4.4.0). Several alternative variants of GARCH models, including sGARCH(1,1), GJR-GARCH(1,1), and EGARCH(1,1), were fitted under various conditional error distributions (i.e., STD, SSTD, GED, SGED, and GHD). The choice of the model was based on AIC, BIC, HQIC, and LL evaluation criteria, and ARMA(3,2)-EGARCH(1,1) was the best model according to the lowest evaluation criteria. Residual diagnostic results indicated that the model adequately captured autocorrelation, conditional heteroskedasticity, and asymmetry in JSE Top40 log-returns. Volatility persistence was also detected, confirming the persistence attributes of financial volatility. Thereafter, the ARMA(3,2)-EGARCH(1,1) model was coupled with XGBoost using standardised residuals extracted from ARMA(3,2)-EGARCH(1,1) as lagged features. The data was split into training (60%), testing (20%), and calibration (20%) sets. Based on the lowest values of forecast accuracy measures (i.e., MASE, RMSE, MAE, MAPE, and sMAPE), along with prediction intervals and their evaluation metrics (i.e., PICP, PINAW, PICAW, and PINAD), the hybrid model captured residual nonlinearities left by the standalone ARMA(3,2)-EGARCH(1,1) and demonstrated improved forecasting accuracy. The hybrid ARMA(3,2)-EGARCH(1,1)-XGBoost model outperforms the standalone ARMA(3,2)-EGARCH(1,1) model across all forecast accuracy measures. This highlights the robustness and suitability of the hybrid ARMA(3,2)-EGARCH(1,1)-XGBoost model for financial risk management in emerging markets and signifies the strengths of integrating statistical and machine learning methods in financial time series modelling. Full article
Show Figures

Figure 1

13 pages, 398 KB  
Article
An Approximate Algorithm for Sparse Distributionally Robust Optimization
by Ruyu Wang, Yaozhong Hu, Cong Liu and Quanwei Gao
Information 2025, 16(8), 676; https://doi.org/10.3390/info16080676 - 7 Aug 2025
Viewed by 457
Abstract
In this paper, we propose a sparse distributionally robust optimization (DRO) model incorporating the Conditional Value-at-Risk (CVaR) measure to control tail risks in uncertain environments. The model utilizes sparsity to reduce transaction costs and enhance operational efficiency. We reformulate the problem as a [...] Read more.
In this paper, we propose a sparse distributionally robust optimization (DRO) model incorporating the Conditional Value-at-Risk (CVaR) measure to control tail risks in uncertain environments. The model utilizes sparsity to reduce transaction costs and enhance operational efficiency. We reformulate the problem as a Min-Max-Min optimization and convert it into an equivalent non-smooth minimization problem. To address this computational challenge, we develop an approximate discretization (AD) scheme for the underlying continuous random vector and prove its convergence to the original non-smooth formulation under mild conditions. The resulting problem can be efficiently solved using a subgradient method. While our analysis focuses on CVaR penalty, this approach is applicable to a broader class of non-smooth convex regularizers. The experimental results on the portfolio selection problem confirm the effectiveness and scalability of the proposed AD algorithm. Full article
(This article belongs to the Special Issue Optimization Algorithms and Their Applications)
Show Figures

Graphical abstract

17 pages, 1152 KB  
Article
PortRSMs: Learning Regime Shifts for Portfolio Policy
by Bingde Liu and Ryutaro Ichise
J. Risk Financial Manag. 2025, 18(8), 434; https://doi.org/10.3390/jrfm18080434 - 5 Aug 2025
Cited by 1 | Viewed by 1032
Abstract
This study proposes a novel Deep Reinforcement Learning (DRL) policy network structure for portfolio management called PortRSMs. PortRSMs employs stacked State-Space Models (SSMs) for the modeling of multi-scale continuous regime shifts in financial time series, striking a balance between exploring consistent distribution properties [...] Read more.
This study proposes a novel Deep Reinforcement Learning (DRL) policy network structure for portfolio management called PortRSMs. PortRSMs employs stacked State-Space Models (SSMs) for the modeling of multi-scale continuous regime shifts in financial time series, striking a balance between exploring consistent distribution properties over short periods and maintaining sensitivity to sudden shocks in price sequences. PortRSMs also performs cross-asset regime fusion through hypergraph attention mechanisms, providing a more comprehensive state space for describing changes in asset correlations and co-integration. Experiments conducted on two different trading frequencies in the stock markets of the United States and Hong Kong show the superiority of PortRSMs compared to other approaches in terms of profitability, risk–return balancing, robustness, and the ability to handle sudden market shocks. Specifically, PortRSMs achieves up to a 0.03 improvement in the annual Sharpe ratio in the U.S. market, and up to a 0.12 improvement for the Hong Kong market compared to baseline methods. Full article
(This article belongs to the Special Issue Machine Learning Applications in Finance, 2nd Edition)
Show Figures

Figure 1

31 pages, 1755 KB  
Article
Two-Stage Distributionally Robust Optimization for an Asymmetric Loss-Aversion Portfolio via Deep Learning
by Xin Zhang, Shancun Liu and Jingrui Pan
Symmetry 2025, 17(8), 1236; https://doi.org/10.3390/sym17081236 - 4 Aug 2025
Cited by 1 | Viewed by 1049
Abstract
In portfolio optimization, investors often overlook asymmetric preferences for gains and losses. We propose a distributionally robust two-stage portfolio optimization (DR-TSPO) model, which is suitable for scenarios where the loss reference point is adaptively updated based on prior decisions. For analytical convenience, we [...] Read more.
In portfolio optimization, investors often overlook asymmetric preferences for gains and losses. We propose a distributionally robust two-stage portfolio optimization (DR-TSPO) model, which is suitable for scenarios where the loss reference point is adaptively updated based on prior decisions. For analytical convenience, we further reformulate the DR-TSPO model as an equivalent second-order cone programming counterpart. Additionally, we develop a deep learning-based constraint correction algorithm (DL-CCA) trained directly on problem descriptions, which enhances computational efficiency for large-scale non-convex distributionally robust portfolio optimization. Our empirical results obtained using global market data demonstrate that during COVID-19, the DR-TSPO model outperformed traditional two-stage optimization in reducing conservatism and avoiding extreme losses. Full article
(This article belongs to the Section Computer)
Show Figures

Figure 1

Back to TopTop