Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (35)

Search Parameters:
Keywords = Basel Accord

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
7 pages, 717 KiB  
Brief Report
Diagnostic Limitations of Applying a Human Portable Blood Glucose Meter in the Detection of Hypoglycemia in Pregnant Ewes
by José Lucas Xavier Lopes, Raquel Fraga e Silva Raimondo, Luiza Rodegheri Jacondino, Beatriz Riet Correa, Clara Satsuki Mori and Álan Gomes Pöppl
Vet. Sci. 2025, 12(1), 47; https://doi.org/10.3390/vetsci12010047 - 11 Jan 2025
Viewed by 1083
Abstract
Pregnant ewes are susceptible to hypoglycemia and ketosis; therefore, monitoring glycemic status is extremely important. Portable blood glucose meters (PBGMs) can assist in quickly and conveniently identifying glycemic disturbances in this species, provided that they meet the criteria of analytical accuracy. This study [...] Read more.
Pregnant ewes are susceptible to hypoglycemia and ketosis; therefore, monitoring glycemic status is extremely important. Portable blood glucose meters (PBGMs) can assist in quickly and conveniently identifying glycemic disturbances in this species, provided that they meet the criteria of analytical accuracy. This study evaluated the performance of a human PBGM (Accu-Chek Performa®, Roche Diagnostics, Basel, Switzerland) in the glycemic evaluation of 34 pregnant ewes at days 90 and 120 of pregnancy in comparison with the results of glycemia determination by a reference method (RM). The device showed a high positive correlation (r = 0.71, 95%CI = 0.57–0.82, p < 0.0001) with the RM; however, 96.6% of the PBGM results (58.5 ± 9.82 mg/dL) were higher (p < 0.0001) than those obtained in the laboratory (48.6 ± 9.31 mg/dL). The PBGM tested was considered analytically inaccurate according to ISO 15197:2013, which states that when glucose levels are below 100 mg/dL, 95% of the measurements should deviate by no more than 15 mg/dL from the RM value, and 1/3 of the PBGM results were above this limit. Hypoglycemia (<50 mg/dL) was documented in 60.29% of samples tested on with the RM, but only 17.64% of results were below 50 mg/dL using the PBGM. Due to these limitations, Accu-Check Performa® results should be interpreted cautiously in pregnant sheep suspected of hypoglycemia. Full article
Show Figures

Figure 1

34 pages, 4441 KiB  
Article
Deep Fuzzy Credibility Surfaces for Integrating External Databases in the Estimation of Operational Value at Risk
by Alejandro Peña, Lina M. Sepúlveda-Cano, Juan David Gonzalez-Ruiz, Nini Johana Marín-Rodríguez and Sergio Botero-Botero
Sci 2024, 6(4), 74; https://doi.org/10.3390/sci6040074 - 5 Nov 2024
Cited by 2 | Viewed by 1579
Abstract
Operational risk (OR) is usually caused by losses due to human errors, inadequate or defective internal processes, system failures, or external events that affect an organization. According to the Basel II agreement, OR is defined by seven risk events: internal fraud, external fraud, [...] Read more.
Operational risk (OR) is usually caused by losses due to human errors, inadequate or defective internal processes, system failures, or external events that affect an organization. According to the Basel II agreement, OR is defined by seven risk events: internal fraud, external fraud, labor relations, clients, damage to fixed assets, technical failures and failures in the execution and administration of processes. However, the low frequency with which a loss event occurs creates a technological challenge for insurers in estimating the operational value at risk (OpVar) for the protection derived from an organization’s business activities. Following the above, this paper develops and analyzes a Deep Fuzzy Credibility Surface model (DFCS), which allows the integration in a single structure of different loss event databases for the estimation of an operational value at risk (OpVar), overcoming the limitations imposed by the low frequency with which a risk event occurs within an organization (sparse data). For the estimation of OpVar, the DFCS model incorporates a novel activation function based on the generalized log-logistic function to model random variables of frequency and severity that define a loss event (linguistic random variables), as well as a credibility surface to integrate the magnitude and heterogeneity of losses in a single structure as a result of the integration of databases. The stability provided by the DFCS model could be evidenced through the structure exhibited by the aggregate loss distributions (ALDs), which are obtained as a result of the convolution process between frequency and severity random variables for each database and which are expected to achieve similar structures to the probability distributions suggested by Basel II agreements (lean, long tail, positive skewness) against the OR modeling. These features make the DFCS model a reference for estimating the OpVar to protect the risk arising from an organization’s business operations by integrating internal and external loss event databases. Full article
(This article belongs to the Special Issue Computational Linguistics and Artificial Intelligence)
Show Figures

Figure 1

21 pages, 1576 KiB  
Article
Microcredit Pricing Model for Microfinance Institutions under Basel III Banking Regulations
by Patricia Durango-Gutiérrez, Juan Lara-Rubio, Andrés Navarro-Galera and Dionisio Buendía-Carrillo
Int. J. Financial Stud. 2024, 12(3), 88; https://doi.org/10.3390/ijfs12030088 - 3 Sep 2024
Cited by 2 | Viewed by 3231
Abstract
Purpose. The purpose of this research is to propose a tool for designing a microcredit risk pricing strategy for borrowers of microfinance institutions (MFIs). Design/methodology/approach. Considering the specific characteristics of microcredit borrowers, we first estimate and measure microcredit risk through the default probability, [...] Read more.
Purpose. The purpose of this research is to propose a tool for designing a microcredit risk pricing strategy for borrowers of microfinance institutions (MFIs). Design/methodology/approach. Considering the specific characteristics of microcredit borrowers, we first estimate and measure microcredit risk through the default probability, applying a parametric technique such as logistic regression and a non-parametric technique based on an artificial neural network, looking for the model with the highest predictive power. Secondly, based on the Basel III internal ratings-based (IRB) approach, we use the credit risk measurement for each borrower to design a pricing model that sets microcredit interest rates according to default risk. Findings. The paper demonstrates that the probability of default for each borrower is more accurately adjusted using the artificial neural network. Furthermore, our results suggest that, given a profitability target for the MFI, the microcredit interest rate for clients with a lower level of credit risk should be lower than a standard, fixed rate to achieve the profitability target. Practical implications. This tool allows us, on the one hand, to measure and assess credit risk and minimize default losses in MFIs and, secondly, to promote their competitiveness by reducing interest rates, capital requirements, and credit losses, favoring the financial self-sustainability of these institutions. Social implications. Our findings have the potential to make microfinance institutions fairer and more equitable in their lending practices by providing microcredit with risk-adjusted pricing. Furthermore, our findings can contribute to the design of government policies aimed at promoting the financial and social inclusion of vulnerable people. Originality. The personal characteristics of microcredit clients, mainly reputation and moral solvency, are crucial to the default behavior of microfinance borrowers. These factors should have an impact on the pricing of microcredit. Full article
Show Figures

Figure 1

16 pages, 425 KiB  
Article
Coffee Consumption and CYP1A2 Polymorphism Involvement in Type 2 Diabetes in a Romanian Population
by Laura Claudia Popa, Simona Sorina Farcas and Nicoleta Ioana Andreescu
J. Pers. Med. 2024, 14(7), 717; https://doi.org/10.3390/jpm14070717 - 3 Jul 2024
Cited by 4 | Viewed by 3560
Abstract
Cytochrome P450 1A2 (CYP1A2) is known to be the main enzyme directly responsible for caffeine metabolism. Rs762551 (NC_000015.10:g.74749576C>A) is a single nucleotide polymorphism of the CYP1A2 gene, and it is known mainly for metabolizing caffeine. A significant worldwide health issue, type 2 diabetes [...] Read more.
Cytochrome P450 1A2 (CYP1A2) is known to be the main enzyme directly responsible for caffeine metabolism. Rs762551 (NC_000015.10:g.74749576C>A) is a single nucleotide polymorphism of the CYP1A2 gene, and it is known mainly for metabolizing caffeine. A significant worldwide health issue, type 2 diabetes (T2DM), has been reported to be negatively associated with coffee consumption. Yet, some studies have proven that high intakes of coffee can lead to a late onset of T2DM. Objectives: This study aims to find any significant correlations among CYP1A2 polymorphism, coffee consumption, and T2DM. Methods: A total of 358 people were enrolled in this study—218 diagnosed with T2DM, and 140 representing the control sample. The qPCR technique was performed, analyzing rs762551 (assay C_8881221) on the LightCycler 480 (Roche, Basel, Switzerland) with Gene Scanning software version 1.5.1 (Roche). Results: Our first observation was that the diabetic patients were likely to consume more coffee than the non-diabetic subjects. People with the AA genotype, or the fast metabolizers, are the least common, yet they are the highest coffee consumers and present the highest glucose and cholesterol levels. Another important finding is the correlation between coffee intake and glucose level, which showed statistically significant differences between the diabetic group (p = 0.0002) and the control group (p = 0.029). Conclusions: The main conclusion of this study is that according to genotype, caffeine levels, glucose, and cholesterol are interconnected and proportionally related, regardless of type 2 diabetes. Full article
Show Figures

Figure 1

10 pages, 214 KiB  
Article
Competition and Regulation: The Case of the UK Banking Industry
by Eleonora Muzzupappa
Mathematics 2024, 12(8), 1126; https://doi.org/10.3390/math12081126 - 9 Apr 2024
Viewed by 2733
Abstract
This study examines the impact of the Basel Accords on competition within the UK banking sector, considering variations based on bank size. The Basel Accords, designed to enhance financial stability, introduce provisions that may affect competition dynamics. Empirical analysis reveals divergent outcomes: large [...] Read more.
This study examines the impact of the Basel Accords on competition within the UK banking sector, considering variations based on bank size. The Basel Accords, designed to enhance financial stability, introduce provisions that may affect competition dynamics. Empirical analysis reveals divergent outcomes: large banks tend towards monopolization, while other banks shift towards a more competitive environment. Large banks benefit from regulatory barriers and technological advancements, while other banks face challenges from increased compliance costs. These findings highlight the complex relationship between regulation and competition in banking, emphasizing the need for balanced regulations that promote stability while fostering healthy competition. Full article
(This article belongs to the Special Issue Mathematical and Statistical Modeling of Socio-Economic Behavior)
21 pages, 1984 KiB  
Article
An Empirical Analysis of the Dynamics Influencing Bank Capital Structure in Africa
by Ayodeji Michael Obadire, Vusani Moyo and Ntungufhadzeni Freddy Munzhelele
Int. J. Financial Stud. 2023, 11(4), 127; https://doi.org/10.3390/ijfs11040127 - 1 Nov 2023
Cited by 8 | Viewed by 3972
Abstract
Financial institutions, particularly banks, have long grappled with the dilemma of structuring their capital optimally. This process, commonly referred to as capital structure decision-making, is of paramount importance, especially within the financial services sector, where strict regulations are imposed by reserve and central [...] Read more.
Financial institutions, particularly banks, have long grappled with the dilemma of structuring their capital optimally. This process, commonly referred to as capital structure decision-making, is of paramount importance, especially within the financial services sector, where strict regulations are imposed by reserve and central banks in alignment with global Basel guidelines. This study unveils the key factors that determine the capital structure choices of African banks, using panel data encompassing 45 listed banks across six nations that had embraced the Basel III Accord spanning the years 2010 to 2019. The study used the system-generalised moment methods (sys-GMM) estimator to fit the formulated panel data regression model. The study findings showed positive associations between ZSCORE, an indicator of bank financial stability, and net interest margin ratio (NIMR) with bank leverage (TCTE). In addition, the results revealed positive correlations between earnings volatility (EV), profitability (P), and risk (R) with bank leverage (TDCE). This suggests that profitable banks are inclined to favour debt financing, a phenomenon driven by their ability to comfortably service debt obligations with free cash flows. This study’s overarching conclusion underscores the dominant influence of the Liquidity Coverage Ratio (LCR) on African bank capital structures. Whether assessing traditional or Basel III-prescribed measures of bank leverage, LCR consistently emerged as the primary determinant. This finding is of significant relevance to bank executives and regulators, offering them essential insights for informed decision-making by considering striking a balance between equity and debt financing based on financial stability, profitability, and risk profiles. Full article
19 pages, 3163 KiB  
Article
Measurement and Calibration of Regulatory Credit Risk Asset Correlations
by Anton van Dyk and Gary van Vuuren
J. Risk Financial Manag. 2023, 16(9), 402; https://doi.org/10.3390/jrfm16090402 - 7 Sep 2023
Cited by 1 | Viewed by 3687
Abstract
Vasicek’s asymptotic single risk factor (ASRF) model is employed by the Basel Committee on Banking Supervision (BCBS) in its internal ratings-based (IRB) approach for estimating credit losses and regulatory credit risk capital. This methodology requires estimates of asset correlations; these are prescribed by [...] Read more.
Vasicek’s asymptotic single risk factor (ASRF) model is employed by the Basel Committee on Banking Supervision (BCBS) in its internal ratings-based (IRB) approach for estimating credit losses and regulatory credit risk capital. This methodology requires estimates of asset correlations; these are prescribed by the BCBS. Practitioners are interested to know market-implied asset correlations since these influence economic capital and lending behavior. These may be backed out from ASRF loan loss distributions using ex post loan losses. Prescribed asset correlations have been neither updated nor recalibrated since their introduction in 2008 with the implementation of the Basel II accord. The market milieu has undergone significant alterations and adaptations since then; it is unlikely that these remain relevant. Loan loss data from a developed (US) and developing (South Africa) economy spanning at least two business cycles for each region were used to explore the relevance of the BCBS calibration. Results obtained from three alternative methodologies are compared with prescribed BCBS values, and the latter were found to be countercyclical to empirical loan loss experience, resulting in less punitive credit risk capital requirements than required in market crises and more punitive requirements than required in calm conditions. Full article
(This article belongs to the Section Banking and Finance)
Show Figures

Figure 1

15 pages, 392 KiB  
Article
Investigating Causes of Model Instability: Properties of the Prediction Accuracy Index
by Ross Taplin
Risks 2023, 11(6), 110; https://doi.org/10.3390/risks11060110 - 7 Jun 2023
Cited by 2 | Viewed by 3055
Abstract
The Prediction Accuracy Index (PAI) monitors stability, defined as whether the predictive power of a model has deteriorated due to a change in the distribution of the explanatory variables since its development. This paper shows how the PAI is related to the Mahalanobis [...] Read more.
The Prediction Accuracy Index (PAI) monitors stability, defined as whether the predictive power of a model has deteriorated due to a change in the distribution of the explanatory variables since its development. This paper shows how the PAI is related to the Mahalanobis distance, an established statistic for examining high leverage observations in data. This relationship is used to explore properties of the PAI, including tools for how the PAI can be decomposed into effects due to (a) individual observations/cases, (b) individual variables, and (c) shifts in the mean of variables. Not only are these tools useful for practitioners to help determine why models fail stability, but they also have implications for auditors and regulators. In particular, reasons why models containing econometric variables may fail stability are explored and suggestions to improve model development are discussed. Full article
23 pages, 2132 KiB  
Article
Regulation and De-Risking: Theoretical and Empirical Insights
by Lawrence Haar and Andros Gregoriou
Risks 2023, 11(6), 104; https://doi.org/10.3390/risks11060104 - 2 Jun 2023
Viewed by 2855
Abstract
The purpose of the Bank for International Settlements regulatory agenda, as implemented by financial regulators globally, has been to make banks safer and reduce the likelihood of systemic events. Using an original model of bank profit maximisation under a regulatory constraint, we statistically [...] Read more.
The purpose of the Bank for International Settlements regulatory agenda, as implemented by financial regulators globally, has been to make banks safer and reduce the likelihood of systemic events. Using an original model of bank profit maximisation under a regulatory constraint, we statistically examine how market risk exposure has interacted with financial performance and capital structure, to see if the Basel regulatory agenda concerning the quantity, quality and liquidity of capital, has prompted changes in banking behaviour as measured by exposure to market risk. Breaking new ground, we empirically explore how the regulatory agenda has affected the largest banks. We analyse if the regulatory agenda has succeeded in aligning the cost of capital with their exposure to market risk, measured by Value at Risk; or if regulations have induced changes to banking activities. We find rather than regulation inducing changes to the rate at which unchanged risk exposure is capitalised; it leads to changes in the nature of exposures. Risk has declined along with financial performance while the cost of capital is largely unchanged. A consequence of regulation may be to encourage the migration of riskier activities to organisations where it may be borne more cheaply. Full article
(This article belongs to the Special Issue Risks: Feature Papers 2023)
Show Figures

Figure 1

10 pages, 263 KiB  
Article
Characteristics of Asthma Exacerbations in Emergency Care in Switzerland—Demographics, Treatment, and Burden of Disease in Patients with Asthma Exacerbations Presenting to an Emergency Department in Switzerland (CARE-S)
by Marco Rueegg, Jeannette-Marie Busch, Peter van Iperen, Joerg D. Leuppi and Roland Bingisser
J. Clin. Med. 2023, 12(8), 2857; https://doi.org/10.3390/jcm12082857 - 13 Apr 2023
Cited by 3 | Viewed by 3589
Abstract
Emergency care for asthma is provided by general practitioners, pulmonologists, and emergency departments (EDs). Although it is known that patients presenting to EDs with acute asthma exacerbations are a vulnerable population and that this mode of presentation is a risk marker for more [...] Read more.
Emergency care for asthma is provided by general practitioners, pulmonologists, and emergency departments (EDs). Although it is known that patients presenting to EDs with acute asthma exacerbations are a vulnerable population and that this mode of presentation is a risk marker for more severe complications, research on this population is scarce. We conducted a retrospective study on patients with asthma exacerbations who presented to the ED of the University Hospital Basel, Switzerland, during 2017–2020. Of the last 200 presentations, 100 were selected and analyzed to assess demographic information, the use of previous and ED-prescribed asthma medication, and clinical outcomes after a mean period of time of 18 months. Of these 100 asthma patients, 96 were self-presenters, and 43 had the second highest degree of acuity (emergency severity index 2). Global Initiative for Asthma (GINA) step 1 and step 3 were the most common among patients with known GINA levels, accounting for 22 and 18 patients, respectively. A total of 4 patients were undergoing treatment with oral corticosteroids at presentation, and 34 were at discharge. At presentation, 38 patients used the combination therapy of inhaled corticosteroid/long-acting β2-agonist (ICS/LABA), and 6 patients underwent ICS monotherapy. At discharge, 68 patients were prescribed with ICS/LABA. At entry to the ED, about one-third of patients did not use any asthma medication. In total, 10 patients were hospitalized. None of them needed invasive or non-invasive ventilation. A follow-up for the study was precluded by the majority of patients. This group of asthma patients seemed particularly vulnerable as their asthma medication at presentation was often not according to guidelines or even lacking, and almost all the patients had self-presented to the ED without any reference from a physician. The majority of patients did not give consent to the collection of any follow-up information. These medical shortcomings reflect an urgent medical need to improve care for patients at high risk of asthma exacerbations. Full article
(This article belongs to the Section Emergency Medicine)
13 pages, 3659 KiB  
Article
Kinetic Photovoltaic Facade System Based on a Parametric Design for Application in Signal Box Buildings in Switzerland
by Ho Soon Choi
Appl. Sci. 2023, 13(7), 4633; https://doi.org/10.3390/app13074633 - 6 Apr 2023
Cited by 6 | Viewed by 5005
Abstract
This study aims to produce renewable energy by applying a solar-energy-harvesting architectural design using solar panels on the facade of a building. To install as many solar panels as possible on the building elevation, the Signal Box auf dem Wolf, located in Basel, [...] Read more.
This study aims to produce renewable energy by applying a solar-energy-harvesting architectural design using solar panels on the facade of a building. To install as many solar panels as possible on the building elevation, the Signal Box auf dem Wolf, located in Basel, Switzerland, was selected as the research target. The solar panels to be installed on the facade of the Signal Box auf dem Wolf are planned such that they are able to move according to the optimal tilt angle every month to allow maximal energy generation. The kinetic photovoltaic facade system and the simulation of renewable energy generation were implemented using a parametric design. The novelty of this study is the development of a kinetic photovoltaic facade system using a parametric design algorithm. From the perspective of renewable energy in the field of architecture, the kinetic photovoltaic facade system developed in this study has the advantage of producing maximal renewable energy according to the optimal tilt angle of the solar panels. Additionally, building facades that move according to the optimal tilt angle will contribute to the expansion of the field of sustainable architectural design. Full article
Show Figures

Figure 1

20 pages, 428 KiB  
Article
Exploring Industry-Distress Effects on Loan Recovery: A Double Machine Learning Approach for Quantiles
by Hui-Ching Chuang and Jau-er Chen
Econometrics 2023, 11(1), 6; https://doi.org/10.3390/econometrics11010006 - 14 Feb 2023
Cited by 2 | Viewed by 4552
Abstract
In this study, we explore the effect of industry distress on recovery rates by using the unconditional quantile regression (UQR). The UQR provides better interpretative and thus policy-relevant information on the predictive effect of the target variable than the conditional quantile regression. To [...] Read more.
In this study, we explore the effect of industry distress on recovery rates by using the unconditional quantile regression (UQR). The UQR provides better interpretative and thus policy-relevant information on the predictive effect of the target variable than the conditional quantile regression. To deal with a broad set of macroeconomic and industry variables, we use the lasso-based double selection to estimate the predictive effects of industry distress and select relevant variables. Our sample consists of 5334 debt and loan instruments in Moody’s Default and Recovery Database from 1990 to 2017. The results show that industry distress decreases recovery rates from 15.80% to 2.94% for the 15th to 55th percentile range and slightly increases the recovery rates in the lower and the upper tails. The UQR provide quantitative measurements to the loss given default during a downturn that the Basel Capital Accord requires. Full article
Show Figures

Figure 1

11 pages, 898 KiB  
Article
Outcome of Hospitalized Cancer Patients with Hypernatremia: A Retrospective Case-Control Study
by Jessica del Rio and Martin Buess
Curr. Oncol. 2022, 29(11), 8814-8824; https://doi.org/10.3390/curroncol29110693 - 16 Nov 2022
Cited by 3 | Viewed by 3477
Abstract
Hypernatremia (>145 mmol/L) is a relatively rare event, and the data regarding its role in the outcome of inpatients on an oncology ward are weak. The aim of this study was to describe the prevalence, prognosis, and outcome of hospitalized cancer patients with [...] Read more.
Hypernatremia (>145 mmol/L) is a relatively rare event, and the data regarding its role in the outcome of inpatients on an oncology ward are weak. The aim of this study was to describe the prevalence, prognosis, and outcome of hospitalized cancer patients with hypernatremia. We performed a retrospective case-control study of data obtained from inpatients with a solid tumor at the St. Claraspital, Basel, Switzerland, who were admitted between 2017 and 2020. The primary endpoint was overall survival. Hypernatremia was found in 93 (3.16%) of 2945 inpatients bearing cancer or lymphoma. From 991 eligible normonatremic control patients, 93 were matched according to diagnosis, age, and sex. The median overall survival time (OS) of patients with hypernatremia was 1.5 months compared to 11.7 months of the normonatremic controls (HR 2.69, 95% CI 1.85–3.90, p < 0.0001). OS of patients with irreversible compared to reversible hypernatremia was significantly shorter (23 versus 88 days, HR 4.0, 95% CI 2.04–7.70, p < 0.0001). The length of hospital stay was significantly longer for the hypernatremic than for the normonatremic group (p < 0.0001). Significantly more patients with hypernatremia died in the hospital (30.1% versus 8.6%, p < 0.001). These results suggest hypernatremia to be associated with an unfavorable outcome and a very short OS. Full article
(This article belongs to the Section Palliative and Supportive Care)
Show Figures

Figure 1

21 pages, 6151 KiB  
Article
Fast 3D Face Reconstruction from a Single Image Using Different Deep Learning Approaches for Facial Palsy Patients
by Duc-Phong Nguyen, Tan-Nhu Nguyen, Stéphanie Dakpé, Marie-Christine Ho Ba Tho and Tien-Tuan Dao
Bioengineering 2022, 9(11), 619; https://doi.org/10.3390/bioengineering9110619 - 27 Oct 2022
Cited by 5 | Viewed by 6446
Abstract
The 3D reconstruction of an accurate face model is essential for delivering reliable feedback for clinical decision support. Medical imaging and specific depth sensors are accurate but not suitable for an easy-to-use and portable tool. The recent development of deep learning (DL) models [...] Read more.
The 3D reconstruction of an accurate face model is essential for delivering reliable feedback for clinical decision support. Medical imaging and specific depth sensors are accurate but not suitable for an easy-to-use and portable tool. The recent development of deep learning (DL) models opens new challenges for 3D shape reconstruction from a single image. However, the 3D face shape reconstruction of facial palsy patients is still a challenge, and this has not been investigated. The contribution of the present study is to apply these state-of-the-art methods to reconstruct the 3D face shape models of facial palsy patients in natural and mimic postures from one single image. Three different methods (3D Basel Morphable model and two 3D Deep Pre-trained models) were applied to the dataset of two healthy subjects and two facial palsy patients. The reconstructed outcomes were compared to the 3D shapes reconstructed using Kinect-driven and MRI-based information. As a result, the best mean error of the reconstructed face according to the Kinect-driven reconstructed shape is 1.5±1.1 mm. The best error range is 1.9±1.4 mm when compared to the MRI-based shapes. Before using the procedure to reconstruct the 3D faces of patients with facial palsy or other facial disorders, several ideas for increasing the accuracy of the reconstruction can be discussed based on the results. This present study opens new avenues for the fast reconstruction of the 3D face shapes of facial palsy patients from a single image. As perspectives, the best DL method will be implemented into our computer-aided decision support system for facial disorders. Full article
(This article belongs to the Special Issue Multiscale Modeling in Computational Biomechanics)
Show Figures

Figure 1

22 pages, 6403 KiB  
Article
The COVID-19 Housing Boom: Is a 2007–2009-Type Crisis on the Horizon?
by Diamando Afxentiou, Peter Harris and Paul Kutasovic
J. Risk Financial Manag. 2022, 15(8), 371; https://doi.org/10.3390/jrfm15080371 - 22 Aug 2022
Cited by 8 | Viewed by 4894
Abstract
While the current housing market remains relatively strong, with housing prices setting records, concerns are growing of a potential housing bubble similar to that of 2007–2009; this paper compares the current housing market environment with that of 2007–2009 and concludes that the many [...] Read more.
While the current housing market remains relatively strong, with housing prices setting records, concerns are growing of a potential housing bubble similar to that of 2007–2009; this paper compares the current housing market environment with that of 2007–2009 and concludes that the many of the factors that caused the 2007–2009 crisis do not exist today. Factors associated with subprime mortgages, poor and non-existent underwriting loan requirements, weak regulatory oversight, exaggerated credit ratings, under-capitalization in the banking sector and excessive speculative activity in the housing market have been addressed by regulation, which is aimed at preventing another financial crisis similar to 2007–2009. Equally important, major fundamental factors affecting real estate valuation are providing support for the housing market and housing prices; these factors are impacting both the demand and supply side of the housing market. The factors include the lack of inventories of homes available for sale, the underproduction of housing, decreased household mobility limiting supply and the increase in housing demand from millennials and institutional investors; these fundamental factors were not evident during the 2007–2009 period. Despite a number of indicators signaling a potential topping out and overvaluation of housing prices, the authors conclude that the fundamental factors will limit the extent that the housing market weakens over the next few years. Full article
Show Figures

Figure 1

Back to TopTop