Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (16)

Search Parameters:
Authors = Martin Hecht ORCID = 0000-0002-5168-4911

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
16 pages, 3879 KiB  
Article
Prognostic Value of Speckle Tracking Echocardiography-Derived Strain in Unmasking Risk for Arrhythmias in Children with Myocarditis
by Nele Rolfs, Cynthia Huber, Bernd Opgen-Rhein, Isabell Altmann, Felix Anderheiden, Tobias Hecht, Marcus Fischer, Gesa Wiegand, Katja Reineker, Inga Voges, Daniela Kiski, Wiebke Frede, Martin Boehne, Malika Khedim, Daniel Messroghli, Karin Klingel, Eicke Schwarzkopf, Thomas Pickardt, Stephan Schubert, Fatima I. Lunze and Franziska Seideladd Show full author list remove Hide full author list
Biomedicines 2024, 12(10), 2369; https://doi.org/10.3390/biomedicines12102369 - 16 Oct 2024
Cited by 2 | Viewed by 1711
Abstract
Background/Objectives: Risk assessment in pediatric myocarditis is challenging, particularly when left ventricular ejection fraction (LVEF) is preserved. This study aimed to evaluate LV myocardial deformation using speckle-tracking echocardiography (STE)-derived longitudinal +strain (LS) and assessed its diagnostic and prognostic value in children with myocarditis. [...] Read more.
Background/Objectives: Risk assessment in pediatric myocarditis is challenging, particularly when left ventricular ejection fraction (LVEF) is preserved. This study aimed to evaluate LV myocardial deformation using speckle-tracking echocardiography (STE)-derived longitudinal +strain (LS) and assessed its diagnostic and prognostic value in children with myocarditis. Methods: Retrospective STE-derived layer-specific LV LS analysis was performed on echocardiograms from patients within the multicenter, prospective registry for pediatric myocarditis “MYKKE”. Age- and sex-adjusted logistic regression and ROC analysis identified predictors of cardiac arrhythmias (ventricular tachycardia, ventricular fibrillation, atrioventricular blockage III°) and major adverse cardiac events (MACE: need for mechanical circulatory support (MCS), cardiac transplantation, and/or cardiac death). Results: Echocardiograms from 175 patients (median age 15 years, IQR 7.9–16.5 years; 70% male) across 13 centers were included. Cardiac arrhythmias occurred in 36 patients (21%), and MACE in 28 patients (16%). Impaired LV LS strongly correlated with reduced LVEF (r > 0.8). Impaired layer-specific LV LS, reduced LVEF, LV dilatation, and increased BSA-indexed LV mass, were associated with the occurrence of MACE and cardiac arrhythmias. In patients with preserved LVEF, LV LS alone predicted cardiac arrhythmias (p < 0.001), with optimal cutoff values of −18.0% for endocardial LV LS (sensitivity 0.69, specificity 0.94) and –17.0% for midmyocardial LV LS (sensitivity 0.81, specificity 0.75). Conclusions: In pediatric myocarditis, STE-derived LV LS is not only a valuable tool for assessing systolic myocardial dysfunction and predicting MACE but also identifies patients at risk for cardiac arrhythmias, even in the context of preserved LVEF. Full article
Show Figures

Figure 1

14 pages, 4399 KiB  
Article
Reducing Loneliness through the Power of Practicing Together: A Randomized Controlled Trial of Online Dyadic Socio-Emotional vs. Mindfulness-Based Training
by Hannah Matthaeus, Malvika Godara, Sarita Silveira, Martin Hecht, Manuel Voelkle and Tania Singer
Int. J. Environ. Res. Public Health 2024, 21(5), 570; https://doi.org/10.3390/ijerph21050570 - 29 Apr 2024
Cited by 6 | Viewed by 2834
Abstract
Loneliness has become a pressing topic, especially among young adults and during the COVID-19 pandemic. In a randomized controlled trial with 253 healthy adults, we evaluated the differential efficacy of two 10-week app-delivered mental training programs: one based on classic mindfulness and one [...] Read more.
Loneliness has become a pressing topic, especially among young adults and during the COVID-19 pandemic. In a randomized controlled trial with 253 healthy adults, we evaluated the differential efficacy of two 10-week app-delivered mental training programs: one based on classic mindfulness and one on an innovative partner-based socio-emotional practice (Affect Dyad). We show that the partner-based training resulted in greater reductions in loneliness than the mindfulness-based training. This effect was shown on three measures of loneliness: general loneliness assessed with the 20-item UCLA Loneliness Scale, state loneliness queried over an 8-day ecological momentary assessment in participants’ daily lives, and loneliness ratings required before and after daily practice. Our study provides evidence for the higher efficacy of a mental training approach based on a 12 min practice conducted with a partner in reducing loneliness and provides a novel, scalable online approach to reduce the increasing problem of loneliness in society. Full article
(This article belongs to the Special Issue Public Health Consequences of Social Isolation and Loneliness)
Show Figures

Figure 1

17 pages, 1697 KiB  
Article
A SAS Macro for Automated Stopping of Markov Chain Monte Carlo Estimation in Bayesian Modeling with PROC MCMC
by Wolfgang Wagner, Martin Hecht and Steffen Zitzmann
Psych 2023, 5(3), 966-982; https://doi.org/10.3390/psych5030063 - 5 Sep 2023
Cited by 3 | Viewed by 1780
Abstract
A crucial challenge in Bayesian modeling using Markov chain Monte Carlo (MCMC) estimation is to diagnose the convergence of the chains so that the draws can be expected to closely approximate the posterior distribution on which inference is based. A close approximation guarantees [...] Read more.
A crucial challenge in Bayesian modeling using Markov chain Monte Carlo (MCMC) estimation is to diagnose the convergence of the chains so that the draws can be expected to closely approximate the posterior distribution on which inference is based. A close approximation guarantees that the MCMC error exhibits only a negligible impact on model estimates and inferences. However, determining whether convergence has been achieved can often be challenging and cumbersome when relying solely on inspecting the trace plots of the chain(s) or manually checking the stopping criteria. In this article, we present a SAS macro called %automcmc that is based on PROC MCMC and that automatically continues to add draws until a user-specified stopping criterion (i.e., a certain potential scale reduction and/or a certain effective sample size) is reached for the chain(s). Full article
(This article belongs to the Special Issue Computational Aspects and Software in Psychometrics II)
Show Figures

Figure 1

13 pages, 271 KiB  
Article
Accurate Standard Errors in Multilevel Modeling with Heteroscedasticity: A Computationally More Efficient Jackknife Technique
by Steffen Zitzmann, Sebastian Weirich and Martin Hecht
Psych 2023, 5(3), 757-769; https://doi.org/10.3390/psych5030049 - 21 Jul 2023
Cited by 4 | Viewed by 2194
Abstract
In random-effects models, hierarchical linear models, or multilevel models, it is typically assumed that the variances within higher-level units are homoscedastic, meaning that they are equal across these units. However, this assumption is often violated in research. Depending on the degree of violation, [...] Read more.
In random-effects models, hierarchical linear models, or multilevel models, it is typically assumed that the variances within higher-level units are homoscedastic, meaning that they are equal across these units. However, this assumption is often violated in research. Depending on the degree of violation, this can lead to biased standard errors of higher-level parameters and thus to incorrect inferences. In this article, we describe a resampling technique for obtaining standard errors—Zitzmann’s jackknife. We conducted a Monte Carlo simulation study to compare the technique with the commonly used delete-1 jackknife, the robust standard error in Mplus, and a modified version of the commonly used delete-1 jackknife. Findings revealed that the resampling techniques clearly outperformed the robust standard error in rather small samples with high levels of heteroscedasticity. Moreover, Zitzmann’s jackknife tended to perform somewhat better than the two versions of the delete-1 jackknife and was much faster. Full article
(This article belongs to the Special Issue Computational Aspects and Software in Psychometrics II)
22 pages, 2560 KiB  
Article
Heterogeneous Mental Health Responses to the COVID-19 Pandemic in Germany: An Examination of Long-Term Trajectories, Risk Factors, and Vulnerable Groups
by Malvika Godara, Jessie Rademacher, Martin Hecht, Sarita Silveira, Manuel C. Voelkle and Tania Singer
Healthcare 2023, 11(9), 1305; https://doi.org/10.3390/healthcare11091305 - 3 May 2023
Cited by 10 | Viewed by 2776
Abstract
Abundant studies have examined mental health in the early periods of the COVID-19 pandemic. However, empirical work examining the mental health impact of the pandemic’s subsequent phases remains limited. In the present study, we investigated how mental vulnerability and resilience evolved over the [...] Read more.
Abundant studies have examined mental health in the early periods of the COVID-19 pandemic. However, empirical work examining the mental health impact of the pandemic’s subsequent phases remains limited. In the present study, we investigated how mental vulnerability and resilience evolved over the various phases of the pandemic in 2020 and 2021 in Germany. Data were collected (n = 3522) across seven measurement occasions using validated and self-generated measures of vulnerability and resilience. We found evidence for an immediate increase in vulnerability during the first lockdown in Germany, a trend towards recovery when lockdown measures were eased, and an increase in vulnerability with each passing month of the second lockdown. Four different latent trajectories of resilience–vulnerability emerged, with the majority of participants displaying a rather resilient trajectory, but nearly 30% of the sample fell into the more vulnerable groups. Females, younger individuals, those with a history of psychiatric disorders, lower income groups, and those with high trait vulnerability and low trait social belonging were more likely to exhibit trajectories associated with poorer mental well-being. Our findings indicate that resilience–vulnerability responses in Germany during the COVID-19 pandemic may have been more complex than previously thought, identifying risk groups that could benefit from greater support. Full article
Show Figures

Figure 1

14 pages, 351 KiB  
Article
What Is the Maximum Likelihood Estimate When the Initial Solution to the Optimization Problem Is Inadmissible? The Case of Negatively Estimated Variances
by Steffen Zitzmann, Julia-Kim Walther, Martin Hecht and Benjamin Nagengast
Psych 2022, 4(3), 343-356; https://doi.org/10.3390/psych4030029 - 30 Jun 2022
Cited by 5 | Viewed by 2270
Abstract
The default procedures of the software programs Mplus and lavaan tend to yield an inadmissible solution (also called a Heywood case) when the sample is small or the parameter is close to the boundary of the parameter space. In factor models, a [...] Read more.
The default procedures of the software programs Mplus and lavaan tend to yield an inadmissible solution (also called a Heywood case) when the sample is small or the parameter is close to the boundary of the parameter space. In factor models, a negatively estimated variance does often occur. One strategy to deal with this is fixing the variance to zero and then estimating the model again in order to obtain the estimates of the remaining model parameters. In the present article, we present one possible approach for justifying this strategy. Specifically, using a simple one-factor model as an example, we show that the maximum likelihood (ML) estimate of the variance of the latent factor is zero when the initial solution to the optimization problem (i.e., the solution provided by the default procedure) is a negative value. The basis of our argument is the very definition of ML estimation, which requires that the log-likelihood be maximized over the parameter space. We present the results of a small simulation study, which was conducted to evaluate the proposed ML procedure and compare it with Mplus’ default procedure. We found that the proposed ML procedure increased estimation accuracy compared to Mplus’ procedure, rendering the ML procedure an attractive option to deal with inadmissible solutions. Full article
(This article belongs to the Special Issue Computational Aspects and Software in Psychometrics II)
30 pages, 2939 KiB  
Article
Coping with the COVID-19 Pandemic: Perceived Changes in Psychological Vulnerability, Resilience and Social Cohesion before, during and after Lockdown
by Sarita Silveira, Martin Hecht, Hannah Matthaeus, Mazda Adli, Manuel C. Voelkle and Tania Singer
Int. J. Environ. Res. Public Health 2022, 19(6), 3290; https://doi.org/10.3390/ijerph19063290 - 10 Mar 2022
Cited by 42 | Viewed by 7848
Abstract
The COVID-19 pandemic and associated lockdowns have posed unique and severe challenges to our global society. To gain an integrative understanding of pervasive social and mental health impacts in 3522 Berlin residents aged 18 to 65, we systematically investigated the structural and temporal [...] Read more.
The COVID-19 pandemic and associated lockdowns have posed unique and severe challenges to our global society. To gain an integrative understanding of pervasive social and mental health impacts in 3522 Berlin residents aged 18 to 65, we systematically investigated the structural and temporal relationship between a variety of psychological indicators of vulnerability, resilience and social cohesion before, during and after the first lockdown in Germany using a retrospective longitudinal study design. Factor analyses revealed that (a) vulnerability and resilience indicators converged on one general bipolar factor, (b) residual variance of resilience indicators formed a distinct factor of adaptive coping capacities and (c) social cohesion could be reliably measured with a hierarchical model including four first-order dimensions of trust, a sense of belonging, social interactions and social engagement, and one second-order social cohesion factor. In the second step, latent change score models revealed that overall psychological vulnerability increased during the first lockdown and decreased again during re-opening, although not to baseline levels. Levels of social cohesion, in contrast, first decreased and then increased again during re-opening. Furthermore, participants who increased in vulnerability simultaneously decreased in social cohesion and adaptive coping during lockdown. While higher pre-lockdown levels of social cohesion predicted a stronger lockdown effect on mental health, individuals with higher social cohesion during the lockdown and positive change in coping abilities and social cohesion during re-opening showed better mental health recovery, highlighting the important role of social capacities in both amplifying but also overcoming the multiple challenges of this collective crisis. Full article
(This article belongs to the Special Issue Social and Emotional Impact of the COVID-19 Pandemic)
Show Figures

Figure 1

15 pages, 311 KiB  
Article
A Bayesian EAP-Based Nonlinear Extension of Croon and Van Veldhoven’s Model for Analyzing Data from Micro–Macro Multilevel Designs
by Steffen Zitzmann, Julian F. Lohmann, Georg Krammer, Christoph Helm, Burak Aydin and Martin Hecht
Mathematics 2022, 10(5), 842; https://doi.org/10.3390/math10050842 - 7 Mar 2022
Cited by 7 | Viewed by 2617
Abstract
Croon and van Veldhoven discussed a model for analyzing micro–macro multilevel designs in which a variable measured at the upper level is predicted by an explanatory variable that is measured at the lower level. Additionally, the authors proposed an approach for estimating this [...] Read more.
Croon and van Veldhoven discussed a model for analyzing micro–macro multilevel designs in which a variable measured at the upper level is predicted by an explanatory variable that is measured at the lower level. Additionally, the authors proposed an approach for estimating this model. In their approach, estimation is carried out by running a regression analysis on Bayesian Expected a Posterior (EAP) estimates. In this article, we present an extension of this approach to interaction and quadratic effects of explanatory variables. Specifically, we define the Bayesian EAPs, discuss a way for estimating them, and we show how their estimates can be used to obtain the interaction and the quadratic effects. We present the results of a “proof of concept” via Monte Carlo simulation, which we conducted to validate our approach and to compare two resampling procedures for obtaining standard errors. Finally, we discuss limitations of our proposed extended Bayesian EAP-based approach. Full article
(This article belongs to the Special Issue Bayesian Inference and Modeling with Applications)
17 pages, 7602 KiB  
Article
Comparative Study on Matching Methods for the Distinction of Building Modifications and Replacements Based on Multi-Temporal Building Footprint Data
by Martin Schorcht, Robert Hecht and Gotthard Meinel
ISPRS Int. J. Geo-Inf. 2022, 11(2), 91; https://doi.org/10.3390/ijgi11020091 - 27 Jan 2022
Cited by 1 | Viewed by 3166
Abstract
We compare different matching methods for distinguishing building modifications from replacements based on multi-temporal building footprint geometries from 3D city models. Manually referenced footprints of building changes were used to determine which thresholds are suitable for distinction. In addition, since the underlying LoD1 [...] Read more.
We compare different matching methods for distinguishing building modifications from replacements based on multi-temporal building footprint geometries from 3D city models. Manually referenced footprints of building changes were used to determine which thresholds are suitable for distinction. In addition, since the underlying LoD1 (Level of Detail 1) data is highly accurate, randomly generated position deviations were added to allow for transferability to less well-matched data. In order to generate a defined position deviation, a novel method was developed. This allows determination of the effects of position deviations on accuracy. Determination of these methods’ suitability for manipulation of data from sources of different levels of generalization (cross-scale matching) is therefore not the focus of this work. In detail, the methods of ‘Common Area Ratio’, ‘Common Boundary Ratio’, ‘Hausdorff Distance’ and ‘PoLiS’ (Polygon and Line Segment based metric) were compared. In addition, we developed an extended line-based procedure, which we called ‘Intersection Boundary Ratio’. This method was shown to be more robust than the previous matching methods for small position deviations. Furthermore, we addressed the question of whether a minimum function at PoLiS and Hausdorff distance is more suitable to distinguish between modification and replacement. Full article
Show Figures

Figure 1

5 pages, 193 KiB  
Editorial
Geospatial Modeling Approaches to Historical Settlement and Landscape Analysis
by Hendrik Herold, Martin Behnisch, Robert Hecht and Stefan Leyk
ISPRS Int. J. Geo-Inf. 2022, 11(2), 75; https://doi.org/10.3390/ijgi11020075 - 19 Jan 2022
Cited by 4 | Viewed by 3165
Abstract
Landscapes and human settlements evolve over long periods of time. Land change, as one of the drivers of the ecological crisis in the Anthropocene, therefore, needs to be studied with a long-term perspective. Over the past decades, a substantial body of research has [...] Read more.
Landscapes and human settlements evolve over long periods of time. Land change, as one of the drivers of the ecological crisis in the Anthropocene, therefore, needs to be studied with a long-term perspective. Over the past decades, a substantial body of research has accumulated in the field of land change science. The quantitative geospatial analysis of land change, however, still faces many challenges; be that methodological or data accessibility related. This editorial introduces several scientific contributions to an open-access Special Issue on historical settlement and landscape analysis. The featured articles cover all phases of the analysis process in this field: from the exploration and geocoding of data sources and the acquisition and processing of data to the adequate visualization and application of the retrieved historical geoinformation for knowledge generation. The data used in this research include archival maps, cadastral and master plans, crowdsourced data, airborne LiDAR and satellite-based data products. From a geographical perspective, the issue covers urban and rural regions in Central Europe and North America as well as regions subject to highly dynamic urbanization in East Asia. In the view of global environmental challenges, both the need for long-term studies on land change within Earth system research and the current advancement in AI methods for the retrieval, processing and integration of historical geoinformation will further fuel this field of research. Full article
(This article belongs to the Special Issue Historic Settlement and Landscape Analysis)
29 pages, 734 KiB  
Article
Comparing the MCMC Efficiency of JAGS and Stan for the Multi-Level Intercept-Only Model in the Covariance- and Mean-Based and Classic Parametrization
by Martin Hecht, Sebastian Weirich and Steffen Zitzmann
Psych 2021, 3(4), 751-779; https://doi.org/10.3390/psych3040048 - 30 Nov 2021
Cited by 13 | Viewed by 4597
Abstract
Bayesian MCMC is a widely used model estimation technique, and software from the BUGS family, such as JAGS, have been popular for over two decades. Recently, Stan entered the market with promises of higher efficiency fueled by advanced and more sophisticated algorithms. With [...] Read more.
Bayesian MCMC is a widely used model estimation technique, and software from the BUGS family, such as JAGS, have been popular for over two decades. Recently, Stan entered the market with promises of higher efficiency fueled by advanced and more sophisticated algorithms. With this study, we want to contribute empirical results to the discussion about the sampling efficiency of JAGS and Stan. We conducted three simulation studies in which we varied the number of warmup iterations, the prior informativeness, and sample sizes and employed the multi-level intercept-only model in the covariance- and mean-based and in the classic parametrization. The target outcome was MCMC efficiency measured as effective sample size per second (ESS/s). Based on our specific (and limited) study setup, we found that (1) MCMC efficiency is much higher for the covariance- and mean-based parametrization than for the classic parametrization, (2) Stan clearly outperforms JAGS when the covariance- and mean-based parametrization is used, and that (3) JAGS clearly outperforms Stan when the classic parametrization is used. Full article
Show Figures

Figure 1

12 pages, 397 KiB  
Article
Using the Effective Sample Size as the Stopping Criterion in Markov Chain Monte Carlo with the Bayes Module in Mplus
by Steffen Zitzmann, Sebastian Weirich and Martin Hecht
Psych 2021, 3(3), 336-347; https://doi.org/10.3390/psych3030025 - 30 Jul 2021
Cited by 15 | Viewed by 5245
Abstract
Bayesian modeling using Markov chain Monte Carlo (MCMC) estimation requires researchers to decide not only whether estimation has converged but also whether the Bayesian estimates are well-approximated by summary statistics from the chain. On the contrary, software such as the Bayes module in [...] Read more.
Bayesian modeling using Markov chain Monte Carlo (MCMC) estimation requires researchers to decide not only whether estimation has converged but also whether the Bayesian estimates are well-approximated by summary statistics from the chain. On the contrary, software such as the Bayes module in Mplus, which helps researchers check whether convergence has been achieved by comparing the potential scale reduction (PSR) with a prespecified maximum PSR, the size of the MCMC error or, equivalently, the effective sample size (ESS), is not monitored. Zitzmann and Hecht (2019) proposed a method that can be used to check whether a minimum ESS has been reached in Mplus. In this article, we evaluated this method with a computer simulation. Specifically, we fit a multilevel structural equation model to a large number of simulated data sets and compared different prespecified minimum ESS values with the actual (empirical) ESS values. The empirical values were approximately equal to or larger than the prespecified minimum ones, thus indicating the validity of the method. Full article
Show Figures

Figure 1

10 pages, 1462 KiB  
Communication
Proof of Principle for Direct Reconstruction of Qualitative Depth Information from Turbid Media by a Single Hyper Spectral Image
by Martin Hohmann, Damaris Hecht, Benjamin Lengenfelder, Moritz Späth, Florian Klämpfl and Michael Schmidt
Sensors 2021, 21(8), 2860; https://doi.org/10.3390/s21082860 - 19 Apr 2021
Cited by 3 | Viewed by 2780
Abstract
In medical applications, hyper-spectral imaging is becoming more and more common. It has been shown to be more effective for classification and segmentation than normal RGB imaging because narrower wavelength bands are used, providing a higher contrast. However, until now, the fact that [...] Read more.
In medical applications, hyper-spectral imaging is becoming more and more common. It has been shown to be more effective for classification and segmentation than normal RGB imaging because narrower wavelength bands are used, providing a higher contrast. However, until now, the fact that hyper-spectral images also contain information about the three-dimensional structure of turbid media has been neglected. In this study, it is shown that it is possible to derive information about the depth of inclusions in turbid phantoms from a single hyper-spectral image. Here, the depth information is encoded by a combination of scattering and absorption within the phantom. Although scatter-dominated regions increase the backscattering for deep vessels, absorption has the opposite effect. With this argumentation, it makes sense to assume that, under certain conditions, a wavelength is not influenced by the depth of the inclusion and acts as an iso-point. This iso-point could be used to easily derive information about the depth of an inclusion. In this study, it is shown that the iso-point exists in some cases. Moreover, it is shown that the iso-point can be used to obtain precise depth information. Full article
(This article belongs to the Collection Advances in Spectroscopy and Spectral Imaging)
Show Figures

Figure 1

25 pages, 4733 KiB  
Article
Mapping Public Urban Green Spaces Based on OpenStreetMap and Sentinel-2 Imagery Using Belief Functions
by Christina Ludwig, Robert Hecht, Sven Lautenbach, Martin Schorcht and Alexander Zipf
ISPRS Int. J. Geo-Inf. 2021, 10(4), 251; https://doi.org/10.3390/ijgi10040251 - 9 Apr 2021
Cited by 57 | Viewed by 18433
Abstract
Public urban green spaces are important for the urban quality of life. Still, comprehensive open data sets on urban green spaces are not available for most cities. As open and globally available data sets, the potential of Sentinel-2 satellite imagery and OpenStreetMap (OSM) [...] Read more.
Public urban green spaces are important for the urban quality of life. Still, comprehensive open data sets on urban green spaces are not available for most cities. As open and globally available data sets, the potential of Sentinel-2 satellite imagery and OpenStreetMap (OSM) data for urban green space mapping is high but limited due to their respective uncertainties. Sentinel-2 imagery cannot distinguish public from private green spaces and its spatial resolution of 10 m fails to capture fine-grained urban structures, while in OSM green spaces are not mapped consistently and with the same level of completeness everywhere. To address these limitations, we propose to fuse these data sets under explicit consideration of their uncertainties. The Sentinel-2 derived Normalized Difference Vegetation Index was fused with OSM data using the Dempster–Shafer theory to enhance the detection of small vegetated areas. The distinction between public and private green spaces was achieved using a Bayesian hierarchical model and OSM data. The analysis was performed based on land use parcels derived from OSM data and tested for the city of Dresden, Germany. The overall accuracy of the final map of public urban green spaces was 95% and was mainly influenced by the uncertainty of the public accessibility model. Full article
Show Figures

Figure 1

21 pages, 5459 KiB  
Article
Mapping Long-Term Dynamics of Population and Dwellings Based on a Multi-Temporal Analysis of Urban Morphologies
by Robert Hecht, Hendrik Herold, Martin Behnisch and Mathias Jehling
ISPRS Int. J. Geo-Inf. 2019, 8(1), 2; https://doi.org/10.3390/ijgi8010002 - 21 Dec 2018
Cited by 22 | Viewed by 6690
Abstract
Information on the distribution and dynamics of dwellings and their inhabitants is essential to support decision-making in various fields such as energy provision, land use planning, risk assessment and disaster management. However, as various different of approaches to estimate the current distribution of [...] Read more.
Information on the distribution and dynamics of dwellings and their inhabitants is essential to support decision-making in various fields such as energy provision, land use planning, risk assessment and disaster management. However, as various different of approaches to estimate the current distribution of population and dwellings exists, further evidence on past dynamics is needed for a better understanding of urban processes. This article therefore addresses the question of whether and how accurately historical distributions of dwellings and inhabitants can be reconstructed with commonly available geodata from national mapping and cadastral agencies. For this purpose, an approach for the automatic derivation of such information is presented. The data basis is constituted by a current digital landscape model and a 3D building model combined with historical land use information automatically extracted from historical topographic maps. For this purpose, methods of image processing, machine learning, change detection and dasymetric mapping are applied. The results for a study area in Germany show that it is possible to automatically derive decadal historical patterns of population and dwellings from 1950 to 2011 at the level of a 100 m grid with slight underestimations and acceptable standard deviations. By a differentiated analysis we were able to quantify the errors for different urban structure types. Full article
(This article belongs to the Special Issue Historic Settlement and Landscape Analysis)
Show Figures

Figure 1

Back to TopTop