Special Issue "Computational Aspects and Software in Psychometrics II"

A special issue of Psych (ISSN 2624-8611). This special issue belongs to the section "Psychometrics and Educational Measurement".

Deadline for manuscript submissions: 31 March 2023 | Viewed by 1437

Special Issue Editor

Dr. Alexander Robitzsch
E-Mail Website
Guest Editor
IPN – Leibniz Institute for Science and Mathematics Education, University of Kiel, Olshausenstraße 62, 24118 Kiel, Germany
Interests: item response models; linking; methodology in large-scale assessments; multilevel models; missing data; cognitive diagnostic models; Bayesian methods and regularization
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

Statistical software in psychometrics has made tremendous progress in providing open-source solutions. The focus of this Special Issue is on computational aspects and statistical algorithms for psychometric methods. Software articles introducing new software packages or tutorials are particularly welcome. Simulation studies as well as implementation aspects of psychometric models would fit this Special Issue perfectly. We would also like to invite researchers to submit articles of software reviews that review one software package, several packages, or empirical comparisons of several packages. Potential psychometric models include (but are not limited to) item response models, structural equation models, multilevel models, latent class models, cognitive diagnostic models, or mixture models.

Dr. Alexander Robitzsch
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Psych is an international peer-reviewed open access quarterly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1200 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • statistical software
  • estimation algorithms
  • software tutorials
  • software reviews
  • item response models
  • multilevel models
  • structural equation models
  • latent class models
  • cognitive diagnostic models
  • mixture models
  • open source software

Published Papers (2 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Article
What Is the Maximum Likelihood Estimate When the Initial Solution to the Optimization Problem Is Inadmissible? The Case of Negatively Estimated Variances
Psych 2022, 4(3), 343-356; https://doi.org/10.3390/psych4030029 - 30 Jun 2022
Viewed by 335
Abstract
The default procedures of the software programs Mplus and lavaan tend to yield an inadmissible solution (also called a Heywood case) when the sample is small or the parameter is close to the boundary of the parameter space. In factor models, a [...] Read more.
The default procedures of the software programs Mplus and lavaan tend to yield an inadmissible solution (also called a Heywood case) when the sample is small or the parameter is close to the boundary of the parameter space. In factor models, a negatively estimated variance does often occur. One strategy to deal with this is fixing the variance to zero and then estimating the model again in order to obtain the estimates of the remaining model parameters. In the present article, we present one possible approach for justifying this strategy. Specifically, using a simple one-factor model as an example, we show that the maximum likelihood (ML) estimate of the variance of the latent factor is zero when the initial solution to the optimization problem (i.e., the solution provided by the default procedure) is a negative value. The basis of our argument is the very definition of ML estimation, which requires that the log-likelihood be maximized over the parameter space. We present the results of a small simulation study, which was conducted to evaluate the proposed ML procedure and compare it with Mplus’ default procedure. We found that the proposed ML procedure increased estimation accuracy compared to Mplus’ procedure, rendering the ML procedure an attractive option to deal with inadmissible solutions. Full article
(This article belongs to the Special Issue Computational Aspects and Software in Psychometrics II)
Article
Dealing with Missing Responses in Cognitive Diagnostic Modeling
Psych 2022, 4(2), 318-342; https://doi.org/10.3390/psych4020028 - 14 Jun 2022
Viewed by 485
Abstract
Missing data are a common problem in educational assessment settings. In the implementation of cognitive diagnostic models (CDMs), the presence and/or inappropriate treatment of missingness may yield biased parameter estimates and diagnostic information. Using simulated data, this study evaluates ten approaches for handling [...] Read more.
Missing data are a common problem in educational assessment settings. In the implementation of cognitive diagnostic models (CDMs), the presence and/or inappropriate treatment of missingness may yield biased parameter estimates and diagnostic information. Using simulated data, this study evaluates ten approaches for handling missing data in a commonly applied CDM (the deterministic inputs, noisy “and” gate (DINA) model): treating missing data as incorrect (IN), person mean (PM) imputation, item mean (IM) imputation, two-way (TW) imputation, response function (RF) imputation, logistic regression (LR), expectation-maximization (EM) imputation, full information maximum likelihood (FIML) estimation, predictive mean matching (PMM), and random imputation (RI). Specifically, the current study investigates how the estimation accuracy of item parameters and examinees’ attribute profiles from DINA are impacted by the presence of missing data and the selection of missing data methods across conditions. While no single method was found to be superior to other methods across all conditions, the results suggest the use of FIML, PMM, LR, and EM in recovering item parameters. The selected methods, except for PM, performed similarly across conditions regarding attribute classification accuracy. Recommendations for the treatment of missing responses for CDMs are provided. Limitations and future directions are discussed. Full article
(This article belongs to the Special Issue Computational Aspects and Software in Psychometrics II)
Show Figures

Figure 1

Back to TopTop