Special Issue "Feature Papers in Psychometrics and Educational Measurement"

A special issue of Psych (ISSN 2624-8611). This special issue belongs to the section "Psychometrics and Educational Measurement".

Deadline for manuscript submissions: 20 May 2023 | Viewed by 374

Special Issue Editor

Dr. Alexander Robitzsch
E-Mail Website
Guest Editor
IPN – Leibniz Institute for Science and Mathematics Education, University of Kiel, Olshausenstraße 62, 24118 Kiel, Germany
Interests: item response models; linking; methodology in large-scale assessments; multilevel models; missing data; cognitive diagnostic models; Bayesian methods and regularization
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

This Special Issue,  “Feature Papers in Psychometrics and Educational Measurement”, aims to collect high-quality original articles, reviews, research notes, and short communications in the cutting-edge field of psychometrics and educational measurement. We encourage the Editorial Board Members of Psych to contribute feature papers reflecting the latest progress in their field of research or to invite relevant experts and colleagues to do so.

Dr. Alexander Robitzsch
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Psych is an international peer-reviewed open access quarterly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1200 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Published Papers (1 paper)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Article
Examining and Improving the Gender and Language DIF in the VERA 8 Tests
Psych 2022, 4(3), 357-374; https://doi.org/10.3390/psych4030030 - 06 Jul 2022
Viewed by 272
Abstract
The purpose of this study was to examine and improve differential item functioning (DIF) across gender and language groups in the VERA 8 tests. We used multigroup concurrent calibration with full and partial invariance based on the Rasch and two-parameter logistic (2PL) models, [...] Read more.
The purpose of this study was to examine and improve differential item functioning (DIF) across gender and language groups in the VERA 8 tests. We used multigroup concurrent calibration with full and partial invariance based on the Rasch and two-parameter logistic (2PL) models, and classified students into proficiency levels based on their test scores and previously defined cut scores. The results indicated that some items showed gender- and language-specific DIF when using the Rasch model, but we did not detect large misfit items (suspected as DIF) when using the 2PL model. When the item parameters were estimated using the 2PL model with partial invariance assumption (PI-2PL), only small or negligible misfit items were found in the overall tests for both groups. It is argued in this study that the 2PL model should be preferred because both of its approaches provided less bias. However, especially in the presence of unweighted sample sizes of German and non-German students, the non-German students had the highest misfit item proportions. Although the items with medium or small misfit did not have a significant effect on the scores and performance classifications, the items with large misfit changed the proportions of students at the highest and lowest performance levels. Full article
(This article belongs to the Special Issue Feature Papers in Psychometrics and Educational Measurement)
Show Figures

Figure 1

Back to TopTop