Next Article in Journal
Views on Public Transport and How Personal Experiences Can Contribute to a More Positive Attitude and Behavioural Change
Previous Article in Journal
Exploring the Term “Resilience” in Arctic Health and Well-Being Using a Sharing Circle as a Community-Centered Approach: Insights from a Conference Workshop
Article Menu

Export Article

Open AccessArticle
Soc. Sci. 2019, 8(2), 46; https://doi.org/10.3390/socsci8020046

Measurement Invariance of a Direct Behavior Rating Multi Item Scale across Occasions

1
Research in Inclusive Education, Faculty of Rehabilitation Science, Technical University of Dortmund, 44227 Dortmund, Germany
2
Department of Special Education, University of Cologne, 50931 Cologne, Germany
3
Deggendorf Institute of Technology, Institute for Quality and Continuing Education, 94469 Deggendorf, Germany
4
Faculty of Rehabilitation Science, Educational Research Methods, Technical University of Dortmund, 44227 Dortmund, Germany
*
Author to whom correspondence should be addressed.
Received: 16 October 2018 / Revised: 29 January 2019 / Accepted: 29 January 2019 / Published: 4 February 2019
Full-Text   |   PDF [605 KB, uploaded 18 February 2019]   |  
  |   Review Reports

Abstract

Direct Behavior Rating (DBR) as a behavioral progress monitoring tool can be designed as longitudinal assessment with only short intervals between measurement points. The reliability of these instruments has been mostly evaluated in observational studies with small samples based on generalizability theory. However, for a standardized use in the pedagogical field, a larger and broader sample is required in order to assess measurement invariance between different participant groups and over time. Therefore, we constructed a DBR, the Questionnaire for Monitoring Behavior in Schools (QMBS) with multiple items to measure the occurrence of specific externalizing and internalizing student classroom behaviors on a Likert scale (1 = never to 7 = always). In a pilot study, two trained raters observed 16 primary education students and rated the student behavior over all items with a satisfactory reliability. In the main study, 108 regular primary school students, 97 regular secondary students, and 14 students in a clinical setting were rated daily over one week (five measurement points). Item response theory (IRT) analyses confirmed the technical adequacy of the instrument and latent growth models demonstrated the instrument’s stability over time. Further development of the instrument and study designs to implement DBRs is discussed. View Full-Text
Keywords: direct behavior rating; test; sensitivity over time; rating; school; classroom behavior; progress monitoring direct behavior rating; test; sensitivity over time; rating; school; classroom behavior; progress monitoring
Figures

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited (CC BY 4.0).
SciFeed

Share & Cite This Article

MDPI and ACS Style

Gebhardt, M.; DeVries, J.M.; Jungjohann, J.; Casale, G.; Gegenfurtner, A.; Kuhn, J.-T. Measurement Invariance of a Direct Behavior Rating Multi Item Scale across Occasions. Soc. Sci. 2019, 8, 46.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Soc. Sci. EISSN 2076-0760 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top