Analysis of a Divergent Thinking Dataset

Special Issue Editors


E-Mail Website
Guest Editor
Institute of Psychology, University of Münster, 48149 Münster, Germany
Interests: psychometrics; creativity; learning progress assessment

E-Mail Website
Guest Editor
Department of Psychology, Pace University, 1 Pace Plaza, New York, NY 10038, USA
Interests: aesthetic abilities; intelligence; creativity; personality
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

The Journal of Intelligence is planning a Special Issue on the analysis of a divergent thinking dataset.

  • The data are available at https://osf.io/a9qnc.
  • The main data are available for three objects with alternate uses (i.e., garbage bag, paperclip, and rope).
  • The dataset includes many different scores of the divergent thinking data, such as variants of originality scores (human ratings, statistical rarity, and semantic distance), as well as fluency and elaboration scores.
  • Data for each single response are also available. These include, for example, the ratings of single responses, as well as the text-mining features obtained for single responses.
  • Person-level covariates are also available (e.g., verbal fluency, typing speed, right-wing authoritarianism), but divergent thinking should always be part of the analysis performed.
  • For an example of how the data can be analyzed, please see the following:

Forthmann, B., & Doebler, P. (2022). Fifty years later and still working: Rediscovering Paulus et al’s (1970) automated scoring of divergent thinking tests. Psychology of Aesthetics, Creativity, and the Arts. Advance online publication. https://doi.org/10.1037/aca0000518

Forthmann, B., Holling, H., Çelik, P., Storme, M., & Lubart, T. (2017). Typing speed as a confounding variable and the measurement of quality in divergent thinking. Creativity Research Journal, 29(3), 257–269. https://doi.org/10.1080/10400419.2017.1360059

Forthmann, B., Paek, S. H., Dumas, D., Barbot, B., & Holling, H. (2020). Scrutinizing the basis of originality in divergent thinking tests: On the measurement precision of response propensity estimates. British Journal of Educational Psychology, 90(3), 683–699. https://doi.org/10.1111/bjep.12325

We welcome all manuscripts which contribute to an understanding of new analytical approaches for researchers interested in divergent thinking research, a new understanding of the data, and/or to the measurement of divergent thinking, whether related to the approaches used in the papers mentioned above or not. Any type of analysis qualifies on the condition that it is of sufficiently high quality. Specifically, the analysis could be any of the following:

  • Simple or complex;
  • Outdated (some approaches used in the past might still be useful while other “old” methods of data analysis have been simply overlooked) or modern (i.e., the newest approaches taken from the literature on these methods);
  • Frequentist or Bayesian;
  • Psychometric (classical test theory, item response theory, or network psychometrics);
  • Focused on prediction.

Dr. Boris Forthmann
Dr. Nils Myszkowski
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a double-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Journal of Intelligence is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • divergent thinking
  • human ratings
  • assessment
  • data analysis
  • psychometrics
  • prediction modeling
  • psychological methods

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue policies can be found here.

Published Papers (2 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

34 pages, 2580 KiB  
Article
Bayesian Estimation of Generalized Log-Linear Poisson Item Response Models for Fluency Scores Using brms and Stan
by Nils Myszkowski and Martin Storme
J. Intell. 2025, 13(3), 26; https://doi.org/10.3390/jintelligence13030026 - 23 Feb 2025
Viewed by 415
Abstract
Divergent thinking tests are popular instruments to measure a person’s creativity. They often involve scoring fluency, which refers to the count of ideas generated in response to a prompt. The two-parameter Poisson counts model (2PPCM), a generalization of the Rasch Poisson counts model [...] Read more.
Divergent thinking tests are popular instruments to measure a person’s creativity. They often involve scoring fluency, which refers to the count of ideas generated in response to a prompt. The two-parameter Poisson counts model (2PPCM), a generalization of the Rasch Poisson counts model (RPCM) that includes discrimination parameters, has been proposed as a useful approach to analyze fluency scores in creativity tasks, but its estimation was presented in the context of generalized structural equation modeling (GSEM) commercial software (e.g., Mplus). Here, we show how the 2PPCM (and RPCM) can be estimated in a Bayesian multilevel regression framework and interpreted using the R package brms, which provides an interface for the Stan programming language. We illustrate this using an example dataset, which contains fluency scores for three tasks and 202 participants. We discuss model specification, estimation, convergence, fit and comparisons. Furthermore, we provide instructions on plotting item response functions, comparing models, calculating overdispersion and reliability, as well as extracting factor scores. Full article
(This article belongs to the Special Issue Analysis of a Divergent Thinking Dataset)
Show Figures

Figure 1

21 pages, 1568 KiB  
Article
Decomposing the True Score Variance in Rated Responses to Divergent Thinking-Tasks for Assessing Creativity: A Multitrait–Multimethod Analysis
by David Jendryczko
J. Intell. 2024, 12(10), 95; https://doi.org/10.3390/jintelligence12100095 - 27 Sep 2024
Viewed by 1034
Abstract
It is shown how the Correlated Traits Correlated Methods Minus One (CTC(M − 1)) Multitrait-Multimethod model for cross-classified data can be modified and applied to divergent thinking (DT)-task responses scored for miscellaneous aspects of creative quality by several raters. In contrast to previous [...] Read more.
It is shown how the Correlated Traits Correlated Methods Minus One (CTC(M − 1)) Multitrait-Multimethod model for cross-classified data can be modified and applied to divergent thinking (DT)-task responses scored for miscellaneous aspects of creative quality by several raters. In contrast to previous Confirmatory Factor Analysis approaches to analyzing DT-tasks, this model explicitly takes the cross-classified data structure resulting from the employment of raters into account and decomposes the true score variance into target-specific, DT-task object-specific, rater-specific, and rater–target interaction-specific components. This enables the computation of meaningful measurement error-free relative variance-parameters such as trait-consistency, object–method specificity, rater specificity, rater–target interaction specificity, and model-implied intra-class correlations. In the empirical application with alternate uses tasks as DT-measures, the model is estimated using Bayesian statistics. The results are compared to the results yielded with a simplified version of the model, once estimated with Bayesian statistics and once estimated with the maximum likelihood method. The results show high trait-correlations and low consistency across DT-measures which indicates more heterogeneity across the DT-measurement instruments than across different creativity aspects. Substantive deliberations and further modifications, extensions, useful applications, and limitations of the model are discussed. Full article
(This article belongs to the Special Issue Analysis of a Divergent Thinking Dataset)
Show Figures

Figure 1

Back to TopTop