Next Article in Journal
Interaction Effects between Openness and Fluid Intelligence Predicting Scholastic Performance
Next Article in Special Issue
Can Intelligence Testing Inform Educational Intervention for Children with Reading Disability?
Previous Article in Journal / Special Issue
Why Creativity Isn’t in IQ Tests, Why it Matters, and Why it Won’t Change Anytime Soon Probably
Open AccessArticle

Differences in Judgments of Creativity: How Do Academic Domain, Personality, and Self-Reported Creativity Influence Novice Judges’ Evaluations of Creative Productions?

Yale Child Study Center, 230 South Frontage Road, New Haven, CT 06520, USA
Department of Psychological, Health, and Learning Sciences, University of Houston, 3657 Cullen Boulevard, Houston, TX 77204-5029, USA
Department of Computer Science, Dickinson College, 28 North College Street, Carlisle, PA 17013-2737, USA
Psychology Department, Pace University, One Pace Place, New York, NY 10038, USA
Moscow State University of Psychology and Education, ul. Sretenka, 29, Moscow 127051, Russia
Author to whom correspondence should be addressed.
Academic Editor: Robert J. Sternberg
J. Intell. 2015, 3(3), 73-90;
Received: 1 July 2015 / Revised: 4 September 2015 / Accepted: 9 September 2015 / Published: 14 September 2015
(This article belongs to the Special Issue Challenges in Intelligence Testing)
Intelligence assessment is often viewed as a narrow and ever-narrowing field, defined (as per IQ) by the measurement of finely distinguished cognitive processes. It is instructive, however, to remember that other, broader conceptions of intelligence exist and might usefully be considered for a comprehensive assessment of intellectual functioning. This article invokes a more holistic, systems theory of intelligence—the theory of successful intelligence—and examines the possibility of including in intelligence assessment a similarly holistic measure of creativity. The time and costs of production-based assessments of creativity are generally considered prohibitive. Such barriers may be mitigated by applying the consensual assessment technique using novice raters. To investigate further this possibility, we explored the question: how much do demographic factors such as age and gender and psychological factors such as domain-specific expertise, personality or self-perceived creativity affect novices’ unidimensional ratings of creativity? Fifty-one novice judges from three undergraduate programs, majoring in three disparate expertise domains (i.e., visual art, psychology and computer science) rated 40 child-generated Lego creatures for creativity. Results showed no differences in creativity ratings based on the expertise domains of the judges. However, judges’ personality and self-perception of their own everyday creativity appeared to influence the way they scored the creatures for creativity. View Full-Text
Keywords: creativity; consensual assessment technique; novice judges creativity; consensual assessment technique; novice judges
MDPI and ACS Style

Tan, M.; Mourgues, C.; Hein, S.; MacCormick, J.; Barbot, B.; Grigorenko, E. Differences in Judgments of Creativity: How Do Academic Domain, Personality, and Self-Reported Creativity Influence Novice Judges’ Evaluations of Creative Productions? J. Intell. 2015, 3, 73-90.

Show more citation formats Show less citations formats

Article Access Map by Country/Region

Only visits after 24 November 2015 are recorded.
Back to TopTop