Special Issue "Challenges in Intelligence Testing"

A special issue of Journal of Intelligence (ISSN 2079-3200).

Deadline for manuscript submissions: closed (15 July 2015).

Special Issue Editor

Special Issue Information

Dear Colleagues,

On the one hand, intelligence testing has been one of the most successful enterprises in the application of psychology. The tests have proven useful not only in predicting academic success, but also in predicting a wide range of life outcomes: health, wealth, marital stability, job success, and longevity, to name a few. At the same time, some psychologists have believed that the usefulness of the tests, both as scientific instruments and as practical predictors of various kinds of success, has been compromised by various elements. Some of the arguments that have been advanced, either against the tests or in favor of broader tests, are that: (a) there is more to intelligence than IQ; (b) the tests are differentially useful (and potentially even harmful) to members of certain groups, such as ones that vary from mainstream Western society in enculturation or socialization; (c) the tests lack adequate theoretical justification; and (d) the tests are typically static rather than dynamic, so that they are unable to measure an individual’s zone of proximal development. In this symposium, contributors will discuss the challenges that intelligence tests face—both theoretical and empirical—and will bolster their claims with data supporting their points of view.

Prof. Dr. Robert J. Sternberg

Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All papers will be peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Journal of Intelligence is an international peer-reviewed open access quarterly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1400 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.



Keywords

  • intelligence
  • intelligence test
  • IQ
  • multiple intelligences
  • culture
  • theories of intelligence
  • equit
  • ethnicity
  • group differences
  • validity
  • bias
  • socialization
  • dynamic testing

Published Papers (6 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Editorial

Jump to: Research, Review, Other

Editorial
The Gift that Keeps on Giving—But for How Long?
J. Intell. 2016, 4(1), 4; https://doi.org/10.3390/jintelligence4010004 - 04 Mar 2016
Cited by 2 | Viewed by 4469
Abstract
Some gifts keep on giving.[...] Full article
(This article belongs to the Special Issue Challenges in Intelligence Testing)

Research

Jump to: Editorial, Review, Other

Article
Integrating Hot and Cool Intelligences: Thinking Broadly about Broad Abilities
J. Intell. 2016, 4(1), 1; https://doi.org/10.3390/jintelligence4010001 - 29 Jan 2016
Cited by 17 | Viewed by 6699
Abstract
Although results from factor-analytic studies of the broad, second-stratum abilities of human intelligence have been fairly consistent for decades, the list of broad abilities is far from complete, much less understood. We propose criteria by which the list of broad abilities could be [...] Read more.
Although results from factor-analytic studies of the broad, second-stratum abilities of human intelligence have been fairly consistent for decades, the list of broad abilities is far from complete, much less understood. We propose criteria by which the list of broad abilities could be amended and envision alternatives for how our understanding of the hot intelligences (abilities involving emotionally-salient information) and cool intelligences (abilities involving perceptual processing and logical reasoning) might be integrated into a coherent theoretical framework. Full article
(This article belongs to the Special Issue Challenges in Intelligence Testing)
Show Figures

Figure 1

Article
Differences in Judgments of Creativity: How Do Academic Domain, Personality, and Self-Reported Creativity Influence Novice Judges’ Evaluations of Creative Productions?
J. Intell. 2015, 3(3), 73-90; https://doi.org/10.3390/jintelligence3030073 - 14 Sep 2015
Cited by 13 | Viewed by 5014
Abstract
Intelligence assessment is often viewed as a narrow and ever-narrowing field, defined (as per IQ) by the measurement of finely distinguished cognitive processes. It is instructive, however, to remember that other, broader conceptions of intelligence exist and might usefully be considered for a [...] Read more.
Intelligence assessment is often viewed as a narrow and ever-narrowing field, defined (as per IQ) by the measurement of finely distinguished cognitive processes. It is instructive, however, to remember that other, broader conceptions of intelligence exist and might usefully be considered for a comprehensive assessment of intellectual functioning. This article invokes a more holistic, systems theory of intelligence—the theory of successful intelligence—and examines the possibility of including in intelligence assessment a similarly holistic measure of creativity. The time and costs of production-based assessments of creativity are generally considered prohibitive. Such barriers may be mitigated by applying the consensual assessment technique using novice raters. To investigate further this possibility, we explored the question: how much do demographic factors such as age and gender and psychological factors such as domain-specific expertise, personality or self-perceived creativity affect novices’ unidimensional ratings of creativity? Fifty-one novice judges from three undergraduate programs, majoring in three disparate expertise domains (i.e., visual art, psychology and computer science) rated 40 child-generated Lego creatures for creativity. Results showed no differences in creativity ratings based on the expertise domains of the judges. However, judges’ personality and self-perception of their own everyday creativity appeared to influence the way they scored the creatures for creativity. Full article
(This article belongs to the Special Issue Challenges in Intelligence Testing)
Article
Why Creativity Isn’t in IQ Tests, Why it Matters, and Why it Won’t Change Anytime Soon Probably
J. Intell. 2015, 3(3), 59-72; https://doi.org/10.3390/jintelligence3030059 - 07 Aug 2015
Cited by 30 | Viewed by 8281
Abstract
Creativity is a part of most theories of intelligence—sometimes a small part and sometimes a large part. Yet even IQ tests that assess aspects of intelligence that supposedly reflect creative abilities do not actually measure creativity. Recent work has argued that intelligence and [...] Read more.
Creativity is a part of most theories of intelligence—sometimes a small part and sometimes a large part. Yet even IQ tests that assess aspects of intelligence that supposedly reflect creative abilities do not actually measure creativity. Recent work has argued that intelligence and creativity are more conceptually related than we have thought. In addition, creativity offers a potential way to counter issues of test bias from several different angles. That said, inherent difficulties in measuring creativity and inherent sluggishness in the test industry mean the odds are small that creativity will find its way into IQ tests as currently defined. However, there remain other potential possibilities in related fields. Full article
(This article belongs to the Special Issue Challenges in Intelligence Testing)

Review

Jump to: Editorial, Research, Other

Review
Can Intelligence Testing Inform Educational Intervention for Children with Reading Disability?
J. Intell. 2015, 3(4), 137-157; https://doi.org/10.3390/jintelligence3040137 - 25 Nov 2015
Cited by 19 | Viewed by 8240
Abstract
This paper examines the value of intelligence testing for the purpose of informing us how best to intervene with children with reading disability. While the original function of IQ testing was to ascertain whether a child was capable of profiting from schooling, there [...] Read more.
This paper examines the value of intelligence testing for the purpose of informing us how best to intervene with children with reading disability. While the original function of IQ testing was to ascertain whether a child was capable of profiting from schooling, there are many who now claim that cognitive assessment offers a range of diagnostic and prescriptive functions which can help teachers in delivering effective educational programs. This paper interrogates such assertions in relation to the assessment of IQ, cognitive strengths and weaknesses, executive functions, and the use of dynamic testing/assessment. The paper concludes that current evidence indicates that cognitive measures have limited relevance for instructional planning, and cognitive training programs have yet to show sufficient academic gains. For these reasons, it is recommended that our energies should be directed to the continuing development of powerful forms of academic skills-based instruction operating within a response to intervention framework. Full article
(This article belongs to the Special Issue Challenges in Intelligence Testing)

Other

Concept Paper
Contextual Responsiveness: An Enduring Challenge for Educational Assessment in Africa
J. Intell. 2016, 4(1), 3; https://doi.org/10.3390/jintelligence4010003 - 17 Feb 2016
Cited by 6 | Viewed by 4741
Abstract
Numerous studies in Africa have found that indigenous conceptualization of intelligence includes dimensions of social responsibility and reflective deliberation, in addition to the dimension of cognitive alacrity emphasized in most intelligence tests standardized in Western societies. In contemporary societies undergoing rapid socio-cultural and [...] Read more.
Numerous studies in Africa have found that indigenous conceptualization of intelligence includes dimensions of social responsibility and reflective deliberation, in addition to the dimension of cognitive alacrity emphasized in most intelligence tests standardized in Western societies. In contemporary societies undergoing rapid socio-cultural and politico-economic change, the technology of intelligence testing has been widely applied to the process of educational selection. Current applications in Zambia rely exclusively on Western style tests and fail to respond to some enduring cultural preoccupations of many parents, educators and policymakers. We discuss how recent and ongoing research addresses the challenges of eco-culturally responsive assessment with respect to assessment of intellectual functions in early childhood, monitoring initial literacy acquisition in middle childhood, and selection for admission to secondary and tertiary education. We argue that the inherent bias of normative tests can only be justified politically if a compelling theoretical account is available of how the construct of intelligence relates to learning and how opportunities for learning are distributed through educational policy. While rapid social change gives rise to demands for new knowledge and skills, assessment of intellectual functions will be more adaptive in contemporary Zambian society if it includes the dimensions of reflection and social responsibility. Full article
(This article belongs to the Special Issue Challenges in Intelligence Testing)
Show Figures

Graphical abstract

Back to TopTop