Complex Problem Solving and its Position in the Wider Realm of the Human Intellect

A special issue of Journal of Intelligence (ISSN 2079-3200).

Deadline for manuscript submissions: closed (1 December 2016) | Viewed by 70645

Special Issue Editors


E-Mail Website
Guest Editor
Department of Educational Psychology, Goethe-University Frankfurt, 60629 Frankfurt am Main, Germany
Interests: educational psychology; assessment; problem solving; multivariate statistics
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
Centre for Educational Measurement at University of Oslo (CEMO), Faculty of Educational Sciences, 0318 Oslo, Norway

Special Issue Information

Dear Colleagues,

Ever since the first standardized measures of human intelligence emerged, they have been criticized from various angles and for various reasons. One of these criticisms related to the abstract nature of standard intelligence tests, such as the Raven Matrices, that were arguably out of touch with reality and had little real world relevance. When research on complex problem solving (CPS) emerged in the 1970s, it was precisely the complex and highly face-valid nature of the problem scenarios used in this line of research, in which participants had to explore complex systems and work through environments that tried to mimic real-life scenarios, which was considered important and a viable alternative to standard measures of intelligence.
However, the high hopes put into CPS diminished rather quickly because the conceptual delineation between CPS on the one hand and general intelligence on the other hand was difficult to establish and to bolster through empirical studies. In fact, conceptual fuzziness and issues in the assessment and the scoring of complex scenarios hampered a thorough understanding of CPS as a latent construct, its assessment, and its utility for human performance in general. It was only recently that CPS was re-discovered, partly due to new assessment approaches that solved some of the former issues and partly due to the emerging conceptual relevance of CPS as a 21st century skill. For instance, CPS was assessed in over 50 countries worldwide in the Programme for International Student Assessment (PISA) in its 2012 cycle, and international reporting on country differences has had a palpable impact on research, policy, and education.
While interest on CPS has, thus, risen anew, a number of yet unanswered questions related to the nature of CPS as a latent variable (e.g., cognitive and non-cognitive dimensions; relation to other components of human intelligence; role of prior knowledge in complex problem solving), its assessment (e.g., convergent validity between different assessment approaches; balance between ecological validity and psychometric scaling), and the practical relevance of CPS (e.g., prediction of real-world problem solving performance, added value of CPS beyond well-established predictors) remain unanswered. It is the aim of this special issue to contribute to the academic discussion concerning CPS and, in doing so, advancing our knowledge in a field of notable significance. We invite research articles, review articles, commentaries, as well as communications for inclusion into this Special Issue.

Prof. Dr. Samuel Greiff
Dr. Ronny Scherer
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a double-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Journal of Intelligence is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue polices can be found here.

Published Papers (7 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Editorial

Jump to: Research, Review

4 pages, 213 KiB  
Editorial
Complex Problem Solving and Its Position in the Wider Realm of the Human Intellect
by Samuel Greiff and Ronny Scherer
J. Intell. 2018, 6(1), 5; https://doi.org/10.3390/jintelligence6010005 - 25 Jan 2018
Cited by 3 | Viewed by 9210
Abstract
We cannot solve our problems with the same thinking we used when we created them.[...] Full article

Research

Jump to: Editorial, Review

1433 KiB  
Article
Fluid Ability (Gf) and Complex Problem Solving (CPS)
by Patrick Kyllonen, Cristina Anguiano Carrasco and Harrison J. Kell
J. Intell. 2017, 5(3), 28; https://doi.org/10.3390/jintelligence5030028 - 13 Jul 2017
Cited by 9 | Viewed by 10839
Abstract
Complex problem solving (CPS) has emerged over the past several decades as an important construct in education and in the workforce. We examine the relationship between CPS and general fluid ability (Gf) both conceptually and empirically. A review of definitions of the two [...] Read more.
Complex problem solving (CPS) has emerged over the past several decades as an important construct in education and in the workforce. We examine the relationship between CPS and general fluid ability (Gf) both conceptually and empirically. A review of definitions of the two factors, prototypical tasks, and the information processing analyses of performance on those tasks suggest considerable conceptual overlap. We review three definitions of CPS: a general definition emerging from the human problem solving literature; a more specialized definition from the “German School” emphasizing performance in many-variable microworlds, with high domain-knowledge requirements; and a third definition based on performance in Minimal Complex Systems (MCS), with fewer variables and reduced knowledge requirements. We find a correlation of 0.86 between expert ratings of the importance of CPS and Gf across 691 occupations in the O*NET database. We find evidence that employers value both Gf and CPS skills, but CPS skills more highly, even after controlling for the importance of domain knowledge. We suggest that this may be due to CPS requiring not just cognitive ability but additionally skill in applying that ability in domains. We suggest that a fruitful future direction is to explore the importance of domain knowledge in CPS. Full article
Show Figures

Figure 1

1422 KiB  
Article
The Impact of Symmetry: Explaining Contradictory Results Concerning Working Memory, Reasoning, and Complex Problem Solving
by Alexandra Zech, Markus Bühner, Stephan Kröner, Moritz Heene and Sven Hilbert
J. Intell. 2017, 5(2), 22; https://doi.org/10.3390/jintelligence5020022 - 18 May 2017
Cited by 9 | Viewed by 10230
Abstract
Findings of studies on the unique effects of reasoning and working memory regarding complex problem solving are inconsistent. To find out if these inconsistencies are due to a lack of symmetry between the studies, we reconsidered the findings of three published studies on [...] Read more.
Findings of studies on the unique effects of reasoning and working memory regarding complex problem solving are inconsistent. To find out if these inconsistencies are due to a lack of symmetry between the studies, we reconsidered the findings of three published studies on this issue, which resulted in conflicting conclusions regarding the inter-relations between reasoning, working memory, and complex problem solving. This was achieved by analysing so far unpublished problem solving data from the study of Bühner, Krumm, Ziegler, and Plücken (2006) (N= 124). One of the three published studies indicated unique effects of working memory and reasoning on complex problem solving using aggregated scores, a second study found no unique contribution of working memory using only figural scores, and a third study reported a unique influence only for reasoning using only numerical scores. Our data featured an evaluation of differences across content facets and levels of aggregation of the working memory scores. Path models showed that the results of the first study could not be replicated using content aggregated scores; the results of the second study could be replicated if only figural scores were used, and the results of the third study could be obtained by using only numerical scores. For verbal content, none of the published results could be replicated. This leads to the assumption that not only symmetry is an issue when correlating non-symmetrical data, but that content also has to be taken into account when comparing different studies on the same topic. Full article
Show Figures

Figure 1

1140 KiB  
Article
Comparing Business Experts and Novices in Complex Problem Solving
by C. Dominik Güss, Hannah Devore Edelstein, Ali Badibanga and Sandy Bartow
J. Intell. 2017, 5(2), 20; https://doi.org/10.3390/jintelligence5020020 - 13 May 2017
Cited by 9 | Viewed by 9145
Abstract
Business owners are faced with complex problems and are required to make decisions on a daily basis. The purpose of this study was to investigate complex problem solving (CPS) between experts and novices and to explore the competing theories of expert-rigidity versus expert-adaptability, [...] Read more.
Business owners are faced with complex problems and are required to make decisions on a daily basis. The purpose of this study was to investigate complex problem solving (CPS) between experts and novices and to explore the competing theories of expert-rigidity versus expert-adaptability, as part of exploring which theory better explains crystallized intelligence. Participants were 140 business owners, business management undergraduate students and psychology students. Each participant managed a highly complex simulated chocolate company. Decisions and systems data were automatically saved in log files. Results revealed that small business owners performed best, followed by business students and then psychology students. A process analysis revealed that experts compared to novices spent more time initially exploring the complex situation. Experts were found to have greater flexibility in their decisions, having made the most personnel and advertising changes in response to situational demands. Adaptability and flexibility were predictive of performance, with results supporting the adaptability/flexibility theory of expertise. This study shows the influence of expertise on complex problem solving and the importance of flexibility when solving dynamic business problems. Complex business simulations are not only useful tools for research, but could also be used as tools in training programs teaching decision making and problem solving strategies. Full article
Show Figures

Figure 1

713 KiB  
Article
Missing the Wood for the Wrong Trees: On the Difficulty of Defining the Complexity of Complex Problem Solving Scenarios
by Jens F. Beckmann and Natassia Goode
J. Intell. 2017, 5(2), 15; https://doi.org/10.3390/jintelligence5020015 - 13 Apr 2017
Cited by 12 | Viewed by 8671
Abstract
In this paper we discuss how the lack of a common framework in Complex Problem Solving (CPS) creates a major hindrance to a productive integration of findings and insights gained in its 40+-year history of research. We propose a framework that anchors complexity [...] Read more.
In this paper we discuss how the lack of a common framework in Complex Problem Solving (CPS) creates a major hindrance to a productive integration of findings and insights gained in its 40+-year history of research. We propose a framework that anchors complexity within the tri-dimensional variable space of Person, Task and Situation. Complexity is determined by the number of information cues that need to be processed in parallel. What constitutes an information cue is dependent on the kind of task, the system or CPS scenario used and the task environment (i.e., situation) in which the task is performed. Difficulty is conceptualised as a person’s subjective reflection of complexity. Using an existing data set of N = 294 university students’ problem solving performances, we test the assumption derived from this framework that particular system features such as numbers of variables (NoV) or numbers of relationships (NoR) are inappropriate indicators of complexity. We do so by contrasting control performance across four systems that differ in these attributes. Results suggest that for controlling systems (task) with semantically neutral embedment (situation), the maximum number of dependencies any of the output variables has is a promising indicator of this task’s complexity. Full article
Show Figures

Figure 1

225 KiB  
Article
What Can We Learn from “Not Much More than g”?
by Kevin Murphy
J. Intell. 2017, 5(1), 8; https://doi.org/10.3390/jintelligence5010008 - 25 Feb 2017
Cited by 13 | Viewed by 9121
Abstract
A series of papers showing that measures of general cognitive ability predicted performance on the job and in training and that measures of specific cognitive abilities rarely made an incremental contribution to prediction led to a premature decline in research on the roles [...] Read more.
A series of papers showing that measures of general cognitive ability predicted performance on the job and in training and that measures of specific cognitive abilities rarely made an incremental contribution to prediction led to a premature decline in research on the roles of specific abilities in the workplace. Lessons learned from this research include the importance of choosing the right general cognitive measures and variables, the relative roles of prediction vs. understanding and the need for a wide range of criteria when evaluating the contribution of specific skills such as complex problem solving. In particular, research published since the “not much more than g” era suggests that distinguishing between fluid and crystallized intelligence is important for understanding the development and the contribution of complex problem solving. Full article

Review

Jump to: Editorial, Research

280 KiB  
Review
Complex Problem Solving in Assessments of Collaborative Problem Solving
by Arthur Graesser, Bor-Chen Kuo and Chen-Huei Liao
J. Intell. 2017, 5(2), 10; https://doi.org/10.3390/jintelligence5020010 - 27 Mar 2017
Cited by 45 | Viewed by 12322
Abstract
Collaborative problem solving (ColPS) proficiency was developed as a new assessment for the Programme for International Student Assessment (PISA) in the 2015 international evaluation of student skills and knowledge. The assessment framework defined by the PISA ColPS 2015 expert group crossed three major [...] Read more.
Collaborative problem solving (ColPS) proficiency was developed as a new assessment for the Programme for International Student Assessment (PISA) in the 2015 international evaluation of student skills and knowledge. The assessment framework defined by the PISA ColPS 2015 expert group crossed three major collaboration processes with four problem solving processes that were adopted from the PISA 2012 individual problem solving assessment to form a matrix of 12 specific skills. The three major collaboration processes are (1) establishing and maintaining shared understanding; (2) taking appropriate action; and (3) establishing and maintaining team organization. The four problem solving processes are exploring and understanding the problem, representing and formulating the problem, planning and executing strategies, and monitoring and reflecting on the problem-solving activities. This article discusses how the problem-solving dimension was integrated with the collaboration dimension. We also discuss how computer agents were involved in the PISA ColPS 2015 assessment in order to ensure a satisfactory assessment of collaborative problem solving. Examples of the use of agents to assess ColPS are provided in the context of a released PISA item and a project conducted in Taiwan. Full article
Back to TopTop