Next Article in Journal
Landslide Susceptibility Prediction Using GIS, Analytical Hierarchy Process, and Artificial Neural Network in North-Western Tunisia
Previous Article in Journal
Seasonal Sea Surface Temperatures from Mercenaria spp. During the Plio-Pleistocene: Oxygen Isotope Versus Clumped Isotope Paleothermometers
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

What Are US Undergraduates Taught and What Have They Learned About US Continental Crust and Its Sedimentary Basins?

by
Clinton Whitaker Crowley
*,† and
Robert James Stern
Sustainable Earth Systems Sciences Department, University of Texas at Dallas, Richardson, TX 75080-3021, USA
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Geosciences 2025, 15(8), 296; https://doi.org/10.3390/geosciences15080296 (registering DOI)
Submission received: 20 May 2025 / Revised: 10 July 2025 / Accepted: 23 July 2025 / Published: 2 August 2025
(This article belongs to the Section Sedimentology, Stratigraphy and Palaeontology)

Abstract

We need to educate students and the public about addressing natural resource challenges to maintain civilization moving into a sustainable future. Because US mineral and energy resources are found in its continental crust and sedimentary basins, introductory geology students need to be well-informed about US crust and basins. We think that creating effective videos about these topics is the best way to engage students to want to learn more. In preparation for making these videos, we researched what introductory geology students are taught and what they learn about these topics. Student interviews informed us about learned curriculum, and taught curriculum was analyzed using a novel keyword-counting method applied to textbook indices. We found that geophysics is stressed twice as much as geology, radiometric dating, and sedimentary basins. We expected that students would have learned more about geophysics and less about the other topics; however, this was not the case. Students knew more about geology, and less about geophysics, radiometric dating, and sedimentary basins. To make effective videos on these topics, we need to explain the following threshold concepts: seismic refraction to scaffold student understanding of crustal geophysics, as well as radiometric dating and deep time to understand crustal geology and the economic importance of sedimentary basins.

1. Introduction

US continental crust and its sedimentary basins are key geoscience concepts for US citizens and policymakers to understand because these are where we get mineral and energy resources required for human needs/use [1,2,3]. The US crust and sedimentary basins contain many undeveloped deposits of minerals that have recently become important for communication and renewable energy networks [1,2]. These rocks also offer safe, permanent storage for toxic and radioactive wastes [4,5]. In spite of this importance, the public and policymakers—and college students—lack a clear understanding of the rocks that our nation is built upon, and the key role those rocks play in a sustainable future economy [6]. For this reason, US citizens—especially students—need to know more about US continental crust and its sedimentary basins.
Basic concepts about US continental crust and sedimentary basins can and should be explained better to students, citizens, and policymakers. Accurate and engaging video has been shown to be an effective tool in geoscience education [7,8,9]. For these reasons, we are making three short videos to teach this audience about US continental crust and its sedimentary basins, emphasizing four topics: geophysics, geology, radiometric dating, and sedimentary basins. These three videos will explain how we know what we know (epistemology) about the US crust (geophysics and geology) and its sedimentary basins.
Before designing such videos, it is important to assess how textbooks for introductory geology courses present these concepts and what US undergraduates in introductory geology courses learn about these topics. Students get their information about these topics from four main sources:
(a)
from what they have learned before starting the class
(b)
from what they hear and see in lecture, lab, and from each other while studying
(c)
from reading assigned texts and other course materials
(d)
from self-motivated learning from various sources: Wikipedia, videos, web searches, social media, etc.
We cannot know what the instructor actually says in their lectures or what students hear and retain; however, knowing what is in the textbooks provides a general outline of what is likely covered in class. What is the best way to determine what students know and which of these sources were most important? The only way we can determine (a) what students knew before starting the course, (b) what they learned in lecture and lab, and (c) what they learned on their own from self-motivated study is by asking them in a well-designed survey—a direct assessment [10]. But we can also determine (d) information in the assigned textbooks by comparing them—an indirect assessment [10].

2. Materials and Methods

We performed two evaluations centered around two questions: what do introductory geology textbooks say about US continental crust and sedimentary basins, and what do students who have completed an introductory geology course know about those topics? Two approaches have been used in this kind of research. Some studies examine geology textbooks [11,12,13] and others focus on students’ knowledge [14,15,16,17]. Interestingly, we have not found another instance of our approach, namely, reviewing geoscience textbook content combined with assessing student knowledge. We used both methods to research what students are taught and what they have learned. Below, we first explain why we chose our methods, and then the methods themselves are explained.

2.1. Textbook Assessment

It is impossible to know what is taught in a course unless we have taught it ourselves. To approximate what students are taught about what our three videos will explain (geophysical and geological approaches to understanding US crust and its sedimentary basins), we surveyed six current textbooks (since 2016) that are widely used in introductory geology courses to understand the extent to which they teach about US crust and sedimentary basins (Table 1). We evaluated the six textbooks by counting entries for keywords (Table 2, Figure 1) in the indices of each, and compared their contents based on this. We have not found this method used to compare textbooks’ content, but we believe it is valid because publishers’ standards for indexing are similar, though not identical [18]. For the textbook review, the term keywords means either single words, hyphenated words, or two words that are essential to understanding one of the four topics and that are appropriate to a college-level introductory course. For example, a student could not understand the geology of the US crust without understanding what granite is. The term keyword is used in a broader sense in the student quiz and interview, which will be explained in that section.
Textbook indices are lists of keywords that help readers access content [18]; however, authors use different keywords for the same concepts. We began the textbook assessment by compiling our own list of keywords for each of the four topics. Because our focus was narrow, our results are not useful for comparing the overall content of the textbooks. We ignored the geophysics of the mantle and core, rock types, surface features and processes, glaciers, and other topics present in all six textbooks. But we recognized that some of these terms were called something else. For example, Reynolds et al. did not use our keyword radiometric dating in their index, instead using the term isotopic ages for the same concept. For that reason, we made a list of alternate keywords used in the textbook indices and made an expanded search that included these alternate terms. A Supplementary Document lists the original keywords and additional keywords used in the indices. The results used in Table 3 are based on this expanded search.
Introductory physical geology textbooks are broadly similar but approach the topic from different perspectives. All cover important topics about how the Earth system operates. This includes the most important natural physical processes that students should know about that occur within and on the Earth, and at the interface between these two realms. Any of these textbooks could be used in any introductory physical geology course. They differ in the number of chapters, in the titles of chapter headings, number of pages, and number of figures and tables. Table 1 summarizes these differences between the textbooks we surveyed.
Textbooks varied between 528 and 784 pages. Figures per page ranged between ~1 (0.8) and 3.
Similar to the approaches of other textbook surveys (e.g., [19]), we began by searching each book’s index for words or phrases related to each of the four topics that are key to understanding the three videos (Table 2). See Supplementary Documents for scoring examples.
We totaled the instances in each book’s index of each keyword or its equivalents to obtain a score for that topic in that book. We summed the total instances of all keywords in each book and included the average number of keywords per page.

2.2. Student Assessment

We interviewed students in introductory geology classes at the University of Texas at Dallas (UTD) near the end of the semester to determine what they had learned. We interviewed (in person, one at a time) 51 students taking both in-person (sample size n = 20) and online sections (n = 31) of introductory geoscience courses at UTD about their knowledge of four geoscience concepts (column headings of Table 2). The interviews were conducted at the close of the Fall semester, 2023, by which time most of the course material had been presented. Reported majors are listed in the Supplementary Document Initial Survey Data. Only one student was a Geoscience major; the other 50 had a wide range of majors, with the exception of arts and humanities; only three science majors participated. Seventy percent (36/51) of the students said this was their first geoscience course. Two took Physical Geology lab, two had taken Oceanography, and one had taken both Oceanography and Earthquakes and Volcanoes.
Personal information was not recorded. This study is similar to previous studies assessing student comprehension of foundational ideas in geoscience [15]. Students received one point of extra credit for participating. Grading is usually carried out on a percentage system in the US, with exams, labs, and other assignments each comprising a fraction of the semester’s final grade. Instructors sometimes offer extra credit assignments to give students an opportunity to improve their grade.
The survey evolved. Initially, we gave nine participants a four-question multiple-choice quiz about the four topics [Supplementary Table S1, student survey summary, page 8], followed by an interview to determine where they received the information they used in their answers. After checking the results, we did not learn what they knew about the four geoscience concepts. This is because students are skilled at winnowing out incorrect multiple-choice answers and spotting words that suggest correct answers. For example, this multiple-choice question resulted in 9 out of 9 correct answers:
What are sedimentary basins? Choose the best answer:
  • Depressions of the Earth’s crust in which a thick sequence of sediments has been deposited
  • Craters left by a meteorite impact and filled with igneous rock
  • A place where Earth’s surface has collapsed into a sinkhole
  • A low-lying region carved out by glaciers
We gathered that students knew that “depressions of the Earth’s crust” and “basins” mean the same thing and could recognize that the correct answer probably had the word “sediments” in it. This experience taught us that asking multiple-choice questions—at least the ones we devised—was not the best way to assess students’ knowledge. We discussed this problem and decided to change our approach by asking each student to write answers to open-ended questions about the four geoscience topics. After students wrote their answers, we asked them to discuss where they received their information. This format, commonly used in geoscience concept surveys [20] gave us a much better assessment of student knowledge. Below are the four questions that we asked:
Crustal geophysics question: “Tell me what you can about the term Geophysics, and what data can you think of that only Geophysicists can tell us?”
Crustal geology question: “Tell me what you can about Geology? What is it, and what can it tell us?”
Absolute dating question: “Fossils can help us know the age of some sedimentary rocks. But igneous rocks don’t have fossils. Can you tell me what kind of methods we use to calculate the age of igneous rocks?”
Sedimentary basin question: “Can you tell me what a sedimentary basin is? Have you heard of one? Why would sedimentary basins be important to us?”
Objectively scoring answers to multiple-choice questions is easy: each answer is right or wrong. Objectively scoring answers to open-ended questions is more challenging, but can be done [21]. In the student assessment, we considered any word that can be associated with the topics to be a keyword, opening up a much broader vocabulary for evaluation. A student could answer the geophysics question without once using one of the nine textbook assessment keywords but receive points for using other geophysics-related terms like mantle, core, layers, magnetometer, and well log.
We assigned point values to student answers using the following rubric: (1) presence of keywords; (2) keyword usage; (3) accurate examples given; and (4) clarity vs. vagueness.
Our analysis looked for patterns in students’ responses to the questions, especially about the following:
  • Which of the four topics do students feel most confident about?
  • Which of the four topics do students feel least confident about?
  • Which, if any, of these topics can students explain clearly?
We assigned numerical scores based on this rubric. We first looked for the best answers in the group and scored them; then, we looked for the worst answers in the group and scored those. Then, the rest of the answers were scored by identifying what made them better than the worst but not as good as the best. Scores for each topic were rated from 1 (lowest) to 5 (highest) and were averaged for the group to obtain a group score for that topic. See Supplementary Table S2 summary of taught vs. learned curriculum, page 2, for scored answers.
We asked students where they received the information they used to answer the questions. For crustal geophysics, 36 of 42 students reported the sources of their answers, 33 for crustal geology, 34 for radiometric dating, and 32 for sedimentary basins. Students were encouraged that it is alright to admit if they did not know the answer and made a guess, or that they did not remember where they learned the information. Guess and this course are self-explanatory categories of answer, as are don’t remember and own research/interest. Every other reported source, including other science classes, videos, social media, and in one case, movies, was combined into different source. Some responses included both the current introductory geology course and another source. For these responses, we added one point apiece into the this course and different source columns of the spreadsheet (See Supplementary Table S1, student survey summary, page 7).

3. Results

3.1. Textbook Index Review

Our results are summarized in Table 3 and Figure 1.
Let us look at how each of the six textbooks treats the 36 keywords covering the four topics of interest. The total number of keywords used in each textbook varies significantly, between a low of 249 to a high of 419, with a mean of 318 and a standard deviation of 130 (Table 3). This heterogeneity is also seen in how each textbook treats each topic. Geophysics keywords in each text vary between 40 and 60% of all keywords, with a mean of 50 ± 7.2%. Geology keywords vary between 15 and 32%, with a mean of 23 ± 7.1%. Radiometric dating keywords vary between 5 and 27%, with a mean of 17 ± 7.3%. Sedimentary basins keywords vary between 6 and 15%, with a mean of 10 ± 3.1%. A Kruskal–Wallis H test showed no significant difference between the groups. We used an online calculator (https://www.statskingdom.com/kruskal-wallis-calculator.html (1 July 2025)) to test our keyword counts, with the following results: there is a non-significant difference in the dependent variable between the different groups, χ2(5) = 4.39, p = 0.494, with a mean rank score of 13.5 for Group1 (Grotzinger Jordan), 15 for Group 2 (Marshak), 12.75 for Group 3 (Plummer Carlson Hammersley), 13 for Group 4 (Reynolds Johnson Morin Carter), 14.75 for Group 5 (Tarbuck Lutgens), and 6 for Group 6 (Wicander Monroe).
From this analysis, it is clear that of the four topics, geophysics receives strikingly more attention in terms of the percentage of keywords. Overall, geophysics received over twice as much attention as the second-ranking topic, geology, almost three times the attention given to radiometric dating, and five times more than sedimentary basins. The second-most important topic in terms of keyword percentage varied between geology and radiometric dating. In all but one textbook, sedimentary basins received the least attention.

3.2. Student Interviews

We calculated the mean and standard deviation for each category of the answers. Table 4 and Figure 2 compare the results; histograms are shown on page 1 of Supplementary Table S1—student survey summary.
We scored students’ responses using four categories:
(1)
Keyword presence is a simple count of all words related to the topic.
(2)
Keyword usage scores how well the word is used in the answer. A word in a sentence, “geologists find resources,” is scored higher than the word “resources” appearing in a list of words.
(3)
Examples awards points to relevant examples. “Geologists find resources like groundwater” is an example that would receive a high score.
(4)
Clarity/vagueness awards higher scores to complete sentences and full explanations and lower scores to lists and incoherent statements. A list of a dozen geophysics terms would receive a score of 5 for presence of keywords, but a 1 for Clarity/vagueness.
Considering the significance of the means, both for the categories for each topic, as well as the topic mean score, we might expect a group of students who are comfortable with a topic to score a mean of above 2.5 on each of the categories. Correspondingly, we expect groups of students who are unfamiliar with a topic to score less than 2.5. For the sixteen category topics, means ranged from a low of 1.5 for Geophysics Clarity/vagueness to a high of 3.4 for Geology Keyword presence. Of the four topics and their categories, means were consistently highest in Geology (2.4 to 3.4), and the other three categories scored more poorly in each topic: Geophysics (1.5 to 2.4), Radiometric Dating (2), and Sedimentary Basins (1.7 to 2.3). Keyword presence was the highest-scoring category for all four topics. Keyword usage scored second-highest, Examples scored third-highest, and Clarity/vagueness scored lowest. Consider the significance of the means of each of the four topics. The Geology mean ranked highest at 2.9, and the other three means ranked essentially the same (1.9 to 2.1).
Considering the significance of the standard deviations, both for the categories for each topic, as well as the topic mean score, we might expect a group of students with widely different levels of understanding to result in a wide range of standard deviations, whereas a group of students with similar levels of understanding would result in a narrow range of standard deviations. For the sixteen category topics, standard deviations ranged from a low of 0.8 for Geology Keyword presence and relevance to a high of 2 for Geology Examples. Surprisingly, both the two lowest (0.8 and 0.9) and the highest (2) standard deviations appear in the Geology topic. Geophysics categories have uniformly low standard deviations (1.1 to 1.2). The sedimentary basins categories have the next lowest standard deviations (1.2 to 1.4). The uniform means and standard deviations for the Radiometric Dating categories reflect that students either knew or did not know the answer.
Student answers to the open-ended questions varied more than the multiple-choice answers. The sedimentary basin question is especially illustrative. While 100% of students chose the correct multiple-choice answer, only six out of 42 students had a good answer in the open-ended format. For the sedimentary basin question, 36 out of 42 answers were similar to this:
“I don’t think I have ever heard of one. But just going based off the two words together I think it could be an area where multiple sedimentary rocks sit. I don’t know why they would be important.”
All students had heard of sedimentary rocks, and all knew that basins were low places on Earth’s surface, but most knew no examples and did not understand the importance of sedimentary basins. The sharp difference between the results of multiple-choice and open-ended questions convinced us to discard the nine surveys that only used multiple-choice questions.
Reported sources of information are summarized in Figure 3. For crustal geophysics, guess was reported as the most important source of answers, at 50%. Only 26% of students reported this course as their source of answers about geophysics, and 22% reported different source. For crustal geology, this course was the most important source of answers, at 52%, with different source—usually another geoscience class—as the second-most important, at 34%, only 12% reported guess as their source. For radiometric dating, this course was the most important source (49%), even though many of these students’ answers were incorrect. Different source was the second-most important (26%), and 23% reported guess. For sedimentary basins, this course was the most important source (46%), with guess a close second (38%), and a different source the third-most important (15%). Only a few percent of all respondents gave don’t remember as their source of answers.

4. Discussion

In this section, we explore the implications of three results: what is taught in courses or the taught curriculum [22], what students learn in the courses or the learned curriculum [22], and the relationship, if any, between the two.

4.1. Taught Curriculum

We want to understand what is taught in courses about the following topics: geophysics of continental crust, geology topics related to crustal provinces, radiometric dating, and sedimentary basins. These topics interest us because they are essential for explaining the subjects of our three videos: geophysics of the US crust, crustal provinces of the US crust, and sedimentary basins in the US crust.
What is the best way to understand what students are taught? The best way, of course, is to observe all the classes and see for yourself. Several hundred thousand students enroll annually in introductory geoscience courses in the US [23]; this implies that there are several thousand geoscience courses offered annually. This, and smaller but still resource-heavy direct classroom observations [24], would be operationally impossible for any small group of researchers to do [25]. The second-best method would be to survey geoscience instructors about the extent to which they teach our three topics of interest. The 2016 National Geoscience Faculty Survey (NGFS) involved emailing a survey to 10,910 geoscience faculty, with 2615 respondents [23,26]. This approach has been used to understand motivators and inhibitors to change—why and how geoscience faculty modify their course content and teaching methods [27]. This research was sponsored by the National Association of Geoscience Teachers (NAGT) and was made possible by funding from 16 National Science Foundation (NSF) grants. We did not take this approach because our effort had no external funding.
We developed what we think is the third-best approach, which involves examining what the textbooks teach. This is a valid approach because some of the material in the assigned textbooks will be used in some of the classes. These materials encourage instructors to follow the assigned textbook more closely and for students to read the textbook. We cannot know how much material from the textbooks is used in the classroom, but we are confident that there is some relationship between what textbooks say and what is taught in the classroom. Some instructors assign textbooks that are never or rarely referred to in lectures; in this case, the relationship between textbook content and students’ source of information is likely to be low. Other instructors follow textbooks closely in lectures; in this case, students are more likely to read the textbook and use it as a source of information. Furthermore, publishers provide large amounts of additional material—figures, slide sets, test banks, interactive websites, etc. These are either the same figures as in the textbook or are closely related to the text. It seems reasonable to assume that there is a correlation between what is in assigned textbooks and what is taught in the classroom, although we cannot quantify this relationship. For these reasons, we have focused on what is in textbooks as the third-best method of assessing taught curriculum on our topics of interest. What is the best way to do this?
There are a number of ways to do this, but we think a reasonable approximation of a textbook’s content is to examine its index, as we have done for our topic-related keywords in Table 2 and Table 3. We believe it is valid to consider the six textbooks as a group. This is because the overall content and structure of introductory geology courses are broadly similar. The six textbooks we examined share fifteen chapter topics in common: plate tectonics, minerals, igneous rocks, volcanism, sedimentary rocks, metamorphic rocks, earthquakes, Earth’s interior, geologic structures, deep time, surface water, oceans and coasts, groundwater, deserts, and glaciers. Our topics of interest—crustal geophysics, crustal geology, radiometric dating, and sedimentary basins—should appear in some of these common chapters if they appear anywhere. Crustal geophysics topics should appear in chapters about the Earth’s interior and earthquakes. Crustal geology topics should appear in chapters about igneous rocks and plate tectonics. Radiometric dating should appear in chapters about deep time. Sedimentary basins should appear in chapters about sedimentary rocks and geologic structures.
To what extent could our use of index terms as a proxy for taught curriculum be misleading? The possible reasons for this could be (1) that the taught curriculum has no relationship with textbook contents; (2) index terms vary greatly from textbook to textbook, and (3) the emphasis of different topics in textbooks. The first concern is the most important, but without being in the classes, we cannot evaluate this possibility. We can address the second and third concerns. Indexing standards are non-binding [18]; however, index terms vary only slightly by publisher. We believe using index terms as a proxy for taught curriculum is unlikely to be more than slightly misleading for that reason. The textbooks we surveyed all emphasized the same principal topics in introductory geology; therefore, we do not think their indices are substantially different because of textbook content. Supplementary Document 2 summarizes the incidence of keywords by each topic of interest. Geophysics keywords appear most frequently in all six textbooks. Sedimentary basin keywords appear the least frequently in four of the six textbooks, tying for lowest frequency in the fifth textbook, and are next to last in the sixth. Radiometric dating keywords are mentioned more frequently than geology keywords in 2/3 of the textbooks, but neither topic approaches the sum of geophysics keywords. For this reason, we think it is appropriate to treat the six textbooks as a group, as a way of accessing the taught curriculum for the topics we are interested in. Our method provides the “quick and dirty look” we need to guide our composition of videos on these topics. We feel our method gave us enough of an understanding of the taught curriculum to compare it meaningfully with the learned curriculum as well.
Some unexpected results emerged from the textbook assessment (Table 3, Figure 1). The most unexpected result was the dominance of geophysics keywords compared to the other three categories of keywords. Our geophysics keywords in Table 2 comprised 40% to 63% of all keywords counted in the six textbooks, with a mean of 51%. The range of keywords in Table 2 for all of the other categories in all of the textbooks ranged from 6% to 32%. This is surprising because the purpose of all six textbooks is to introduce students to geology, not geophysics, as reflected in the titles of the books. We expected that geology keywords would appear more than keywords for the other three topics.
A second surprising result is how little is taught about sedimentary basins. Our sedimentary basins keywords in Table 2 comprised 6% to 15% of all keywords counted in the six textbooks, with a mean of 10%. This is surprising because sedimentary basins are the principal source of the oil, gas, and coal that power America, and this industry provides a lot of jobs for geoscientists and others.
A third surprising result is the low percentages of our geology keywords from Table 2. Our geology keywords comprised 10% to 32% of all keywords counted in the six textbooks, with a mean of 22%. Part of the explanation for this is that we split the crustal province topic into two topics: geology and radiometric dating. If we combine these two topics into one, the related keywords range between 28% and 44% of the keywords we sought.
Would our results help a professor choose one of these textbooks for their class? It is not our intention to judge which of these textbooks is best for introductory courses. However, we can imagine that a glance at Table 1’s comparison of numbers of figures, tables, chapters, and pages might be useful.

4.2. Learned Curriculum

We decided not to use the results of the nine multiple-choice surveys for reasons explained in Section 2.2. We think that the second survey method’s written answers to open-ended questions are better because the students had to come up with their own answers instead of choosing between answers we provided. Using their own words meant they had to have learned keywords and examples, and learned them well enough to explain them. Here is what we think the results of the 42 open-ended surveys tell us about what introductory geoscience students learned about each of our four topics of interest:
Crustal geophysics: Students’ learning of this topic was limited, as indicated by the mean score of 1.9, statistically tied for worst along with radiometric dating and sedimentary basins. Most were not able to explain what the keywords meant, although most knew a few keywords (“Maps and readings of plate tectonics, earthquakes, volcanoes,” etc., Table 4). Most were unable to provide relevant examples, and most answers were vague. It seems that students have some knowledge about geophysics; for example, they are generally aware that movement occurs within the Earth and that earthquakes happen, but do not understand what geophysics can tell us about the crust. This result is both surprising and unsurprising; unsurprising because the class is introductory geology, not introductory geophysics, but surprising because so much emphasis is placed on geophysics in the taught curriculum.
Crustal geology: Most students knew some geology keywords and were able to explain what they meant and offered some valid examples, and their answers were clearer than for any of the other four topics. The mean score of students’ answers was by far the highest (2.9) for the four topics, and each category in this topic scored higher than corresponding categories in the other three topics. This result is unsurprising because introductory geology concepts are more easily expressed in plain language without specialized terminology, as observed for the other three topics.
Radiometric dating: Students had either learned or had not learned this topic. The mean score of students’ answers was tied for the worst along with crustal geophysics and sedimentary basins (2.0); 40 out of 42 scores were either 1 or 5. All four scoring categories reflected this bimodal learning outcome, as can be seen in radiometric dating’s highest standard deviation among all topics and categories.
It is surprising that this course was the primary source of answers for radiometric dating, because many of these answers were incorrect. It is not surprising that the score was lower than for crustal geology, because the answer requires remembering specific terminology and processes, and introductory science students have been found to hold beliefs about radiation and radioactivity that differ from the settled science of radiometric dating [28].
Sedimentary basins: These scores were slightly better than for absolute dating, but it was still mostly a know-it/do-not-know-it question. The sources of information for this question were guessing and this course, with four students reporting this course and another source of information.
For sedimentary basins, this course was the primary information source. Like radiometric dating, this is surprising because many of these students’ answers were incorrect. The low emphasis on sedimentary basins in the taught curriculum may explain the poor scores. This lack of emphasis on teaching undergraduates about sedimentary basins is consistent with the results of a survey of sedimentology and stratigraphy concepts taught in undergraduate geoscience courses [29].
The combined means for each question show one topic students generally know more about (geology) and three others they know much less about (geophysics, absolute dating, and sedimentary basins). This is not surprising, since geology is the broadest of the four topics, and the other three questions are about subsets of the science. We think the casual survey and interview format is partly responsible for the low scores for clarity vs. vagueness. But answers to the geology question were expressed much more clearly than the other three, showing the connection between knowing about a topic and being able to make clear statements about it.
Figure 3 shows that many students reported learning about our topics of interest in the introductory geology course they were taking, with different sources, usually other science classes, as the second-most important source. Scoring shows students learned about crustal geology to an above-average extent, but learned much less about our other three topics. We believe this shows that what is taught about our topics of interest, and what is learned, are not the same for the low-scoring topics, namely, crustal geophysics, radiometric dating, and sedimentary basins.

4.3. Comparing Taught and Learned Curriculum

We assume that there is a positive correlation between taught curriculum and learned curriculum (ideal trend in Figure 4). If not, then what is the purpose of the taught curriculum? Building on that assumption, we examine what we have learned about that relationship. Specifically, we assume that the more a topic is taught, the more a student will learn about that topic. In our case, we think the number of index entries referring to our keywords is a proxy for the taught curriculum, and the numerical scores on the survey give us an understanding of the learned curriculum.
We expected a positive relationship between taught and learned curriculum, as illustrated by the ideal trend in Figure 4; however, this is not observed. There is no positive relationship between our measures of taught and learned curriculum, and, in fact, the overall trend is slightly negative. Our data do not reveal a positive relationship between our proxy for taught curriculum and our understanding of learned curriculum. What is the best way to interpret these surprising results? We consider three possibilities below.
One possibility is that our procedure for assessing taught curriculum is flawed. We admit that our procedure is not the best one; however, it is the only one available to us for reasons explained in Section 4.1. Is it good enough? We believe we have developed a novel method of assessing taught curriculum. We acknowledge that we cannot reject the possibility that this novel procedure is flawed.
Another possibility is that our procedure for assessing learned curriculum is flawed. Our procedure is in common use and has been shown to be effective [10,28,30,31]. This suggests that our procedure for assessing learned curriculum is effective. Nevertheless, we think we can compose better questions for this student survey, which are explored in Section 4.2—designing a better student survey.
A third possibility is that there may be two relationships, one for geophysics and one for the other three topics, which are related to geology. We observe that despite the dominance of crustal geophysics in our index of taught curriculum, students did not have a sound grasp of geophysics concepts. Students showed a similar grasp of geophysics, radiometric dating, and sedimentary basins, in spite of the fact that geophysics is emphasized more than the other two topics in our proxy of taught curriculum. In contrast, there were significantly fewer crustal geology keywords in the taught curriculum, yet the top mean score shows students learned significantly more about geology than any of the other three topics. There may be more than one relationship for each of the four topics, but each with a different trend. If this is a useful explanation, then geophysics is a much more difficult topic for students to understand than the other three; in which case, there is a weaker relationship between the geophysics taught and learned curriculum.
Why should this be so? We think this is the result of the fact that grasping geophysics imposes a large cognitive load on students [32], requiring understanding abstract math and physics, which makes it challenging for many students to learn [33,34]. In contrast to geophysics, geology can be taught in ways that are more tangible and less abstract [35,36]. Low scores for radiometric dating and sedimentary basins correlate with the low emphasis in the taught curriculum. We conclude that there are two relationships between taught and learned curriculum: one trend is for the more easily learned geology-related topics, and the other is for the more difficult topic, geophysics. Cognitive load can be quantified [37]; however, we think our assessment and interview format provides sufficient evidence that a majority of students in our survey found some concepts easier to understand than others. This information will help us design video content that helps build students’ intuition about abstract topics in geosciences.

4.4. Designing a Better Student Survey

We learned a lot in the course of the survey, including how it might be improved in subsequent surveys in this ongoing study. We learned that detailed student responses to interview questions that can be scored objectively are required in order to evaluate the learned curriculum. We feel that such responses could have been obtained by asking more questions and more specific questions. For example, if we wanted to know if students knew the names of types of clouds, it would be better to ask “what are the names of types of clouds” than “tell me about clouds.” In composing new questions, we aim to be less general and more specific. Good survey questions do not hint at correct answers, but must be posed clearly enough to rule out ambiguities.
Below, we consider the questions one by one. The first question was about geophysics: “Tell me what you can about the term geophysics, and what data can you think of that only geophysicists can tell us?” We decided this question is too vague and could be improved by making it more specific: “The science of physics applied to the Earth is called geophysics. What do we now know about the interior of the Earth that geophysicists discovered? How thick is the continental crust? What is one method that has taught us how do we know this?”
The second question was about geology: “Tell me what you can about geology? What is it, and what can it tell us?” We think this question is vague and could be improved by making it more specific: “You took a course about geology, which is the study of the Earth. What kinds of crust are there? Can you tell me what kind of rocks that make up the continental crust? About how old is the US continental crust?”
The third question was about absolute dating: “Fossils can help us know the age of some sedimentary rocks. But igneous rocks don’t have fossils. Can you tell me what kind of methods we use to calculate the age of igneous rocks?” We think this question was ambiguous because we did not specifically exclude relative dating, and it could be improved by making it more specific: “Knowing the age of rocks is critical for geology. The principle of superposition tells us that younger rock strata lie on top of older rocks. Fossils can help us know the relative age of some rocks. What kind of rocks contain fossils that can allow us to know the rock’s relative age? It is possible to get an absolute age of a rock; for example, 90 million years. What kinds of rocks are best for this? What natural process makes it possible to know this age?”
The fourth question was about sedimentary basins: “Can you tell me what a sedimentary basin is? Have you heard of one? Why would sedimentary basins be important to us?” We think this question was too vague and could be improved by making it more specific: “Sedimentary basins are large geologic structures filled with great thicknesses of sedimentary rocks, and there are several types of sedimentary basins. Can you name a type of sedimentary basin? Can you identify an important sedimentary basin in the US? Can you tell me why sedimentary basins are important to the US economy?”
All of the new questions have a multi-layered approach that begins by reminding students of the question’s topic, then asks three related and specific questions about it. We think this approach will better gauge the learned curriculum for two reasons. First, reminding students of the topic is likely to put them at ease and help them recall what they learned in the course about this topic. Second, this revised approach asks more questions about each topic, which makes the answers easier to score objectively.

4.5. Lessons for Producing the Videos

As stated in Section 1, we are making three short videos to teach introductory geology students about the US continental crust and its sedimentary basins. What did we learn from the surveys of the taught and learned curriculum that will help us make these videos more effective? We surveyed students’ understanding of four topics, and learned that they were equally uncomfortable with geophysics, radiometric dating, and sedimentary basins, and more comfortable with geology. Because of this, we think the videos need to pay more attention to each of the topics that students were uncomfortable with.
A threshold concept is defined as a concept that allows a breakthrough in conceptual understanding [38,39]. Threshold concepts are difficult to learn but transformative once mastered, such as the understanding of feedback loops in Earth systems [40]. We think we should and can incorporate threshold concepts into our videos. Below, we explore the lessons we have learned from our survey of taught and learned curriculum, and the threshold concepts we would like to present in each.
We observed that despite the dominance of geophysics in our index of taught curriculum, students did not have a sound grasp of these concepts (Section 4.3). One reason for this may be that there are so many geophysical methods: seismic refraction and reflection, measurement of crustal heat flow, magnetism, gravity surveys, electrical resistivity, and others. These are all new, unfamiliar, and intimidating to introductory geology students. Only a subset of these is used to determine crustal thickness, including seismic refraction, seismic reflection, gravity modeling, and receiver functions. If we tried to explain even this subset of methods in a single video, this would result in a heavy cognitive load, which would not engage students’ interest. Effective videos must be designed to keep viewers interested, and information that does not contribute to explaining the video’s topic should be avoided; this is extraneous cognitive load [41]. From our experience making geoscientific educational videos, we have found that the best way to hold a viewer’s attention is to focus on a single important method to scaffold their understanding [42]. Scaffolds are temporary supports used in constructing a building. In our case, students understanding that geophysics is essential to determine the thickness of the continental crust is the “building.” Explaining one of the most important geophysical methods used to do this—seismic refraction—rather than explaining all of the methods, is a “scaffold” designed to ease the cognitive load [43,44] that geophysics imposes on students. Seismic refraction is not only a useful scaffold, it is a threshold concept for crustal geophysics.
In the case of the second video about the age of US continental crust, we have the opportunity to present two threshold concepts: deep time and how we know this—radiometric dating [39,45]. We learned that students either knew something or did not know anything about radiometric dating. Specifically, many students are unaware of even the concept of radioactive decay, which is essential for calculating the absolute ages of crustal rocks. Every book we reviewed introduces the concept of radiometric dating, but student comprehension may be blocked by the large cognitive load of learning how radioisotopes decay and the meaning of parent/daughter ratios in rock. Students’ knowledge of time does not include deep time, and thus does not include the timescale of important Earth events [46].
Understanding the age of US crust is a threshold to understanding deep time, because its age encompasses 85% of the age of the Earth. If students can learn that the US crust formed over 3.5 billion years, they will have crossed the threshold of understanding deep time. But in order to do that, students must learn the threshold concept of radioactive decay and its utility in radiometric dating of crustal rocks.
In the case of sedimentary basins, we learned that with few exceptions, most students knew little about what these are, nor could they name any US sedimentary basins. Sedimentary basins are shown and sometimes explained in the textbooks we reviewed, but the concept was not stressed. Some textbooks applied the term sedimentary basin only to intracratonic basins. No textbook that we reviewed discusses the concept of sedimentary basins as important geologic structures where sediments accumulate, and the reasons these form in different tectonic environments. We learned that we need to introduce the concept itself first, because only then can we list the types of sedimentary basins and their distribution in the US.
Is there a threshold concept about sedimentary basins? We think there is. Learning the concept that not only mountains and canyons have layers and complexity, but that unseen structures lie beneath flat, featureless expanses of Earth’s surface—the Permian Basin of Texas, for example—potentially transforms the students’ understanding of Earth’s subsurface structure. Sedimentary basins, once comprehended, connect to the other two video topics, inducing an appreciation of deep time and of the utility of geophysics.

5. Conclusions

In this paper, we explored what students are taught and what they understand about how we use geophysics and geology to study continental crust, and what they are taught and know about sedimentary basins. We will use this information to help us design three videos about US continental crust and its sedimentary basins.
To do this, we had to understand both what students are taught and what they learn. We assessed the taught curriculum in six textbooks, using a novel approach based on counting keywords in the textbooks’ indices to quantify what the textbooks teach. We found that geophysics is emphasized more than geology, radiometric dating, and sedimentary basins in all six textbooks.
To assess the learned curriculum, we surveyed 42 students using a time-intensive, open-ended written survey and interview, meeting with each participant one at a time. Our purpose was to determine what students understand about four topics related to US crust and sedimentary basins: geophysics, geology, radiometric dating, and sedimentary basins. No relationship between taught and learned curriculum was observed (Figure 4). Results show students know more about geology, and know much less about the other three topics. We were surprised that most students had learned little about geophysics, despite its strong emphasis in textbooks. We believe this shows that grasping geophysics’ abstract math and physics concepts imposes a large cognitive load on students, which makes it challenging for many students to learn, and geology’s tangible and less abstract concepts impose a smaller cognitive load, making it easier to learn.
We found that there were three topics that students struggle to understand: (1) how we use geophysics to study the US continental crust; (2) how we use radiometric dating to distinguish crustal provinces and their ages; and (3) what US sedimentary basins are. Our results suggest that a well-executed video series could usefully complement a textbook-based taught curriculum. We think each video should emphasize a threshold concept: ideas that are difficult to learn, but once mastered, allow a breakthrough in conceptual understanding. Our threshold concepts are seismic refraction for crustal geophysics, deep time and radiometric dating for crustal geology, and economic importance and subsurface structure for sedimentary basins.

Supplementary Materials

The following supporting information can be downloaded at: https://www.mdpi.com/article/10.3390/geosciences15080296/s1, Supplementary Table S1: Student survey summary, Supplementary Table S2: Summary of taught vs. learned curriculum.

Author Contributions

Conceptualization, C.W.C. and R.J.S.; methodology, C.W.C. and R.J.S.; formal analysis, C.W.C. and R.J.S.; investigation, C.W.C.; data curation, C.W.C.; writing—original draft preparation, C.W.C.; writing—review and editing, C.W.C. and R.J.S.; visualization, C.W.C.; supervision, R.J.S. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

The original data presented in the study are openly available in Zenodo, at DOI: 10.5281/zenodo.15445404. The study was conducted in accordance with the Declaration of Helsinki, and the protocol was approved by the Ethics Committee of IRB-23-709 on 8 August 2023. Informed consent for participation was obtained from all subjects involved in the study. The authors declare no conflict of interest.

Acknowledgments

We appreciate the assistance we have received from introductory geoscience instructors at UT Dallas, Griffin, Pujana, and Sickmann. We received advice on how to assess learned curriculum from Urquhart of UT Dallas’ Math and Science Education Department, and Denise Gregory for advice on statistical results. We also appreciate the publishers of the textbooks we reviewed, all of whom allowed us access to the latest editions. This is UTD geosciences contribution number 1733.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Fortier, S.M.; Nassar, N.T.; Graham, G.E.; Hammarstrom, J.M.; Day, W.C.; Mauk, J.L. USGS Critical Minerals Review. Min. Eng. 2022, 74, 34–48. [Google Scholar]
  2. Schulz, K.; DeYoung, J.; Seal, R.; Bradley, D. Professional Paper 1802: Critical Mineral Resources of the United States—Economic and Environmental Geology and Prospects for Future Supply. Professional Thesis, U.S. Geological Survey, Reston, VA, USA, 2017. [Google Scholar]
  3. Sovacool, B.K.; Ali, S.H.; Bazilian, M.; Radley, B.; Okatz, J.; Mulvaney, D. Sustainable Minerals and Metals for a Low-Carbon Future. Science 2020, 367, 30–33. [Google Scholar] [CrossRef]
  4. Brady, P.V.; Freeze, G.A.; Kuhlman, K.L.; Hardin, E.L.; Sassani, D.C.; MacKinnon, R.J. Deep Borehole Disposal of Nuclear Waste. In Geological Repository Systems for Safe Disposal of Spent Nuclear Fuels and Radioactive Waste; Elsevier: Amsterdam, The Netherlands, 2017; pp. 89–112. ISBN 978-0-08-100642-9. [Google Scholar]
  5. Zou, L.; Cvetkovic, V. Disposal of High-Level Radioactive Waste in Crystalline Rock: On Coupled Processes and Site Development. Rock Mech. Bull. 2023, 2, 100061. [Google Scholar] [CrossRef]
  6. Petcovic, H.L.; Ruhf, R.J. Geoscience Conceptual Knowledge of Preservice Elementary Teachers: Results from the Geoscience Concept Inventory. J. Geosci. Educ. 2008, 56, 251–260. [Google Scholar] [CrossRef]
  7. Jones, J.P.; McConnell, D.A.; Wiggen, J.L.; Bedward, J. Effects of Classroom “Flipping” on Content Mastery and Student Confidence in an Introductory Physical Geology Course. J. Geosci. Educ. 2019, 67, 195–210. [Google Scholar] [CrossRef]
  8. Wiggen, J.; McConnell, D. Geoscience Videos and Their Role in Supporting Student Learning. J. Coll. Sci. Teach. 2017, 46, 44–49. [Google Scholar] [CrossRef]
  9. Willis, S.; Stern, R.J.; Ryan, J.; Bebeau, C. Exploring Best Practices in Geoscience Education: Adapting a Video/Animation on Continental Rifting for Upper-Division Students to a Lower-Division Audience. Geosciences 2021, 11, 140. [Google Scholar] [CrossRef]
  10. Suskie, L. Assessing Student Learning: A Common Sense Guide; John Wiley & Sons, Incorporated: Hoboken, NJ, USA, 2018. [Google Scholar]
  11. Suskie, L. Assessing Student Learning: A Common Sense Guide; John Wiley & Sons, Incorporated: Hoboken, NJ, USA, 2009; ISBN 978-0-470-28964-8. [Google Scholar]
  12. Bhattacharya, D.; Carroll Steward, K.; Forbes, C.T. Empirical Research on K-16 Climate Education: A Systematic Review of the Literature. J. Geosci. Educ. 2021, 69, 223–247. [Google Scholar] [CrossRef]
  13. Christman, R.; Aronoff, S.; Burmester, R.; Babcock, S.; Engebretson, D.; Schwartz, M.; Talbot, J.; Wodzicki, A. Evaluations of Some Introductory Geology Textbooks. J. Geol. Educ. 1985, 33, 188–191. [Google Scholar] [CrossRef]
  14. King, C.J.H. An Analysis of Misconceptions in Science Textbooks: Earth Science in England and Wales. Int. J. Sci. Educ. 2010, 32, 565–601. [Google Scholar] [CrossRef]
  15. Arthurs, L. What College-Level Students Think: Student Alternate Conceptions and Their Cognitive Models of Geoscience Concepts. In Qualitative Inquiry in Geoscience Education Research; Geological Society of America Special Paper 474; Geological Society of America: Boulder, CO, USA, 2011; pp. 135–152. ISBN 978-0-8137-2474-4. [Google Scholar]
  16. Czajka, C.D.; McConnell, D. An Exploratory Study Examining Undergraduate Geology Students’ Conceptions Related to Geologic Time and Rates. J. Geosci. Educ. 2018, 66, 231–245. [Google Scholar] [CrossRef]
  17. Ford, B.; Taylor, M. Investigating Students’ Ideas About Plate Tectonics. Sci. Scope 2006, 30, 38–43. [Google Scholar]
  18. Guffey, S.K.; Slater, T.F. Geology Misconceptions Targeted by an Overlapping Consensus of US National Standards and Frameworks. Int. J. Sci. Educ. 2020, 42, 469–492. [Google Scholar] [CrossRef]
  19. Z39.4-2021; ANSI/NISO Z39.4-2021 Criteria for Indexes. National Information Standards Organization Z39.4 Working Group NISO: Baltimore, MD, USA, 2022. [CrossRef]
  20. Stine, M.B.; Butler, D.R. A Content Analysis of Biogeomorphology within Geomorphology Textbooks. Geomorphology 2011, 125, 336–342. [Google Scholar] [CrossRef]
  21. Cheek, K.A. Commentary: A Summary and Analysis of Twenty-Seven Years of Geoscience Conceptions Research. J. Geosci. Educ. 2010, 58, 122–134. [Google Scholar] [CrossRef]
  22. Gómez-Gonçalves, A.; Corrochano, D.; Fuertes-Prieto, M.Á.; Ballegeer, A.-M. How Long Has It Taken for the Physical Landscape to Form? Conceptions of Spanish Pre-Service Teachers. Educ. Sci. 2020, 10, 373. [Google Scholar] [CrossRef]
  23. Bolin, P.E. What Is Learned? How Do We Know? Art Educ. 1999, 52, 4. [Google Scholar] [CrossRef]
  24. Egger, A. The Role of Introductory Geoscience Courses in Preparing Teachers—And All Students—For the Future: Are We Making the Grade? GSAT 2019, 29, 4–10. [Google Scholar] [CrossRef]
  25. Budd, D.A. Characterizing Teaching in Introductory Geology Courses: Measuring Classroom Practices. J. Geosci. Educ. 2013, 61, 461–475. [Google Scholar]
  26. Goldsmith, D.W. A Case-Based Curriculum for Introductory Geology. J. Geosci. Educ. 2011, 59, 119–125. [Google Scholar] [CrossRef]
  27. Egger, A.E.; Viskupic, K.; Iverson, E.R. Results of the National Geoscience Faculty Survey (2004–2016); National Association of Geoscience Teachers: Northfield, MN, USA, 2019. [Google Scholar]
  28. Riihimaki, C.A.; Viskupic, K. Motivators and Inhibitors to Change: Why and How Geoscience Faculty Modify Their Course Content and Teaching Methods. J. Geosci. Educ. 2020, 68, 115–132. [Google Scholar] [CrossRef]
  29. Prather, E. Students’ Beliefs About the Role of Atoms in Radioactive Decay and Half-Life. J. Geosci. Educ. 2005, 53, 345–354. [Google Scholar] [CrossRef]
  30. Kreager, B.Z. Concepts About Sedimentology and Stratigraphy in Undergraduate Geoscience Courses. Master’s Thesis, University of Nebraska-Lincoln, Lincoln, NE, USA, 2016. [Google Scholar]
  31. Camburn, E.M. Review of “Asking Students About Teaching: Student Perception Surveys and Their Implementation”; NEPC Review; National Education Policy Center: Boulder, CO, USA, 2012; p. 10. [Google Scholar]
  32. Tretter, T.R.; Jones, M.G.; Andre, T.; Negishi, A.; Minogue, J. Conceptual Boundaries and Distances: Students’ and Experts’ Concepts of the Scale of Scientific Phenomena. J. Res. Sci. Teach. 2006, 43, 282–319. [Google Scholar] [CrossRef]
  33. Jaeger, A.J.; Shipley, T.F.; Reynolds, S.J. The Roles of Working Memory and Cognitive Load in Geoscience Learning. J. Geosci. Educ. 2017, 65, 506–518. [Google Scholar] [CrossRef]
  34. Dentith, M.C.; Wheatley, M.R. An Introductory Geophysical Exercise Demonstrating the Use of the Gravity Method in Mineral Exploration. J. Geosci. Educ. 1999, 47, 213–220. [Google Scholar] [CrossRef]
  35. DiLeonardo, C.; James, B.R.; Ferandez, D.; Carter, D. Supporting Transfer Students in the Geosciences from Two-year Colleges to University Programs. New Dir. Community Coll. 2022, 2022, 107–118. [Google Scholar] [CrossRef]
  36. Gold, A.U.; Pendergast, P.M.; Ormand, C.J.; Budd, D.A.; Mueller, K.J. Improving Spatial Thinking Skills among Undergraduate Geology Students through Short Online Training Exercises. Int. J. Sci. Educ. 2018, 40, 2205–2225. [Google Scholar] [CrossRef]
  37. Waldron, J.W.F.; Locock, A.J.; Pujadas-Botey, A. Building an Outdoor Classroom for Field Geology: The Geoscience Garden. J. Geosci. Educ. 2016, 64, 215–230. [Google Scholar] [CrossRef]
  38. Hart, S.; Staveland, L. Development of NASA-TLX (Task Load Index): Results of Empirical and Theoretical Research. In Human Mental Workload Advances in Psychology; Elsevier: Amsterdam, The Netherlands, 1988; Volume 52, pp. 139–183. ISBN 978-0-444-70388-0. [Google Scholar]
  39. Meyer, J.; Land, R. Overcoming Barriers to Student Understanding; Routledge: New York, NY, USA, 2006; ISBN 978-1-134-18995-3. [Google Scholar]
  40. Stokes, A.; King, H.; Libarkin, J.C. Research in Science Education: Threshold Concepts. J. Geosci. Educ. 2007, 55, 434–438. [Google Scholar] [CrossRef]
  41. Kastens, K.A.; Manduca, C.A.; Cervato, C.; Frodeman, R.; Goodwin, C.; Liben, L.S.; Mogk, D.W.; Spangler, T.C.; Stillings, N.A.; Titus, S. How Geoscientists Think and Learn. EoS Trans. 2009, 90, 265–266. [Google Scholar] [CrossRef]
  42. Brame, C.J. Effective Educational Videos: Principles and Guidelines for Maximizing Student Learning from Video Content. LSE 2016, 15, 1–6. [Google Scholar] [CrossRef] [PubMed]
  43. Van De Pol, J.; Volman, M.; Beishuizen, J. Scaffolding in Teacher–Student Interaction: A Decade of Research. Educ. Psychol. Rev. 2010, 22, 271–296. [Google Scholar] [CrossRef]
  44. Doering, A.; Veletsianos, G. Multi-Scaffolding Environment: An Analysis of Scaffolding and Its Impact on Cognitive Load and Problem-Solving Ability. J. Educ. Comput. Res. 2007, 37, 107–129. [Google Scholar] [CrossRef]
  45. Van Nooijen, C.C.A.; De Koning, B.B.; Bramer, W.M.; Isahakyan, A.; Asoodar, M.; Kok, E.; Van Merrienboer, J.J.G.; Paas, F. A Cognitive Load Theory Approach to Understanding Expert Scaffolding of Visual Problem-Solving Tasks: A Scoping Review. Educ. Psychol. Rev. 2024, 36, 12. [Google Scholar] [CrossRef]
  46. King, H. Student Difficulties in Learning Geoscience. Planet 2012, 25, 40–47. [Google Scholar] [CrossRef]
Figure 1. Percentage and standard deviation of topic keywords to the total number of keywords in all six indices.
Figure 1. Percentage and standard deviation of topic keywords to the total number of keywords in all six indices.
Geosciences 15 00296 g001
Figure 2. Students’ mean scores with standard deviation.
Figure 2. Students’ mean scores with standard deviation.
Geosciences 15 00296 g002
Figure 3. Sources of answers reported by students.
Figure 3. Sources of answers reported by students.
Geosciences 15 00296 g003
Figure 4. Taught curriculum indicator (textbook mean keyword %) vs. learned curriculum (survey mean scores). Ideal trend is intended to generally indicate an expected positive relationship between the emphasis on topics in taught curriculum and expected learning outcomes. The pink and brown lines indicate that geophysics topics are more difficult for introductory students to learn compared with geology-related topics. See text for further discussion.
Figure 4. Taught curriculum indicator (textbook mean keyword %) vs. learned curriculum (survey mean scores). Ideal trend is intended to generally indicate an expected positive relationship between the emphasis on topics in taught curriculum and expected learning outcomes. The pink and brown lines indicate that geophysics topics are more difficult for introductory students to learn compared with geology-related topics. See text for further discussion.
Geosciences 15 00296 g004
Table 1. List of textbooks reviewed.
Table 1. List of textbooks reviewed.
TitleAuthor(s)PublisherYearChaptersPagesFiguresTables
Understanding EarthGrotzinger, JordanW.H. Freeman,
New York, NY, USA
20202378462129
Essentials of GeologyMarshakW. W. Norton & Company, New York, NY, USA20222572085713
Physical GeologyPlummer, Carlson, HammersleyMcGraw-Hill Education, New York, NY, USA20222367274735
Exploring GeologyReynolds, Johnson, Morin, CarterMcGraw-Hill LLC,
New York, NY, USA
201919704213710
Earth: An Introduction
to Physical Geology
Tarbuck, Lutgens Pearson Education, Inc.
Hoboken, NJ, USA
2020247849619
Physical Geology:
Investigating Earth
Wicander, MonroeCengage Learning, Inc.,
Independence, KY, USA
20231852849624
Table 2. Textbook index search terms.
Table 2. Textbook index search terms.
Crustal GeophysicsCrustal GeologyRadiometric DatingSedimentary Basins
MohoLaurentiaAbsolute ageContinental rift basin
Seismic refractionCrustal provinceUranium-leadPassive margin
Crustal compositionProterozoicIsotopesForeland basin
Continental crustArcheanZirconForearc basin
GravityGraniteRadiometric datingIntracratonic basin
EarthquakesPhanerozoicRadioactiveBack-arc basin
MagneticPaleozoicGeochronologyTranstensional basin
SeismologyMesozoicHalf-lifeHydrocarbons
Seismic reflectionCenozoic Subsidence
Table 3. Percentage and standard deviation of keywords in surveyed textbooks.
Table 3. Percentage and standard deviation of keywords in surveyed textbooks.
TopicsGrotzinger
Jordan
MarshakPlummer Carlson
Hammersley
Reynolds Johnson
Morin Carter
Tarbuck
Lutgens
Wicander
Monroe
MeanStd. Dev.
Geophys. %505360404263508.3
Geology %193215292210247.6
Radiometric dating %24520152718186.9
Sedimentary basins %7106159993.0
Total # of keywords35242530224932011829495.1
Table 4. Mean and standard deviation for 42 answers to each of the four questions.
Table 4. Mean and standard deviation for 42 answers to each of the four questions.
CategoryGeophysics Mean ±
1 Std. Dev.
Geology
Mean ±
1 Std. Dev.
Radiometric Dating
Mean ±
1 Std. Dev.
Sed. Basins Mean ±
1 Std. Dev.
Category Mean
Score
Keyword presence2.4 ± 1.213.4 ± 0.852.0 ± 1.702.3 ± 1.442.53
Keyword usage2.1 ± 1.243.0 ± 0.872.0 ± 1.672.2 ± 1.382.33
Examples1.7 ± 1.132.8 ± 1.162.0 ± 1.652.0 ± 1.432.13
Clarity/vagueness1.5 ± 1.122.4 ± 1.092.0 ± 1.671.7 ± 1.171.90
Topic mean score1.9 ± 1.092.9 ± 0.902.0 ± 1.662.0 ± 1.21
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Crowley, C.W.; Stern, R.J. What Are US Undergraduates Taught and What Have They Learned About US Continental Crust and Its Sedimentary Basins? Geosciences 2025, 15, 296. https://doi.org/10.3390/geosciences15080296

AMA Style

Crowley CW, Stern RJ. What Are US Undergraduates Taught and What Have They Learned About US Continental Crust and Its Sedimentary Basins? Geosciences. 2025; 15(8):296. https://doi.org/10.3390/geosciences15080296

Chicago/Turabian Style

Crowley, Clinton Whitaker, and Robert James Stern. 2025. "What Are US Undergraduates Taught and What Have They Learned About US Continental Crust and Its Sedimentary Basins?" Geosciences 15, no. 8: 296. https://doi.org/10.3390/geosciences15080296

APA Style

Crowley, C. W., & Stern, R. J. (2025). What Are US Undergraduates Taught and What Have They Learned About US Continental Crust and Its Sedimentary Basins? Geosciences, 15(8), 296. https://doi.org/10.3390/geosciences15080296

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop