Next Article in Journal
A Multifaceted Analysis of School-Level Factors for Average SAT Score Disparities Across Diverse Regions
Previous Article in Journal
Making Redox Tangible: Physical Models in Electrochemistry Education
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Geoscience Mathematics Self-Efficacy Scale (GeoMSES) for Majors-Level Undergraduates: Psychometric Data

1
Cedar Lake Research Group LLC, Portland, OR 97293, USA
2
Science Education Resource Center, Carleton College, Northfield, MN 55057, USA
3
Earthscope Consortium, Washington, DC 20005, USA
4
Geology Department, Highline College, Des Moines, WA 98198, USA
*
Author to whom correspondence should be addressed.
Educ. Sci. 2026, 16(2), 288; https://doi.org/10.3390/educsci16020288
Submission received: 18 November 2025 / Revised: 3 February 2026 / Accepted: 8 February 2026 / Published: 11 February 2026

Abstract

Self-efficacy is often investigated as a key attitudinal component of academic persistence and performance, and self-efficacy surveys can serve in research models as efficient and non-threatening assessment tools to augment or substitute for achievement or performance measures. The Geoscience Mathematics Self-Efficacy Scale (GeoMSES) builds on prior work in measurement of self-efficacy for mathematics by focusing specifically on students’ capacity to apply mathematical skills to typical problems encountered in majors-level undergraduate geoscience courses or in professional geoscience settings. The scale was developed as part of program evaluation research for a set of learning modules designed to augment existing geoscience curricula. In samples of undergraduate students in courses at 20 institutions (n = 351), data collected using the new 18-item scale had good psychometric properties including normal distributions, high internal reliability and stability, and significant predictive correlations with mathematics performance. Responses to particular items were highly intercorrelated yet not redundant; the questions may be useful when used individually or in subsets for targeted assessment of self-efficacy for particular skills. The GeoMSES may serve as a research or program evaluation tool or as a classroom assessment tool for instructors interested in using student self-efficacy to help them plan or assess their teaching.

1. Introduction

Mathematical and statistical competency are critical components of success for undergraduate geoscience students, as quantitative skills are required across a range of prerequisite and majors-level courses. The ability to analyze data, model Earth processes, and interpret quantitative information is essential for understanding geological phenomena (Soule et al., 2018; Mosher et al., 2021; McFadden et al., 2021; Liemohn, 2024). However, many students enter geoscience programs with insufficient quantitative preparation and negative attitudes toward mathematics that can impede their success (Palestro & Jameson, 2020; Jameson et al., 2024). Given the growing emphasis on data literacy and mathematical applications in geoscience careers (Loudin, 2004; Manduca et al., 2008; Campbell et al., 2013; Viskupic et al., 2021; Lofton et al., 2025), it is imperative to assess and support students’ confidence in their mathematical abilities to understand how to enhance learning outcomes and retention. For educators and policy makers interested in broadening participation in geoscience programs and increasing the number of geoscience professionals in the U.S., fostering quantitative literacy is crucial as quantitative skills become increasingly important in geoscience careers (Macdonald et al., 2000; Manduca et al., 2008).
Self-efficacy, which is the belief in one’s capability to successfully complete a specific task, has been identified as a significant predictor of academic performance (Bandura, 1997, 2006; Finney & Schraw, 2003; Honicke & Broadbent, 2016). Within an academic context, self-efficacy is particularly relevant for assessing students’ confidence in their quantitative skills, as higher self-efficacy correlates with increased effort, perseverance, and strategic learning behaviors (Mega et al., 2014). Prior research has demonstrated that self-efficacy measures related to statistics and mathematics predict performance and attitudes toward quantitative subjects (Finney & Schraw, 2003; Nielsen & Moore, 2003; Skaalvik et al., 2015; Charalambous et al., 2021; Jameson et al., 2024). In addition, enhancing self-efficacy through meaningful, real-world applications of quantitative skills can mitigate math anxiety and improve student success in STEM disciplines (Teasdale et al., 2018; O’Leary et al., 2021).
Several recent studies have investigated self-efficacy in the context of the geosciences. Researchers have explored the role of field experiences on students’ self-efficacy and how it may affect their persistence and interest in geoscience (Kortz et al., 2019; Fairchild et al., 2023). In introductory geoscience courses, teaching activities that engage students in prediction, observation, and critical thinking, as well as authentic research paired with nature of science reflections have proved effective at increasing geoscience self-efficacy (Moss et al., 2018; James et al., 2021). These studies highlight the value of assessing self-efficacy and the potential impact of pedagogical interventions. However, these studies have not focused specifically on mathematics self-efficacy among geoscience students.
Despite the demonstrated importance of self-efficacy in academic success, there is currently no self-efficacy scale specifically designed to measure the mathematical confidence of majors-level geoscience students and no baseline data for geoscience majors’ mathematics self-efficacy. Existing self-efficacy measures are often too general and not attuned to the skills required for geoscience coursework (Nielsen & Moore, 2003). Given that self-efficacy is task-specific (Pajares, 1996, 2003), a geoscience-specific mathematics self-efficacy scale is needed to accurately assess students’ confidence in applying quantitative skills within the discipline. This scale could provide valuable insights for educators seeking to improve instructional strategies and student support mechanisms.
Building on the success of the original The Math You Need project for introductory undergraduate geoscience skills (TMYN-Intro; Wenner & Baer, 2015), the recent U.S. National Science Foundation-funded TMYN-Majors project (serc.carleton.edu/mathyouneed/geomajors; Math You Need—Majors, n.d.) developed and published 14 web-based, co-curricular modules to support the teaching and learning of quantitative skills in majors-level geoscience courses (Table 1). Each online module includes an instructor guide with learning goals, teaching recommendations and resources, and three components for students: an introduction to the quantitative concept, geoscience-situated practice problems, and a bank of math quiz questions that instructors can integrate into learning management systems. Data from these questions can be used to assess skills or self-efficacy of individual students or classes of students in order to plan instruction and gauge the success of instruction. Two-person teams of Earth science instructors from different subdisciplines created the modules at module development workshops. The authors are faculty members at institutions of varying sizes and contexts (Table A1). Each module was peer reviewed against a materials rubric and then tested with students by both authors and non-author faculty members. They tested the modules in a wide range of Earth science related courses, including oceanography, climate science, environmental science, geomorphology, petrology, and structural geology.
For this study, we developed and then generated initial psychometric data for the Geoscience Mathematics Self-Efficacy Scale (GeoMSES), an easily administered instrument designed to measure mathematics self-efficacy of undergraduate geoscience students. The GeoMSES builds on the Mathematics Self-Efficacy Scale (MSES) developed by Nielsen and Moore (2003) for high-school students, which was tested and validated by Jameson et al. (2024) with undergraduate students in introductory geoscience courses. The GeoMSES aligns with the mathematical demands of majors-level geoscience courses, incorporating tasks related to data analysis, modeling, and applied quantitative reasoning. By evaluating the reliability and validity of the GeoMSES, this research seeks to establish a foundational tool for assessing and addressing the quantitative skills of geoscience majors. Preliminary evidence presented in this article supports the instrument’s utility, suggesting that it holds promise for further application in educational settings and future research on student success in geoscience disciplines.

2. Materials and Methods

Starting with the Mathematics Self-Efficacy Scale (MSES, Nielsen & Moore, 2003), the following modifications were made to produce a new 18-item scale designed to be more aligned with the mathematics applications typically needed in majors-level undergraduate geoscience courses and geoscience careers:
  • Instructions were modified to remove italics and specification of where mathematical tasks are to be performed
  • Response options were expanded from a 5-point scale to a 10-point scale in order to make the questions more sensitive at the individual item level (response options range from 0 to 9, anchored as “Not at all confident” at zero and “Extremely confident” at 9)
  • Several item stems were retained with wording identical to the MSES:
    “Work with decimals”
    “An algebra problem”
    “A problem in trigonometry”
    “Work with fractions”
  • Some MSES item stems were retained with minor changes:
    “A simultaneous equation” became “Solve simultaneous equations”
    “Determine the degrees of a missing angle” and “Determine the value of a missing side length” were combined into “Determine the degrees of a missing angle or the length of a missing side of a triangle”
    “Calculate values of area and volume” became “Calculate the values of area and volume”
  • One original MSES item stem was removed: “Sketch a curve”
  • Eleven new item stems targeting skills relevant to geoscience were created:
    “Read a ternary diagram”
    “Find the components of a vector”
    “Solve a differential equation”
    “Use formulas in Excel or Sheets”
    “Create a histogram”
    “Calculate a linear regression”
    “Convert between different units”
    “Calculate a standard deviation from a set of numbers”
    “Work with exponential equations”
    “Write numbers in scientific notation”
    “Determine an integral”
The resulting set of questions was included as part of the student attitude and self-efficacy survey questions administered to participants on pre- and post-surveys in the TMYN-Majors project. The GeoMSES scale score for each student was calculated as the mean of their responses to the 18 items. Course instructors were also able to use WAMAP (a free web-based platform for mathematics assessment and course management, wamap.org) to include baseline and follow-up math performance tests aligned with module content in their pre- and post-surveys for students. Math performance relevant to the course content was measured by each instructor by selecting a set of questions that were aligned with the module topics assigned in the course. Students answered 3–4 questions per module on the pre- and post-surveys, with the total number of questions dependent on the number of modules used in each course. Then, each students’ math performance score was normalized to produce a percentage correct, which was analyzed as a math performance measure.
The study was approved by the Institutional Review Board of Highline College (IRB00012012, 27 September 2022). Students were asked to complete the GeoMSES scale and the math performance test at the beginning and again near the end of their course, using an online portal built in the WAMAP system. Student data was managed locally by course instructors. The research team downloaded and de-identified the raw data using anonymous ID codes that allowed student pre- and post-surveys to be matched for analysis of math performance and self-efficacy.
Student data were collected during the academic terms Fall 2023, Spring and Fall 2024, and Spring 2025 and analyzed for internal consistency reliability and for validity in predicting student performance on quantitative skills tasks. In Spring 2025, a subset of instructors asked students who were not involved in the intervention study to complete the self-efficacy questions again, approximately two weeks after their initial pre-survey, to provide insight into the stability (test–retest reliability) of scale scores before students were potentially influenced by the TMYN-Majors modules or other learning activities in their courses. Data from students who completed the self-efficacy questions twice were included in the stability analysis.
Data collected from 500 students who participated in trials of the modules were screened for completeness on the measures pertinent to this study; the sample analyzed here included data from 351 students in 29 courses taught by 20 instructors at a variety of higher education institutions (Table A1). The data include demographic descriptors, information about prior course-taking, and pre-surveys and post-surveys on math performance and self-efficacy. This report is focused on psychometric data related to the newly developed self-efficacy scale.

3. Results

3.1. Sample and Distribution Characteristics

A total of 351 students at 20 institutions returned pre- and post-survey responses for mathematics self-efficacy and performance related to the TMYN-Majors modules. GeoMSES scores had close to normal distributions over the 0 to 9 range of possible scores (Figure 1 and Table 2). Pre-survey GeoMSES scores ranged from 0.33 to 9.00 with a mean of 5.79 (SD = 1.48) and minimal skewness (−0.40) and kurtosis (0.31). Post-survey scores ranged from 1.50 to 9.00 with a mean of 6.71 (SD = 1.36) and modest skewness (−0.62) and kurtosis (0.29).

3.2. Internal Consistency Reliability

In the combined pre-survey data for all academic terms, the internal consistency of the 18-item GeoMSES at the beginning of the term, estimated using Cronbach’s Alpha, was 0.91. Inter-item correlations ranged from 0.11 to 0.63. Deletion of any single item did not cause the reliability coefficient to increase; Cronbach’s Alpha with a single item deleted ranged from 0.90 to 0.91. Results were similar in each term; considering data from the beginning of each term separately, the reliability coefficients ranged from 0.88 to 0.93.
The internal consistency of the scale in post-survey data collected at the end of the term was 0.92 (Cronbach’s Alpha). Inter-item correlations ranged from 0.15 to 0.66. Deletion of any single item did not cause the reliability coefficient to increase; Cronbach’s Alpha with a single item deleted ranged from 0.91 to 0.92. Results were similar in each term; the reliability coefficients ranged from 0.90 to 0.93 in data from different terms.

3.3. Item Characteristics and Scale Factor Structure

Most individual items had relatively normal distributions, though in this sample, responses to some of the questions were negatively skewed (most students responded in the upper half of the possible score) (Table 3). Scores on individual items at pre-survey ranged from 0 to 9; average scores ranged from 3.97 (“determine an integral”) to 7.52 (“an algebra problem”). At post-survey, scores on individual items ranged from 0 to 9; average scores ranged from 5.17 (“find the components of a vector”) to 7.88 (“work with decimals”).
Skewness ranged from 0.21 to −1.48 at pre-survey and from −0.34 to −1.26 at post-survey. Items that were negatively skewed with relatively high mean scores (tasks for which many students reported higher self-efficacy) included ”an algebra problem,” “work with decimals,” “calculate the values of area and volume,” “write numbers in scientific notation,” “work with fractions,” “determine the degrees of a missing angle or the length of a missing side of a triangle,” and “convert between different units.” However, even on the most skewed items, at baseline more than half of the students in these samples did not respond at the ceiling (a response of 9 on the 0 to 9 scale).
Principal axis factoring with oblique rotation (Oblimin) was applied to examine the factor structure of the pre-survey and post-survey data (Table 3). After oblique rotation to allow correlated factors, three factors with Eigenvalues greater than 1.0 were extracted, accounting for 51.00 percent of the variance in pre-survey responses and 52.92 percent of variance in post-survey responses. Communalities for individual items ranged from 0.30 to 0.62 in the pre-survey data and from 0.32 to 0.64 in the post-survey data. The correlation between factor 1 and factor 2 was 0.50 at pre-survey and 0.54 at post-survey; the correlation between factor 1 and factor 3 was 0.46 at pre-survey and 0.65 at post-survey; the correlation between factor 2 and factor 3 was 0.35 at pre-survey and 0.46 at post-survey.
The items noted above as those with the highest mean scores at pre-survey (and the greatest negative skew in their distributions) were associated with the first factor, along with two other items addressing ability to “work with exponential equations” and solve “a problem in trigonometry.” This factor accounted for 39.30 percent of variance in the pre-survey analysis and 43.31 percent of variance in the post-survey analysis. These math tasks may be those that students were most likely to have encountered and practiced in prior coursework.
Items most associated with the second factor, which accounted for 6.62 percent of variance in the pre-survey analysis and 6.44 percent of variance in the post-survey analysis, included “solve a differential equation,” “determine an integral,” “find the components of a vector,” “calculate a linear regression,” and “solve simultaneous equations.” Students may have been less likely to have prior experience with these math tasks.
Likewise, items most associated with the third factor (accounting for 5.08 percent of variance in the pre-survey analysis and 3.18 percent of variance in the post-survey analysis) included “create a histogram,” “use formulas in Excel or Sheets,” “read a ternary diagram,” and “calculate a standard deviation from a set of numbers.” These graphical and statistical tasks may have also been less familiar to many students based on their prior course experience.
Descriptive statistics for individual items and pattern matrix loadings are displayed together in Table 3 in order of descending mean item-level self-efficacy at pre-survey to help visualize the relationship between mean self-efficacy and the three extracted factors.

3.4. Test–Retest Reliability (Stability)

In order to estimate the stability of the GeoMSES, two geoscience instructors who were not using the modules in their courses asked their students to complete the self-efficacy questions twice: once at the very beginning of the course, and then again approximately two weeks later. This yielded data from 26 students for an estimation of how stable the GeoMSES scores are over short periods of time in the absence of an intervention designed to influence them.
The correlation between the first and second pre-surveys was 0.87, and the difference between mean scale scores was not significant (t(25) = 0.98, p = 0.34). Means and standard deviations were 6.80 (1.47) for the first pre-survey and 6.66 (1.43) for the second pre-survey.

3.5. Validity

GeoMSES scores were significant predictors of math performance. At the pre-survey, the correlation between GeoMSES scores and math performance scores was 0.32 (p < 0.01) over all 4 terms. At the post-survey, the correlation between GeoMSES scores and math performance scores was 0.23 (p < 0.01). As mentioned in the Methods Section, the math performance scores were aligned with the specific modules used in each course implementation, whereas the GeoMSES score included self-efficacy for performing a uniform suite of math and statistics skills relevant to geoscience, including topics covered in TMYN-Majors modules.
In a regression analysis, GeoMSES post-survey scores accounted for significant variance in end-of-course math performance scores over and above variance accounted for by baseline math performance scores (R-squared = 0.25, p < 0.001; beta coefficient for baseline math performance = 0.45, p < 0.001, beta coefficient for GeoMSES = 0.12, p = 0.018). The semipartial correlation between math performance and GeoMSES scores was 0.11, indicating that self-efficacy as measured by the GeoMSES predicted more than 1 percent of additional variance in student math performance scores after the predictive value of baseline performance scores was taken into account.

4. Discussion

The findings reported here support the reliability of data from the GeoMSES survey instrument and provide preliminary data for validity of the tool as a measure of undergraduate student self-efficacy for applying mathematical concepts and skills in geoscience courses and professional contexts. Data resulting from the survey instrument were found to have high internal consistency reliability, indicating that the items converge on measuring a single coherent construct. On the other hand, the items were not so highly correlated that they seemed redundant. Individual items appear to contribute to measuring the central construct of self-efficacy for applying quantitative skills in geoscience settings, while retaining value as individual items, each measuring self-efficacy for a specific quantitative skill application which may be taught or learned independently from other skills.
More detail on internal reliability was provided by an analysis of the factor structure in student pre-survey and post-survey responses. The survey was designed to assess self-efficacy for some basic math applications that are often considered prerequisites for majors-level Earth science students and also to assess other math skills that are more advanced and/or specific to Earth science courses and careers. The three self-efficacy factors that became apparent in the responses appear to correspond to these different levels of prior experience with the underlying math content. Although responses to the questions assessing self-efficacy for the less-advanced math skills exhibited moderate negative skew, this is not surprising among upper-level geoscience students, and these items may be particularly useful in student populations that are under-prepared for the quantitative demands of majors-level coursework. A planned next version of the GeoMSES scale will include additional items that target advanced math applications in geoscience settings, which will hopefully act in concert with the existing items that focus on advanced applications, reducing the overall mean GeoMSES score and attenuating the mild negative skewness of the full scale score while still retaining the less-advanced math skill items in order to afford diagnosis of basic math skill self-efficacy deficits in particular student populations. The scale is intended to be sensitive to the full range of mathematics self-efficacy observed in upper-level geoscience students.
The test–retest reliability (stability) analysis indicated that student scale scores did not show volatility over short periods of time. Although the sample size for this analysis was much smaller than the main sample size, this preliminary finding suggests that the scale measures a characteristic of students that does not vary a great deal over short periods of time, and therefore, day-to-day noise (a type of measurement error) is not likely to negatively affect use of the scale in educational settings.
Regarding measurement validity, the new self-efficacy scale correlated with actual math performance and predicted a significant amount of variance in post-survey math performance even after statistically accounting for prior math performance. Given the design of this study, particularly the variability in how math performance was measured in different courses, further study of this finding is warranted to determine how robust this estimated relationship may be in different circumstances
Because in some settings a self-report survey measure of self-efficacy for specific geoscience applications of mathematics will be more efficient to administer than a measure of actual performance, it is worthwhile to know that the GeoMSES may serve as a useful proxy measure that predicts variability in student math performance. The GeoMSES may be less time-consuming and stressful for both students and instructors than a full-fledged performance assessment, and provide actionable insights into not only how students perceive their own capability to apply mathematics in geoscience contexts, but their actual quantitative skills.
The GeoMSES appears promising for guiding or evaluating instructional practice in geoscience courses or for use in research aimed at building or testing models of how psychological factors interact and predict geoscience performance outcomes. Further investigation of the predictive validity of the GeoMSES scores seems warranted, focusing on better understanding the extent to which this measure of student self-efficacy predicts math performance and overall success in specific geoscience course, program, and career settings. In a meta-analysis of 241 data sets, academic self-efficacy was found to have a large correlation with undergraduate students’ academic performance—“the strongest correlate” of 50 measures examined (Richardson et al., 2012).
In addition to potentially predicting some variance in math performance, attitude measures such as the GeoMSES can serve other related roles in research and applied use. As a predictor of both performance and persistence (Bandura, 1997, 2006; Finney & Schraw, 2003; Mega et al., 2014; Honicke & Broadbent, 2016), self-efficacy scores may help identify students at risk of dropping out of geoscience courses or programs, who may need individual assistance or resources to retain them in geoscience career tracks. Instructors could use the scale and/or the individual items to assess students, plan instruction accordingly, and measure outcomes to determine whether re-teaching or additional practice is indicated to bolster skills or confidence. In particular, the GeoMSES measure could be useful for identifying students with insufficient quantitative preparation or low expectations about their mathematics skills (Palestro & Jameson, 2020; Jameson et al., 2024). Even among students who are able to successfully apply quantitative skills to solve geoscience problems, low self-efficacy or other psychological factors may indicate a likelihood of avoiding rather than embracing these skills, limiting their future prospects for further education or geoscience career paths. Instructors may augment course offerings by providing targeted support to improve particular quantitative skills and to ameliorate psychological barriers (Klyce & Ryker, 2024; Headley, 2023).
On a larger scale, the GeoMSES could be a useful instrument for program evaluation research aimed at estimating the effectiveness of geoscience courses, curriculum, or programs to support student quantitative skills and self-efficacy. In conjunction with more thorough measurement of math and geoscience performance outcomes, the instrument could aid in development of geoscience-specific research models designed to better integrate and understand the role of attitudinal factors in course and career outcomes. Domain-specific assessments of self-efficacy tend to be stronger predictors of real-world performance than more general self-efficacy measures (Bandura, 1997, 2006; Pajares, 1996). Studies of motivational and persistence-related factors in geoscience education and careers could benefit from a more focused and precise measure of self-efficacy for quantitative applications in this domain. Prior research shows that self-efficacy not only predicts academic achievement but also influences interest development, goal setting, and persistence in STEM majors and careers (Lent et al., 1994; Robbins et al., 2004). Within the geosciences—where mathematics often functions as both a skill gatekeeper and an attitudinal barrier (Jameson et al., 2024; Sexton et al., 2022)—quantifying students’ confidence in using math for majors-level domain-specific tasks could clarify how perceived competence shapes their motivation to persist in the field. Further specifying links between mathematics self-efficacy, geoscience self-efficacy, and interest could help explain gender or identity-based disparities in academic and career persistence (Jameson et al., 2024; Fairchild et al., 2023; Kortz et al., 2019). Moreover, in professional geoscience contexts, a domain-focused self-efficacy measure could help evaluate how confidence in quantitative reasoning evolves through graduate training, fieldwork, or applied industry roles—offering insight into how efficacy beliefs sustain continued learning and problem-solving over a career. A majors-level geoscience-specific mathematics self-efficacy instrument provides researchers and educators with a precise tool for connecting cognitive, affective, and motivational processes to performance, persistence, and identity development in geoscience education and practice.
The differing quantitative demands of geoscience courses could be important to consider when using the GeoMSES in assessment. Each department or program may offer courses at different levels in their degree programs, but often courses such as geophysics and hydrogeology have higher quantitative demands than courses such as Earth materials or sedimentology. At the program level, atmospheric science or marine science programs may have higher quantitative requirements than geology programs. However, these differences vary by institution, and they may affect the interpretation of GeoMSES scale and item scores. A useful example is the contrast between linear regression and ternary diagrams, both items with low pre-survey mean scores. If students have taken an introductory statistics course, it is likely they have been introduced to linear regression. However, they may not have applied linear regression in a geoscience context and they may still not be confident about when or how to apply it. In contrast, students are unlikely to have been exposed to ternary diagrams outside of a geoscience context, though students in an upper-level petrology or geochemistry course may have been introduced to ternary diagrams in a lower-level Earth materials course. The GeoMSES may help faculty assess the potentially uneven skill familiarity and self-efficacy development of students in relation to specific course and program content and expectations.
The study had some notable limitations in addition to those discussed above. In the absence of a control group there is no basis to draw strong inferences about possible causal effects of module implementation on self-efficacy or math performance. The data came from multiple small courses at a range of institution types across many subdisciplines in the geosciences. The range of courses and institutions provided a diverse cross-section of geoscience students, but it makes analysis of demographic factors such as class year or exploration of specific subdisciplines problematic due to small subgroup sizes. Thus, we have not provided disaggregated analysis by class year or course subdiscipline. A larger study including more students and/or less variability in student characteristics and course/program contexts could provide more focused and robust evidence concerning variability in students’ math performance and self-efficacy by class year and Earth science subdiscipline.
Reliability, validity, and utility of data produced using a measurement tool like the GeoMSES is context-dependent, and indeed, these are attributes of specific data sets, not of the instrument itself. To the extent possible, users would do well to check the distribution characteristics, reliability, and validity of their own data to ensure that the instrument provides useful data with their students and in their setting. The findings reported here demonstrate that data from the GeoMSES can be reliable and valid for the intended purposes, but the extent to which the measure produces useful data may vary in different settings.
Data reported here were generated using the original 18-item version of the GeoMSES. A subsequently developed version includes six additional items targeting advanced math applications that are aligned with corresponding new modules that have been added to the TMYN-Majors project offerings for geoscience instructors and programs. Contact the authors for more information on the instrument reported here or newer iterations of the GeoMSES.

Author Contributions

All authors contributed to the conceptualization of this work, funding acquisition, and the overall investigation. Specific contributions included methodology, formal analysis, software, and writing of the initial draft, M.C. and R.M.; data organization, B.P.-S. and R.M.; review and editing, B.P.-S. and E.B. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the National Science Foundation, Division of Undergraduate Education, grant numbers NSF-2234225, 2234237, and 2336447. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the NSF.

Institutional Review Board Statement

The study was approved by the Institutional Review Board of Highline College (IRB00012012, 27 September 2022). All students were informed of their right to not participate and offered an opt-out option. In some cases, individual institutions required individual documentation of consent. However, Highline IRB determined the study to be exempt from further review due to being part of standard educational processes, with no requirement for documentation of individual participant consent.

Informed Consent Statement

All students were informed of their right to not participate and offered an opt-out option.

Data Availability Statement

The raw data supporting the conclusions of this article will be made available by the authors on request.

Acknowledgments

The authors would like to thank the faculty members who generously served as authors for the TMYN-Majors modules and who tested modules with students in their classrooms, including helping to gather the raw data used in this study; as well as TMYN-Majors Advisory Committee, which gave invaluable input at many stages of the project. We also appreciate the students who took the time to respond to the surveys.

Conflicts of Interest

Author Michael Coe was employed by the company Cedar Lake Research Group LLC. The remaining authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
GeoMSESGeoscience Mathematics Self-Efficacy Scale
STEMScience, Technology, Engineering, and Mathematics
TMYN-IntroThe Math You Need—Introductory Geoscience
TMYN-MajorsThe Math Your Earth Science Majors Need

Appendix A

Characteristics of institutions and students involved in this study.
Table A1. Institution information of module authors/testers.
Table A1. Institution information of module authors/testers.
Terminal DegreeNumber of Institutions
Associates1
Bachelors10
Masters4
PhD8
Note: Institutions included four Hispanic-serving institutions, two historically Black colleges and universities, and one Alaska Native-serving institution and were in 18 U.S. states: Alaska, California, Colorado, Florida, Georgia, Maine, Maryland, Massachusetts, Montana, Nebraska, New Mexico, New York, North Carolina, Pennsylvania, Texas, Utah, Virginia, and Wisconsin. A total of 23 instructors at 23 institutions served as module authors, but usable paired student self-efficacy data were obtained from 20 institutions.
Table A2. Student demographic information.
Table A2. Student demographic information.
DemographicsNumberPercent
Freshman4713%
Sophomore8023%
Junior10731%
Senior11533%
Female17651%
Male14141%
“Other” or “Prefer not to answer”308%
1st generation college *10831%
Non-1st generation college23066%
“I’m not sure” or “Prefer not to answer”113%
Note: N = 351, students who completed both pre- and post-survey self-efficacy questions. * Answered “No” to the question “Have any of your parents completed a 4-yr college degree?”.

References

  1. Bandura, A. (1997). Self-efficacy: The exercise of control. Freeman. [Google Scholar]
  2. Bandura, A. (2006). Guide for constructing self-efficacy scales. In Self-efficacy beliefs of adolescents (pp. 307–337). Information Age Publishing. Available online: https://www.emerald.com/books/edited-volume/18068/chapter-abstract/101339715/Guide-for-Constructing-Self-Efficacy-Scales (accessed on 1 May 2023).
  3. Campbell, K., Overeem, I., & Berlin, M. (2013). Taking it to the streets: The case for modeling in the geosciences undergraduate curriculum. Computers & Geosciences, 53, 123–128. [Google Scholar] [CrossRef]
  4. Charalambous, M., Hodge, J. A., & Ippolito, K. (2021). Statistically significant learning experiences: Towards building self-efficacy of undergraduate statistics learners through team-based learning. Educational Action Research, 29(2), 226–244. [Google Scholar] [CrossRef]
  5. Fairchild, E., Sexton, J., Newman, H., Hinerman, K., McKay, J., & Riggs, E. (2023). A qualitative study of marginalized students’ academic, physical, and social self-efficacy in a multiweek geoscience field program. Journal of Geoscience Education, 72(2), 146–158. [Google Scholar] [CrossRef]
  6. Finney, S. J., & Schraw, G. (2003). Self-efficacy beliefs in college statistics courses. Contemporary Educational Psychology, 28(2), 161–186. [Google Scholar] [CrossRef]
  7. Headley, R. M. (2023). An intervention to address math anxiety in the geosciences. Journal of Geoscience Education, 71(1), 33–42. [Google Scholar] [CrossRef]
  8. Honicke, T., & Broadbent, J. (2016). The influence of academic self-efficacy on academic performance: A systematic review. Educational Research Review, 17, 63–84. [Google Scholar] [CrossRef]
  9. James, N. M., Kreager, B. Z., & LaDue, N. D. (2021). Predict-observe-explain activities preserve introductory geology students’ self-efficacy. Journal of Geoscience Education, 70(2), 238–249. [Google Scholar] [CrossRef]
  10. Jameson, M. M., Sexton, J., London, D., & Wenner, J. M. (2024). Relationships and gender differences in math anxiety, math self-efficacy, geoscience self-efficacy, and geoscience interest in introductory geoscience students. Education Sciences, 14(4), 426. [Google Scholar] [CrossRef]
  11. Klyce, A., & Ryker, K. (2024). Evaluating the effectiveness of spatial training for introductory geology students. Geosphere, 20(2), 350–366. [Google Scholar] [CrossRef]
  12. Kortz, K. M., Cardace, D., & Savage, B. (2019). Affective factors during field research that influence intention to persist in the geosciences. Journal of Geoscience Education, 68(2), 133–151. [Google Scholar] [CrossRef]
  13. Lent, R. W., Brown, S. D., & Hackett, G. (1994). Toward a unifying social cognitive theory of career and academic interest, choice, and performance. Journal of Vocational Behavior, 45(1), 79–122. [Google Scholar] [CrossRef]
  14. Liemohn, M. W. (2024). Data analysis for the geosciences: Essentials of uncertainty, comparison, and visualization. Wiley Blackwell. [Google Scholar]
  15. Lofton, M. E., Moore, T. N., Woelmer, W. M., Thomas, R. Q., & Carey, C. C. (2025). A modular curriculum to teach undergraduates ecological forecasting improves student and instructor confidence in their data science skills. BioScience, 75(2), 127–138. [Google Scholar] [CrossRef]
  16. Loudin, M. (2004). Where in the world will we find our future geoscientists?: One employer’s perspective. Eos Transactions AGU, 85, ED31B-0748. [Google Scholar]
  17. Macdonald, R. H., Srogi, L., & Stracher, G. B. (2000). Building quantitative skills of students in geoscience courses. Journal of Geoscience Education, 48(4), 409–412. [Google Scholar] [CrossRef]
  18. Manduca, C. A., Baer, E., Hancock, G., Macdonald, R. H., Patterson, S., Savina, M., & Wenner, J. (2008). Making undergraduate geoscience quantitative. Eos, Transactions American Geophysical Union, 89(16), 149–150. [Google Scholar] [CrossRef]
  19. Math You Need—Majors. (n.d.). Math you need—Majors. Available online: https://serc.carleton.edu/mathyouneed/geomajors/index.html (accessed on 1 September 2025).
  20. McFadden, R. R., Viskupic, K., & Egger, A. E. (2021). Faculty self-reported use of quantitative and data analysis skills in undergraduate geoscience courses. Journal of Geoscience Education, 69(4), 373–386. [Google Scholar] [CrossRef]
  21. Mega, C., Ronconi, L., & De Beni, R. (2014). What makes a good student? How emotions, self-regulated learning, and motivation contribute to academic achievement. Journal of Educational Psychology, 106(1), 121–131. [Google Scholar] [CrossRef]
  22. Mosher, S., Harrison, W., Huntoon, J., Keane, C. M., McConnell, D., Miller, K., Ryan, J., Summa, L., Villalobos, J., & White, L. (2021). Vision and change: The future of undergraduate geoscience education (p. 176). American Geological Institute. Available online: https://www.americangeosciences.org/change/ (accessed on 1 May 2022).
  23. Moss, E., Cervato, C., Genschel, U., Ihrig, L., & Ogilvie, C. A. (2018). Authentic research in an introductory geology laboratory and student reflections: Impact on nature of science understanding and science self-efficacy. Journal of Geoscience Education, 66(2), 131–146. [Google Scholar] [CrossRef]
  24. Nielsen, I. L., & Moore, K. A. (2003). Psychometric data on the mathematics self-efficacy scale. Educational and Psychological Measurement, 63(1), 128–138. [Google Scholar] [CrossRef]
  25. O’Leary, E. S., Sayson, H. W., Shapiro, C., Garfinkel, A., Conley, W. J., Levis-Fitzgerald, M., Eagan, M. K., & Van Valkenburgh, B. (2021). Reimagining the introductory math curriculum for life sciences students. CBE—Life Sciences Education, 20(4), ar62. [Google Scholar] [CrossRef]
  26. Pajares, F. (1996). Self-efficacy beliefs in academic settings. Review of Education Research, 66, 543–578. [Google Scholar] [CrossRef]
  27. Pajares, F. (2003). Self-efficacy beliefs, motivation, and achievement in writing: A review of the literature. Reading & Writing Quarterly, 19(2), 139–158. [Google Scholar] [CrossRef]
  28. Palestro, J. J., & Jameson, M. M. (2020). Math self-efficacy, not emotional self-efficacy, mediates the math anxiety-performance relationship in undergraduate students. Cognition, Brain, Behavior. An Interdisciplinary Journal, 24(4), 379–394. [Google Scholar] [CrossRef]
  29. Richardson, M., Abraham, C., & Bond, R. (2012). Psychological correlates of university students’ academic performance: A systematic review and meta-analysis. Psychological Bulletin, 138(2), 353–387. [Google Scholar] [CrossRef]
  30. Robbins, S. B., Lauver, K., Le, H., Davis, D., Langley, R., & Carlstrom, A. (2004). Do psychosocial and study skill factors predict college outcomes? A meta-analysis. Psychological Bulletin, 130(2), 261–288. [Google Scholar] [CrossRef]
  31. Sexton, J., London, D., Jameson, M. M., & Wenner, J. M. (2022). Thriving, persisting, or agonizing: Integrated math anxiety experiences of university students in introductory geoscience classes. Education Sciences, 12, 577. [Google Scholar] [CrossRef]
  32. Skaalvik, E. M., Federici, R. A., & Klassen, R. M. (2015). Mathematics achievement and self-efficacy: Relations with motivation for mathematics. International Journal of Educational Research, 72, 129–136. [Google Scholar] [CrossRef]
  33. Soule, D., Darner, R., O’Reilly, C. M., Bader, N. E., Meixner, T., Gibson, C. A., & McDuff, R. E. (2018). EDDIE modules are effective learning tools for developing quantitative literacy and seismological understanding. Journal of Geoscience Education, 66(2), 97–108. [Google Scholar] [CrossRef]
  34. Teasdale, R., Selkin, P., & Goodell, L. (2018). Evaluation of student learning, self-efficacy, and perception of the value of geologic monitoring from Living on the Edge, an InTeGrate curriculum module. Journal of Geoscience Education, 66(3), 186–204. [Google Scholar] [CrossRef]
  35. Viskupic, K., Egger, A. E., McFadden, R. R., & Schmitz, M. D. (2021). Comparing desired workforce skills and reported teaching practices to model students’ experiences in undergraduate geoscience programs. Journal of Geoscience Education, 69(1), 27–42. [Google Scholar] [CrossRef]
  36. Wenner, J. M., & Baer, E. (2015). The Math You Need, When You Need It (TMYN): Leveling the Playing Field. Numeracy: Advancing Education in Quantitative Literacy, 8(2), 5. [Google Scholar] [CrossRef]
Figure 1. GeoMSES score distributions at (a) pre- and (b) post-survey.
Figure 1. GeoMSES score distributions at (a) pre- and (b) post-survey.
Education 16 00288 g001
Table 1. TMYN-Majors module topics.
Table 1. TMYN-Majors module topics.
Module
Calculating Uncertainty
Correlation
Exponential Equations
Histograms
Introductory Statistics
Linear Regression
Logarithms
Log-Log Plots
Orders of Magnitude
Probability
Scientific Notation
Solving Math Word Problems
Ternary Diagrams
Vectors
Table 2. GeoMSES distribution characteristics at pre- and post-survey.
Table 2. GeoMSES distribution characteristics at pre- and post-survey.
StatisticPre-SurveyPost-Survey
N351351
Minimum0.331.5
Maximum99
Mean (SD)5.79 (1.48)6.71 (1.36)
Skewness (SE)−0.40 (0.13)−0.62 (0.13)
Kurtosis (SE)0.31 (0.26)0.29 (0.26)
Table 3. GeoMSES item characteristics and pattern matrix loadings at pre- and post-survey.
Table 3. GeoMSES item characteristics and pattern matrix loadings at pre- and post-survey.
Item TopicPre-SurveyPost-SurveyPre-Survey FactorsPost-Survey Factors
Mean (SD)SkewnessMean (SD)Skewness123123
An algebra problem7.52 (1.73)−1.487.85 (1.40)−1.260.789 0.828
Work with decimals7.50 (1.69)−1.217.88 (1.49)−1.420.836 0.866
Calculate the values of area and volume7.37 (1.81)−1.387.80 (1.56)−1.400.751 0.735
Write numbers in scientific notation7.02 (2.23)−1.317.70 (1.77)−1.710.501 0.500
Work with fractions6.95 (1.93)−0.797.45 (1.66)−1.110.693 0.646
Determine the degrees of a missing angle or the length of a missing side of a triangle6.75 (2.13)−1.077.11 (1.97)−1.150.643 0.602
Convert between different units6.73 (2.26)−1.087.35 (1.91)−1.280.675 0.671
Use formulas in Excel or Sheets6.03 (2.57)−0.717.23 (2.08)−1.45 0.497 0.646
Work with exponential equations6.03 (2.23)−0.666.85 (1.90)−1.010.4960.317 0.319
Create a histogram5.56 (2.73)−0.507.11 (2.01)−1.03 0.726 0.451
A problem in trigonometry5.54 (2.40)−0.506.32 (2.23)−0.800.3980.332 0.3610.346
Solve simultaneous equations5.39 (2.24)−0.296.01 (2.10)−0.590.3300.428 0.452
Calculate a standard deviation from a set of numbers5.17 (2.61)−0.256.32 (2.28)−0.68 0.467 0.3170.528
Solve a differential equation4.54 (2.59)−0.145.32 (2.46)−0.41 0.765 0.809
Find the components of a vector4.10 (2.51)0.135.17 (2.54)−0.31 0.581 0.571
Read a ternary diagram4.06 (2.98)0.145.95 (2.26)−0.51 0.467 0.509
Calculate a linear regression4.06 (2.58)0.176.10 (2.44)−0.63 0.5510.464 0.555
Determine an integral3.97 (2.64)0.215.41 (2.41)−0.34 0.685 0.796
Note. N = 349 to 351. For ease of viewing, items are arranged in order of descending pre-survey self-efficacy mean responses and factor loadings lower than 0.30 have been suppressed. Differences between pre- and post-survey means were significant at p < 0.001. Students were asked to respond to each of the stems using a 10-level response scale, with zero anchored as “Not at all confident” and 9 anchored as “Extremely confident.” Survey instructions were “The following questions ask you to estimate your own mathematics ability. On a scale of 0 (Not at all confident) to 9 (extremely confident), how confident are you that you can perform each of the following math tasks?”.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Coe, M.; McFadden, R.; Pratt-Sitaula, B.; Baer, E. Geoscience Mathematics Self-Efficacy Scale (GeoMSES) for Majors-Level Undergraduates: Psychometric Data. Educ. Sci. 2026, 16, 288. https://doi.org/10.3390/educsci16020288

AMA Style

Coe M, McFadden R, Pratt-Sitaula B, Baer E. Geoscience Mathematics Self-Efficacy Scale (GeoMSES) for Majors-Level Undergraduates: Psychometric Data. Education Sciences. 2026; 16(2):288. https://doi.org/10.3390/educsci16020288

Chicago/Turabian Style

Coe, Michael, Rory McFadden, Beth Pratt-Sitaula, and Eric Baer. 2026. "Geoscience Mathematics Self-Efficacy Scale (GeoMSES) for Majors-Level Undergraduates: Psychometric Data" Education Sciences 16, no. 2: 288. https://doi.org/10.3390/educsci16020288

APA Style

Coe, M., McFadden, R., Pratt-Sitaula, B., & Baer, E. (2026). Geoscience Mathematics Self-Efficacy Scale (GeoMSES) for Majors-Level Undergraduates: Psychometric Data. Education Sciences, 16(2), 288. https://doi.org/10.3390/educsci16020288

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop