Investigating Students’ Perception with an Online Dynamic Earth Course during COVID-19: A Quantitative Inquiry

: This study investigated Earth science students’ experiences with online education during the COVID-19 pandemic at the University of Tennessee, Knoxville, in the US. We used an existing survey from the online education literature, the Online Learning Environment Survey (OLES), which consists of three instruments: (a) community of inquiry (CoI), (b) Institutional Support (IS)


Introduction
One of the key attributes of an online learning program is bi-directional communication between learners and instructors, which is primarily dependent on the delivery method and overall accessibility [1][2][3].The online format means that course-related activities (e.g., lecturing by the instructor) may not require either a physical classroom or a designated class meeting time (i.e., asynchronous).While the transition to online education has been witnessed in many disciplines such as physics, chemistry, and mathematics [4,5], the online delivery of Earth science courses is still not well established, except for several online programs in a small number of universities (e.g., the University of Florida, Texas A&M University, Western Governors University, Ohio University).
Asynchronous online courses feature the flexibility to access teaching and learning without the simultaneous presence of students and instructors in a physical classroom.This course delivery mode can be very effective during adverse conditions, such as the recent COVID-19 global pandemic.Although online course delivery is known to be effective, when it comes to geoscience courses, the lack of an adequate field support system could pose challenges to the successful delivery of such courses [6].On the other hand, technological advances in the last decade have had a significant impact on online education [7,8], so it may be reasonable to anticipate that the online delivery of geoscience courses could also benefit from such advances, particularly introductory-level courses (i.e., physical geology and historical geology).
This study presents a novel analysis of students' feedback survey data from an online Dynamic Earth course developed before and through the COVID-19 pandemic.The objective of the study is to evaluate the student responses of an online geoscience course using OLES and analyzing its future implications for further development.The study serves two purposes: (1) to understand students' lived experiences regarding the online delivery of Earth science courses (e.g., course accessibility), and (2) to gain insights into the development of other online Earth science courses based on the findings from this study.
In this context, our goal is to address the following research questions (RQs): RQ1: How important did students perceive various community of inquiry factors to be in terms of contributing to their online learning in the introductory-level Earth science course?RQ2: How important did students perceive various self-directed learning factors to be in terms of contributing to their online learning in the introductory-level Earth science course?
The organization of the paper is as follows.First, we discuss a literature review explaining the context and relevance of the study.Then, we present the method while explaining the research site, survey instrument, data collection, and data analysis.The Methodology section is followed by a synthesis of the results, discussion, and conclusion.

Literature Review
Existing studies examined and compared the learning outcomes between online and traditional classroom-based onsite classes in various disciplines including geology.Burger (2015) [9] suggested an effective online Earth science course using a distinct delivery method that directly compared student performance and experience, including perceived course difficulty, learning effectiveness, student engagement, satisfaction and retention, and student comfort with technology.The study also concluded that the rarity of online Earth science classes may be attributed to employers' and professional geologists' negative perceptions of the online delivery approach, which cannot provide in-person field experiences.Ni (2013) [10] suggested that the online teaching mode can be more successful and accessible than traditional classroom teaching, primarily because of the flexibility of course delivery and independent mode of instruction.Triantafyllou and Timcenko (2016) [11] reported successful online student learning in mathematics courses delivered through media technology, including screencasts, online readings, quizzes, and lecture notes.
A typical learning goal in geoscience education is to ensure that students develop 3-dimensional perception and interpretation skills.To develop a strong conceptual foundation, it is important for the student to manipulate and reconcile 3D representations of structures that are not fully elucidated by 2D textbook figures.The field component of introductory geoscience courses, therefore, is imperative because it may assist students in making inferences and reconciling classroom learning [12][13][14].This necessary learning goal has been a long-standing limitation for online geoscience courses, as the virtual medium was never thought to be a viable alternative to the field component.With recent advancements in computer graphics, increasing availability of virtual fieldtrips, and highresolution 3D images, a technology-enabled Earth science education is no longer hindered by those limitations.
User experience is a crucial metric of the success of an online class [15].Several obstacles could exist, which are not directly related to the subject matter but rather to the course delivery interface and clarity of the course structure.Challenges may include (1) the demonstration of geologic concepts with examples with no in-person interaction, (2) student background in technology, and (3) student adaptability to the online environment [16][17][18].Importantly, different students in the same course may have vastly different levels of background and experience with technology, which could lead to various responses [19].
Key geologic topics, including earthquakes, plate tectonics, groundwater movement, contaminants, and mountain-building processes, can be better perceived through models [20][21][22], animations, and simulation than they can through still photographs in textbooks, as these processes are dynamic and temporally complex.The modeling of such concepts is best performed using a digital medium.For example, there are several obstacles in comprehending the concept of plate tectonics.First, it requires a general factual basis of Earth's composition.Second, unobservable multiscale processes variable in time and space (e.g., at great depth in subduction zones over tens of millions of years) pose further obstacles to comprehension [23][24][25][26].
Apart from the demonstration of the key geologic concepts, efficient course delivery in an online platform could be an added challenge.For a better understanding of the online learning environment in the field of geoscience, we have applied an established tool comprising community of inquiry (CoI), Institutional Support (IS), and Self-Directed Learning (SDL).CoI proposes an approach of online learning depending on the three parameters, including social presence, teaching presence, and cognitive presence.It is reported that [27] social interactions may not be as effective as the online learning modes for critical thinking.Online learning experiences have the potential for the deepest levels of reflective thought and learning.The integration of cognitive, social, and teaching elements expands learning opportunities beyond simple social exchanges.The CoI applies a meaningful inquiry supported by teaching, social, and cognitive elements, which are essential for promoting deep and durable learning in online learning environments [28,29].Students benefit from the cognitive processes that online instructors encourage through collaborative learning activities for a sustainable reflective discourse [30].On the other hand, SDL has been a primary theoretical construct in adult education and its research has evolved over time [31].Peer study suggests SDL as both a process and a personal attribute [32].Applications of the OLES in online geoscience education is still largely unexplored, where the scopes of the online geoscience mode can be addressed through user feedback on social presence, teaching presence, cognitive presence, and self-directed learning.

Methodology
The study was approved by the Institutional Review Board (IRB) of the University of Tennessee at Knoxville (UTK), where the research was conducted.With the IRB approval, UTK students, enrolled in the Dynamic Earth Online classes in summer 2019 and 2020 and in fall 2020, participated electronically to complete the Online Learning Environment Survey [33][34][35].
Dynamic Earth (GEOL 101, or physical geology) is a heavily enrolled multi-section introductory course that includes geology majors, minors (2-5%, internal data), and nonmajor students.Primarily motivated by the scarcity of an entirely online geology course at UTK, we developed Dynamic Earth Online.Initially delivered as a 5-week accelerated summer course, the course was later expanded to 15 weeks and delivered in both fall and spring semesters.The course enrollment primarily consists of non-STEM major students who seek to fulfill a natural science general education.

Survey Instrument
Establishing an effective e-learning system is a complex process.The designing of such a system is often dependent on the implementation of various factors including infrastructure, content and assessment, quality of learner support systems, assumptions made by learners and educators about the learning experience itself, and peer support networks for learners and educators [37].Obtaining students' feedback about the learning environment is an essential step to identify the areas where improvements could be made in the future.Based on this rationale, we applied the OLES instrument for this study.
The OLES consists of three separate instruments: (a) community of inquiry (CoI), (b) Institutional Support (IS), and (c) the Self-Directed Online Learning Scale (SDOLS).As outlined in Table 1, CoI is composed of three subscales representing three subdimensions: (a) teaching presence (TP), (b) social presence (SP), and (c) cognitive presence (CP) [38,39].IS consists of just one item.The SDOLS has two subscales representing two subdimensions: (a) autonomous learning (AUL) and (b) asynchronous online learning (ASL) [40].The authors slightly modified the survey instruments in the OLES to make the items more relevant to the current study.This modified survey consists of a set of 46 closed-ended items (Appendix A).All OLES items were measured on a five-point Likert scale: 1 for strongly disagree (SD), 2 for disagree (D), 3 for neutral (N), 4 for agree (A), and 5 for strongly agree (SA).

Data Collection
The participants of the survey were enrolled in the Dynamic Earth Online class over the course of multiple semesters.After the last day of grade submission, the students were invited to take the survey anonymously by using a link provided by the instructor through the QuestionPro survey software versions 2019 and 2020 were available from the Office of Information Technology (OIT) at UTK.The students had one week to complete the survey.
In total, 56 participants completed the survey with responses to all three instruments.The results included the views of non-geoscience major students from general education, architecture, construction engineering, and business departments.We believe that including the feedback from non-geoscience major students might eliminate the bias that could have been introduced if the survey participants were limited to geoscience major students only.At the same time, the feedback from non-geoscience majors may contribute to the efforts to engage the students from other disciplines so that they are more likely to pursue geoscience career opportunities.

Data Analysis
The Rasch model was constructed based on the assumption that the most parsimonious and effective predictor of a characteristic is the relationship between an item's difficulty and a person's ability.It is explained based on the underlying logic that subjects have a higher probability of correctly answering (i.e., endorsing) easier items and a lower probability of correctly answering more difficult ones [41].Rasch analysis has been successfully used in multiple disciplines including human sciences, health [42], mathematics [43], and education [44].Recently, the field of geoscience has witnessed an increasing number of Rasch analysis applications [45].
In the study, we performed Rasch analysis based on the rating scale model (RSM) separately for each of the subscales of CoI and SDOLS using the Winsteps software (version 4.3.2).In particular, the Wright item-person map of each Rasch analysis addressed the perceptions of student respondents regarding various aspects of the online Earth science course (RQ1 and RQ2).
Before looking at the maps, each analysis began with an investigation of the psychometric properties of all subscales, each of which had a Cronbach's alpha statistic of 0.825 or higher.First, we examined each subscale to see whether the unidimensionality assumption of the Rasch analysis was satisfied using the Principal Component Analysis of Standardized Rasch Residuals (PCASRR).Second, we investigated how well the items of each subscale fitted the rating scale model based on the infit and outfit mean squares (MNSQ) statistics.Third, we examined the reliability and separation statistics of persons/students and items.Finally, we analyzed the Wright item-person map to estimate the item difficulty and student ability together on a logit scale.In each map, the mean of the item difficulty was arbitrarily set to zero.Items of greater difficulty were placed towards the top of the map, whereas those of less difficulty were at the bottom.Accordingly, students with greater ability were placed at the top of the map, whereas those with less ability were at the bottom.For each subscale, the map provided the distribution of the items on the right side and that of the students on the left side.

Results
The data from all participants who responded to one or more of the five subscales and the IS scale are summarized using descriptive statistics in Figure 1, which consists of a panel of two subfigures.For each response category of each subscale and the IS scale, the subfigure on the upper level of the panel documents the total number/count of responses, whereas the other subfigure on the lower level is the corresponding proportion of responses.Figure 1 shows a clear dominance of the participants in favor of online learning.Based on Figure 1a, for each subscale/scale, most of the students responded to the strongly agree or agree category.Specifically, as per Figure 1b, for the CoI subscales, around 60% of the participants rated using the two highest categories; for the IS scale and SDOLS subscales, the proportions were as high as 80%.

Rasch Modeling Analyses
As per the PCASRR analyses, for each subscale, the fundamental Rasch assumption of unidimensionality was satisfied from a practical perspective because a single, primary Rasch dimension existed that explained over 50% of the raw variance (Table 2).Regarding the quality of items, among the three subscales of CoI, there were only three items in the SP and CP subscales that had a score of slightly above 1.50 on either infit or outfit or both MNSQ statistics, thus supporting the quality of the vast majority of the CoI items as being adequate [46].For the SDOLS instrument, a similar conclusion was drawn about the quality of items except for SDL_Q01, which had a relatively high outfit MNSQ statistic of 2.19.Next, all point-measure correlations in each subscale were positive and (moderately) strong (at least 0.63), indicating that there was a proper alignment between the item and the latent construct that the subscale was designed to measure [46].Next, regarding the reliability and separation statistics of each subscale, the CoI instrument subscales mostly had satisfactory scores on person separation, person reliability, and item reliability (CP may be a little low on item reliability), but their item separation scores were all lower than the threshold of 3.00.In contrast, the SDOLS showed a weaker response than CoI on these four statistics.A major underlying reason is that the two subscales of SDOLS had a smaller number of items than the subscales of CoI [46].

Community of Inquiry
Teaching Presence: Based on the Wright map for the TP subscale of the CoI instrument in Figure 2, the students more easily endorsed Item 2, suggesting that the students easily agreed that the instructor clearly communicated important course goals.Next, Items 3, 4, 1, and 8 were the four items that students almost as easily endorsed, indicating that the students very much agreed that the instructor clearly communicated instructions on participation in course learning activities, related due dates and time frames, important course topics, and that the instructor helped keep the students on task in a way that helped them to learn.At the mean difficulty level, there was a group of three items: Items 11, 13, and 9, with the same level of difficulty estimates.This indicates that, on average, the students shared the view that the instructor helped them to focus discussions on relevant issues in a way that helped them to learn, that the instructor provided timely feedback, and that the instructor encouraged them to explore new concepts in the course.Slightly above the mean difficulty level were Items 10 and 6, where the students had some reservations regarding the instructor's actions reinforcing the development of a sense of community among students and on the instructor being helpful in guiding the class towards understanding topics in a way that helped the students clarify their thinking.The students also found it relatively difficult to endorse Item 7 (The instructor helped keep the students engaged and participating in productive dialogues) and Item 5 (The instructor was helpful in identifying areas of agreement and disagreement on course topics that helped the students to learn).Finally, the hierarchy continued to advance upward, reaching the item that the students found most difficult to endorse, Item 12, indicating that the students hardly agreed that the instructor provided feedback that helped them understand their strengths and weaknesses.

Rasch Modeling Analyses
As per the PCASRR analyses, for each subscale, the fundamental Rasch assumption of unidimensionality was satisfied from a practical perspective because a single, primary Rasch dimension existed that explained over 50% of the raw variance (Table 2).Regarding the quality of items, among the three subscales of CoI, there were only three items in the SP and CP subscales that had a score of slightly above 1.50 on either infit or outfit or both MNSQ statistics, thus supporting the quality of the vast majority of the CoI items as being adequate [46].For the SDOLS instrument, a similar conclusion was drawn about the quality of items except for SDL_Q01, which had a relatively high outfit MNSQ statistic of 2.19.Next, all point-measure correlations in each subscale were positive and (moderately) Social Presence: Based on the Wright map for the SP subscale of the CoI instrument in Figure 3, the students felt comfortable interacting with other students (Item 19) and in participating in the course discussions (Item 18) and in disagreeing with other students without compromising a sense of trust (Item 20).Next, Items 17 and 21, with the same level of endorsability, were slightly more difficult for the students to agree on.Despite that, since the two items were still below the mean level of difficulty, it was concluded that the students felt comfortable conversing through the online medium and felt that their points of view were acknowledged by other students in the class.Next, the hierarchy continued to move upward, and right above the mean difficulty level was Item 15, indicating that the students had some reservations about being able to form distinct impressions of some of the other students in the class.Even more difficult to endorse were Items 14 and 22. From this, it is interpreted that the students tended to disagree that getting to know other students in the class gave them a sense of belonging in the course and that online discussions helped them to develop a sense of collaboration.Finally, the most difficult item to endorse was Item 16, suggesting the students hardly agreed that online communication was an excellent medium for social interaction.hardly agreed that the instructor provided feedback that helped them understand their strengths and weaknesses.other students in the class gave them a sense of belonging in the course and that online discussions helped them to develop a sense of collaboration.Finally, the most difficult item to endorse was Item 16, suggesting the students hardly agreed that online communication was an excellent medium for social interaction.  .Next, at the mean difficulty level were two items: Items 32 and 33.This indicates that, on average, the students could describe ways to test and apply the knowledge from the course and that they developed solutions to course problems that could be used in practice.Next, the hierarchy continued to advance upward to reach Items 26, 34, and 28.The interpretation is that, compared with the previous items, the students found it difficult to share the views that they used various information sources to explore problems/issues presented in the course, that they could apply the knowledge created in this course to their work or other non-class related activities, and that those online discussions were valuable in helping them appreciate different perspectives.Finally, Items 23 and 25 were the two most difficult items, suggesting that students hardly agreed that the problems posed in this course increased their interest in course issues and that they felt motivated to explore content-related questions.
could be used in practice.Next, the hierarchy continued to advance upward to reach Items 26, 34, and 28.The interpretation is that, compared with the previous items, the students found it difficult to share the views that they used various information sources to explore problems/issues presented in the course, that they could apply the knowledge created in this course to their work or other non-class related activities, and that those online discussions were valuable in helping them appreciate different perspectives.Finally, Items 23 and 25 were the two most difficult items, suggesting that students hardly agreed that the problems posed in this course increased their interest in course issues and that they felt motivated to explore content-related questions.

Self-Directed Online Learning Scale
Autonomous Learning: Based on the Wright map for the AUL subscale of the SDOLS instrument in Figure 5, the easiest item was Item 38, suggesting the students believed they were in control of their online learning.In contrast, the most difficult item on the subscale was Item 36, indicating that students hardly agreed that they were able to make such decisions about their online learning as selecting online project topics.Between these two items, there were four items, in ascending order of difficulty: Items 37, 40, 39, and 41.First, below the mean level of difficulty, the students had little difficulty agreeing that they worked online during times that they found convenient and that they approached online learning in their own way.Second, on average, the students tended to agree they played

Self-Directed Online Learning Scale
Autonomous Learning: Based on the Wright map for the AUL subscale of the SDOLS instrument in Figure 5, the easiest item was Item 38, suggesting the students believed they were in control of their online learning.In contrast, the most difficult item on the subscale was Item 36, indicating that students hardly agreed that they were able to make such decisions about their online learning as selecting online project topics.Between these two items, there were four items, in ascending order of difficulty: Items 37, 40, 39, and 41.First, below the mean level of difficulty, the students had little difficulty agreeing that they worked online during times that they found convenient and that they approached online learning in their own way.Second, on average, the students tended to agree they played an important role in their online learning.Third, they had reservations about being able to remain motivated even though the instructor was not always online.
an important role in their online learning.Third, they had reservations about being able to remain motivated even though the instructor was not always online.Asynchronous Online Learning: Based on the Wright map for the ASL subscale of the SDOLS instrument in Figure 6, the easiest item was Item 42, indicating that the students were easily able to access the discussion forum at places convenient for them.Next, they were almost equally easily able to read posted messages at times that were convenient for them (Item 43).Next, Item 46 was slightly more difficult, but still below the mean level of difficulty, suggesting that the students usually took notes while watching a video on the computer.Finally, Items 44 and 45 were the two most difficult items on the subscale, which indicates that the students had difficulty in relating the content of online course materials to the information they read from the books and in understanding course-related information when it was presented in video format.Asynchronous Online Learning: Based on the Wright map for the ASL subscale of the SDOLS instrument in Figure 6, the easiest item was Item 42, indicating that the students were easily able to access the discussion forum at places convenient for them.Next, they were almost equally easily able to read posted messages at times that were convenient for them (Item 43).Next, Item 46 was slightly more difficult, but still below the mean level of difficulty, suggesting that the students usually took notes while watching a video on the computer.Finally, Items 44 and 45 were the two most difficult items on the subscale, which indicates that the students had difficulty in relating the content of online course materials to the information they read from the books and in understanding course-related information when it was presented in video format.Finally, the literature recommends that the sample size should be at least six times the number of items for stable results in a factor analysis, of which Rasch analysis is a special type for categorical data [47][48][49].This sample size of 56 participants (relative to the number of items in each of the five subscales) failed to meet this criterion in two instances: (a) 13 items in the TP subscale and (b) 12 items in the CP subscale.To further strengthen the results presented above, an additional Monte Carlo statistical simulation study was conducted based on Linacre (1994) [50] to see whether the relatively small sample size in each of the two subscales had any negative impact on the estimates of model parameters (i.e., difficulty/endorsability of item statements and person/student participant ability).Based on the parameter estimates from the collected/real data, for each of the two subscales, 100 datasets were simulated to have the same number of 56 participants and (13 in TP and 12 in CP) items as the collected data and were each analyzed under the RSM to Finally, the literature recommends that the sample size should be at least six times the number of items for stable results in a factor analysis, of which Rasch analysis is a special type for categorical data [47][48][49].This sample size of 56 participants (relative to the number of items in each of the five subscales) failed to meet this criterion in two instances: (a) 13 items in the TP subscale and (b) 12 items in the CP subscale.To further strengthen the results presented above, an additional Monte Carlo statistical simulation study was conducted based on Linacre (1994) [50] to see whether the relatively small sample size in each of the two subscales had any negative impact on the estimates of model parameters (i.e., difficulty/endorsability of item statements and person/student participant ability).Based on the parameter estimates from the collected/real data, for each of the two subscales, 100 datasets were simulated to have the same number of 56 participants and ( 13in TP and 12 in CP) items as the collected data and were each analyzed under the RSM to produce 100 sets of difficulty and ability parameter estimates.Next, for each subscale, the 100 estimates of each model parameter were aggregated across all 100 simulated datasets to arrive at the mean and median estimates of the parameter, both of which were then compared with the estimate from the collected data.As shown in Figure 7, an overlay of three-line charts documenting the three estimates of each parameter and their comparisons was created, respectively, for item difficulty and person ability parameters (TP and CP results are in the top and bottom panel of the figure, respectively).Evidently, for virtually all parameters in each subscale, the median and mean estimates from the simulated data were very close to each other and were also close to the estimate from the collected data, which further strengthens the results of the study.
Geosciences 2024, 14, x FOR PEER REVIEW 13 of 20 produce 100 sets of difficulty and ability parameter estimates.Next, for each subscale, the 100 estimates of each model parameter were aggregated across all 100 simulated datasets to arrive at the mean and median estimates of the parameter, both of which were then compared with the estimate from the collected data.As shown in Figure 7, an overlay of three-line charts documenting the three estimates of each parameter and their comparisons was created, respectively, for item difficulty and person ability parameters (TP and CP results are in the top and bottom panel of the figure, respectively).Evidently, for virtually all parameters in each subscale, the median and mean estimates from the simulated data were very close to each other and were also close to the estimate from the collected data, which further strengthens the results of the study.

Discussion
A long-standing debate in geoscience education is the question of how we replace or create the field component with virtual course delivery.It may be true that field experience cannot be replaced entirely with virtual or online modality.However, under the curriculum of certain introductory courses, rigorous fieldwork may not be required to cover the concepts and laboratory studies effectively and efficiently to meet learning goals.These introductory Earth science courses may include Physical Geology, and Historical Geology, in which, logistics and services could be arranged, including rocks, minerals, and fossil samples [7,51].
This study was conducted to investigate students' perceptions of the delivery of online Earth science courses while overcoming the challenge of such courses being fieldbased.We studied the outcomes of converting the Dynamic Earth (aka Physical Geology) in-person course to its equivalent online mode by evaluating students' feedback.Our study assessed student experiences of the online class using three surveys.A total of two research questions were proposed and addressed in the study.

Discussion
A long-standing debate in geoscience education is the question of how we replace or create the field component with virtual course delivery.It may be true that field experience cannot be replaced entirely with virtual or online modality.However, under the curriculum of certain introductory courses, rigorous fieldwork may not be required to cover the concepts and laboratory studies effectively and efficiently to meet learning goals.These introductory Earth science courses may include Physical Geology, and Historical Geology, in which, logistics and services could be arranged, including rocks, minerals, and fossil samples [7,51].
This study was conducted to investigate students' perceptions of the delivery of online Earth science courses while overcoming the challenge of such courses being field-based.We studied the outcomes of converting the Dynamic Earth (aka Physical Geology) inperson course to its equivalent online mode by evaluating students' feedback.Our study assessed student experiences of the online class using three surveys.A total of two research questions were proposed and addressed in the study.

Addressing RQ1 through RQ2
Regarding RQ1, the results of the Rasch analysis demonstrate the students' perceptions of their community of inquiry experience in this introductory geoscience course.On the TP subscale, while the students easily endorsed that the instructor clearly communicated important course goals, they found it harder to agree that the instructor provided feedback that helped them understand their strengths and weaknesses.On the SP subscale, the students indicated that they felt comfortable interacting with other students, participating in the course discussions, and disagreeing with other students without compromising a sense of trust.At the same time, they found it challenging to endorse that online communication was an excellent medium for social interaction.On the CP scale, the students easily agreed that some course activities piqued their curiosity, brainstorming and finding relevant information helped them resolve content-related questions, combining new information from various sources helped them answer questions from the course activities, learning activities in the course helped them construct explanations/solutions, and reflecting on the course content and discussions helped them to understand fundamental concepts.However, they hardly agreed that the problems posed in this course increased their interest in course issues and that they felt motivated to explore content-related questions.Finally, there were items measuring other aspects of community of inquiry whose level of difficulty (i.e., endorsability) fell in between the most endorsable and the least endorsable items outlined above.
Regarding RQ2, the results of the Rasch analysis demonstrated the students' perceptions of their self-directed learning experience in this introductory-level geoscience course.On the AUL subscale, while they did not have any difficulty endorsing that they were in control of their online learning, they found it difficult to agree that they were able to make such decisions about their online learning such as selecting online project topics.
On the ASL subscale, the students easily endorsed that they could access the discussion forum at places convenient for them, but found it challenging to relate the content of online course materials to the information they read from the books and understand course-related information when it was presented in video format.Finally, there were items measuring other aspects of self-directed learning whose level of difficulty (i.e., endorsability) fell in between the most endorsable and the least endorsable items outlined above.
Our Rasch analysis results suggest that there is a strong positive correlation between students' experiences and course objectives.Overall, the Rasch analysis shows that the students had less difficulty in agreeing with most of the survey questions.In contrast, students found it more difficult to agree on the aspects of communication and appearance.These include having to know their peers in the class well enough, lack of the appearance of the instructor, participation in course discussions, and project work.These are some of the common challenges for asynchronous online courses, where the physical presence of class participants and instructors may not be mandatory [52,53].Needless to say, because of this spatiotemporal flexibility of the asynchronous online mode, it offers the range and freedom to complete the course without being present in a physical classroom at a particular class time.We believe that the students' responses to the lack of enough peer interaction could be due to their unfamiliarity with the learning style, which could be improved through professional development and more exposure to online learning.Finally, the students agreed that the support and facilities were adequate for the online course and wished to have more detailed feedback on their homework submissions.

Implications for Online Earth Science Programs
Our study suggests an overall interest in asynchronous online programs in the Earth science discipline.Although the survey was administered to the students enrolled on one course (introductory physical geology), this viewpoint might be extended to the other Earth science courses as well.At the same time, the critical rating for the CoI subscale could be directed to scopes of improvements as necessary.Also, the interface could be smoother, which is critical for online education.In addition, due to diverse student backgrounds (e.g., traditional vs. non-traditional), students may not possess the same level of skills required to undertake the technology related to the various tasks involved in online course structures.
The various factors of CoI are highly useful in explaining the effectiveness of online learning [27].Our results related to the CoI factors highlight a significant influence in all three spaces of social, cognitive, and teaching presence in online geoscience learning.One of the main factors that may have controlled the positive feedback in the study is students being able to express themselves with no peer influence and/or confrontation.At the same time, there may be some limitations in initiating enough interaction, which would improve with the familiarity of the mode, prior experience, and practice.The online learning platform is well known for its flexibility to accommodate learners' busy schedule as it offers the opportunity for individuals to take charge of their own learning.SDL is a crucial component of a learner for better adjustment and success in online learning [54][55][56].Interest, curiosity, and a desire for self-improvement were among the most important factors reflected in the student feedback in this study, where they could use various devices and places to learn, and meet their self-directed learning requirements at their own pace [57].Therefore, learners present an increasing interest for online learning, which, in this case, was geoscience.
Having an online Earth science program used to be a rare occurrence due to its nature of being a field-based science.However, with technological advancements, laboratory materials and virtual or self-guided fieldwork are available through different service providers and publishers for introductory geoscience classes [8].Compared with traditional, face-to-face course delivery, the online option helps to reach more diverse groups of students who may not be available for in-person classes [58] and, thus, contributes to student enrollment.Consequently, online course delivery has become an important revenue generator for colleges and universities [59] and an integral component of their long-term strategies [60].

Limitations and Future Directions
It is critical to address the authors' plan for potential areas of future expansion of the study at this stage.Although the result of the study shows a very good correlation between the online delivery of a geoscience course and user experience feedback, a wider demographic representation would have provided a better reflection of the outcome universally.Our results are based on the data reflecting the responses of students from one introductory-level course from one institution.Authors plan to address these two areas as a future extension of this study in two steps.First, we plan to conduct a survey over a longer time and a broader demographic representation to obtain a larger sample size.Then, we will include other introductory core courses into the study to assess the potential of developing a comprehensive online Earth science program.

Conclusions
The current study investigated the level of endorsability of certain aspects of online learning environments and resources through Rasch modeling, contained substantial evidence of the demand for online Earth science courses, and presented supportive evidence to promote online Earth science programs.Our results showed that, with advanced technology and lab services, introductory Earth science classes could be easily transformed into their online equivalent while also overcoming the challenges of geoscience education usually being field based.To effectively and efficiently implement the online delivery of introductory courses, both the instructors and students need to be trained through appropriate professional development sessions, and a trial-and-error approach may need to be taken.As per our analysis, fewer items were marked as being difficult to carry out than those marked as easy.These difficult items could be subject-specific challenges such as geoscience, in this case.The course can be updated with continued studies in the field as needed and modifications to the instrument over time.In the end, the study may also serve to promote similar research using surveys designed to measure geoscience students' feedback regarding the online delivery of introductory Earth science courses.The instructor was helpful in identifying areas of agreement and disagreement on course topics that helped me to learn 6 The instructor was helpful in guiding the class towards understanding course topics in a way that helped me clarify my thinking 7 The instructor helped to keep course participants engaged and participating in productive dialogue 8 The instructor helped keep the course participants on task in a way that helped me to learn 9 The instructor encouraged course participants to explore new concepts in this course.10 Instructor actions reinforced the development of a sense of community among course participants.11 The instructor helped to focus discussion on relevant issues in a way that helped me to learn.12 The instructor provided feedback that helped me understand my strengths and weaknesses.13 The instructor provided feedback in a timely fashion.

Figure 1 .
Figure 1.Survey scale comparison: (a) comparison of survey scale/subscale response counts, and (b) comparison of survey scale/subscale response proportions.Dark blue, yellow, gray, orange, and light blue colors represent the respective categories of the Likert scale from 5 to 1 in descending order.Notice that panel (b) shows more responses representing strongly agree (5) and agree (4).

Figure 1 .
Figure 1.Survey scale comparison: (a) comparison of survey scale/subscale response counts, and (b) comparison of survey scale/subscale response proportions.Dark blue, yellow, gray, orange, and light blue colors represent the respective categories of the Likert scale from 5 to 1 in descending order.Notice that panel (b) shows more responses representing strongly agree (5) and agree (4).

Figure 2 .
Figure 2. Wright map for the Teaching Presence (TP) subscale of the CoI instrument.The left side of the scale of the map shows the person measures (Table A1) and the right side of the map shows the item measures of the TP subscale.Marker X on the left side of the figure represents individual participants.The letters positioned on the vertical axis indicate the mean value (M) of the person ability (left side of figure) or item difficulty (right side of figure), one standard deviation (S), and two standard deviations (T) from the person ability mean or item difficulty mean.

Figure 2 .
Figure 2. Wright map for the Teaching Presence (TP) subscale of the CoI instrument.The left side of the scale of the map shows the person measures (Table A1) and the right side of the map shows the item measures of the TP subscale.Marker X on the left side of the figure represents individual participants.The letters positioned on the vertical axis indicate the mean value (M) of the person ability (left side of figure) or item difficulty (right side of figure), one standard deviation (S), and two standard deviations (T) from the person ability mean or item difficulty mean.

Figure 3 .
Figure 3. Wright map for the Social Presence (SP) subscale of the CoI instrument.The left side of the map shows the person measures (Table A2) and the right side of the map shows the item measures of the SP.Marker X on the left side of the figure represents individual participants.The letters positioned on the vertical axis indicate the mean value (M) of the person ability (left side of figure) or item difficulty (right side of figure), one standard deviation (S), and two standard deviations (T) from the person ability mean or item difficulty mean.Cognitive Presence: Based on the Wright map for the CP subscale of the CoI instrument in Figure4, the map begins with a group of five items with an identical level of endorsability, which the students found it easy to agree on: (1) Item 24 (Some course activities piqued my curiosity), (2) Item 27 (Brainstorming and finding relevant information helped me resolve content-related questions), (3) Item 29 (Combining new information from a range of sources helped me answer questions raised in course activities), (4) Item 30 (Learning activities in this course helped me construct explanations/solutions), and (5) Item 31 (Reflecting on course content and discussions helped me understand fundamental concepts in this class).Next, at the mean difficulty level were two items: Items 32 and 33.This indicates that, on average, the students could describe ways to test and apply the knowledge from the course and that they developed solutions to course problems that

Figure 3 .
Figure 3. Wright map for the Social Presence (SP) subscale of the CoI instrument.The left side of the map shows the person measures (Table A2) and the right side of the map shows the item measures of the SP.Marker X on the left side of the figure represents individual participants.The letters positioned on the vertical axis indicate the mean value (M) of the person ability (left side of figure) or item difficulty (right side of figure), one standard deviation (S), and two standard deviations (T) from the person ability mean or item difficulty mean.Cognitive Presence: Based on the Wright map for the CP subscale of the CoI instrument in Figure4, the map begins with a group of five items with an identical level of endorsability, which the students found it easy to agree on: (1) Item 24 (Some course activities piqued my curiosity), (2) Item 27 (Brainstorming and finding relevant information helped me resolve content-related questions), (3) Item 29 (Combining new information from a range of sources helped me answer questions raised in course activities), (4) Item 30 (Learning activities in this course helped me construct explanations/solutions), and (5) Item 31 (Reflecting on course content and discussions helped me understand fundamental concepts in this class).Next, at the mean difficulty level were two items: Items 32 and 33.This indicates that, on average, the students could describe ways to test and apply the knowledge from the course and that they developed solutions to course problems that could be used in practice.Next, the hierarchy continued to advance upward to reach Items 26, 34, and 28.The interpretation is that, compared with the previous items, the students found it difficult to share the views that they used various information sources to explore problems/issues presented in the course, that they could apply the knowledge created in this course to their

Figure 4 .
Figure 4. Wright map for the Cognitive Presence (CP) subscale of the CoI instrument.The left side of the map shows the person measures (Table A3) and the right side of the map shows the item measures of CP subscale.Marker X on the left side of the figure represents individual participants.The letters positioned on the vertical axis indicate the mean value (M) of the person ability (left side of figure) or item difficulty (right side of figure), one standard deviation (S), and two standard deviations (T) from the person ability mean or item difficulty mean.

Figure 4 .
Figure 4. Wright map for the Cognitive Presence (CP) subscale of the CoI instrument.The left side of the map shows the person measures (Table A3) and the right side of the map shows the item measures of CP subscale.Marker X on the left side of the figure represents individual participants.The letters positioned on the vertical axis indicate the mean value (M) of the person ability (left side of figure) or item difficulty (right side of figure), one standard deviation (S), and two standard deviations (T) from the person ability mean or item difficulty mean.

Figure 5 .
Figure 5. Wright map for the Autonomous Learning (AUL) subscale of the SDOLS instrument.The left side of the map shows the person measures (Table A5) and the right side of the map shows the item measure of AUL subscale.Marker X on the left side of the figure represents individual participants.The letters positioned on the vertical axis indicate the mean value (M) of the person ability (left side of figure) or item difficulty (right side of figure), one standard deviation (S), and two standard deviations (T) from the person ability mean or item difficulty mean.

Figure 5 .
Figure 5. Wright map for the Autonomous Learning (AUL) subscale of the SDOLS instrument.The left side of the map shows the person measures (Table A5) and the right side of the map shows the item measure of AUL subscale.Marker X on the left side of the figure represents individual participants.The letters positioned on the vertical axis indicate the mean value (M) of the person ability (left side of figure) or item difficulty (right side of figure), one standard deviation (S), and two standard deviations (T) from the person ability mean or item difficulty mean.

Figure 6 .
Figure 6.Wright map for the Asynchronous Online Learning subscale of the SDOLS instrument.The left side of the map shows the person measures (Table A6) and the right side of the map shows the item measures of ASL subscale.Marker # on the left side of the figure represents a multitude of participants.The letters positioned on the vertical axis indicate the mean value (M) of the person ability (left side of figure) or item difficulty (right side of figure), one standard deviation (S), and two standard deviations (T) from the person ability mean or item difficulty mean.

Figure 6 .
Figure 6.Wright map for the Asynchronous Online Learning subscale of the SDOLS instrument.The left side of the map shows the person measures (Table A6) and the right side of the map shows the item measures of ASL subscale.Marker # on the left side of the figure represents a multitude of participants.The letters positioned on the vertical axis indicate the mean value (M) of the person ability (left side of figure) or item difficulty (right side of figure), one standard deviation (S), and two standard deviations (T) from the person ability mean or item difficulty mean.

Figure 7 .
Figure 7. Monte Carlo statistical simulation for TP and CP subscales: (a) comparisons of TP subscale item estimates; (b) comparisons of TP subscale person estimates; (c) comparisons of CP subscale item estimates; and (d) comparisons of CP subscale person estimates.Red, blue, and black lines represent the real data, simulated mean, and simulated median, respectively.

Figure 7 .
Figure 7. Monte Carlo statistical simulation for TP and CP subscales: (a) comparisons of TP subscale item estimates; (b) comparisons of TP subscale person estimates; (c) comparisons of CP subscale item estimates; and (d) comparisons of CP subscale person estimates.Red, blue, and black lines represent the real data, simulated mean, and simulated median, respectively.

Table 1 .
Scales of Online Learning Environment Survey.

Table 2 .
Rasch Analysis of the Survey Responses.

Table A2 .
Community of inquiry (Social Presence).other course participants gave me a sense of belonging in the course.56 15 I was able to form distinct impressions of some course participants.16 Online or web-based communication is an excellent medium for social interaction.17 I felt comfortable conversing through the online medium.18 I felt comfortable participating in the course discussions.19 I felt comfortable interacting with other course participants.20 I felt comfortable disagreeing with other course participants while still maintaining a sense of trust.21 I felt that my point of view was acknowledged by other course participants.22 Online discussions helped me to develop a sense of collaboration.

Table A3 .
Community of inquiry (Cognitive Presence).variety of information sources to explore problems/issues presented in this course.27 Brainstorming and finding relevant information helped me resolve content related questions.28 Online discussions were valuable in helping me appreciate different perspectives.29 Combining new information from a range of sources helped me answer questions raised in course activities.30 Learning activities in this course helped me construct explanations/solutions. 31 Reflecting on course content and discussions helped me understand fundamental concepts in this class.32 I can describe ways to test and apply the knowledge created in this course.33 I have developed solutions to course problems that can be applied in practice.34 I can apply the knowledge created in this course to my work or other non-class related activities.

Table A5 .
Self-directed learning (autonomous learning).to make decisions about my online learning (e.g., selecting online project topics).to remain motivated even though the instructor was not online at all times.

Table A6 .
45lf-directed learning (asynchronous online learning).toreadpostedmessagesattimesthat were convenient to me.44I was able to relate the content of online course materials to the information I have read in books.45Iwasable to understand course-related information when it was presented in video formats.46I was able to take notes while watching a video on the computer.