You are currently viewing a new version of our website. To view the old version click .
Education Sciences
  • Article
  • Open Access

23 February 2022

Teaching Field Data Crowdsourcing Using a GPS-Enabled Cellphone Application: Soil Erosion by Water as a Case Study

,
,
and
1
Department of Forestry and Environmental Conservation, Clemson University, Clemson, SC 29634, USA
2
University of Arkansas Agricultural Experiment Station, Arkansas Forest Resources Center, University of Arkansas at Monticello, Monticello, AR 71655, USA
3
Department of Plant and Environmental Sciences, Clemson University, Clemson, SC 29634, USA
*
Author to whom correspondence should be addressed.
This article belongs to the Special Issue Emergent Technologies to Support Active Learning in Higher Education

Abstract

Crowdsourcing is an important tool for collecting spatio-temporal data, which has various applications in education. The objectives of this study were to develop and test a laboratory exercise on soil erosion by water and field data crowdsourcing in an online introductory soil science course (FNR 2040: Soil Information Systems) at Clemson University. Students from different STEM disciplines (wildlife biology, forestry, and environmental and natural resources) participated in the study in the fall of 2021. They completed a sequence of self-contained digital teaching modules or reusable learning objects (RLOs), which are often used in online learning. The exercise included a field exercise and learning module to teach students about different types of water-based soil erosion as well as field data collection and crowdsourcing tools. As a result of this exercise, student familiarity with crowdsourcing was effectively increased, as shown by the post-assessment survey with a +31.2% increase in the “moderately familiar” category and a +28.3% increase in the “extremely familiar” category. The online quiz contained ten questions and was taken by 56 students with an average score of 9.5 (out of 10). A post-assessment survey found that most of the students indicated that the laboratory was an effective learning experience about field data crowdsourcing using a GPS-enabled cellphone application. Detailed students’ comments indicated enjoyment of learning (e.g., data collection, learning about different technologies), the value of multimedia (e.g., ArcGIS Survey123, cellphone), the flexibility of learning (e.g., field work), the content applicability (e.g., actual field examples of erosion by water), and criticism (e.g., technical issues). A word cloud derived from students’ comments about their laboratory exercise experience indicated the most frequent words used by students, such as “erosion”, “enjoyed”, and “different”, among others. Incorporating a learning module and field exercise using modern data collection technology into an undergraduate soil science education course enabled students to understand the value and methods for leveraging cellphone-based field collection methods to crowdsource data for environmental assessment. Practical recommendations for planning and executing future crowdsourcing exercises were developed using the current study as an example.

1. Introduction

The crowdsourcing of location-aware information is becoming an important data source in scientific inquiry. Crowdsourcing offers an efficient and inexpensive way to collect data (e.g., field). Exposing students to a crowdsourcing activity can teach them critical aspects of the process (e.g., GPS, cellphone data collection, data quality), therefore providing students with a comprehensive understanding of crowdsourcing.
Crowdsourcing can be difficult to define because it can encompass many practices []. Estellés-Arolas and González-Ladrón-De-Guevara (2012) [] conducted an extensive and critical literature analysis, which resulted in an integrated definition of crowdsourcing as collaborative work to complete tasks defined by crowd characteristics, crowdsourcing initiator, and crowdsourcing process [].
The crowdsourcing concept has been applied to a broad range of collaborative activities and has various definitions [] that can make it challenging to teach in higher education. Understanding both the components and definition of crowdsourcing is essential to allow educators to teach relevant aspects of crowdsourcing as it becomes an ever more important way to collect data and ideas from the public. An existing framework to define crowdsourcing [] has been enhanced (Table 1), and will be discussed based on its components with examples from peer-reviewed articles.
Table 1. Definition and characteristics of crowdsourcing (based on Estellés-Arolas et al. 2012 []).
  • Crowd characteristics:
  • Member composition: Type and skill of participants.
In the educational setting, students are almost universally used as the “crowd”. Their types (e.g., academic standing, subject matter, etc.) and skills (e.g., prior knowledge of crowdsourcing) can be determined prior to the crowdsourcing event by conducting a survey using freely available web-based tools (e.g., Google Forms). Differences in the skills and abilities of students have been used to both support other students through peer reviews [] and to develop course content [].
2.
Member tasks: Complexity and difficulty.
Educational subject matter, topics, and learning objectives dictate member tasks []. Tasks can vary in complexity and difficulty and can involve assigning tasks on existing crowdsourcing platforms []. Task routing and assignment in crowdsourcing can be based on cognitive abilities [].
3.
Member participation rewards: Types of compensation.
In educational setting, types of compensation are often different from many commercial crowdsourcing platforms []. In educational exercises that are part of a course, the compensation is most often related to grades [,] or reputational rewards []. Other types of compensation may include enjoyment, the acquisition of technical skills to include in the professional resume, authorship on a peer-reviewed article resulting from crowdsourcing data collection, and many others.
  • Crowdsourcing initiator:
  • Event initiator: Individual, public entity, company.
In the educational setting, teachers and professors are often the event initiators [].
2.
Cost of crowdsourcing deployment: Personnel, computer resources, data validation. Existing online platforms (e.g., ArcGIS Survey123) allow the development of crowdsourcing systems efficiently because the computer resources necessary include an existing cloud infrastructure and the data collection devices which are the existing student smartphones.
3.
Benefits of crowdsourcing: Knowledge, ideas, value creation.
Crowdsourcing can develop field data to support environmental monitoring (as with this study), can create educational content [], and can complete existing tasks through commercial crowdsourcing platforms, which could include digital content creation [] or content recognition []. One of the important benefits of an educational crowdsourcing assignment is that by involving students in the crowdsourcing process, it is possible to teach the meaning and value of the crowdsourcing technique.
  • Crowdsourcing process:
  • Method used: Participatory, distributed online process.
Educational crowdsourcing exercises will often take the form of a class assignment through an online process or platform. It is possible to perform a crowdsourcing exercise without using one or more internet-connected tools, but they in most cases would not be analogous to a crowdsourcing campaign that would use the internet to connect the public or groups of experts to solve a particular task.
2.
Identification of participants: Open, limited to a knowledge community, or a combination.
Within an educational setting, most examples include a limited community based on the students [,], but future exercises could include the development of public crowdsourcing campaigns, including the identification of a community, outreach to that community, and the analysis of results, if it could be completed within an academic term.
3.
Medium used: Internet.
Internet connectivity is key for the collection of content and data, or the completion of tasks within crowdsourcing. Leveraging location (e.g., GPS) and smartphone capabilities, as done in the current study, provides a way to teach structured data collection to rapidly complete a task using multiple participants that would be time- or cost-prohibitive otherwise.
  • Ethical and regulatory considerations:
Crowdsourcing is subject to ethical and regulatory considerations, which are discussed in several studies [,].
Past educational crowdsourcing studies can be categorized as follows: (1) creating educational content, (2) providing practical experience, (3) exchanging knowledge, or (4) providing course feedback []. Many studies in the educational literature utilize crowdsourcing methodologies (including real-world data collection), but few focus on helping students understand the process and the related technologies as an educational goal. The hypothesis of this study is that involving students as agents in the crowdsourcing process helps students understand the relevance and potential of crowdsourcing in their field, while also being able to teach important domain-specific field knowledge. This process of incorporating student participation in crowdsourcing creates the opportunity for actively engaged learning, which may improve the learning process []. The objectives of this study were to develop and test the laboratory exercise on soil erosion field data crowdsourcing using a GPS-enabled cellphone application in an online introductory soil science course (FNR 2040: Soil Information Systems) taught to Clemson University students from different STEM disciplines (wildlife biology, forestry, and environmental and natural resources) in the fall of 2021. The subject of soil erosion fits well with the overall course objectives of learning fundamental soil science concepts and definitions, as well as methods of soil analysis in lectures and laboratories.

2. Materials and Methods

2.1. Design

This exercise involves a sequence of self-contained digital teaching modules or reusable learning objects (RLOs), which were used within a Learning Management System (Canvas; LMS) (Table 2). The laboratory exercise consisted of a field exercise and learning module to teach students about different types of water-based soil erosion, as well as field data collection and crowdsourcing tools (Table 2).
Table 2. Design steps of this study using a sequence of reusable learning objects (RLOs).

2.2. Background of the “Test” Course

“Test” course: Soil Information Systems (FNR 2040) is a 4-credit course in the Department of Forestry and Environmental Conservation at Clemson University, Clemson, SC, USA []. FNR 2040 is “an introductory soil course that focuses on the input, analysis, and output of soil information utilizing geographic information technologies (Global Positioning Systems, Geographic Information Systems, direct/remote sensing) and soil data systems (soil surveys, laboratory data, and soil data storage). Soil Information Systems course is a required course for forestry, wildlife, and environmental science majors” []. The course was taught online for the first time because of COVID-19, which required the development of online exercises. General course information from the survey is presented in Table 3.
Table 3. General survey information about the course (FNR 2040: Soil Information Systems course, n = 58).

2.3. Field Data Collection Using ArcGIS Survey123

A form for soil erosion data collection was created on the ArcGIS Survey123 website (https://survey123.arcgis.com (accessed on 1 September 2021)), which provides a no-code visual form construction tool to build a form before it is shared with the students undertaking data collection. The constructed form included a field for the date and time, as well as the location and a photo upload widget. The form included questions on the type of soil erosion by water present as well as if there was any soil erosion control at the site (Figure 1). There was also a field added so that students could add notes. After the form was constructed, it was deployed on the ArcGIS Survey123 website and then shared with students either through a QR code or by adding their ArcGIS.com account emails. Students were asked to download the free ArcGIS Survey123 application, which is available for both Android and iOS phones, and then to load in the custom survey form before finding field sites. When a suitable field site was found, the students opened the application and form and submitted the required information and photo for each erosion location they identified.
Figure 1. Field data collection using ArcGIS Survey123.

3. Results

3.1. Pre-Testing Responses to the Web-Based Survey

Pre-testing responses to the survey questions about the familiarity with the subject of soil erosion revealed that most students were “slightly familiar” (34.5%) and “somewhat familiar” (46.6%) with the types of soil erosion caused by water (Table 4). In terms of familiarity with technological concepts, the survey results showed that students were more familiar with GPS (“somewhat familiar” 44.8%; “moderately familiar” 41.4%) compared to crowdsourcing (“not at all familiar” 56.9%; “slightly familiar” 22.4%) and remote sensing (“not at all familiar” 51.7%; “slightly familiar” 31.0%) (Table 4). A quarter of all of the students were not aware that cellphones had a built-in GPS (Table 4). More than fifty percent of students previously used cellphones for data collection (Table 4). More than eighty percent of students knew that cellphones can be used for remote sensing purposes (Table 4).
Table 4. Pre- and post-assessment results from the laboratory exercise on soil erosion field data crowdsourcing using the GPS-enabled cellphone application in the FNR 2040: Soil Information Systems course.

3.2. Laboratory Exercise and Quiz Results

Once each participant had recorded soil erosion data from the laboratory exercise within the ArcGIS Survey123 application, the results were synced either in real-time or when the cellphone had internet access again, with the online ArcGIS.com database for storage, and the GPS locations were used to generate a digital map of the locations with different types of soil erosion using geographic information system (GIS) software (Figure 2). Students successfully completed this laboratory exercise.
Figure 2. Maps showing the crowdsourcing of soil erosion field data collected by the students: (a) locations of data collection, and (b) examples of photos of types of soil erosion.
After the laboratory exercise, students were asked to complete a quiz to assess the learning outcomes. The quiz scores were excellent and suggest that students were able to retain the knowledge presented in each RLO (Table 5).
Table 5. Responses to the quiz questions for the laboratory exercise on soil erosion field data crowdsourcing using GPS-enabled cellphone application in the FNR 2040: Soil Information Systems course (n = 56; average score: 9.5; high score: 10; low score: 8; standard deviation: 0.65; average time: 04:58).

3.3. Comparison of Pre- and Post-Testing Responses to the Web-Based Surveys

Post-assessment results found that the lecture and laboratory exercise effectively educated students about the types of soil erosion caused by water, with an increase in the number of students classifying their familiarity with the concept of soil erosion as “moderately familiar” (+31.7% increase from pre-assessment) and “extremely familiar” (+49.1% increase from pre-assessment) (Table 4). Students increased their familiarity with all of the technological concepts of GPS, crowdsourcing, and remote sensing, as indicated by increases in the percent of “moderately” and “extremely” familiar categories. Students also increased their awareness that every cellphone has a GPS built in, and that a cellphone can be used for remote sensing purposes.
Most students agreed or “strongly” agreed that the laboratory was an effective way to learn about soil erosion caused by water, and crowdsourcing (Table 4). Over 90% of students found that a cellphone data gathering app (e.g., ArcGIS Survey123) was an accurate and efficient way to collect field data. Interestingly, although having a cellphone was not a requirement for taking the course, each student had access to a cellphone capable of running the ArcGIS Survey123 application (Android: 6.0 Marshmallow or later (ARMv7 (32 bit) and ARMv8 (64 bit)); iOS: 13 or later (64 bit), which is limited to more modern Android or iOS devices.
Detailed student comments were grouped by theme, with some examples shown in Table 6. Many of the students’ comments followed cross-themes. Students enjoyed (T1. Enjoyment of learning) learning about soil erosion, collecting data, seeing real examples in the field, taking photos, and “looking for soil erosion in their own land”. They appreciated the value of multimedia (T2), such as the ArcGIS Survey 123, cellphone usage, and GIS. Collecting field data presented students with flexibility of learning (T3), whereby they had opportunities for “getting outside”, and “seeing new places to gather data.” Students universally appreciated the field aspect of the data collection and even suggested that they would have appreciated more time to collect additional samples in some cases. Students also appreciated that the laboratory consisted of various RLOs, which provided “the breakdown steps of each part of the lab.” Students recognized the “Applicability of content” (T4), describing their experiences in learning about different types of erosion in the field. Students’ comments also included constructive criticism, which varied from technical suggestions to organizational issues. These suggestions will be highly valuable in future improvements to this exercise, as well as other exercises.
Table 6. Post-assessment students’ comments, grouped by theme [], regarding their experience with the laboratory exercise on soil erosion field data crowdsourcing using a GPS-enabled cellphone application in the FNR 2040: Soil Information Systems course.
The pre- and post-testing non-matching responses to the survey questions to define soil erosion caused by water, the Global Positioning System (GPS) and what it is used for, crowdsourcing, and remote sensing reveal that students significantly improved their ability to define these terms (Table 7). The responses in Table 7 show that most of the students could not define GPS, crowdsourcing, or remote sensing prior to the laboratory exercise. Students demonstrated an ability to define these terms after the laboratory exercise using technological terms (Table 7). This is important because these terms and the associated technologies have relevance to and impacts on additional areas of the course, and for the majors involved. For example, while this exercise used ArcGIS Survey123 for crowdsourcing, the same platform and technology could be used for individual project data collection in the era of the rapid transition from paper-based to digital data recording methodologies.
Table 7. Examples of pre- and post-testing non-matching responses to the survey questions to define soil erosion caused by water, the Global Positioning System (GPS) and what it is used for, crowdsourcing, and remote sensing.
A word cloud was created based on all of the answers to the open-ended request to write about the favorite experience in the exercise. This word cloud showed the most frequently used words, such as “erosion”, “different”, “enjoyed”, and “going”, which indicates that the objective of learning about soil erosion was met (Figure 3). Multiple words also mentioned the idea that going outside and using mobile application technology was an important aspect of the project (e.g., “places,” “data,” “outside,” “app”, “Survey123”). The relatively small presence of words related to data collection, such as “data” and “app”, demonstrates that although the technological tools were mentioned, they were not as prominent for students as the topics of soil erosion and fieldwork.
Figure 3. Word cloud based on students’ comments about their experience with the laboratory exercise on soil erosion field data crowdsourcing using GPS-enabled cellphone application in the FNR 2040: Soil Information Systems course. The larger the word in the word cloud, the more frequent the word in the students’ comments.

4. Discussion

Teaching with technology can increase students’ interests in soil science and other STEM disciplines, especially when field lessons can be related to ecological issues []. Previous studies indicate that many students favor the field component and the use of advanced technologies and soil maps in the soil science discipline []. The results of this study provide practical recommendations for developing crowdsourcing exercises. Table 1 can be used to help plan a crowdsourcing exercise by identifying a suitable crowdsourcing subject and crowdsourcing components (e.g., crowd characteristics, crowdsourcing initiator, and crowdsourcing process). Table 8 demonstrates an example of how to plan and structure an exercise based on the components defined in Table 1 using soil erosion as a subject matter. The advantage of using this structured planning tool is that a crowdsourcing exercise can be created that meets content-based learning objectives while teaching the students about crowdsourcing techniques and tools []. Many other disciplines, besides soil science, have topics with a geospatial component that could benefit from the planning method and technology used here [].
Table 8. Practical recommendations for planning and executing crowdsourcing exercises using the current study as an example.
Future crowdsourcing exercises could build on this methodology to extend the participatory nature of the project through gamification [,] and/or incorporate related technologies. Students could be given the anonymous opportunity to evaluate data collected by other students to improve overall data quality [], which could also be used to create reputational rewards [,]. Projects designed to support a social cause may be well received by students in some situations []. Crowdsourced results could be combined with models to validate estimated environmental impacts. A remote sensing exercise using satellite or aerial imagery could be leveraged to identify likely locations where data are needed. Students could also be tasked with planning a crowdsourcing campaign to be carried out either by student groups or the general public.
The ArcGIS Survey123 provides an accessible and easy-to-use platform for geospatial data collection that can be customized to a wide range of topics, because data collection could range from a more natural resource focus, as with this study, to, for example, economic surveys of individuals. Another example of an alternative open-source platform for crowdsourcing is the QField application []. This system also includes cellphone and desktop applications, and is focused, as with ArcGIS Survey123, on geospatial data collection.

5. Conclusions

This study described the development, application, and assessment of soil erosion field data crowdsourcing using a GPS-enabled cellphone application. The effectiveness of this teaching innovation was assessed using a pre- and post-testing web-based survey and online quiz. The online quiz was used to examine the learning outcomes. The web-based survey tool measured various constructs (e.g., familiarity with types of soil erosion caused by water, GPS, crowdsourcing, and remote sensing) before and after the laboratory-based activity. The reusable learning objects were effective teaching tools for explaining soil erosion as a subject matter, and crowdsourcing as a tool for field data collection. Students gained knowledge of the types of soil erosion, as demonstrated by the excellent quiz scores (average score: 9.5) received by the students. Students increased their familiarity with the concepts of soil erosion, GPS, crowdsourcing, and remote sensing. Students found that a cellphone data-gathering application (e.g., ArcGIS Survey123) was an accurate way to collect field data. The results of the study were used to develop practical recommendations for planning and executing crowdsourcing exercises, using the current study as an example. Educators can use the proposed template to plan crowdsourcing exercises for various topics and disciplines. These methods could be extended to focus on a specific research topic or societal need. Incorporating a peer-review system could improve the data quality by enabling the students to evaluate each other’s work. A reputational system could further improve the crowdsourcing results by identifying students who demonstrated accurate data evaluation skills. Gamification of the crowdsourcing process could increase student participation and satisfaction.

Author Contributions

Conceptualization, E.A.M. and C.J.P.; methodology, E.A.M., C.J.P., G.L.Y. and H.A.Z.; formal analysis, E.A.M., C.J.P. and G.L.Y.; writing—original draft preparation, E.A.M. and G.L.Y.; writing—review and editing, E.A.M. and H.A.Z.; visualization, E.A.M. and H.A.Z.; supervision, E.A.M. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the National Institute of Food and Agriculture (NIFA) (U.S.A.), Grant No. 2020-70003-32310/Project Accession No. 1023532.

Institutional Review Board Statement

The study was conducted according to the guidelines of the Declaration of Helsinki and approved by the Institutional Review Board (or Ethics Committee) of Clemson University (protocol code IRB2020-257 and date of approval: 25 September 2020).

Data Availability Statement

Not applicable.

Acknowledgments

We would like to thank the reviewers for their constructive comments and suggestions.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

ENREnvironmental and Natural Resources
FORForestry
RLOReusable learning object
STEMScience, technology, engineering, and mathematics
USDAUnited States Department of Agriculture
WFBWildlife and Fisheries Biology

References

  1. Estellés-Arolas, E.; González-Ladrón-De-Guevara, F. Towards an integrated crowdsourcing definition. J. Inf. Sci. 2012, 38, 189–200. [Google Scholar] [CrossRef] [Green Version]
  2. Pirttinen, E. Crowdsourcing in Computer Science Education. In Proceedings of the 17th ACM Conference on International Computing Education Research (ICER 2021), Virtual Event USA, 16–19 August 2021; ACM: New York, NY, USA, 2021. [Google Scholar] [CrossRef]
  3. Rahim, A.A.A.; Khamis, N.Y.; Majid, M.S.Z.A.; Uthayakumaran, A.; Isa, N.M. Crowdsourcing as content collaboration for STEM edutainment. Univers. J. Educ. Res. 2021, 9, 99–107. [Google Scholar] [CrossRef]
  4. Khan, J.; Papangelis, K.; Markopoulos, P. Completing a crowdsourcing task instead of an assignment; What do university students think? Case Study. In Proceedings of the 30th ACM Conference on Human Factors in Computing Systems (CHI ′20), Honolulu, HI, USA, 25–30 April 2020. [Google Scholar] [CrossRef]
  5. Goncalves, J.; Feldman, M.; Hu, S.; Kostakos, V.; Bernstein, A. Task routing and assignment in crowdsourcing based on cognitive abilities. In Proceedings of the 26th International World Wide Web Conference (WWW ′17), Perth, Australia, 3–7 April 2017. [Google Scholar] [CrossRef] [Green Version]
  6. Cappa, F.; Rosso, F.; Hayes, D. Monetary and social rewards for crowdsourcing. Sustainability 2019, 11, 2834. [Google Scholar] [CrossRef] [Green Version]
  7. Dunlap, J.C.; Lowenthal, P.R. Online educators’ recommendations for teaching online: Crowdsourcing in action. Open Prax. 2018, 10, 79–89. [Google Scholar] [CrossRef] [Green Version]
  8. Yang, R.; Xue, Y.; Gomes, C.P. Pedagogical value-aligned crowdsourcing: Inspiring the wisdom of crowds via interactive teaching. In Proceedings of the 17th International Conference on Autonomous Agents and Multiagent Systems (AAMAS 2018), Stockholm, Sweden, 10–15 July 2018. [Google Scholar]
  9. Yu, H.; Shen, Z.; Miao, C.; An, B. A Reputation-aware Decision-making Approach for Improving the Efficiency of Crowdsourcing Systems. In Proceedings of the 2013 International Conference on Autonomous Agents and Multi-agent Systems (AAMAS ′13), St. Paul, MN, USA, 6–10 May 2013; International Foundation for Autonomous Agents and Multiagent Systems: Richland, SC, USA, 2013. [Google Scholar]
  10. Guo, H.; Ajmeri, N.; Singh, M.P. Teaching crowdsourcing: An experience report. IEEE Internet Comput. 2018, 22, 44–52. [Google Scholar] [CrossRef]
  11. Zdravkova, K. Ethical issues of crowdsourcing in education. J. Responsible Technol. 2020, 2, 100004. [Google Scholar] [CrossRef]
  12. Zheng, F.; Tao, R.; Maier, H.R.; See, L.; Savic, D.; Zhang, T.; Chen, Q.; Assumpção, T.H.; Yang, P.; Heidari, B.; et al. Crowdshourcing methods for data collection in geophysics: State of the art, issues, and future direction. Rev. Geophys. 2018, 56, 698–740. [Google Scholar] [CrossRef]
  13. Jiang, Y.; Schlagwein, D.; Benetallah, B. A review on crowdsourcing for education: State of the art of literature and practice. In PACIS 2018—Opportunities and Challenges for the Digitized Society: Are we Ready, Proceedings of the 2018 Pacific Asia Conference on Information Systems, Yokohama, Japan, 26–30 June 2020; Association for Information Systems (AIS): Atlanta, GA, USA, 2020; Available online: https://aisel.aisnet.org/pacis2018/ (accessed on 20 January 2022).
  14. Dahlqvist, M. Stimulating Engagement and Learning through Gamified Crowdsourcing: Development and Evaluation of a Digital Platform. Master’s Thesis, Faculty of Arts, Uppsala University, Uppsala, Sweden, 2017. Available online: http://www.diva-portal.org/smash/record.jsf?dswid=-9848&pid=diva2%3A1113649 (accessed on 20 January 2022).
  15. Clemson University. Undergraduate Announcements; Clemson University: Clemson, SC, USA, 2021. [Google Scholar]
  16. Redmond, C.; Davies, C.; Cornally, D.; Adam, E.; Daly, O.; Fegan, M.; O’Toole, M. Using reusable learning objects (RLOs) in wound care education: Undergraduate student nurse’s evaluation of their learning gain. Nurse Educ. Today 2018, 60, 3–10. [Google Scholar] [CrossRef] [PubMed]
  17. Urbańska, M.; Świtoniak, M.; Charzyński, P. Rusty soils—“lost” in school education. Soil Sci. Annu. 2021, 72, 143466. [Google Scholar] [CrossRef]
  18. Hartemink, A.E.; Balks, M.R.; Chen, Z.; Drohan, P.; Field, D.J.; Krasilnikov, P.; Lowe, D.J.; Rabenhorst, M.; van Rees, K.; Schad, P.; et al. The joy of teaching soil science. Geoderma 2014, 217, 1–9. [Google Scholar] [CrossRef]
  19. Urbańska, M.; Charzyński, P.; Gadsby, H.; Novák, T.J.; Şahin, S.; Yilmaz, M.D. Environmental threats and geographical education: Students’ sustainability awareness—Evaluation. Educ. Sci. 2022, 12, 1. [Google Scholar] [CrossRef]
  20. Urbańska, M.; Sojka, T.; Charzyński, P.; Świtoniak, M. Digital media in soil education. Geogr. Tour. 2019, 7, 41–52. [Google Scholar]
  21. Ostadabbas, H.; Weippert, H.; Behr, F.J. Using the synergy of QField for collecting data on-site and QGIS for interactive map creation by ALKIS® data extraction and implementation in PostgreSQL for urban planning processes. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2020, 43, 679–683. [Google Scholar] [CrossRef]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Article Metrics

Citations

Article Access Statistics

Multiple requests from the same IP address are counted as one view.