Next Article in Journal
“Assessment as Discourse”: A Pre-Service Physics Teacher’s Evolving Capacity to Support an Equitable Pedagogy
Previous Article in Journal
Planning Science Instruction for Critical Thinking: Two Urban Elementary Teachers’ Responses to a State Science Assessment
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

An Assessment of Educational Benefits from the OpenOrbiter Space Program

1
Department of Computer Science, 3950 Campus Road, Stop 9015, Grand Forks, ND 58202, USA
2
Department of Space Studies, 4149 University Ave, Stop 9008, Grand Forks, ND 58202, USA
*
Author to whom correspondence should be addressed.
Educ. Sci. 2013, 3(3), 259-278; https://doi.org/10.3390/educsci3030259
Submission received: 20 April 2013 / Revised: 1 July 2013 / Accepted: 2 July 2013 / Published: 10 July 2013

Abstract

:
This paper analyzes the educational impact of the OpenOrbiter Small Spacecraft Development Initiative, a CubeSat development program underway at the University of North Dakota. OpenOrbiter includes traditional STEM activities (e.g., spacecraft engineering, software development); it also incorporates students from non-STEM disciplines not generally involved in aerospace engineering projects such as management, entrepreneurship, education and fine arts. The value of the program to participants is analyzed quantitatively, in terms of improvement related to five key learning objectives.

1. Introduction

The OpenOrbiter Small Spacecraft Development Initiative is an interdisciplinary space program operating at the University of North Dakota. A thematically-related predecessor program was initiated in 2011; OpenOrbiter started in 2012. The program was initiated with the goal of developing a CubeSat-class spacecraft; however, the exact nature of the program, its name and branding, as well as its implementation have been driven by participants. OpenOrbiter seeks to demonstrate the space worthiness and functionality of the Open Prototype for Educational NanoSats (OPEN) designs. OPEN aims to reduce the cost of CubeSat development for other institutions by providing a complete set of design documents, implementation instructions (written and video), software and testing plans. With the OPEN documentation, a CubeSat can be created with a parts budget of less than $5,000.
This paper analyzes the benefit that participating in the program has had for students during its first year of operations (1.5 years, including the thematically-related precursor program). These benefits are characterized in terms of key educational objectives that were defined before program initiation and during its early implementation. Attainment of these objectives is analyzed quantitatively, based on survey responses from participants.

2. Background

The OpenOrbiter project draws inspiration from previous work related to experimental and problem-based learning (PBL) and related to CubeSat development. A brief overview of prior work related to both of these topics is now presented.

2.1. Experiential Learning and Problem-Based Learning

Project based learning (also known as problem based learning or experiential learning) involves providing students with a challenge to solve or problem to resolve. Students collect information, assess the nature of the challenge or problem and devise and implement a plan to achieve the assigned goal or resolve the assigned problem. The utility of PBL techniques has been demonstrated for all stages of education ranging from primary to university-level (see [1,2,3,4,5,6]). The use of PBL has also been favorably assessed in numerous disciplines such as computer science [7,8], computer engineering [9], electrical engineering [10,11], mechanical engineering [12,13,14], aerospace engineering [15,16], management [17] and marketing [18]). Small spacecraft development, in an educational setting, is inherently an exercise in PBL. Students can be involved (depending on program particulars) in the design, development, testing and operations of the spacecraft. PBL small spacecraft programs (e.g., [6,19]) have been shown to be effective in achieving educational outcomes.

2.2. Small Spacecraft

CubeSats, such as those based on the OPEN specification, make ideal platforms for student learning as they allow the project to fit into a timeframe that allows students to be involved in the entire project (or a substantial portion of it). The CubeSat concept was developed by Robert Twiggs and Jordi Puig Suari specifically for educational purposes and they have been successfully used by numerous institutions for this purpose [20,21]. CubeSats, particularly larger sized ones such as the 6-U form factor (30 cm × 20 cm × 10 cm, 8 kg mass), have also been utilized for bona fide research and other purposes. While they have always been much less expensive than typical spacecraft [22], recent work has shown that their cost can be further reduced to as low as $5,000 [23].

3. The Program

The OpenOrbiter Small Spacecraft Development Initiative is a small spacecraft development program underway at the University of North Dakota. It provides participants, which include students ranging from freshmen to graduate students (though most participants are undergraduates), with the opportunity to gain technical, group work, communications and other skills. The following sections provide an overview of the OpenOrbiter program, the benefits of interdisciplinary projects and an overview of the learning objectives of the program.

3.1. Overview of the OpenOrbiter Small Spacecraft Development Initiative

The OpenOrbiter Small Spacecraft Development Initiative was initiated in 2012 as a follow-on to a thematically-related precursor program. The primary technical goal of OpenOrbiter, namely to validate the functionality and space worthiness of the designs of the Open Prototype for Educational NanoSats (OEPN), was participant-derived based upon an initial goal of developing a CubeSat in conjunction with a prospective NASA launch opportunity. The project name and branding were also participant-created.
In addition to providing benefit for its participants, the OpenOrbiter project also serves a wider purpose. OPEN is responsive to a problem encountered at the University of North Dakota and undoubtedly at many other institutions: the initial cost of developing competency in small spacecraft development was beyond the resources available from faculty seed and startup-type funding; however, the programs that could provide sufficient resources generally require demonstrated capabilities. OPEN solves this by making all of the information required to build a 1-U CubeSat with a parts cost of under $5,000 [23] available to educators, researchers and others worldwide. The documentation that will be provided includes CAD diagrams, fabrication instructions (text and video), software and testing plans. The OPEN design [24] is also innovative: it incorporates structural, configuration and other enhancements. The structural design provides over 30% more usable volume (as compared to the base 10 cm × 10 cm × 11 cm form factor) while fully complying with PPOD (launch device) integration requirements.
At present, the vast majority of mechanical and software design work has been completed. Significant progress has been made on software development. Test structures have been developed to validate the mechanical design. Work on electrical design is still ongoing for some (more complex) components, while prototypes of others have been completed.

3.2. Benefits of Interdisciplinary Projects

Interdisciplinary projects are a typical feature of the modern workplace. Most undertakings of any size cannot be performed exclusively by a practitioner of a single specialty. However, virtually all student projects in an academic environment are performed within the context of a course or a degree program. Because of this, they generally involve a set of similarly trained students working on a narrowly-defined topic. Even projects that span disciplines (e.g., teams participating in NASA’s Lunabotics competition [25]) may be limited to only closely related disciplines (e.g., electrical, mechanical and computer engineering, for example).
Because of this, students may not gain exposure to a true interdisciplinary project (characterized by multiple specialists collaboratively performing work related to their area of specialty) until after they enter the workforce. This may require them to unlearn practices and approaches learned while working only in discipline-constrained teams. They may also experience frustration if the process of getting up-to-speed in this impairs their performance during their initial period (normally including some sort of an evaluation/probation process) with a new employer whom they seek to impress.
Involving students in interdisciplinary work prevents ‘silo’-type work habits from developing; students instead learn how to work well in collaboration with others with skills divergent from their own. In addition to these general benefits, students also begin to learn the particular vernacular and work styles of the disciplines whose practitioners-in-training they collaborate with. Interdisciplinary projects may also be able to have a larger scale than those within a single discipline, offering an opportunity for project management practices and discipline-specific multi-person collaboration techniques (e.g., software version control management) to be learned and refined. All of this increases student participant preparation for workplace entry and success.

3.3. Learning Objectives

Five main objectives were identified prior to beginning the OpenOrbiter program. These were increasing proficiency in area-specific technical skills, spacecraft design and development skills, and presentation skills. The program also sought to increase excitement about space and participant comfort giving presentations. Each will now be discussed.
Participants gaining area-specific technical skills is an obvious outcome from the spacecraft development program. For many students, the skills that have been and will (from future involvement) be enhanced are aligned with their major (or perhaps minor). Some students, however, opted to participate in an area different from their academic work to gain an understanding of and experience in a different field. The skills gained or enhanced through program participation were, of course, different for each group and, possibly, each individual (based on what tasks they worked on).
Learning spacecraft design and development skills was another obvious outcome of the program, due to the program focus on small spacecraft design and development. For many students, this was their first exposure to this topic. The skills imparted included iterative spacecraft design and refinement and subsystem-specific design and development skills. Perhaps the single largest lesson taught was with regards to the constraints that the space environment and launch and other costs place on the mass and volume (and, as a consequence of this, virtually all aspects) of the spacecraft.
Presentation skills and comfort giving presentations were identified as key preparations for workforce success that could be enhanced by participation in this program. Success in the workplace environment requires effective use of written and verbal communications to convey highly technical information and other details. Of course, these skills cannot provide value if the individual doesn’t put them to use. Given this, skill development and creating comfort using these skills were identified as key things that could be enhanced through program participation.
Enthusing participants about space and space engineering was also identified as a desired outcome. This outcome cannot be directly traced to future workplace success requirements. However, it was a necessity for project success, as this excitement was seen as a key driver for participants to remain involved in the project. Prospective future funding sources (e.g., NASA) for a national expansion of this type of work also define this is an evaluative criteria for proposal selection making delivering on this goal highly desirable for this purpose as well.

4. Data and Analysis

To assess the performance of the program in attaining these educational objectives, a survey instrument was designed and administered to program participants across all of the groups at regularly scheduled meetings. Twenty individuals completed the survey. These individuals included students studying computer science, electrical engineering, entrepreneurship and space studies. The overall participation in the project varied with nearly 300 students attending at least one meeting, and a smaller number (which fluctuated during the period described with, generally, between 45 and 75 students attending weekly group and/or general meetings). These results are now presented.

4.1. Overall Results

The survey asked participants to evaluate their status prior to project participation and at present for each of the five key outcome areas (technical skill, spacecraft design comfort, excitement about space, presentation skills and presentation comfort). Participants were asked to respond on a nine-point scale for all status questions. Questions were given in the format:
On a scale of 1 to 9, ________________________________ before starting work on the project:
On a scale of 1 to 9, ________________________________ at the present time:
For each question the above blanks were filled in with the particular item of focus. For example, for questions 13 and 18 the phrase “please rate your technical skill in your area of focus” was filled in resulting in the questions “on a scale of 1 to 9, please rate your technical skill in your area of focus before starting work on the project” and “on a scale of 1 to 9, please rate your technical skill in your area of focus at the present time”. For this question, response choices ranged from 9-expert to 5-average to 1-novice. This scale was also used for questions 16 and 21 (“on a scale of 1 to 9, please rate your level of presentation skills”).
For questions 14 and 19 (“on a scale of 1 to 9, please rate your level of comfort with spacecraft design”), response choices ranged from 9-very comfortable to 5-somewhat comfortable to 1-not comfortable. This scale was also used for questions 17 and 22 (“on a scale of 1 to 9, please rate your level of comfort with giving a presentation”)
For questions 15 and 20 (“on a scale of 1 to 9, please rate your level of excitement with space before starting work on the project”), response choices ranged from 9-very excited to 5-average to 1-novice.
The average responses for each category, before and after participation, are presented in Figure 1(a). The average improvement, by category is presented in Figure 1(b). There were a few isolated cases where participants reported lower status-levels after participation as compared to before. For the skill questions, this type of response made no practical sense as there was no conceivable way that the project could have caused someone to regress in their skill level. On the excitement about space and comfort presenting questions, it is of course possible that these attitudes have declined during the time (due to program participation or otherwise). In each instance, the corresponding program impact question showed an average response (4–6 range) so it is presumed that these may be indicative of a change not caused by the program or perhaps participants not correlating their two responses.
Figure 1. (a) Comparison of Beginning and Ending Status Levels. (b) Improvement by Status, Average.
Figure 1. (a) Comparison of Beginning and Ending Status Levels. (b) Improvement by Status, Average.
Education 03 00259 g001
Clearly, it is unrealistic to expect participants to improve in every category; some individuals may have had no or less involvement with areas of the project relevant to a particular category (e.g., presentations). It is thus also useful to look at how much skills improved for individuals who showed some improvement. Figure 2(a) presents the average improvement for individuals showing improvement in each category.
Figure 2. (a) Average Improvement by Status for Students Showing Improvement. (b) Attribution of Program Effect on Creating Change in Status Level.
Figure 2. (a) Average Improvement by Status for Students Showing Improvement. (b) Attribution of Program Effect on Creating Change in Status Level.
Education 03 00259 g002
In addition to asking respondents to characterize their pre-participation and post-participation skill levels, they were also asked to characterize the impact of the program on effecting this change. Again a nine-point scale was used with responses ranging from 9-strongly agree to 7-agree to 5-no preference to 3-disagree to 1-strongly disagree. Each of the three questions (23–25) was presented in the format:
Participation in this project has improved my ____________:
Question 23 asked about “technical skills”. Question 24 had respondents characterize the projects impact on their “interest in space”. Question 25 asked about “presentation skills”.
The average responses to these questions are presented in Figure 2(b). Note that in all cases, the average is on the agree side, to varying extents. One individual who indicated that they hadn’t “really done much” with regards to the project in the open ended question (number 26) influenced this somewhat, with this person’s response excluded the response rise from 6.15 to 6.32, 6 to 6.16 and 5.2 to 5.32, for the technical skills, space interest and presentation skills.

4.2. Comparison of Results between Undergraduate and Graduate Students

As part of the survey instrument, participants were asked a variety of questions relevant to characterizing their academic status and involvement with the project. The next several sections look at starting and ending status levels and the project’s impact in terms of these conditions. This section characterizes these items by whether students were undergraduates or graduate students.
Figure 3. (a) Beginning Status Levels, Compared between Graduate and Undergraduate Students. (b) Ending Status Levels, Compared between Graduate and Undergraduate Students.
Figure 3. (a) Beginning Status Levels, Compared between Graduate and Undergraduate Students. (b) Ending Status Levels, Compared between Graduate and Undergraduate Students.
Education 03 00259 g003
Figure 3(a) presents the pre-participation levels for each category. Figure 3(b) presents these levels after participation. As these figures demonstrate, the relative levels of pre and post status are fairly consistent between undergraduates and graduate students. Graduate students average higher status levels for space excitement, presentation skills and presentation comfort, prior to participation (undergraduates start marginally higher in the other categories). In spacecraft design, graduate students overtake undergraduates during participation. In all other cases, the group that started with a higher skill level also ended with a higher skill level.
Figure 4(a) depicts the relative average aggregate improvement (the average of the sum of the improvement values reported by each individual) between the two groups. The one previously mentioned individual that reported he or she hadn’t “done much with this project” was included in the graduate students. Excluding this individual raises the average to 6.2 (from 5.17) for the graduate students, which significantly exceeds the level reported by the undergraduates. Note that the individuals whose sum was a negative (decline) score have been excluded from this average. The negative value was excluded in the case of individuals who had other positive scores. Figure 4(b) shows the percentage of individuals in each category that had an improvement in each particular area.
Figure 4. (a) Average Aggregate Improvement, Compared between Graduate and Undergraduate Students. (b) Percentage of Participants Showing Improvement in Each Status, Compared between Graduate and Undergraduate Students.
Figure 4. (a) Average Aggregate Improvement, Compared between Graduate and Undergraduate Students. (b) Percentage of Participants Showing Improvement in Each Status, Compared between Graduate and Undergraduate Students.
Education 03 00259 g004
In Figure 5(a), the average improvement for each status has been depicted for both graduate students and undergraduates. Figure 5(b) shows the responses related to program impact. In two of the three instances (technical skill and space interest) more improvement is shown for undergraduate as compared to graduate students. In the third, presentation skills, significantly more improvement is shown for graduate students. Excluding the individual who indicated a lack of participation, technical skills to rise from 5.57 to 6 (as compared to 6.75 for undergraduates), space interest to rise from 5.28 to 5.67 (as compared to 6.33 for undergraduates) and presentation skills rise from 6 to 6.5 (as compared to 4.75 for undergraduates).
Figure 5. (a) Average Improvement in Status Levels (for Students Showing Improvement), Compared between Graduate and Undergraduate Students. (b) Effect of Program on Causing Improvement by Status, Compared between Graduate and Undergraduate Students.
Figure 5. (a) Average Improvement in Status Levels (for Students Showing Improvement), Compared between Graduate and Undergraduate Students. (b) Effect of Program on Causing Improvement by Status, Compared between Graduate and Undergraduate Students.
Education 03 00259 g005

4.3. Comparison of Results between Team Leads and Participants

The relative performance impact of the project on individuals who are team leads versus who are not is now considered. Figure 6(a) presents the pre-participation status levels for both team leads and non-team leads. Figure 6(b) presents the post-participation status levels.
Figure 6. (a) Beginning Status Levels, Compared between Team Leads and Participants. (b) Ending Status Levels, Compared between Team Leads and Participants.
Figure 6. (a) Beginning Status Levels, Compared between Team Leads and Participants. (b) Ending Status Levels, Compared between Team Leads and Participants.
Education 03 00259 g006
Figure 7. (a) Average Aggregate Improvement, Compared between Team Leads and Participants. (b) Percentage of Participants Showing Improvement in Each Status, Compared between Team Leads and Participants.
Figure 7. (a) Average Aggregate Improvement, Compared between Team Leads and Participants. (b) Percentage of Participants Showing Improvement in Each Status, Compared between Team Leads and Participants.
Education 03 00259 g007
Figure 8. (a) Improvement in Status Levels for those showing improvement in each category, Compared between Team Leads and Participants. (b) Effect of Program on Causing Improvement by Status, Compared between Team Leads and Participants.
Figure 8. (a) Improvement in Status Levels for those showing improvement in each category, Compared between Team Leads and Participants. (b) Effect of Program on Causing Improvement by Status, Compared between Team Leads and Participants.
Education 03 00259 g008
The average aggregate improvement for team leads versus non-lead participations is depicted in Figure 7(a). This shows that team leads enjoyed over double the benefit of participation as compared with non-lead participants (7.57 vs. 3.78). Excluding the one individual who indicated a lack of involvement increases the participant average to 4.25. Figure 7(b) depicts the percentage of participants showing improvement in each category for both team leads and non-lead participants. A higher percentage of leads showed improvement in spacecraft design, presentation skills and presentation confidence. A higher percentage of non-leads showed improvement in technical skills and excitement about space.
In Figure 8(a,b) the level of improvement for each category and the effect of the program are considered, respectively. The average improvement shown by the team leads exceeds the level shown by the non-lead participants across all categories. The impact of the program on causing improvement is also higher across all categories is also higher for the team leads.
The data presented clearly indicates that team leads enjoyed significantly more benefit from participation as compared to the non-lead participants. Not only did they show significantly greater benefit (slightly over double), but they attributed this benefit to participation in the program to a greater extent.

4.4. Comparison of Results by Level of Weekly Participation

The impact of how much time is spent per week on the project is now considered. Respondents were asked to characterize their participation on the project into one of three categories: 1–3.99 hours per week spent, 4–7.99 hours per week spent or 8+ hours per week spent. Figure 9(a,b) show the pre-participation and post-participation status levels.
Figure 9. (a) Beginning Status Levels, Compared between Weekly Levels of Participation. (b) Ending Status Levels, Compared between Weekly Levels of Participation.
Figure 9. (a) Beginning Status Levels, Compared between Weekly Levels of Participation. (b) Ending Status Levels, Compared between Weekly Levels of Participation.
Education 03 00259 g009
The average aggregate improvement, by level of weekly participation, is depicted in Figure 10(a). A correlation between greater work on the project and improvement is shown with those working 1–3.99 hours showing an average aggregate improvement of 4.36 (4.8 with the individual who indicated minimal participation excluded) as compared to 7.75 for those spending 4–7.99 hours and 8 for those spending 8 or more hours per week on the project. Figure 10(b) indicates the percentage of participants showing improvement for each category in each condition.
Figure 10. (a) Average Aggregate Improvement, Compared between Weekly Levels of Participation. (b) Percentage of Participants Showing Improvement in Each Status, Compared between Weekly Levels of Participation.
Figure 10. (a) Average Aggregate Improvement, Compared between Weekly Levels of Participation. (b) Percentage of Participants Showing Improvement in Each Status, Compared between Weekly Levels of Participation.
Education 03 00259 g010
In Figure 11(a) the average level of improvement in each category is depicted. Greater improvement is seen in all categories for the 4–7.99 as opposed to the 1–3.99 category. Due to the limited number of individuals responding in the 8+ category, the improvement is centered in two categories (with most of the improvement being located in spacecraft design). Other categories underperform the 3–7.99 and 1–3.99 groups.
Figure 11. (a) Improvement in Status Levels, Compared between Weekly Levels of Participation. (b) Effect of Program on Causing Improvement by Status, Compared between Weekly Levels of Participation.
Figure 11. (a) Improvement in Status Levels, Compared between Weekly Levels of Participation. (b) Effect of Program on Causing Improvement by Status, Compared between Weekly Levels of Participation.
Education 03 00259 g011
The impact of the program on causing the indicated improvement is now considered. The 8 hours per week or more category shows greater attribution of results to the program in each category (as compared to the 1–3.99 and 4–7.99 conditions). The 4–7.99 condition shows more attribution (as compared to the 1–3.99 condition) in two categories (technical and presentation skills), while the 1–3.99 condition shows greater attribution in the space interest category.
The foregoing shows a clear correlation between the amount of time spent weekly on the project and improvement. This is most pronounced between the 1–3.99 and 4–7.99 conditions with only minimal (average) improvement being seen between the 4–7.99 and 8+ categories.

4.5. Comparison of Results by Amount of Time Participating

Correlations between the duration of participation (how long it has been since the individual commenced participation) and results are now assessed. Figure 12(a,b) show the pre-participation and post-participation status values. There is little time-category correlation demonstrated, as would be expected.
Figure 12. (a) Beginning Status Levels, Compared by Time Participating (in Academic Years). (b) Ending Status Levels, Compared by Time Participating (in Academic Years).
Figure 12. (a) Beginning Status Levels, Compared by Time Participating (in Academic Years). (b) Ending Status Levels, Compared by Time Participating (in Academic Years).
Education 03 00259 g012
Figure 13(a) shows the correlation between the duration of participation and average aggregate improvement. A marginal increase is seen between 0.5 years and 1 year. One individual indicated 0.75 years of participation (via writing this answer in on the survey sheet; it was not a choice) and showed a comparative under-performance (relative to the 0.5 and 1 year categories). Excluding the individual that indicated minimal involvement, the 0.5 year average increases to 6 (from 5.25), surpassing the 5.66 response from the 1-year category. The 8 average increase from the 1.5 year condition still surpasses both.
Figure 13. (a) Average Aggregate Improvement, Compared by Time Participating (in Academic Years). (b) Percentage of Participants Showing Improvement in Each Status, Compared by Time Participating (in Academic Years).
Figure 13. (a) Average Aggregate Improvement, Compared by Time Participating (in Academic Years). (b) Percentage of Participants Showing Improvement in Each Status, Compared by Time Participating (in Academic Years).
Education 03 00259 g013
Figure 14. (a) Improvement in Status Levels, Compared by Time Participating (in Academic Years). (b) Effect of Program on Causing Improvement by Status, Compared by Time Participating (in Academic Years).
Figure 14. (a) Improvement in Status Levels, Compared by Time Participating (in Academic Years). (b) Effect of Program on Causing Improvement by Status, Compared by Time Participating (in Academic Years).
Education 03 00259 g014
It would seem that there is some correlation between the time spent involved and the average level of increase; however, this cannot be stated definitively for a number of reasons. First, it appears there was some confusion related to responses in this category altogether due to the ambiguity between calendar years and academic years. Second, the inclusion and exclusion of outlier, erroneous and ambiguous data points appears to have a particular effect in this category with the exclusion of the individual indicating limited involvement bringing the average of the 0.5 year participants above that of the 1-year participants. Another data point (where the individual indicated agreement/agreement-strong agreement with the statements regarding impact but didn’t indicate skill improvement), if excluded, would raise the 1-year condition to 6.8, bringing the two back into stronger correlation.
Figure 13(b) shows the correlation between the amount of time participating and the percentage of individuals showing improvement in each category. The limited membership of several categories makes this graph very erratic. Figure 14(a,b) show the improvement in status levels, by category and attribution by category for each duration of participation condition. Again, the limited membership of some conditions makes both of these graphs somewhat erratic.
It would appear that there is a correlation between the duration that the participant has been involved and the level of benefit attained. However, possible ambiguity in the question and limited membership in certain conditions has made this not entirely certain. Refining this question will serve as an area of improvement for future work. Longitudinal tracking is also planned.
Figure 15. (a) Beginning Status Levels, Compared by Participant GPA. (b) Ending Status Levels, Compared by Participant GPA.
Figure 15. (a) Beginning Status Levels, Compared by Participant GPA. (b) Ending Status Levels, Compared by Participant GPA.
Education 03 00259 g015

4.6. Comparison of Results by GPA

This section compares the various success indicators and the GPA of the participants in an attempt to determine whether there is any correlation. Figure 15(a,b) present the pre-participation and post-participation status levels respectively. There does not appear to be, as expected, any strong bias towards or away from certain categories which correlates with GPA. Figure 16(a) shows the average aggregate improvement. This indicates a slight improvement which correlates with increased GPA (5.67 vs. 6.2). Again, excluding the individual who indicated limited participation causes the 3.49–4.00 to overtake the 4.0 GPA condition (increasing it to 6.38). The other data point (where improvement is attributed, but none is shown) is the sole member of the 3.0–3.49 condition, so this has no impact on the 4.0 versus 3.49–4.00 comparison. Figure 16(b) indicates a higher percentage of individuals in the 4.0 condition experienced an increase in each category as compared to the 3.5–3.99 condition. Excluding the aforementioned individual causes the 3.5–3.99 to overtake in one area (technical skills) and match in another (excitement about space).
Figure 16. (a) Average Aggregate Improvement, Compared by Participant GPA. (b) Percentage of Participants Showing Improvement in Each Status, Compared by Participant GPA.
Figure 16. (a) Average Aggregate Improvement, Compared by Participant GPA. (b) Percentage of Participants Showing Improvement in Each Status, Compared by Participant GPA.
Education 03 00259 g016
Figure 17. (a) Improvement in Status Levels, Compared by Participant GPA. (b) Effect of Program on Causing Improvement by Status, Compared by Participant GPA.
Figure 17. (a) Improvement in Status Levels, Compared by Participant GPA. (b) Effect of Program on Causing Improvement by Status, Compared by Participant GPA.
Education 03 00259 g017
The average amount of improvement values, shown in Figure 17(a), show that the 3.5–3.99 category experienced more improvement when improvement occurred, in all but one category (spacecraft design). Results in the attribution responses shown in Figure 16(b) are mixed with the 3.5–3.99 condition scoring higher in one (technical skills) and the 4.0 condition scoring higher in the other two. The 3–3.49 condition outscores the other two in two conditions (outscoring 3.5–3.99 in all three); however, as there is only a single member to this condition there is insufficient evidence of anything significant about this. Excluding the previously discussed data point does not impact these results.
From the aforementioned, there is insufficient evidence to conclude that GPA had any particular correlation with a gaining value from the program as indicators conflicted. Moreover, in the areas where one was shown to significantly outperform another, there is no practical significance to the result.
Figure 18. (a) Beginning Status Levels, Compared by Undergraduate Class Level. (b) Ending Status Levels, Compared by Undergraduate Class Level.
Figure 18. (a) Beginning Status Levels, Compared by Undergraduate Class Level. (b) Ending Status Levels, Compared by Undergraduate Class Level.
Education 03 00259 g018
Figure 19. (a) Average Aggregate Improvement, Compared by Undergraduate Class Level. (b) Percentage of Participants Showing Improvement in Each Status, Compared by Class Level.
Figure 19. (a) Average Aggregate Improvement, Compared by Undergraduate Class Level. (b) Percentage of Participants Showing Improvement in Each Status, Compared by Class Level.
Education 03 00259 g019

4.7. Comparison of Results by Undergraduate Class Level

The final area of consideration is to determine whether a correlation exists between undergraduates’ class level (freshman, sophomore, junior or senior) and results. Figure 18(a,b) show the initial and ending status levels. Figure 19(a,b) show a lack of progressive increase with increase in grade in aggregate improvement and percentage of individuals improving in each category respectively. Finally, Figure 20(a,b) show a lack of progressive correlation in the level of improvement experienced and attribution of improvement to the program, respectively.
Figure 20. (a) Improvement in Status Levels, Compared by Undergraduate Class Level. (b) Effect of Program on Causing Improvement by Status, Compared by Undergraduate Class Level.
Figure 20. (a) Improvement in Status Levels, Compared by Undergraduate Class Level. (b) Effect of Program on Causing Improvement by Status, Compared by Undergraduate Class Level.
Education 03 00259 g020

5. Conclusions

This paper has presented an initial assessment of the OpenOrbiter Small Spacecraft Development Initiative at the University of North Dakota. It has demonstrated benefit from participation in all of the categories of learning objectives identified prior to program initiation (and several not explicitly identified). It has also shown a strong correlation between the level of improvement and participation as a team lead. It has also shown strong correlation between the number of hours per week that individuals participated and average aggregate improvement. Similarly, a correlation between the duration of participation and improvement is shown. No significant confounding correlation is shown between graduate versus undergraduate status, participant GPA and undergraduate class level and the level of improvement shown (with conflicting indicators or a lack of progression shown).
Future work will include continuation of the assessment activities to allow for greater tracking of correlation between the duration of participation and benefit attained. Several participants noted other areas of benefit that will be assessed in future surveys. These include leadership, communications and team work experience. Work on the development of the spacecraft is ongoing and should lead to future opportunities for additional development work.

Acknowledgements

Small satellite development work at the University of North Dakota is or has been supported by the North Dakota Space Grant Consortium, North Dakota NASA EPSCoR, the University of North Dakota Faculty Research Seed Money Committee, North Dakota EPSCoR (NSF Grant # EPS-814442), the Department of Computer Science, the John D. Odegard School of Aerospace Sciences and the National Aeronautics and Space Administration.
The involvement of the numerous students from multiple disciplines in this project is gratefully acknowledged. Also, thanks are given to the numerous faculty mentors who have helped make this project possible.

Conflict of Interest

The authors declare no conflict of interest.

References

  1. Mountrakis, G.; Triantakonstantis, D. Inquiry-based learning in remote sensing: A space balloon educational experiment. J. Geogr. Higher Educ. 2012, 36, 385–401. [Google Scholar] [CrossRef]
  2. Mathers, N.; Goktogen, A.; Rankin, J.; Anderson, M. Robotic Mission to Mars: Hands-on, minds-on, web-based learning. Acta Astronaut. 2012, 80, 124–131. [Google Scholar] [CrossRef]
  3. Fevig, R.; Casler, J.; Straub, J. Blending Research and Teaching through Near-Earth Asteroid Resource Assessment. In Proceedings of Space Resources Roundtable and Planetary & Terrestrial Mining Sciences Symposium, Golden, CO, USA, 4–7 June 2012.
  4. Hall, S.R.; Waitz, I.; Brodeur, D.R.; Soderholm, D.H.; Nasr, R. Adoption of Active Learning in a Lecture-Based Engineering Class. In Proceedings of the ASEE/IEEE Frontiers in Education Conference, Boston, MA, USA, 6–9 November 2002; Volume 1, pp. T2A-9–T2A-15.
  5. Brodeur, D.R.; Young, P.W.; Blair, K.B. Problem-Based Learning in Aerospace Engineering Education. In Proceedings of the 2002 American Society for Engineering Education Annual Conference and Exposition, Montreal, Quebec, Canada, 23–26 June 2002; pp. 16–19.
  6. Straub, J.; Berk, J.; Nervold, A.; Whalen, D. OpenOrbiter: An interdisciplinary, student run space program. Adv. Educ. 2013, 2, 4–10. [Google Scholar]
  7. Correll, N.; Wing, R.; Coleman, D. A one-year introductory robotics curriculum for computer science upperclassmen. Educ. IEEE Trans. 2013, 56, 54–60. [Google Scholar] [CrossRef]
  8. Broman, D.; Sandahl, K.; Abu Baker, M. The company approach to software engineering project courses. Educ. IEEE Trans. 2012, 55, 445–452. [Google Scholar] [CrossRef]
  9. Qidwai, U. Fun to learn: Project-Based learning in robotics for computer engineers. ACM Inroads 2011, 2, 42–45. [Google Scholar] [CrossRef]
  10. Bütün, E. Teaching genetic algorithms in electrical engineering education: A problem-based learning approach. Int. J. Electr. Eng. Educ. 2005, 42, 223–233. [Google Scholar] [CrossRef]
  11. de Camargo Ribeiro, L.R. Electrical engineering students evaluate problem-based learning (PBL). Int. J. Electr. Eng. Educ. 2008, 45, 152–161. [Google Scholar] [CrossRef]
  12. Robson, N.; Dalmis, I.S.; Trenev, V. Discovery Learning in Mechanical Engineering Design: Case-based Learning or Learning by Exploring? In Proceedings of 2012 ASEE Annual Conference, San Antonio, TX, USA, 10–13 June 2012.
  13. Coller, B.D.; Scott, M.J. Effectiveness of using a video game to teach a course in mechanical engineering. Comput. Educ. 2009, 53, 900–912. [Google Scholar] [CrossRef]
  14. Das, S.; Yost, S.A.; Krishnan, M. A 10-year mechatronics curriculum development initiative: Relevance, content, and results—Part I. Educ. IEEE Trans. 2010, 53, 194–201. [Google Scholar] [CrossRef]
  15. Saunders-Smits, G.N.; Roling, P.; Brügemann, V.; Timmer, N.; Melkert, J. Using the Engineering Design Cycle to Develop Integrated Project Based Learning in Aerospace Engineering. In Proceedings of the International Conference on InnovationPractice and Research in Engineering Education, Coventry, UK, 18–20 September 2012; pp. 18–20.
  16. Jayaram, S.; Boyer, L.; George, J.; Ravindra, K.; Mitchell, K. Project-Based introduction to aerospace engineering course: A model rocket. Acta Astronaut. 2010, 66, 1525–1533. [Google Scholar] [CrossRef]
  17. Reynolds, M.; Vince, R. Critical management education and action-based learning: Synergies and contradictions. Acad. Manag. Learn. Educ. 2004, 3, 442–456. [Google Scholar] [CrossRef]
  18. Siegel, C.F. Introducing marketing students to business intelligence using project-based learning on the World Wide Web. J. Mark. Educ. 2000, 22, 90–98. [Google Scholar]
  19. Nielsen, J.F.D.; Du, X.; Kolmos, A. Innovative application of a new PBL model to interdisciplinary and intercultural projects. Int. J. Electr. Eng. Educ. 2010, 47, 174–188. [Google Scholar]
  20. Twiggs, R.; Malphrus, B. Space Mission Engineering: The New SMAD. In CubeSats; Wertz, J.R., Everett, D.F., Puschell, J.J., Eds.; Microcosm Press: Hawthorne, CA, USA, 2011; pp. 803–821. [Google Scholar]
  21. Swartwout, M. AC 2011–1151: Significance of Student-Built Spacecraft Design Programs—Its Impact on Spacecraft Engineering Education over Last Ten Years. In Proceedings of the American Society for Engineering Education Annual Conference, Vancouver, BC, Canada, 26–29 June 2011.
  22. Straub, J. Cubesats: A Low-Cost, Very High-Return Space Technology. In Proceedings of the 2012 Reinventing Space Conference, Los Angeles, CA, USA, 7–11 May 2012.
  23. Berk, J.; Straub, J.; Whalen, D. Open Prototype for Educational NanoSats: Fixing the Other Side of the Small Satellite Cost Equation. In Proceedings of the 2013 IEEE Aerospace Conference, Big Sky, MT, USA, 2–9 March 2013.
  24. Straub, J.; Korvald, C.; Nervold, A.; Mohammad, A.; Root, N.; Long, N.; Torgerson, D. OpenOrbiter: A low-cost, educational prototype cubesat mission architecture. Machines 2013, 1, 1–32. [Google Scholar] [CrossRef]
  25. Mueller, R.P. Lunabotics Mining Competition: Inspiration through Accomplishment. In Proceedings of the ASCE Earth and Space 2012 Conference, Pasadena, CA, USA, 15–18 April 2012; pp. 1478–1497.

Share and Cite

MDPI and ACS Style

Straub, J.; Whalen, D. An Assessment of Educational Benefits from the OpenOrbiter Space Program. Educ. Sci. 2013, 3, 259-278. https://doi.org/10.3390/educsci3030259

AMA Style

Straub J, Whalen D. An Assessment of Educational Benefits from the OpenOrbiter Space Program. Education Sciences. 2013; 3(3):259-278. https://doi.org/10.3390/educsci3030259

Chicago/Turabian Style

Straub, Jeremy, and David Whalen. 2013. "An Assessment of Educational Benefits from the OpenOrbiter Space Program" Education Sciences 3, no. 3: 259-278. https://doi.org/10.3390/educsci3030259

Article Metrics

Back to TopTop