Utilizing a Transdisciplinary (TD) Systems Engineering (SE) Process Model in the Concept Stage: A Case Study to Effectively Understand the Baseline Maturity for a TD SE Learning Program
Abstract
:1. Introduction
2. Method Overview: TD SE Process Model
2.1. Method Step 1: Disciplinary Convergence—Creating a Collective Impact
2.2. Method Step 2: Transdisciplinary Collaboration
2.3. Method Step 3: Creating Collective Intelligence
2.4. Method Step 4: Transdisciplinary Research Integration
2.5. Method Step 5: Transdisciplinary Engineering Tools
2.6. Method Step 6: Analysis and Transdisciplinary Assessment
Current State, Gaps, and Proposed Modules per Key Attribute
- Assess the current modules concerning the proposed key attribute definitions and identify whether gaps exist.
- If a gap is identified and can be filled by amending a current module, this research will provide recommendations to current module owners.
- If a gap is identified, and the current modules cannot be amended, this research will fill the gap by proposing a new module meeting the key attributes for 21st-century engineering learning.
- If applicable, pilot the new proposed modules.
- Perform a quantitative analysis of current SE modules by scoring the maturity level per key attribute. If applicable, perform a quantitative analysis of pilot SE modules by scoring the maturity level per key attribute.
- Perform a qualitative analysis for feedback received for current modules. If applicable, perform a qualitative analysis for feedback received for pilot SE modules.
3. Case Study: TD SE Learning Program
3.1. Case Study Step 1: Disciplinary Convergence—Creating a Collective Impact
3.2. Case Study Step 2: TD Collaboration
3.3. Case Study Step 3: Collective Intelligence
3.3.1. Key Attributes Identified and Defined
3.3.2. Creating the Structural Self-Interaction Matrix (SSIM)
- If the relationship between factors was from i to j, a V was entered.
- If the relationship between factors was from j to I, an A was entered.
- If the relationship between factors was bidirectional, an X was entered.
- If there was no relationship between i and j, an O was entered.
3.4. Case Study Step 4: TD Research Integration
3.4.1. Control Group—Scoring Current Modules for Key Attributes
3.4.2. Control Group—Feedback per Key Attribute on Current Modules
3.5. Case Study Step 5: TD Engineering Tool
Interpretive Structural Modeling—Digraph and MICMAC from SSIM
- Cluster I includes no factors that are autonomous. Autonomous factors have low driving power and low dependence.
- Cluster II includes one factor (learning environment) that is dependent. Dependent factors have low driving power and high dependence.
- Cluster III includes three factors (instructor, assessment, and sustainability) that are linked. Linkage factors have high driving power and high dependence. Note: Factors instructor, assessment, and sustainability are on the border of Cluster II and III, but because the digraph depicts linkages, these three attributes are moved to Cluster III [30].
- Cluster IV includes three factors (governance, content, and support) that are independent. Independent factors have high driving power and low dependence. These factors have a direct impact on the success of learning programs.
- The MICMAC indicates that the governance, content, and support learning attributes directly impact the success of learning programs. Therefore, in order of priority, the proposal team should address governance, content, and support attributes for effective proposal generation. If time permits, the learning attributes instructor, assessment, and sustainability should be addressed by the proposal team. Because instructor, assessment, and sustainability are linked, these three learning attributes are equally important to address. If time permits, the learning environment should be addressed by the proposal team. The learning environment has low driving power, so research shows that it is of the lowest priority of the seven identified learning attributes.
3.6. Case Study Step 6: Analysis and TD Assessment of Learning Program
Current State, Gaps, and Proposed State of Key Attributes
4. Pilot Program
4.1. Pilot Program—Definition
- The pilot began by meeting with the SE council, a group of SE SMEs representing and harmonizing the corporation’s businesses to advance systems engineering, to agree on an SE learning vision. As defined in the Systems Engineering Vision 2035 from the International Council on Systems Engineering (INCOSE), the council accepted the INCOSE learning framework as the SE learning vision as defined in Table 7 [31], as proposed by the lead author of this paper. The rational for following INCOSE’s learning framework is the broad range of skillsets that include core competencies, professional skills, technical skills, management competencies, and integrating skills required for systems engineering success in industry.
- The SE council members agreed that the INCOSE learning framework would define the SE learning content, and the framework would be utilized to identify content gaps in industry learning.
- The council agreed that the seven attributes as defined in this research, with a special emphasis on the high-driving-power attributes of governance, content, and support, would be utilized to measure existing and future learning programs’ effectiveness.
Core SE Principles |
|
Professional Competencies |
|
Technical Competencies |
|
SE Management Competencies |
|
Integrating Competencies |
|
4.2. Pilot Program—Quantitative Analysis
4.3. Pilot Program—Qualitative Analysis
5. Closing
5.1. Discussion of TD SE Process Model to Propose a TD SE Learning Program
5.2. Final Remarks
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Madni, M.A. Transdisciplinary Systems Engineering; Springer: Berlin/Heidelberg, Germany, 2018; ISBN 978-3-319-62183-8. [Google Scholar]
- INCOSE. INCOSE Systems Engineering Handbook: A Guide for System Life Cycle Processes and Activities, 5th ed.; Wiley: Hoboken, NJ, USA, 2023. [Google Scholar]
- De Luce, D.; Dilanian, K. Can the U.S. and NATO Provide Ukraine with Enough Weapons? What is Kanban? NBC News, 31 March 2022. Available online: https://www.nbcnews.com/politics/national-security/can-us-nato-provide-ukraine-enough-weapons-rcna22066digit’e(accessed on 1 September 2022).
- Flores, M. Set-Based Concurrent Engineering (SBCE): Why Should You Be Interested? Lean Analytics Association. 2 December 2018. Available online: http://lean-analytics.org/set-based-concurrent-engineering-sbce-why-should-you-be-intereste (accessed on 1 September 2022).
- Schwaber, K. Agile Project Management with Scrum, 1st ed.; Microsoft Press: Redmond, WA, USA, 2004. [Google Scholar]
- Sols Rodriguez-Candela, A. Systems Engineering Theory and Practice, 1st ed.; Universidad Pontificia Comillas: Madrid, Spain, 2014. [Google Scholar]
- Ertas, A.; Maxwell, T.; Rainey, V.P.; Tanik, M.M. Transformation of higher education: The transdisciplinary approach in engineering. IEEE Trans. Educ. 2003, 46, 289–295. [Google Scholar] [CrossRef]
- Ertas, A. Creating a Culture of Transdisciplinary Learning in STEAM Education for K-12 Students. Transdiscipl. J. Eng. Sci. 2022, 13, 233–244. [Google Scholar] [CrossRef]
- Ertas, A.; Rohman, J.; Chillakanti, P.; Baturalp, T.B. Transdisciplinary Collaboration as a Vehicle for Collective Intelligence: A Case Study of Engineering Design Education. Int. J. Eng. Educ. 2015, 31, 1526–1536. [Google Scholar]
- Warfield, J. Structuring Complex Systems; Battelle Monograph, No. 3; Battelle Memorial Institute: Columbus, OH, USA, 1974. [Google Scholar]
- Warfield, J. A Philosophy of Design; Document Prepared for Integrated Design & Process Technology Conference; Wiley: Austin, TX, USA, 1995; p. 10. [Google Scholar]
- Ertas, A. Transdisciplinary Design Process; John Wiley & Sons: Hoboken, NJ, USA, 2018; pp. 9–13. [Google Scholar]
- Denning, P.J. Mastering the mess. Commun. ACM 2007, 50, 21–25. [Google Scholar] [CrossRef]
- Rittel, H.W.J.; Webber, M.M. Dilemmas in a general theory of planning. Policy Sci. 1973, 4, 155–169. [Google Scholar] [CrossRef]
- ISO/IEC/IEEE 15288:2023; Systems and Software Engineering—System Life Cycle Processes. The International Organization for Standardization, The International Electrotechnical Commission, and The Institute of Electrical and Electronics Engineers: Geneva, Switzerland, 2023.
- Sinakou, E.; Donche, V.; Boeve-de Pauw, J.; Van Petegem, P. Designing Powerful Learning Environments in Education for Sustainable Development: A Conceptual Framework. Sustainability 2019, 11, 5994. [Google Scholar] [CrossRef]
- Hwang, S. Effects of Engineering Students’ Soft Skills and Empathy on Their Attitudes toward Curricula Integration. Educ. Sci. 2022, 12, 452. [Google Scholar] [CrossRef]
- Hughes, A.J.; Love, T.S.; Dill, K. Characterizing Highly Effective Technology and Engineering Educators. Educ. Sci. 2023, 13, 560. [Google Scholar] [CrossRef]
- Hartikainen, S.; Rintala, H.; Pylvas, L.; Nokelainen, P. The Concept of Active Learning and the Measurement of Learning Outcomes: A Review of Research in Engineering Higher Education. Educ. Sci. 2019, 9, 276. [Google Scholar] [CrossRef]
- Van Bossuyt, D.L.; Beery, P.; O’Halloran, B.M.; Hernandez, A.; Paulo, E. The Naval Postgraduate School’s Department of Systems Engineering Approach to Mission Engineering Education through Capstone Projects. Systems 2019, 7, 38. [Google Scholar] [CrossRef]
- Kans, M. A Study of Maintenance-Related Education in Swedish Engineering Programs. Educ. Sci. 2021, 11, 535. [Google Scholar] [CrossRef]
- Watson, R. Interpretive Structural Modeling—A Useful Tool for Technology Assessment? Technol. Forecast. Soc. Chang. 1978, 11, 165–185. [Google Scholar] [CrossRef]
- Ertas, A.; Gulbulak, U. Managing System Complexity through Integrated Transdisciplinary Design Tools; Atlas Publishing: San Diego, CA, USA, 2021; ISBN 978-0-9998733-1-1. [Google Scholar]
- Harary, F.; Norman, R.V.; Cartwright, D. Structural Models: An Introduction to the Theory of Directed Graphs; Wiley: New York, NY, USA, 1965. [Google Scholar]
- Duperrin, J.C.; Godet, M. Methode de Hierarchisation des Elements d’un Système; Rapport Economique du CEA Publishing; CEA: Budapest, Hungary, 1973; R-45-51. [Google Scholar]
- Mandal, A.; Deshmukh, S.G. Vender selection using interpretive structural modeling (ISM). Int. J. Oper. Prod. Manag. 1994, 14, 52–59. [Google Scholar] [CrossRef]
- ATLAS. (n.d.). ATLAS ISM Tool. Available online: https://theatlas.org (accessed on 1 August 2023).
- Delbecq, A.L.; VandeVen, A.H. A group process model for problem identification and program planning. J. Appl. Behav. Sci. 1971, 7, 466–491. [Google Scholar] [CrossRef]
- McMillan, S.S.; King, M.; Tully, M.P. How to use the nominal group and Delphi techniques. Natl. Libr. Med. Natl. Cent. Biotechnol. Inf. 2016, 38, 655–662. [Google Scholar] [CrossRef] [PubMed]
- Cearlock, D.B. Common Properties and Limitations of Some Structural Modeling Techniques. Ph.D. Thesis, University of Washington, Seattle, WA, USA, 1977. [Google Scholar]
- INCOSE Systems Engineering Vision 2035 Copyright 2021 by INCOSE. 2023. Available online: https://www.incose.org/docs/default-source/se-vision/incose-se-vision-2035.pdf?sfvrsn=e32063c7_10 (accessed on 1 August 2023).
- Galindo, I. The Formula for Successful Learning: Retention. Columbia Theological Seminary. 2022. Available online: https://www.ctsnet.edu/the-formula-for-sucessful-learning-retention/ (accessed on 1 December 2023).
- NASA Jet Propulsion Laboratory. Team X. 1995. Available online: https://jplteamx.jpl.nasa.gov (accessed on 1 December 2023).
Discipline | Discipline Description |
---|---|
Program Management | Responsible for the delivery of products and services to the customer within contract budget and schedule. |
Systems Engineering | Responsible for integrating relevant disciplines to deliver a technical solution that meets mission needs within cost and schedule. |
Mechanical Engineering | Responsible for mechanics and production of tools and machinery. |
Electrical Engineering | Responsible for electricity and electronics, from microscopic computer components to large power networks. |
Software Engineering | Responsible for the design, development, testing, and maintenance of software applications. |
Finance | Responsible for evaluating the earned value of products and services. |
Whole Life Services | Responsible for repairs, obsolescence planning, and sustainment of products and services. |
Mission Assurance | Responsible for the effective application of the organization’s quality management process [15]. |
Supply Chain | Responsible for raw materials and parts that are used for the manufacturing of products. |
Configuration Management | Responsible for managing system and system element configurations over the life cycle [15]. |
Data Management | Responsible for generation, obtainment, confirmation, transformation, retainment, retrieval, dissemination, and disposal of information to designated stakeholders [15]. |
Operations | Responsible for the administration of business activities and tasks. |
Human Resources | Responsible for staffing and retainment of employees. |
Topic | Topic Discussion Ideas |
---|---|
Program | A program is defined as a set of courses or modules. Discuss the current program for early-in-career engineers. |
Modulation | Modulation is the process of having a topic in a small course form. A program would have 2 or more modules. |
Audience | The audience is a new college graduate, a new hire from another company, or an early-in-career employee who self-nominates. |
Learning Environment | Learning environments to discuss include in-person, remote, hybrid, and on-demand. Learning environment describes the type of learning: behaviorism (teacher), liberationism (self), Constructivism (social), and connectivism (internet) learning [16]. |
Learning Objectives | The learning objectives describe what the participants should be able to accomplish because of the study. |
Content | The content is the resources used to develop the skills and knowledge. |
TD Content | Transdisciplinary content is the resources to develop the skills and knowledge from two or more disciplines [17]. |
Length of Study | The length of study includes 5–10-min videos, 30-min modules, and 60-min modules. |
Instructor Strategies | Instructor strategies include an instructor’s ability to connect with participants and utilize available technologies to make learning effective [18]. |
Assessments | Assessment activities include pre- and post-knowledge checks to ensure that objectives were met and end the course surveys. |
Application of New Knowledge | The application of new knowledge discusses how the new knowledge will be applied to a current program or by completing a capstone project [19,20]. |
Office Hours | Office hours are defined as a time each week when SE discipline SMEs are available to answer questions from early-in-career employees. |
Networking | Networking activities include connecting engineers with other engineers, connecting discipline SMEs and technology SMEs, and quarterly discipline meetups. |
Sustainment | Sustainment activities include the responsibility of module owners to have identified SMEs to review content every two years to keep current and the responsibility of program owners to analyze feedback and make continual improvements to the program with each cohort or at a minimum annually [21]. |
Repository | Current modules are listed with revision numbers in a common repository. |
Effectivity | A shared measurement system across learning modules to assess effectivity. |
ROI | ROI is the ability to effectively measure the business value of the learning program or module. |
Coaching | Coaching is the ability to listen and guide engineers on their career journey. |
Governance | Governance sets the vision for SE learning, assesses current content, and identifies learning gaps. |
Topic | Definition |
---|---|
Learning Environment | The learning environment includes options such as in-person, remote, hybrid, and on-demand. Types of learning, regardless of environment, include (1) learning from the instructor flowing information to the participants, (2) learning based on their own preferences of learning environment with self-assessment of gaps, (3) learning amongst colleagues, and (4) learning how to navigate the available resources and learn autonomously. Early-in-career engineers prefer in-person and on-demand learning environments. Early-in-career engineers prefer learning based on their own preferences of learning environment with self-assessment of gaps and learning amongst colleagues. |
Sustainment | Sustainment activities include the responsibility of module owners to have identified SMEs to review content every two years to keep current and program owners to analyze feedback and make continual improvements to the program with each cohort or, at a minimum, annually. |
Assessments | Assessments are utilized to ensure that learning objectives are met. Assessment activities include pre- and post-knowledge checks to measure learning effectiveness. End-of-the-course surveys will be utilized to gain additional insight from participants for module improvements. |
Instructor | Instructor strategies include an instructor’s ability to connect with participants and utilize available technologies to make learning effective. |
Support | Support is having SE SMEs actively measure learning program performance and effectiveness. Effectiveness is measured through multiple engagements with the participants, such as (1) pre- and post-knowledge checks, (2) end-of-the-course surveys, (3) SE SMEs offering office hours at a defined time each week to answer SE discipline questions, and (4) SE SMEs offering career coaching services for systems engineers. Feedback from the engagements will expose what the participants have learned or not learned; this may lead to revising learning content, polling questions, or potentially adding learning modules. |
Content | Content is the resources used to develop the skills and knowledge. Content will state learning objectives to describe what the participants should be able to accomplish because of the study. Content will be modulated to include one topic and tagged with a difficulty level, i.e., entry, intermediate, advanced. Early-in-career engineers prefer a length of study of 30 min or less and an accessible repository of learning modules for on-demand learning. |
Governance | Governance sets the vision for SE learning. SE councils define content and explore where content gaps exist for systems engineering. SE councils establish a shared measurement system for learning program’s effectiveness. |
Modules | Governance | Content | Support | Sustainment | Assessments | Instructor | Learning Environment |
---|---|---|---|---|---|---|---|
Intro to SE | 1 | 1 | 1 | 1 | 2 | 2 | 2 |
Requirements | 1 | 2 | 1 | 1 | 2 | 2 | 2 |
Systems Analysis | 1 | 2 | 1 | 1 | 2 | 2 | 2 |
Trade Studies | 1 | 1 | 1 | 1 | 2 | 2 | 2 |
Modeling and Simulation | 1 | 2 | 1 | 1 | 2 | 2 | 2 |
Whole Life Services | 1 | 2 | 1 | 1 | 2 | 2 | 2 |
Technical Planning | 1 | 1 | 1 | 1 | 2 | 2 | 2 |
Critical Thinking Essentials | 1 | 4 | 1 | 1 | 2 | 2 | 2 |
Average Score | 1 | 1.87 | 1 | 1 | 2 | 2 | 2 |
Key Attributes | Summary of Feedback |
---|---|
Learning Environment | The current SE modules scored 2 for learning environment. Current SE modules offer on-demand learning with no other learning environment options. The learning environment did not allow for self-assessment of gaps and learning amongst colleagues. |
Sustainment | The current SE modules scored 1 for sustainment because there was no process to keep content current. The program did not have SE SMEs identified to review content every two years to keep current. Program owners were not utilizing polling and surveys effectively to make continual improvements, or at a minimum annually. |
Assessments | The current SE modules scored 2 for assessments. Current SE modules need SE SME support to identify content objectives accurately. Pre-knowledge checks do not exist. Post-knowledge checks existed but were not focused on understanding key objectives. End-of-the-course surveys were optional, and, from the surveys obtained, no module improvements were performed for the past seven years. |
Instructor | The current SE modules scored 2 for instructor. Current SE modules do not have instructors assigned as they are on-demand learning modules. Technology is being utilized for on-demand learning, but the presentation has an outdated user experience. |
Support | The current SE modules scored 1 for support because no SE SMEs were actively measuring learning program effectiveness. Polling questions and surveys were not adequate to measure program effectivity. Modules were not revised based on engagement with the participants. Engagement with participants would include knowledge checks, surveys, office hours, and coaching sessions. |
Content | The current SE modules scored 1 for content. The content was outdated and did not cover the basic concepts of said topic. The modules that scored 2 for content covered some basic concepts but some basic concepts were amiss or non-existent. SMEs had not reviewed modules that scored 1 and 2 for content for over seven years. The Critical Thinking Essentials course scored 4 because the content was mature and, with SME support, the content could be used. |
Governance | The current SE modules scored 1 for governance because a vision for SE learning did not exist. During the NGT, it was discussed that SE councils should set the vision and define content while concurrently exploring whether content gaps exist for systems engineering. A need to establish a shared measurement system for the learning program’s effectiveness to indicate where the focus needs to shift to remain in compliance with the vision was also identified. |
Key Attributes | Current State of Attributes | Gaps | Proposed State of Attributes |
---|---|---|---|
Learning Environment | The current SE modules scored 2 for learning environment. Current SE modules offer on-demand learning with no other learning environment options. The learning environment did not allow for self-assessment of gaps and learning amongst colleagues. | Need to offer a wider array of learning environments so participants can choose based on their own preferences. Need to allow for a self-assessment of gaps. Need to allow learning amongst colleagues. | The learning environment includes options such as in-person, remote, hybrid, and on-demand. Types of learning, regardless of environment, include (1) learning from the instructor flowing information to the participants, (2) learning based on their own preferences of learning environment with self-assessment of gaps, (3) learning amongst colleagues, and (4) learning how to navigate the available resources and learn autonomously. Early-in-career engineers prefer in-person and on-demand learning environments. Early-in-career engineers prefer learning based on their own preferences of learning environment with self-assessment of gaps and learning amongst colleagues. |
Sustainment | The current SE modules scored 1 for sustainment because there was no process to keep content current. The program did not have SE SMEs identified to review content every two years to keep current. Program owners were not utilizing polling and surveys effectively to make continual improvements, or at a minimum annually. | Need a process to keep content current. Needs SE SMEs identified to review content. Need SE SMEs to analyze assessment, measure program effectivity, and make continual improvements. | Sustainment activities include the responsibility of module owners to have identified SMEs to review content every two years to keep current and program owners to analyze feedback and make continual improvements to the program with each cohort or, at a minimum, annually. |
Assessments | The current SE modules scored 2 for assessments. Current SE modules need SE SME support to identify content objectives accurately. Pre-knowledge checks do not exist. Post-knowledge checks existed but were not focused on understanding key objectives. End-of-the-course surveys were optional, and from the surveys obtained, no module improvements were performed for the past seven years. | Need SE SME support to identify content objectives. Need SE SME support to create and analyze pre- and post-knowledge checks, and end-of-the-course surveys. Need to require pre- and post-knowledge checks and end-of-the-course surveys. Need SE SME support to measure effectivity of the program and make continual improvements. | Assessments are utilized to ensure that learning objectives are met. Assessment activities include pre- and post-knowledge checks to measure learning effectiveness. End-of-the-course surveys will be utilized to gain additional insight from participants for module improvements. |
Instructor | The current SE modules scored 2 for instructor. Current SE modules do not have instructors assigned as they are on-demand learning modules. Technology is being utilized for on-demand learning, but the presentation has an outdated user experience. | Need to identify instructors for in-person, hybrid, and zoom learning modules. Need to improve user experience of on-demand learning modules. | Instructor strategies include an instructor’s ability to connect with participants and utilize available technologies to make learning effective. |
Support | The current SE modules scored 1 for support because no SE SMEs were actively measuring learning program effectiveness. Polling questions and surveys were not adequate to measure program effectivity. Modules were not revised based on engagement with the participants. Engagement with participants would include knowledge checks, surveys, office hours, and coaching sessions. | Need SE SMEs to actively measure learning program effectiveness. Need SE SMEs to create and analyze pre- and post-knowledge checks and end-of-course surveys. Need SE SMEs to engage with participants including knowledge checks, surveys, office hours, and coaching services, and make continual improvements to SE learning program based on feedback. | Support is having SE SMEs actively measure learning program performance and effectiveness. Effectiveness is measured through multiple engagements with the participants, such as (1) pre- and post-knowledge checks, (2) end-of-the-course surveys, (3) SE SMEs offering office hours at a defined time each week to answer SE discipline questions, and (4) SE SMEs offering career coaching services for systems engineers. Feedback from the engagements will expose what the participants have learned or not learned; this may lead to revising learning content, polling questions, or potentially adding learning modules. |
Content | The current SE modules that scored 1 for Content. The content was outdated and did not cover the basic concepts of said topic. The modules that scored 2 for content covered some basic concepts but some basic concepts were amiss or non-existent. SMEs had not reviewed modules that scored 1 and 2 for Content for over seven years. The Critical Thinking Essentials course scored 4 because the content was mature and with SME support, the content could be used. | Need updated content. Need a process in place to ensure content is kept current. Need to ensure modules have learning objectives described. Need to ensure modules include one topic that is 30 min or less in duration. Need to ensure modules are tagged with difficulty level, i.e., entry, intermediate, advanced. Need to create a repository of SE learning modules for on-demand learning. | Content is the resources used to develop the skills and knowledge. Use the published International Council on Systems Engineering (INCOSE) Handbook that is published every 3–5 years. Content will state learning objectives to describe what the participants should be able to accomplish because of the study. Content will be modulated to include one topic and tagged with difficulty level, i.e., entry, intermediate, advanced. Early-in-career engineers prefer a length of study of 30 min or less and an accessible repository of learning modules for on-demand learning. |
Governance | The current SE modules scored 1 for governance because a vision for SE learning did not exist. During the NGT, it was discussed that SE councils should set the vision and define content while concurrently exploring if content gaps exist for systems engineering. A need to establish a shared measurement system for the learning program’s effectiveness to indicate where the focus needs to shift to remain in compliance with the vision was also identified. | Need to declare governing body for SE learning vision, content definition, and exploration of gaps in existing content. Need to establish a shared measurement system to assess learning program effectiveness. | Governance sets the vision for SE learning. SE councils to set the vision and define content while concurrently exploring whether gaps exist for systems engineering. SE councils establish a shared measurement system to assess the learning program’s effectiveness. |
A Guide for System Life Cycle Processes and Activities | |
---|---|
Ch 1: SE Introduction |
|
Ch 2: System Life Cycle Concepts, Models, and Processes |
|
Ch 3: Life Cycle Analyses and Methods |
|
Ch 4: Tailoring and Application Considerations |
|
Ch 5: Systems Engineering in Practice |
|
Session Name | Available Session Points | Pre-Knowledge % | Post-Knowledge % | Z-Score | p |
---|---|---|---|---|---|
Why INCOSE? | 90 | 80% | 97% | 3.483 | 0.001 *** |
INCOSE Certifications | 54 | 69% | 98% | 4.131 | <<0.001 *** |
SE Discipline Articles | 54 | 83% | 100% | 3.133 | 0.003 ** |
pp. 1–13 | 140 | 74% | 87% | 2.726 | 0.010 ** |
pp. 14–27 | 42 | 48% | 81% | 3.188 | 0.002 ** |
pp. 28–41 | 180 | 56% | 84% | 5.881 | <<0.001 *** |
pp. 42–55 | 80 | 70% | 74% | 0.528 | 0.357 |
pp. 56–69 | 50 | 70% | 94% | 3.123 | 0.003 ** |
pp. 70–83 | 50 | 84% | 98% | 2.446 | 0.020 * |
Session | Participants | Negative Responses | Neutral Responses | Positive Responses | Positive Response Rate |
---|---|---|---|---|---|
Why INCOSE? | 11 | 2 | 8 | 100 | 92% |
INCOSE HB—Session 1 | 10 | 5 | 14 | 81 | 81% |
INCOSE HB—Session 2 | 12 | 3 | 19 | 98 | 82% |
INCOSE HB—Session 3 | 10 | 3 | 12 | 85 | 85% |
Proposed Module | Governance | Content | Support | Sustainment | Assessments | Instructor | Learning Environment |
---|---|---|---|---|---|---|---|
Why INCOSE? | 5 | 4 | 5 | 5 | 5 | 5 | 4 |
INCOSE certification | 5 | 4 | 5 | 5 | 5 | 5 | 4 |
SE discipline articles | 5 | 4 | 5 | 5 | 5 | 5 | 4 |
What is SE? | 5 | 4 | 5 | 5 | 5 | 5 | 4 |
Why is SE Important? | 5 | 4 | 5 | 5 | 5 | 5 | 4 |
Systems Concepts | 5 | 4 | 5 | 5 | 5 | 5 | 4 |
SE Foundations | 5 | 4 | 5 | 5 | 5 | 5 | 4 |
System Science and Systems Thinking | 5 | 4 | 5 | 5 | 5 | 5 | 4 |
Life Cycle Terms and Concepts | 5 | 4 | 5 | 5 | 5 | 5 | 4 |
Life Cycle Model Approaches | 5 | 4 | 5 | 5 | 5 | 5 | 4 |
System Life Cycle Processes | 5 | 4 | 5 | 5 | 5 | 5 | 4 |
Average Scores | 5 | 4 | 5 | 5 | 5 | 5 | 4 |
Proposed Module | Feedback from Surveys | Improvement Action Taken |
---|---|---|
Why INCOSE? |
|
|
INCOSE certification |
|
|
SE discipline articles |
|
|
What is SE? |
|
|
Why is SE Important? |
|
|
Systems Concepts |
|
|
SE Foundations |
|
|
System Science and Systems Thinking |
|
|
Life Cycle Terms and Concepts |
|
|
Life Cycle Model Approaches |
|
|
System Life Cycle Processes |
|
|
TD SE Process Model Steps | Discussion Highlights |
---|---|
Disciplinary Convergence—Creating a Collective Impact | Including disciplines outside of engineering was questioned in the beginning. Engineering did not include other disciplines in the past. Once the concept stage is complete, 70% of the life cycle costs are committed [2]. This classic SE statistic made accepting all discipline inputs acceptable. Taking a step back and creating a SE learning vision as a group was the right thing to do. Previously, a learning vision was not created. |
Transdisciplinary Collaboration | Discussions gave a clear understanding of the need for industry to have a learning program for early-in-career systems engineers. Industry learning programs help train engineers to be effective on programs. |
Collective Intelligence | The NGT was an effective way to identify and define key attributes and their relationships. |
TD Research Integration | Analyzing current SE modules against identified key learning attributes gave clear understanding that improvements were needed. |
TD Engineering Tools | The ISM digraph and MICMAC clarified the flow of attributes and those with high driving power. Focusing on the top three attributes simplified the problem. |
Analysis and TD Assessment | Identifying the gaps from current state to proposed state made the development of the pilot program comprehensive of key attributes. |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Ford, L.; Ertas, A. Utilizing a Transdisciplinary (TD) Systems Engineering (SE) Process Model in the Concept Stage: A Case Study to Effectively Understand the Baseline Maturity for a TD SE Learning Program. Systems 2024, 12, 13. https://doi.org/10.3390/systems12010013
Ford L, Ertas A. Utilizing a Transdisciplinary (TD) Systems Engineering (SE) Process Model in the Concept Stage: A Case Study to Effectively Understand the Baseline Maturity for a TD SE Learning Program. Systems. 2024; 12(1):13. https://doi.org/10.3390/systems12010013
Chicago/Turabian StyleFord, LynnDee, and Atila Ertas. 2024. "Utilizing a Transdisciplinary (TD) Systems Engineering (SE) Process Model in the Concept Stage: A Case Study to Effectively Understand the Baseline Maturity for a TD SE Learning Program" Systems 12, no. 1: 13. https://doi.org/10.3390/systems12010013
APA StyleFord, L., & Ertas, A. (2024). Utilizing a Transdisciplinary (TD) Systems Engineering (SE) Process Model in the Concept Stage: A Case Study to Effectively Understand the Baseline Maturity for a TD SE Learning Program. Systems, 12(1), 13. https://doi.org/10.3390/systems12010013