Meta-Governance Framework to Guide the Establishment of Mass Collaborative Learning Communities
Abstract
:1. Introduction
- Massive open online courses (MOOCs), which are free online courses where the learning contents are delivered to any person who wants to take a course. In this model of learning, one that is designed for large numbers of geographically dispersed students, they can practice learning individually (in personal tasks; thus, this is not really mass collaboration) and through community interactions (via interactive courses, featuring a form of mass collaboration).
- World of Warcraft (WoW), which is a massively multiplayer online role-playing game that facilitates learning through gamification. WoW creates a community of players where participants can play with others in temporary groups. In this collaborative space, learning occurs when the learner needs and wants it. Therefore, in the context of problem-solving, there are opportunities to receive the answer (from other players or more experienced peers) to a question or obtain advice quickly.
- Scratch, which is a free programming language and online community where scratchers/learners can create their own interactive stories, games, and animations. Scratch promotes problem-solving skills, self-expression, collaboration, and creative teaching and learning. There is a discussion page with multiple forums mainly used for chatting, helping (with coding), creating, sharing, and learning together [11].
- The SAP community network (SCN), which is a community of software users, developers, consultants, mentors, and students who use the network to get help, share ideas, learn, innovate, and collaborate [12].
- The community of inquiry (Col) framework is a community of learners and instructors who share a virtual space, technology-reliant environment, rule-based interaction, and course-dependent learning objectives that resulted from the interaction of the perceptions of social, cognitive, and teaching presences [13].
- The ePals global community, which is an example of social online learning that provides the necessary tools (platform) and meeting places to build a worldwide community of learners, global citizens who can share ideas, practice communication, and offer help and guidance [14].
2. Background Information
3. Research Method
3.1. Overall Research Method
- Problem identification and motivation: identifying the research problems and justifying the value of the solutions. As mentioned in the Introduction, this work faces some of the main challenges, such as topic complexity, insufficient structured information (in the literature) about different aspects of MCL. Thus, this step contributes to (a) identifying the main components, features, and characteristics of MCL communities, (b) finding the structural and behavioral aspects of MCL communities, and (c) addressing potential indicators for assessing learning and performance.
- Definition of the objective of the solution: defining how the identified problems should be solved. As pointed out in the Introduction, this study strives to highlight the significance of supporting the processes of implementation, operation, and management of MCL communities. To understand, summarize, and synthesize the existing research, debates, and ideas around this body of knowledge, we conducted a deep literature review. This review helped us to provide a foundation of (general and specialized) knowledge on the topic.
- Design and development: designing an artifact (e.g., model, framework) that can solve the problems detailed above. In responding to the identified problems (addressed in the prior step) and research questions (listed in Section 3.2), we proposed the MGF-MCL framework (presented in Section 4). MGF-MCL is the main contribution of this work. The framework emanates from background knowledge and experiences, in combination with some relevant solutions reported in the literature (and mentioned in part in Section 2).
- Demonstration: finding a suitable context by which to demonstrate the usefulness of the artifact. To measure the success of MGF-MCL in supporting and directing the MCL communities, the framework was applied to three different case studies (presented in Section 5).
- Evaluation: evaluating and observing how well the artifact works. In order to evaluate how successful the MGF-MCL is in supporting and steering the target case studies, we used different assessment methods and processes. For example, through a three-phase evaluation process, we have assessed the acceptability, capability, and effectiveness of the framework in the case studies (see Figure 1).
- Communication: reporting the effectiveness of the proposed solution (framework). The inputs and outputs of this work are shared with others through publication.
3.2. Research Questions and Hypotheses
3.3. Search Strategy
3.4. Inclusion and Exclusion Criteria
3.5. Keyword Selection
- Organizational structure: “organizational structure of collaborative communities/networks/learning communities/learning initiatives”;
- Behavioral structure: “behavioral structure of collaborative communities/networks/learning communities/learning initiatives”;
- Endogenous elements: “endogenous elements of collaborative communities/networks/learning communities/learning initiatives”;
- Exogenous elements: “exogenous elements of collaborative communities/networks/learning communities/learning initiatives”;
- Learning assessment: “assessing learning process in collaborative communities/networks/learning communities/learning initiatives”;
- Performance assessment: “assessing the performance of collaborative communities/networks/learning communities/learning initiatives”;
- Quality of content assessment: “assessing the quality of contents in collaborative communities/networks/learning communities/learning initiatives”.
3.6. Categorization
3.7. Outcome Measures
4. Proposed Meta-Governance Framework
- A.
- Structure Part: refers to the type of structures that can be used for building, arranging, and organizing the MCL community. It comprises two types of structure:
- Organizational structure: outlining how certain activities are delegated toward achieving the goals of the MCL community.
- Behavioral structure: identifying the behavioral patterns and culture of the MCL community.
- B.
- Component Part: addresses the main internal and external components of the MCL community:
- Internal components: focusing on the identification of the main set of elements/properties that can together describe the internal environment of the MCL community.
- External components: focusing on the interaction of the MCL community with its external environment.
- C.
- Assessment Part: emphasizes the importance of community evaluation and, thus, provides a picture of community changes, both positive and negative. It consists of three assessment principles:
- Learning assessment indicators: addressing the main indicators that reflect the individual’s learning.
- Performance assessment indicators: highlighting the main indicators to assess the community performance.
- Assessing the quality of content: identifying potential methods to assess the quality of the content shared among individuals within the community.
- Part 1—identifying the principal components necessary for supporting the implementation, operation, and development of MCL communities. This part focuses on three main tasks:
- ―
- Identifying the potential organizational and behavioral models, aiming at clarifying the community scaffold, scope, authority, and human resources.
- ―
- Identifying the main internal and external elements to raise awareness of the related concepts, environments, entities, relationships, and interactions.
- ―
- Identifying the potential assessment indicators and methods for gathering relevant information about positive and negative changes, such as each individual’s learning progress, community performance, and the distribution of low-quality materials within the community.
- Part 2—evaluating the identified components against the requirements, conditions, and objectives of a target MCL community. This means that when the meta-governance framework is adjusted for application to a specific MCL community, it then needs to be evaluated to ensure that it can meet the expectations and objectives of that community. The evaluation process proceeds in three main phases: namely, evaluation of the acceptability, capability, and effectiveness of the framework. This process can help to appropriately measure, develop, and validate the framework for the target community. The process of evaluation can be taken over by internal members of the community or be outsourced. Depending on community strategies, various methods can be used for evaluation, such as the Delphi method.
4.1. Structural and Behavioral Models
- Behavioral features at the individual level: these reveal those personal attributes (e.g., personal characteristics, personal skills, and background information, as well as individual capacity) that have a significant effect on individual and community learning.
- Behavioral features at the community level: these focus on a set of community features and capabilities that play a key role in the performance and success of the community from the behavioral point of view.
4.2. Internal and External Elements
- Structural dimension—refers to actors (participants) in the community, and their roles and inter-personal relationships.
- Componential dimension—refers to those physical and tangible resources (e.g., equipment and technologies) and intangible resources (e.g., knowledge, copyrights, and patents) that are normally used in the community.
- Functional dimension—refers to all the related operations, processes, methods, procedures, and functions that are used in the community.
- Behavioral dimension—refers to all those principles, governance rules, policies, and cultural issues that drive the behavior of the participants and community.
- Market dimension—deals with interactions between the community and its clientele, competitors, and potential partners and sponsors. Part of this dimension handles related issues, such as the mission of the community, its value proposition, and joint identity.
- Support dimension—deals with the interaction between the community and those external support services (e.g., technical and financial services) that are mainly provided by third-party entities.
- Societal dimension—deals with general interactions (e.g., exchange, competition, and collaboration) and social actions (e.g., communication and volunteering) between the community and society (e.g., public and private organizations or other similar communities).
- Constituency dimension—deals with interactions between the community and its potential new members (e.g., attracting, recruiting, welcoming, directing, and encouraging).
- ○
- Community identity—defining how the community presents itself to the surrounding environment and showing the position of the community in the environment in which it interacts with others.
- ○
- Interaction parties—identifying the potential and interested entities (e.g., organizations, research centers, labs, and researchers) that the community can interact with.
- ○
- Interactions—listing the various transaction types that can take place between the community and its interlocutors (e.g., partners, collaborators, and rivals).
4.3. Assessment Approaches
- Assessment at the individual level—an approach to teaching and learning that creates feedback about the learners’ performance, progress, and/or achievements. It also indicates whether or not the learners have achieved learning outcomes or met the goals of their individualized programs [36].
- Assessment at the community level—a systematic process for analyzing the strengths and weaknesses of the community in relation to its performance, and the factors that affect performance. Additionally, assessment at this level can obtain valid information about the community’s processes, work environment, and organizational structure [37].
- Input indicators—focusing on resources that are needed for creating the learning program;
- Activity indicators—focusing on activities and operations that are used in the learning program;
- Output indicators—focusing on the expected outcome/results that can be achieved by the learning program in the long and the short run; and
- Impact indicators—focusing on learning program contribution to higher-level strategic plans.
- Step 0—sharing a piece of content: in this step, the community members share a piece of content with others in the community.
- Step 1—assessment by technology: in this step, the shared content will be preliminarily checked and then filtered by means of technology (e.g., an AI software filtering system). The checked content: (a) might be rejected when it is useless or is against community rules, (b) might be referred back to the learner when it needs some changes or modification, or (c) might be passed on to the next step, when it can proceed for further assessment.
- Step 2—assessment by moderator(s): in this step, the received content (e.g., controversial cases and suspicious content) from step 1 is checked by the moderator(s) for quality assurance (manually, by computer, or both). Similar to the previous step, the received content: (a) might be rejected, (b) might be referred back to the learner, or (c) might proceed for further assessment. In this case, the checked content will then be categorized according to pre-defined fields, classes, and topics. Afterward, the content is sent to the respective assessment level (Step 3). In cases where the checked content “is not controversial”, it should then be sent to the individual level. If the checked content “is controversial” (e.g., having ethical, cultural, or critical issues), it should preferably be sent to the community level. This is because it is assumed that the community—in comparison with individuals—is better placed to assess and make decisions about such controversial content.
- Step 3a—referral to the individual level: when the content is recognized as “not controversial”, it is sent by a moderator to the individual level for quality assessment.
- >
- Assessment by ordinary participants at the individual level: they assess the content that is recognized as neither “controversial”, nor “scientific and professional”. Here, the assessed content could be rejected, accepted, or referred to the community level (ordinary participants) for further assessment.
- >
- Assessment by expert participants at the individual level: they assess the content that is found “not controversial”, but “scientific and professional”. The assessed content at this stage could be rejected, accepted, or referred to the community level (expert participants) for further assessment.
- Step 3b—referral to the community level: when the considered content is realized as “controversial”, it should be referred to the community level.
- >
- Assessment by ordinary participants at the community level: they assess the content that is categorized as “controversial” but “not scientific and professional”.
- >
- Assessment by expert participants at the community level: they assess the content that is grouped as “controversial” and “scientific and professional”.
5. Case Studies
5.1. Case Study 1
5.2. Case Study 2
- Objective A—creating a lifelong eLearning (LeL) platform for practitioners, one that is able to support collaboration and learning within large networks of participants:
- >
- Function 1—the platform (which is usually video-based in some way) provides an online collaborative environment, where universities, teachers, and students can share their content. This function can be supported by several parts of the meta-governance framework, such as structural and behavioral models, endogenous elements, and the content assessment method.
- >
- Function 2—teachers can construct classes, deliver courses, upload videos, and assign and grade quizzes and homework assignments. This function can receive support from some parts of the meta-governance framework, such as learning and performance assessment.
- >
- Function 3—students can collaborate on finding courses and consuming course content. This function can be directed by some parts of the meta-governance framework, such as structural and behavioral models and endogenous elements.
- >
- Function 4—the platform provides a built-in method for students to practice their new skills and receive feedback from teachers. This function can be guided by some parts of the meta-governance framework, such as the behavioral model.
- >
- Function 5—the classes on the platforms are self-paced, either in part or in full. This function can be supported by some parts of the meta-governance framework, such as the structural model.
- Objective B—developing a learning and knowledge transfer framework, addressing MPQ skills for Industry 4.0:
- >
- Function 1 (in maintenance engineering)—data acquisition regarding equipment, technologies, and functions. This function can be supported by some parts of the meta-governance framework, such as exogenous elements and the content assessment method.
- >
- Function 2 (in production engineering)—a decision support system for the evaluation of continuous production plans. This function can be derived from certain parts of the meta-governance framework, such as performance assessment.
- >
- Function 3 (in production engineering)—data analytics for business intelligence and value creation out of production data. This function can be steered by some parts of the meta-governance framework, such as the content assessment method.
- >
- Function 4 (in quality engineering)—real-time or near-real-time quality control in manufacturing. This function can be supported by certain parts of the meta-governance framework, such as performance assessment.
5.3. Mass Collaborative Learning Illustration
6. Limitations
- The novelty of the MCL concept;
- The complexity of its underlying processes;
- The interdisciplinary nature of the method;
- The fact that the organizational structure and associated mechanism of MCL is still evolving;
- The still insufficient evidence regarding the successful application of MCL in various fields;
- The challenges of dealing with the diffusion of fake or low-quality information/knowledge;
- The fact that the process of MC in the learning community is not yet clearly formalized; and
- The fact that there are some ambiguities regarding the process of stimulating people to join the community and keep them motivated to contribute.
7. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Cordasco, F. A Brief History of Education: A Handbook of Information on Greek, Roman, Medieval, Renaissance, and Modern Educational Practice; No. 67; Rowman & Littlefield: Lanham, MD, USA, 1976; 192p. [Google Scholar]
- Mystakidis, S. Motivation Enhancement Methods for Community Building in Extended Reality. In Augmented and Mixed Reality for Communities; Fisher, J.A., Ed.; CRC Press: Boca Raton, FL, USA, 2021; pp. 265–282. [Google Scholar]
- Mystakidis, S.; Berki, E.; Valtanen, J.-P. Deep and Meaningful E-Learning with Social Virtual Reality Environments in Higher Education: A Systematic Literature Review. Appl. Sci. 2021, 11, 2412. [Google Scholar] [CrossRef]
- Pellas, N.; Kazanidis, I.; Konstantinou, N.; Georgiou, G. Exploring the educational potential of three-dimensional multi-user virtual worlds for STEM education: A mixed-method systematic literature review. Educ. Inf. Technol. 2017, 22, 2235–2279. [Google Scholar] [CrossRef]
- Zamiri, M.; Camarinha-Matos, L.M. Mass Collaboration and Learning: Opportunities, Challenges, and Influential Factors. Appl. Sci. 2019, 9, 2620. [Google Scholar] [CrossRef] [Green Version]
- Cress, U.; Jeong, H.; Moskaliuk, J. Mass Collaboration as an Emerging Paradigm for Education? Theories, Cases, and Research Methods. In Mass Collaboration and Education; Cress, U., Moskaliuk, J., Jeong, H., Eds.; Computer-Supported Collaborative Learning Series; Springer: Cham, Switzerland, 2016; Volume 16. [Google Scholar] [CrossRef]
- Zamiri, M.; Camarinha-Matos, L.M. Learning through Mass Collaboration-Issues and Challenges. In Technological Innovation for Resilient Systems; Camarinha-Matos, L., Adu-Kankam, K., Julashokri, M., Eds.; DoCEIS 2018. IFIP Advances in Information and Communication Technology; Springer: Cham, Switzerland, 2018; Volume 521. [Google Scholar] [CrossRef]
- Zamiri, M.; Camarinha-Matos, L.M. A Mixed Method for Assessing the Reliability of Shared Knowledge in Mass Collaborative Learning Community. In Technological Innovation for Applied AI Systems; DoCEIS 2021. IFIP Advances in Information and Communication Technology; Springer: Cham, Switzerland, 2021; Volume 626. [Google Scholar] [CrossRef]
- Zamiri, M.; EMarcelino-Jesus, E.; Calado, J.; Sarraipa, J.; Jardim-Goncalves, R. Knowledge Management in Research Collaboration Networks. In Proceedings of the International Conference on Industrial Engineering and Systems Management (IESM), Shanghai, China, 25–27 September 2019. [Google Scholar] [CrossRef]
- Zamiri, M.; Camarinha-Matos, L.M. Organizational Structure for Mass Collaboration and Learning. In Technological Innovation for Industry and Service Systems; DoCEIS 2019. IFIP Advances in Information and Communication Technology; Springer: Cham, Switzerland, 2019; Volume 553, pp. 14–23. [Google Scholar] [CrossRef]
- Shapiro, R.B. Toward Participatory Discovery Networks: A Critique of Current Mass Collaboration Environments and a Possible Learning-Rich Future. In Mass Collaboration and Education; Cress, U., Moskaliuk, J., Jeong, H., Eds.; Computer-Supported Collaborative Learning Series; Springer: Cham, Switzerland, 2016; Volume 16. [Google Scholar] [CrossRef]
- Fischer, G. Exploring, Understanding, and Designing Innovative Socio-Technical Environments for Fostering and Supporting Mass Collaboration. In Mass Collaboration and Education; Cress, U., Moskaliuk, J., Jeong, H., Eds.; Computer-Supported Collaborative Learning Series; Springer: Cham, Switzerland, 2016; Volume 16. [Google Scholar] [CrossRef]
- Garrison, D.R.; Anderson, T.; Archer, W. The first decade of the community of inquiry framework: A retrospective. Internet High. Educ. 2010, 13, 5–9. [Google Scholar] [CrossRef]
- Bernadette Dwyer, B. Teaching and Learning in the Global Village: Connect, Create, Collaborate, and Communicate. Read. Teach. 2016, 70, 131–136. [Google Scholar] [CrossRef]
- Zamiri, M.; Camarinha-Matos, L.M. Towards a Reference Model for Mass Collaborative Learning. In Technological Innovation for Life Improvement; DoCEIS 2020, IFIP AICT.; Springer: Cham, Switzerland, 2020; Volume 577, pp. 18–30. [Google Scholar] [CrossRef]
- Illich, I. Deschooling Society; Harper and Row: New York, NY, USA, 1971. [Google Scholar]
- Zamiri, M.; Mohamed, S.; Baqutayan, S. Are We Willing to Share Knowledge Through Computer. Int. Res. J. Comput. Sci. Eng. Appl. 2012, 1, 8–13. [Google Scholar] [CrossRef]
- Corell-Almuzara, A.; López-Belmonte, J.; Marín-Marín, J.-A.; Moreno-Guerrero, A.-J. COVID-19 in the Field of Education: State of the Art. Sustainability 2021, 13, 5452. [Google Scholar] [CrossRef]
- Camarinha-Matos, L.; Afsarmanesh, H. A comprehensive modeling framework for collaborative networked organizations. J. Intell. Manuf. 2007, 18, 529–542. [Google Scholar] [CrossRef]
- Milagres, R.; Silva, S.; Rezende, R. Collaborative Governance: The Coordination of Governance Networks. Rev. de Adm. FACES J. 2019, 18, 103–120. [Google Scholar] [CrossRef]
- Emerson, K.; Nabatchi, T.; Balogh, S. An integrative framework for collaborative governance. J. Public Adm. Res. Theory 2012, 22, 1–29. [Google Scholar] [CrossRef] [Green Version]
- Filatotchev, I.; Nakajima, G. Internal and External Corporate Governance: An Interface between an Organization and its Environment. Br. J. Manag. 2010, 21, 591–606. [Google Scholar] [CrossRef]
- Xing, W.; Wadholm, R.; Petakovic, E.; Goggins, S. Group Learning Assessment: Developing a Theory-Informed Analytics. Educ. Technol. Soc. 2015, 18, 110–128. [Google Scholar]
- Domingo-Segovia1, J.; Rosel, B.R.R.; Rodríguez-Fernández, S.; Bolívar, A. Professional Learning Community Assessment-Revised (PLCA-R) questionnaire: Translation and validation in Spanish context. Learn. Environ. Res. 2020, 23, 347–367. [Google Scholar] [CrossRef]
- Meijer, H.; Hoekstra, R.; Brouwer, J.; Strijbos, J. Unfolding collaborative learning assessment literacy: A reflection on current assessment methods in higher education. Assess. Eval. High. Educ. 2020, 45, 1222–1240. [Google Scholar] [CrossRef]
- Summers, J.; Gorin, J.; Beretvas, S.; Svinicki, M. Evaluating Collaborative Learning and Community. J. Exp. Educ. 2005, 73, 165–188. [Google Scholar] [CrossRef]
- Strijbos, J. Assessment of (Computer-Supported) Collaborative Learning. IEEE Trans. Learn. Technol. 2011, 4, 59–73. [Google Scholar] [CrossRef] [Green Version]
- Cen, L.; Ruta, D.; Powell, L.; Hirsch, B.; Ng, J. Quantitative approach to collaborative learning: Performance prediction, individual assessment, and group composition. Int. J. Comput. -Supported Collab. Learn. 2016, 11, 187–225. [Google Scholar] [CrossRef]
- Cooks, L.; Scharrer, E. Assessing Learning in Community Service Learning: A Social Approach. Mich. J. Community Serv. Learn. 2006, 13, 44–55. [Google Scholar]
- Klein, S.; Benjamin, R.; Shavelson, R.; Bolus, R. The Collegiate Learning Assessment: Facts and Fantasies. Eval. Rev. 2007, 31, 415–439. [Google Scholar] [CrossRef]
- Peffers, K.; Tuunanen, T.; Gengler, C.E.; Rossi, M.; Hui, W.; Virtanen, V.; Bragge, J. The Design Science Research Process: A Model for Producing and Presenting Information Systems Research. In Proceedings of the First International Conference on Design Science Research in Information Systems and Technology, Claremont, CA, USA, 24–25 February 2006; pp. 16–83. [Google Scholar]
- Brocke, J.V.; Hevner, A.; Maedche, A. Introduction to Design Science Research. In Design Science Research; vom Brocke, J., Hevner, A., Maedche, A., Eds.; Design Science Research, Cases: Cham, Switzerland, 2020. [Google Scholar] [CrossRef]
- Soares, A.L.; De Sousa, J.P.; Barbedo, F. Modeling the Structure of Collaborative Networks: Some Contributions. In Processes and Foundations for Virtual Organizations; Camarinha-Matos, L.M., Afsarmanesh, H., Eds.; PRO-VE 2003. IFIP—The International Federation for Information Processing; Springer: Boston, MA, USA, 2004; Volume 134. [Google Scholar] [CrossRef] [Green Version]
- Camarinha-Matos, L.M. Advances in Collaborative Networked Organizations. Innovation in Manufacturing Networks; Azevedo, A., Ed.; Springer: Berlin/Heidelberg, Germany, 2008; pp. 3–16. [Google Scholar] [CrossRef] [Green Version]
- Camarinha-Matos, L.M.; Afsarmanesh, H.; Ermilova, E.; Ferrada, F.; Klen, A.; Jarimo, T. ARCON reference models for collaborative networks. In Collaborative Network: Reference Modeling; Springer: Berlin/Heidelberg, Germany, 2008; pp. 83–112. [Google Scholar]
- Maki, P.L. Developing an assessment plan to learn about student learning. J. Acad. Librariansh. 2002, 28, 8–13. [Google Scholar] [CrossRef]
- Ferreira, R.P.; Silva, J.N.; Strauhs, F.R.; Soares, A.L. Performance Management in Collaborative Networks: A Methodological Proposal. J. Univers. Comput. Sci. 2011, 17, 1412–1429. [Google Scholar] [CrossRef]
- Habernal, I.; Daxenberger, J.; Gurevych, I. Mass Collaboration on the Web: Textual Content Analysis by Means of Natural Language Processing. In Mass Collaboration and Education. Computer-Supported Collaborative Learning Series; Cress, U., Moskaliuk, J., Jeong, H., Eds.; Springer: Cham, Switzerland, 2016; Volume 16. [Google Scholar] [CrossRef] [Green Version]
- Thota, A.; Tilak, P.; Ahluwalia, A.; Lohia, N. Fake News Detection: A Deep Learning Approach. J. SMU Data Sci. Rev. 2018, 1, 10. [Google Scholar]
- Zhang, Y.; Sun, Y.; Xie, B. Quality of Health Information for Consumers on the Web: A Systematic Review of Indicators, Criteria, Tools, and Evaluation Results. J. Assoc. Inf. Sci. Technol. 2015, 66, 2071–2084. [Google Scholar] [CrossRef]
- Ed-en hub. Available online: https://edenhub.eu/index.php/about/ (accessed on 20 October 2021).
- Enhance. Available online: http://eplus-enhance.eu/ (accessed on 20 October 2021).
- Zamiri, M.; Ferreira, J.; Sarraipa, J.; Sassanelli, C.; Gusmeroli, S.; Goncalves, R.J. Towards A Conceptual Framework for Developing Sustainable Digital Innovation Hubs. In Proceedings of the 27th ICE/IEEE International Conference on Engineering, Technology, and Innovation (ICE/ITMC), Cardiff, UK, 21–23 June 2021. in press. [Google Scholar]
- Alves, P.M.S. Glofood–Knowledge Sharing in a Community. Master’s Thesis, Department of Electrical and Computer Engineering, NOVA University Lisbon, Lisbon, Portugal, September 2021. [Google Scholar]
General Structural Model for MCL Communities | |||
---|---|---|---|
Joining Mechanisms
| Roles
| Governance
| Knowledge Management (KM)
|
General Behavioral Model for MCL Communities | |
---|---|
Individual Level
| Community Level
|
Endogenous Elements for MCL | |||
---|---|---|---|
Structural Dimension | Componential Dimension | Functional Dimension | Behavioral Dimension |
Participants
| Resources
| Processes
| Governance Model
|
Exogenous Elements for MCL | |||
---|---|---|---|
Market Dimension | Support Dimension | Societal Dimension | Constituency Dimension |
Community Identity Level | |||
Community Mission
| Community’s Social Nature
| Community legal Status
| Attracting and Recruiting Strategies
|
Interaction Parties Level | |||
Potential Customers/Clients
| Financial Entities
| Governmental Organizations
| Organizations
|
Interaction level | |||
Customer/Client Interactions
| Support/Service Acquisition
| Political Relations
| Member Searching
|
Assessment Indicators for MCL Communities | |
---|---|
Indicators for Individual Performance Assessment | Indicators for Community Performance Assessment |
Input Indicators Having good socio-economic conditions
|
|
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Zamiri, M.; Camarinha-Matos, L.M.; Sarraipa, J. Meta-Governance Framework to Guide the Establishment of Mass Collaborative Learning Communities. Computers 2022, 11, 12. https://doi.org/10.3390/computers11010012
Zamiri M, Camarinha-Matos LM, Sarraipa J. Meta-Governance Framework to Guide the Establishment of Mass Collaborative Learning Communities. Computers. 2022; 11(1):12. https://doi.org/10.3390/computers11010012
Chicago/Turabian StyleZamiri, Majid, Luis M. Camarinha-Matos, and João Sarraipa. 2022. "Meta-Governance Framework to Guide the Establishment of Mass Collaborative Learning Communities" Computers 11, no. 1: 12. https://doi.org/10.3390/computers11010012
APA StyleZamiri, M., Camarinha-Matos, L. M., & Sarraipa, J. (2022). Meta-Governance Framework to Guide the Establishment of Mass Collaborative Learning Communities. Computers, 11(1), 12. https://doi.org/10.3390/computers11010012