Next Article in Journal
A High-Temperature Risk Assessment Model for Maize Based on MODIS LST
Next Article in Special Issue
Endogenous Political, Institutional, Cultural, and Geographic Determinants of Intermunicipal Cooperation—Evidence from Slovakia
Previous Article in Journal
People’s Tendency Toward Norm-Interventions to Tackle Waste Disposal in Public Open Spaces in Phnom Penh, Cambodia
Previous Article in Special Issue
How Social Media Can Foster Social Innovation in Disadvantaged Rural Communities
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:

Towards a Reflexive Framework for Fostering Co—Learning and Improvement of Transdisciplinary Collaboration

Department of Agriculture, Food and Environment, University of Pisa, via del Borghetto 80, 56124 Pisa, Italy
PRAC-Policy Research & Consultancy, Im Hopfengarten 19b, D-65812 Bad Soden a.Ts., Germany
Ruralis–Institute for Rural and Regional Research, Universitetssenteret Dragvoll, N-7491 Trondheim, Norway
Countryside and Community Research Institute, University of Gloucestershire, Swindon Road, Cheltenham, Gloucestershire GL50 4AZ, UK
Rural Sociology Group, Wageningen University, Hollandseweg 1, 6706 KN Wageningen, The Netherlands
Author to whom correspondence should be addressed.
Sustainability 2019, 11(23), 6602;
Submission received: 6 October 2019 / Revised: 22 October 2019 / Accepted: 14 November 2019 / Published: 22 November 2019


Scholars in sustainability science as well as research funders increasingly recognize that a shift from disciplinary and interdisciplinary science to transdisciplinary (TD) research is required to address ever more complex sustainability challenges. Evidence shows that addressing real-world societal problems can be best achieved through collaborative research where diverse actors contribute different kinds of knowledge. While the potential benefits of TD research are widely recognized, its implementation remains a challenge. In this article, we develop a framework that supports reflection and co-learning. Our approach fosters monitoring of the collaboration processes, helps to assess the progress made and encourages continuous reflection and improvement of the research processes. The TD co-learning framework has four dimensions and 44 criteria. It is based on a substantial literature review and was tested in a Horizon 2020-funded research project ROBUST, which is applying experimental governance techniques to improve rural-urban relations in eleven European regions. The results demonstrate that the framework covers the key facets of TD collaboration and that all four broad dimensions matter. Each research-practice team reflected on how their collaboration is going and what needs to be improved. Indeed, the coordination team was able to see how well TD collaboration is functioning at a project level. We believe the framework will be valuable for actors involved in the planning and implementation of any type of multi-actor, interactive, innovation, transformation and action-oriented research project.

1. Introduction

The global challenges societies are facing are multidimensional, transcending disciplinary boundaries, multi-actor by nature and intertwined with a diverse and dynamic socio-political context [1,2,3,4,5]. Disciplinary and orthodox research methodologies have limited capacity to deal with complex challenges [3,6,7,8]. Sustainability science was established as a new field of research in the late 1990s to respond to the multidimensionality of challenges facing societies [4,9,10,11,12,13,14,15,16,17]. It was designed as a problem-driven and solution-oriented field [4], with the aim to consolidate knowledge and methodologies among “natural sciences, social sciences and humanities to create a new, holistic science” [8] (p. 2). It was also meant to be interdisciplinary.
This shift from disciplinary to interdisciplinary science is crucial to solve pressing global challenges, particularly those related to sustainability. Takeuchi (2014) goes beyond and argues that “transcending the interdisciplinary […] [and] the academic […] is the direction in which we must take scholarship [because] society must be included within the grand arena of sustainability science; it must not be shut out.” [8] (p. 3) Jahn et al. (2012) arrived at a similar conclusion, emphasizing the need to link scientific progress with societal needs [18]. Many other researchers support these views, adding that academic and practitioner knowledge needs to be brought together [7,12,18,19,20,21].
As a result, in the last 30 years theory and methodology on TD research within sustainability science have been evolving and maturing often derived from practical experiences. Early examples of the shift towards TD approaches include TD research carried out from 1999 onward at the Institute for Social-Ecological Research (ISOE) ( in Frankfurt/M; the research of ISOE led to major socio-ecological and transformation-oriented research programs funded by the Federal Ministry of Education and Research (BMBF) from 2000 [22,23]; and the research and educational activities carried out from 2010 to 2018 at the Lund University Centre of Excellence for Integration of Social and Natural Dimensions of Sustainability (LUCID) ( [11].
The need to better link scientific progress with societal needs is also more and more recognized by research funders. Interestingly, related research projects are rarely labelled explicitly as ‘transdisciplinary’ but rather as ‘interdisciplinary with stakeholder involvement’ (examples include 94 funded projects under the Rural Economy and Land Use Program in the UK and 12 projects under Stand-Land-Plus program on rural-urban relations, Germany) or ‘multi-actor.’ The latter term features in the European Commission’s (EC) Horizon 2020 program (by 2019, the European Union (EU) allocated around one billion euros to fund around 180 multi-actor projects related to agriculture, forestry and rural development in the seven years of Horizon 2020 (2014–2020). Over 120 of these projects have already started (, where ‘multi-actor projects’ are defined as “projects in which end users and multipliers of research results such as farmers and farmers’ groups, advisers, enterprises and others, are closely cooperating throughout the whole research project period” [24]
While the potential benefits of TD research are widely recognized and such projects are increasingly being funded, implementing TD research remains a major challenge [1,4,7,18,25,26,27,28]. Challenges include alignment of different values and interests, kinds of knowledge and methods; a gap between how TD research is conceptualized, planned and implemented; managing multi-actor collaboration; generating impact at different scales; and institutional barriers [1,4,7,26,28,29,30,31,32,33]. The key question is how to overcome the above mentioned.
Mitchell and Ross in the book of Fam et al. (2017) [2] presented a set of guidelines for improvement of TD research practice where reflexivity and learning are brought to the fore (reflexivity is defined by Bolton (2010) as being aware of and questioning, our own attitudes, thought processes, values, assumptions, prejudices and habitual actions. It leads to an awareness of the limits of our knowledge and of how our own behavior might disregard different views [34], pp. 14, 15). Similarly, many researchers consider either reflexivity or learning (individual learning, mutual, collaborative or co-learning, and social learning), or both elements key for the success of TD research [2,4,6,7,18,20,28,35,36,37,38]. In the literature, the terms ‘mutual learning’ and ‘co-learning’ are often used interchangeably. Mutual learning or co-learning is context-dependent and occurs through social interaction when partners with different expertise and knowledge are collaborating in TD research [2]. Social learning is defined as “a change in understanding that goes beyond the individual to become situated within wider social units or communities of practice through social interactions between actors within social networks.” [39] Westberg and Polk (2016) also discuss different kinds of learning in sustainability science and TD discourse [35]. We will use ‘co-learning’ in this paper to capture these different notions of learning.
Westberg and Polk (2016) emphasize that learning and its connection with reflexivity still do not receive the attention they deserve in TD research, adding that assessing progress in joint learning requires monitoring [35]. The related discussions are between more positivist, technical-rational models that assume that objective assessments are possible and those that argue for the need to acknowledge greater complexity, uncertainty, subjectivity and context specificity. In the latter case, the approach needs to be more deliberative, with an emphasis on inclusiveness and recognition of the validity of a wider range of voices and perspectives in defining legitimacy [2,4,18,19,37,40,41].
In light of the above, the aim of this paper is to develop an approach or framework that supports reflexive activities, facilitates learning and allows monitoring multi-actor research collaborations in order to assess the progress made and encourage continuous improvement. We particularly focus on TD collaboration as one of the key features of TD research. The main research questions are the following:
What broad dimensions determine TD collaboration?
What are appropriate criteria and related guiding questions that can support reflexivity and co-learning in TD collaboration?
What are the lessons learned from a first application of the framework in the EU-funded research project ROBUST on urban-rural relations?
In this paper, we first provide a brief description of the methodology used. This is then followed by a presentation of the TD co-learning framework with four dimensions that we elaborated based on a review of relevant studies. The four dimensions are presented in a tabular form operationalized with a total of 44 criteria, each with guiding questions. Some first lessons learned in applying the framework in the EU-funded research project ROBUST on urban-rural relations illustrate the effectiveness of the approach. We conclude the paper with a brief reflection on the importance of the four dimensions as well as a discussion of opportunities for wider application.

2. Methodology

2.1. Elaboration of the TD Co-Learning Framework Based on a Literature Review

In order to answer the first two research questions abovementioned, a significant body of literature was examined. In the review, we used Scopus, ScienceDirect and Mendeley databases. The keywords included ‘collaboration,’ ‘teamwork,’ ‘team science,’ ‘transdisciplinary,’ ‘participatory,’ ‘living lab,’ ‘evaluation,’ ‘reflection,’ ‘learning,’ ‘monitoring,’ ‘framework,’ ‘criteria’ and ‘indicators.’ Relevant grey literature was identified through supplementary searches. It comprised various reports, project documentation, as well as material published by the EC, in particular by Directorate-General for Agriculture and Rural Development. The search stage was continuous, which allowed spotting relevant publications that would not have been found otherwise through a single search procedure with a set of keywords. Then the titles, abstracts and keywords of the released articles were scanned. Additional criteria were applied at this stage to further narrow down the list of literature. We checked whether the literature includes any of the following terms highly relevant for TD research (1) theoretical frameworks and concepts related to teamwork, collaboration, (meta-)cognition, learning, mutual or collaborative learning, social learning, knowledge integration, social innovation or reflexivity or reflection; (2) theoretical or methodological discussions related to participatory, interdisciplinary, multidisciplinary, cross-disciplinary approaches including monitoring of participatory processes, reflexive assessment, evaluation methods and living lab methodology. Living Labs (LLs) can be defined as “an arena (i.e., geographically or institutionally bounded spaces) and … an approach for intentional collaborative experimentation of researchers, citizens, companies and local governments” [42] (p. 2). This is also the definition adopted in the ROBUST project.
Overall, we selected around 100 publications and several pieces of grey literature for the review. During the elaboration of this framework, particular attention was paid to elaborated frameworks relevant for this paper [9,36,37,43,44,45,46,47,48]. Thematically, the selected literature spans sustainability science in broad terms and specifically, sustainable food systems; socio-ecological and social-ecological systems; rural development; land use; environmental management and urban-rural relations. We also included literature on public health, digitalization and innovation, as well as policy to better capture the diverse nature of TD collaboration and related learning.
The material reviewed predominantly includes studies conducted within the last 30 years. According to Jahn et al. (2012) that is the period when transdisciplinarity is acknowledged to have gained momentum: “It is widely acknowledged that transdisciplinarity gained its current popularity through the works of Funtowicz and Ravetz (1990, 1991, 1993) on post-normal science and further still through those of Gibbons et al. (1994) on a new mode of knowledge production (‘Mode 2’)” [18] (p. 4). The reviewed literature also encompasses various cultural settings and geographical locations (including research carried out in Europe, North and South America, Australia and New Zealand).
We believe that the above steps allowed to base framework development on a sufficiently comprehensive literature review. The resulting TD co-learning framework is presented with relevant references in Section 3.

2.2. Testing of the Framework in the EU-Funded Research Project ROBUST

Recognizing the potential benefits of TD research, the EU-funded research project “Rural Urban Outlooks: Unlocking Synergies (ROBUST)” (For more information on the project, see the project website: was designed to be TD. LLs are one way of working in a TD fashion as they bring actors from science, policy and practice together. In ROBUST, eleven LLs located in different European regions are put center stage. Each LL consists of research and practice partner teams. Research partners are represented by universities, research institutes and consulting firms, while practice partners are most often from a municipal government or regional authority overseeing regional development planning and policy. Research-practice partner teams in LLs have been brought together early in the project to collaborate on specific issues that are considered important for the specific region.
To enhance the LL approach, the project planning included a dedicated task to systematically monitor and reflect on TD collaboration processes. The task was also meant to facilitate necessary adjustment over the course of the project. One first result of this task is the TD co-learning framework presented in this article. The other important outcome will be a longitudinal dataset illustrating change in TD collaboration in the eleven LLs over the four-year project period. Three surveys are planned to do so: the baseline, the progress and the final with about a year in between.
The baseline survey was designed based on the framework presented in this paper. The criteria were chosen and the questions were formulated considering the current state of affairs in the project (e.g., what needs and can already be measured). The baseline survey was run via the online survey tool Qualtrics. It was implemented in April 2019 and included 30 questions. The baseline survey was anonymous with only three questions related to personal data: LL team, type of partner (research or practice) and disciplinary background. Out of a total of 79 partners in the LLs, 57 completed the survey, of which 32 were research and 25 practice partners (overall response rate: 72%).
The baseline survey was deliberately implemented only when LLs had started tangible joint work. The survey serves as a benchmark against which to measure change and it was used to refine priorities and processes. The lessons learned from the first application of the framework are presented in Section 4.

3. TD Co-Learning Framework

In this section, we will present the framework. We will describe the general structure of the framework followed by a discussion of each of the four broad dimensions with the related criteria, literature references and guiding questions.

3.1. Structure and Use of the Framework

The TD co-learning framework is structured along four dimensions as the literature review showed that these dimensions are the most essential when assessing the functioning of participatory and TD research processes (see Appendix A Table A1). The framework consists of the following components:
  • Four dimensions: context, approach, process and outcomes;
  • 44 criteria with related literature references;
  • Guiding questions for each criterion.
To capture the plurality of values and perspectives of the actors involved, we followed the four principles [43] when selecting criteria:
  • relevance, social significance and applicability;
  • credibility, integration and reflexivity, added to traditional criteria of scientific rigor;
  • legitimacy, inclusion and fair representation of stakeholder interests;
  • effectiveness, that is, actual or potential contribution to problem solving and social change.
In order to harness the potential of the framework, we suggest translating the criteria into questions that make sense in the specific context. The questions provided in the tables in Section 3.2 are indicative. The final selection of the criteria, including the number of criteria a team intends to use to monitor and assess team collaboration and the formulation of questions depend on the precise goals and needs of a particular team or project.
The framework can also be used to track change over time, for example over the course of a project. The guiding questions can therefore be used as a reference point, but they need to be adapted to the particular project stage. For example, questions relevant at the beginning of a project can be selected for a baseline survey. During later project stages (e.g., with progress and final surveys), emphasis can shift towards outputs and impact.
Collecting data at individual and team levels is crucial as a team view might not always reflect the individuals’ opinion [49]. Distinguishing between both levels also provides interesting entry points to a deeper analysis, for example by examining divergences in views. More importantly, questions posed to the team might trigger valuable team level discussions and can therefore be applied to teams or individuals or to both.
Table 1 provides a summary overview of the key issues covered with the framework.

3.2. The Four Broad Dimensions in Detail

3.2.1. Context

The setting in which a research process occurs comprises institutional, political, socioeconomic, environmental, historical and cultural factors [9,36,50,51]. Available resources and infrastructure, the degrees of freedom in research and so forth are all context-dependent. Hermans et al. (2011) emphasize that context factors might differ significantly from region to region, and that as a result, the same research process may yield different results [51]. Context also influences the other three dimensions in the framework: the approaches and methods used are tailored to project- and location-specific aims; TD processes change as well and all of this has a significant influence on outcomes and how they are being perceived and assessed (Table 2).

3.2.2. Approach

Approach is included in the framework as it plays a pivotal role in TD research processes. Methodologies are selected based on the broad approach taken. A key question related to ‘approach’ that is of particular interest in this paper is whether a research project is action-orientated. Ramalingam, Wild, and Buffardi (2019) consider monitoring, evaluation and learning crucial in TD research processes, emphasizing that all three should be continuous [40]. “Mutual readiness to reflect, to listen to each other’s views, interests, experiences” and “reflexive monitoring in action” as a practice-oriented approach are suggested for system innovation projects by van Mierlo et al. (2010) [41]. Others, emphasize the importance of co-reflection and adaptive learning [37] as well as of continuous formative evaluation for TD research in sustainability science [4,40,54] (Table 3).

3.2.3. Process

TD research and cooperation among actors with different backgrounds means multiple perspectives, diverse sets of skills and experiences, differences in the terminology used, which often means also different and sometimes conflicting goals. Co-learning, co-creation and co-production also mean working across disciplinary and sectoral boundaries. Professional and personal relationships can therefore play a major role in TD cooperation. The common ground essential for successful collaboration and trust need to be built and this takes time. It follows that the levels and types of stakeholder involvement, ownership and trust, appreciation and respect, the processes in place, leadership, management, decision-making and so forth are key issues (see for example [36]). In our framework, the process dimension encompasses these issues, and more generally, the way the cooperation is implemented, organized, managed and functioning (Table 4).

3.2.4. Outcomes

From the point of view of practice partners and stakeholders involved in TD research projects, expected outcomes are central for their motivation and engagement. Whenever the aim is to jointly make a real difference and to arrive at results that are tangible, meaningful and applicable, practice partners tend to engage. In this respect, Theory of Change is helpful because it starts with the joint articulation of long-term goals, expected outcomes and impacts. Subsequent steps are identifying the conditions for those goals to be met. Walter et al. (2007) refer to “long-term effects representing [the] goals of the TD project.” [54] In our framework, the outcomes dimension comprises intended and unintended outputs, effects, outcomes and impacts (Table 5).

4. Lessons Learned from a First Application of the Framework

In this section, we discuss some first lessons learned when testing the framework in the ROBUST project (see Section 2.2). Overall, the results of the baseline survey on the functioning of TD collaboration were predominantly positive. They, inter alia, indicated the following:
  • The partnership spirit is strong in almost all LL teams. This was manifested for example through the joint preparation of the Research and Innovation Agenda that is central for LL work (39% of respondents reported ‘joint drafting, sharing of preparation’ and 35% of respondents described it as ‘drafted by the research partner with significant feedback from the practice partner’).
  • A blend of different kinds of expertise in LLs is viewed very positive for achieving LL goals with 69% of respondents ranking it high.
  • 77% of respondents feel that their personal contribution to the teamwork is valued (this is almost equally the case for research and practice partners).
  • The three so far received personal benefits reported as the most important featured: development of new valuable relationships (72% of respondents), acquisition of new knowledge (61% of respondents), opportunity to address an important issue (61% of respondents).
As the main purpose of the TD co-learning framework presented in this paper is to continuously enhance research-practice collaboration, the main interest in its use is to identify the main challenges in implementation that need to be addressed. We will therefore in the following focus on the challenges encountered at both LL and project levels, as well as briefly discuss implications. We also explain how we adjusted the use of the framework to better meet the needs of the project. As reflexive components were rather new for most project partners, engaging them in this exercise was a first challenge. One common concern was that we would focus too much on collaboration processes instead of the “much more important” research content-related issues.
In the following we will present the main issues that emerged so far. They were identified based on a combination of the analysis of the baseline survey data, continuous observations during project meetings and records related to relevant email exchanges and Skype calls. The challenge of building trust will not be discussed as it is meanwhile broadly reported and confirmed by many scholars working in TD research.
Most of the challenges we discuss are closely interconnected.
Challenge 1.
Disciplinary and professional lenses can cause underestimation of the complexity of TD collaboration.
The way the partners are seeing TD collaboration is shaped by their disciplinary and professional perspective. By professional perspective we mean epistemological positions in the disciplines they are working in and the specificity of their work, such as established professional norms, routines, processes and methods. For example, colleagues who are used to work with quantitative methods and impact evaluation might not immediately recognize the value of a qualitative or discursive approach; so the method might therefore seem less rigorous or not appropriate. A quote illustrates this view: “I believe that having serious discussions based on the survey done with five participants from a team is not a serious approach and base to work with.” This phenomenon was discussed by Mitchell and Ross [2] (p. 174): “what we take to be true is deeply connected to epistemological positions and these differ radically within and between disciplines. Can truth only be found in large statistically significant results? Or are people’s experiences and stories equally powerful sources of rich and meaningful data? Transdisciplinarity requires articulating between different forms of knowledge–talking across different ways of knowing, different forms of truth.”
In the approach we apply in ROBUST, no one needs to evaluate or is evaluated; instead, we aim at triggering joint discussion of key issues which is crucial in meeting project goals. Furthermore, some professionals working in the disciplines such as spatial planning and economics, do not require much interdisciplinary collaboration. Collaboration with diverse actors is therefore something new for them to embark on. Professional jargon hindering effective communication and mutual understanding is a closely related problem (this is in line with for example [30]). All of the aforementioned demonstrates that the view of partners can be limited and fragmented, which leads to the next issue.
Challenge 2.
Reflexive activities are not needed, ‘not helpful’ and ‘too demanding.’
Some partners consider the investment of time and efforts in monitoring and reflexive activities not worthwhile, important or helpful. In some instances, the big picture and complexity of the project are not recognized. This inevitably means that monitoring and reflexivity are perceived as not needed and are seen instead as a ‘distraction’ from the main (content-related) project activities. A quote from one of the project partners can illustrate this: “Our […] partner did not join the project to participate in the experiment of TD collaboration. They joined to address the real problems they are facing in the region.” Another partner commented: “We consider this exercise demanding for many project participants and teams. [It is not clear] how it can contribute to and support the assessment of project progress.” The implication for project planning and implementation is that monitoring and reflexivity components must not become cumbersome for partners. Their benefits need to be clearly communicated and the related activities need to be carefully planned and efficiently implemented. For example, combining project reporting with monitoring and reflexive elements reduces the ‘burden’ and motivates partners.
Challenge 3.
Psychological barriers and comfort zones.
Another issue that emerged as rather stand-alone is reluctance to openly talk about difficulties and tensions. Interestingly, this concerns both the initiation of open discussions within a specific team and the communication to ‘outsiders,’ for example, a project coordination team or another LL team in the project. At the survey design stage when we were asking the projects partners for feedback, one of the partners remarked: “Many of the questions are oriented at ‘problems’ and challenges encountered in the project. This might raise reluctance among participants to provide a sincere assessment.” In general, both research and practice partners took great care to avoid potentially conflictual discussions—sometimes at the cost of continuing with unresolved issues and in some cases, causing frustration of individual team members. Not willing to reveal problems to ‘outsiders’ is a common psychological phenomenon [113]. People prefer to hide difficulties from others since they do not want to show weaknesses and be judged. The latter was a greater concern for research partners.
Challenge 4.
Institutional frameworks can limit the nature of collaboration.
The institutional framework the partners are part of can greatly affect their room for maneuver in collaboration (see also [53]). For many practice partners, such as regional administrators or spatial planners, the agenda, the processes and the way of working can be rather fixed. Activities and work routines tend to be strictly regulated. Space for participatory activities, co-learning and collaboration is often very limited and for some practice partners this is a challenge. The following quote illustrates this view: “We normally do not interact with the general public. The results of the LL process will feed into the planning procedure; debating the topic now with planners from the municipalities would be deviating and municipal politicians have other agendas. […] This is supposed to feed into a legal procedure and needs a high degree of formality.” In some way this issue overlaps with individual’s perspectives that reflect the particular professional environment. The same applies to research partners who often cannot understand the restrictions and routines in private and public sectors. Getting to know each other’s ‘rules of the game’ is a precondition for good TD collaboration.
Challenge 5.
Matching the interests and competences of partners with requirements.
Reconciling the interests of research and practice partners and their capacity and expertise with requirements at project and LL levels is a major issue. Partners rate the importance of various outputs and outcomes differently. Problems are further aggravated by the evolving agenda of LL teams where the interest of either of the partners might be moving towards the area where the other partner is not very interested in but also not very competent. The original idea in the project planning was to bring complementary competences and skills together when building LL teams. Our experience shows that the supply and demand of competences is not always matched well and sometimes team members are not even aware of the skills and expertise their colleagues possess. Mallaband et al. (2017) come to a similar conclusion contending that sometimes the capacity, skills and expertise of particular team members are not clear for other members and they do not understand what these other members can ‘bring to the table’—“in terms of the methods used, credibility of the data and conclusions and the ‘real world’ impact.” [30] (pp. 12–13) Our own conclusion at this point is that even more attention should be paid during work planning to more precisely matching the interests and competences of partners. However, we acknowledge that even paying closer attention to this aspect does not guarantee success as agendas and relationships are evolving. As one of the project partners in ROBUST noted: “Partner selection is very much dictated by circumstances. When it comes to working with local governments and public authorities, policymakers agree to join the project but at the end of the day, civil servants are those involved; both type of actors are not always aligned and most importantly, the mandate of policymakers tends to be limited, while civil servants remain.”
Challenge 6.
Gap between TD theory and practice.
Our experience so far shows that there is a significant gap between the theory, methodology and guiding principles for TD research ‘on paper’ and the way it is implemented in practice. A statement from one of the project partners illustrates this: “These ideas that LLs [meaning TD research as well] will run on their own is just not true. LL work immediately ceases once researchers stop pushing it. These LLs are just too much effort, a lot of pressure is put onto the researchers’ shoulders to implement them without much results.” In theory, LLs are meant to be a joint effort where all partners share responsibility—which infers a rather different mindset. One of the reasons for the gap between theory and practice is the rapid evolution of TD concepts and methodologies. The problem is aggravated by the continuing predominance of disciplinary perspectives, and on the side of practice partners, the rather entrenched professional routines. The evolving and adaptive nature of TD research confuses many actors about what exactly should be done and how, compared to the rather clear goals and linear processes that actors are more used to within their disciplinary and professional environment (see for example [114]).
The six challenges identified above, are in line with the literature touched upon in the introduction. Even though the challenges we have encountered and illustrated above are so far only based on a first application of the framework, ongoing discussions and email exchanges at LL and project levels, they by and large correspond with those previously identified. The added value of applying the TD co-learning framework in our project was that it allowed to identify issues early and trigger much needed discussions. Consistency with earlier research concerns challenges related to the integration of diverse knowledge, disciplinary perspectives and methods [26,28] (Challenge 1). The gap between how TD research is conceptualized and implemented [26] was also identified as a major issue in ROBUST (Challenge 6). One of the challenges identified in the literature [31] was introducing reflexivity in TD collaboration, and our experience in ROBUST is in line with that (Challenge 2 and Challenge 3).
All six challenges listed above require continuous attention from the project coordination team. In order to motivate all partners to genuinely participate in monitoring and reflexive activities, a stepwise approach was elaborated. Most importantly, we explained better how precisely the project is complex, why a reflexive approach is needed and why co-learning is important. Wherever possible, we referred to relevant literature and experiences. For example, it was helpful to refer to previous research projects that had ‘overlooked’ or ‘neglected’ co-learning processes and a reflexive approach and as a result, performed worse than they could have. We also revised the baseline survey and implemented it in a way that was more inviting and less time-consuming. This was achieved through a significant reduction of the number of questions to the most essential, turning most of them into multiple choice or Likert scale questions and running it via a convenient online survey tool. Overall, the baseline survey allowed each LL team to reflect on how their collaboration is going and what needs to be improved. The project coordination team was in turn able to see a bigger picture of how well TD collaboration in the project is functioning.

5. Conclusions

Given the growing emphasis now placed on TD research methods that deliver sustainability solutions to many of the wicked problems we now face, we discuss in this article why it is important to include reflexive components in this type of research. We ascertained that while the TD research approach and TD research projects are well-intended, thorough and well-structured frameworks to enable reflexivity and show tangible improvements, especially for soft outcomes, are needed.
In this concluding section, we will first briefly reflect on the four dimensions and key issues (and criteria) that have been identified as elemental in TD collaboration. This will be followed by some thoughts on a wider application of the framework.

5.1. Importance of the Four Dimensions and Criteria for Fostering TD Collaboration

In this article, we synthesize and enhance earlier meta-level research and reviews, for example [9,26,38,39,43,48] and many other research-based articles. We are presenting the results of our review in the form of a TD co-learning framework which comprises four dimensions with 44 criteria. We think that the framework will support the monitoring of TD processes and therefore will ultimately enhance TD collaboration and (co-)learning.
The results of a first application of the approach in the ROBUST project shows that the framework covers the key facets of TD collaboration and that all four broad dimensions matter for this. The four dimensions allowed obtaining more complete insights into how well the TD teams are functioning. The key issues in TD collaboration were
  • The different disciplinary backgrounds of the project partners, their level of engagement and the perceived relevance of the problems they are jointly addressing (context).
  • The perceived usefulness of complementary types of knowledge in addressing these problems, and, closely related, attitudes towards the reflexive methods applied (approach).
  • The nature and quality of collaboration in terms of reconciling different interests in teamwork, the functioning of knowledge exchange and co-learning, levels of motivation and mutual appreciation, leadership patterns, team-level management and decision-making (process).
  • The progress being made in relation to the aims of the team and project, including whether the aims are still achievable; things that can be improved; and personal benefits so far received—as one of the main motivation factors to stay engaged (outcomes).

5.2. Towards a Wider Application of the Framework

TD collaboration potentially entails an enormous amount of learning among all actors involved across the science-policy-practice interface. The monitoring and reflexive components of the framework can capture this learning. We therefore expect that the framework will be of a much broader relevance and will assist actors involved in the planning and implementation of TD research projects. We also believe that the framework will be valuable for any type of multi-actor, interactive, innovation, transformation and action-oriented research project. We think that the framework can most effectively be used in formative evaluation, including in a participatory manner.
The main strengths of the framework can be summarized as conceptual coherence, flexibility and functionality. The same qualities were also recognized as important for evaluating sustainability projects [115,116]. An important characteristic is that the TD co-learning framework is adaptable to the needs of those who will use it. The questions indicated in the framework are by no means prescriptive. Instead, they are meant to accommodate different needs in a project, for example when designing a survey, structuring an interview, facilitating a team discussion and/or joint reflection exercise, creating a reporting template and so forth.
We identified two main groups of actors that could benefit from using this approach:
Multi-actor teams that do not have the time or resources to develop an own framework for monitoring and reflexive activities. They can cherry-pick what is needed in their situation depending on project goals, scale, actors involved and so forth from the wider list of criteria. To better understand the meaning of each specific indicator for project work, the related practical questions are juxtaposed. Some questions are suitable for designing a survey with more focus on quantitative data (Likert Scale, Multiple Choice). Other questions can be used for a targeted focus groups or workshops to trigger and guide/facilitate participatory reflection.
Teams which elaborated a framework but seek to widen, adjust or improve it for their specific context and teams which need immediate practical solutions. The framework can be applied in a wide range of sectors. Applications may include tracking stakeholder engagement or satisfaction surveys. In this case, the list of criteria and questions is meant to inspire new ways of thinking or working.
The next step for us will be to use the framework to derive practical lessons learned from other TD research projects and initiatives in the area of agri-food, sustainable rural development and the knowledge-based bioeconomy.

Author Contributions

Conceptualization: M.K., supported by K.K. and F.G.; Literature review: M.K.; Development of methodology: M.K., supported by K.K., F.G. and D.M.; First application in ROBUST: M.K., supported by D.M. and J.S.C.W.; Overall lead, original draft preparation, writing & editing: M.K.


This research was funded by the ROBUST project, as part of the Horizon 2020 Framework Programme of the European Union under Grant Agreement no. 727988. The information and views set out in this publication are those of the authors and do not necessarily reflect the official opinion of the European Union.


We gratefully acknowledge the contribution of all ROBUST project partners and thank them for the cooperation in Task 3.4. We would also like to thank the anonymous reviewers of this and previous versions of the paper for their valuable comments. We also thank Joanne Spataro, an English language instructor at the University of Pisa, for a final language check of the manuscript.

Conflicts of Interest

The authors declare no conflict of interest. The founding sponsors had no role in the design of the study; in the collection, analyses or interpretation of data; in the writing of the manuscript and in the decision to publish the results.

Appendix A

Table A1. Overview of the main literature examined to select the four broad dimensions for the framework (in the alphabetical order).
Table A1. Overview of the main literature examined to select the four broad dimensions for the framework (in the alphabetical order).
SourceThematic FocusContextApproachProcessOutcomes
1. Benson et al., 2014 [117]Participatory governance, management of water resources, evaluation, learning XX
2. Blackstock, Kelly, and Horsey, 2007 [9]Participatory research for sustainability, evaluation, social learningXX 1XX
3. Burgess and Chilvers, 2006 [51]Participatory technology assessment, evaluation, new governanceX XX
4. Carew and Wickson, 2010 [1]TD research, research planning, supporting, evaluationX XX 2
5. Fam, Palmer, Riedy, and Mitchell, 2017 [2]TD research, practice for sustainability, reflexivity, learning, governanceXXXX
6. Hansson and Polk, 2018 [118]TD research, sustainable urban development, evaluation XX
7. Hermans, Haarmann, and Dagevos, 2011 [51]Stakeholder participation, monitoring regional sustainability, evaluationX XX
8. Holzer, Carmon, and Orenstein, 2018 [44]TD research, socio-ecological systems, methodology, evaluation XX
9. Hubeau, Marchand, Coteur, Debruyne, and Van Huylenbroeck, 2018 [36]TD research, agri-food systems, reflexive assessment, sustainability, transformationXX 3XX
10. OECD, 2018 [46]Triangular cooperation, value-added, monitoring, evaluationX 4 XX
11. Schuurman, De Marez, and Ballon, 2016 [74]Living Labs, open innovation, impact, small and medium-sized enterprises X X
12. Siebenhüner, 2018 [119]TD research for sustainability, knowledge integration, co-learning XX
13. Stokols, Harvey, Gress, Fuqua, and Phillips, 2005 [120]TD research, collaboration, evaluation, tobacco use science and preventionX 5 XX
14. Taplin and Clark, 2012 [121]Theory of Change, planning, monitoring, evaluation, outcomes, indicatorsX X 6X
15. Van Geenhuizen, 2018 [47]Living Labs, user-centered innovation, boundary spanning, evaluationX 7X 8XX
16. Veeckman, Schuurman, Leminen, and Westerlund, 2013 [48]Living Labs, characteristics, outcomes, user-centered innovation, evaluation,X 9X X
17. Walker, Rahman, and Cave, 2001 [122]Adaptive policies, policy analysis, policymaking, outcomesX 10 X
18. Walter, Helgenberger, Wiek, and Scholz, 2007 [54]TD research, evaluation, knowledge integration, co-learning XX
19. Vogel, 2012 [123]Theory of Change, monitoring, evaluation, international developmentX X 11X
1 ‘Methods’ in Blackstock et al. (2007) [9]; 2 ‘Product’ in Carew and Wickson (2010) [1]; 3 ‘Methods’ in Hubeau et al. (2018) [36]; 4 ‘Activities’ in [46]; 5 ‘Antecedents’ in Stokols et al. (2005) [120]; 6 ‘Interventions’ in Taplin and Clark (2012) [121]; 7 ‘Exogenous influences’ and ‘Inputs’ in van Geenhuizen (2018) are included under ‘Context.’ According to van Geenhuizen (2018), inputs include the motivation and capabilities of actors; sets of learning tools and models; specific expertise; financial budgets and other resources, as well as the real-life environment. Van Geenhuizen (2018) describes LL processes that are beyond control of managers of LL as ‘exogenous influences’ [47]; 8 ‘Inputs’ in van Geenhuizen (2018). van Geenhuizen (2018) lists methods and tools of LL processes as ‘Inputs’ [47]; 9 ‘Environment’ in Veeckman et al. (2013) [48]; 10 ‘Stage setting’ in Walker et al. (2001) [122], which includes important objectives, constraints and policy options; 11 ‘Activities’ in Vogel (2012). [123].


  1. Carew, A.L.; Wickson, F. The TD Wheel: A heuristic to shape, support and evaluate transdisciplinary research. Futures 2010, 42, 1146–1155. [Google Scholar] [CrossRef]
  2. Fam, D.; Palmer, J.; Riedy, C.; Mitchell, C. Transdisciplinary Research and Practice for Sustainability Outcomes, 1st ed.; Fam, D., Palmer, J., Riedy, C., Mitchell, C., Eds.; Routledge: Abingdon, UK; New York, NY, USA, 2017; Available online: (accessed on 3 March 2019).
  3. Hadorn, G.H.; Bradley, D.; Pohl, C.; Rist, S.; Wiesmann, U. Implications of transdisciplinarity for sustainability research. Ecol. Econ. 2006, 60, 119–128. [Google Scholar] [CrossRef]
  4. Lang, D.J.; Wiek, A.; Bergmann, M.; Stauffacher, M.; Martens, P.; Moll, P.; Swilling, M.; Thomas, C.J. Transdisciplinary research in sustainability science: Practice, principles, and challenges. Sustain. Sci. 2012, 7, 25–43. [Google Scholar] [CrossRef]
  5. Lawrence, R.J. Beyond Disciplinary Confinement to Imaginative Transdisciplinarity. In Tackling Wicked Problems Through Transdisciplinary Imagination; Taylor & Francis: London, UK, 2010; pp. 16–30. [Google Scholar] [CrossRef]
  6. Mitchell, C.; Cordell, D.; Fam, D. Beginning at the end: The outcome spaces framework to guide purposive transdisciplinary research. Futures 2015, 65, 86–96. [Google Scholar] [CrossRef]
  7. Popa, F.; Guillermin, M.; Dedeurwaerdere, T. A pragmatist approach to transdisciplinarity in sustainability research: From complex systems theory to reflexive science. Futures 2015, 65, 45–56. [Google Scholar] [CrossRef]
  8. Takeuchi, K. The ideal form of transdisciplinary research as seen from the perspective of sustainability science, considering the future development of IATSS. IATSS Res. 2014, 38, 2–6. [Google Scholar] [CrossRef]
  9. Blackstock, K.; Kelly, G.; Horsey, B. Developing and applying a framework to evaluate participatory research for sustainability. Ecol. Econ. 2007, 60, 726–742. [Google Scholar] [CrossRef]
  10. Clark, W.C.; Dickson, N.M. Sustainability science: The emerging research program. Proc. Natl. Acad. Sci. USA 2003, 100, 8059–8061. [Google Scholar] [CrossRef]
  11. Jerneck, A.; Olsson, L.; Ness, B.; Anderberg, S.; Baier, M.; Clark, E.; Hickler, T.; Hornborg, A.; Kronsell, A.; Lövbrand, E.; et al. Structuring sustainability science. Sustain. Sci. 2011, 6, 69–82. [Google Scholar] [CrossRef]
  12. Kates, R.W.; Clark, W.C.; Corell, R.; Hall, J.M.; Jaeger, C.C.; Lowe, I.; McCarthy, J.J.; Schellnhuber, H.J.; Bolin, B.; Dickson, N.M.; et al. Environment and Development: Sustainability Science. Science 2001, 292, 641–642. [Google Scholar] [CrossRef]
  13. Komiyama, H.; Takeuchi, K. Sustainability science: Building a new discipline. Sustain. Sci. 2006, 1, 1–6. [Google Scholar] [CrossRef]
  14. Martens, P. Sustainability: Science or Fiction? IEEE Eng. Manag. Rev. 2007, 35, 70. [Google Scholar] [CrossRef]
  15. Swart, R.; Raskin, P.; Robinson, J. The problem of the future: Sustainability science and scenario analysis. Glob. Environ. Chang. 2004, 14, 137–146. [Google Scholar] [CrossRef]
  16. Wiek, A.; Ness, B.; Schweizer-Ries, P.; Brand, F.S.; Farioli, F. From complex systems analysis to transformational change: A comparative appraisal of sustainability science projects. Sustain. Sci. 2012, 7, 5–24. [Google Scholar] [CrossRef]
  17. Wiek, A.; Withycombe, L.; Redman, C.L. Key competencies in sustainability: A reference framework for academic program development. Sustain. Sci. 2011, 6, 203–218. [Google Scholar] [CrossRef]
  18. Jahn, T.; Bergmann, M.; Keil, F. Transdisciplinarity: Between mainstreaming and marginalization. Ecol. Econ. 2012, 79, 1–10. [Google Scholar] [CrossRef]
  19. Funtowicz, S.O.; Ravetz, J.R. Science for the post-normal age. Futures 1993, 25, 739–755. [Google Scholar] [CrossRef]
  20. Šūmane, S.; Kunda, I.; Knickel, K.; Strauss, A.; Tisenkopfs, T.; Rios, I.D.I.; Rivera, M.; Chebach, T.; Ashkenazy, A.; Carmenado, I.D.L.R. Local and farmers’ knowledge matters! How integrating informal and formal knowledge enhances sustainable and resilient agriculture. J. Rural. Stud. 2018, 59, 232–241. [Google Scholar] [CrossRef]
  21. Ziman, J. Is science losing its objectivity? Nature 1996, 382, 751–754. [Google Scholar] [CrossRef]
  22. BMBF. Sozial-ökologische Forschung: Förderkonzept für Eine Gesellschaftsbezogene Nachhaltigkeitsforschung 2015–2020. Bonn. 2015. Available online: (accessed on 29 August 2019).
  23. BMBF. Forschungsagenda Green Economy. Bonn. 2016. Available online: (accessed on 27 August 2019).
  24. Van Oost, I. The European Innovation Partnership (EIP) Agricultural Productivity and Sustainability: Speeding Up Innovation. In Proceedings of the “Added Value of Cooperation in Bioeconomy Research” International Bioeast Conference, Budapest, Hungary, 20 September 2017; Available online: (accessed on 14 March 2019).
  25. Bäckstrand, K. Civic Science for Sustainability: Reframing the Role of Experts, Policy-Makers and Citizens in Environmental Governance. Glob. Environ. Politics 2003, 3, 24–41. [Google Scholar] [CrossRef]
  26. Brandt, P.; Ernst, A.; Gralla, F.; Luederitz, C.; Lang, D.J.; Newig, J.; Reinert, F.; Abson, D.J.; Von Wehrden, H. A review of transdisciplinary research in sustainability science. Ecol. Econ. 2013, 92, 1–15. [Google Scholar] [CrossRef]
  27. Jakobsen, C.H.; Hels, T.; McLaughlin, W.J. Barriers and facilitators to integration among scientists in transdisciplinary landscape analyses: A cross-country comparison. For. Policy Econ. 2004, 6, 15–31. [Google Scholar] [CrossRef]
  28. Polk, M. Transdisciplinary co-production: Designing and testing a transdisciplinary research framework for societal problem solving. Futures 2015, 65, 110–122. [Google Scholar] [CrossRef]
  29. Maasen, S.; Lieven, O. Transdisciplinarity: A new mode of governing science? Sci. Public Policy 2006, 33, 399–410. [Google Scholar] [CrossRef]
  30. Mallaband, B.; Wood, G.; Buchanan, K.; Staddon, S.; Mogles, N.; Gabe-Thomas, E. The reality of cross-disciplinary energy research in the United Kingdom: A social science perspective. Energy Res. Soc. Sci. 2017, 25, 9–18. [Google Scholar] [CrossRef]
  31. McGregor, S.L. 4/22—Challenges of Transdisciplinary Collaboration: A Conceptual Literature Review. Integral Leadersh. Rev. 2017. [Google Scholar]
  32. Schoolman, E.D.; Guest, J.S.; Bush, K.F.; Bell, A.R. How interdisciplinary is sustainability research? Analyzing the structure of an emerging scientific field. Sustain. Sci. 2012, 7, 67–80. [Google Scholar] [CrossRef]
  33. Zscheischler, J.; Rogga, S. Transdisciplinarity in land use science—A review of concepts, empirical findings and current practices. Futures 2015, 65, 28–44. [Google Scholar] [CrossRef]
  34. Bolton, G. Reflection and Reflexivity: What and Why Reflective Practice: Writing and Professional Development; Bolton, G., Ed.; SAGE: London, UK, 2010. [Google Scholar]
  35. Westberg, L.; Polk, M. The role of learning in transdisciplinary research: Moving from a normative concept to an analytical tool through a practice-based approach. Sustain. Sci. 2016, 11, 385–397. [Google Scholar] [CrossRef]
  36. Hubeau, M.; Marchand, F.; Coteur, I.; Debruyne, L.; Van Huylenbroeck, G. A reflexive assessment of a regional initiative in the agri-food system to test whether and how it meets the premises of transdisciplinary research. Sustain. Sci. 2018, 13, 1137–1154. [Google Scholar] [CrossRef]
  37. Roux, D.J.; Stirzaker, R.J.; Breen, C.M.; Lefroy, E.; Cresswell, H.P.; Lefroy, E. Framework for participative reflection on the accomplishment of transdisciplinary research programs. Environ. Sci. Policy 2010, 13, 733–741. [Google Scholar] [CrossRef]
  38. Schauppenlehner-Kloyber, E.; Penker, M. Managing group processes in transdisciplinary future studies: How to facilitate social learning and capacity building for self-organised action towards sustainable urban development? Futures 2015, 65, 57–71. [Google Scholar] [CrossRef]
  39. Reed, M.S.; Evely, A.C.; Cundill, G.; Fazey, I.; Glass, J.; Laing, A.; Newig, J.; Parrish, B.; Prell, C.; Raymond, C.; et al. What is Social Learning? Ecol. Soc. 2010, 15, r1. [Google Scholar] [CrossRef]
  40. Ramalingam, B.; Wild, L.; Buffardi, A.L. Briefing Note Making Adaptive Rigour Work Principles and Practices for Strengthening Monitoring, Evaluation and Learning for Adaptive Management. 2019. Available online: (accessed on 3 July 2019).
  41. van Mierlo, B.; Regeer, B.; van Amstel, M.; Arkesteijn, M.C.M.; Beekman, V.; Bunders, J.F.G.; de Cock, B.T.; Elzen, B.; Hoes, A.C.; Leeuwis, C. Reflexive Monitoring in Action: A Guide for Monitoring System Innovation Projects; Wageningen/Amsterdam:Communication and Innovation Studies, WUR; Athena Institute, VU.: Wageningen, The Netherlands, 2010; Available online: (accessed on 9 March 2019).
  42. Voytenko, Y.; McCormick, K.; Evans, J.; Schliwa, G. Urban living labs for sustainability and low carbon cities in Europe: Towards a research agenda. J. Clean. Prod. 2016, 123, 45–54. [Google Scholar] [CrossRef] [Green Version]
  43. Belcher, B.M.; Rasmussen, K.E.; Kemshaw, M.R.; Zornes, D.A. Defining and assessing research quality in a transdisciplinary context. Res. Eval. 2016, 25, 1–17. [Google Scholar] [CrossRef]
  44. Holzer, J.M.; Carmon, N.; Orenstein, D.E. A methodology for evaluating transdisciplinary research on coupled socio-ecological systems. Ecol. Indic. 2018, 85, 808–819. [Google Scholar] [CrossRef]
  45. Lasker, R.D.; Weiss, E.S.; Miller, R. Partnership Synergy: A Practical Framework for Studying and Strengthening the Collaborative Advantage. Milbank Q. 2001, 79, 179–205. [Google Scholar] [CrossRef] [Green Version]
  46. OECD. Toolkit for Identifying, Monitoring and Evaluating the Value Added of Triangular Co-Operation. 2018, pp. 1–33. Available online: (accessed on 21 January 2019).
  47. van Geenhuizen, M. A framework for the evaluation of living labs as boundary spanners in innovation. Environ. Plan. C Politics Space 2018, 36, 1280–1298. [Google Scholar] [CrossRef]
  48. Veeckman, C.; Schuurman, D.; Leminen, S.; Westerlund, M. Linking Living Lab Characteristics and Their Outcomes: Towards a Conceptual Framework. Technol. Innov. Manag. Rev. 2013, 3, 6–15. [Google Scholar] [CrossRef]
  49. Smithson, J. Using and analysing focus groups: Limitations and possibilities. Int. J. Soc. Res. Methodol. 2000, 3, 103–119. [Google Scholar] [CrossRef]
  50. Burgess, J.; Chilvers, J. Upping the ante: A conceptual framework for designing and evaluating participatory technology assessments. Sci. Public Policy 2006, 33, 713–728. [Google Scholar] [CrossRef]
  51. Hermans, F.L.P.; Haarmann, W.M.F.; Dagevos, J.F.L.M.M. Evaluation of stakeholder participation in monitoring regional sustainable development. Reg. Environ. Chang. 2011, 11, 805–815. [Google Scholar] [CrossRef] [Green Version]
  52. Center for the Advancement of Collaborative Strategies in Health. Partnership Self-Assessment Tool-Questionnaire. 2002. Available online: (accessed on 11 January 2019).
  53. Klerkx, L.; Seuneke, P.; De Wolf, P.; Rossing, W.A. Replication and translation of co-innovation: The influence of institutional context in large international participatory research projects. Land Use Policy 2017, 61, 276–292. [Google Scholar] [CrossRef]
  54. Walter, A.I.; Helgenberger, S.; Wiek, A.; Scholz, R.W. Measuring societal effects of transdisciplinary research projects: Design and application of an evaluation method. Eval. Program Plan. 2007, 30, 325–338. [Google Scholar] [CrossRef]
  55. Bammer, G. Enhancing research collaborations: Three key management challenges. Res. Policy 2008, 37, 875–887. [Google Scholar] [CrossRef]
  56. Marks, M.A.; Mathieu, J.E.; Zaccaro, S.J. A Temporally Based Framework and Taxonomy of Team Processes. Acad. Manag. Rev. 2001, 26, 356–376. [Google Scholar] [CrossRef] [Green Version]
  57. Edmondson, A.C.; Harvey, J.-F. Cross-boundary teaming for innovation: Integrating research on teams and knowledge in organizations. Hum. Resour. Manag. Rev. 2018, 28, 347–360. [Google Scholar] [CrossRef]
  58. Williams, K.Y.; O’Reilly, C.A.; Hülsheger, U.R.; Anderson, N.; Salgado, J.F.; Schauppenlehner-Kloyber, E.; Penker, M. Demography and diversity in organizations: A review of 40 years of research. Futures 1998, 20, 79–140. [Google Scholar]
  59. Campbell, D.; Moore, G. Increasing the use of research in population health policies and programs: A rapid review. Public Health Res. Pract. 2018, 28, e2831816. [Google Scholar] [CrossRef]
  60. Meagher, L.R. Report Rural Economy and Land Use Programme Societal and Economic Impact Evaluation. 2012. Available online: (accessed on 27 January 2019).
  61. Wiek, A.; Talwar, S.; Robinson, J.; O’Shea, M.; O’Shea, M. Toward a methodological scheme for capturing societal effects of participatory sustainability research. Res. Eval. 2014, 23, 117–132. [Google Scholar] [CrossRef]
  62. Wenger, E. Communities of Practice and Social Learning Systems. Organization 2000, 7, 225–246. [Google Scholar] [CrossRef]
  63. Meagher, L.R. Report Rural Economy and Land Use Programme Societal and Economic Impact Evaluation. Part two. 2012. Available online: (accessed on 10 February 2019).
  64. Zardo, P.; Barnett, A.G.; Suzor, N.; Cahill, T. Does engagement predict research use? An analysis of The Conversation Annual Survey 2016. PLoS ONE 2018, 13, e0192290. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  65. Ponomariov, B.L.; Boardman, P.C. Influencing scientists’ collaboration and productivity patterns through new institutions: University research centers and scientific and technical human capital. Res. Policy 2010, 39, 613–624. [Google Scholar] [CrossRef]
  66. Carr, G.; Blöschl, G.; Loucks, D.P. Gaining insight into interdisciplinary research and education programmes: A framework for evaluation. Res. Policy 2018, 47, 35–48. [Google Scholar] [CrossRef]
  67. Cummings, J.N.; Kiesler, S. Coordination costs and project outcomes in multi-university collaborations. Res. Policy 2007, 36, 1620–1634. [Google Scholar] [CrossRef]
  68. Haapasaari, P.; Kulmala, S.; Kuikka, S. Growing into Interdisciplinarity: How to Converge Biology, Economics, and Social Science in Fisheries Research? Ecol. Soc. 2012, 17, 6. [Google Scholar] [CrossRef]
  69. Heinze, T.; Shapira, P.; Rogers, J.D.; Senker, J.M. Organizational and institutional influences on creativity in scientific research. Res. Policy 2009, 38, 610–623. [Google Scholar] [CrossRef]
  70. Kabo, F.W.; Cotton-Nessler, N.; Hwang, Y.; Levenstein, M.C.; Owen-Smith, J. Proximity effects on the dynamics and outcomes of scientific collaborations. Res. Policy 2014, 43, 1469–1485. [Google Scholar] [CrossRef]
  71. Moragues-Faus, A.; Marceau, A. Measuring Progress in Sustainable Food Cities: An Indicators Toolbox for Action. Sustainability 2019, 11, 45. [Google Scholar] [CrossRef] [Green Version]
  72. Steen, K.; van Bueren, E. Urban Living Labs: A living lab way of working. In Amsterdam Institute for Advanced Metropolitan Regions, 1st ed.; AMS Institute: Amsterdam, The Netherlands, 2017; Available online: (accessed on 29 January 2019).
  73. FAO. Participatory Assessment, Monitoring and Evaluation; Rome, 1989; Available online: (accessed on 18 January 2019).
  74. Schuurman, D.; De Marez, L.; Ballon, P. The Impact of Living Lab Methodology on Open Innovation Contributions and Outcomes. Technol. Innov. Manag. Rev. 2016, 6, 7–16. [Google Scholar] [CrossRef]
  75. De Moor, K.; Berte, K.; De Marez, L.; Joseph, W.; Deryckere, T.; Martens, L. User-Driven Innovation? Challenges of User Involvement in Future Technology Analysis. Sci. Public Policy 2010, 37, 51–61. [Google Scholar] [CrossRef]
  76. Kehayia, E.; Swaine, B.; Longo, C.; Ahmed, S.; Archambault, P.; Fung, J.; Kairy, D.; Lamontagne, A.; Le Dorze, G.; Lefebvre, H.; et al. Creating a Rehabilitation Living Lab to Optimize Participation and Inclusion for Persons with Physical Disabilities. Alter 2014, 8, 151–157. [Google Scholar] [CrossRef] [Green Version]
  77. Klein, J.T. Evaluation of Interdisciplinary and Transdisciplinary Research. Am. J. Prev. Med. 2008, 35, S116–S123. [Google Scholar] [CrossRef] [PubMed]
  78. Logghe, S.; Schuurman, D. Action Research as a Framework to Evaluate the Operations of a Living Lab. Technol. Innov. Manag. Rev. 2017, 7, 35–41. [Google Scholar] [CrossRef]
  79. Sauer, S. User Innovativeness in Living Laboratories: Everyday User Improvisations with Icts as a Source of Innovation. Ph.D. Thesis, University Library/University of Twente, Twente, The Netherlands, 2013. [Google Scholar]
  80. Ståhlbröst, A.; Holst, M. Reflecting on Actions in Living Lab Research. Technol. Innov. Manag. Rev. 2017, 7, 27–34. [Google Scholar] [CrossRef]
  81. Wenger-Trayner, B.; Wenger-Trayner, E.; Cameron, J.; Eryigit-Madzwamuse, S.; Hart, A. Boundaries and Boundary Objects: An Evaluation Framework for Mixed Methods Research. J. Mix. Methods Res. 2017, 1–18. [Google Scholar] [CrossRef] [Green Version]
  82. Hakkarainen, L.; Hyysalo, S. How Do We Keep the Living Laboratory Alive? Learning and Conflicts in Living Lab Collaboration. Technol. Innov. Manag. Rev. 2013, 3, 16–22. [Google Scholar] [CrossRef]
  83. Boix-Mansilla, V.; Lamont, M.; Sato, K. Successful Interdisciplinary Collaborations: The Contributions of Shared Socio-Emotional-Cognitive Platforms to Interdisciplinary Synthesis. In Proceedings of the 4S Annual Meeting, Vancouver, BC, Canada, 16–20 February 2012; Available online: (accessed on 18 May 2019).
  84. Siedlok, F.; Hibbert, P.; Sillince, J. From Practice to Collaborative Community in Interdisciplinary Research Contexts. Res. Policy 2015, 44, 96–107. [Google Scholar] [CrossRef] [Green Version]
  85. Hibbert, P.; Siedlok, F.; Beech, N. The Role of Interpretation in Learning Practices in the Context of Collaboration. Acad. Manag. Learn. Educ. 2016, 15, 26–44. [Google Scholar] [CrossRef]
  86. Jeffrey, P. Smoothing the Waters. Soc. Stud. Sci. 2003, 33, 539–562. [Google Scholar] [CrossRef]
  87. Leminen, S.; Westerlund, M.; Kortelainen, M. A Recipe for Innovation through Living Lab Networks. In Proceedings of the XXIII ISPIM Conference, Barcelona, Spain, 17–20 June 2012. [Google Scholar]
  88. Borrego, M.; Newswander, L.K. Characteristics of Successful Cross-Disciplinary Engineering Education Collaborations. J. Eng. Educ. 2008, 97, 123–134. [Google Scholar] [CrossRef]
  89. Jha, Y.; Welch, E.W. Relational Mechanisms Governing Multifaceted Collaborative Behavior of Academic Scientists in Six Fields of Science and Engineering. Res. Policy 2010, 39, 1174–1184. [Google Scholar] [CrossRef]
  90. Heslop, B.; Paul, J.; Stojanovski, E.; Bailey, K. Organisational Psychology and Appreciative Inquiry: Unifying the Empirical and the Mystical. AI Pract. 2018, 69–90. [Google Scholar] [CrossRef]
  91. Hülsheger, U.R.; Anderson, N.; Salgado, J.F. Team-Level Predictors of Innovation at Work: A Comprehensive Meta-Analysis Spanning Three Decades of Research. J. Appl. Psychol. 2009, 94, 1128–1145. [Google Scholar] [CrossRef]
  92. Borrego, M.; Cutler, S. Constructive Alignment of Interdisciplinary Graduate Curriculum in Engineering and Science: An Analysis of Successful IGERT Proposals. J. Eng. Educ. 2010, 99, 355–369. [Google Scholar] [CrossRef]
  93. Vilsmaier, U.; Engbers, M.; Luthardt, P.; Maas-Deipenbrock, R.M.; Wunderlich, S.; Scholz, R.W. Case-Based Mutual Learning Sessions: Knowledge Integration and Transfer in Transdisciplinary Processes. Sustain. Sci. 2015, 10, 563–580. [Google Scholar] [CrossRef]
  94. Hadorn, G.H.; Pohl, C.; Hoffmann-Riem, H.; Biber-Klemm, S.; Wiesmann, U.; Grossenbacher-Mansuy, W.; Zemp, E.; Joye, D.; Hadorn, G.H.; Pohl, C.; et al. Handbook of Transdisciplinary Research; Hadorn, G.H., Hoffmann-Riem, H., Biber-Klemm, S., Grossenbacher-Mansuy, W., Joye, D., Pohl, C., Wiesmann, U., Zemp, E., Eds.; Springer Netherlands: Dordrecht, The Netherlands, 2008. [Google Scholar] [CrossRef]
  95. Hoffmann, S.; Pohl, C.; Hering, J.G. Exploring Transdisciplinary Integration within a Large Research Program: Empirical Lessons from Four Thematic Synthesis Processes. Res. Policy 2017, 46, 678–692. [Google Scholar] [CrossRef] [Green Version]
  96. Nesshöver, C.; Assmuth, T.; Irvine, K.N.; Rusch, G.M.; Waylen, K.A.; Delbaere, B.; Haase, D.; Jones-Walters, L.; Keune, H.; Kovacs, E.; et al. The Science, Policy and Practice of Nature-Based Solutions: An Interdisciplinary Perspective. Sci. Total Environ. 2017, 579, 1215–1227. [Google Scholar] [CrossRef]
  97. Roux, D.J.; Nel, J.L.; Cundill, G.; O’Farrell, P.; Fabricius, C. Transdisciplinary Research for Systemic Change: Who to Learn with, What to Learn about and How to Learn. Sustain. Sci. 2017, 12, 711–726. [Google Scholar] [CrossRef]
  98. Schut, M.; van Paassen, A.; Leeuwis, C.; Klerkx, L. Towards Dynamic Research Configurations: A Framework for Reflection on the Contribution of Research to Policy and Innovation Processes. Sci. Public Policy 2014, 41, 207–218. [Google Scholar] [CrossRef]
  99. Schippers, M.C.; Den Hartog, D.N.; Koopman, P.L.; Wienk, J.A. Diversity and Team Outcomes: The Moderating Effects of Outcome Interdependence and Group Longevity and the Mediating Effect of Reflexivity. J. Organ. Behav. 2003, 24, 779–802. [Google Scholar] [CrossRef]
  100. Blackstock, K.L.L.; Waylen, K.A.A.; Dunglinson, J.; Marshall, K.M.M. Linking Process to Outcomes-Internal and External Criteria for a Stakeholder Involvement in River Basin Management Planning. Ecol. Econ. 2012, 77, 113–122. [Google Scholar] [CrossRef]
  101. Leminen, S.; Westerlund, M.; Nyström, A.G. On Becoming Creative Consumers - User Roles in Living Labs Networks. Int. J. Technol. Mark. 2014, 9, 33–52. [Google Scholar] [CrossRef]
  102. Nyström, A.-G.; Leminen, S.; Westerlund, M.; Kortelainen, M. Actor Roles and Role Patterns Influencing Innovation in Living Labs. Ind. Mark. Manag. 2014, 43, 483–495. [Google Scholar] [CrossRef] [Green Version]
  103. MacMynowski, D.P. Pausing at the Brink of Interdisciplinarity: Power and Knowledge at the Meeting of Social and Biophysical Science. Ecol. Soc. 2007, 12, 20. [Google Scholar] [CrossRef] [Green Version]
  104. OECD. DAC Principles for Evaluation of Development Assistance; Paris, France, 1991; Available online: (accessed on 16 February 2019).
  105. Arnstein, S.R. A Ladder Of Citizen Participation. J. Am. Inst. Plann. 1969, 35, 216–224. [Google Scholar] [CrossRef] [Green Version]
  106. Borner, K.; Contractor, N.; Falk-Krzesinski, H.J.; Fiore, S.M.; Hall, K.L.; Keyton, J.; Spring, B.; Stokols, D.; Trochim, W.; Uzzi, B. A Multi-Level Systems Perspective for the Science of Team Science. Sci. Transl. Med. 2010, 2, 1–6. [Google Scholar] [CrossRef] [Green Version]
  107. Stokols, D.; Misra, S.; Moser, R.P.; Hall, K.L.; Taylor, B.K. The Ecology of Team Science. Am. J. Prev. Med. 2008, 35, S96–S115. [Google Scholar] [CrossRef]
  108. Sutherland Olsen, D. Emerging Interdisciplinary Practice: Making Nanoreactors. Learn. Organ. 2009, 16, 398–408. [Google Scholar] [CrossRef]
  109. Mâsse, L.C.; Moser, R.P.; Stokols, D.; Taylor, B.K.; Marcus, S.E.; Morgan, G.D.; Hall, K.L.; Croyle, R.T.; Trochim, W.M. Measuring Collaboration and Transdisciplinary Integration in Team Science. Am. J. Prev. Med. 2008, 35, S151–S160. [Google Scholar] [CrossRef] [Green Version]
  110. Strype, J.; Gundhus, H.I.; Egge, M.; Ødegård, A. Perceptions of Interprofessional Collaboration. Prof. Prof. 2014, 4. [Google Scholar] [CrossRef] [Green Version]
  111. Chianca, T. The OECD/DAc Criteria for International Development Evaluations: An Assessment and Ideas for Improvement. J. Multidiscip. Eval. 2008, 5, 11. [Google Scholar]
  112. Davidson, E. Evaluation Methodology Basics: The Nuts and Bolts of Sound Evaluation; SAGE Publications, Inc.: Thousand Oaks, CA, USA, 2005. [Google Scholar]
  113. Haas, L.J. Handbook of Primary Care Psychology; Haas, L.J., Ed.; Oxford University Press: New York, NY, USA, 2004. [Google Scholar]
  114. Knickel, K.; Brunori, G.; Rand, S.; Proost, J. Towards a Better Conceptual Framework for Innovation Processes in Agriculture and Rural Development: From Linear Models to Systemic Approaches. J. Agric. Educ. Ext. 2009, 15, 131–146. [Google Scholar] [CrossRef]
  115. Becker, J. Making Sustainable Development Evaluations Work. Sustain. Dev. 2004, 12, 200–211. [Google Scholar] [CrossRef]
  116. Weaver, P.M. Evaluating Sustainability Science: A Methodological Framework. Paper for Deliverable 3; 2002; Available online: Summary.pdf (accessed on 8 December 2018).
  117. Benson, D.; Fritsch, O.; Cook, H.; Schmid, M. Evaluating Participation in WFD River Basin Management in England and Wales: Processes, Communities, Outputs and Outcomes. Land Use Policy 2014, 38, 213–222. [Google Scholar] [CrossRef]
  118. Hansson, S.; Polk, M. Assessing the Impact of Transdisciplinary Research: The Usefulness of Relevance, Credibility, and Legitimacy for Understanding the Link between Process and Impact. Res. Eval. 2018, 27, 132–144. [Google Scholar] [CrossRef]
  119. Siebenhüner, B. Conflicts in Transdisciplinary Research: Reviewing Literature and Analysing a Case of Climate Adaptation in Northwestern Germany. Ecol. Econ. 2018, 154, 117–127. [Google Scholar] [CrossRef]
  120. Stokols, D.; Harvey, R.; Gress, J.; Fuqua, J.; Phillips, K. In Vivo Studies of Transdisciplinary Scientific Collaboration. Am. J. Prev. Med. 2005, 28, 202–213. [Google Scholar] [CrossRef]
  121. Taplin, D.H.; Clark, H. Theory of Change Basics: A Primer on Theory of Change; ActKnowledge: New York, NY, USA, 2012. [Google Scholar] [CrossRef]
  122. Walker, W.E.; Rahman, S.A.; Cave, J. Adaptive Policies, Policy Analysis, and Policy-Making. Eur. J. Oper. Res. 2001, 128, 282–289. [Google Scholar] [CrossRef] [Green Version]
  123. Vogel, I. Review of the Use of ‘Theory of Change’ in International Development. Review Report; 2012; Available online: (accessed on 28 January 2019).
Table 1. Overview table of the four dimensions of the TD co-learning framework with key criteria and indications.
Table 1. Overview table of the four dimensions of the TD co-learning framework with key criteria and indications.
DimensionKey Issues
ContextRepresents the setting in which TD collaboration is taking place
  • Organizational structure, resources and infrastructure, Boundary setting
  • Real-world context
  • Number and diversity of actors
  • Level of openness
  • Early involvement of key actors, engaged community
ApproachDefines the broad research approach taken and related methodological aspects
  • Joint learning and complementary knowledge
  • Use of participatory methods and co-creation
  • Use of action-orientated approach
  • Use of systems approach
  • Reflexivity and monitoring, feedback loops and refinement
ProcessEncompasses the way the cooperation is implemented, organized and managed
  • Common vision, genuine inclusion, common language, effective communication
  • Ownership and trust, appreciation and respect
  • Competences, knowledge integration, co-learning and co-creation
  • Leadership, roles and decision-making
  • Management and conflict resolution
OutcomesSubsumes intended and unintended outputs, effects, outcomes and impacts
  • Relevance, effectiveness, unintended effects and efficiency
  • Dissemination, networking and mobilization of additional support
  • Transformative learning, capacity-building
  • Satisfaction of core constituencies
  • Impact, comparability and transferability of findings, legacy
Table 2. Context—Criteria, related literature and guiding questions.
Table 2. Context—Criteria, related literature and guiding questions.
CriteriaRelated LiteratureGuiding Questions (Indicative)
Organizational structure, resources and infrastructureTechnical infrastructure [48,52]. Flexibility [37]. Access to resources [9]. Institutional context [36,53]). Historical context [36,54] To what extent does your team have what it needs to work effectively in terms of skills and expertise/data and information?
To what extent does your team have enough financial resources/time?
Does your team have effective connections to decision-makers, government agencies, relevant organizations and your target group/s?
Boundary setting and strategic planningBoundary setting [2,55]. Strategic planning [37]. Explicit theory of change; Relevant research objectives and design [43]. Opportunity to influence [9]. Lifespan [48,56] Which actors need to be involved in the work?
What can best be done with the available time, money and person power (what is central and what is marginal)?
Has the collaboration changed over time? If yes, how and why?
Real-World contextReal-world context [48]. Societally relevant research problem [43]To what extent does the work in your team address the challenges in the region perceived as important by relevant stakeholders?
Number and diversity of actorsScale [48]. Actor-complexity [47]. Building a heterogeneous network [41]. Team diversity [57,58] How comfortable are you with the size/background/skillset of your team?
In what way does the diverse background/skillset of your team members affect the way you interact with each other?
Early involvementEarly involvement [47,59,60] Were you engaged in the project from early on?
From your experience in (research) projects, to what extent does early engagement make a difference for successful collaboration?
TransparencyLevel of openness [47,48,60]. Internal and external transparency [4,9,36,54,61] Does everyone in your team have access to the results of the joint work and the jointly generated new knowledge?
Can everyone in your team use the key resources without any limitations?
Engaged communityCommunity [48,62]. Sense of belonging [57]. Continuous commitment and engagement [20,37,47,59,63,64]. Involvement of participants and sense of urgency [41]. Connectivity between researchers [65]. Supporting face to face interaction [66,67,68,69,70]. Empowerment of practitioners [26]How are you being informed about advances, changes etc. in the joint work in your team?
Do you send each other relevant materials, such as interesting papers, studies with breakthrough results?
How frequent on average do you communicate with your team colleagues?
Which way of communication with your team colleagues do you find the most effective in between face-to-face meetings? What has proven to work well/not at all for your team?
Would you prefer to meet your team members in person more often?
Do your colleagues display high levels of cooperation and mutual support?
Table 3. Approach—Criteria, related literature and guiding questions.
Table 3. Approach—Criteria, related literature and guiding questions.
CriteriaRelated LiteratureGuiding Questions (Indicative)
Use of an action-orientated approachAction-orientated approach [47,71] To what extent is an action-orientated approach prioritized in your team?
How well are your partners able to elaborate strategies that are likely to work in the region?
Joint learningJoint reflection and learning [41,54]. Reflexivity [2,4,7,18,28,37,40,41]. Adaptive learning [37,40,54]To what extent are you ready to learn from team members with different backgrounds?
Do you mind putting ideas of your own up for discussion?
How important do you find joint learning for your team?
Monitoring and evaluationReflexive monitoring in action [41]. Monitoring [72]. Monitoring and learning [40]. Ongoing monitoring and reflexivity [43]. Evaluation [47,48,72,73]. Continuous formative evaluation [4]. Participatory evaluation and learning [9,40,47]Does your team have mechanisms to monitor progress/results? Which ones?
Does your team evaluate the progress and impact? If yes, how?
Multi-methodUse of multiple approaches and tools [47,74] Do you combine different kinds of methods in a carefully considered way to achieve goals?
Use of participatory methodsParticipatory approach [9,61] Do you use participatory methods in your teamwork?
What methods proved most effective for you personally/your team?
Systems approachSystems approach [47,71] How well is your team able to carry out more encompassing analyses?
Have you defined the boundaries of the system you are examining?
Complementary knowledgeComplementary knowledge [20,46]. Contribution of different actors [43]To what extent do the complementary strengths of team members help achieving goals?
To what extent is an effective combination of perspectives, resources and skills of the team members actively promoted in the team?
Feedback and adjustmentIteration, feedback and refinement [47,75,76,77,78,79,80,81]. Adjustment of activities [41]. Adaptive decision-making and revision [37]Do you feel that there are enough feedback loops in the team?
Is the feedback received from your team members sufficiently integrated into the planning of next steps?
How is the feedback process organized in your team? On what occasion? Who is involved?
Table 4. Process—Criteria, related literature and guiding questions.
Table 4. Process—Criteria, related literature and guiding questions.
CriteriaRelated LiteratureGuiding Questions (Indicative)
Common vision and genuine inclusionCreating a common vision, alignment of priorities/interests [9,46,47,48,63,71,74,82]. Shared identity and values [66,83,84]. Inclusiveness [28,71]. Genuine and explicit inclusion [43]. How well are the team members able to include (reconcile) the views and priorities of all involved in the team? Is there a positive/negative example?
To what extent does your team try to coordinate plans with others in the project? How is the coordination achieved?
Common languageCommon language [26,52,59,63,64]. Clarification for “shared interpretive horizon” [66,84,85,86] How well do you feel your colleagues understand what you are saying?
Do you have a common glossary of key terms?
How well is your team able to express goals in a way that is supported by all involved?
Personal motivationPassion [87]). Motivation [47,48,61,74]. Motivation and encouragement [54]. Strategic intention [87]. Willingness to learn [9,36,54,61]How often do you go beyond what is required in the tasks assigned to you?
How much do you feel your team members are inspired and motivated about the work they are doing?
How satisfied are you with the contribution of other colleagues in the team?
Ownership and trustBuilding ownership and trust [46,47,48,60,74]. Ownership of outcomes [9]. Trust [41,54,66,84,88]. Trust and respect [89] Are you ready to share products that are still work in progress? Why/why not?
How satisfied are you with the co-ownership of the results of joint work?
Appreciation and psychological safetyAppreciation and respect [47,48,60,74]. Psychological safety [57,90,91] To what extent do you think your contributions are valued in the team?
To what extent does your team provide an environment where different opinions can be voiced?
To what extent do you feel you can take risks in your team voicing a less popular view?
Effective internal communicationInternal communication [9,36,54,61]. Participatory events [61]. Effective communication [43]. To what extent are you aware of who is doing what in your team?
How effective are your meetings?
CompetencesKnowledge and skills [37,47,87]. Communication skills, team working skills and a broad perspective [92]. Capacity to participate [9,36,54,61]. Adequate competences [43]Are there any skills that you think are missing in your team to do the job more effectively? If yes, which?
To what extent do you feel your actual responsibilities in the team correspond well with your knowledge and skills?
Is there any form of training to keep team members’ skills up-to-date?
Co-learningCo-learning [20,41,47,54,61,76,78,80,81]. Mutual learning [1,4,33,36,50,54,93,94,95,96,97]. Knowledge exchange [26,37,60,90]. Do you implement dedicated measures to support co-learning? If yes, what kind of measures?
To what extent do you recognize the value of your own knowledge in co-learning?
What are the top five most challenging obstacles limiting effective knowledge exchange?
Knowledge integrationKnowledge integration [2,20,44,93]. Knowledge management [98]. Knowledge diversity [57,99] Are you using particular methods or tools that connect different kinds of knowledge (e.g., when addressing an issue/problem you are working on)? If yes, can you provide examples?
What processes/tools do you use to keep track of available knowledge?
Co-creationCo-creation [47,48,52,60,81]By working together, how well are the team members able to identify new ways to solve problems?
How do you share the work? How is it decided who is doing what?
LeadershipLeadership [4,9,36,37,52,54,57,77,100] Presence of ‘prime movers’ [41]Who is leading your team? What are the reasons?
To what extent does a team leader encourage team members to be creative and look at things differently?
How effective is the team leader’s communication style?
Actors’ rolesUser/stakeholder roles [43,46,61,101]. Flexibility in actor roles [47,102]. (New) roles for research(ers) [98]. Role clarity [57] Are the roles of different members in your team properly defined?
How balanced is the distribution of work between different team members/gender-wise?
Do you feel your role in the team has changed over the course of the project?
Decision-makingDecision-making [52]. Quality of decision-making [9]Does everyone in the team have the same influence on decisions?
How comfortable are you with the way decisions are made in the team?
Administration and managementAdministration and management [45,72,77] Appropriate project implementation [43]. Harnessing differences [55] Are the results of the team’s decisions and action points documented?
How effective is the preparation of team-level decisions?
Conflict resolutionConflict reduction/mitigation/resolution [4,47,52,54,61,82]. Harnessing differences [55,66,83,84,86,88,103] Were there diverging views when deciding on key issues? If yes, how did you deal with that?
Were there diverging views on the research agenda? If yes, how did you deal with that?
To what extent are you as a team able to work through differences of opinion without damaging relationships?
Do you use professional facilitators?
Table 5. Outcomes—Criteria, related literature and guiding questions.
Table 5. Outcomes—Criteria, related literature and guiding questions.
CriteriaRelated LiteratureGuiding Questions (Indicative)
RelevanceRelevance [37,43,73,104]Are the aims you formulated (still) relevant?
Are the activities consistent with the main aims/intended impacts?
EffectivenessEffectiveness [63,73,77,104,105,106,107] To what extent is progress being made in relation to the aims of the project?
What are the main factors influencing the achievement of objectives?
EfficiencyEfficiency [39,46,47,104]. Professionalism [37]. Cost effectiveness [9]How efficiently does the team use different resources in achieving goals?
Are the planned activities/budget lines sufficiently on schedule?
How productive are the meetings?
Unintended effectsUnintended effects [37,46,47,60,73,77] Is your research guided by the principles of Responsible Research and Innovation (RRI)?
Do you have processes/methods in place that aim at spotting negative impacts?
Communication and disseminationCommunication and dissemination [64,66,68,84,86,108]. External communication [36]. Distribution of knowledge [54,61]Are you regularly communicating and disseminating (e.g., interim) research results?
How do you check whether you effectively reach your target groups?
Mobilization of additional support and network buildingMobilization of additional resources, networks and institutions [46]. Network building [45,47,54,60]. Network relationships [9,36,37,61]How good is your team in obtaining support from individuals or organizations in the region to help move things forward?
What other networks/organizations have joined over time in order to foster upscaling and multiplication??
What additional resources were mobilized?
Satisfaction of core constituencies and community identificationSatisfaction with collaboration [47,109]. Community identification and ‘‘sense of belonging’’ [54]. Satisfaction of core constituencies including expectations; accountability [9]. Satisfaction and commitment [99]How satisfied are you with your role in the team/the team’s plans/strategy? What can be done better?
Are representatives of core constituencies/the local community actively engaged in project activities?
Does the project have an influence on governance arrangements/decision-making in your region?
Benefit(s) received, usefulnessValue creation/sharing "ecosystem" [48,60]. Personal value [52,110]. Impact of collaboration [109]. Recognized impact [9,36]. Proximal outputs (team members’ learning and professional development) [57] To what extent has your ability to influence policy/meet the needs of your region increased?
Have you experienced any drawbacks as a result of the collaboration?
How does the value of collaboration compare to the drawbacks?
Social learningSocial learning [7,9,37,38,39,66,81]. Individual and transformative learning [36,66]. Emergent knowledge and influence of local knowledge on outcomes [9]Can you identify some examples on how the teamwork has changed the way you see things and your professional practice?
Can you give an example of how your personal expertise or experience has influenced discussions and directions taken by your team?
Capacity-buildingCapacity-building [9,36,37,38,54,60,61,63,98] To what extent has your ability to apply scientific concepts to addressing real world problems been improved?
Can you identify some transdisciplinary methods you learned to use?
Comparability/transferability of findingsComparability of results across regional/national/international contexts [71]. Transferability of results [43,47,111,112] To what extent can what you learned be applied in other contexts?
What are the main factors that influence the transferability of findings?
ImpactSignificant outcome [43]. Impact [37,46,47,60,73,77]. Type and degree of impact observed [60]. Recognized impacts [9]Can you identify some examples of actual changes in policy or practice? (instrumental)
Can you identify some examples of how the broad understanding of the issues studied has been improved? (conceptual)
Can you identify some examples of increased willingness to engage in new collaborations? (culture change)
LegacySustainability [46,47] To what extent do you think will the benefits of the project continue after funding ceases?
What are the main factors for the benefits of the project to last (e.g., institutionalization of new networks and exchanges)?

Share and Cite

MDPI and ACS Style

Knickel, M.; Knickel, K.; Galli, F.; Maye, D.; Wiskerke, J.S.C. Towards a Reflexive Framework for Fostering Co—Learning and Improvement of Transdisciplinary Collaboration. Sustainability 2019, 11, 6602.

AMA Style

Knickel M, Knickel K, Galli F, Maye D, Wiskerke JSC. Towards a Reflexive Framework for Fostering Co—Learning and Improvement of Transdisciplinary Collaboration. Sustainability. 2019; 11(23):6602.

Chicago/Turabian Style

Knickel, Marina, Karlheinz Knickel, Francesca Galli, Damian Maye, and Johannes S. C. Wiskerke. 2019. "Towards a Reflexive Framework for Fostering Co—Learning and Improvement of Transdisciplinary Collaboration" Sustainability 11, no. 23: 6602.

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop