Next Article in Journal
What Are Male and Female Students’ Views of Science in a Society in Transition? A Self-Study of an Institution of Higher Education
Next Article in Special Issue
Creating an Online Social Learning Platform: A Model Approach for Open Development, Open Access and Open Education
Previous Article in Journal
The Community Collaboration Model for School Improvement: A Scoping Review
Previous Article in Special Issue
Barriers to Digital Higher Education Teaching and How to Overcome Them—Lessons Learned during the COVID-19 Pandemic
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

User-Oriented Policies in European HEIs: Triggering a Participative Process in Today’s Digital Turn—An OpenU Experimentation in the University of Paris 1 Panthéon-Sorbonne

by
Marco Renzo Dell’Omodarme
1,* and
Yasmine Cherif
2
1
Ecole des Arts de la Sorbonne, Paris 1 Panthéon-Sorbonne University, 75005 Paris, France
2
Direction des Relations Internationales, Paris 1 Panthéon-Sorbonne University, 75013 Paris, France
*
Author to whom correspondence should be addressed.
Educ. Sci. 2022, 12(12), 919; https://doi.org/10.3390/educsci12120919
Submission received: 9 October 2022 / Revised: 5 December 2022 / Accepted: 6 December 2022 / Published: 14 December 2022

Abstract

:
As European higher education institutions (HEIs) increasingly grapple with new challenges, the importance and difficulty of massification, democratization, and inclusion have been reinforced by the recent pandemic crisis and the simultaneous need for pedagogical continuity. Meeting these challenges not only implies a profound change in organization and teaching practices, which need to focus on user-centered quality learning, but also raises questions about financing, management, and governance. Using results from two participative experiments conducted in the French University of Paris 1 Panthéon-Sorbonne in the framework of the OpenU (Online Pedagogical Resources in European Universities) project, the authors present ethical and practical issues that currently face inclusive and user-oriented policies in the European higher education area. Through this paper, we argue that creating an imaginative and inclusive participative process is as essential today in the spirit of evidence-based policies supporting digital education as it is partial. We furthermore present emerging results on current needs, as well as incentives to increase participation. Such results ultimately allow us to draw conclusions and recommendations for institutional and national policymakers to further improve user-oriented policies.

1. Introduction

In the last decades, social sciences fields such as sociology, political science, planning, and even architecture, have seen a rise in the popularity of experiments. The capacity to foster experimentation is argued to be one of the key characteristics of both behavioral economics and innovation policy mixes [1]. By viewing experimentation as continuous growth, the process of iterative adaptation to new circumstances and experiences is believed to pragmatically entail a certain idea of progress and improvement [2].
Perhaps one of the best known defenders of experimentation in policy sciences, Donald T. Campbell considered experiments, and more particularly randomization, to be the main pathway for scientific research and even an ideal for a better governance and Utopian society [3].
Policy experimentation can be defined as “a purposeful and coordinated activity geared to producing novel policy options that are injected into official policymaking and then replicated on a larger scale, or even formally incorporated into national law” [4]. Experimentalist governance is based on deliberation and generation of evidence, which was developed in response to command-and-control regulations. These were argued not to work in a contemporary world that experiences fast-paced changes and problems of implementing fixed rules on the ground [5]. In contrast, experimentalist governance is increasingly considered an important driver for desirable societal transformation, and in the particular field of education, a catalyzer of innovation [6].
More importantly, though, policy experimentation is an attempt to fill a gap in our knowledge of what works and what does not. As early as in the 1960s, Harold Lasswell asserted that experiments were an effective way to improve policy making practices, generate scientific knowledge, and build capacity to implement novel ways of doing policy. These purposes imply a certain level of learning and a subsequent translation into policy practices [7]. In order for educational experimentation to work, education systems must adopt an attitude of constructive skepticism that acknowledges the risk inherent in any reform or experiment and allows them to transparently govern this process [8,9]. In this framework, HEIs represent what James Mahoney and Kathleen Thelen describe as “compromises or relatively durable though still contested settlements based on specific coalitional dynamics” [10]. They are thus not only inscribed in the political and social structure, but are themselves not strictly independent of socio-political changes.
Digitalization cannot be studied while neglecting the context. In the French public higher education area, competition between institutions has been furthermore accentuated by international rankings and the establishment of evaluation agencies (e.g., the French Agency for the Evaluation of Research and Higher Education [AERES], the High Council for the Evaluation of Research and Higher Education [Hcéres], and EQUIS) [11]. Higher education institutions have been urged to stand out by demonstrating their ability to innovate in both research and teaching. In particular, digital strategies appear to be a decisive lever for competitiveness available to higher education institutions. Whether it is to adapt training to the diversity of student populations, to increase the visibility of research and teaching activities, or to provide effective management tools, digital technology is bringing about profound changes in university policies. Thus, in such a competitive higher education system, it seems that institutions have no choice but to innovate.
Because the dominant conception of experiments in the policy sciences is that they are mainly a research method, the matter of governance and leadership have not garnered enough attention and have remained limited. Yet, such deep changes raise questions on the management process within a public French higher education institution, the current principles of which are still mostly based on the unity of time and place. When considering improvement, Gilbert developed the Behavior Engineering Model with the belief that the greatest barrier to worthy performance comes from a lack of information and support by management rather than an individual’s lack of desire to perform well.
Gilbert’s model focuses on two distinct factors of performance—the environment and the individual’s behaviors—which can be viewed from three perspectives—information, instrumentation, and motivation. Based on his understanding of technological improvement, Gilbert’s Behavior Engineering Model consists of three Leisurely Theorems that:
  • Distinguish between accomplishment and behavior to define worthy performance: worthy performance is characterized by a person’s behavior and accomplishments;
  • Identify methods for determining the potential for improvement (PIP), amounting to the ratio between typical performance and exemplary performance;
  • Describe six components of behavior that can be manipulated to change performance, among which are environmental components (data, resources, and incentives), as well as knowledge, capacity, and motives [12].
Thus, the aim of this paper is to focus on motivations, participation incentives, and the ethical issues at hand during the institutional management of the technological and digital improvement process in the University of Paris 1 Panthéon-Sorbonne. This objective relies on the following observations that have been made.
First, studies show that emerging attention has been directed at the link between public governance aspects [13] and user perspective [14] since 2018. In the management, marketing, and IT fields, this focus on users through “customer-centric”, “human-centered”, “user-oriented” approaches has been expounded earlier by researchers (e.g., [15,16]) and practitioners alike [17]. In these fields, user-centered design is often branded as a means to create highly usable and accessible products through an iterative design process in which users and their needs are involved in each phase of the design process. In user-centered design, designers use a mixture of investigative methods and tools (e.g., surveys and interviews) and generative ones (e.g., brainstorming) to develop an understanding of user needs. This allows a product to reach the standard of “external integrity” [18], which refers to the match between the product and the intended user and involves managing knowledge.
Similarly, the operational success of an experiment implies that all stakeholders (school staff and parents, local and central authorities, communities, and of course the students themselves) play their parts at all the steps, including the evaluation, which should be performed with respect to criteria decided a priori [8], whether it be “relieving [users] of frustration, of confusion, of a sense of helplessness [, making] them feel in control and empowered” [15], or to human factors, user experience, or “usability” [17].
The challenge of developing successful products, here effective policies and infrastructures that enhance the user’s learning experience and improves it, requires an interrelational approach across all the key disciplines, thus leading to a higher level of “collective creativity” [19]. This aims to generate more creative solutions than those generated by less user-oriented approaches.
Second, only a few studies have been conducted on digitalization in universities. In 2021, a survey was conducted in Paris and its surrounding areas on the use of digital technology in universities and training centers. The objective of this exploratory study was to identify pre-existing digital uses in higher education institutions before the health crisis, then to analyze the solutions implemented during the lockdown and measure their impacts on students and teachers. A survey was also carried out on the sustainability of the changes once the crisis was over and the issues that were highlighted within the higher education community. A second phase of complementary qualitative interviews involved 40 teachers and students from different institutions and levels of training in order to gather their experience of distance learning. Overall, studies showed that while the health crisis has forced institutions to accelerate their transition, French higher education institutions had to rely on a massive and unprecedented deployment and use of digital technology [20]. Six universities proposed actions related to digital or distance learning through international cooperation (Paris 1 Panthéon-Sorbonne, Sorbonne Nouvelle, and Sciences Po), geographic dispersion of sites (Paris-Saclay), professional platforms (Ensam), and educational digital third parties (Evry).
In the particular case of Paris 1, although increasing attention has been given politically to digitalization in the last two years, data on Paris 1 members’ perception of digitalization remain relatively lacking. Only one examination of digital use in Paris 1, which remains internal, has compared the first lockdown and the second one, showing that the student experience was not similar to that of the teachers. An explosion of use by teachers can be observed at the beginning of March 2020, while students do not follow the same acceleration curve at all.
These studies also lacked data on the particular populations of students. The aforementioned study conducted in the Parisian region only included eight student experiences, among which there was only one student of Université Paris 1 Panthéon-Sorbonne. When an informal exchange forum was conducted in Paris 1, only two students came forward, reporting the same general sentiment that was reported in other institutions. On an institutional level, surveys conducted by the ORIVE of Paris 1 only concerned three questions related to digitalization prior to the health crisis: first, student perspectives on teachers’ available equipment (50% satisfied or very satisfied), second, the desire to have online or blended courses (around 40% a categoric “no”), and third, the format used to transcribe classes (more than 60% digital format) (ORIVE, 2018–2019 data, Survey on Study conditions for bachelor and master student). In 2021, a thematic survey was conducted with all the students enrolled in a Master 2 during the academic year 2020–2021, which was marked by the generalized use of distance learning. This excluded distance learning, off-site training abroad, mobilities, online law school, Administration institutes, and preparatory classes. A total of 27% of the students responded to the survey.
However, as the survey took place in March–April 2021, the questions only covered assessments that took place before April 2021. By this date, 89% of students reported taking distance assessments during the academic year. Furthermore, the survey focused on one certain population and a certain time.
In a particular context where this institution was not accustomed to distance learning, all levels of the university were overwhelmed and had to make do with the means at hand. However, while the health crisis has made it possible to reexamine the teaching and pedagogy provided by teachers in France [20], the digital transformation was not necessarily inscribed in strategic policies. Difficulties experienced in achieving a complete and sustainable transformation have not yet been resolved, mostly and undoubtedly due to the lack of a global policy for digital transformation in certain institutions, with very different situations from one university to another.
The situation of urgency and uncertainty that characterized the COVID-19 pandemic was therefore not necessarily compatible with the implementation of real innovative projects [21]. The renewal of pedagogical practices often remains at the stage of reflection for the moment in spite of a few experiments by teachers. The adaptation of courses most often resulted in a simple transposition of the courses given in class to digital media. The COVID-19 pandemic has, for many students, reinforced a precariousness with multiple origins (cultural, economic, family, social), increasing the vulnerability of some of them.
Therefore, looking at the future of higher education from an adaptive perspective requires a better understanding of where and how students learn, since some learning activities may take place off campus. This also fits with the Vantage Points model developed by MGTaylor Corporation, which acts as a “slice of reality” serving to diagnose possible influences on behavior and identify strategies for performance improvement. Through glyphs, a certain spatial arrangement, and connections through different components, the Vantage Points models assert that “you can never understand the philosophy of a system or enterprise until you are immersed in the tasks that comprise its daily functions. The task provides a mental elevation from which the whole essence of the system can be contemplated.”
Third, there is much to be learnt from the process in itself. It is no coincidence that such surveys and studies have seen a rise in number since the COVID-19 pandemic [22]. Previous literature suggests that “institutional settings define the degree and form of experimentation that is deemed legitimate” [23], implying that experiments are bound by institutional rules. In other terms, “experiments are infused with political ideas and […] often confirm existing ideas rather than challenge them [24]. An ‘informal’ policy experiment [2] can be derived from that, exceeding formal evaluation-based learning, and creating an informal cognitive and normative learning that can influence further institutional changes [25]. Since learning occurs through trial and error, Popper emphasises the importance of being able to identify the causes of success or failure of a change.
Our study builds on those observations and refers to the outputs emerging from two participative experiments conducted in the French University of Paris 1 Panthéon-Sorbonne in the framework of the OpenU project. The processes explored in this paper fall into the gradual reforms approach, i.e., the necessity of small changes in order to more clearly identify the effects of the intervention and bring stakeholders on board.
In this paper, we report the results of those two rounds of experiments investigating the extent to which the knowledge generated in the higher education community can succeed in being pertinent when aiming to orient policies in said field. In each of the two experiments, which were conducted in spring 2022, members of the academic community of Paris 1 Panthéon-Sorbonne did not update their beliefs on digitalization when presented with the opportunity to run counter to their predispositions.

2. Materials and Methods

All figures and data used in this study stem from two participative experiments conducted with members of the University of Paris 1 Panthéon-Sorbonne University between January 2022 and September 2022.

2.1. Specification of Context, Population, and Field of Study

Both authors are themselves members of the University of Paris 1 Panthéon-Sorbonne, which facilitated their access to the field and allowed them to make full use of the necessary internal services and actors. The University of Paris 1 Panthéon-Sorbonne is also one the academic partners of the OpenU project within the framework of which lies this research. The choice of this university appeared to be particularly pertinent in the context of the stated policy experimentation process.
Paris 1 Panthéon-Sorbonne is neither the only French university in the project nor the largest one. However, it is relatively representative of public universities in the French higher education system. The university gathers around 45,000 students and 2500 professional members. It reflects the LMD system, which most European countries have adopted in an effort to promote coherence across borders, but also reflects the antagonistic nature between public entities and more selective ones (“Grandes écoles”) [26]. In the year 2021–2022, around 2950 students were parallelly enrolled in an external two-year preparatory course (cours préparatoires or prépas). Those students do not attend classes in the university but remain registered as a “back-up plan” should they not be admitted into more competitive institutions.
At the same time, Paris 1 Panthéon-Sorbonne is a social-sciences-only university revolving around three disciplines—Economics and Management, Arts and Humanities, and Law and Political Sciences—and as such is one of the largest universities of humanities and social sciences in France. This exclusive nature is particularly interesting here as this paper’s discussions include the acceptance, adoption, and use of technologies [27], which lies at the heart of the expertise of the university.
Its campus, which is famously based in the capital city, is characterized by its scattered nature. The University of Paris 1 Panthéon-Sorbonne is located on 25 sites in Paris and the Ile-de-France region, with more than 1500 students being enrolled in non-LMD (capacités, university diplomas, DAEU, etc.). Its research departments are structured around three major disciplinary poles with 36 research teams, including 23 UMRs under joint supervision with the CNRS or IRD and 13 host teams, as well as 10 doctoral schools.
The university also lies at the heart of a network of international relations covering five continents. More than 670 foreign students were registered in the university in the academic year 2021–2022. This is facilitated by the University of Paris 1 Panthéon-Sorbonne’s choice since 2020 to continue applying the same registration fees to French and foreign students, whether they are of intra- or extra-European origin. Beyond mobility, more than 1100 students are enrolled in off-campus courses in nine countries abroad, while 800 students are either in joint degrees or in double degrees [28].
This large, yet exclusively humanities-oriented panel, therefore provides an appropriate field in which to conduct qualitative and reflexive experimentation on the role of community members in policy making.
In the process, we address the following questions:
  • Research Question 1. (RQ1)—What are the current barriers to the digital turn, as seen by non-strategic members of the community?
  • Research Question 2. (RQ2)—How to implement an inclusive user-oriented participative approach in the digitalization of university, i.e., how to ensure participation, and adherence?
  • Research Question 3 (RQ3)—Which ethical issues are at play when building policies based on such approaches?
Specifically, we have six principal hypotheses about how the effectiveness of policy experimentation will vary. The assumptions guiding the paper were as follows:
H1. 
There is a lack of information leading towards a lack of acceptance.
H2. 
The aforementioned varies in accordance with social factors, including marginalization.
H3. 
A more participative process is requested by concerned parties.
H4. 
When existent, participative processes remain quite lethargic due to a lack of incentive.
H5. 
Participative processes only concern and include those who are favourable to the topic of digital education.
H6. 
Participative processes only concern and include those who are interested in the topic of digital education.
In order to bypass possible bias stemming from such a close relationship, the choice was made to use different sampling techniques when addressing the target groups.
Non-probability sampling techniques were useful in this exploratory and qualitative phase of the study, as the aim was not to test a hypothesis about a broad population, but to develop an initial understanding of a small population. After a limited voluntary response (discussed below), a more judgemental sampling was implemented to consume less time and select a committed and diverse sample for conducting the focus group. Such diversity did not only concern different bodies (students, administrative body, and teachers) but also clusters such as gender, nationality, associative engagement, and interest/position vis-à-vis digital technologies. Encapsulating such a diversity required an extensive knowledge of those involved to select a sample that is most useful to the purposes of the research and gather a varied range of data on their experiences.
The second experimentation phase first built on this sampling, combining a voluntary response (in which already existing students were free to volunteer or not, while a call for contribution was also disseminated) together with a snowball sampling where students recruited other potential participants with similar characteristics. However, and in order to extend the representativity of the study, the second part of this phase, which consisted of a survey, relied on a voluntary response. The survey was sent out to all 45,000 students. Sampling here was therefore random. All the same, it is still not possible to speak of a probabilistic sample, as some respondents were at least somewhat inherently more likely to volunteer than others, whether it be due to their availability, to their interest, or to their acceptance of using online technologies to respond. Such circumstances were taken into consideration, and are further discussed in this paper.

2.2. Data Collection in Two Phases

2.2.1. First Experimentation Phase

The first experiment (hereafter Phase 1) lasted between January 2022 and April 2022.
The purpose of this phase was to examine the expectations of users of the EU universities in digital times and report on outputs stemming from the creation of an engaging and inclusive imaginative process for members of the community where ideas are received, deepened, and put into use. This process aimed at wholly involving diverse university members and thus strengthening their perception of themselves as key players in their universities. At the same time, it aimed to extend the network of the OpenU project within the university while engaging members in marginal discussions that would feed into the institutional level and current interrogations within policy spheres and EUAs. While working on expectations and imagination, mediation was used to lead the collective work toward the definition of changes in digital policies necessary to meet and satisfy the expectations and hopes.
To that end, a series of focus group working sessions were held between members of the Paris 1 community, as shown in Figure 1. The primary aim of the focus group working sessions was to assess and identify expectations of users of the EU universities in digital times. Collective work was led to define changes in digital policies necessary to meet and satisfy the expectations and hopes. A certified professional facilitator was recruited to chair the meetings, train the trainers, and ensure impartiality.
Such sessions heavily relied on the philosophy of focusing on the real needs of users. It also built on inclusion and the necessity of leaving room for plurality, as it intends to sensibly welcome differences (of means, perspectives, priorities, etc.) between the present/future of some and the present/future of others. The focus groups were based on an adapted Design Sprint process in that it worked on the basis of a rapid approach, allowing participants to understand, analyse, decide, imagine, and test thanks to having user feedback within an imposed time constraint and in the absence of iterations. While the process did not limit itself to five consecutive days in order to adapt to the community’s calendar, its priority remained to limit the risks and uncertainties linked to innovation. Beyond reaching an immediate result, it moreover stimulated creativity, improved the credibility, engaged diverse views, and generated a strong motivation and a training of the concerned sample.
This process proved to be well adapted to the process as it clearly explained who does what, when, how, and with whom, yet without interfering in the collaborative multidisciplinary innovation method that is at the heart of the focus group.
Through a call for participation, 12 members were identified from the three bodies of the academic community within the university (students, teachers, administrative staff). This group was diverse as it included both members familiar and unfamiliar with digital tools, teachers who are skeptical of them, foreign students, undergraduate students, PhD candidates, alumni, administrative staff, and involved members in Una Europa. Participants were to live the experience of a deep sharing in the thinking process, which is necessary to foster a belonging feeling in participants and trigger deep involvement. They should be able to share hopes, expectations, and dreams about the digital university and the capacity of change of the university that would allow for better experiences as students, teachers, and administrative staff.
The steering group monitored and approved the quality and ethics of the expected project results against the progress indicators and the key question it sets at the beginning of the project.
By the end of this project, it was expected that participants would have engaged in an imaginative process that they found relevant to the OpenU project, but also that they would feel their contributions had been received with openness and had contributed to the final outcome and that their results can feed into the downstream discussions and steps within OpenU.
Personal data was limited to what was necessary for the purposes of processing the data: name, status in the university, and discipline.
However, a privacy statement was drawn up concerning the personal data, which was all the more necessary considering the trust environment that was built. For this reason, the confidentiality of all participants’ information shared in this focus group was respected.
Figure 1. Structure of the Experimentation Phase 1 (designed by © Kumquat).
Figure 1. Structure of the Experimentation Phase 1 (designed by © Kumquat).
Education 12 00919 g001
Two overarching questions were designed by participants in the experiment and guided the experimentation: How are we to consider the digital university and what does it look like? What are its challenges and limits? What are the hopes, fears, and priorities regarding the use of digital technology in order to ensure an optimal implementation of education, research, and study?
Such a qualitative method made it possible to give control to the participant, within well-defined limits, while giving them a certain liberty in the used discourse. The participants take a direct part in the production of knowledge, exercise immediate control over the work and conclusions of the researchers, and directly link the analyses to a social praxis that is quite different in nature from scientific experimentation as it is conventionally conceived. It is important to note that this is not contradictory to the research register because it is still a matter of taking the initiative, ensuring the methodological conduct, and assuming responsibility for the conclusions.
The role of iteration here allows “a loop-like pattern of multiple rounds of revisiting the data as additional questions emerge, new connections are unearthed, and more complex formulations develop along with a deepening understanding of the material” [29]. It is less a question of asking good questions than of asking good questions of oneself, so that they emerge naturally in the course of the interview, which is conceived as a true interlocution [30]. Such a question is therefore to be linked to a deeply reflexive process, the key to sparking insight, developing meaning, and progressively leading to refined focus and understandings.
It is also noteworthy that these two questions themselves stem from a number of diverse questions that came out of a collective discussion, such as:
  • What does the ideal digital university look like?
  • How does digital use correspond or diverge from the missions of the university?
  • How does the digital university raise concerns?
  • How does digitalization of the university have an impact on your daily life and work?
  • What are your expectations with regards to security and privacy of digital universities (and why?)
  • Where to start?
  • Does digitalization improve pedagogical content?
  • How does fragmentation and interoperability of digital services affect your work?
  • What did you learn from your experiences with the digital university during the pandemic? (positives and negatives, necessary/desirable/unwanted things)
  • Is digitalization necessary? It is inevitable?
  • Can links be established between the digital university and society at large?
  • What institutional guarantees would you like for the digital university? (diploma, ECTS, transferable credits to your home university)
  • How to overcome the language barrier?
  • How can digital functionalities support mobility?
  • Which changes would be necessary for the coordination and delivery of this new pedagogy?
Objectives were therefore to sufficiently engage members of the experimentation collectively so that they would provide a reply to those questions.

2.2.2. Second Experimentation Phase

The second experiment (hereafter Phase 2) started in May 2022 and ended in September 2022. One of the main targets of this phase was to build and conduct a survey in the student population of the university on digitalization. The main principle guiding this phase was the inclusion of students from the beginning of the experimentation as main actors in the EU HEIs, which implies a high level of inclusion for students in the HEI’s teams. Students were involved in the project from the beginning and were called to lead and conduct the experimentation, and at the same time policy making, inside the project. Students were also naturally targeted by the questionnaire and were involved in the analysis of the study.
Students took up the question of digital pedagogies in order to design a survey that reflected their needs, fears, and expectations. Special attention was given to the diversity of participants. The steering committee was made up of six students from different fields (informatics, arts, economy, history), different levels (bachelor, master, doctorate), and different profiles (CPGE before university, foreign, L1 to L3 at university, reorientation during the course, etc.). The steering committee was involved in (1) the survey design, (2) the launch of the questionnaire, (3) joint meetings with JU students, (4) the interpretation of the results, and (5) the communication of the results.
The objectives of the survey, as decided by the students participating in the Steering Committee, were to be informed of students’ needs and expectations (both in times of crisis and outside of these periods) to facilitate the current reflections. The survey was therefore a way to become actors of change, and to actively participate in the evolution of our university, and to potentially be heard. In particular, the survey aimed to examine the strength and opportunities of digital in the university, to gather feedback on past experiences with digital in the university, and to entertain possible prospects for improvement of the digital system.
It was decided that there would be no limit to the number of students. The exclusion of courses outside the Learning and Research Units (IAE, ISST, EDS-IED, CIPCEA, DEVE-PSC) or Erasmus courses, exchange programs, delocalized courses, CPGE, etc., was perceived as a potential limit to the diversity of responses. The steering group felt that all students enrolled at Paris 1 would be concerned and that those involved in mobility and exchange programs could have interesting opinions, even if it was admitted that there was a strong chance that they would not respond. The survey therefore concerned all 47,318 students with a first question concerning the profile of the respondent in order to identify particular sources and contexts.
The questionnaire consisted of thirty-two questions, including four open-ended questions, covering five different aspects:
  • Identification of the respondent.
  • Distance learning conditions.
  • Students’ experiences with digital technology at the university.
  • Preferences of the student for improvement of the digital system currently in place.
  • Students’ preferences with respect to communicative and informative measures in the digital university.
Respondents to the survey were informed of the context and purpose of data collection. The information collected in this survey was processed by computer, and the data concerning students were used in a strictly anonymous way. The recipient of the data collected is the University of Paris 1 Panthéon-Sorbonne which will establish indicators.
The survey was launched online on the BLOOM platform using the LimeSurvey tool on 17 June 2022 (LimeSurvey Community Edition Version 5.2.14+220214, LimeSurvey GmbH, Hamburg, Germany) An email reminder was sent on 28 June 2022 and the survey was closed on 7 July 2022. A total of 304 students responded. The survey was optional and anonymous, and the data collected were used strictly anonymously. In accordance with the French Data Protection Act of 6 January 1978 and the RGPD, students had the right to access and rectify data concerning them.
In both of these phases, the aim of such narrow but well-identified experiments was not to replicate the current disposition but rather to challenge it. Here, policy experimentation is designed as a means to possibly reach out to actors who have normally been excluded from public governance processes. Therefore, the valuing of innovation would not be specific to that publicly expressed by the education system, but would be directly linked to social representations of innovation [31].
Notably, both the two phases put forward a participatory approach, designed to actively involve the target population, i.e., students. While participatory evaluation processes are sometimes deemed less “scientific” and “objective” than more traditional processes, they allow stakeholders to take ownership of the results of their actions and to make them evolve according to the conclusions they have reached. This is all the more necessary as evaluation means projecting a system of values (a frame of reference) and expressing a particular point of view on the action, so it is important to encourage the expression of a diversity of points of view on public action so that the social legitimacy of the evaluation is as broad as possible [32].
Participatory evaluation was deemed to offer greater external validity to the evaluation exercise, because it is discussed by concerned stakeholders, encouraging the expression of a diversity of viewpoints. The evaluative judgment is thus constructed from a multiplicity of informed opinions. The participation of stakeholders in the evaluation exercise is then seen as a guarantee that societal concerns will be better considered in the objectives of future projects, which gives these projects greater external legitimacy [33]. By organizing the exchange of points of view, participatory evaluation allows the evaluation process to be an exercise in the co-construction of public action. The confrontation of one’s point of view with that of others, the better understanding of the motivations of the other stakeholders, and the identification of points of convergence and areas of irreducible disagreement between actors will enable progress to be made in the collective construction of the decision-making problem. In a way, it is a question of betting on collective intelligence and mobilizing the energy resulting from differences to channel it towards the creation of something that has never been created before.
This evaluation, by seeking to give voice to those traditionally excluded from public debate—particularly the most disadvantaged groups—aims to broaden and enrich public debate. There is here an emancipatory purpose expected of participatory evaluation [32]. If the citizen is involved, this breaks down his or her feeling of apathy, isolation, and powerlessness.
One of the longer-term aims was that results of the evaluation will have all the more chance of being used if the students have participated in the different stages of the evaluation process and they therefore better assimilated the analyses and results of the evaluation. In addition, the more they have contributed to the evaluation process, the more likely it is that they will agree with the evaluation findings. It is therefore hoped that the recommendations will be easier to implement and that there will be fewer obstacles to the solutions adopted [34].
Among the participatory evaluation methods, the choice was made to resort to the accompanied self-evaluation. This was viewed as being one of the most complete in that context because (1) all the participants in the implementation of the project are the actors of the evaluation, from the definition of the objectives to the conclusions, and (2) the methodological and institutional support is provided by an external facilitator, either the ORIVE or the IAF, who brings their competence and the necessary distance from the project to learn and evolve. Accompanied self-evaluation allows the actors to retrace together the path they have taken and to have a long-term vision of what they want to pursue: to find the major stages of the project, to see how the objectives have been implemented, how they have evolved and why, and to identify the obstacles and the resources of which they were not necessarily aware This is an exercise that requires a special kind of perspective and questioning. Moreover, the evaluation can lead to questioning that is difficult for the group to assume if there is no one to regulate this process.

2.3. Data Analysis

Ultimately, by September 2022, material used for the analysis presented in this paper consisted of the following material milestones and outputs:
(1)
Results of benchmark evaluations as collected at the start of the first phase from participants of focus groups and steering committee meetings.
(2)
Material outputs stemming from working sessions of the first phase—here, written recommendations intended for strategic, institutional, and political levels, as well as a tool kit for facilitating interactive and inclusive pedagogies and decision processes.
(3)
The final survey resulting from exchanges and meetings with the Student Steering Committee during the second phase.
(4)
Data collected through the survey from 304 students of the University Paris 1 Panthéon-Sorbonne.
(5)
Participatory rapid appraisals and observations emerging from working sessions and meetings of both phases, and collected using a collaborative and overarching log book.
Each of the two experiments used both qualitative and quantitative methods to analyze its data.
In Phase 1, data was collected during focus groups and during pre- and post-session surveys.
Through participant observation, text analysis was implemented in order to gather sentiment information. This technique allows the intentions and emotions of discourse to be understood and monitored, whether it is positive, negative, or neutral, and to be analyzed depending on certain factors. In order to maximize realism, classifications were only identified through participative observation, rather than being purposely triggered. Special attention was given to discourse-generated model practices and argumentative and/or legitimating elements.
Furthermore, analyzing the argumentative, dramatizing, and evaluative statements, causal relations (cause–effect), and their link to responsibilities, problem dimensions, value implications, moral and aesthetic judgments, consequences, possible courses of action, and others, allowed building a case towards action generating schemes and frames, which were reflected by further outputs.
Given the focus on discourse analysis, it was crucial to use qualitative methods, in order to yield findings that reflected the participants’ perspectives, experiences, and emotions on a topic that, although daily, remains related to the private spheres.
This was also complemented with a regression analysis using historical data (pre- vs. post-) to understand how a dependent variable’s value is affected. This analysis method was used in analyzing the survey’s responses. The same questions were asked both in an initial questionnaire before the project started and in an evaluation questionnaire after the project had taken place, concerning knowledge of digital university services, opportunities to access and assess digital services, satisfaction and participation, and perspective on participative process and channels. Following this, the results from both questionnaires were evaluated to demonstrate the evolution in opinions over the course of the project. A comparison diagram shows the results from the initial questionnaire and the results from the evaluation questionnaire on the right. This data was also cross-referenced with outputs stemming from the aforementioned text analysis, in order to monitor changes that occurred throughout the project.
Similarly, Phase 2 required both a qualitative participant observation and a more quantitative analysis. By joining and participating in student discussions, we simultaneously observed and documented interactions while noting invaluable information on the topics which subjects would be reluctant to talk about during interviews, because they are considered as obvious until discordance arises. Thus, thematic analysis was conducted. This involved noting the shared data before identifying and reviewing five main themes: perception of digital tools, perception of digital practices, perception of digital use, perception of institution, and perception of students. Each theme was examined to gain an understanding of participants’ perceptions and motivations.
For students who had already participated in Phase 1, inference served to examine whether there were significant differences between the two phases. The main effects of each phase were identified, and comparisons or contrasts were established to determine between which conditions a difference was observed.
Additionally, analysis of the survey was also conducted by a student in line with the overall phase’s structure. Before quantitative analysis, the gathered data were prepared. The dataset was checked for missing data and outliers. The responses obtained in the 28 questions were entered into the computer and started with a presentation of the data, a “flat sort”. This was done in the form of a table (all the data were included, which is appropriate in the case of small numbers), or in the form of a graph, to give a synthetic view of the data, a general trend.
After the simple presentation of the data, the hypotheses for analyzing and understanding the responses were discussed together with the ORIVE and the research team, question by question or criterion by criterion. This involved comparing the data with each other (from questionnaires and interviews) or with the existing review of literature. Some questions were cross-referenced to identify links. These cross-references and cross-sortings were presented in the form of a graph with the mention of a possible statistical link if the data were representative. A breakdown of the results by field of study was carried out in order to extract potential similarities between the fields of study, in accordance with the one predefined by Université Paris 1 Panthéon-Sorbonne. Five sheets grouping the students by discipline were produced: Art; Law–Political Science; Economics–Business–Mathematics; Geography–History–Art–Philosophy; and Institutes (IAES-IDUP-IEDES-IREST). This breakdown implied that students enrolled in a double degree program in two different fields of study were considered in each field.
Overall, the results were expressed as percentages in the graphs, and as numbers and percentages in the tables, except for the sheet concerning art students, where they were only expressed in headcount due to the small number of students (seven students). As the number of respondents was small, the results must be interpreted with caution and the percentages are given as an indication. In addition, due to rounding, some totals may be less than or greater than 100%.
For the multiple-choice questions, the results were presented in a similar way to the single choice questions.
As for open-ended questions, a content analysis was carried out where description and data analysis are presented together. An initial analysis of each question was carried out, reviewing consistency of responses, possible contradictions, and statements that directly or indirectly identify the respondent. This was followed by a cross-sectional analysis question by question. The report classically presents a question-by-question analysis, or more frequently a criterion-by-criterion analysis of all the interviews, illustrated by excerpts (with anonymity of the respondents). The aim is to identify the homogeneity or, on the contrary, the diversity of points of view.
The target’s perception of each organizing dimension was measured. The variables whose factor loadings were concentrated on the same factor were grouped. This also served to uncover variables, which allowed streamlining specific segments.
Since data analysis and interpretation can be influenced by the personality and culture of the evaluator, having unbiased data collected in a neutral and fair manner ensures validity and reliability.
In both phases, neutrality and impartiality were considered while constituting an evaluation team that would overcome biases in data collection, and also in the analysis of the results obtained for better objectivity in scaling up or replication in another context. While in the first phase an impartial facilitator was recruited, in the second phase the student was assigned by the ORIVE service due to their reliability. This, therefore, guaranteed administrative and political independence.
In parallel, and due to the mixed-methods situation, exclusively relying on a neutral and impartial team would lead to a situation of non-in-depth or superficial analysis of cases or phenomena during the process. Both independent entities were supervised by institutional components in the shape of the International Affairs Department and the ORIVE. The ORIVE furthermore declared the survey to the data protection officer, as it was not implemented through internal services but through the OpenU project and its platform. Most importantly, the information collection stage required use of prior knowledge and expertise in the field to better understand the responses and collect information. The interpretation therefore relied on regular and open peer-discussions with analysts and was finalized by the authors.

3. Results

3.1. Remarks on the Perception of Digital(ized) Universities

3.1.1. Digital University: Definition

In both experiments, a considerable amount of time was granted to student and staff views on their outlook for the digital university, which required first to specify the meaning of “digitalization” and “digital university”.
One of the main questions to be answered in the first phase indeed related to the way the digital university was envisioned.
Qualitatively, an implicit association test was conducted at the start of Phase 1. Not only was this research method seen as varied and playful, but it also studied automatic, often unconscious, associations of ideas present in the memory, and it revealed associations and preferences that would not be captured by an explicit method, or would be captured incorrectly, because the target group is often not aware of them themselves.
The first exercise recalled infinitely possible images that were associated with the term “digital university”. Table 1 below shows the intuitive associations that were recorded during this first session.
It is thus possible to classify these answers in several categories relating to:
  • Means: equipment, tools, computer, functioning, wheels, slope, skidding.
  • Interaction: collaboration, exchange, open, argument, commotion, discord, argument, share, diversity, jump, multitude.
  • Environment: calmness, residence, domesticity, privacy, international, fragile, outside, window, fragile, fear, slope, language.
  • Time frame: future, old, slow, antiquated, temporality.
At the completion of this first phase, the focus group provided a definition of a university that would be digital, as stated below (Figure 2a). However, perhaps more explicit is the definition that was agreed upon by students in Phase 2.
Figure 2 below showcases specifications that were used to clarify the meaning of digital university in both phases.
In both definitions, it is notable that digitalization is placed at a distance from university rather than in juxtaposition with it. A question that was raised during both phases was indeed the contradiction existing in the term “digital university”.
During Phase 1, it was noted that by the end of the focus group sessions, the question regarding the digital university was often adapted by discussions to become “how do we envision the university in the face of the digital,” “how do we envision the digitalized university,” or “how do we envision a university where digital has its place?”
More explicitly, an exercise interrogated the participants’ perception of the university through a creative writing exercise. Participants were asked to write a letter to the university (theirs or in general). Letters were compiled into a word cloud, which is shown in Figure 3.
Answers here were mostly classified into the categories below:
  • Participants: students, teacher, together, student, people, professors, users.
  • Timeframe: years, after, already, schedule, moment, then, time.
  • Variation: change, develop, become, new, can, project, transition, turning point, chance.
  • Means: tools, equipment.
  • Well-being: confidence, concern, request, division, justice, fear, wish.
  • Environment: life, experience, facing, have to, place, institution, opportunity, power, political, politics.
  • Interaction: open, share, talk, meet, alone.
  • Educational content: content, classes, knowledge, practices, know, work.
Table 2 below summarizes the aforementioned word associations by comparing categories used when talking about the university on one hand and when talking about the digital university on the other hand.

3.1.2. Existing Information on the Digital University

Based on the aforementioned categories, it was possible to directly collect data on the knowledge and information levels of respondents with regards to the components of the digital university.
  • Interactions
In Phase 1, each respondent was presented with two statements related to interaction in the digital university: “I reflected on my own perspectives on the digital university” (statement 1, Figure 4a), “I know the perspectives of other members of the university community on the digital university” (statement 2, Figure 4b)
Before this first phase was undertaken, there was a wide range of answers given by respondents when asked about whether they have thought about their own perspectives on a digital university, with 4 in 9 respondents saying they had not really thought about this before. After the project, the range of responses was much narrower, with 100% of respondents in agreement.
In the same way, respondents to the initial questionnaire demonstrated a wide range of responses to the second statement, with the majority (8 out of 9) unable to agree with it. On the evaluation questionnaire, it was clear that the range of responses had narrowed, with 100% of respondents in agreement to some level with the statement.
  • Means
Means, tools, and services appeared to be at the top of the collective imagination when discussing the digital university in Phase 1 (Figure 5).
Indeed, 80% of the respondents in the initial questionnaire of Phase 1 answered that they already knew services. In the final questionnaire, the range of responses had narrowed, with 100% of respondents in agreement with the statement
However, complementary answers showed that those responses were only associated with certain tools (Figure 6).
In the same way, during Phase 2, students were asked whether they felt they needed to be informed or trained for digital use, to which many answered “no” (Figure 7).
Yet, when asked about specific tools, answers were a bit more mixed, as shown in the table below (Table 3).
  • Environment and timeframe
Both temporal (before/after) and environmental (working conditions, lockdown) aspects of the digital university were heavily impacted by the individual and collective experience of previous experiences with online education and particularly the pandemic.
This was all the more the case for students in Phase 1, for whom the notion of the digital university poured salt over the wounds of COVID-19. Such experiences were often mentioned during focus groups.
Whereas in Phase 2 the questionnaire specified that it aimed to assess the digital university “beyond COVID-19”, it was often referred to during unrelated open questions, showing that it was a turning point for most students. A specific section was also allocated to previous experiences, and to the particular experience of the COVID-19 pandemic (Figure 8).
In view of these answers, it seems reasonable to interrogate the perceived information, and at least to deduce that the perceived information might have been clouded by factors such as the experience of the COVID-19 pandemic, and oriented towards only a few aspects of means, interactions, and environment. It is thus impossible to assert whether there is “a lack of information” (H1: There is a lack of information leading towards a lack of acceptance), and whether such a lack hinders or limits any activities. It is, on the other hand, clear that there was a limited perception of having a lack of information by participants in both phases.

3.1.3. Acceptance of a Digital University

However, this is not to say that a digital university cannot be considered, as showcased by the aforementioned definitions (Figure 2).
Indeed, when asked whether the university necessarily entails physical interactions, 63% of the students replied that whereas physical aspects were important, it was not incompatible with online education (Figure 9). Responses to the favorability towards digital education were slightly more mixed (Figure 10).
However, both figures show that respondents were more inclined towards compromise and that, contrary to what was put forward by H1 (There is a lack of information leading towards a lack of acceptance), there is no radical lack of acceptance towards digital education. H1 is thus not valid in that context.

3.2. Individual and Collective Stances towards Digitalization

Results were cross-referenced with several other variables, among which were discipline, perceived social capital (resources, gender), and interest toward digitalization.
Those results are shown in Table 4, Table 5 and Table 6.

3.2.1. Factors of Favorability

The status of favorability as a stance motivated by the field remained unfortunately unclear within Phase 1, mostly due to the context and small number of participants that was intrinsic to this particular experiment. Aggregate trends in favorability and discipline resulting from Phase 2 are presented in Table 3.
While it is impossible to assert that a discipline would be more inherently favorable to digital education than another, it is possible to cross-check this data with other variables.
It therefore seems appropriate to review the individual collective experience of digitalization. While such an experience was strongly impacted by the COVID-19 crisis, as mentioned in the previous paragraphs, it was also affected by other invariables that we explore in a more extensive manner in the table below (Table 5).
The data show that the disciplines that were most favorable to digital education, namely, law and political sciences and the institutes, were not necessarily those which suffered the most from social and financial marginalization. In fact, students from both fields recorded slightly higher rates of financial difficulties (28% and 26%). Those fields have also noted a higher number of students with good or somewhat good working conditions and environment, and especially more adequate space compared to other fields, which might be linked to financial resources but also other factors such as psychological and familial ones.
However, those who expressed a more favorable stance also noted a better overall experience and a slightly more adapted educational content during the COVID-19 experience (Table 5), which, on one hand, further reasserted the aforementioned impact of the pandemic, and on the other hand it showed the role teachers can play in facilitating favorability through adapting their pedagogies.
In order to better verify this assertion, data was cross-validated with information on resources (namely, the question “According to you, do your resources cover the needs related to your student life and your education?”) and replies on gender. For the latter, the choice of “other” has not been considered in view of its small headcount. Data was recorded in the following table (Table 6).
It is clear here that those who have higher chances of facing social marginalization are not necessarily those who are less favorable to digital education, and while the number of answers is not enough to put forward such an assertion, it might seem to be quite the contrary. This, therefore, invalidates the hypothesis H2 (The aforementioned varies in accordance with social factors, including marginalization).
Conversely, looking at those who are less favorable, or even unfavorable, to digital education is interesting. Digging into their fields would mainly concern the social and human sciences, as too few respondents in the arts field answered. Instead, we choose here to look at responses given to the open question “Why are you unfavorable?” Among the 304 respondents, 158 students answered this question, representing approximately 99% of students who were not in favor of distance learning. As the answers were well documented, a careful reading allowed us to collect 11 expressed reasons, as shown in Figure 11.
Most students reported increasing digital and distance learning fatigue, especially as the health crisis continued to unfold. Overall, they felt more difficulty concentrating, assimilating, and motivating themselves during distance learning courses (as expressed by 46% of respondents). Nearly half of the students also expressed a sense of isolation and loneliness with a lack of interaction among students and with faculty.
Moreover, the lack of social interaction was identified as one of the greatest difficulties. Some students experienced a double distancing, both pedagogically and socially, which may have led to dropout situations. Thus, 49% of respondents believed that digitalization contributes to isolation and the destruction of social ties. The confusion between the personal environment and the study environment is also difficult to manage for some, all the more so in the particular context of housing in the Ile-de-France region, which is particularly noted by several students who suffer(ed) from living in a “slum”.
While 46% of respondents expressed difficulty concentrating and/or motivating themselves in a digital environment, the terms used and the reasons were multiple: the unsuitable environment contributed to this (“A student room is not a classroom”), as well as difficulty keeping up, but students also cited the presence of other distracting activities nearby and feeling less concerned or involved with classes.
Most of the responses were common to all levels of study and all disciplines. The only exception was mental well-being, expressed by 8% of respondents, often attributing it to personal “experience”. Approximately 75% of those expressing anxiety, depression, or psychological sequelae were undergraduate students.
Here, yet again, the students’ positions were not inconsistent with an occasional usefulness of digital teaching or the fact that it is possible a few times a year. They mentioned in particular time saving, transportation, force majeure, extreme weather conditions, colloquia and conferences, or continuing education.
However, overall about 16% believed that digitization was contrary to the values of the university and to the very principle of studies (e.g., “This is not my idea of teaching”, “Nothing replaces face-to-face classes”, “A large part of the interest of the university comes from the social link”, “The university is a place of exchange”, “Learning is not a passive and isolated act”).
It is noteworthy that that these principles extend beyond the university space, and also raise questions about the fundamental principles that should lead our society (equality, balance, openness, etc.), e.g.: “What kind of society do we want to lead?”, “We are not robots but humans. The school, the university is a space of confrontations and constructions of the individual identity, and the school of the life that is not learned behind a screen but that is lived in contact with others.” “This is an aberration that will only contribute to the growing isolation and the rugged individualism that characterizes today’s society.”

3.2.2. Indifference and Interest towards Digitalization

On the other side of the coin can also lie indifference towards digital education and a general lack of interest.
In some theories, interest is one component of a larger construct which would be mainly motivational and generally positive [35,36]. However, this holds the main limitation of being unidimensional whereas other theories include several items targeting feelings-related valances, value-related valances, the intrinsic character of beliefs, and independent and voluntary reengagement in biology-related content and activities [37]. Most of those factors were mentioned in previous paragraphs, while reengagement will be further detailed in the following section.
That being said, it is already possible to draw some conclusions on interest based on the processes of this study, and namely on engagement.
Phase 1 explored the right to take an interest in your own experience, as Cavell puts it. Indeed, “The inherent issue of appropriateness in human speech (beginning with the question whether to speak, to destroy silence)—challenging as it were the issue of adequacy in making speech realistic…is bound up with seeing human speech as expressive, expressive of what matters in a human life, what counts for it, which inescapably puts at stake how much something matters, how deep or permanent or partial or unreflective our interest is in a given case” [38].
The final evaluation of this phase assessed this: 100% of respondents found that they were able during this phase to share their feelings and thoughts about the use of digital technology and its part in the university’s ability to deliver on its mission. Responses to the final evaluation showed that this project was successful in helping the focus group better understand their own and other’s perspectives when it comes to digital university services, as well as understanding the range of digital services which exist within their university. The project did not have an incidence for the use of digital services, as participants involved stated they were already using the university’s digital services before they were asked to take part in the questionnaire, and responses show that behaviors from before the project likely continued after the project was started. Nonetheless, it is also clear that the project had a positive impact on the focus group’s opinions on whether they have been able to express their thoughts on the use of digital services and technology within the university, steering away from “Never” as the majority answer.
The focus group format was also successful in rebalancing the situation by giving the floor to inaudible members of the community. This is particularly true of students, some of whom did not hesitate to express their disagreement with the teacher and researchers. All members, and students especially, constantly expressed surprise during the focus groups. A clear expression of enthusiasm was also noted on the part of participants, but especially on the part of students who were involved in the debates (e.g., by taking a stand against their professors) and were able to better understand how the university works (e.g., employing terms usually used by staff members and analyzing existing competition between professors) and extend the horizon of their thinking. This was evidently also shown in the previously stated Figure 4a,b.
Phase 2 also addressed this is in a slightly more straightforward way as students were asked whether they were looking to get engaged via information circuits (Figure 12). A total of 55% of those who answered this question responded affirmatively, showing that while neither feelings, nor value, nor engagement seemed to be radically positive, interest seemed to be held at a stagnant, median point. Results therefore rather pointed to a confirmed H3 (A more participative process is requested by concerned parties).

3.3. Participative Measures’ Impact

3.3.1. Rate vs. Content of Involvement

Despite noted interest, participation in both phases was limited.
One of the qualitative metrics that was foreseen was related to the participants’ engagement in the designing process. While the expected number of participants was satisfactorily met, the process in itself brought to light some shortcomings. Dissemination was inherently restricted to a smaller, more targeted audience, which was not only interested in digitalization but was additionally interested in the European or EU-funded aspect of it. Received feedback feeds into our understanding of digitalization and its perception in this university:
  • The strategic and political level of the university was necessary to support the accompaniment and identification of project leaders and interested networks.
  • In spite of the call explicitly mentioning “teachers-researchers, BIATSS and students”, members did not necessarily assimilate themselves to the word “users”.
  • The call for expression of interest raised some questions with regards to the readability of the project, as well as multilingualism. Terms such as “focus groups” are still viewed as being neoliberal and illegible.
  • Students also expressed the need to have a dedicated readable platform; mailbox is mainly used by professors and students’ unions.
Initial adherence to such collective interfaces therefore seemed to be more limited than expected, and this in spite of using multiple channels (social media, targeted e-mails, internal newsletter, university website) and offering financial incentives. Several obstacles were noted, including: lack of assimilation to the word “user”, limited ownership by strategic level, and existing (somewhat negative) perception of European projects.
In Phase 2, 304 participants answered the survey from among more than 40,000 students. Though the aim was not for the survey to be representative, but rather to collect insights, this is still less than 1% of the Paris 1 student population. Objective obstacles to the responses were the use of an online questionnaire, as traditionally used by the institution, and the period during which the questionnaire was launched, which coincided with the end of the scholar year. However, despite the low response rate, this seemed to roughly match the distribution of students across levels/units and their fields.
These outcomes therefore allow us to confirm the H4 (When existent, participative processes remain quite lethargic due to a lack of incentive).
Nevertheless, this did not correspond to a lack of involvement throughout the project. In Phase 1, all partners were present during all of the sessions, despite other obligations (Figure 13)
In Phase 2, our attention was particularly drawn to the open-ended questions. Unlike closed-ended questions that limit responses to the options given and are chosen by the writer, open-ended questions allow respondents to go deeper into their answers and gain valuable information about the topic at hand. The open-ended nature of the questions allowed respondents to respond based on their knowledge, feelings, and understanding. The responses were therefore used to obtain detailed and descriptive information about a topic.
Open questions were optional, yet had a fairly decent response rate. In particular, questions about alternatives to the way digital education was dealt with during the pandemic and the reasons behind unfavorable stances had response rates of 28% and 50%, respectively. This suggests that the students who responded to the survey felt involved, as answering an open-ended question requires more involvement and time from the respondent. Moreover, it was noted that most of the responses given were longer than 10 words, and were often supported by examples or explanations.
Other answers were even more straightforward on that paradox, such as the following one:
“Following a question, I would like to point out that the information mailing lists are non-existent in my eyes or very badly managed. I am an MBA student at IAE Paris Panthéon Sorbonne and I am bombarded with emails that make no sense for my training (language catch-up for the Bachelor’s degree, why inform me of that?) but especially my profile. Or I will be informed of initiatives (sport, language, una europa...) in which I can’t even participate when I ask for it (you are not concerned/eligible), because it seems that the pro students of the IAE Paris Panthéon Sorbonne in continuing education are not part of the students of the Sorbonne in fact, which is very sad and deprives us of exchange with other courses. We lack availability during the day of course being in professional activity in parallel with the training, but conferences or other could interest us. Also, it is curious that we receive everything (and anything?) concerning the Sorbonne, but we are not informed of the conferences of the IAE, or at least only by the screens on-site, and we are only present every 2 weeks. In short, I would be delighted to participate much more in the academic life of the Sorbonne and hope that this survey will help to mobilize in the future all the students of the university, whatever the training. Thank you for your thoughts on the subject!”
(Phase 2)
This paradox does not allow us to draw a link of causality between participation and favorability (Figure 10). It is clear, however, that those who responded, and were even more involved, were not necessarily the most favorable. The H5 (Participative processes only concern and include those who are favourable to the topic of digital education) is therefore not confirmed.

3.3.2. Incentives for Deeper Participation

We have showed above that offering incentives, namely, monetary incentives to participate in the experiment, were put forward as one way to reduce attrition or outright refusal to participate in an experiment. This, however, had a limited impact, as explained in the previous paragraph. We therefore suggest in the following that the reasons for contribution to such a participative process be explored [39].
  • Direct, personalized interaction
Despite being a very diverse group, we show in Figure 14 the distribution of participants (1, 2, 3…12) in relation to the person in charge of the conception of the project (named A) and their own informal networks (B).
Out of 12 participants, 9 participants knew directly or indirectly the researcher who had initiated the project. Only three bore only an interest in the project, related to their functions in the university.
This was once again reaffirmed in responses given during the survey in Phase 2. When asked what would encourage them to participate in projects, students (both those having worded the survey and those having answered) mostly pointed to direct or personalized contact (Figure 15). Several answers were possible.
Informal networks therefore play a big part in such projects, proving that individual initiatives remain predominant in the university’s landscape [40].
  • Ownership of decisions
Results of both phases draw our attention to the empowerment that stems from such participative phases, and its capacity to attract participants.
In the first phase, one of the main outputs was a policy note. According to participants, this political text aimed to establish the state of the art of digital technology in universities and to offer a certain number of recommendations that, although not exhaustive, seemed to the group to be essential political elements. They were thought of and organized as a political base guaranteeing that universities avoid abusing digital technology.
These recommendations were obviously impregnated with an ideological and political matrix advocating free and open digital practices and use, which is why the focus group also “wanted to propose” pedagogical tools to complement the political recommendations. Those offered “fun” tools allowed pushing the different actors of the universities to open their critical thinking and their creativity through graphic supports. In both cases, the objective was to offer theoretical and methodological tools in order to comprehend the stakes, the limits, and the fears about digital technology in the French and European universities.
It is worth noting that among the political directives lies the wish to further participate, which further reasserts that a more participative process is requested by members of the community. Indeed, participants of Phase 1 requested the involvement of different university bodies in the development and definition of internal and inter-university collaborative policies, as well as the involvement of different university bodies in the development of policy frameworks (especially in finding ways to enforce good practice).
In the same spirit, students who participated in Phase 2 wished to put forward in the very first lines of their survey the following:
“Designed by a group of students from different disciplines, this survey is aimed at the entire student community of the University of Paris 1 Panthéon-Sorbonne. Its aim is to find out your needs and expectations (both in times of crisis and outside of them) to feed into the current thinking. Your point of view and your experience will be brought to the attention of the institutions involved in the digital transformation. It is therefore a way to become actors of change, and to actively participate in the evolution of our university, and to potentially be heard. It is in this context that we would like to collect your opinion.”
  • Pedagogical gain, scientific benefit
Participants were interrogated at the beginning of both phases as to why they were interested in participating in these processes. Their answers, provided anonymously, are highlighted below.
“I have never heard of the project, and would have never had, had I not met [person]. The topic interested me and I wanted to learn more”
(Phase 2)
“I hope that we can share the word and arrive at collective answers”
(Phase 1)
“I would like to see what this project can give as a result”
(Phase 1)
“I would like to make the project more concrete: talking about one’s fears and expectations is a first approach, but in the rest of the project, the ideal for me would be to talk about the articulations of the future digital university because how to express one’s fears vis-à-vis of a project that does not even have a substance or a form? So try to give ideas to “optimize” the platform in its practical and functional aspect.”
(Phase 1)
There is also much to be said about the tools developed at the end of Phase 1.
Reactions clearly showed that focus groups provided participants, and trainers especially, with ideas for interactive training, animating and even tool designing, and opened their minds to new perspectives. Trainers expressed that they had learned from the facilitator and were keen to implement their method. Tools used and developed were requested for further use. The meeting with the two experts was also particularly enlightening, as shown in the additional optional answer given below.
“[I liked] especially:
-the context allowing the meeting between the different bodies of the universities (students, admin, profs)
-the meeting with the external speakers and the whole day of work
-the political discussions on the digital issues”
Ultimately, in addition to the policy recommendations produced, and in order to show the difficulties in constructing and guiding these recommendations, two sets of games were proposed. These were viewed as a “reflection support” that not only proposed an open and objective format, but also integrated concepts related to the political, social, educational, and ethical structures of the digital university. Game number 1 proposed players list qualities and their opposites, thus offering a large degree of freedom, while game number 2 involved several constraints to select some qualities among some chosen by the designers.
This device presented was designed to be used as “an accompaniment to the reflection process on subjects that concern the university, its internal management policy and its relations with other entities, as well as the transformations that accompany the daily life of the institution and any other form of collaborative work necessary for the proper functioning of academic institutions.”
The accompanying narrative explanation further explains: ‘Insofar as they are collective games, they can be mobilized as much in the work within already constituted teams that compose a level of organization of the universities (a service, an office, a pedagogical department, a class) as well as within groups constituted ad hoc in the framework of the various projects carried out by the institution. The Spark Game can be used in the “conciliator” or “medium” mode to define a state of affairs based on the suggested items. Participants could engage in a “builder” mode session linked to the IN game where their imagination is solicited to envision the shape of the project to come”.
It seems reasonable to conclude, based on those elements, that participation is related to interest (H6), which is in itself dependent on other factors such as direct interaction, scientific gain, and the ability to have an impact.

4. Discussion and Conclusions

4.1. Lessons Learned

Throughout this paper, we have explored the perceptions, interest, and participation of university members within and in relation to their institution’s change.
Results have shown that participants were found to be interested in the topics at hand when willing to contribute. However, this interest was not randomly or equally distributed. Theoretical models identify situational factors as critical in the development of individual interest [41,42,43], but they also show that learning contexts can promote maintained situational interest if they cause individuals to feel empowered by the knowledge presented to them in the situation [41,42]. The interest approach can also be related to situated learning as defined by J. Lave and E. Wenger [44], wherein they stress how learning processes are always intertwined with management issues, particularly those connected to the inclusion of the “learner” into the community of practitioners that such a process implies. In the learning process, knowledge and the inclusion in the community of practice can be seen as expressing the individual interest [45], explaining in this way why our experiments’ results have not shown a clear or radical refusal or acceptance of digitalization as such, but have rather been more oriented toward policies and the management that embodies such policies.
At the same time, the conducted experiments show that learning, and more broadly, garnering an interest, is congruent to the context, or the process, in which it is enabled.
Lave and Wenger’s works [44] have shown that learning a subject is related to the quality of the apprenticeship relationship: the peripheral position of beginners, while useful for the acquisition of skills, is at the same time a source of frustration when it does not lead to a central engagement in the activity and a clear inclusion in the community of practice [46] that emerges from the gathering of individuals engaged in a shared activity such as the “making” of HEIs. Results of the Phase 1 and Phase 2 experiments indeed draw our attention towards deontological and ethical issues underlined in the recommendations. It is assumed here that digitalization hides ethical issues common HEIs must deal with. Policy suggestions point to considering digitalization as an additional opportunity, but not as the solution to management problems, nor as a shortcut that allows dodging what is at stake in these ethical and deontological issues.
It therefore appears that trust was at the core of such processes. In both experimentation projects, the methodology adopted recognized that to have confidence in others is to recognize their power to act and to support them in the development of their abilities, e.g., by allowing members to equally express their opinions. This, put simply, points to trusting relations being the basis on which inclusion can be established. Conversely, inequal distribution of technical or institutional knowledge is shown to lead towards a mistrust in the technological tool and at the same in European projects [47]. When it becomes a restricted concern of few people, the lack of inclusive processes entails a loss of empowerment and trust, and a growing level of disengagement by actors whose contributions are important for the projects, which can be simply translated as a loss of interest and exclusion. As such, the inclusion of actors into the community of practice can be considered as a central factor related to the management strategies implemented in such communities. This suggests that a deontological issue is intertwined with digitalization, not because of the technical infrastructure on which it relies, but due to the social organization of the work, the sharing of knowledge and the strategies of inclusion and exclusion that such a management entails, assuming that management can be understood as the way policies are embodied in social practices. In this view, “trust” in the institution can be inferred from the critical approach shown toward digitalization and technology as a whole.
At the same time, and digging further into the ethics, ownership of the research—both the process of action research and its outcomes—appears to be an important issue in collaborative research due to its participatory nature and democratic approach. Dealing with ownership—the juridical propriety and the legitimacy of any contribution—requires drawing a higher level of attention to the management of the research process. New ethics in research stress the unequal position of different participants in the process, where some are designated as leaders and others as simple contributors, enhancing a hierarchical distribution of roles and responsibilities. Fair ethics suggest considering participants as equals, both with the researchers and with each other, sometimes being designated as the “object” of the inquiry. There might be tension between the initiator of the change and the other participants. The action research process might be seen as coercive if the necessary preparation has been lacking.
The methodology adopted for the Phase 1 and Phase 2 experiments was designed in a way that allowed it to tend towards an ethical approach: Phase 1 gathered participants in the academic community belonging to different statuses, but the supposed hierarchy implied by their statuses did not impact either the cooperation or the importance of contributions. Phase 2 was implemented by students acting as a research group, defining the targets of the survey and designing the questionnaire in collaboration with experienced actors in the field belonging to the same university. The attempt was made in the initial phase to generate the awareness of acting together in pursuit of a common goal, sharing experiences, and considering contributions as being at the same level and of equal importance and consistency with the project. By acting in this way, the social organization of participants (governance of the experiment) was the initial point of the focus group fixing the inclusion strategy (participation) and common rules (deontology) as the starting point for a common work. Emphasis was placed on the social aspects of learning and sharing, rather than solely on the individual or the environment. Learning by ‘observing and pitching in’ could easily apply.
Once these conditions were settled, participants were able to engage themselves in the imaginative process required by Phase 1, suggesting that a trusting atmosphere initiated by the ethical approach had led to creative response.
In terms of policies, our analysis relied on an innovative, adaptive, and imaginative response to change, which demands an organizational climate of autonomy, immunity from interference, trust, openness, encouragement of risk-taking, and tolerance of failure; in other words, it demands the existence of freedom in Sen’s ‘process’ sense (Sen, 1998 [48]). Freedom as an ‘opportunity’ represents the driver of creativity, but this opportunity can only arise in a situation where there is knowledge, and knowledge comes into being firstly ‘on the ground’ (in the community of practice), where creativity happens. Knowledge as the fruit of creativity appears to generate opportunity and thus freedom. It is also knowledge that creates the organizational opportunities to pursue and to achieve those outcomes which are defined as ‘valuable’ in terms of organizational purpose.
The perspective adopted thus leads us to consider learning in terms of inclusion or exclusion in the community of practice. The two experiments suggested a different consideration for the digital tool: the knowledge which is at stake here does not only concern the contents of the learning opportunities, whether they are on-line or face-to-face [49]. They suggested that digitalization is an issue in so far as we look at it as the final target of apprenticeship and not as middle learning. Knowledge here should be seen as meta tool, or, on a meta-learning level, as the way users can learn how to learn with the digital tool, a knowledge that concerns the way to acquire knowledge, learning to learn through the digital tool, leading to inclusion in the greater community of digital users and successful students. The fears expressed related sentiments of isolation and an overall incapacity to engage in a deep and authentic learning process.

4.2. Ethical Limitations and Recommendations for Future Research

While we have already addressed quantitative limitations that challenged the representativity of answers given by respondents, this study also embraced important limitations that should be addressed in future research.
Nowadays, and as an extension of the climate of mistrust that was previously mentioned, policy debates appear as an oppositional moment where stakeholders raise questions and challenges to the authority and rationality of governance. For these reasons, the artificiality of the experimentation has been perceived and has fed mistrust in the project (which proves in itself the mistrust participants nourished against the institution), not towards other participants, but towards the ultimate goal. While managing the group in order to create a peaceful and positive climate, it was clear to all participants that the results would not be used by the project managers in intervening in a policy-making process at a high authority level [50]. Recommendations, one should say, are not policies. It was rightfully noted that participants were not official and institutional stakeholders normally engaged in local, national, or international policy making.
While on one side this limit does not reduce the impact or invalidate the results, it underlines the point that the experimentation itself would not have resulted in an intervention in the policy-making process, despite being aligned with a top-down vision of a policy-making process.
Opposite to a top-down framework in policy making, a bottom-up approach should have led us to the conception of a different experiment grounded on field observations (ethnography of uses and organizations) based on cultural studies’ lesson pointing to a reflexive knowledge that is always present and belonging to practitioners even if it not formalized in the scientific format. Such a bottom-up approach would have therefore also led to recommendations pointing towards an adequation of the governance to the existing practices in order to enhance, facilitate, or support them [51], assuming that such practices were the result of a local rational engagement of actors in a social distributed knowledge environment.
Instead, it is not surprising that participants produced items and tools useful to facilitate discussions and debates about changes in HEI responding to the main targets of the project with a need to instruct a larger debate on the issue, embodying the position of investigators and watchdogs rather than decision makers (consistent with the drafting of recommendations).
It thus seems that participants simultaneously adhered to a top-down framework and to the vision of the political importance of digitalization and Europeanization of HEIs, and engaged themselves in a project supposedly conceived in the top-down framework. Nonetheless, mistrust for the top institutions that must carry out the changes in HEIs continues to grow, and outcomes produced ultimately do not resemble recommendations, probably because of the ambiguous position they were occupying as both bottom users and experts. The artificiality of the situation should drive us to a conception of research that puts users at the center of the observation process into the field, raising social knowledge about the practices, and eventually engaging them in the policy-making process as experts of such practices and communities of practice in a bottom-up framework.
The commitment in research processes transforms subjects and triggers the emergence of a different consciousness about the epistemological background of social practices and the knowledge involved in it, producing performative effects and transforming a social group from “object” of research to “subject” of knowledge.

Author Contributions

Conceptualization, Y.C. and M.R.D.’O.; methodology, M.R.D.’O.; validation, M.R.D.’O.; formal analysis, Y.C.; investigation, Y.C.; resources, Y.C.; data curation, Y.C.; writing—original draft preparation, Y.C.; writing—review and editing, Y.C.; visualization, Y.C.; supervision, Y.C.; project administration, M.R.D.’O.; funding acquisition, N/A. All authors have read and agreed to the published version of the manuscript.

Funding

This research was co-funded by the European Commission in the framework of the Erasmus+ OpenU project (KA3-606692-EPP-I-2018-2-FR-EPPKA3-PI-POLICY). It represents only the opinion of the authors. Neither partners nor the European Commission can be held responsible for any use that may be made of the information contained therein.

Institutional Review Board Statement

Ethical review and approval were conducted by the Steering Committee Board of the concerned study, as described in pages 8 to 9 of the paper. All quantitative data were approved by the institutional Observatory of Results, Professional Insertion and Student Life (ORIVE) in Paris 1 Panthéon-Sorbonne, in accordance to the Law n° 78-17 on Information Technology and Liberties (6 January 1978) and GPDR, and following the approval of the institutional Data Protection Officer (June 2022).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study, and are archived as public information by the ORIVE.

Data Availability Statement

Data can be consulted by contacting correspondence stated above.

Acknowledgments

The authors thank the participants of the focus group and the student Steering Committee for their participation. While participants of the focus group wished to remain anonymous, special thanks go to the students who have been involved in those experimentations. This work would also have not been possible without the intervention, advice and support of the following: Bruno Selun (Kumquat) who has facilitated and coordinated Phase 1, Johannes Posel (Freie Universität Berlin) for his technical assistance, and both DPEIP and ORIVE (Université Paris 1 Panthéon-Sorbonne) for their availability, their work and advice, especially Elodie Hutin, Elodie Mette, Pamela Torres and Agnes Garcia.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Magro, E.; Wilson, J.R. Policy-mix evaluation: Governance challenges from new place-based innovation policies. Res. Policy 2019, 48, 103612. [Google Scholar] [CrossRef]
  2. Christopher, A.; Bartenberger, M. Varieties of Experimentalism. Ecol. Econ. 2016, 130, 64–73. [Google Scholar]
  3. Campbell, D.T.; Stanley, J.C. Experimental and Quasi-Experimental Designs for Research; Rand McNally & Company: Chicago, IL, USA, 1963. [Google Scholar]
  4. Heilmann, S. Policy Experimentation in China’s Economic Rise. Stud. Comp. Int. Dev. 2008, 43, 1–26. [Google Scholar] [CrossRef]
  5. Morgan, K. Experimental Governance and Territorial Development. In Background Paper for an OECD/EC Workshop on “Broadening Innovation Policy: New Insights for Regions and Cities”, Paris, France, 14 December 2018; OECD/EC: Paris, France, 2018; Available online: https://orca.cardiff.ac.uk/id/eprint/125697/1/Morgan%282018%29ExperimentalGovernanceAndTerritorialDevelopment_OECD_FINAL%20report.pdf (accessed on 29 March 2022).
  6. European Commission/EACEA/Eurydice. Support Mechanisms for Evidence-Based Policy-Making in Education, Eurydice Report; Publications Office of the European Union: Luxembourg, 2017; Available online: https://eige.europa.eu/resources/206_EN_Evidence_based_policy_making.pdf (accessed on 2 June 2022).
  7. Parsons, W. Public Policy. An Introduction to the Theory and Practice of Policy Analysis; Edward Elgar: Aldershot, UK, 1995; pp. 618–675. [Google Scholar]
  8. Burns, T.; Köster, F. Governing Education in a Complex World, Educational Research and Innovation; OECD Publishing: Paris, France, 2016; pp. 3–238. [Google Scholar]
  9. Centre for Educational Research and Innovation, Evidence in Education: Linking Research and Policy, OECD. 2007. Available online: https://www.oecd.org/education/ceri/evidenceineducationlinkingresearchandpolicy.htm#3 (accessed on 2 June 2022).
  10. Mahoney, J.; Thelen, K. A Theory of Gradual Institutional Change. In Explaining Institutional Change: Ambiguity, Agency, and Power; Cambridge University Press: Cambridge, UK, 2010; pp. 1–37. [Google Scholar]
  11. Musselin, C. La Grande Course des Universités; Presses de Sciences Po: Paris, France, 2017; 303p. [Google Scholar]
  12. Gilbert, T.F. Human Competence: Engineering Worthy Performance; McGraw-Hill: New York, NY, USA, 1978; 370p. [Google Scholar]
  13. Finger, M.; Audouin, M. The Governance of Smart Transportation Systems: Towards New Organizational Structures for the Development of Shared, Automated, Electric and Integrated Mobility; The Urban Book Series; Springer: Berlin/Heidelberg, Germany, 2018. [Google Scholar]
  14. Norman, D. Emotional Design: Why We Love (or Hate) Everyday Things; Basic Books: New York, NY, USA, 2004; 287p. [Google Scholar]
  15. Hesselgren, M.; Sjöman, M.; Pernestål, A. Understanding user practices in mobility service systems: Results from studying large scale corporate MaaS in practice. Travel Behav. Soc. 2019, 21, 318–327. [Google Scholar] [CrossRef]
  16. Veryzer, R.W.; Borja de Mozota, B. The Impact of User-Oriented Design on New Product Development: An Examination of Fundamental Relationships. J. Prod. Innov. Manag. 2005, 22, 128–143. [Google Scholar] [CrossRef]
  17. Vredenburg, K.; Isensee, S.; Righi, C. User-Centered Design: An Integrated Approach; Prentice Hall: Hoboken, NJ, USA, 2002; 288p. [Google Scholar]
  18. Clark, K.B.; Fujimoto, T. The Power of Product Integrity. Harv. Bus. Rev. 1190, 68, 107–118. [Google Scholar]
  19. Henderikx, P.; Ubachs, G.; Antonaci, A. Models and Guidelines for the Design and Development of Teaching and Learning in Digital Higher Education; Global Academic Press: Maastricht, The Netherlands, 2002. [Google Scholar]
  20. Maltz, E.; Souder, W.E.; Kumar, A. Influencing R&D/marketing integration and the use of market information by R&D managers: Intended and unintended effects of managerial actions. J. Bus. Res. 2001, 52, 69–82. [Google Scholar] [CrossRef]
  21. De Berny, C.; Rousseau, A.; et Deschatre, M. Les usages du numérique dans l’enseignement supérieur, L’Institut Paris Région: Paris, France October 2021. Available online: https://www.institutparisregion.fr/fileadmin/NewEtudes/000pack2/Etude_2690/Rapport_usage_du_numerique_enseig._sup._oct._2021.pdf (accessed on 19 May 2022).
  22. Brouns, F.; Wopereis, I.; Klemke, R.; Albert, S.; Wahls N Pieters, M.; Riccò, I. Report on Mapping (post-)COVID-19 Needs in Universities. DigiTeL Pro, EADTU. Project Funded by the European Commission. 2022. Available online: https://digitelpro.eadtu.eu/outcomes (accessed on 30 November 2022).
  23. Fuenfschilling, L.; Frantzeskaki, N.; Coenen, L. Urban experimentation & sustainability transitions. Eur. Plan. Stud. 2018, 27, 219–228. [Google Scholar] [CrossRef] [Green Version]
  24. Huitema, D.; Jordan, A.; Munaretto, S.; Hildén, M. Policy experimentation: Core concepts, political dynamics, governance and impacts. Policy Sci. 2018, 51, 143–159. [Google Scholar] [CrossRef] [Green Version]
  25. Kivimaa, P.; Rogge, K.S. Interplay of policy experimentation and institutional change in sustainability transitions: The case of mobility as a service in Finland. Res. Policy 2021, 51, 104412. [Google Scholar] [CrossRef]
  26. Mascret, A. Enseignement Supérieur et Recherche en France: Une Ambition D’excellence; La documentation Française: Paris, France, 2015. [Google Scholar]
  27. Liere-Netheler, K.; Packmohr, S.; Vogelsang, K. Drivers of Digital Transformation in Manufacturing. In Proceedings of the Hawaii International Conference on System Science 2018, Waikoloa Village, HI, USA, 3–6 January 2018. [Google Scholar]
  28. International Relations Department. International Annual Report; Université Paris 1 Panthéon-Sorbonne: Paris, France, 2022. [Google Scholar]
  29. Berkowitz, A.D. From Reactive to Proactive Prevention: Promoting an Ecology of Health on Campus. In A Handbook on Substance Abuse for College and University Personnel; Rivers, P.C., Shore, E., Eds.; Greenwood Press: Westport, CT, USA, 1997; Chapter 6. [Google Scholar]
  30. Morin, E. Introduction à la Pensée Complexe; ESF Éditeur: Paris, France, 1990. [Google Scholar]
  31. Cros, F. L’innovation en éducation et en formation. Rev. Française Pédagogie 1997, 118, 127–157. [Google Scholar]
  32. Plottu, B.; et Plottu, É. Contraintes et vertus de l’Évaluation participative. J. Rev. Française De Gest. 2009, 192, 31–58. [Google Scholar] [CrossRef]
  33. Tripier; Laufer, R.; Burlaud, A. Management public, Gestion et Légitimité. Sociol. Trav. 1981, 3, 364–366. [Google Scholar]
  34. Patton, M.Q. Utilization-Focused Evaluation: The New Century Text, 3rd ed.; Sage Publications: Thousand Oaks, CA, USA, 1997. [Google Scholar]
  35. Ryan, R.M.; Deci, E.L. An overview of self-determination theory: An organismic dialectical perspective. In Handbook of Self-Determination Research; Deci, E.L., Ryan, R.M., Eds.; University of Rochester Press: Rochester, NY, USA, 2002; pp. 3–33. [Google Scholar]
  36. Wigfield, A.; Eccles, J.S. Expectancy–Value Theory of Achievement Motivation. Contemp. Educ. Psychol. 2000, 25, 68–81. [Google Scholar] [CrossRef]
  37. Schiefele, U. Situational and individual interest. In Handbook of Motivation at School; Routledge/Taylor & Francis Group: New York, NY, USA, 2009; pp. 197–222. [Google Scholar]
  38. Cavell, S. Little Did I Know: Excerpts from Memory; Stanford University Press: Stanford, CA, USA, 2010; p. 85. [Google Scholar]
  39. Sadoff, S. The role of experimentation in education policy. Oxf. Rev. Econ. Policy 2014, 30, 597–620. [Google Scholar] [CrossRef]
  40. Smirnova, E.; Balog, K. A User-Oriented Model for Expert Finding. In Advances in Information Retrieval. ECIR 2011; Lecture Notes in Computer Science; Springer: Berlin/Heidelberg, Germany, 2011; Volume 6611. [Google Scholar]
  41. Hidi, S.; Harackiewicz, J.M. Motivating the academically unmotivated: A critical issue for the 21st century. Rev. Educ. Res. 2000, 70, 151–179. [Google Scholar] [CrossRef]
  42. Hidi, S.; Renninger, K.A. The Four-Phase Model of Interest Development. Educ. Psychol. 2006, 41, 111–127. [Google Scholar] [CrossRef]
  43. Schiefele, U. The role of interest in motivation and learning. In Intelligence and Personality. Bridging the Gap in Theory and Measurement; Collis, J.M., Messick, S., Eds.; Erlbaum: Mahwah, NJ, USA, 2001; pp. 163–193. [Google Scholar]
  44. Lave, J.; Wenger, E. Situated Learning: Legitimate Peripheral Participation; Cambridge University Press: New York, NY, USA, 1991. [Google Scholar]
  45. Knekta, E.; Rowland, A.A.; Corwin, L.A.; Eddy, S. Measuring university students’ interest in biology: Evaluation of an instrument targeting Hidi and Renninger’s individual interest. Int. J. STEM Educ. 2020, 7, 23. [Google Scholar] [CrossRef]
  46. Wenger, E. Communities of Practice: Learning, Meaning, and Identity; Cambridge University Press: New York, NY, USA, 1998. [Google Scholar] [CrossRef]
  47. Paquelin, D.; Crosse, M. Responsabilisation, ouverture et confiance: Pistes pour l’enseignement supérieur du futur. Enjeux Société 2021, 8, 190–215. [Google Scholar] [CrossRef]
  48. Sen, A. Development as Freedom; Alfred Knopf: New York, NY, USA, 1999. [Google Scholar]
  49. Draxler-Weber, N.; Packmohr, S.; Brink, H. Barriers to Digital Higher Education Teaching and How to Overcome Them—Lessons Learned during the COVID-19 Pandemic. Educ. Sci. 2022, 12, 870. [Google Scholar] [CrossRef]
  50. Kvernbekk, T. Evidence-Based Educational Practice. Oxford Research Encyclopedia of Education. 2017. Available online: https://oxfordre.com/education/view/10.1093/acrefore/9780190264093.001.0001/acrefore-9780190264093-e-187 (accessed on 10 March 2022).
  51. Pellegrini, M.; Vivanet, G. Evidence-Based Policies in Education: Initiatives and Challenges in Europe. ECNU Rev. Educ. 2020, 4, 25–45. [Google Scholar] [CrossRef]
Figure 2. Excerpts of Policy Note (Phase 1) and Questionnaire (Phase 2). (a) Description of the digital university at the outcome of Phase 1; (b) Definition of digitalization in the second phase.
Figure 2. Excerpts of Policy Note (Phase 1) and Questionnaire (Phase 2). (a) Description of the digital university at the outcome of Phase 1; (b) Definition of digitalization in the second phase.
Education 12 00919 g002
Figure 3. Word cloud associated with the creative writing on the term “university” (Phase 1, © Kumquat).
Figure 3. Word cloud associated with the creative writing on the term “university” (Phase 1, © Kumquat).
Education 12 00919 g003
Figure 4. Answers regarding knowledge of interactions, exchanges, and perspectives on digital university during Phase 1 (graphs produced by © Kumquat). (a) Statement 1; (b) Statement 2.
Figure 4. Answers regarding knowledge of interactions, exchanges, and perspectives on digital university during Phase 1 (graphs produced by © Kumquat). (a) Statement 1; (b) Statement 2.
Education 12 00919 g004
Figure 5. Words associated with digital in the university (Phase 1, © Kumquat).
Figure 5. Words associated with digital in the university (Phase 1, © Kumquat).
Education 12 00919 g005
Figure 6. Knowledge of services in the university (Phase 1, © Kumquat).
Figure 6. Knowledge of services in the university (Phase 1, © Kumquat).
Education 12 00919 g006
Figure 7. Answers provided to the question “Do you feel the need to benefit from an information session or a training on certain digital services or digital usage in the university?” Left to right: (1) Yes, (2) No, (3) I don’t know.
Figure 7. Answers provided to the question “Do you feel the need to benefit from an information session or a training on certain digital services or digital usage in the university?” Left to right: (1) Yes, (2) No, (3) I don’t know.
Education 12 00919 g007
Figure 8. Answers provided by students to the question “Did the COVID-19 experience change or affect your relation to digital?” Left to right: (1) Not at all, my position remains the same, (2) Somewhat yes, my perception has remained the same, (3) A lot, the experience changed my view on digital, (4) Radically, my position is not the same, (5) I did/do not have an opinion, (6) Unconcerned, I have arrived to the university this year.
Figure 8. Answers provided by students to the question “Did the COVID-19 experience change or affect your relation to digital?” Left to right: (1) Not at all, my position remains the same, (2) Somewhat yes, my perception has remained the same, (3) A lot, the experience changed my view on digital, (4) Radically, my position is not the same, (5) I did/do not have an opinion, (6) Unconcerned, I have arrived to the university this year.
Education 12 00919 g008
Figure 9. Answers provided to the question “According to you, does the university require physical interaction?” Downwards: (1) Yes, the university can only be on-site. (2) Yes, but this is not contradictory to online courses. (3) No, the university is not characterized by its physical attributes.
Figure 9. Answers provided to the question “According to you, does the university require physical interaction?” Downwards: (1) Yes, the university can only be on-site. (2) Yes, but this is not contradictory to online courses. (3) No, the university is not characterized by its physical attributes.
Education 12 00919 g009
Figure 10. Answers provided to the question “Are you favorable to digital teaching?” Downwards: (1) Favorable, (2) Somewhat favorable, (3) Somewhat unfavorable, (4) Not favorable, (5) No opinion.
Figure 10. Answers provided to the question “Are you favorable to digital teaching?” Downwards: (1) Favorable, (2) Somewhat favorable, (3) Somewhat unfavorable, (4) Not favorable, (5) No opinion.
Education 12 00919 g010
Figure 11. Answers provided by students to the question “If less or not favorable to digital education explain why”.
Figure 11. Answers provided by students to the question “If less or not favorable to digital education explain why”.
Education 12 00919 g011
Figure 12. Answers provided to the question “Do you seek to get informed of university news or projects held by university?” Left to side: (1) Yes, (2) No, (3) I don’t know.
Figure 12. Answers provided to the question “Do you seek to get informed of university news or projects held by university?” Left to side: (1) Yes, (2) No, (3) I don’t know.
Education 12 00919 g012
Figure 13. Workload required during Phase 1, as perceived by participants.
Figure 13. Workload required during Phase 1, as perceived by participants.
Education 12 00919 g013
Figure 14. Map of participants in relation to contact person of the project (A).
Figure 14. Map of participants in relation to contact person of the project (A).
Education 12 00919 g014
Figure 15. Answers provided by students to the question “Which (communicative) factors would empower/enable you to participate in the university’s projects or activities?” Left to right: (1) Direct communication (intervention in classroom), (2) Nominative communication by mail, (3) General communication by department offices, (4) General communication by central services, (5) I don’t know.
Figure 15. Answers provided by students to the question “Which (communicative) factors would empower/enable you to participate in the university’s projects or activities?” Left to right: (1) Direct communication (intervention in classroom), (2) Nominative communication by mail, (3) General communication by department offices, (4) General communication by central services, (5) I don’t know.
Education 12 00919 g015
Table 1. Words associated with the terms “digital university”.
Table 1. Words associated with the terms “digital university”.
ParticipantFunctionWords Associated with Chosen Image
1Admin.Perspective, discord, realities, diversity
2Admin.Tools, functioning, collaboration, wheels
3Admin.Outside, window, exchange, open
4StudentWeb, fragile, solidity, domesticity, fear
5StudentSkidding, slope, slippery, jump
6StudentCollaboration, human touch, interaction, difficulties
7StudentFuture, English, language, international
8ResearcherTemporality, snail, slow, residence, domesticity, privacy
9ResearcherCommotion, argument, calmness
10ResearcherComputer, antiquated, old, equipment, multitude
12ResearcherShare, exchange, international
Table 2. Categories associated with the term “university” and “digital university”.
Table 2. Categories associated with the term “university” and “digital university”.
CategoryUniversityDigital University
MeansTools, equipmentEquipment, tools, computer, functioning, wheels, slope, skidding
InteractionOpen, share, talk, meet, aloneCollaboration, exchange, open, argument, commotion, discord, argument, share, diversity, jump, multitude
EnvironmentLife, experience, facing, have to, place, institution, opportunity, power, political, politicsCalmness, residence, domesticity, privacy, international, fragile, outside, window, fragile, fear, slope, language
TimeframeYears, after, already, schedule, moment, then, timeFuture, old, slow, antiquated, temporality
VariationChange, develop, become, new, can, project, transition, turning point, chance-
ParticipantsStudents, teacher, together, student, people, professors, users-
Well-beingConfidence, concern, request, division, justice, fear, wish-
Educational contentContent, classes, knowledge, practices, know, work-
Table 3. Answers provided to the question “Do you know the following tools proposed by the digital services of Paris 1?”.
Table 3. Answers provided to the question “Do you know the following tools proposed by the digital services of Paris 1?”.
Tool/ServiceI Know and UseI Know but Do Not UseI Use if I Have to but do Not KnowI Don’t Know
E-mail93%4%1%0%
Interactive pedagogical interface90%3%2%2%
Planning27%47%4%20%
Address list 24%40%13%19%
Library catalogue43%28%12%15%
Documentary access (Mikado)30%24%9%33%
Documentary access (Domino)60%12%8%18%
Office 365 49%29%3%17%
Document transfer (Filex)9%8%4%77%
Internship and job opportunities browser14%42%10%30%
Mobility opportunities browser10%40%6%41%
Forum and chatbox6%43%5%43%
Collaborative document (Framapad)6%14%3%73%
Poll (Evento)5%14%3%74%
Medical appointments tool11%28%5%54%
Table 4. Aggregate trends in favorability and discipline, Phase 2, according to the answers provided to the question “Are you favorable to digital teaching?”.
Table 4. Aggregate trends in favorability and discipline, Phase 2, according to the answers provided to the question “Are you favorable to digital teaching?”.
FieldFavorableSomewhat FavorableSomewhat UnfavorableUnfavorable
Social and human sciences12%19%29%41%
Law and political science22%25%25%26%
Art03 (43%)1 (14%)3 (43%)
Institutes21%26%42%11%
Table 5. Aggregate trends by discipline and social profile, working environment, and COVID-19 experiences, according to answers provided in Phase 2.
Table 5. Aggregate trends by discipline and social profile, working environment, and COVID-19 experiences, according to answers provided in Phase 2.
TotalSocial and Human SciencesLaw and Political SciencesArtsInstitutes
Social ProfileGenderF64%67%70%5 (71%)69%
M33%29%29%2 (29%)25%
Age17–2467%67%70%5 (71%)47%
25–348%10%6%1 (14.5%)21%
35+25%23%25%1 (14.5%)32%
FinanceYes72%71%72%6 (85.5%)69%
No26%28%25%1 (14.5%)26%
Level (LMD)Y 1–2 32%34%36%4 (57%)27%
Y333%29%28%5 (71%)26%
MA36%35%36%3 (42%)53%
PhD4%7%4%1 (14.5%)0
Working EnvironmentGood working conditionsYes84%78%86%6 (85.5%)100%
No15%22%12%1 (14.5%)0
Adequate spaceYes63%57%71%1 (14.5%)63%
No37%43%29%6 (85.5%)0
EquipmentPersonal computer97%96%97%6 (85.5%)100%
Internet76%68%82%5 (71%)84%
Webcam93%90%97%5 (71%)100%
Mic97%96%99%5 (71%)94%
COVID-19 ExperienceOverallGood15%10%17%05%
Somewhat good30%31%24%3 (42%)42%
Somewhat bad24%23%33%1 (14.5%)11%
Bad18%22%13%2 (29%)16%
Educational contentAdapted48%48%50%2 (29%)63%
Not adapted40%40%38%4 (57%)16%
Table 6. Aggregate trends in favorability and social profile, according to answers provided in Phase 2.
Table 6. Aggregate trends in favorability and social profile, according to answers provided in Phase 2.
ResourcesGender
Sufficient Insufficient FemaleMale
Num%Num%Num%Num%
Favorable3617%1621%2818%1215%
Somewhat favorable5526%2027%4228%1924%
Somewhat unfavorable5426%1621%3825%1823%
Unfavorable6330%2331%4328%2735%
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Dell’Omodarme, M.R.; Cherif, Y. User-Oriented Policies in European HEIs: Triggering a Participative Process in Today’s Digital Turn—An OpenU Experimentation in the University of Paris 1 Panthéon-Sorbonne. Educ. Sci. 2022, 12, 919. https://doi.org/10.3390/educsci12120919

AMA Style

Dell’Omodarme MR, Cherif Y. User-Oriented Policies in European HEIs: Triggering a Participative Process in Today’s Digital Turn—An OpenU Experimentation in the University of Paris 1 Panthéon-Sorbonne. Education Sciences. 2022; 12(12):919. https://doi.org/10.3390/educsci12120919

Chicago/Turabian Style

Dell’Omodarme, Marco Renzo, and Yasmine Cherif. 2022. "User-Oriented Policies in European HEIs: Triggering a Participative Process in Today’s Digital Turn—An OpenU Experimentation in the University of Paris 1 Panthéon-Sorbonne" Education Sciences 12, no. 12: 919. https://doi.org/10.3390/educsci12120919

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop