Next Article in Journal
Cataloguing of the Defects Existing in Aluminium Window Frames and Their Recurrence According to Pluvio-Climatic Zones
Next Article in Special Issue
A Derivation of Factors Influencing the Diffusion and Adoption of an Open Source Learning Platform
Previous Article in Journal
On the Potential Impacts of Smart Traffic Control for Delay, Fuel Energy Consumption, and Emissions: An NSGA-II-Based Optimization Case Study from Dhahran, Saudi Arabia
Previous Article in Special Issue
The UnMOOCing Process: Extending the Impact of MOOC Educational Resources as OERs
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Human-Centered Design as an Approach to Create Open Educational Resources

Faculty of Computer Science, Multimedia and Telecommunications, Universitat Oberta de Catalunya, 08018 Barcelona, Spain
*
Author to whom correspondence should be addressed.
Sustainability 2020, 12(18), 7397; https://doi.org/10.3390/su12187397
Submission received: 1 August 2020 / Revised: 2 September 2020 / Accepted: 3 September 2020 / Published: 9 September 2020
(This article belongs to the Special Issue Opportunities and Challenges for the Future of Open Education)

Abstract

:
Open educational resources (OER) play an important role in teaching and learning, especially in lifelong learning. Educational resources should be created in a way that addresses lifelong learners’ needs. Human-centered design (HCD) is a design perspective and an iterative process that involves users in all phases of the process. Thus, an HCD approach can provide relevant advantages when creating OER for lifelong learning. This work presents the Design Toolkit as a case study of digital open educational contents for design education that has been created following an HCD process. The orientation of the Design Toolkit is to provide users OER in a tool format rather than in a traditional manner. The main goal of this research is to contribute to the understanding of how HCD impacts OER creation. The research focuses on teachers, assessing the Design Toolkit content organization and analyzing teacher adoption and usage of the resources. The HCD approach fosters teachers’ satisfaction, promotes OER adoption and provides new design requirements for a future iteration of the HCD process. The results show that designing OER involving users through an HCD approach sets the focus adequately on their needs and limitations. Teachers feel satisfied with the Design Toolkit, fostering the adoption of OER in different educational contexts. Finally, users’ involvement in the whole HCD process points out design and educational requirements for future work.

1. Introduction

User-centered design (UCD) is both defined as a process and as a philosophy. As a design process, it is an approach to plan projects and a set of methods to be used in each phase. On the other hand, as a design philosophy, it aims to involve users in all phases of the design process [1]. UCD is especially useful where design solutions involve interactive technology [2]. Human-centered design (HCD) provides a broader mindset to UCD, emphasizing the humanistic approach and the way in which technology is designed [3]. This design approach has a direct link with the human pillar of sustainability, which focuses on the importance of involving users in the making of products or services [4].
Sustainable development is an oriented and heuristic process [5], where education in all design disciplines plays an essential role in conveying human-centered values and skills to designers. In recent years, design has experienced a meaningful evolution that takes into account the sustainable development. This evolution has caused a redefinition of the discipline and the designer role. This transformation has shifted design focus from products to ideas, people and experiences and new design disciplines have arisen, such as design thinking, service design and open design [6].
In this context, design practitioners have become lifelong learners that need to be up-to-date with new design disciplines, and its related processes, methods and principles. This new design context highlights the need for easily accessible, up-to-date, educational design contents that address the new challenges of the field. The main challenge is related to going beyond theory-based, one-dimensional static learning resources and providing them in an action-oriented manner to both designers and learners. In this scenario, new design contents with a “tool perspective” have been created mostly by companies and practitioners [7]. These actionable contents are available through repositories that follow a toolbox metaphor and are mainly aimed at practitioners. Therefore, there is an opportunity to create open educational resources (OER) for both design practitioners and learners.
This work focuses on the Design Toolkit, an open educational repository that provides content as tools. The platform allows learners to access and explore open educational design resources in multiple ways (direct access, exploration) using a navigational system designed to empower teachers and learners [8]. The Design Toolkit was designed and developed following an HCD approach through an iterative process [9].
The general purpose of this research is to contribute to the understanding of how HCD impacts OER design and development. Specifically, the work focuses on the Design Toolkit as a case study, assessing content organization and analyzing teacher adoption and usage. The HCD approach fosters teacher satisfaction in terms of user experience and promotes OER adoption.

2. Background

2.1. Lifelong Learning and OER

Lifelong learning is the process of learning that occurs during each person’s lifetime [10]. This way of learning often implies that learners combine their educational and professional life. In this scenario, distance learning programs enable students to balance and adapt the learning progress to their availability and needs. Distance learning programs are usually delivered through technology, which may enhance the learning process. A key attribute of lifelong learning is that learners take greater responsibility for their own learning process [11,12]. It is commonly accepted that educational resources play a key role in distance learning programs as components of knowledge assets [13,14,15]. Open educational resources (OER) “are teaching, learning, and research resources that reside in the public domain or have been released under an intellectual property license that permits their free use and repurposing by others”. [16].
OER are made possible by information and communication technologies [17], and they are released under open intellectual property licenses that allow free use or repurposing [18]. Walker [15] states that open educational resources need to be convenient, effective, affordable, sustainable and available to every learner and teacher.
Research about OER focuses on both learners and teachers. Armellini and Nie [19] state that teachers or course designers could make use of these OER through four types of practices: (1) “as-is” as a planned enhancement during curriculum design, (2) “as-is” as a “just-in-time” resource during course delivery, (3) adapted OER during curriculum design, and (4) adapted OER during course delivery. Depending on how and when OER are used, these four practices can be grouped into two scenarios: curriculum design, and course delivery. Additionally, Cox & Trotter present a pyramid with six essential factors of OER teacher adoption [20]. These factors are: access, permission, awareness, capacity, availability and volition.
Although the benefits of OER in education have been proved [21,22], the use of OER seems to be low in higher education [23,24,25]. Nevertheless, it does not mean that the reuse of these resources is not happening. As Glennie, Harley and Butcher [26] explain it might be taking place “below the radar”, and Wiley uses the term “dark reuse” to refer to unobserved behavior [27]. Thus, it is worth providing evidence of the usage of these kinds of resources in higher education. From the designer’s perspective, it is also crucial to indicate the process followed to design and develop those resources, seeing them as a process rather than as a product [28].
Human-centered design processes and methods aim to create solutions by focusing on the needs, wants and limitations of the users of the product [28]. Technology-enhanced learning (TEL) and OER may take advantage of HCD when designing learner-centered educational environments and contents [29,30].

2.2. OER in the Design Discipline

OER take a key role in learning processes in the design field. The design discipline is constantly evolving, and new content is appearing, while previously existing content is changing. This fact becomes even more notable in interaction design and human-computer interaction (HCI) disciplines, where constant technological changes push design practitioners to keep knowledge updated. Although some initiatives have flourished during the last few years to provide design-related content, most of them are targeted at design practitioners. However, as models of design resources, it is worth doing an in-depth analysis of them through a benchmark method. In Table 1 of [9], “Analysis of design toolkit’s main features and contents”, we analyzed design content repositories focusing on the type of content (actionable, theoretical, etc.) and the organization of content (labeling, navigation systems, explorability, etc). In this work we enlarge the analysis introducing the open content perspective, analyzing the licences used in each content repository. The results show that most of them are published under restrictive licenses that do not allow users to reuse those resources. The results are summarized on Table 1.

3. Design Toolkit

UOC’s Design Toolkit (design-toolkit.recursos.uoc.edu) is an open educational resource about design. It was developed and is being used on design-related courses at the Open University of Catalonia (UOC, www.uoc.edu). UOC is a fully online university based in Barcelona (Spain) with a community of more than 60,000 students and 3000 teachers. The UOC educational model is based on breaking space and time barriers through asynchronous distance education based on TEL. Teaching and learning mainly take place in a virtual learning environment that integrates learning contents, asynchronous communication, academic services and interaction with teachers and peers. Currently, the Design Toolkit educational content is being used on fifteen courses of five design-related programs.
The design and development of the Design Toolkit followed an iterative UCD process [24], which involved users (learners and teachers), in which a set of educational and design requirements were identified.
Educational requirements include (1) providing content as tools; (2) providing action-oriented resources; (3) allowing users to explore educational resources and (4) distributing contents with an open licence. On the other hand, design requirements are: (5) resources must be easy to access; (6) contents should be clear, organized and up to date; (7) contents must be shareable, manageable and reusable in different learning contexts; (8) design professionals (lifelong learners) need direct and action-oriented content; (9) teachers and instructors need to be able to easily edit, organize and update content.
As a result, the Design Toolkit has four main characteristics: (1) its design followed a user-centered design process; (2) it provides design related content in an actionable way; (3) it is focused on learner autonomy and lifelong learning; and (4) the educational resources are offered under an open license.
The contents of the Design Toolkit are organized into cards, which have a common structure and each one presents content in an actionable way, providing step by step guidance and examples. Some cards also provide two levels of content depth through a guide. The guide offers in-depth guidance, allowing the adaptation of contents to different educational contexts and learners. Digital content navigation is strongly linked to user satisfaction since it allows users to understand where to find what they need. The navigation system of the Design Toolkit comprises six main categories of which Design Methods is the most important and the one that contains the majority of contents. The Design Methods category provides a filtering system that allows learners to filter design methods based on the design phase, the type of method (quantitative or qualitative), user involvement as well as the difficulty, experience and duration needed. This navigation and filtering system (Figure 1 and Figure 2) empower both teachers and learners, allowing them to explore design methods in each learning scenario.
This work mainly focuses on how the Design Toolkit open educational resource is used by teachers under a human-centered design perspective. The relationship between content organization and teachers’ needs is assessed, as well as diverse teacher adoption in educational settings.

4. Materials and Methods

In this research we follow a human-centered design iterative process and we aim to discover insights into content organization and teachers’ adoption and usage of the Design Toolkit project. In terms of the iterative process, this research constitutes a new iteration of the HCD process (Figure 3) based on the principles of ISO 9241-210 [29]. Previous work on the Design Toolkit can be found in [9].
From the educational point of view, the Design Toolkit was designed mainly for two types of users: learners and teachers. The first design iteration was primarily focused on understanding the needs, expectations and limitations of learners. Thus, learners were involved through research methods with the goal to collect their user experience with the Design Toolkit. As shown in [9], the overall experience was satisfactory, and learners preferred to use the Design Toolkit rather than the traditional educational content. That research provided insights based on learners’ needs, wants and limitations, which were mostly related to the information architecture and content organization.
Due to the importance of the information architecture on a learning content platform, collecting more information was necessary from the teachers’ point of view in order to define design requirements. Thus, three actions were carried out:
  • First, workshop with information architecture experts to assess and improve existing content organization and structure. Based on this, card sorting was defined and carried out with teachers (See Section 4.1).
  • Second, a set of focus groups with teachers to inquire about teachers’ adoption of the OER, the information architecture and how they use the Design Toolkit in teaching and learning processes (See Section 4.2).
  • Third, two validated questionnaires were used to understand the user experience with the learning contents platform: system usability scale (SUS), and net promoter score (NPS) (See Section 4.3).

4.1. Card Sorting

Card sorting is a useful design method in which participants need to label and classify cards in groups according to their criteria [31,32]. This user-centered design method generates an overall classification of the information and provides insights into the user’s mental model, allowing researchers to know the way users group, sort and label the content of a website [32,33].
In this research, we performed an individual card sorting with teachers that know and have used the Design Toolkit for teaching. Regarding the number of participants, in the case of card sorting, Nielsen states that a 0.90 correlation of issue identification would be achieved by testing 15 users [34,35]. In our case, 15 teachers participated in the focus groups (n = 15), 60% of them male and 40% female. Their ages ranged from 32 to 63, with an average of 43.6. They had used the Toolkit from 2 to 6 semesters with an average of 3.
A closed card sorting was carried out in which users were provided with cards already classified into specific categories [32]. This type of card sorting is more suitable for cases in which researchers aim to validate a current classification [32]. This predefined classification (Table 2) was based on the results of the work done by information architecture experts at the previously mentioned workshop. However, not only could participants arrange cards into categories, but they also could change the name of each category, and add or delete categories.

4.2. Focus Groups

A focus group is an effective qualitative method where a broad range of opinions, feelings, attitudes and experiences arise [36,37,38]. Through this method, participants discuss a topic exposed by a moderator, who also drives the conversation to make important ideas flourish. We ran three different focus groups, as it is thought that running two or more focus groups may increase the chances of success [39]. Small groups of participants may result in a lack of experiences drawn [39], and large groups may result in a difficult situation to express participants’ opinion. Thus, each group had between 4 and 6 participants. To recruit participants, a survey was sent to 18 teachers from courses where the toolkit had been used and a total of 15 participants were recruited (n = 15), 46.7% of them male and 53.3% female. Their ages ranged from 30 to 55, with an average of 40.53. They had used the Toolkit from 2 to 6 semesters with an average of 2.7. Once further information about those who felt able to participate was collected through this survey, two groups of similar profiles were set up to build a space for exchanging opinions and attitudes between similar participants [40,41], and a convenient time was arranged to run three focus groups: one with course designers and two with course instructors (Table 3).
Each focus group was divided into two parts. In the first one, participants were asked to perform a task to show when they started to use the Design Toolkit in their teaching process. To do so, they had to point out the key timepoints in a predefined timeline (Figure 4). They were asked to explain on top of the figure where in this process was the Design Toolkit used.
The second part was an open interview guided by a moderator and analyzed by one observer, who took notes of the experience. This stage lasted between 60 and 90 min and addressed specific questions and topics that needed to be addressed in each focus group, which are summarized in (Table 4)

4.3. User Experience Evaluation Methods

Usability and the perceived usefulness of a product or a service have an effect on the user experience and satisfaction [42], which has a prominent relevance in TEL in Higher Eduaction [43]. SUS is a popular and validated method to evaluate the usability of web applications [44,45,46] even with a small sample group [47], which is also used to quantify the user’s satisfaction in using e-learning platforms [48]. In addition, how likely are users to recommend a product might be an indicator about how satisfied they are with this product. NPS is also a validated questionnaire, introduced by Fred Reichheld in 2003 [49], that shows how likely participants are to recommend the company or product to a friend or a colleague and categorize participants into three categories: detractors, passives and promoters.

5. Results and Discussion

5.1. User Experience and OER Adoption

One of the goals of this research was to understand teachers’ satisfaction with the Design Toolkit educational resources. This was carried out with SUS and NPS questionnaires.
The SUS questionnaire allows participants to rate ten questions related to the user experience of a digital product using a 5-point Likert scale, from strongly disagree (1) to strongly agree (5) (Equation (1)) [45]. Each participant’s responses were calculated (1) and SUS results were considered positive results when they were higher than 70. In the Design Toolkit’s case, the SUS scale had an average value of 75.58. Cronbach’s alpha was performed to examine the internal consistency of the results and an α = 0.86 was obtained. It should be noted that values greater than α = 0.7 are reliable [50], and in SUS questionnaires (10 items) it is set at α = 0.707.
S U S = 2.5 × [ n = 1 5 ( U 2 n 1 1 ) + ( 5 U 2 n ) ]
NPS questionnaire participants were asked to answer the question: “How likely are you to recommend [company or product name] to a friend or colleague?” using a 10-point Likert scale. The results were split into three categories, being “detractors”, those who answered from 0 to 6, “passives”, from 7 to 8, and “promoters”, from 9 to 10 [51]. Results on the Design Toolkit’s questionnaire shows that 30.77% of participants were “passives” and 69.70% were “promoters” with scores higher than 9 (Figure 5). Thus, the NPS result was calculated, and scored a 69.70, which is an excellent result, taking into account that results higher than 50 are considered excellent [52].
Although the validity of NPS is sometimes questioned [53], it is widely used to assess satisfaction in education due to its simplicity [51]. Thus, it is worth taking the results into account to get an overall satisfaction indicator. However, it should be used together with other quantitative and qualitative methods.
Focus groups revealed more detailed and qualitative information about teacher satisfaction. When asked for an overall opinion on the Design Toolkit, the majority of the participants showed their high level of satisfaction with the platform and how they adopt it in their teaching activity.
“It is an agile tool, which helps you to structure the course, it is nice to use, and I really appreciate that it is an action-oriented tool […] I also like the educational perspective, with which I agree, constructivism, which fosters the idea of creating knowledge through the interaction with the content…”
participant P1.5
“I think it is a very useful tool, not only because the explanations that it includes of each technique, model and so on…but also because it encompasses all these things in a unique website…it has a lot of potential…”
participant P3.2
As participant 1.5′s response shows, some of the feedback emphasizes the utility of the platform to design and prepare the course. In this regard, the result of the timeline task (Figure 4), in which participants had to indicate the key points in a timeline when they used the Design Toolkit before starting teaching the course, shows that they use the platform at the beginning of this process (Figure 6). It should be noted that while course designers start using it later, course instructors use the Design Toolkit as a tool to get an overview of the course (Figure 7).
This use of the platform fits in with one of the types of use that teachers can make of OER according to Armellini and Nie, in which teachers use it “as a planner enhancement during curriculum design” [19]. However, focus groups also reveal another type of use explained by Armellini and Nie once the teaching activity starts [19]. It is seen as “a resource during course delivery” as participants P2.2 and P2.3 explained.
[Marking a student’s activity], I highlight the error, and I provide them with a link to the Design Toolkit where it is explained […] if I ever avoid referring to the toolkit in the activities’ feedback, sometimes students ask me for examples, and I send them the link to the toolkit.”
participant P2.3
“I usually use it [the Design Toolkit] to validate the marking of the activities […] it is the resource that I use to check if students have done what they were asked to do”
participant P2.2
As presented in Section 3, both design and educational requirements aimed to provide open resources to the design community (practitioners and learners). In this regard, the majority of participants express satisfaction with the Design Toolkit being an open resource to the design community (participant P3.4 and P2.6).
“I like the fact that it is an open resource and that there is no private section.”
participant P3.4
“Publishing an open resource that has an academic background and a revision process, how this does [the Design Toolkit], has a great value to the community”
participant P2.6
In fact, due to the Design Toolkit being an open educational resource, several participants expressed that they already knew the Design Toolkit before being part of UOC’s faculty. Not only did they use the platform with educational purpose but also in their professional jobs. Furthermore, some of them stated that the Design Toolkit is also a beneficial resource for the institution because it increases the reputation for reliability and excellence in the design community (participant P2.1).
“From my point of view, what you have (the Design Toolkit) is branded content; in addition, it is excellent branded content.”
participant P2.1

5.2. Information Architecture

Another primary goal of this research was to get a deeper understanding of the findings obtained in previous stages of the HCD process [9] where the need for an improvement of the information architecture was raised. Information architecture relates to four areas: an organization system, a labeling system, a navigational system, and a searching system [33]. The card sorting revealed valuable information about some of these areas:
The organization system relates to how content is categorized in the platform. The card sorting results shown in Figure 8 have been calculated with the unweighted pair grouping method with arithmetic-mean (UPGMA) technique, which is one of the most commonly used clustering algorithms [54]. The majority of participants sorted the cards into nine groups (red horizontal line near 1 in the y-axis in Figure 8), which was the proposal by experts in the workshop explained in the Section 4. This means that there is no evidence of a problem with the content organization in the platform.
Results were also analyzed from the labeling perspective. Since dendrograms in card sorting take no notice of group labels, a qualitative analysis of card sorting was also done. This analysis reveals no important issues regarding the labelling system since only two participants proposed changing some group labels. Notably, one of them suggested changing two group labels, whereas the other one recommended only one.
Since the card sorting method does not reveal information about searching and navigation systems, questions related to that were discussed in focus groups. Regarding the search system, it should be noted that the current Design Toolkit version does not provide a searching system. When participants were asked about what they think should be improved, some of them suggested incorporating a searching system in the platform.
“Regarding how to improve it [the Design Toolkit], I would add a searching system to find specific resources directly.”
participant P1.1
I would appreciate a search bar […] when I need to find something, I have to use the shortcut command + F to find it.”
participant P3.1
Finally, and regarding the navigation system, the majority of focus group participants expressed concern with the navigation system of the platform. Most of them referred to the difficulty of finding content in the toolkit using the filtering system. During the focus group, the moderator showed a screenshot of the current filtering system (Figure 1) to the participants and asked them for an opinion on it.
Although there were no specific improvements proposals, there was a consensus in all of the focus group about the need to improve the filtering system. Some of the participants were very critical of the current filtering system (participant P3.4).
“The filters are not clear at all, and do not encourage users to explore the resources.”
participant P3.4
Since participants showed such strong concern about how to find the content quickly, the moderator drove the conversation in this direction to obtain more detailed information. Even though participants clarified that it was not a general issue in the platform, they also raised issues regarding the access to the content. In this case, some of them showed the difficulty in finding the guides inside cards (participant P2.1). Moreover, some of them proposed changes in the Design Toolkit interface to make the guides’ buttons inside each card more visible.
“Related to the guides, this second level that some of the cards have, […] it has sometimes been difficult to find them, not least for me […] it is sometimes easier to find them using a Google Search than through the Design Toolkit navigation.”
participant P2.1
As a summary, the general satisfaction with the Design Toolkit in the teacher community is positive and participants stated that they had a good user experience while exploring and using the platform. Regarding the educational aim of the toolkit, all of the participants were satisfied with the ease of use and the adaptability of the educational resources to their teaching activities. Additionally, there was a consensus between all the teachers on the benefits of publishing educational resources under open licenses. The Design Toolkit contributes to spreading design knowledge to the community, and it also improves the reputation of the university.
The results of this research also contribute to the definition of new design requirements to improve the Design Toolkit. These requirements are summarized in Table 5, and constitute the starting point for a new iteration of the Design Toolkit HCD process.

6. Conclusions

In this paper, we presented the advantages of designing and creating open educational resources as tools rather than in a traditional format through an iterative human-centered design process. This action-oriented content enabled teachers to adapt educational resources for their teaching process, both in the course design and in the course delivery phases [19]. Results reveal a high level of satisfaction with the Design Toolkit and its usability as SUS and NPS results shown. This satisfaction is linked to the ease of use of the OER in their teaching activity in different contexts. This concept is aligned with the OER adoption factor presented by Cox and Trotter [20].
Open licensing the Design Toolkit and making it publicly accessible contributed to the design community. The design field is mainly a professionalized area, where providing educational content with an educational perspective may contribute to design practitioners’ lifelong learning. Additionally, UOC, the higher education institution that promotes the Design Toolkit OER, takes advantage of it as the repository is currently a well known resource in the Spanish-speaking design community.
Additionally, we showed the advantages of creating OER through an iterative human-centered design process where researchers, teachers and learners have worked collaboratively. This constitutes an improvement on how OER can be created. Involving teachers in the design process facilitated the discovery of areas to improve, especially those based on their experiences and use, that would be more challenging to discover without their participation. Furthermore, the use of the card sorting method was crucial to emphasize issues related to the information architecture. This brings to light that the use of HCD methods in OER creation enables obtaining design insights and improving the design of OER platforms.

Author Contributions

Conceptualization, C.G.-L., E.M. and S.T.; Data curation, C.G.-L., E.M. and S.T.; Formal analysis, C.G.-L. and E.M.; Investigation, C.G.-L., E.M. and S.T.; Methodology, C.G.L., E.M. and S.T.; Supervision, E.M.; Visualization, C.G.-L. and S.T.; Writing–original draft, C.G.-L., E.M. and S.T.; Writing–review & editing, C.G.-L., E.M. and S.T. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Norman, D.A.; Draper, S.W. User Centered System Design: New Perspectives on Human-Computer Interaction; CRC Press: Boca Raton, FL, USA, 1986. [Google Scholar]
  2. Lowdermilk, T. User-Centered Design: A Developer’s Guide to Building User-Friendly Applications; O’Reilly Media, Inc.: Sevastopol, CA, USA, 2013. [Google Scholar]
  3. Gasson, S. Human-centered vs. user-centered approaches to information system design. J. Inf. Technol. Theory Appl. 2003, 5, 5. [Google Scholar]
  4. Benn, S.; Edwards, M.; Williams, T. Organizational Change for Corporate Sustainability; Routledge: Abington, UK, 2014. [Google Scholar]
  5. Blewitt, J. The Ecology of Learning: Sustainability, Lifelong Learning and Everyday Life; Earthscan Publications Ltd.: London, UK, 2006. [Google Scholar]
  6. Sanders, E.B.-N.; Stappers, P.J. Co-creation and the new landscapes of design. CoDesign 2008, 4, 5–18. [Google Scholar] [CrossRef] [Green Version]
  7. IDEO. The Field Guide to Human-Centered Design: Design Kit; IDEO: Japalo Alto, CA, USA, 2015. [Google Scholar]
  8. Kalantzis, M.; Cope, B. The Teacher as Designer: Pedagogy in the New Media Age. E Learn. Digit. Media 2010, 7, 200–222. [Google Scholar] [CrossRef] [Green Version]
  9. Garcia-Lopez, C.; Tesconi, S.; Mor, E. Designing Design Resources: From Contents to Tools. In Human-Computer Interaction. Perspectives on Design; Springer International Publishing: Orlando, FL, USA, 2019; pp. 87–100. [Google Scholar]
  10. Knapper, C.; Cropley, A.J. Lifelong Learning in Higher Education; Psychology Press: East Sussex, UK, 2000. [Google Scholar]
  11. Pachler, N.; Daly, C.; Mor, Y.; Mellar, H. Formative e-assessment: Practitioner cases. Comput. Educ. 2010, 54, 715–721. [Google Scholar] [CrossRef] [Green Version]
  12. Gikandi, J.W.; Morrow, D.; Davis, N.E. Online formative assessment in higher education: A review of the literature. Comput. Educ. 2011, 57, 2333–2351. [Google Scholar] [CrossRef]
  13. Carroll, J.M.; Rosson, M.B.; Dunlap, D.; Isenhour, P. Frameworks for Sharing Teaching Practices. J. Educ. Technol. Soc. 2005, 8, 162–175. [Google Scholar]
  14. Hsu, K.C.; Yang, F.-C.O. Toward an Open and Interoperable e-Learning Portal: OEPortal. J. Educ. Technol. Soc. 2008, 11, 131–148. [Google Scholar]
  15. Chen, I.Y.L.; Chen, N.-S.; Kinshuk. Examining the Factors Influencing Participants’ Knowledge Sharing Behavior in Virtual Learning Communities. J. Educ. Technol. Soc. 2009, 12, 134–148. [Google Scholar]
  16. Heath, W. Open Educational Resources: Breaking the Lockbox on Education; William and Flora Hewlett Fundation: Menlo Park, CA, USA, 2013; Available online: https://hewlett.org/open-educational-resources-breaking-the-lockbox-on-education/ (accessed on 10 July 2020).
  17. United Nations Educational Scientific, and Cultural Organization (UNESCO). Forum on the Impact of Open Courseware for Higher Education in Developing Countries; Final Report; UNESCO: Paris, France, 2002. [Google Scholar]
  18. Atkinson, R.K.; Derry, S.J.; Renkl, A.; Wortham, D. Learning from Examples: Instructional Principles from the Worked Examples Research. Rev. Educ. Res. 2000, 70, 181–214. [Google Scholar] [CrossRef]
  19. Armellini, A.; Nie, M. Open educational practices for curriculum enhancement. Open Learn. 2013, 28, 7–20. [Google Scholar] [CrossRef] [Green Version]
  20. Cox, G.; Trotter, H. Factors Shaping Lecturers’ Adoption of OER at Three South African Universities; African Minds, International Development Research Centre & Research on Open: Western Cape, South Africa, 2017. [Google Scholar]
  21. Grewe, K.; Davis, W.P. The Impact of Enrollment in an OER Course on Student Learning Outcomes. Int. Rev. Res. Open Distrib. Learn. 2017, 18. [Google Scholar] [CrossRef] [Green Version]
  22. Delgado, H.; Delgado, M.; Hilton, J., III. On the efficacy of open educational resources: Parametric and nonparametric analyses of a university calculus class. Int. Rev. Res. Open Distrib. Learn. 2019, 20, 200. [Google Scholar]
  23. Allen, I.E.; Seaman, J. Opening the Curriculum: Open Educational Resources in U.S. Higher Education; Babson Survey Research Group: Babston Park, MA, USA, 2014; p. 52. [Google Scholar]
  24. Schuwer, R.; Janssen, B. Open Educational Resources en MOOC’s in Het Nederlandse Hoger Onderwijs: Een onderzoek Naar de Stand van Zaken Rond Productie en Hergebruik; Fontys Hogeschool ICT: Eindhoven, The Netherlands, 2016. [Google Scholar]
  25. Beaven, T. ‘Dark reuse’: An empirical study of teachers’ OER engagement. Open Prax. 2018, 10, 377–391. [Google Scholar] [CrossRef]
  26. Glennie, J.; Harley, K.; Butcher, N. Introduction: Discourses in the development of OER practice and policy. In Open Educational Resources and Change in Higher Education: Reflections from Practice; Commonwealth of Learning: Vancouver, CO, Canada, 2012; pp. 1–12. [Google Scholar]
  27. Wiley, D. Dark Matter, Dark Reuse, and the Irrational Zeal of a Believer. Available online: http://opencontent.org/blog/archives/905 (accessed on 10 June 2009).
  28. Friesen, N. Open educational resources: New possibilities for change and sustainability. Int. Rev. Res. Open Distance Learn. 2009, 10, 1–13. [Google Scholar] [CrossRef] [Green Version]
  29. ISO. Ergonomics of Human-System Interaction: Part 210: Human-Centred Design for Interactive Systems; ISO: Geneva, Switzerland, 2010. [Google Scholar]
  30. Ferran, N.; Guerrero-Roldán, A.-E.; Mor, E.; Minguillón, J. User Centered Design of a Learning Object Repository. In Human Centered Design; Springer: Berlin/Heidelberg, Germany, 2009; pp. 679–688. [Google Scholar]
  31. Sherwin, K. Card Sorting: Uncover Users’ Mental Models for Better Information Architecture. Available online: https://scholar.google.com/scholar?hl=en&as_sdt=0%2C5&q=Card+Sorting%3A+Uncover+Users%E2%80%99+Mental+Models+for+Better+Information+Architecture&btnG= (accessed on 17 July 2020).
  32. Spencer, D. Card Sorting: Designing Usable Categories; Rosenfeld Media: New York, NY, USA, 2009. [Google Scholar]
  33. Morville, P.; Rosenfeld, L.; Arango, J. Information Architecture: For the Web and Beyond; O’Reilly Media: Sebastopol, CA, USA, 2015. [Google Scholar]
  34. Hepburn, P.; Lewis, K.M. What’s in a Name? Using Card Sorting to Evaluate Branding in an Academic Library’s Web Site. Coll. Res. Libr. 2008, 69, 242–251. [Google Scholar] [CrossRef] [Green Version]
  35. Nielsen, J. Card Sorting: How Many Users to Test. 2004. Available online: http//www.nngroup.com/articles/card-sorting-how-many-users-to-test/ (accessed on 6 June 2015).
  36. Lazar, J.; Feng, J.H.; Hochheiser, H. Research Methods in Human-Computer Interaction; Morgan Kaufmann: Cambridge, MA, USA, 2017. [Google Scholar]
  37. Hanington, B.; Martin, B. Universal Methods of Design: 100 Ways to Research Complex Problems, Develop Innovative Ideas, and Design Effective Solutions; Rockport Publishers: Beverly, MA, USA, 2012. [Google Scholar]
  38. Plummer, P. Focus group methodology. Part 1: Design considerations. Int. J. Ther. Rehabil. 2017, 24, 297–301. [Google Scholar] [CrossRef]
  39. Krueger, R.A. Focus Groups: A Practical Guide for Applied Research; Sage Publications: Thousand Oaks, CA, USA, 2014. [Google Scholar]
  40. Asbury, J.-E. Overview of Focus Group Research. Qual. Health Res. 1995, 5, 414–420. [Google Scholar] [CrossRef]
  41. Oates, B.J. Researching Information Systems and Computing; Sage: Thousand Oaks, CA, USA, 2005. [Google Scholar]
  42. Calisir, F.; Calisir, F. The relation of interface usability characteristics, perceived usefulness, and perceived ease of use to end-user satisfaction with enterprise resource planning (ERP) systems. Comput. Human Behav. 2004, 20, 505–515. [Google Scholar] [CrossRef]
  43. Gomes, A. User Needs and User Centered Design of Teaching Support Environments for E-learning. In Proceedings of the Society for Information Technology & Teacher Education International Conference, Albuquerque, NM, USA, March 2003; pp. 3562–3565. Available online: https://www.learntechlib.org/primary/p/17906/ (accessed on 3 September 2020).
  44. Lewis, J.R. Measuring perceived usability: The CSUQ, SUS, and UMUX. Int. J. Human–Comput. Interact. 2018, 34, 1148–1156. [Google Scholar] [CrossRef]
  45. Harrati, N.; Bouchrika, I.; Tari, A.; Ladjailia, A. Exploring user satisfaction for e-learning systems via usage-based metrics and system usability scale analysis. Comput. Human Behav. 2016, 61, 463–471. [Google Scholar] [CrossRef]
  46. Lewis, J.R. The System Usability Scale: Past, Present, and Future. Int. J. Hum. Comput. Interact. 2016, 61, 463–471. [Google Scholar] [CrossRef]
  47. Kaewsaiha, P. Usability of the Learning Management System and Choices of Alternative. In The International Conference on Education, Psychology, and Social Sciences (ICEPS); Tokyo University of Science: Tokio, Japan, 2019; pp. 252–259. Available online: http://www.elic.ssru.ac.th/pongrapee_ka/pluginfile.php/18/mod_page/content/13/Full%20Paper.pdf (accessed on 3 September 2020).
  48. Brooke, J. SUS: A ‘quick and dirty’usability. Usability Eval. Ind. 1996, 189, 4–7. [Google Scholar]
  49. Reichheld, F.F. The one number you need to grow. Harv. Bus. 2003, 81, 46–55. [Google Scholar]
  50. Sekaran, U.; Bougie, R. Research Methods for Business: A Skill Building Approach; John Wiley & Sons: Chichester, West Sussex, UK, 2016. [Google Scholar]
  51. Palmer, K.; Devers, C. An Evaluation of MOOC Success: Net Promoter Scores. In Proceedings of EdMedia + Innovate Learning 2018; Association for the Advancement of Computing in Education (AACE): Amsterdam, The Netherlands, 2018; pp. 1648–1653. [Google Scholar]
  52. Yan, J. Net Promoter Score (NPS): What is a Good Net Promoter Score? 2017. Available online: https://www.questionpro.com/blog/nps-considered-good-net-promoter-score/ (accessed on 20 July 2020).
  53. Keiningham, T.L.; Cooil, B.; Andreassen, T.W.; Aksoy, L. A Longitudinal Examination of Net Promoter and Firm Revenue Growth. J. Mark. 2007, 71, 39–51. [Google Scholar] [CrossRef] [Green Version]
  54. Gronau, I.; Moran, S. Optimal implementations of UPGMA and other common clustering algorithms. Inf. Process. Lett. 2007, 104, 205–210. [Google Scholar] [CrossRef]
Figure 1. Design Toolkit’s filtering system.
Figure 1. Design Toolkit’s filtering system.
Sustainability 12 07397 g001
Figure 2. User journey card.
Figure 2. User journey card.
Sustainability 12 07397 g002
Figure 3. ISO 9241-210:2010-Human-centred design for interactive systems.
Figure 3. ISO 9241-210:2010-Human-centred design for interactive systems.
Sustainability 12 07397 g003
Figure 4. Predefined timeline used in Focus Groups. From “course preparation” to “start of the “course delivery”.
Figure 4. Predefined timeline used in Focus Groups. From “course preparation” to “start of the “course delivery”.
Sustainability 12 07397 g004
Figure 5. Responses to “How likely are you to recommend the Design Toolkit to a friend or colleague?”
Figure 5. Responses to “How likely are you to recommend the Design Toolkit to a friend or colleague?”
Sustainability 12 07397 g005
Figure 6. Examples of some of the timelines drawn by teachers during the focus groups.
Figure 6. Examples of some of the timelines drawn by teachers during the focus groups.
Sustainability 12 07397 g006
Figure 7. Examples of timelines drawn by teachers during the focus group.
Figure 7. Examples of timelines drawn by teachers during the focus group.
Sustainability 12 07397 g007
Figure 8. Cluster tree from card sorting result.
Figure 8. Cluster tree from card sorting result.
Sustainability 12 07397 g008
Table 1. Toolkits’ licenses analysis.
Table 1. Toolkits’ licenses analysis.
ToolkitLinkLicence
Data visualization cataloghttps://datavizcatalogue.com/Copyright
Design-led research toolkithttp://dlrtoolkit.com/Unspecified
DIYhttps://diytoolkit.org/CC BY-SA 4.0
D.P.Dhttp://www.edu-design-principles.orgUnspecified
Dubberly Design Officehttp://www.dubberly.com/Unspecified
Ideos’ DesignKithttps://www.designkit.org/Unspecified
Hi Toolboxhttps://toolbox.hyperisland.com/Copyright
Medialab Amsterdamhttps://toolkits.dss.cloud/design/Unspecified
Project of howhttps://projectofhow.com/Unspecified
Service Design Toolkithttps://www.servicedesigntoolkit.org/CC BY-NC 4.0
Usability.govhttps://www.usability.gov/CC0 (Public domain)
Table 2. Card sorting predefined categories.
Table 2. Card sorting predefined categories.
CategorySubcategoryNumber of Cards
Methods-36
PrinciplesCognitives7
Functionals10
Perceptives8
Models-7
Interaction-16
Ideas-9
ToolsMedia8
Visual strategies15
Table 3. Focus groups’ participants.
Table 3. Focus groups’ participants.
FGNum. ParticipantsProfilesParticipants
FG15Course designers and instructorsP1.1, P1.2, P1.3, P1.4, P1.5
FG26Course instructorsP2.1, P2.2, P2.3, P2.4, P2.5, P2.6
FG34Course instructorsP 3.1, P3.2, P3.3, P3.4
Table 4. Focus groups’ questions, topics and tasks.
Table 4. Focus groups’ questions, topics and tasks.
TypeQuestion or Topic
TaskTimeline of the usage of the Design Toolkit
QuestionsGeneral satisfaction and user experience with the Design Toolkit
QuestionsAdoption of the Design Toolkit to the course design
QuestionsAdoption of the Design Toolkit to the course delivery
QuestionsOpen feature and external use of the Design Toolkit
QuestionsDesign features and opportunities to improve
Table 5. Design requirements.
Table 5. Design requirements.
NumberAreaQuestion or TopicPriority
1Filtering systemThe filtering system needs to be improved through a new visual design that facilitates to filter the content.1
2Searching systemA searching system needs to be implemented in the Design Toolkit1
3Visibility of guidesGuides need to be more visible inside each card. This needs to be addressed through a redesign of the interface.1
4LabelingConsider adjusting some categories’ names3

Share and Cite

MDPI and ACS Style

Garcia-Lopez, C.; Mor, E.; Tesconi, S. Human-Centered Design as an Approach to Create Open Educational Resources. Sustainability 2020, 12, 7397. https://doi.org/10.3390/su12187397

AMA Style

Garcia-Lopez C, Mor E, Tesconi S. Human-Centered Design as an Approach to Create Open Educational Resources. Sustainability. 2020; 12(18):7397. https://doi.org/10.3390/su12187397

Chicago/Turabian Style

Garcia-Lopez, Carles, Enric Mor, and Susanna Tesconi. 2020. "Human-Centered Design as an Approach to Create Open Educational Resources" Sustainability 12, no. 18: 7397. https://doi.org/10.3390/su12187397

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop