sustainability-logo

Journal Browser

Journal Browser

Smart Technology-Enhanced and Sustainable Assessment

A special issue of Sustainability (ISSN 2071-1050).

Deadline for manuscript submissions: closed (31 March 2021) | Viewed by 393

Special Issue Editors


E-Mail
Guest Editor
Departamento de Ingeniería Informática, Escuela Superior de Ingeniería, Universidad de Cádiz

E-Mail Website
Guest Editor
Departamento de Filología Francesa e Inglesa, Facultad de Filosofía y Letras, Universidad de Cádiz
Interests: MALL; VLE

Special Issue Information

Dear Colleagues,

It is our pleasure to announce the opening of a new Special Issue in the Sustainability Journal.

While assessment has traditionally been seen as the domain of teachers and educators in order to evaluate students’ learning outcomes (summative assessment), pedagogic advances and the rise of new teaching paradigms have brought not only new ways of teaching and learning into discussion but also new ways of assessment (Gibbs & Simpson, 2004–5; Boud & Soler, 2016). Assessment is no longer understood as the action that takes place at the end of a course to measure students’ learning outcomes, but as part of the learning process itself. This means, assessment must take place during the learning process (formative assessment) and aim at providing students with regular feedback on their learning process, hence guaranteeing that the learning outcomes (summative assessment) meet students’ and teachers’ expectations (Boud & Molley, 2016).

As Davies (2010, 2016) and other researchers (Dawson et. al, 2019) have outlined in different studies assessment lies at the very heart of the teaching and learning experience shaping both, learners’ understanding of the curriculum as well as their ability to succeed in the learning process. At the same time, assessment affects the teachers’ workload. Moreover, taking into account the increasing trend towards blended teaching practices and the use of Massive Open Online Courses (MOOCs) the need for supporting teachers and learners during the teaching-learning process has increased in the last decade, becoming one of the main concerns of educators and educational institutions (Lugton, 2012; Jamieson & Musumeci, 2017; Martín-Monje, Castrillo &  Mañana-Rodríguez, 2018). 

In this context, the concept of sustainable assessment (Boud & Soler, 2016) has been defined as a way to prepare students to meet their own future learning needs throughout life in formal and informal settings.

Sustainable assessment has turned into one of the key concerns of researchers and practitioners to meet the needs of the present in terms of the demands of formative and summative assessment. It makes rethinking learning outcomes and methods to focus on what students can do in the world, beyond the knowledge and skills required in a certain discipline.

Information Technologies (IT) have been a relevant strategy to achieve sustainable assessment practices, as they provide students with authentic contexts through simulations and virtual worlds within professional contexts (Williams, 2008). The search for IT-enabled learning methods and strategies can also help teachers in the long term to analyse and assess students’ performance allowing for a more sustainable assessment especially when the number of students and thus workload is extremely high (Broadbent et al., 2018). As pointed out by several researchers (Johnson, 2013; Balderas et al., 2017) the analysis of students’ interaction logs could provide both, teachers as well as students with valuable data on their teaching-learning process, detecting eventual difficulties and learning needs on time, hence allowing teachers to provide personalized and immediate feedback. 

In a data-driven world, IT is more than a simple implementation tool, either to deploy a learning setting or to perform a job in the workplace. Emergent jobs have much to do with managing and assessing data and information in such amounts that can be hardly handled manually without the tooling support that IT provides and, most importantly, a sufficient mastery of the computing-related knowledge and skills that underlies data-driven assessment tasks. Computational thinking (CT) encompasses such abilities by abstracting away from particular aspects in the domain of computing, which are often hard for non-computing specialists. CT has had a strong influence on learning for all kinds of disciplines (Wing, 2017), providing an intellectual framework for thinking that will be fundamental in a lot of work settings that are emerging in the 21st century.

As a consequence, sustainable assessment practices have to consider CT and computing-related aspects to be successful in anticipating what students can do, assess and learn in the future world. This marks new directions of research in sustainable assessment, beyond those summarized by Boud & Soler (2016) as purposes, tasks, dispositions, engagements and course designs.

One of the areas that could benefit very much from using IT to harness CT is the area of foreign language-teaching with a special focus on Computer- and Mobile Assisted Language Learning (CALL/MALL). Since its rise in the 60th/80’s of the 20th century both areas have proven to entail a great potential to enrich teaching-learning processes. However, despite the numerous advances in both areas, teachers still face important challenges in order to exploit its full potential. Some of those challenges are to equip teachers with the necessary tools and knowledge to create their own teaching resources while at the same time extracting valuable information to assess their students’ learning process (Reinders, 2018; Thomas et al., 2017).

Some recent studies in this direction are those carried out by (Mota et al., 2018) and (Balderas et al., 2017). While the first explores the possibility of providing teachers from different areas with several authoring tools to allow teachers creating their own e-learning activities extracting valuable data on students’ performance, the second study focuses on the development of a domain specific language (DSL) to help foreign language teachers assessing students’ performance when learning through video games. 

Other attempts in this context have been made by (Palomo-Duarte, et. al., 2016) and (Gimeno-Sanz et al., 2018) who developed several virtual learning platforms that allowed students to become active participants in their own learning process, by taking part -amongst others- in the process of formative assessment, either through self-evaluation or peer evaluation. The latter are considered paramount to support students in their lifelong learning process since they both allow to equip students with the competences and skills they need to succeed in their learning process in the long term (Boud, 2000; Davies, 2010; Falchikov, 2013).

One of the current challenges with regard to self- and peer-assessment (Davies, 2000) is the development of standards that must be transparent for students so that they can see the knowledge they are expected to achieve (Jonsson, 2014; Bearman & Ajjawi, 2019; Ajjawi, et al., 2019).

The scope of this Special Issue (SI) is to provide an insight into the most recent findings and teaching practices regarding sustainable assessment in education and the added value of technology within this process.

In this sense, the different research papers of the SI intend to provide teachers and practitioners from a wide range of areas (from education in general up to specific areas such as language teaching and computer science) with good practices and strategies that could allow them to easily gather, organize and assess a great amount of data on their students’ learning process while at the same time providing students with valuable feedback and support during their learning process.

The methods and strategies this SI intends to provide aim to guarantee that assessment meets either, the students’ and teachers’ needs of the present while preparing, at the same time, students for meeting their own future learning needs (Boud & Soler, 2016; Nguyen  & Walker, 2016).

Under this perspective, the Special Issue wants to contribute to the field, presenting the most relevant advances in this research area.

We hope you will contribute your high-quality research and we look forward to reading your valuable results.

References:

Ajjawi, R.; Bearman, M. & Boud, D. (2019). Performing standards: a critical perspective on the contemporary use of standards in assessment. Teaching in Higher Education, 1-13. doi: 10.1080/13562517.2019.1678579. 

Balderas, A., Berns, A., Palomo-Duarte, M., Dodero, J. M., & Ruiz-Rube, I. (2017). Retrieving Objective Indicators from Student Logs in Virtual Worlds. Journal of Information Technology Research (JITR), 10(3), 69-83. doi:10.4018/JITR.2017070105.

Bearman, M. & Ajjawi, R. (2019). Can a Rubric Do More Than Be Transparent? Invitation as a New Metaphor for Assessment Criteria. Studies in Higher Education, 1-10. doi:10.1080/03075079.2019.1637842.

Broadbent, J.; Panadero. E. & Boud,D. (2018). Implementing summative assessment with a formative flavour: a case study in a large class. Assessment & Evaluation in Higher Education, 43 (2), 307-322. doi. 10.1080/02602938.2017.1343455.

Boud, D. & Molloy, E. (2013). Rethinking models of feedback for learning: the challenge of design. Assessment & Evaluation in Higher Education, 38 (6), 698-712. doi: 10.1080/02602938.2012.691462.

Boud, D. & Soler, R. (2016): Sustainable assessment revisited. Assessment & Evaluation in Higher Education, 41 (3), 400-413. doi: 10.1080/02602938.2015.1018133.

Davies, P. (2000). Computerised Peer-Assessment. Innovations in Education and Training. International Journal (IETI), 37(4), 346–355. doi:10.1080/135580000750052955.

Davies, S. (2010). Effective Assessment in a Digital Age. A guide to technology-enhanced assessment and feedback. Retrieved at: https://facultyinnovate.utexas.edu/sites/default/files/digiassass_eada.pdf

Dawson, Ph., Henderson,M., Mahoney, P., Phillips, M., Ryan, T., Boud, D.  & Molloy, E.  (2019). What makes for effective feedback: staff and student perspectives, Assessment & Evaluation in Higher Education, 44 (1), 25-36. doi. 10.1080/02602938.2018.1467877. 

Falchikov, N. (2013). Improving assessment through student involvement: Practical solutions for aiding learning in higher and further education. Routledge.

Gibbs, G. & Simpson, C. (2004–5): Conditions under which assessment supports student learning. Learning and Teaching in Higher Education, 1 (1): 3–31. Retrieved at http://eprints.glos.ac.uk/3609/.

Gimeno-Sanz, A.;  Sevilla-Pavón, A., &  Martínez-Sáez, A.  (2018): From local to massive learning: unveiling the (re) design process of an English LMOOC based on InGenio materials. In P. Taalas, J. Jalkanen, L. Bradley & S. Thouësny (Eds.), Future-proof CALL: language learning as exploration and encounters–short papers from EUROCALL 2018 (pp.77-85), Research-Publishing net. doi: 10.14705/rpnet.2018.26.9782490057221.

Jamieson, J. & Musumeci, M. (2017). Integrating assessment with instruction through technology. In C. A. Chapelle & Sh. Sauro (Eds.), The handbook of technology and second language teaching and learning (pp. 293- 316), Wiley Blackwell.

Johnson, D. H. (2013). Teaching a “MOOC”: Experiences from the front line. Digital Signal Processing and Signal Processing Education Meeting (DSP/SPE), 2013 IEEE, 268-272. doi: 10.1109/DSP-SPE.2013.6642602.

Jonsson, A. (2014). Rubrics as a Way of Providing Transparency in Assessment. Assessment & Evaluation in Higher Education, 39 (7), 840–852. doi. 10.1080/02602938.2013.875117.

Lugton, M. (2012). What is a MOOC? What are the different types of MOOC? xMOOCs and cMOOCs. Reflections.

Martín-Monje, E.; Dolores Castrillo, M. & Mañana-Rodríguez, J. (2018). Understanding online interaction in language MOOCs through learning analytics, Computer Assisted Language Learning, 31(3), 251-272. doi: 10.1080/09588221.2017.1378237

Mota, J.M.; Ruiz-Rube, I.; Dodero, J.M.; & Arnedillo-Sánchez, I. (2018). Augmented reality mobile app development for all. Computers & Electrical Engineering 65, 250-260. doi.org/10.1016/j.compeleceng.2017.08.025.

Nguyen, Th.T.H. & Walker, M. (2016): Sustainable assessment for lifelong learning. Assessment & Evaluation in Higher Education, 41 (1), 97-111. doi: 10.1080/02602938.2014.985632.

Palomo-Duarte, M.; Berns, A., Cejas, A., Dodero, J.M., Caballero, J.A. & Ruiz-Rube, I. (2016): Assessing Foreign Language Learning Through Mobile Game-Based Learning Environments. International Journal of Human Capital and Information Technology Professionals 7,( 2), 53-67.  doi: 10.4018/IJHCITP.2016040104.

Reinders, H. (2018). Teacher Resistance and Resilience. In J.I. Liontas & M. DelliCarpini (eds.), The TESOL Encyclopedia of English Language Teaching (pp-1-6), T. International Association. doi:10.1002/9781118784235.eelt0270.

Thomas, M.; Reinders, H.; & Gelan, A. (2017).  Learning Analytics in Online Language Learning. Challenges and Future directions. In. Lillian L. C. Wong, Ken Hyland (eds.), Faces of English Education. Students, Teachers, and Pedagogy (pp. 1-10) London, Routledge.

Wing, J.M. (2017). Computational thinking’s influence on research and education for all. Italian Journal of Educational Technology, 25(2), 7-14. doi: 10.17471/2499-4324/922

Prof. Dr. Juan Manuel Dodero
Prof. Anke Berns
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Sustainability is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2400 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Technology-enhanced learning and assessment
  • E-Assessment and Peer assessment
  • Formative, Summative and Sustainable assessment
  • Assessment in Massive Open Online Courses
  • Assessment in Virtual Simulations
  • Mobile Assisted Learning and assessment
  • Computational thinking and assessment

Published Papers

There is no accepted submissions to this special issue at this moment.
Back to TopTop