Next Issue
Volume 12, July
Previous Issue
Volume 12, May
 
 

Future Internet, Volume 12, Issue 6 (June 2020) – 15 articles

Cover Story (view full-size image): Soft sensors (SSs) are inferential models used in many industrial fields. They allow real-time estimation of hard-to-measure variables as a function of easy-to-measure variables obtained from on-line sensors. SSs are generally built from industries’ historical databases through data-driven approaches. A critical issue in SSs design concerns the selection of input variables, among those available in a candidate dataset, that can reach great numbers in an industrial environment. Such numbers of inputs would make the design computationally demanding and lead to poorly performing models. An input selection procedure is necessary. Most input selection approaches for SS design are addressed and classified in the paper, with their benefits and drawbacks outlined, to guide the designer through this step. View this paper.
  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Reader to open them.
Order results
Result details
Section
Select all
Export citation of selected articles as:
23 pages, 453 KiB  
Article
A Methodology to Evaluate Standards and Platforms within Cyber Threat Intelligence
by Alessandra de Melo e Silva, João José Costa Gondim, Robson de Oliveira Albuquerque and Luis Javier García Villalba
Future Internet 2020, 12(6), 108; https://doi.org/10.3390/fi12060108 - 23 Jun 2020
Cited by 32 | Viewed by 7874
Abstract
The cyber security landscape is fundamentally changing over the past years. While technology is evolving and new sophisticated applications are being developed, a new threat scenario is emerging in alarming proportions. Sophisticated threats with multi-vectored, multi-staged and polymorphic characteristics are performing complex attacks, [...] Read more.
The cyber security landscape is fundamentally changing over the past years. While technology is evolving and new sophisticated applications are being developed, a new threat scenario is emerging in alarming proportions. Sophisticated threats with multi-vectored, multi-staged and polymorphic characteristics are performing complex attacks, making the processes of detection and mitigation far more complicated. Thus, organizations were encouraged to change their traditional defense models and to use and to develop new systems with a proactive approach. Such changes are necessary because the old approaches are not effective anymore to detect advanced attacks. Also, the organizations are encouraged to develop the ability to respond to incidents in real-time using complex threat intelligence platforms. However, since the field is growing rapidly, today Cyber Threat Intelligence concept lacks a consistent definition and a heterogeneous market has emerged, including diverse systems and tools, with different capabilities and goals. This work aims to provide a comprehensive evaluation methodology of threat intelligence standards and cyber threat intelligence platforms. The proposed methodology is based on the selection of the most relevant candidates to establish the evaluation criteria. In addition, this work studies the Cyber Threat Intelligence ecosystem and Threat Intelligence standards and platforms existing in state-of-the-art. Full article
(This article belongs to the Collection Information Systems Security)
Show Figures

Figure 1

17 pages, 380 KiB  
Article
Validating the Adoption of Heterogeneous Internet of Things with Blockchain
by Lulwah AlSuwaidan and Nuha Almegren
Future Internet 2020, 12(6), 107; https://doi.org/10.3390/fi12060107 - 21 Jun 2020
Cited by 15 | Viewed by 3534
Abstract
Emerging technologies such as Internet of Things (IoT) and blockchain have affected the digital transformation. Blockchain, on the one hand, was initially developed for the purpose of financial trading due to its robustness especially for fault tolerance and cryptographic security in addition to [...] Read more.
Emerging technologies such as Internet of Things (IoT) and blockchain have affected the digital transformation. Blockchain, on the one hand, was initially developed for the purpose of financial trading due to its robustness especially for fault tolerance and cryptographic security in addition to its decentralized architecture. IoT, on the other hand, is an open interconnected network of smart devices able to communicate simultaneously. This arises a challenge in privacy and security, specifically for the data being exchanged. To overcome this, studies have focused on the blockchain to resolve the security and privacy issues of IoT. Indeed, limited studies have proposed to assess blockchain’s viability for IoT and the associated challenges. In this paper, a conceptual model has proposed to identify the crucial factors affecting the adoption of blockchain in IoT. The model consists of four dimensions of factors that we assume will affect the adoption of the two technologies. The dimensions are: attitude-related factors, social influence related factors, data-related factors, and security-related factors. This model is validated through a survey that was distributed between professionals in blockchain and IoT. The findings show a significant impact of data-related factors on the adoption of blockchain in IoT and the intention to use them. The model can play an important role in the development of strategies, standards, and performance assessment. Full article
(This article belongs to the Special Issue Network Economics and Utility Maximization)
Show Figures

Figure 1

20 pages, 292 KiB  
Article
Collective Intelligence in Polish-Ukrainian Internet Projects. Debate Models and Research Methods
by Rafał Olszowski and Marcin Chmielowski
Future Internet 2020, 12(6), 106; https://doi.org/10.3390/fi12060106 - 20 Jun 2020
Cited by 1 | Viewed by 2893
Abstract
In this study, we focus on models of civic debate suitable for use in Polish-Ukrainian internet projects, as well as methods of researching collective intelligence that can help to monitor particular aspects of such debates and consequently create social bridging capital between these [...] Read more.
In this study, we focus on models of civic debate suitable for use in Polish-Ukrainian internet projects, as well as methods of researching collective intelligence that can help to monitor particular aspects of such debates and consequently create social bridging capital between these groups. The dynamic socio-political situation of recent years, both in Ukraine and in Poland, has created new conditions. Anti-government protests and social turmoil related to the war in Crimea and Donbas, as well as a high level of migration in the region in a short period led to the creation of a multi-ethnic society. This brings opportunities for the development of a new type of social capital: A new participative model of social life based on internet projects, with a relatively low entry barrier, space for creativity, and the widespread use of ICT technologies, can provide the new ways of debating, civic engagement, and collective action. Our research, based on a multidisciplinary literature review, as well as a series of qualitative in-depth interviews (IDIs), proved that the selected collective intelligence (CI) research methods and debate models can help to develop internet communities that will contribute to building bridging capital between Poles and Ukrainians. Full article
(This article belongs to the Special Issue Selected Papers from the INSCI2019: Internet Science 2019)
30 pages, 1333 KiB  
Article
SOLIOT—Decentralized Data Control and Interactions for IoT
by Sebastian R. Bader and Maria Maleshkova
Future Internet 2020, 12(6), 105; https://doi.org/10.3390/fi12060105 - 16 Jun 2020
Cited by 10 | Viewed by 4412
Abstract
The digital revolution affects every aspect of society and economy. In particular, the manufacturing industry faces a new age of production processes and connected collaboration. The underlying ideas and concepts, often also framed as a new “Internet of Things”, transfer IT technologies to [...] Read more.
The digital revolution affects every aspect of society and economy. In particular, the manufacturing industry faces a new age of production processes and connected collaboration. The underlying ideas and concepts, often also framed as a new “Internet of Things”, transfer IT technologies to the shop floor, entailing major challenges regarding the heterogeneity of the domain. On the other hand, web technologies have already proven their value in distributed settings. SOLID (derived from “social linked data”) is a recent approach to decentralize data control and standardize interactions for social applications in the web. Extending this approach towards industrial applications has the potential to bridge the gap between the World Wide Web and local manufacturing environments. This paper proposes SOLIOT—a combination of lightweight industrial protocols with the integration and data control provided by SOLID. An in-depth requirement analysis examines the potential but also current limitations of the approach. The conceptual capabilities are outlined, compared and extended for the IoT protocols CoAP and MQTT. The feasibility of the approach is illustrated through an open-source implementation, which is evaluated in a virtual test bed and a detailed analysis of the proposed components. Full article
(This article belongs to the Special Issue Internet of Things (IoT) Applications for Industry 4.0)
Show Figures

Figure 1

31 pages, 423 KiB  
Article
Improvements of the xAAL Home Automation System
by Christophe Lohr and Jérôme Kerdreux
Future Internet 2020, 12(6), 104; https://doi.org/10.3390/fi12060104 - 12 Jun 2020
Cited by 5 | Viewed by 3228
Abstract
The xAAL home automation system has been designed on the basis of distributed systems principles with messages passing and home network communications over IP. The proposal makes extensive use of standards and provides a clear separation of roles along the distributed system with [...] Read more.
The xAAL home automation system has been designed on the basis of distributed systems principles with messages passing and home network communications over IP. The proposal makes extensive use of standards and provides a clear separation of roles along the distributed system with no predominant actor. This allows openness and interoperability. This objective can be reached once all parts are convinced: consumers, manufacturers, service providers, etc. To get a broad adoption, the proposal comes with fine-tuned communication, architecture, security, and simplicity. Tests and experiments in the long term have led us to optimize the protocol, adjust the architecture, and rearrange device descriptions. This paper provides a full description of the improved system, with all details to make feasible compatible alternative implementations. It also discusses alternatives and all aspects that led us to make structuring choices: CBOR messages on an IP multicast channel, intranet communication, ciphering with Poly1305/Chacha20, structured and extensible abstract device description, and a distributed system architecture. Full article
Show Figures

Graphical abstract

24 pages, 2385 KiB  
Review
Risk-Based Access Control Model: A Systematic Literature Review
by Hany F. Atlam, Muhammad Ajmal Azad, Madini O. Alassafi, Abdulrahman A. Alshdadi and Ahmed Alenezi
Future Internet 2020, 12(6), 103; https://doi.org/10.3390/fi12060103 - 11 Jun 2020
Cited by 23 | Viewed by 5543
Abstract
Most current access control models are rigid, as they are designed using static policies that always give the same outcome in different circumstances. In addition, they cannot adapt to environmental changes and unpredicted situations. With dynamic systems such as the Internet of Things [...] Read more.
Most current access control models are rigid, as they are designed using static policies that always give the same outcome in different circumstances. In addition, they cannot adapt to environmental changes and unpredicted situations. With dynamic systems such as the Internet of Things (IoT) with billions of things that are distributed everywhere, these access control models are obsolete. Hence, dynamic access control models are required. These models utilize not only access policies but also contextual and real-time information to determine the access decision. One of these dynamic models is the risk-based access control model. This model estimates the security risk value related to the access request dynamically to determine the access decision. Recently, the risk-based access control model has attracted the attention of several organizations and researchers to provide more flexibility in accessing system resources. Therefore, this paper provides a systematic review and examination of the state-of-the-art of the risk-based access control model to provide a detailed understanding of the topic. Based on the selected search strategy, 44 articles (of 1044 articles) were chosen for a closer examination. Out of these articles, the contributions of the selected articles were summarized. In addition, the risk factors used to build the risk-based access control model were extracted and analyzed. Besides, the risk estimation techniques used to evaluate the risks of access control operations were identified. Full article
(This article belongs to the Special Issue Emerging Trends of Fog Computing in Internet of Things Applications)
Show Figures

Figure 1

15 pages, 1689 KiB  
Article
CMS: A Continuous Machine-Learning and Serving Platform for Industrial Big Data
by KeDi Li and Ning Gui
Future Internet 2020, 12(6), 102; https://doi.org/10.3390/fi12060102 - 10 Jun 2020
Cited by 8 | Viewed by 3358
Abstract
The life-long monitoring and analysis for complex industrial equipment demands a continuously evolvable machine-learning platform. The machine-learning model must be quickly regenerated and updated. This demands the careful orchestration of trainers for model generation and modelets for model serving without the interruption of [...] Read more.
The life-long monitoring and analysis for complex industrial equipment demands a continuously evolvable machine-learning platform. The machine-learning model must be quickly regenerated and updated. This demands the careful orchestration of trainers for model generation and modelets for model serving without the interruption of normal operations. This paper proposes a container-based Continuous Machine-Learning and Serving (CMS) platform. By designing out-of-the-box common architecture for trainers and modelets, it simplifies the model training and deployment process with minimal human interference. An orchestrator is proposed to manage the trainer’s execution and enables the model updating without interrupting the online operation of model serving. CMS has been deployed in a 1000 MW thermal power plant for about five months. The system running results show that the accuracy of eight models remains at a good level even when they experience major renovations. Moreover, CMS proved to be a resource-efficient, effective resource isolation and seamless model switching with little overhead. Full article
(This article belongs to the Special Issue Network Architectures and Protocols for Industrial IoT)
Show Figures

Figure 1

17 pages, 5049 KiB  
Article
Performance Analysis of a Novel TCP Protocol Algorithm Adapted to Wireless Networks
by Gonzalo Olmedo, Román Lara-Cueva, Diego Martínez and Celso de Almeida
Future Internet 2020, 12(6), 101; https://doi.org/10.3390/fi12060101 - 09 Jun 2020
Cited by 6 | Viewed by 3672
Abstract
As telecommunication systems evolve towards new-generation architectures, likewise, new protocols are created in order to improve efficiency. One of these protocols is Transmission Control Protocol (TCP), which controls the transmission bit rate in function of network congestion. Nevertheless, in wireless communications, there appear [...] Read more.
As telecommunication systems evolve towards new-generation architectures, likewise, new protocols are created in order to improve efficiency. One of these protocols is Transmission Control Protocol (TCP), which controls the transmission bit rate in function of network congestion. Nevertheless, in wireless communications, there appear problems such as noise and interference, for which TCP was not designed. Based on these problems, there exist some methods trying to mitigate congestion, such as explicit loss notifications and the use of end-to-end codification. The aim of this work was to propose a wireless TCP protocol improvement, considering a negative acknowledgment (NACK), which allows to differentiate between losses due to congestion and losses due to wireless channel issues. NACK employs a small protocol packet and produces improvement in the quality of service metrics. The experiments were carried out in in-door and out-door environments, over an online video game scenario, and over a long-distance wireless link between two islands. The average results show a 25-percent delay improvement and a 5-percent jitter improvement when compared to the original TCP Reno protocol, while for throughput a 90-percent improvement was achieved for distances between 100 and 414 m. Full article
Show Figures

Figure 1

21 pages, 6656 KiB  
Article
Performance Model for Video Service in 5G Networks
by Jiao Wang, Jay Weitzen, Oguz Bayat, Volkan Sevindik and Mingzhe Li
Future Internet 2020, 12(6), 99; https://doi.org/10.3390/fi12060099 - 08 Jun 2020
Cited by 1 | Viewed by 2984
Abstract
Network slicing allows operators to sell customized slices to various tenants at different prices. To provide better-performing and cost-efficient services, network slicing is looking to intelligent resource management approaches to be aligned to users’ activities per slice. In this article, we propose a [...] Read more.
Network slicing allows operators to sell customized slices to various tenants at different prices. To provide better-performing and cost-efficient services, network slicing is looking to intelligent resource management approaches to be aligned to users’ activities per slice. In this article, we propose a radio access network (RAN) slicing design methodology for quality of service (QoS) provisioning, for differentiated services in a 5G network. A performance model is constructed for each service using machine learning (ML)-based approaches, optimized using interference coordination approaches, and used to facilitate service level agreement (SLA) mapping to the radio resource. The optimal bandwidth allocation is dynamically adjusted based on instantaneous network load conditions. We investigate the application of machine learning in solving the radio resource slicing problem and demonstrate the advantage of machine learning through extensive simulations. A case study is presented to demonstrate the effectiveness of the proposed radio resource slicing approach. Full article
(This article belongs to the Special Issue Future Networks: Latest Trends and Developments)
Show Figures

Figure 1

20 pages, 2543 KiB  
Article
Patient Privacy Violation Detection in Healthcare Critical Infrastructures: An Investigation Using Density-Based Benchmarking
by William Hurst, Aaron Boddy, Madjid Merabti and Nathan Shone
Future Internet 2020, 12(6), 100; https://doi.org/10.3390/fi12060100 - 08 Jun 2020
Cited by 10 | Viewed by 4127
Abstract
Hospital critical infrastructures have a distinct threat vector, due to (i) a dependence on legacy software; (ii) the vast levels of interconnected medical devices; (iii) the use of multiple bespoke software and that (iv) electronic devices (e.g., laptops and PCs) are often shared [...] Read more.
Hospital critical infrastructures have a distinct threat vector, due to (i) a dependence on legacy software; (ii) the vast levels of interconnected medical devices; (iii) the use of multiple bespoke software and that (iv) electronic devices (e.g., laptops and PCs) are often shared by multiple users. In the UK, hospitals are currently upgrading towards the use of electronic patient record (EPR) systems. EPR systems and their data are replacing traditional paper records, providing access to patients’ test results and details of their overall care more efficiently. Paper records are no-longer stored at patients’ bedsides, but instead are accessible via electronic devices for the direct insertion of data. With over 83% of hospitals in the UK moving towards EPRs, access to this healthcare data needs to be monitored proactively for malicious activity. It is paramount that hospitals maintain patient trust and ensure that the information security principles of integrity, availability and confidentiality are upheld when deploying EPR systems. In this paper, an investigation methodology is presented towards the identification of anomalous behaviours within EPR datasets. Many security solutions focus on a perimeter-based approach; however, this approach alone is not enough to guarantee security, as can be seen from the many examples of breaches. Our proposed system can be complementary to existing security perimeter solutions. The system outlined in this research employs an internal-focused methodology for anomaly detection by using the Local Outlier Factor (LOF) and Density-Based Spatial Clustering of Applications with Noise (DBSCAN) algorithms for benchmarking behaviour, for assisting healthcare data analysts. Out of 90,385 unique IDs, DBSCAN finds 102 anomalies, whereas 358 are detected using LOF. Full article
(This article belongs to the Section Internet of Things)
Show Figures

Figure 1

20 pages, 749 KiB  
Perspective
Volunteer Down: How COVID-19 Created the Largest Idling Supercomputer on Earth
by Nane Kratzke
Future Internet 2020, 12(6), 98; https://doi.org/10.3390/fi12060098 - 06 Jun 2020
Cited by 5 | Viewed by 3969
Abstract
From close to scratch, the COVID-19 pandemic created the largest volunteer supercomputer on earth. Sadly, processing resources assigned to the corresponding Folding@home project cannot be shared with other volunteer computing projects efficiently. Consequently, the largest supercomputer had significant idle times. This perspective paper [...] Read more.
From close to scratch, the COVID-19 pandemic created the largest volunteer supercomputer on earth. Sadly, processing resources assigned to the corresponding Folding@home project cannot be shared with other volunteer computing projects efficiently. Consequently, the largest supercomputer had significant idle times. This perspective paper investigates how the resource sharing of future volunteer computing projects could be improved. Notably, efficient resource sharing has been optimized throughout the last ten years in cloud computing. Therefore, this perspective paper reviews the current state of volunteer and cloud computing to analyze what both domains could learn from each other. It turns out that the disclosed resource sharing shortcomings of volunteer computing could be addressed by technologies that have been invented, optimized, and adapted for entirely different purposes by cloud-native companies like Uber, Airbnb, Google, or Facebook. Promising technologies might be containers, serverless architectures, image registries, distributed service registries, and all have one thing in common: They already exist and are all tried and tested in large web-scale deployments. Full article
(This article belongs to the Special Issue Cloud-Native Applications and Services)
Show Figures

Figure 1

24 pages, 1188 KiB  
Review
Input Selection Methods for Soft Sensor Design: A Survey
by Francesco Curreri, Giacomo Fiumara and Maria Gabriella Xibilia
Future Internet 2020, 12(6), 97; https://doi.org/10.3390/fi12060097 - 04 Jun 2020
Cited by 19 | Viewed by 4849
Abstract
Soft Sensors (SSs) are inferential models used in many industrial fields. They allow for real-time estimation of hard-to-measure variables as a function of available data obtained from online sensors. SSs are generally built using industries historical databases through data-driven approaches. A critical issue [...] Read more.
Soft Sensors (SSs) are inferential models used in many industrial fields. They allow for real-time estimation of hard-to-measure variables as a function of available data obtained from online sensors. SSs are generally built using industries historical databases through data-driven approaches. A critical issue in SS design concerns the selection of input variables, among those available in a candidate dataset. In the case of industrial processes, candidate inputs can reach great numbers, making the design computationally demanding and leading to poorly performing models. An input selection procedure is then necessary. Most used input selection approaches for SS design are addressed in this work and classified with their benefits and drawbacks to guide the designer through this step. Full article
(This article belongs to the Collection Featured Reviews of Future Internet Research)
Show Figures

Figure 1

14 pages, 1008 KiB  
Article
Multi-Source Neural Model for Machine Translation of Agglutinative Language
by Yirong Pan, Xiao Li, Yating Yang and Rui Dong
Future Internet 2020, 12(6), 96; https://doi.org/10.3390/fi12060096 - 03 Jun 2020
Cited by 9 | Viewed by 3404
Abstract
Benefitting from the rapid development of artificial intelligence (AI) and deep learning, the machine translation task based on neural networks has achieved impressive performance in many high-resource language pairs. However, the neural machine translation (NMT) models still struggle in the translation task on [...] Read more.
Benefitting from the rapid development of artificial intelligence (AI) and deep learning, the machine translation task based on neural networks has achieved impressive performance in many high-resource language pairs. However, the neural machine translation (NMT) models still struggle in the translation task on agglutinative languages with complex morphology and limited resources. Inspired by the finding that utilizing the source-side linguistic knowledge can further improve the NMT performance, we propose a multi-source neural model that employs two separate encoders to encode the source word sequence and the linguistic feature sequences. Compared with the standard NMT model, we utilize an additional encoder to incorporate the linguistic features of lemma, part-of-speech (POS) tag, and morphological tag by extending the input embedding layer of the encoder. Moreover, we use a serial combination method to integrate the conditional information from the encoders with the outputs of the decoder, which aims to enhance the neural model to learn a high-quality context representation of the source sentence. Experimental results show that our approach is effective for the agglutinative language translation, which achieves the highest improvements of +2.4 BLEU points on Turkish–English translation task and +0.6 BLEU points on Uyghur–Chinese translation task. Full article
(This article belongs to the Section Big Data and Augmented Intelligence)
Show Figures

Figure 1

25 pages, 283 KiB  
Review
Simulating Resource Management across the Cloud-to-Thing Continuum: A Survey and Future Directions
by Malika Bendechache, Sergej Svorobej, Patricia Takako Endo and Theo Lynn
Future Internet 2020, 12(6), 95; https://doi.org/10.3390/fi12060095 - 29 May 2020
Cited by 27 | Viewed by 4785
Abstract
In recent years, there has been significant advancement in resource management mechanisms for cloud computing infrastructure performance in terms of cost, quality of service (QoS) and energy consumption. The emergence of the Internet of Things has led to the development of infrastructure that [...] Read more.
In recent years, there has been significant advancement in resource management mechanisms for cloud computing infrastructure performance in terms of cost, quality of service (QoS) and energy consumption. The emergence of the Internet of Things has led to the development of infrastructure that extends beyond centralised data centers from the cloud to the edge, the so-called cloud-to-thing continuum (C2T). This infrastructure is characterised by extreme heterogeneity, geographic distribution, and complexity, where the key performance indicators (KPIs) for the traditional model of cloud computing may no longer apply in the same way. Existing resource management mechanisms may not be suitable for such complex environments and therefore require thorough testing, validation and evaluation before even being considered for live system implementation. Similarly, previously discounted resource management proposals may be more relevant and worthy of revisiting. Simulation is a widely used technique in the development and evaluation of resource management mechanisms for cloud computing but is a relatively nascent research area for new C2T computing paradigms such as fog and edge computing. We present a methodical literature analysis of C2T resource management research using simulation software tools to assist researchers in identifying suitable methods, algorithms, and simulation approaches for future research. We analyse 35 research articles from a total collection of 317 journal articles published from January 2009 to March 2019. We present our descriptive and synthetic analysis from a variety of perspectives including resource management, C2T layer, and simulation. Full article
(This article belongs to the Special Issue Network Cost Reduction in Cloud/Fog Computing Environments)
20 pages, 3784 KiB  
Article
COVID-19 Epidemic as E-Learning Boost? Chronological Development and Effects at an Austrian University against the Background of the Concept of “E-Learning Readiness”
by Martin Ebner, Sandra Schön, Clarissa Braun, Markus Ebner, Ypatios Grigoriadis, Maria Haas, Philipp Leitner and Behnam Taraghi
Future Internet 2020, 12(6), 94; https://doi.org/10.3390/fi12060094 - 26 May 2020
Cited by 102 | Viewed by 16352
Abstract
The COVID-19 crisis influenced universities worldwide in early 2020. In Austria, all universities were closed in March 2020 as a preventive measure, and meetings with over 100 people were banned and a curfew was imposed. This development also had a massive impact on [...] Read more.
The COVID-19 crisis influenced universities worldwide in early 2020. In Austria, all universities were closed in March 2020 as a preventive measure, and meetings with over 100 people were banned and a curfew was imposed. This development also had a massive impact on teaching, which in Austria takes place largely face-to-face. In this paper we would like to describe the situation of an Austrian university regarding e-learning before and during the first three weeks of the changeover of the teaching system, using the example of Graz University of Technology (TU Graz). The authors provide insights into the internal procedures, processes and decisions of their university and present figures on the changed usage behaviour of their students and teachers. As a theoretical reference, the article uses the e-learning readiness assessment according to Alshaher (2013), which provides a framework for describing the status of the situation regarding e-learning before the crisis. The paper concludes with a description of enablers, barriers and bottlenecks from the perspective of the members of the Educational Technology department. Full article
(This article belongs to the Special Issue Computational Thinking)
Show Figures

Figure 1

Previous Issue
Next Issue
Back to TopTop