Next Article in Journal
Cluster Analysis and Model Comparison Using Smart Meter Data
Next Article in Special Issue
An Adaptive Game-Based Learning Strategy for Children Road Safety Education and Practice in Virtual Space
Previous Article in Journal
An Explainable Machine Learning Approach Based on Statistical Indexes and SVM for Stress Detection in Automobile Drivers Using Electromyographic Signals
Previous Article in Special Issue
Table Tennis Tutor: Forehand Strokes Classification Based on Multimodal Data and Neural Networks
Article

Towards Automatic Collaboration Analytics for Group Speech Data Using Learning Analytics

1
Educational Science Faculty, Open University of the Netherlands, 6419 AT Heerlen, The Netherlands
2
Institute of Education Science, Ruhr-Universität Bochum, 44801 Bochum, Germany
3
Information Communication Technology Faculty, Zuyd University of Applied Sciences, 6419 DJ Heerlen, The Netherlands
4
Computer Science Faculty, Delft University of Technology, 2628 CD Delft, The Netherlands
5
Information Center for Education, DIPF Leibniz Institute for Research and Information in Education, 60323 Frankfurt am Main, Germany
6
Computer Science Faculty, Goethe University, 60323 Frankfurt am Main, Germany
*
Authors to whom correspondence should be addressed.
Current address: Educational Science Faculty, Open Universiteit, Valkenburgerweg 177, 6419AT Heerlen, The Netherlands.
Academic Editors: José A. Ruipérez-Valiente and Jenny Benois-Pineau
Sensors 2021, 21(9), 3156; https://doi.org/10.3390/s21093156
Received: 26 March 2021 / Revised: 27 April 2021 / Accepted: 29 April 2021 / Published: 2 May 2021
(This article belongs to the Special Issue From Sensor Data to Educational Insights)
Collaboration is an important 21st Century skill. Co-located (or face-to-face) collaboration (CC) analytics gained momentum with the advent of sensor technology. Most of these works have used the audio modality to detect the quality of CC. The CC quality can be detected from simple indicators of collaboration such as total speaking time or complex indicators like synchrony in the rise and fall of the average pitch. Most studies in the past focused on “how group members talk” (i.e., spectral, temporal features of audio like pitch) and not “what they talk”. The “what” of the conversations is more overt contrary to the “how” of the conversations. Very few studies studied “what” group members talk about, and these studies were lab based showing a representative overview of specific words as topic clusters instead of analysing the richness of the content of the conversations by understanding the linkage between these words. To overcome this, we made a starting step in this technical paper based on field trials to prototype a tool to move towards automatic collaboration analytics. We designed a technical setup to collect, process and visualize audio data automatically. The data collection took place while a board game was played among the university staff with pre-assigned roles to create awareness of the connection between learning analytics and learning design. We not only did a word-level analysis of the conversations, but also analysed the richness of these conversations by visualizing the strength of the linkage between these words and phrases interactively. In this visualization, we used a network graph to visualize turn taking exchange between different roles along with the word-level and phrase-level analysis. We also used centrality measures to understand the network graph further based on how much words have hold over the network of words and how influential are certain words. Finally, we found that this approach had certain limitations in terms of automation in speaker diarization (i.e., who spoke when) and text data pre-processing. Therefore, we concluded that even though the technical setup was partially automated, it is a way forward to understand the richness of the conversations between different roles and makes a significant step towards automatic collaboration analytics. View Full-Text
Keywords: collaboration; collaboration analytics; co-located collaboration analytics; group speech analytics; multimodal learning analytics collaboration; collaboration analytics; co-located collaboration analytics; group speech analytics; multimodal learning analytics
Show Figures

Figure 1

MDPI and ACS Style

Praharaj, S.; Scheffel, M.; Schmitz, M.; Specht, M.; Drachsler, H. Towards Automatic Collaboration Analytics for Group Speech Data Using Learning Analytics. Sensors 2021, 21, 3156. https://doi.org/10.3390/s21093156

AMA Style

Praharaj S, Scheffel M, Schmitz M, Specht M, Drachsler H. Towards Automatic Collaboration Analytics for Group Speech Data Using Learning Analytics. Sensors. 2021; 21(9):3156. https://doi.org/10.3390/s21093156

Chicago/Turabian Style

Praharaj, Sambit, Maren Scheffel, Marcel Schmitz, Marcus Specht, and Hendrik Drachsler. 2021. "Towards Automatic Collaboration Analytics for Group Speech Data Using Learning Analytics" Sensors 21, no. 9: 3156. https://doi.org/10.3390/s21093156

Find Other Styles
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Back to TopTop