Next Article in Journal
Real-time Recognition of Interleaved Activities Based on Ensemble Classifier of Long Short-Term Memory with Fuzzy Temporal Windows
Previous Article in Journal
How Do We Create Experiences for Students That Connect with What They Care About?
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Proceeding Paper

UCAmI Cup. Analyzing the UJA Human Activity Recognition Dataset of Activities of Daily Living †

1
Department of Computer Science, University of Jaen, 23071 Jaén, Spain
2
School of Computing, Ulster University, Jordanstown Campus, Belfast BT37 0QB, UK
*
Author to whom correspondence should be addressed.
Presented at the 12th International Conference on Ubiquitous Computing and Ambient Intelligence (UCAmI 2018), Punta Cana, Dominican Republic, 4–7 December 2018.
Proceedings 2018, 2(19), 1267; https://doi.org/10.3390/proceedings2191267
Published: 26 October 2018
(This article belongs to the Proceedings of UCAmI 2018)

Abstract

:
Many real-world applications, which are focused on addressing the needs of a human, require information pertaining to the activities being performed. The UCAmI Cup is an event held within the context of the International Conference on Ubiquitous Computing and Ambient ‪Intelligence, where delegates are given the opportunity to use their tools and techniques to analyse a previously unseen human activity recognition dataset and to compare their results with others working in the same domain. In this paper, the human activity recognition dataset used relates to activities of daily living generated in the UJAmI Smart Lab, University of Jaén. The dataset chosen for the first edition of the UCAmI Cup represents 246 activities performed over a period of ten days carried out by a single inhabitant. The dataset includes four data sources: (i) event streams from 30 binary sensors, (ii) intelligent floor location data, (iii) proximity data between a smart watch worn by the inhabitant and 15 Bluetooth Low Energy beacons and (iv) acceleration of the smart watch. In this first edition of the UCAmI Cup, 26 participants from 10 different countries contacted the organizers to obtain the dataset.

1. Introduction

Activity recognition systems deployed in smart homes are characterized by their ability to detect Activities of Daily Living (ADL) in order to improve assistance. Such solutions have been adopted by smart homes in practice and have delivered promising results for improving the quality of care services for elderly people and responsive assistance in emergency situations [1].
Data driven approaches [2] developed for the purposes of Human Activity Recognition (HAR) of ADLs require large annotated data sets which offer high levels of quality in terms of both the ground truth and generalisation of the underlying data. A limited number of online repositories have supported the notion of providing openly available datasets for research and development purposes. Two key examples are the UC Irvine Machine Learning repository [2] and Physionet [3]. The former has recently extended its datasets to include a small number of HAR related resources. The European Union funded Project OPPORTUNITY created a common platform whereby researchers working in different organizations could have access to a common data set and therefore were able to compare their results with others [4]. Beyond the aforementioned, efforts to provide high quality, openly available large scale datasets have been largely un-co-ordinated. There still remains a lack of frameworks where multiple researchers have the ability to compare their results using their tools and techniques to analyses the same HAR problem relating to an ADL dataset. The competition closest to the UCAmI Cup is the recently announced Sussex-Huawei Locomotion Challenge [5] where the Sussex-Huawei Locomotion Dataset is used to recognize 8 modes of locomotion and transportation (Car, Bus, Train, Subway, Walk, Run, Bike and Still) from the inertial sensor data of a smartphone (accelerometer, gyroscope, magnetometer, linear acceleration, gravity, orientation (quaternions and ambient pressure.). Nevertheless, this competition does not aim to develop solutions for smart homes to improve assistance.
The concept of comparing techniques on openly available data is performed in other domains such as indoor localization [6] or automatic image classification [7], Physionet CinC, IJCAI [8] Competitions [9] and the KDD Cup challenge [10].
In order to address the gap in the domain of HAR for ADL, the UCAmI Cup has been announced. The UCAmI Cup aims to be an annual event within the forum of the International Conference of Ubiquitous Computing and Ambient Intelligence (UCAmI) where delegates will be provided with the opportunity to use their tools and techniques to analyse a HAR dataset and to compare their results with others in the ADL context. Each year, the dataset and the problem to be addressed will be changed to align with the major research topics being considered as state-of-the-art trends. The selected dataset to be used in the 1st UCAmI Cup [11] is the HAR dataset of ADL generated by the University of Jaén (UJA) in its newly created UJAmI Smart Lab. This paper aims to review the details of the infrastructure of the UJAmI Smart Lab in addition to present in detail the selected dataset that will be used in the 1st UCAmI Cup together with the details of the competition.
The remainder of the paper is structured as follows: Section 2 presents the UJAmI Smart Lab of the University of Jaén where the HAR dataset of ADL was generated. Section 3 presents a general description of the dataset in addition to its structure and its format. Section 4 presents each kind of data source that is contained in the dataset: binary sensor data, proximity data, acceleration data and, finally, location data of the smart floor. Section 5 presents details of the competition in the 1st UCAmI Cup and the results attained. Finally, Section 6 presents the conclusions and future works.

2. UJAmI Smart Lab of the University of Jaén

The University of Jaén Ambient Intelligence (UJAmI) [12,13,14] represents an innovative space that plays a key role in the implementation of new ground-breaking research within the realms of Ambient Intelligence (AmI) [4], which is a paradigm in information technology aimed at empowering people’s capabilities through the means of digital environments.
The aim of the creation of the UJAmI Smart Lab [14] in 2014 was to produce a real apartment: sensitive, adaptive and responsive to human needs (habits, gestures and emotions) which subsequently underpinned assistive technology based solutions in the home.
The UJAmI SmartLab measures approximately 25 square meters; its measurements are 5.8 m long and 4.6 m wide. It is divided into five regions: entrance, kitchen, workplace, living room and a bedroom with an integrated bathroom. The layout of the UJAmI SmartLab is presented in Figure 1.
A set of multiple and heterogeneous sensors have been deployed in different areas of the environment in order to capture human-environment interactions in addition to inhabitant behaviour. Currently, a web-based system for managing and monitoring smart environments is deployed [15] based on openHAB [16] with an approach for distributing and processing heterogeneous data based on a representation with fuzzy linguistic terms [17]. It is, however, beneficial to utilize a framework that includes a common protocol for data collection, a common format for data exchange, and a data repository and related tools to underpin research within the domain of activity recognition. For this reason, the UJAmI SmartLab is moving towards the deployment of a common middleware platform referred to as SensorCentral [18] that is compatible with an open data format referred to as the Open Data Initiative (ODI) [19].

3. General Description of UJAmI HAR Dataset

In this Section, the HAR of the ADL dataset from the UJAmI SmartLab used in the 1st UCAmI Cup is described.
The UJA dataset from the UJAmI SmartLab is composed of four data sources that have been obtained whilst an inhabitant performed 246 instances of activity classes over a period of 10 days. The dataset is divided into two sets:
-
Part 1: Labelled training set with seven days of recordings that contains 169 instances.
-
Part 2: Unlabelled test set with three days of recordings that contains 77 instances.
The four data sources are as follows:
  • Event stream generated by 30 binary sensors.
  • Proximity information between a smart watch worn by an inhabitant and a set of 15 Bluetooth Low Energy (BLE) beacons deployed in the UJAmI SmartLab.
  • Acceleration generated by the smart watch.
  • An intelligent floor with 40 modules that provides location data.
The inhabitant who performed the activities was a 24 year old male student from the University of Jaen. During data collection, the smart watch “LG Urbane model” [20] was worn on the participant’s right hand. For reasons of energy saving, recording of acceleration data and proximity related information ceased when the inhabitant went to bed in addition to when he left the UJAmI SmartLab.
The dataset includes 24 different types of activities as presented in Table 1 with the frequency of each activity only in the training set presented.
The activities being undertaken during data collection were annotated by using NFC tags and a smartphone. This process was used to label the beginning and end of each activity.
The root folder of the dataset contains the folders and files as illustrated in Figure 2.
  • The Folder named “Pictures” (UCAmI Cup\Pictures\) contains:
    • o A folder named “Binary Sensors” with pictures of each binary sensor used for data collection in the UJAmI SmartLab.
    • o A folder named “BLE sensor” with pictures of each BLE sensor used in the UJAmI SmartLab during data collection.
    • o A folder named “Smart Lab” which contains pictures of each area in the UJAmI SmartLab.
  • The Folder named “Layout” (UCAmI Cup\Layout\) contains:
    • o A file named “sensors.png” which shows the layout of the UJAmI SmartLab and where each of the binary sensors are located.
    • o A file named “proximity.png” which shows the layout of the UJAmI SmartLab and where each of the BLE sensors are located.
    • o A file named “Coordinates.docx” which contains a table with the coordinates X and Y of each binary sensor and each BLE sensor in the UJAmI SmartLab.
    • o A file named “floor.png” which shows the layout of the smart floor in the UJAmI SmartLab.
    • o A file named “floor-modules.png” which shows the layout of the smart floor in the UJAmI SmartLab with the ID of each module in the layout.
  • The Folder named “Data” (UCAmI Cup\Data\) contains 10 days of recordings divided into the following two folders (refer to Figure 3):
    • o The Folder named Test contains the data for 3 days and is unlabelled.
    • o The Folder named Training contains data for 7 days and is fully labelled.
Each of the 10 sub-folders contains data for each recording day.
The name of each folder in each recording day has the following format: YYYY-MM-DD, with YYYY representing the year, MM the month and DD the day. Each of the folders contain three sub-folders, one for each time routine of the day. The time routines are represented by T, which can take the following values: A for the morning, B for the afternoon and C for the evening.
In a similar manner, each of the 3 sub-folders are named according to the day of the recording and the time of the routine (YYYY-MM-DD-T). Each routine-folder has the following files according to the four data sources: Binary Sensors, Proximity (BLE sensors), Acceleration and Floor.
Furthermore, each routine-folder in the training set contains the file YYYY-MM-DD-T-activity.csv with the sequence of activities that are carried out together and the timestamps of the beginning and the end of each activity (refer to Figure 4).
The file named YYYY-MM-DD-T-activity.csv contains the following fields:
  • DATE BEGIN: Timestamp when the inhabitant starts the activity.
  • DATE END: Timestamp when the inhabitant finishes the activity.
  • ACTIVITY: Name of the activity carried out by the inhabitant.
  • HABITANT: Person that carries out the activity.
The name of the inhabitant is imaginary and has been included to support the future extension of the AR evaluation for multiple occupancy scenarios. The 1st UCAmI Cup is, however, only concerned with a single inhabitant scenario.
As an example, folders and files included in the day-folder named “2017-11-08-A” are listed in Figure 5. This folder is contained in the training set.
As an example, the files included in the day-folder named “2017-11-09-A” are presented in Figure 6. This day is contained in the test set and therefore does not include any labelling of the data.
  • “time-slots-Training.csv” (UCAmI Cup\time-slots training.csv) contains a csv file which stores the annotations of the activities. This file has 30 s timeslots of the dataset where only the activities that are carried out in this time period are labelled. An excerpt from this file is presented in Figure 7.
  • The field named “results.csv” (UCAmI Cup\results.csv) contains a csv file with the timeslots for the test set, however, none of the activities have been labelled. This labelling exercise is to be completed by the participants in the UCAmI Cup. An excerpt from this file is presented in Figure 8.

4. Data Sources of the UJA HAR Dataset

In this Section, the four data sources of the UJAmI dataset are described in detail.

4.1. Binary Sensor Data File

In the UJAmI SmartLab a set of 30 binary sensors were deployed. All of them transmit a binary value together with the timestamp. The set of binary sensors are categorised into the following three sensor types where the meaning/semantic of the values are described:
  • Magnetic contact. This is a wireless magnetic sensor [21] that works with the Z-Wave protocol. When the sensor detects that the two pieces of the sensor have been separated, the sensor sends an event with a value that represents “open”. When the pieces of the sensor are put back together, the sensor sends an event with a value that represents “close”. In our dataset, this kind of sensor is used for the purposes of tracking the position of doors (open or closed) in addition to placing them in objects that have a fixed place when they are not being used. For example, a TV remote control, medicine box, or bottle of water. In these instances when the value is “close”, it means that the object is not being used, otherwise, when the value is “open”, it means that the object is being used.
  • Motion. This is a wireless PIR sensor that works with the ZigBee protocol that is used to detect whether an inhabitant has moved in or out of the sensor’s range. It has a maximum IR detection range of 7 metres with a sample rate of 5 s. When motion is detected the sensor sends a value that represents movement. When the movement ceases, the sensor sends a value that represents no movement.
  • Pressure. This is a wireless sensor that works with the Z-Wave protocol that is connected to a textile layer. When pressure is detected in the textile layer the sensor sends a value that represents press. When the pressure ceases, the sensor sends a value that represents no press. Usually, this kind of sensor is used in sofas, chairs or beds.
The details of the objects/sensors and their locations are presented in Table 2. Details of the binary sensors deployed in the UJAmI Smart Lab Table 2.
In the folder “UCAmI Cup\Pictures\Binary sensors” pictures of each binary sensor can be found. The coordinates of these sensors are illustrated in the file “UCAmI Cup\Layout\Coordinates.docx” and, in addition, the approximate position of each can be found in the file “UCAmI Cup\Layout \sensors.png”.
The files named YYYY-MM-DD-T-sensors.csv contain the following fields:
  • TIMESTAMP: This indicates when a sensor sends an event.
  • OBJECT: ID of the object associated to the sensor that send the event.
  • STATE: Value of the sensor event.
  • HABITANT: Person who is performing the activity.
Figure 9 contains an excerpt from a file YYYY-MM-DD-T-sensors.csv.

4.2. Proximity Data

The proximity data was collected through an Android application installed on the smart watch of the inhabitant and a set of 15 BLE beacons with a sample frequency of 0.25 Hz. The beacon model used was the Sticker from Estimote [22].
When the smart watch reads the signal from a BLE beacon, it collects a Received Signal Strength Indicator (RSSI) measurement. Each BLE beacon must set a broadcasting power with which it broadcasts its signal. The smart watch has the capability to read the RSSIs from several BLE beacons when they are in range. The proximity between a wearable device and a BLE beacon impacts upon the RSSI. The greater the RSSI received by the smart watch, the smaller the distance between it and the BLE beacon.
15 BLE beacons were deployed in the UJAmI SmartLab as presented in Table 3. For small items, for example a toothbrush and medicine box, the BLE broadcasting power (measured in decibels) was set to a smaller range in an effort to reduce/avoid false positives.
In the folder “UCAmI Cup\Pictures\BLE sensors” pictures of each BLE sensor according to the code of each sensor can be found. The coordinates of these sensors are illustrated in the file “UCAmI Cup\Layout\Coordinates.docx” with the approximate position being specified in the file “UCAmI Cup\Layout\proximity.png”.
The files named YYYY-MM-DD-T-proximity.csv contain the following fields:
  • TIMESTAMP: This indicates when the data of a BLE beacon is read.
  • ID: Unique identifier of the BLE beacon associated to an object.
  • OBJECT: Object where the BLE beacon has been deployed.
  • RSSI: RSSI read by the smart watch.
Further information relating to the methods that are used to obtain the proximity and the RSSI from the BLE beacon can be found in the product’s SDK [22]. Figure 10. Excerpt from a proximity.csv file Figure 10 illustrates an excerpt of a file YYYY-MM-DD-T-proximitiy.csv.

4.3. Acceleration Data

The acceleration data has been collected through an Android application installed on the smart watch of the inhabitant. Data was collected with a sample frequency of 50 Hz. The acceleration data has been collected in three axes, which are expressed by meter per second squared (m/s−2) [23].
The files named YYYY-MM-DD-T-acceleration.csv contain the acceleration data collected that have been generated by the smart watch while the habitant carried out the different activities.
The files named YYYY-MM-DD-T-acceleration.csv contain the following fields:
  • TIMESTAMP: This indicates when the data is collected.
  • X: The acceleration in the x-axis.
  • Y: The acceleration in the y-axis.
  • Z: The acceleration in the z-axis.
Figure 11 illustrates an excerpt from a file YYYY-MM-DD-T- acceleration.csv.

4.4. Floor Capacitance Data

The UJAmI SmartLab has a SensFloor® [24] that consists of a suite of capacitive sensor that lie below the floor.
The floor of the UJAmI SmartLab is formed by 40 modules that are distributed in a matrix of 4 rows and 10 columns. A module is composed of eight sensor fields, each sensor in a module is associated with an id-number. The layout of the SensFloor in the UJAmI SmartLab is presented in Figure 12.
The files named YYYY-MM-DD-T-floor.csv contain the following fields:
  • TIMESTAMP: This indicates when the capacitance data of a module is collected
  • DEVICE: Identifies a module, per row and per column of the floor matrix
  • CAPACITANCE: Values of the 8 sensors of a module when the capacitances change. The first value is for the sensor with the id-number 1 and the last sensor is with the last sensor with the id-number 8.
Figure 13 presents an excerpt from the file YYYY-MM-DD-T-floor.csv.
In the folder “UCAmI Cup\Layout\” a file named “floor.png” can be found that shows the layout of the smart floor in the UJAmI SmartLab and a file named “floor-modules.png” that shows the layout of the smart floor in the UJAmI SmartLab with the ID of the modules.

5. Competition

The dataset presented in Section 3 and Section 4 was available for participants to train their methods and tools. The four data sources were available; participants could use one source, several of them or all of them. In order to evaluate participant’s approaches and compare results within the community, the unlabeled test set with three days of recordings that contains 77 instances was provided.
26 participants from 10 countries (Spain, China, U.K., Argentina, Mexico, Ireland, Colombia, Sweden, South Korea and Japan) made contact with the organizers of the 1st UCAmI Cup to obtain the UJA dataset. Participants were required to use their trained methods and tools in order to recognize each activity in the test dataset. Participants were subsequently required to submit their predicted activities from the benchmarking test to the organizers of the UCAMI Cup in the format of a file field named results.csv, which was described in Section 2.
By the closing date of the competition, six contributions were submitted and the organizers computed the results in terms of classification accuracy. Let N A i be the number of activities from each class A i , and TP A i the number of activities correctly classified, the classification accuracy was then defined by Equation (1):
Accuracy = i = 1 N A i TP A i N A i  
Once the deadline to participate in the UCAmI cup had expired (May 10, 2018), an excel file containing the ground truth of each activity in the text set was included in the shared UJA dataset [25].

6. Conclusions

In this paper, the human activity recognition dataset for activities of daily living generated in the University of Jaén has been presented within the context of the first edition of the UCAmI Cup within the International Conference of Ubiquitous Computing and Ambient Intelligence. To do so, the UJAmI Smart Lab, where the dataset was generated, was described. A general description of the dataset, its structure and its format have been presented. Furthermore, the four data sources that are included in the dataset have been presented in detail: (i) event streams from 30 binary sensor, (ii) location data from an intelligent floor, (iii) proximity data between a smart watch worn by the inhabitant and 15 Bluetooth Low Energy beacons and (iv) acceleration of the smart watch. Finally, the initial details of the competition in the UCAmI Cup have been provided. Our future work is focused on gathering all the techniques associated with the analysis of the 1st UCAmI Cup in order to publish for the first time a consolidated report of the performance of HAR on a common dataset. Planning for the 2nd UCAmI Cup is currently underway which will involve an ADL dataset focused on multi-occupancy.

Supplementary Materials

The UJA HAR dataset used in this contribution are available online at https://drive.google.com/open?id=1Ntu2DfQbHqsCpdHSnXVK6I6eJe7qpffk

Author Contributions

Conceptualization, M.E. and C.N.; Methodology, M.E., J.M.Q. and C.N. Data collection and pre-processing, M.E. and C.N.; Validation, M.E., J.M. Writing-Review & Editing, M.E., J.M.Q. and C.N.

Funding

This work was supported by the REMIND project, which has received funding from the European Union’s Horizon 2020 research and innovation programme under the Marie Skłodowska-Curie grant agreement No 734355. Also, this contribution has been supported by the project PI-0203- 2016 from the Council of Health for the Andalusian Health Service, Spain together the 252 research project TIN2015-66524-P from the Spanish government. Invest Northern Ireland is acknowledged for partially supporting this project under the Competence Centre Programme Grant RD0513853–Connected Health Innovation Center.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

HARHuman Activity Recognition
ADLActivities of Daily Living
UJAmIUniversity of Jaén Ambient Intelligence
UCAmI Ubiquitous Computing and Ambient Intelligence

References

  1. Rashidi, P.; Mihailidis, A. A survey on ambient-assisted living tools for older adults. IEEE J. Biomed. Health Inf. 2013, 17, 579–590. [Google Scholar] [CrossRef] [PubMed]
  2. Machine Learning Repository. Available online: http://archive.ics.uci.edu/ml/index.php (accessed on 13 August 2018).
  3. PhysioNet. Available online: https://www.physionet.org/ (accessed on 13 August 2018).
  4. Chen, L.; Nugent, C.D.; Wang, H. A knowledge-driven approach to activity recognition in smart homes. IEEE Trans. Knowl. Data Eng. 2012, 24, 961–974. [Google Scholar] [CrossRef]
  5. Sussex-Huawei Locomotion Challenge. Available online: http://www.shl-dataset.org/activity-recognition-challenge/ (accessed on 13 August 2018).
  6. Indoor Positioning and Indoor Navigation Conference. Available online: http://www.ipin2017.org/ (accessed on 13 August 2018).
  7. iNaturalist Challenge. Available online: https://www.kaggle.com/c/inaturalist-challenge-at-fgvc-2017 (accessed on 13 August 2018).
  8. Computing in Cardiology Challenges. Available online: https://www.physionet.org/challenge/ (accessed on 13 August 2018).
  9. AI Video Competition. Available online: https://ijcai-17.org/competitions.html (accessed on 13 August 2018).
  10. Data Mining and Knowledge Discovery competition. Available online: http://www.kdd.org/kdd-cup (accessed on 13 August 2018).
  11. UCAmI Cup. Available online: http://mamilab.esi.uclm.es/ucami2018/UCAmICup.html (accessed on 13 August 2018).
  12. UJAmI Smart Lab. Available online: http://ceatic.ujaen.es/ujami/en (accessed on 13 August 2018).
  13. Sagha, H. Benchmarking classification techniques using the opportunity human activity dataset. In Proceedings of the 2011 IEEE International Conference on Systems, Man, and Cybernetics (SMC), Anchorage, AK, USA, 9–12 October 2011; pp. 36–40. [Google Scholar]
  14. Espinilla, M.; Martínez, L.; Medina, J.; Nugent, C. The Experience of Developing the UJAmI Smart Lab. IEEE Access 2018, 6, 34631–34642. [Google Scholar] [CrossRef]
  15. Zafra, D.; Medina, J.; Martínez, L.; Nugent, C.; Espinilla, M. A Web System for Managing and Monitoring Smart Environments. In Proceedings of the Bioinformatics and Biomedical Engineering–4th International Conference, IWBBIO, Granada, Spain, 20–22 April 2016; Springer: Berlin, Germany, 2016; Volume 9656, pp. 677–688. [Google Scholar]
  16. OpenHab. Available online: https://www.openhab.org/ (accessed on 13 August 2018).
  17. Medina, J.; Martínez, L.; Espinilla, M. Subscribing to fuzzy temporal aggregation of heterogeneous sensor streams in real-time distributed environments. Int. J. Commun. Syst. 2017, 30, 3238. [Google Scholar] [CrossRef]
  18. Rafferty, J.; Nugent, C.D. A Scalable, Research Oriented, Generic, Sensor Data Platform. IEEE Access 2018. [Google Scholar] [CrossRef]
  19. McChesney, I.; Nugent, C.; Rafferty, J.; Synnott, J. Exploring an Open Data Initiative ontology for Shareable Smart Environment Experimental Datasets; Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); 10586 LNCS; Springer: Berlin, Germany, 2017; pp. 400–412. [Google Scholar]
  20. Wearable Smart Watch Website. Available online: http://www.lg.com/es/wearables/lg-LGW150-g-watch-urbane (accessed on 13 August 2018).
  21. Everspring Website. Available online: http://www.everspring.com/portfolio-item/sm810-doorwindow-contact-sensor/ (accessed on 13 August 2018).
  22. Estimote website Available online:. Available online: https://estimote.com/ (accessed on 13 August 2018).
  23. Android SDK. Available online: https://developer.android.com/reference/android/hardware/SensorManager.html#SENSOR_ACCELEROMETER (accessed on 13 August 2018).
  24. SensFloor® Available online:. Available online: http://future-shape.com/en/system (accessed on 13 August 2018).
  25. Dataset 1st UCAmI Cup. Available online: https://drive.google.com/open?id=1Ntu2DfQbHqsCpdHSnXVK6I6eJe7qpffk (accessed on 13 August 2018).
Figure 1. Layout of the UJAmI SmartLab (a) plan view (b) isometric view.
Figure 1. Layout of the UJAmI SmartLab (a) plan view (b) isometric view.
Proceedings 02 01267 g001
Figure 2. Folders and files in the root folder of the UJA dataset.
Figure 2. Folders and files in the root folder of the UJA dataset.
Proceedings 02 01267 g002
Figure 3. File structure of the data folder.
Figure 3. File structure of the data folder.
Proceedings 02 01267 g003
Figure 4. Excerpt from the file activity.csv.
Figure 4. Excerpt from the file activity.csv.
Proceedings 02 01267 g004
Figure 5. Examples of files included in the data folder called “2017-11-08-A” in the training set.
Figure 5. Examples of files included in the data folder called “2017-11-08-A” in the training set.
Proceedings 02 01267 g005
Figure 6. Files in the data folder called “2017-11-09-A” in the test set.
Figure 6. Files in the data folder called “2017-11-09-A” in the test set.
Proceedings 02 01267 g006
Figure 7. An excerpt from the file named time-slots-Training.xls.
Figure 7. An excerpt from the file named time-slots-Training.xls.
Proceedings 02 01267 g007
Figure 8. An excerpt from the file named results.xls.
Figure 8. An excerpt from the file named results.xls.
Proceedings 02 01267 g008
Figure 9. Excerpt from the file sensors.csv.
Figure 9. Excerpt from the file sensors.csv.
Proceedings 02 01267 g009
Figure 10. Excerpt from a proximity.csv file.
Figure 10. Excerpt from a proximity.csv file.
Proceedings 02 01267 g010
Figure 11. Excerpt from an acceleration.csv file
Figure 11. Excerpt from an acceleration.csv file
Proceedings 02 01267 g011
Figure 12. Layout of the smart floor in the UJAmI SmartLab.
Figure 12. Layout of the smart floor in the UJAmI SmartLab.
Proceedings 02 01267 g012
Figure 13. Layout of the smart floor in the UJAmI SmartLab.
Figure 13. Layout of the smart floor in the UJAmI SmartLab.
Proceedings 02 01267 g013
Table 1. Activities recorded in the UJA dataset.
Table 1. Activities recorded in the UJA dataset.
ID ActivityName ActivityFreq.Description
Act01Take medication7This activity involved the inhabitant going to the kitchen, taking some water, removing medication from a box and swallowing the pills.
Act02Prepare breakfast7This activity involved the inhabitant going to the kitchen, taking some products for lunch. This activity can involve (i) making a cup of tea with kettle or (ii) making a hot chocolate drink with milk in the microwave. This activity involves placing things to eat in the dining room, but not sitting down to eat.
Act03Prepare lunch6This activity involved the inhabitant going to the kitchen, and taking some products from the refrigerator and pantry. This activity can involve (i) preparing a plate of hot food on the fire, for example pasta or (ii) heating a precooked dish in the microwave.This activity also involves placing things to eat in the dining room, but not sitting down to eat.
Act04Prepare dinner7This activity involved the inhabitant going to the kitchen, and taking some products from the refrigerator and pantry. This activity can involve (i) preparing a plate of hot food on the fire, for example pasta or (ii) heating a precooked dish in the microwave.This activity also involves placing things to eat in the dining room, but not sitting down to eat.
Act05Breakfast7This activity involved the inhabitant going to the dining room in the kitchen in the morning and sitting down to eat. When the inhabitant finishes eating, they place the utensils in the sink or in the dishwasher.
Act06Lunch6This activity involved the inhabitant going to the dining room in the kitchen in the afternoon and sitting down to eat. When the inhabitant finishes eating, he places the utensils in the sink or in the dishwasher.
Act07Dinner7This activity involved the inhabitant going to the dining room in the kitchen in the evening and sitting down to eat. When the inhabitant finishes eating, they place the utensils in the sink or in the dishwasher.
Act08Eat a snack5This activity involved the inhabitant going to the kitchen to take fruit or a snack, and to eat it in the kitchen or in the living room. This activity can imply that the utensils are placed in the sink or in the dishwasher.
Act09Watch TV6This activity involved the inhabitant going to the living room, taking the remote control, sitting down on the sofa and when he was finished, the remote control was left close to the TV.
Act10Enter the SmartLab12This activity involved the inhabitant entering the SmartLab through the entrance at the main door and putting the keys into a small basket.
Act11Play a videogame1This activity involved the inhabitant going to the living room, taking the remote controls of the TV and XBOX, and sitting on the sofa. When the inhabitant finishes playing, he gets up from the sofa and places the controls near the TV.
Act12Relax on the sofa1This activity involved the inhabitant going to the living room, sitting on the sofa and after several minutes, getting up off the sofa.
Act13Leave the SmarLab9This activity involved the inhabitant going to the entrance, opening the main door and leaving the SmartLab, then closing the main door.
Act14Visit in the SmartLab1This activity involved the inhabitant going to the entrance, opening the main door, chatting with someone at the main door, and then closing the door.
Act15Put waste in the bin11This activity involved the inhabitant going to the kitchen, picking up the waste, then taking the keys from a small basket in the entrance and exiting the SmartLab. Usually, the inhabitant comes back after around 2 min, leaving the keys back in the small basket.
Act16Wash hands6This activity involved the inhabitant going to the bathroom, opening/closing the tap, lathering his hands, and then rinsing and drying them.
Act17Brush teeth21This activity involved the inhabitant going to the bathroom and brushing his teeth and opening/closing the tap.
Act18Use the toilet10This activity involved the inhabitant going to the bathroom and using the toilet, opening/closing the toilet lid and pulling the cistern.
Act19Wash dishes2This activity involved the inhabitant going to the kitchen and placing the dirty dishes in the dishwasher, and then placing the dishes back in the right place.
Act20Put washing into the washing machine6This activity involved the inhabitant going to the bedroom, picking up the laundry basket, going to the kitchen, putting clothes in the washing machine, waiting around 20 min and then taking the clothes out of the washing machine and placing them in the bedroom closet.
Act21Work at the table2This activity involved the inhabitant going to the workplace, sitting down, doing work, and finally, getting up.
Act22Dressing15This activity involved the inhabitant going to the bedroom, putting dirty clothes in the laundry basket, opening the closet, putting on clean clothes and then closing the closet.
Act23Go to the bed7This activity involved the inhabitant going to the bedroom, lying in bed and sleeping. This activity is terminated once the inhabitant stays 1 min in bed.
Act24Wake up7This activity involved the inhabitant getting up and out of the bed.
Table 2. Details of the binary sensors deployed in the UJAmI Smart Lab.
Table 2. Details of the binary sensors deployed in the UJAmI Smart Lab.
IDOBJECTXYTypeSTATE 1STATE 2
M01Door450460ContactOpenClose
TV0TV119252ContactOpenClose
SM1Motion sensor–Kitchen580260MotionMovementNo movement
SM3Motion sensor–bathroom270128MotionMovementNo movement
SM4Motion sensor–bedroom1460MotionMovementNo movement
SM5Motion sensor–sofa164249MotionMovementNo movement
D01Refrigerator510144ContactOpenClose
D02Microwave48037ContactOpenClose
D03Wardrobe59169ContactOpenClose
D04Cupboard cups546104ContactOpenClose
D05Dishwasher48763ContactOpenClose
D07Top WC25456ContactOpenClose
D08Closet546194ContactOpenClose
D09Washing machine40863ContactOpenClose
D10Pantry546149ContactOpenClose
H01Kettle46724ContactOpenClose
C01Medication box4710ContactOpenClose
C02Fruit platter4340ContactOpenClose
C03Cutlery515116ContactOpenClose
C04Pots515116ContactOpenClose
IDOBJECTXYTypeSTATE 1STATE 2
C05Water bottle567170ContactOpenClose
C07Remote XBOX117252ContactPresentNo present
C08Trash489233ContactOpenClose
C09Tap306107ContactOpenClose
C10Tank31044ContactOpenClose
C12Laundry basket46163ContactPresentNo present
C13Pyjamas drawer59169ContactOpenClose
C14Bed14094PressurePressureNo Pressure
C15Kitchen faucet55898ContactOpenClose
S09Pressure sofa130407PressureNo PressureNo Pressure
Table 3. Details of the set of BLE sensors deployed in the UJAmI SmartLab and their respective BLE broadcasting power.
Table 3. Details of the set of BLE sensors deployed in the UJAmI SmartLab and their respective BLE broadcasting power.
NameBroadcasting Power (dB)
1–TV controller−12
2–Book −12
3–Entrance door −12
4–Medicine box −16
5–Food cupboard −12
6–Fridge −12
7–Pot drawer−12
8–Water bottle−12
9–Garbage can−12
10–Wardrobe door−12
11–Pyjama drawer −12
12–Bed−12
13–Bathroom tap −12
14–Toothbrush −16
15–Laundry basket −12
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Espinilla, M.; Medina, J.; Nugent, C. UCAmI Cup. Analyzing the UJA Human Activity Recognition Dataset of Activities of Daily Living. Proceedings 2018, 2, 1267. https://doi.org/10.3390/proceedings2191267

AMA Style

Espinilla M, Medina J, Nugent C. UCAmI Cup. Analyzing the UJA Human Activity Recognition Dataset of Activities of Daily Living. Proceedings. 2018; 2(19):1267. https://doi.org/10.3390/proceedings2191267

Chicago/Turabian Style

Espinilla, Macarena, Javier Medina, and Chris Nugent. 2018. "UCAmI Cup. Analyzing the UJA Human Activity Recognition Dataset of Activities of Daily Living" Proceedings 2, no. 19: 1267. https://doi.org/10.3390/proceedings2191267

Article Metrics

Back to TopTop