You are currently viewing a new version of our website. To view the old version click .
Proceedings
  • Proceeding Paper
  • Open Access

26 October 2018

UCAmI Cup. Analyzing the UJA Human Activity Recognition Dataset of Activities of Daily Living †

,
and
1
Department of Computer Science, University of Jaen, 23071 Jaén, Spain
2
School of Computing, Ulster University, Jordanstown Campus, Belfast BT37 0QB, UK
*
Author to whom correspondence should be addressed.
Presented at the 12th International Conference on Ubiquitous Computing and Ambient Intelligence (UCAmI 2018), Punta Cana, Dominican Republic, 4–7 December 2018.
This article belongs to the Proceedings UCAmI 2018

Abstract

Many real-world applications, which are focused on addressing the needs of a human, require information pertaining to the activities being performed. The UCAmI Cup is an event held within the context of the International Conference on Ubiquitous Computing and Ambient ‪Intelligence, where delegates are given the opportunity to use their tools and techniques to analyse a previously unseen human activity recognition dataset and to compare their results with others working in the same domain. In this paper, the human activity recognition dataset used relates to activities of daily living generated in the UJAmI Smart Lab, University of Jaén. The dataset chosen for the first edition of the UCAmI Cup represents 246 activities performed over a period of ten days carried out by a single inhabitant. The dataset includes four data sources: (i) event streams from 30 binary sensors, (ii) intelligent floor location data, (iii) proximity data between a smart watch worn by the inhabitant and 15 Bluetooth Low Energy beacons and (iv) acceleration of the smart watch. In this first edition of the UCAmI Cup, 26 participants from 10 different countries contacted the organizers to obtain the dataset.

1. Introduction

Activity recognition systems deployed in smart homes are characterized by their ability to detect Activities of Daily Living (ADL) in order to improve assistance. Such solutions have been adopted by smart homes in practice and have delivered promising results for improving the quality of care services for elderly people and responsive assistance in emergency situations [].
Data driven approaches [] developed for the purposes of Human Activity Recognition (HAR) of ADLs require large annotated data sets which offer high levels of quality in terms of both the ground truth and generalisation of the underlying data. A limited number of online repositories have supported the notion of providing openly available datasets for research and development purposes. Two key examples are the UC Irvine Machine Learning repository [] and Physionet []. The former has recently extended its datasets to include a small number of HAR related resources. The European Union funded Project OPPORTUNITY created a common platform whereby researchers working in different organizations could have access to a common data set and therefore were able to compare their results with others []. Beyond the aforementioned, efforts to provide high quality, openly available large scale datasets have been largely un-co-ordinated. There still remains a lack of frameworks where multiple researchers have the ability to compare their results using their tools and techniques to analyses the same HAR problem relating to an ADL dataset. The competition closest to the UCAmI Cup is the recently announced Sussex-Huawei Locomotion Challenge [] where the Sussex-Huawei Locomotion Dataset is used to recognize 8 modes of locomotion and transportation (Car, Bus, Train, Subway, Walk, Run, Bike and Still) from the inertial sensor data of a smartphone (accelerometer, gyroscope, magnetometer, linear acceleration, gravity, orientation (quaternions and ambient pressure.). Nevertheless, this competition does not aim to develop solutions for smart homes to improve assistance.
The concept of comparing techniques on openly available data is performed in other domains such as indoor localization [] or automatic image classification [], Physionet CinC, IJCAI [] Competitions [] and the KDD Cup challenge [].
In order to address the gap in the domain of HAR for ADL, the UCAmI Cup has been announced. The UCAmI Cup aims to be an annual event within the forum of the International Conference of Ubiquitous Computing and Ambient Intelligence (UCAmI) where delegates will be provided with the opportunity to use their tools and techniques to analyse a HAR dataset and to compare their results with others in the ADL context. Each year, the dataset and the problem to be addressed will be changed to align with the major research topics being considered as state-of-the-art trends. The selected dataset to be used in the 1st UCAmI Cup [] is the HAR dataset of ADL generated by the University of Jaén (UJA) in its newly created UJAmI Smart Lab. This paper aims to review the details of the infrastructure of the UJAmI Smart Lab in addition to present in detail the selected dataset that will be used in the 1st UCAmI Cup together with the details of the competition.
The remainder of the paper is structured as follows: Section 2 presents the UJAmI Smart Lab of the University of Jaén where the HAR dataset of ADL was generated. Section 3 presents a general description of the dataset in addition to its structure and its format. Section 4 presents each kind of data source that is contained in the dataset: binary sensor data, proximity data, acceleration data and, finally, location data of the smart floor. Section 5 presents details of the competition in the 1st UCAmI Cup and the results attained. Finally, Section 6 presents the conclusions and future works.

2. UJAmI Smart Lab of the University of Jaén

The University of Jaén Ambient Intelligence (UJAmI) [,,] represents an innovative space that plays a key role in the implementation of new ground-breaking research within the realms of Ambient Intelligence (AmI) [], which is a paradigm in information technology aimed at empowering people’s capabilities through the means of digital environments.
The aim of the creation of the UJAmI Smart Lab [] in 2014 was to produce a real apartment: sensitive, adaptive and responsive to human needs (habits, gestures and emotions) which subsequently underpinned assistive technology based solutions in the home.
The UJAmI SmartLab measures approximately 25 square meters; its measurements are 5.8 m long and 4.6 m wide. It is divided into five regions: entrance, kitchen, workplace, living room and a bedroom with an integrated bathroom. The layout of the UJAmI SmartLab is presented in Figure 1.
Figure 1. Layout of the UJAmI SmartLab (a) plan view (b) isometric view.
A set of multiple and heterogeneous sensors have been deployed in different areas of the environment in order to capture human-environment interactions in addition to inhabitant behaviour. Currently, a web-based system for managing and monitoring smart environments is deployed [] based on openHAB [] with an approach for distributing and processing heterogeneous data based on a representation with fuzzy linguistic terms []. It is, however, beneficial to utilize a framework that includes a common protocol for data collection, a common format for data exchange, and a data repository and related tools to underpin research within the domain of activity recognition. For this reason, the UJAmI SmartLab is moving towards the deployment of a common middleware platform referred to as SensorCentral [] that is compatible with an open data format referred to as the Open Data Initiative (ODI) [].

3. General Description of UJAmI HAR Dataset

In this Section, the HAR of the ADL dataset from the UJAmI SmartLab used in the 1st UCAmI Cup is described.
The UJA dataset from the UJAmI SmartLab is composed of four data sources that have been obtained whilst an inhabitant performed 246 instances of activity classes over a period of 10 days. The dataset is divided into two sets:
-
Part 1: Labelled training set with seven days of recordings that contains 169 instances.
-
Part 2: Unlabelled test set with three days of recordings that contains 77 instances.
The four data sources are as follows:
  • Event stream generated by 30 binary sensors.
  • Proximity information between a smart watch worn by an inhabitant and a set of 15 Bluetooth Low Energy (BLE) beacons deployed in the UJAmI SmartLab.
  • Acceleration generated by the smart watch.
  • An intelligent floor with 40 modules that provides location data.
The inhabitant who performed the activities was a 24 year old male student from the University of Jaen. During data collection, the smart watch “LG Urbane model” [] was worn on the participant’s right hand. For reasons of energy saving, recording of acceleration data and proximity related information ceased when the inhabitant went to bed in addition to when he left the UJAmI SmartLab.
The dataset includes 24 different types of activities as presented in Table 1 with the frequency of each activity only in the training set presented.
Table 1. Activities recorded in the UJA dataset.
The activities being undertaken during data collection were annotated by using NFC tags and a smartphone. This process was used to label the beginning and end of each activity.
The root folder of the dataset contains the folders and files as illustrated in Figure 2.
Figure 2. Folders and files in the root folder of the UJA dataset.
  • The Folder named “Pictures” (UCAmI Cup\Pictures\) contains:
    • o A folder named “Binary Sensors” with pictures of each binary sensor used for data collection in the UJAmI SmartLab.
    • o A folder named “BLE sensor” with pictures of each BLE sensor used in the UJAmI SmartLab during data collection.
    • o A folder named “Smart Lab” which contains pictures of each area in the UJAmI SmartLab.
  • The Folder named “Layout” (UCAmI Cup\Layout\) contains:
    • o A file named “sensors.png” which shows the layout of the UJAmI SmartLab and where each of the binary sensors are located.
    • o A file named “proximity.png” which shows the layout of the UJAmI SmartLab and where each of the BLE sensors are located.
    • o A file named “Coordinates.docx” which contains a table with the coordinates X and Y of each binary sensor and each BLE sensor in the UJAmI SmartLab.
    • o A file named “floor.png” which shows the layout of the smart floor in the UJAmI SmartLab.
    • o A file named “floor-modules.png” which shows the layout of the smart floor in the UJAmI SmartLab with the ID of each module in the layout.
  • The Folder named “Data” (UCAmI Cup\Data\) contains 10 days of recordings divided into the following two folders (refer to Figure 3):
    Figure 3. File structure of the data folder.
    • o The Folder named Test contains the data for 3 days and is unlabelled.
    • o The Folder named Training contains data for 7 days and is fully labelled.
Each of the 10 sub-folders contains data for each recording day.
The name of each folder in each recording day has the following format: YYYY-MM-DD, with YYYY representing the year, MM the month and DD the day. Each of the folders contain three sub-folders, one for each time routine of the day. The time routines are represented by T, which can take the following values: A for the morning, B for the afternoon and C for the evening.
In a similar manner, each of the 3 sub-folders are named according to the day of the recording and the time of the routine (YYYY-MM-DD-T). Each routine-folder has the following files according to the four data sources: Binary Sensors, Proximity (BLE sensors), Acceleration and Floor.
Furthermore, each routine-folder in the training set contains the file YYYY-MM-DD-T-activity.csv with the sequence of activities that are carried out together and the timestamps of the beginning and the end of each activity (refer to Figure 4).
Figure 4. Excerpt from the file activity.csv.
The file named YYYY-MM-DD-T-activity.csv contains the following fields:
  • DATE BEGIN: Timestamp when the inhabitant starts the activity.
  • DATE END: Timestamp when the inhabitant finishes the activity.
  • ACTIVITY: Name of the activity carried out by the inhabitant.
  • HABITANT: Person that carries out the activity.
The name of the inhabitant is imaginary and has been included to support the future extension of the AR evaluation for multiple occupancy scenarios. The 1st UCAmI Cup is, however, only concerned with a single inhabitant scenario.
As an example, folders and files included in the day-folder named “2017-11-08-A” are listed in Figure 5. This folder is contained in the training set.
Figure 5. Examples of files included in the data folder called “2017-11-08-A” in the training set.
As an example, the files included in the day-folder named “2017-11-09-A” are presented in Figure 6. This day is contained in the test set and therefore does not include any labelling of the data.
Figure 6. Files in the data folder called “2017-11-09-A” in the test set.
  • “time-slots-Training.csv” (UCAmI Cup\time-slots training.csv) contains a csv file which stores the annotations of the activities. This file has 30 s timeslots of the dataset where only the activities that are carried out in this time period are labelled. An excerpt from this file is presented in Figure 7.
    Figure 7. An excerpt from the file named time-slots-Training.xls.
  • The field named “results.csv” (UCAmI Cup\results.csv) contains a csv file with the timeslots for the test set, however, none of the activities have been labelled. This labelling exercise is to be completed by the participants in the UCAmI Cup. An excerpt from this file is presented in Figure 8.
    Figure 8. An excerpt from the file named results.xls.

4. Data Sources of the UJA HAR Dataset

In this Section, the four data sources of the UJAmI dataset are described in detail.

4.1. Binary Sensor Data File

In the UJAmI SmartLab a set of 30 binary sensors were deployed. All of them transmit a binary value together with the timestamp. The set of binary sensors are categorised into the following three sensor types where the meaning/semantic of the values are described:
  • Magnetic contact. This is a wireless magnetic sensor [] that works with the Z-Wave protocol. When the sensor detects that the two pieces of the sensor have been separated, the sensor sends an event with a value that represents “open”. When the pieces of the sensor are put back together, the sensor sends an event with a value that represents “close”. In our dataset, this kind of sensor is used for the purposes of tracking the position of doors (open or closed) in addition to placing them in objects that have a fixed place when they are not being used. For example, a TV remote control, medicine box, or bottle of water. In these instances when the value is “close”, it means that the object is not being used, otherwise, when the value is “open”, it means that the object is being used.
  • Motion. This is a wireless PIR sensor that works with the ZigBee protocol that is used to detect whether an inhabitant has moved in or out of the sensor’s range. It has a maximum IR detection range of 7 metres with a sample rate of 5 s. When motion is detected the sensor sends a value that represents movement. When the movement ceases, the sensor sends a value that represents no movement.
  • Pressure. This is a wireless sensor that works with the Z-Wave protocol that is connected to a textile layer. When pressure is detected in the textile layer the sensor sends a value that represents press. When the pressure ceases, the sensor sends a value that represents no press. Usually, this kind of sensor is used in sofas, chairs or beds.
The details of the objects/sensors and their locations are presented in Table 2. Details of the binary sensors deployed in the UJAmI Smart Lab Table 2.
Table 2. Details of the binary sensors deployed in the UJAmI Smart Lab.
In the folder “UCAmI Cup\Pictures\Binary sensors” pictures of each binary sensor can be found. The coordinates of these sensors are illustrated in the file “UCAmI Cup\Layout\Coordinates.docx” and, in addition, the approximate position of each can be found in the file “UCAmI Cup\Layout \sensors.png”.
The files named YYYY-MM-DD-T-sensors.csv contain the following fields:
  • TIMESTAMP: This indicates when a sensor sends an event.
  • OBJECT: ID of the object associated to the sensor that send the event.
  • STATE: Value of the sensor event.
  • HABITANT: Person who is performing the activity.
Figure 9 contains an excerpt from a file YYYY-MM-DD-T-sensors.csv.
Figure 9. Excerpt from the file sensors.csv.

4.2. Proximity Data

The proximity data was collected through an Android application installed on the smart watch of the inhabitant and a set of 15 BLE beacons with a sample frequency of 0.25 Hz. The beacon model used was the Sticker from Estimote [].
When the smart watch reads the signal from a BLE beacon, it collects a Received Signal Strength Indicator (RSSI) measurement. Each BLE beacon must set a broadcasting power with which it broadcasts its signal. The smart watch has the capability to read the RSSIs from several BLE beacons when they are in range. The proximity between a wearable device and a BLE beacon impacts upon the RSSI. The greater the RSSI received by the smart watch, the smaller the distance between it and the BLE beacon.
15 BLE beacons were deployed in the UJAmI SmartLab as presented in Table 3. For small items, for example a toothbrush and medicine box, the BLE broadcasting power (measured in decibels) was set to a smaller range in an effort to reduce/avoid false positives.
Table 3. Details of the set of BLE sensors deployed in the UJAmI SmartLab and their respective BLE broadcasting power.
In the folder “UCAmI Cup\Pictures\BLE sensors” pictures of each BLE sensor according to the code of each sensor can be found. The coordinates of these sensors are illustrated in the file “UCAmI Cup\Layout\Coordinates.docx” with the approximate position being specified in the file “UCAmI Cup\Layout\proximity.png”.
The files named YYYY-MM-DD-T-proximity.csv contain the following fields:
  • TIMESTAMP: This indicates when the data of a BLE beacon is read.
  • ID: Unique identifier of the BLE beacon associated to an object.
  • OBJECT: Object where the BLE beacon has been deployed.
  • RSSI: RSSI read by the smart watch.
Further information relating to the methods that are used to obtain the proximity and the RSSI from the BLE beacon can be found in the product’s SDK []. Figure 10. Excerpt from a proximity.csv file Figure 10 illustrates an excerpt of a file YYYY-MM-DD-T-proximitiy.csv.
Figure 10. Excerpt from a proximity.csv file.

4.3. Acceleration Data

The acceleration data has been collected through an Android application installed on the smart watch of the inhabitant. Data was collected with a sample frequency of 50 Hz. The acceleration data has been collected in three axes, which are expressed by meter per second squared (m/s−2) [].
The files named YYYY-MM-DD-T-acceleration.csv contain the acceleration data collected that have been generated by the smart watch while the habitant carried out the different activities.
The files named YYYY-MM-DD-T-acceleration.csv contain the following fields:
  • TIMESTAMP: This indicates when the data is collected.
  • X: The acceleration in the x-axis.
  • Y: The acceleration in the y-axis.
  • Z: The acceleration in the z-axis.
Figure 11 illustrates an excerpt from a file YYYY-MM-DD-T- acceleration.csv.
Figure 11. Excerpt from an acceleration.csv file

4.4. Floor Capacitance Data

The UJAmI SmartLab has a SensFloor® [] that consists of a suite of capacitive sensor that lie below the floor.
The floor of the UJAmI SmartLab is formed by 40 modules that are distributed in a matrix of 4 rows and 10 columns. A module is composed of eight sensor fields, each sensor in a module is associated with an id-number. The layout of the SensFloor in the UJAmI SmartLab is presented in Figure 12.
Figure 12. Layout of the smart floor in the UJAmI SmartLab.
The files named YYYY-MM-DD-T-floor.csv contain the following fields:
  • TIMESTAMP: This indicates when the capacitance data of a module is collected
  • DEVICE: Identifies a module, per row and per column of the floor matrix
  • CAPACITANCE: Values of the 8 sensors of a module when the capacitances change. The first value is for the sensor with the id-number 1 and the last sensor is with the last sensor with the id-number 8.
Figure 13 presents an excerpt from the file YYYY-MM-DD-T-floor.csv.
Figure 13. Layout of the smart floor in the UJAmI SmartLab.
In the folder “UCAmI Cup\Layout\” a file named “floor.png” can be found that shows the layout of the smart floor in the UJAmI SmartLab and a file named “floor-modules.png” that shows the layout of the smart floor in the UJAmI SmartLab with the ID of the modules.

5. Competition

The dataset presented in Section 3 and Section 4 was available for participants to train their methods and tools. The four data sources were available; participants could use one source, several of them or all of them. In order to evaluate participant’s approaches and compare results within the community, the unlabeled test set with three days of recordings that contains 77 instances was provided.
26 participants from 10 countries (Spain, China, U.K., Argentina, Mexico, Ireland, Colombia, Sweden, South Korea and Japan) made contact with the organizers of the 1st UCAmI Cup to obtain the UJA dataset. Participants were required to use their trained methods and tools in order to recognize each activity in the test dataset. Participants were subsequently required to submit their predicted activities from the benchmarking test to the organizers of the UCAMI Cup in the format of a file field named results.csv, which was described in Section 2.
By the closing date of the competition, six contributions were submitted and the organizers computed the results in terms of classification accuracy. Let N A i be the number of activities from each class A i , and TP A i the number of activities correctly classified, the classification accuracy was then defined by Equation (1):
Accuracy = i = 1 N A i TP A i N A i  
Once the deadline to participate in the UCAmI cup had expired (May 10, 2018), an excel file containing the ground truth of each activity in the text set was included in the shared UJA dataset [].

6. Conclusions

In this paper, the human activity recognition dataset for activities of daily living generated in the University of Jaén has been presented within the context of the first edition of the UCAmI Cup within the International Conference of Ubiquitous Computing and Ambient Intelligence. To do so, the UJAmI Smart Lab, where the dataset was generated, was described. A general description of the dataset, its structure and its format have been presented. Furthermore, the four data sources that are included in the dataset have been presented in detail: (i) event streams from 30 binary sensor, (ii) location data from an intelligent floor, (iii) proximity data between a smart watch worn by the inhabitant and 15 Bluetooth Low Energy beacons and (iv) acceleration of the smart watch. Finally, the initial details of the competition in the UCAmI Cup have been provided. Our future work is focused on gathering all the techniques associated with the analysis of the 1st UCAmI Cup in order to publish for the first time a consolidated report of the performance of HAR on a common dataset. Planning for the 2nd UCAmI Cup is currently underway which will involve an ADL dataset focused on multi-occupancy.

Supplementary Materials

The UJA HAR dataset used in this contribution are available online at https://drive.google.com/open?id=1Ntu2DfQbHqsCpdHSnXVK6I6eJe7qpffk

Author Contributions

Conceptualization, M.E. and C.N.; Methodology, M.E., J.M.Q. and C.N. Data collection and pre-processing, M.E. and C.N.; Validation, M.E., J.M. Writing-Review & Editing, M.E., J.M.Q. and C.N.

Funding

This work was supported by the REMIND project, which has received funding from the European Union’s Horizon 2020 research and innovation programme under the Marie Skłodowska-Curie grant agreement No 734355. Also, this contribution has been supported by the project PI-0203- 2016 from the Council of Health for the Andalusian Health Service, Spain together the 252 research project TIN2015-66524-P from the Spanish government. Invest Northern Ireland is acknowledged for partially supporting this project under the Competence Centre Programme Grant RD0513853–Connected Health Innovation Center.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

HARHuman Activity Recognition
ADLActivities of Daily Living
UJAmIUniversity of Jaén Ambient Intelligence
UCAmI Ubiquitous Computing and Ambient Intelligence

References

  1. Rashidi, P.; Mihailidis, A. A survey on ambient-assisted living tools for older adults. IEEE J. Biomed. Health Inf. 2013, 17, 579–590. [Google Scholar] [CrossRef] [PubMed]
  2. Machine Learning Repository. Available online: http://archive.ics.uci.edu/ml/index.php (accessed on 13 August 2018).
  3. PhysioNet. Available online: https://www.physionet.org/ (accessed on 13 August 2018).
  4. Chen, L.; Nugent, C.D.; Wang, H. A knowledge-driven approach to activity recognition in smart homes. IEEE Trans. Knowl. Data Eng. 2012, 24, 961–974. [Google Scholar] [CrossRef]
  5. Sussex-Huawei Locomotion Challenge. Available online: http://www.shl-dataset.org/activity-recognition-challenge/ (accessed on 13 August 2018).
  6. Indoor Positioning and Indoor Navigation Conference. Available online: http://www.ipin2017.org/ (accessed on 13 August 2018).
  7. iNaturalist Challenge. Available online: https://www.kaggle.com/c/inaturalist-challenge-at-fgvc-2017 (accessed on 13 August 2018).
  8. Computing in Cardiology Challenges. Available online: https://www.physionet.org/challenge/ (accessed on 13 August 2018).
  9. AI Video Competition. Available online: https://ijcai-17.org/competitions.html (accessed on 13 August 2018).
  10. Data Mining and Knowledge Discovery competition. Available online: http://www.kdd.org/kdd-cup (accessed on 13 August 2018).
  11. UCAmI Cup. Available online: http://mamilab.esi.uclm.es/ucami2018/UCAmICup.html (accessed on 13 August 2018).
  12. UJAmI Smart Lab. Available online: http://ceatic.ujaen.es/ujami/en (accessed on 13 August 2018).
  13. Sagha, H. Benchmarking classification techniques using the opportunity human activity dataset. In Proceedings of the 2011 IEEE International Conference on Systems, Man, and Cybernetics (SMC), Anchorage, AK, USA, 9–12 October 2011; pp. 36–40. [Google Scholar]
  14. Espinilla, M.; Martínez, L.; Medina, J.; Nugent, C. The Experience of Developing the UJAmI Smart Lab. IEEE Access 2018, 6, 34631–34642. [Google Scholar] [CrossRef]
  15. Zafra, D.; Medina, J.; Martínez, L.; Nugent, C.; Espinilla, M. A Web System for Managing and Monitoring Smart Environments. In Proceedings of the Bioinformatics and Biomedical Engineering–4th International Conference, IWBBIO, Granada, Spain, 20–22 April 2016; Springer: Berlin, Germany, 2016; Volume 9656, pp. 677–688. [Google Scholar]
  16. OpenHab. Available online: https://www.openhab.org/ (accessed on 13 August 2018).
  17. Medina, J.; Martínez, L.; Espinilla, M. Subscribing to fuzzy temporal aggregation of heterogeneous sensor streams in real-time distributed environments. Int. J. Commun. Syst. 2017, 30, 3238. [Google Scholar] [CrossRef]
  18. Rafferty, J.; Nugent, C.D. A Scalable, Research Oriented, Generic, Sensor Data Platform. IEEE Access 2018. [Google Scholar] [CrossRef]
  19. McChesney, I.; Nugent, C.; Rafferty, J.; Synnott, J. Exploring an Open Data Initiative ontology for Shareable Smart Environment Experimental Datasets; Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); 10586 LNCS; Springer: Berlin, Germany, 2017; pp. 400–412. [Google Scholar]
  20. Wearable Smart Watch Website. Available online: http://www.lg.com/es/wearables/lg-LGW150-g-watch-urbane (accessed on 13 August 2018).
  21. Everspring Website. Available online: http://www.everspring.com/portfolio-item/sm810-doorwindow-contact-sensor/ (accessed on 13 August 2018).
  22. Estimote website Available online:. Available online: https://estimote.com/ (accessed on 13 August 2018).
  23. Android SDK. Available online: https://developer.android.com/reference/android/hardware/SensorManager.html#SENSOR_ACCELEROMETER (accessed on 13 August 2018).
  24. SensFloor® Available online:. Available online: http://future-shape.com/en/system (accessed on 13 August 2018).
  25. Dataset 1st UCAmI Cup. Available online: https://drive.google.com/open?id=1Ntu2DfQbHqsCpdHSnXVK6I6eJe7qpffk (accessed on 13 August 2018).
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Article Metrics

Citations

Article Access Statistics

Multiple requests from the same IP address are counted as one view.