#
Using Ontologies for the Online Recognition of Activities of Daily Living^{ †}

^{1}

^{2}

^{*}

^{†}

*Ubiquitous Computing and Ambient Intelligence (UCAmI 2017)*; Ochoa, S., Singh, P., Bravo, J., Eds.; Lecture Notes in Computer Science, Springer: Cham, Switzerland, 2017; Volume 10586; pp. 381–393.

*Best Paper Award*.

## Abstract

**:**

## 1. Introduction

- Automatic generation of features for DDA: The dataset has been described in the form of an ontology. The relevant concepts and properties in the ontology are combined and transformed to generate new concepts, which are then evaluated to determine their relevance. This process is repeated until a certain number of concepts describing the activities of the dataset are obtained. All the new concepts are then taken as input features by the supervised learning algorithms to generate the activity classification models. These concepts also provide knowledge to describe the activities under a richer and interpretable representation.
- An ontology for the mining of ADL: One of the main problems of ontologies is their low performance. We describe in this work an ontology that has been specifically developed for the mining of ADL. This ontology greatly reduces the amount of resources needed for reasoners and improves the efficiency of the whole process.
- Online activity recognition: In this paper, we extend our previous work [8] and propose an approach for the online recognition of activities. In that work, we proposed a hybrid approach for the recognition of activities after a pre-segmentation process in which the sensor data stream was divided into segments, also known as temporal windows, by using the begin and the end labels of each activity [9,10]. The begin label indicates the starting time of the activity, and the end label indicates the ending time of the activity. Therefore, each segment of the sensor data stream exactly corresponds to an activity. The DDA approach offers excellent results for offline activity recognition where the begin and the end labels of each activity in the dataset are known. However, the successful results that DDA provides cannot be transferred to online activity recognition, since the begin label cannot be predicted [9,11]. In this paper, we propose a mechanism to dynamically calculate the size of temporal windows for each type of activity.

## 2. Background

#### 2.1. Ontologies

#### 2.2. Related Works

- Based on activity: This is a popular segmentation approach, also called explicit, being adopted by a wide range of offline DDA proposals for sensor-based activity recognition because of its excellent results [25]. Typically, the sensor data stream is divided into segments coinciding with the starting and ending point of time for each activity, whose activation within them provides a straightforward feature representation from binary sensors [24]. The main disadvantage of this approach is that it is not feasible for online activity recognition because it is not possible to know when the activities are going to start or end.
- Based on time: In this approach, the sensor data stream is divided into segments of a given duration [26]. The main problem of this approach is identifying the optimal length of the segments, since it is critical for the performance of activity recognition [27]. Initial approaches proposed a fixed time segmentation for evaluating the activation of binary sensor within temporal windows. These approaches usually employed windows of 60 s in length, which provide good performance in daily human activities recognition [9,23]. Recent works propose the use of more elaborated methods to identify the optimal size of windows per activity by using statistical analysis: (i) the average length of the activities and the sampling frequency of the sensors [28]; or (ii) a weighted average by the standard deviation of the activities [22]. This approach has been also used in activity recognition based on wearable sensor devices [29].
- Based on events: Another approach consists of dividing the sensor data stream based on the changes in sensor events [30]. The main problem with this approach is separating sensor events, which may correspond to different activities that can be included in the same segment. On the one hand, this approach is adopted by some research works that analyze sensors providing continuous data from wearable devices, such as accelerometers [10]. On the other hand, the segmentation based on events in binary sensors has been proposed to evaluate activity recognition together with dynamic windowing approaches [21].

## 3. Methodology

#### 3.1. An Ontology for the Description of Activities

#### 3.2. Extended Features Generation

## 4. Experiment

#### 4.1. From the Sensor Data Stream to Feature Vectors

#### 4.2. Experiment Description

#### 4.3. Results

#### 4.4. Simulation of a Real Scenario

## 5. Conclusions and Future Work

## Acknowledgments

## Author Contributions

## Conflicts of Interest

## References

- Chen, L.; Hoey, J.; Nugent, C.; Cook, D.; Yu, Z. Sensor-based activity recognition. IEEE Trans. Syst. Man Cybern. Part C Appl. Rev.
**2012**, 42, 790–808. [Google Scholar] [CrossRef] - Korhonen, I.; Parkka, J.; Van Gils, M. Health Monitoring in the Home of the Future. IEEE Eng. Med. Biol. Mag.
**2003**, 22, 66–73. [Google Scholar] [CrossRef] [PubMed] - Li, C.; Lin, M.; Yang, L.; Ding, C. Integrating the enriched feature with machine learning algorithms for human movement and fall detection. J. Supercomput.
**2014**, 67, 854–865. [Google Scholar] [CrossRef] - Chen, L.; Nugent, C. Ontology-based activity recognition in intelligent pervasive environments. Int. J. Web Inf. Syst.
**2009**, 5, 410–430. [Google Scholar] [CrossRef] - Chen, L.; Nugent, C.; Wang, H. A knowledge-driven approach to activity recognition in smart homes. IEEE Trans. Knowl. Data Eng.
**2012**, 24, 961–974. [Google Scholar] [CrossRef] - Chen, L.; Nugent, C.; Okeyo, G. An ontology-based hybrid approach to activity modeling for smart homes. IEEE Trans. Hum.-Mach. Syst.
**2014**, 44, 92–105. [Google Scholar] [CrossRef] - Rafferty, J.; Chen, L.; Nugent, C.; Liu, J. Goal lifecycles and ontological models for intention based assistive living within smart environments. Comput. Syst. Sci. Eng.
**2015**, 30, 7–18. [Google Scholar] - Salguero, A.; Espinilla, M. Improving Activity Classification Using Ontologies to Expand Features in Smart Environments. In Ubiquitous Computing and Ambient Intelligence; Ochoa, S.F., Singh, P., Bravo, J., Eds.; Springer International Publishing: Cham, Switzerland, 2017; pp. 381–393. [Google Scholar]
- Ordóñez, F.J.; de Toledo, P.; Sanchis, A. Activity Recognition Using Hybrid Generative/Discriminative Models on Home Environments Using Binary Sensors. Sensors
**2013**, 13, 5460–5477. [Google Scholar] [CrossRef] [PubMed] - Banos, O.; Galvez, J.M.; Damas, M.; Pomares, H.; Rojas, I. Window Size Impact in Human Activity Recognition. Sensors
**2014**, 14, 6474–6499. [Google Scholar] [CrossRef] [PubMed] - Ordóñez, F.J.; Iglesias, J.A.; de Toledo, P.; Ledezma, A.; Sanchis, A. Online activity recognition using evolving classifiers. Expert Syst. Appl.
**2013**, 40, 1248–1255. [Google Scholar] [CrossRef] - Chandrasekaran, B.; Josephson, J.; Benjamins, V. What are ontologies, and why do we need them? IEEE Intell. Syst. Their Appl.
**1999**, 14, 20–26. [Google Scholar] [CrossRef] - Uschold, M.; Gruninger, M. Ontologies: Principles, methods and applications. Knowl. Eng. Rev.
**1996**, 11, 93–136. [Google Scholar] [CrossRef] - Knijff, J.; Frasincar, F.; Hogenboom, F. Domain taxonomy learning from text: The subsumption method versus hierarchical clustering. Data Knowl. Eng.
**2013**, 83, 54–69. [Google Scholar] [CrossRef] - Wei, T.; Lu, Y.; Chang, H.; Zhou, Q.; Bao, X. A semantic approach for text clustering using WordNet and lexical chains. Expert Syst. Appl.
**2015**, 42, 2264–2275. [Google Scholar] [CrossRef] - Horrocks, I. Ontologies and the semantic web. Commun. ACM
**2008**, 51, 58–67. [Google Scholar] [CrossRef] - Kohler, J.; Philippi, S.; Specht, M.; Ruegg, A. Ontology based text indexing and querying for the semantic web. Knowl.-Based Syst.
**2006**, 19, 744–754. [Google Scholar] [CrossRef] - Maedche, A.; Staab, S. Ontology learning for the semantic web. IEEE Intell. Syst. Their Appl.
**2001**, 16, 72–79. [Google Scholar] [CrossRef] - Horrocks, I.; Patel-Schneider, P.; Van Harmelen, F. From SHIQ and RDF to OWL: The making of a Web Ontology Language. Web Semant.
**2003**, 1, 7–26. [Google Scholar] [CrossRef] - Sirin, E.; Parsia, B.; Grau, B.; Kalyanpur, A.; Katz, Y. Pellet: A practical OWL-DL reasoner. Web Semant.
**2007**, 5, 51–53. [Google Scholar] [CrossRef] - Shahi, A.; Woodford, B.J.; Lin, H. Dynamic Real-Time Segmentation and Recognition of Activities Using a Multi-feature Windowing Approach. In Proceedings of the Pacific-Asia Conference on Knowledge Discovery and Data Mining, Jeju, Korea, 23–26 May 2017; Springer: Cham, Switzerland, 2017; pp. 26–38. [Google Scholar]
- Espinilla, M.; Medina, J.; Hallberg, J.; Nugent, C. A new approach based on temporal sub-windows for online sensor-based activity recognition. J. Ambient Intell. Hum. Comput.
**2018**. [Google Scholar] [CrossRef] - Van Kasteren, T.; Noulas, A.; Englebienne, G.; Kröse, B. Accurate activity recognition in a home setting. In Proceedings of the UbiComp 2008-10th International Conference on Ubiquitous Computing, Seoul, Korea, 21–24 September 2008; pp. 1–9. [Google Scholar]
- Espinilla, M.; Rivera, A.; Pérez-Godoy, M.D.; Medina, J.; Martinez, L.; Nugent, C. Recognition of Activities in Resource Constrained Environments; Reducing the Computational Complexity. In Ubiquitous Computing and Ambient Intelligence; Springer: Cham, Switzerland, 2016; pp. 64–74. [Google Scholar]
- Junker, H.; Amft, O.; Lukowicz, P.; Tröster, G. Gesture spotting with body-worn inertial sensors to detect user activities. Pattern Recognit.
**2008**, 41, 2010–2024. [Google Scholar] [CrossRef] - Van Kasteren, T.L.M. Activity Recognition for Health Monitoring Elderly Using Temporal Probabilistic Models. Ph.D. Thesis, University of Amsterdam, Amsterdam, The Netherlands, 2011. [Google Scholar]
- Gu, T.; Wu, Z.; Tao, X.; Pung, H.K.; Lu, J. Epsicar: An emerging patterns based approach to sequential, interleaved and concurrent activity recognition. In Proceedings of the International Conference on Pervasive Computing and Communications, Galveston, TX, USA, 9–13 March 2009; pp. 1–9. [Google Scholar]
- Krishnan, N.C.; Cook, D.J. Activity recognition on streaming sensor data. Pervasive Mob. Comput.
**2014**, 10, 138–154. [Google Scholar] [CrossRef] [PubMed] - Wang, L.; Gu, T.; Tao, X.; Lu, J. A hierarchical approach to real-time activity recognition in body sensor networks. Pervasive Mob. Comput.
**2012**, 8, 115–130. [Google Scholar] [CrossRef] - Patterson, T.; Khan, N.; McClean, S.; Nugent, C.; Zhang, S.; Cleland, I.; Ni, Q. Sensor-based change detection for timely solicitation of user engagement. IEEE Trans. Mob. Comput.
**2017**, 16, 2889–2900. [Google Scholar] [CrossRef] - Chen, B.; Fan, Z.; Cao, F. Activity recognition based on streaming sensor data for assisted living in smart homes. In Proceedings of the 2015 International Conference on Intelligent Environments (IE), Prague, Czech Republic, 15–17 July 2015; pp. 124–127. [Google Scholar]
- Triboan, D.; Chen, L.; Chen, F.; Wang, Z. Semantic segmentation of real-time sensor data stream for complex activity recognition. Pers. Ubiquitous Comput.
**2017**, 21, 411–425. [Google Scholar] [CrossRef] - Singh, D.; Merdivan, E.; Hanke, S.; Kropf, J.; Geist, M.; Holzinger, A. Convolutional and Recurrent Neural Networks for Activity Recognition in Smart Environment. In Towards Integrative Machine Learning and Knowledge Extraction; Springer: Cham, Switzerland, 2017; pp. 194–205. [Google Scholar]
- Cheng, W.; Kasneci, G.; Graepel, T.; Stern, D.; Herbrich, R. Automated feature generation from structured knowledge. In Proceedings of the 20th ACM International Conference on Information and Knowledge Management, Glasgow, Scotland, UK, 24–28 October 2011; ACM: New York, NY, USA, 2011; pp. 1395–1404. [Google Scholar]
- Terziev, Y. Feature generation using ontologies during induction of decision trees on linked data. In Proceedings of the SWC PhD Symposium, Kobe, Japan, 17–21 October 2016. [Google Scholar]
- Hirst, G.; St-Onge, D. Lexical chains as representations of context for the detection and correction of malapropisms. In WordNet: An Electronic Lexical Database; The MIT Press: Cambridge, MA, USA, 1998; Volume 305, pp. 305–332. [Google Scholar]
- Paulheim, H. Generating possible interpretations for statistics from linked open data. In The Semantic Web: Research and Applications; Springer: Berlin/Heidelberg, Germany, 2012; pp. 560–574. [Google Scholar]
- Yan, S.; Liao, Y.; Feng, X.; Liu, Y. Real time activity recognition on streaming sensor data for smart environments. In Proceedings of the 2016 International Conference on Progress in Informatics and Computing (PIC), Shanghai, China, 23–25 December 2016; pp. 51–55. [Google Scholar]
- Wemlinger, Z.; Holder, L. The cose ontology: Bringing the semantic web to smart environments. In Proceedings of the International Conference on Smart Homes and Health Telematics, Montreal, QC, Canada, 20–22 June 2011; Springer: Cham, Switzerland, 2011; pp. 205–209. [Google Scholar]
- Baryannis, G.; Woznowski, P.; Antoniou, G. Rule-Based Real-Time ADL Recognition in a Smart Home Environment. In Proceedings of the International Symposium on Rules and Rule Markup Languages for the Semantic Web, Stony Brook, NY, USA, 6–9 July 2016; Springer: Cham, Switzerland, 2016; pp. 325–340. [Google Scholar]
- Bae, I.H. An ontology-based approach to ADL recognition in smart homes. Future Gener. Comput. Syst.
**2014**, 33, 32–41. [Google Scholar] [CrossRef] - Noor, M.H.M.; Salcic, Z.; Kevin, I.; Wang, K. Enhancing ontological reasoning with uncertainty handling for activity recognition. Knowl. Based Syst.
**2016**, 114, 47–60. [Google Scholar] [CrossRef] - Lehmann, J.; Auer, S.; Bühmann, L.; Tramp, S. Class expression learning for ontology engineering. Web Semant. Sci. Serv. Agents World Wide Web
**2011**, 9, 71–81. [Google Scholar] [CrossRef] - Aloulou, H.; Mokhtari, M.; Tiberghien, T.; Endelin, R.; Biswas, J. Uncertainty handling in semantic reasoning for accurate context understanding. Knowl. Based Syst.
**2015**, 77, 16–28. [Google Scholar] [CrossRef] - Singla, G.; Cook, D.J.; Schmitter-Edgecombe, M. Tracking activities in complex settings using smart environment technologies. Int. Biosci. Psychiatry Technol.
**2009**, 1, 25. [Google Scholar]

DL Syntax | Manchester Syntax | Semantics | |
---|---|---|---|

$\mathcal{I}$ | ${C}_{1}\sqcap {C}_{2}$ | ${C}_{1}\phantom{\rule{3.33333pt}{0ex}}and\phantom{\rule{3.33333pt}{0ex}}{C}_{2}$ | ${({C}_{1}\sqcap {C}_{2})}^{I}=({C}_{1}^{I}\cap {C}_{2}^{I})$ |

$\mathcal{U}$ | ${C}_{1}\bigsqcup {C}_{2}$ | ${C}_{1}\phantom{\rule{3.33333pt}{0ex}}or\phantom{\rule{3.33333pt}{0ex}}{C}_{2}$ | ${({C}_{1}\cup {C}_{2})}^{I}=({C}_{1}^{I}\cup {C}_{2}^{I})$ |

$\mathcal{C}$ | $\neg C$ | $not\phantom{\rule{3.33333pt}{0ex}}C$ | ${(\neg C)}^{I}={\Delta}_{I}\setminus {C}^{I}$ |

$\mathcal{S}$ | $\exists R.C$ | $R\phantom{\rule{3.33333pt}{0ex}}some\phantom{\rule{3.33333pt}{0ex}}C$ | ${(\exists R.C)}^{I}=\{x\mid \exists y.\langle x,y\rangle \in {R}^{I}\wedge y\in {C}^{I}\}$ |

$\mathcal{A}$ | $\forall R.C$ | $R\phantom{\rule{3.33333pt}{0ex}}only\phantom{\rule{3.33333pt}{0ex}}C$ | ${(\forall R.C)}^{I}=\{x\mid \forall y.\langle x,y\rangle \in {R}^{I}\to y\in {C}^{I}\}$ |

$\mathcal{X}$ | $\le nR.C$ | $R\phantom{\rule{3.33333pt}{0ex}}max\phantom{\rule{3.33333pt}{0ex}}n\phantom{\rule{3.33333pt}{0ex}}C$ | ${(\ge nR.C)}^{I}=\{x\mid card\phantom{\rule{3.33333pt}{0ex}}\{y.\langle x,y\rangle \in {R}^{I}\wedge y\in {C}^{I}\}\le n\}$ |

$\mathcal{M}$ | $\ge nR.C$ | $R\phantom{\rule{3.33333pt}{0ex}}min\phantom{\rule{3.33333pt}{0ex}}n\phantom{\rule{3.33333pt}{0ex}}C$ | ${(\le nR.C)}^{I}=\{x\mid card\phantom{\rule{3.33333pt}{0ex}}\{y.\langle x,y\rangle \in {R}^{I}\wedge y\in {C}^{I}\}\ge n\}$ |

startsWith some | hasItem min 2 | ||
---|---|---|---|

Activity | Hall-Bedroom_Door_Set | Hall-Bedroom_Door_Set | Positive |

1 | 1 | 0 | 1 |

2 | 0 | 1 | 0 |

3 | 1 | 1 | 1 |

4 | 0 | 0 | 1 |

Activity | Instances | Duration (average) | Duration (sd) |
---|---|---|---|

Get drink | 20 | 53.25 | 68.81 |

Go to bed | 24 | 29,141.63 | 10,913.63 |

Leave the house | 34 | 39,823.09 | 42,045.64 |

Prepare breakfast | 20 | 202.75 | 153.61 |

Prepare dinner | 10 | 2054.00 | 1185.22 |

Take a shower | 23 | 573.39 | 158.53 |

Use the toilet | 114 | 104.62 | 101.01 |

245 |

Activity | Sensor Events | Sensor Events (average) | Sensor Events (sd) |
---|---|---|---|

Get drink | 69 | 3.45 | 1.00 |

Go to bed | 74 | 3.08 | 1.06 |

Leave the house | 113 | 3.32 | 2.06 |

Prepare breakfast | 100 | 5.00 | 1.30 |

Prepare dinner | 64 | 6.40 | 1.58 |

Take a shower | 53 | 2.30 | 0.56 |

Use the toilet | 376 | 3.30 | 0.81 |

1319 |

**Table 5.**Performance of the classic approach when using the mean duration of activities as the lengths of temporal windows ($c=0$). SMO, Sequential Minimal Optimization; VP, Voted Perceptron; DT, Decision Table.

Percent Correct | F-Measure | |||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|

Activity | C4.5 | SMO | VP | DT | RF | Best | C4.5 | SMO | VP | DT | RF | Best |

Get drink | 96.86 | 99.32 | 95.37 | 96.87 | 100.00 | 100.00 | 0.83 | 0.96 | 0.57 | 0.80 | 1.00 | 1.00 |

Go to bed | 89.26 | 89.54 | 90.36 | 90.23 | 91.43 | 91.43 | 0.00 | 0.00 | 0.03 | 0.00 | 0.30 | 0.30 |

Leave the house | 97.96 | 97.96 | 94.56 | 97.02 | 97.42 | 97.96 | 0.94 | 0.94 | 0.74 | 0.91 | 0.92 | 0.94 |

Prepare breakfast | 96.19 | 95.91 | 94.82 | 91.02 | 97.82 | 97.82 | 0.75 | 0.71 | 0.56 | 0.29 | 0.83 | 0.83 |

Prepare dinner | 95.51 | 96.18 | 96.06 | 94.39 | 97.82 | 97.82 | 0.42 | 0.51 | 0.30 | 0.33 | 0.64 | 0.64 |

Take a shower | 98.36 | 98.23 | 94.04 | 90.63 | 98.63 | 98.63 | 0.90 | 0.89 | 0.44 | 0.00 | 0.91 | 0.91 |

Use the toilet | 89.26 | 88.19 | 84.91 | 87.08 | 91.44 | 91.44 | 0.89 | 0.87 | 0.84 | 0.88 | 0.91 | 0.91 |

Percent Correct | F-Measure | ||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|

$|{\mathit{F}}^{\mathit{k}}|$ | C4.5 | SMO | VP | DT | RF | Best | C4.5 | SMO | VP | DT | RF | Best | |

Classic | 14 | 89.26 | 89.54 | 90.36 | 90.23 | 91.43 | 91.43 | 0.00 | 0.00 | 0.03 | 0.00 | 0.30 | 0.30 |

$\mathcal{A}\mathcal{C}\mathcal{I}\mathcal{X}\mathcal{M}\mathcal{S}\mathcal{U}$ | 20 | 90.23 | 90.23 | 90.23 | 90.23 | 89.69 | 90.23 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 |

$\mathcal{A}\mathcal{C}\mathcal{I}\mathcal{X}\mathcal{M}\mathcal{S}\mathcal{U}$ | 40 | 90.23 | 89.70 | 90.23 | 90.23 | 89.14 | 90.23 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 |

$\mathcal{A}\mathcal{C}\mathcal{I}\mathcal{X}\mathcal{M}\mathcal{S}\mathcal{U}$ | 60 | 91.29 | 93.46 | 90.23 | 90.23 | 93.33 | 93.46 | 0.45 | 0.57 | 0.00 | 0.00 | 0.50 | 0.57 |

$\mathcal{A}\mathcal{C}\mathcal{I}\mathcal{X}\mathcal{M}\mathcal{S}\mathcal{U}$ | 80 | 98.78 | 98.64 | 92.55 | 98.78 | 98.51 | 98.78 | 0.92 | 0.91 | 0.29 | 0.92 | 0.91 | 0.92 |

$\mathcal{A}\mathcal{C}\mathcal{I}\mathcal{X}\mathcal{M}\mathcal{S}\mathcal{U}$ | 100 | 98.78 | 98.78 | 91.06 | 98.78 | 98.37 | 98.78 | 0.92 | 0.92 | 0.14 | 0.92 | 0.90 | 0.92 |

$\mathcal{C}\mathcal{S}\mathcal{M}$ | 20 | 90.23 | 90.23 | 90.23 | 90.23 | 89.69 | 90.23 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 |

$\mathcal{C}\mathcal{S}\mathcal{M}$ | 40 | 90.23 | 89.70 | 90.23 | 90.23 | 89.14 | 90.23 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 |

$\mathcal{C}\mathcal{S}\mathcal{M}$ | 60 | 91.29 | 93.46 | 90.23 | 90.23 | 93.33 | 93.46 | 0.45 | 0.57 | 0.00 | 0.00 | 0.50 | 0.57 |

$\mathcal{C}\mathcal{S}\mathcal{M}$ | 80 | 98.78 | 98.78 | 93.36 | 98.78 | 98.51 | 98.78 | 0.92 | 0.92 | 0.40 | 0.92 | 0.91 | 0.92 |

$\mathcal{C}\mathcal{S}\mathcal{M}$ | 100 | 98.78 | 98.78 | 93.22 | 98.78 | 98.51 | 98.78 | 0.92 | 0.92 | 0.35 | 0.92 | 0.91 | 0.92 |

$\mathcal{S}$ | 20 | 90.23 | 90.23 | 90.23 | 90.23 | 90.23 | 90.23 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 |

$\mathcal{S}$ | 40 | 98.78 | 98.78 | 92.27 | 98.78 | 98.64 | 98.78 | 0.92 | 0.92 | 0.24 | 0.92 | 0.91 | 0.92 |

$\mathcal{S}$ | 60 | 98.78 | 98.23 | 91.73 | 98.78 | 98.37 | 98.78 | 0.92 | 0.89 | 0.22 | 0.92 | 0.89 | 0.92 |

$\mathcal{S}$ | 80 | 98.78 | 97.97 | 90.51 | 98.78 | 98.37 | 98.78 | 0.92 | 0.87 | 0.04 | 0.92 | 0.89 | 0.92 |

$\mathcal{S}$ | 100 | 98.78 | 97.97 | 90.91 | 98.78 | 98.09 | 98.78 | 0.92 | 0.87 | 0.09 | 0.92 | 0.87 | 0.92 |

Percent Correct | F-Measure | ||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|

$|{\mathit{F}}^{\mathit{k}}|$ | C4.5 | SMO | VP | DT | RF | Best | C4.5 | SMO | VP | DT | RF | Best | |

Classic | 14 | 89.26 | 88.19 | 84.91 | 87.08 | 91.44 | 91.44 | 0.89 | 0.87 | 0.84 | 0.88 | 0.91 | 0.91 |

$\mathcal{A}\mathcal{C}\mathcal{I}\mathcal{X}\mathcal{M}\mathcal{S}\mathcal{U}$ | 20 | 80.94 | 80.12 | 79.71 | 76.72 | 81.76 | 81.76 | 0.79 | 0.78 | 0.78 | 0.76 | 0.80 | 0.80 |

$\mathcal{A}\mathcal{C}\mathcal{I}\mathcal{X}\mathcal{M}\mathcal{S}\mathcal{U}$ | 40 | 84.82 | 83.99 | 84.12 | 82.90 | 85.89 | 85.89 | 0.85 | 0.84 | 0.84 | 0.83 | 0.86 | 0.86 |

$\mathcal{A}\mathcal{C}\mathcal{I}\mathcal{X}\mathcal{M}\mathcal{S}\mathcal{U}$ | 60 | 95.68 | 96.77 | 89.73 | 94.72 | 96.49 | 96.77 | 0.95 | 0.97 | 0.90 | 0.94 | 0.96 | 0.97 |

$\mathcal{A}\mathcal{C}\mathcal{I}\mathcal{X}\mathcal{M}\mathcal{S}\mathcal{U}$ | 80 | 94.88 | 97.72 | 89.29 | 94.99 | 97.03 | 97.72 | 0.95 | 0.98 | 0.89 | 0.94 | 0.97 | 0.98 |

$\mathcal{A}\mathcal{C}\mathcal{I}\mathcal{X}\mathcal{M}\mathcal{S}\mathcal{U}$ | 100 | 94.88 | 97.18 | 89.56 | 94.99 | 96.89 | 97.18 | 0.95 | 0.97 | 0.89 | 0.94 | 0.97 | 0.97 |

$\mathcal{C}\mathcal{S}\mathcal{M}$ | 20 | 80.94 | 80.12 | 79.71 | 76.72 | 81.76 | 81.76 | 0.79 | 0.78 | 0.78 | 0.76 | 0.80 | 0.80 |

$\mathcal{C}\mathcal{S}\mathcal{M}$ | 40 | 84.82 | 83.99 | 84.12 | 82.90 | 85.89 | 85.89 | 0.85 | 0.84 | 0.84 | 0.83 | 0.86 | 0.86 |

$\mathcal{C}\mathcal{S}\mathcal{M}$ | 60 | 95.68 | 96.77 | 89.73 | 94.72 | 96.49 | 96.77 | 0.95 | 0.97 | 0.90 | 0.94 | 0.96 | 0.97 |

$\mathcal{C}\mathcal{S}\mathcal{M}$ | 80 | 94.88 | 97.72 | 90.26 | 94.99 | 97.57 | 97.72 | 0.95 | 0.98 | 0.90 | 0.95 | 0.97 | 0.98 |

$\mathcal{C}\mathcal{S}\mathcal{M}$ | 100 | 94.88 | 97.86 | 88.32 | 94.86 | 97.02 | 97.86 | 0.95 | 0.98 | 0.88 | 0.94 | 0.97 | 0.98 |

$\mathcal{S}$ | 20 | 80.86 | 79.52 | 81.81 | 80.44 | 84.77 | 84.77 | 0.80 | 0.79 | 0.81 | 0.80 | 0.85 | 0.85 |

$\mathcal{S}$ | 40 | 95.54 | 97.04 | 91.06 | 94.59 | 96.90 | 97.04 | 0.95 | 0.97 | 0.91 | 0.94 | 0.97 | 0.97 |

$\mathcal{S}$ | 60 | 95.54 | 97.71 | 89.31 | 94.99 | 96.76 | 97.71 | 0.95 | 0.98 | 0.89 | 0.94 | 0.97 | 0.98 |

$\mathcal{S}$ | 80 | 95.54 | 97.30 | 88.47 | 94.99 | 97.02 | 97.30 | 0.95 | 0.97 | 0.89 | 0.94 | 0.97 | 0.97 |

$\mathcal{S}$ | 100 | 95.54 | 97.31 | 88.73 | 94.72 | 96.48 | 97.31 | 0.95 | 0.97 | 0.88 | 0.94 | 0.96 | 0.97 |

Percent Correct | F-Measure | ||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|

$|{\mathit{F}}^{\mathit{k}}|$ | C4.5 | SMO | VP | DT | RF | Best | C4.5 | SMO | VP | DT | RF | Best | |

Classic | 14 | 95.51 | 96.18 | 96.06 | 94.39 | 97.82 | 97.82 | 0.42 | 0.51 | 0.30 | 0.33 | 0.64 | 0.64 |

$\mathcal{A}\mathcal{C}\mathcal{I}\mathcal{X}\mathcal{M}\mathcal{S}\mathcal{U}$ | 20 | 97.96 | 97.83 | 96.33 | 97.96 | 97.42 | 97.96 | 0.50 | 0.49 | 0.13 | 0.50 | 0.47 | 0.50 |

$\mathcal{A}\mathcal{C}\mathcal{I}\mathcal{X}\mathcal{M}\mathcal{S}\mathcal{U}$ | 40 | 97.56 | 98.79 | 96.47 | 96.06 | 98.78 | 98.79 | 0.49 | 0.77 | 0.13 | 0.12 | 0.72 | 0.77 |

$\mathcal{A}\mathcal{C}\mathcal{I}\mathcal{X}\mathcal{M}\mathcal{S}\mathcal{U}$ | 60 | 97.28 | 99.06 | 96.87 | 96.20 | 98.51 | 99.06 | 0.46 | 0.88 | 0.23 | 0.16 | 0.63 | 0.88 |

$\mathcal{A}\mathcal{C}\mathcal{I}\mathcal{X}\mathcal{M}\mathcal{S}\mathcal{U}$ | 80 | 97.28 | 98.92 | 96.73 | 96.20 | 98.65 | 98.92 | 0.46 | 0.84 | 0.20 | 0.16 | 0.69 | 0.84 |

$\mathcal{A}\mathcal{C}\mathcal{I}\mathcal{X}\mathcal{M}\mathcal{S}\mathcal{U}$ | 100 | 97.28 | 98.92 | 96.18 | 96.20 | 98.37 | 98.92 | 0.46 | 0.80 | 0.10 | 0.16 | 0.62 | 0.80 |

$\mathcal{C}\mathcal{S}\mathcal{M}$ | 20 | 97.96 | 97.83 | 96.33 | 97.96 | 97.42 | 97.96 | 0.50 | 0.49 | 0.13 | 0.50 | 0.47 | 0.50 |

$\mathcal{C}\mathcal{S}\mathcal{M}$ | 40 | 97.56 | 98.79 | 96.47 | 96.06 | 98.78 | 98.79 | 0.49 | 0.77 | 0.13 | 0.12 | 0.72 | 0.77 |

$\mathcal{C}\mathcal{S}\mathcal{M}$ | 60 | 97.28 | 99.06 | 96.87 | 96.20 | 98.51 | 99.06 | 0.46 | 0.88 | 0.23 | 0.16 | 0.63 | 0.88 |

$\mathcal{C}\mathcal{S}\mathcal{M}$ | 80 | 97.28 | 99.06 | 96.59 | 96.20 | 98.66 | 99.06 | 0.46 | 0.88 | 0.17 | 0.16 | 0.69 | 0.88 |

$\mathcal{C}\mathcal{S}\mathcal{M}$ | 100 | 97.28 | 98.92 | 96.46 | 96.20 | 98.91 | 98.92 | 0.46 | 0.84 | 0.13 | 0.16 | 0.76 | 0.84 |

$\mathcal{S}$ | 20 | 97.29 | 96.34 | 96.33 | 97.43 | 96.07 | 97.43 | 0.48 | 0.36 | 0.24 | 0.48 | 0.39 | 0.48 |

$\mathcal{S}$ | 40 | 97.69 | 98.37 | 97.01 | 96.61 | 98.36 | 98.37 | 0.60 | 0.74 | 0.27 | 0.26 | 0.60 | 0.74 |

$\mathcal{S}$ | 60 | 97.69 | 98.92 | 96.86 | 96.61 | 97.95 | 98.92 | 0.60 | 0.85 | 0.31 | 0.26 | 0.57 | 0.85 |

$\mathcal{S}$ | 80 | 97.69 | 99.59 | 96.73 | 96.61 | 98.36 | 99.59 | 0.60 | 0.97 | 0.20 | 0.26 | 0.69 | 0.97 |

$\mathcal{S}$ | 100 | 97.28 | 99.45 | 97.69 | 96.48 | 98.37 | 99.45 | 0.50 | 0.89 | 0.43 | 0.26 | 0.62 | 0.89 |

Measure | Group | Classifier | Mean | SD |
---|---|---|---|---|

Precision | a | SMO | 96.581 | 4.395 |

a | RF | 96.390 | 3.759 | |

ab | C4.5 | 95.708 | 4.006 | |

b | DT | 95.087 | 4.444 | |

c | VP | 91.958 | 4.161 | |

F-measure | a | SMO | 0.815 | 0.246 |

ab | RF | 0.758 | 0.244 | |

b | C4.5 | 0.708 | 0.283 | |

c | DT | 0.608 | 0.378 | |

d | VP | 0.414 | 0.034 |

Measure | Group | DL Operators | Mean | SD |
---|---|---|---|---|

Precision | a | $\mathcal{S}$ | 96.662 | 4.095 |

a | $\mathcal{A}\mathcal{C}\mathcal{I}\mathcal{X}\mathcal{M}\mathcal{S}\mathcal{U}$ | 95.117 | 4.574 | |

a | $\mathcal{C}\mathcal{S}\mathcal{M}$ | 95.025 | 4.641 | |

b | classic | 91.444 | 3.758 | |

F-measure | a | $\mathcal{S}$ | 0.697 | 0.308 |

a | $\mathcal{A}\mathcal{C}\mathcal{I}\mathcal{X}\mathcal{M}\mathcal{S}\mathcal{U}$ | 0.655 | 0.335 | |

a | $\mathcal{C}\mathcal{S}\mathcal{M}$ | 0.647 | 0.345 | |

b | classic | 0.461 | 0.359 |

Percent Correct | F-Measure | |||||||
---|---|---|---|---|---|---|---|---|

Activity | Classic | Proposal | Gain | % Max. | Classic | Proposal | Gain | % Max. |

Gain | Gain | |||||||

Go to bed | 91.43 | 98.78 | 7.35 | 86% | 0.30 | 0.92 | 0.62 | 89% |

Prepare dinner | 97.82 | 99.59 | 1.77 | 81% | 0.83 | 0.97 | 0.14 | 82% |

Use the toilet | 91.44 | 97.86 | 6.42 | 75% | 0.91 | 0.98 | 0.07 | 78% |

Get drink | 100.00 | 99.59 | −0.41 | - | 1.00 | 0.97 | −0.03 | - |

Leave the house | 97.96 | 100.00 | 2.04 | 100% | 0.94 | 1.00 | 0.06 | 100% |

Take a shower | 98.63 | 98.77 | 0.14 | 10% | 0.91 | 0.93 | 0.02 | 22% |

Prepare breakfast | 97.82 | 98.78 | 0.96 | 44% | 0.83 | 0.92 | 0.09 | 53% |

96.44 | 99.07 | 2.63 | 66% | 0.79 | 0.96 | 0.17 | 72% |

Percent Correct | F-Measure | ||||||
---|---|---|---|---|---|---|---|

Dataset | Activity | Classic | Proposal | Gain | Classic | Proposal | Gain |

Singla | Answer the phone | 98.22 | 98.81 | 0.59 | 0.89 | 0.94 | 0.05 |

Singla | Choose outfit | 100.00 | 100.00 | 0.00 | 1.00 | 1.00 | 0.00 |

Singla | Clean | 98.81 | 98.80 | −0.01 | 0.95 | 0.95 | 0.00 |

Singla | Fill medication dispenser | 99.02 | 97.81 | −1.21 | 0.94 | 0.89 | −0.05 |

Singla | Prepare birthday card | 100.00 | 98.39 | −1.61 | 1.00 | 0.93 | −0.07 |

Singla | Prepare soup | 100.00 | 100.00 | 0.00 | 1.00 | 1.00 | 0.00 |

Singla | Wash DVD | 98.60 | 100.00 | 1.40 | 0.95 | 1.00 | 0.05 |

Singla | Water plants | 98.60 | 98.80 | 0.20 | 0.94 | 0.95 | 0.01 |

Ordoñez (a) | Breakfast | 99.55 | 99.55 | 0.00 | 0.90 | 0.94 | 0.04 |

Ordoñez (a) | Grooming | 96.53 | 97.12 | 0.59 | 0.92 | 0.94 | 0.02 |

Ordoñez (a) | Leaving | 99.55 | 100.00 | 0.45 | 0.98 | 1.00 | 0.02 |

Ordoñez (a) | Lunch | 99.50 | 100.00 | 0.50 | 0.88 | 0.90 | 0.02 |

Ordoñez (a) | Showering | 100.00 | 100.00 | 0.00 | 1.00 | 1.00 | 0.00 |

Ordoñez (a) | Sleeping | 100.00 | 100.00 | 0.00 | 1.00 | 1.00 | 0.00 |

Ordoñez (a) | Snack | 100.00 | 99.70 | −0.30 | 1.00 | 0.98 | −0.02 |

Ordoñez (a) | Spare_Time_TV | 98.49 | 100.00 | 1.51 | 0.98 | 1.00 | 0.02 |

Ordoñez (a) | Toileting | 97.29 | 97.29 | 0.00 | 0.86 | 0.86 | 0.00 |

Ordoñez (b) | Breakfast | 96.06 | 96.80 | 0.74 | 0.41 | 0.62 | 0.21 |

Ordoñez (b) | Dinner | 97.55 | 97.62 | 0.07 | 0.00 | 0.33 | 0.33 |

Ordoñez (b) | Grooming | 93.60 | 92.71 | −0.89 | 0.86 | 0.83 | −0.03 |

Ordoñez (b) | Leaving | 95.38 | 97.70 | 2.32 | 0.71 | 0.86 | 0.15 |

Ordoñez (b) | Lunch | 97.10 | 97.10 | 0.00 | 0.00 | 0.42 | 0.42 |

Ordoñez (b) | Showering | 100.00 | 100.00 | 0.00 | 0.60 | 0.60 | 0.00 |

Ordoñez (b) | Sleeping | 94.05 | 100.00 | 5.95 | 0.34 | 1.00 | 0.66 |

Ordoñez (b) | Snack | 90.93 | 91.97 | 1.04 | 0.40 | 0.60 | 0.20 |

Ordoñez (b) | Spare_Time_TV | 88.55 | 92.35 | 3.80 | 0.73 | 0.85 | 0.12 |

Ordoñez (b) | Toileting | 93.08 | 94.57 | 1.49 | 0.80 | 0.85 | 0.05 |

Average | 97.42 | 98.04 | 0.62 | 0.78 | 0.86 | 0.08 |

Percent Correct | F-Measure | |||
---|---|---|---|---|

Activity | Classic | Proposal | Classic | Proposal |

Go to bed | 56.91 | 60.16 | 0.00 | 0.20 |

Prepare dinner | 90.46 | 96.29 | 0.51 | 0.39 |

Use the toilet | 84.84 | 80.16 | 0.76 | 0.82 |

Get drink | 92.54 | 92.30 | 0.52 | 0.52 |

Leave the house | 76.39 | 78.56 | 0.82 | 0.83 |

Take a shower | 92.30 | 94.13 | 0.41 | 0.44 |

Prepare breakfast | 93.86 | 95.13 | 0.36 | 0.50 |

83.90 | 85.25 | 0.48 | 0.53 |

© 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Salguero, A.G.; Espinilla, M.; Delatorre, P.; Medina, J. Using Ontologies for the Online Recognition of Activities of Daily Living. *Sensors* **2018**, *18*, 1202.
https://doi.org/10.3390/s18041202

**AMA Style**

Salguero AG, Espinilla M, Delatorre P, Medina J. Using Ontologies for the Online Recognition of Activities of Daily Living. *Sensors*. 2018; 18(4):1202.
https://doi.org/10.3390/s18041202

**Chicago/Turabian Style**

Salguero, Alberto G., Macarena Espinilla, Pablo Delatorre, and Javier Medina. 2018. "Using Ontologies for the Online Recognition of Activities of Daily Living" *Sensors* 18, no. 4: 1202.
https://doi.org/10.3390/s18041202