Mitigating the Impact on Users’ Privacy Caused by over Specifications in the Design of IoT Applications
Abstract
:1. Introduction
2. Related Work
3. Research Method
- Questionnaire survey [40]: A description of PATH could be given to persons in the target category (i.e., knowledgeable about software development, yet not being privacy experts), together with some questions exploring whether they find the guidelines of PATH understandable and whether they think it will help them avoid overspecification. The advantage of a questionnaire survey is that one might reach a high number of respondents quickly, and thus get a big amount of data. The main weakness is that findings from questionnaires might be shallow and would mainly be about the respondents’ perceptions of the understandability and effectiveness of PATH. For example, a respondent who reads the guidelines might think they are understandable and respond positively about its believed effectiveness, whereas the actual usage of the framework might have uncovered that there were guidelines the participant was unable to follow or important issues misunderstood. A particular weakness about questionnaires for our RQs is that questions would be somewhat hypothetical, making it hard for respondents to answer reliably (e.g., it is hard to estimate the effectiveness of a method you have not tried). Therefore, the questionnaire approach was not chosen for this research, as the disadvantages and shortcomings were considered to far outweigh the possibility to get more data quickly.
- Case study [41]: For example, having somebody use PATH in a software development project for a considerable period of time, with several approaches to data collection along the way (e.g., observing them at work, interviewing them, studying design documents, and developed software). This would indeed have been very interesting, especially if it were a real industrial software project where privacy issues was a key concern. However, it is difficult to convince someone to use a yet untried framework in a real project. Also, such a case study would be quite time-consuming. Therefore, such a type of evaluation was considered overambitious at the current point. It would be more feasible to get some low threshold evidence for the effectiveness of the framework first, and then possibly build on that for more ambitious evaluations.
- Experiment [42], with recruited participants, performing a rather short duration task using the PATH framework, as administered by the researcher. The advantage of such an experiment compared to the case study is that it is easier to conduct because of the short duration, and the researcher has more control, e.g., about what task the participants shall perform. It is also regarded as the classical method to establish a cause–effect relationship [42]. However, the increased control of the experiment, comes at the cost of reduced realism, as participants can only be kept in a controlled setting for a limited period of time; this means that the task must be fairly small, and thus more limited in scope than a typical industrial software development task.
- Having all participants use PATH only.
- Comparing PATH (as the new treatment) to some kind of control. This control might be:
- (a)
- using just an ad hoc common sense approach, i.e., no method at all related to privacy issues, and
- (b)
- using another privacy method, preferably one that was well established or the most suitable one could find.
- An individual set-up has a number of advantages. The number of data points would be higher, making it easier to find statistically significant results. Also, some research indicates that, during brainstorming, individuals might be able to achieve a higher productivity, in terms of quantity and quality of the generated ideas, than if they are arranged in groups [47,48].
- A group-based set-up seems more realistic in industrial software projects. This type of design work and decision-making process is typically done by a team larger than one individual [49].
3.1. Assignment Description
3.2. Privacy Frameworks
- Capture: What kind of information is being picked up? Candidates include voices, actual speech, moving video or frame grabbed images (close-up or not), personal identity, work activity and its products such as key presses, applications used, files accessed, and documents.
- Construction: What happens to the information? Is it encrypted or processed at some point or combined with other information and, if so, how? Is it stored? In what form?
- Accessibility: Is information publicly, available to particular groups, certain people or just to oneself?
- Purpose: What is the information used for? How might it be used in the future? Is it going to be combined with information from third parties?
- Intentionality: Is user intent necessary for the interaction to take place?
- Visibility: Is it necessary to see that the interaction takes place?
- Precision: Is the information obtained from the system precise?
- Continuity: Is it possible for the system to measure how long the interaction is taking?
- Understandability: Is the interaction easy to understand or to use?
- Mutability: Is the information sent or received always the same or does it change?
- Segmentation: Is it possible to send different information to different receivers or is it always the same?
- Directionality: Can the information flow in both directions or only in one direction?
- Mediation: Does the transmission of information require of a third party mediator?
3.3. Data Analysis
- RF: Radio Frequency: any device that sends information wirelessly through electromagnetic signals. (keywords “rf”, “nfc”, “rfid”, “bluetooth”, “wifi”)
- Mobile: Usage of portable devices, be it through a smart phone or tablet app or a visitors guide device. (keywords “guide”, “tablet”, “app”)
- Video: Video or audio capturing devices. (keywords “video”, “ir-cameras”, “audio”)
- Self: Self reported mechanisms. Those where the visitor actively reports time spent or level of engagement by pressing a button or scanning a ticket or QR code next to the exhibitions. (keywords “happy-or-not”, “ticket”, “qr”)
- Others: Other interaction mechanisms mentioned by the students, combination of other different mechanism, Brain-Computer Interface, beacons or presence sensors. (keywords “combination”, “bci”, “beacon”, “sensor”)
4. Results
- Alternatives: What alternatives have the students found? Do the students consider there might be another alternative they have not thought about?
- Method: Which method have the students followed to elaborate the list of alternatives? How did they compare the alternatives? Do they consider that the framework was helpful? What were the pros and cons of the method or methods they used.
- Accuracy trade-off: Have the students considered any alternative that provides less amount of information than video recording? Do the students consider that another interaction mechanism different than video recording would be sufficient?
- Technical limitations dismissal: What alternatives could be considered if there was no technical limitation of what is possible with current technology?
- Information: References to a type of information that the students considered relevant to capture from the museums visitors (age, gender, time spent, location, and engagement).
- PATH concept: A concept that is mentioned or explained by the PATH framework (intentionality, understandability, precision, etc.).
- QOC-FC concept: A concept that is mentioned or explained by the QOC-FC framework (capture, construction, control, feedback, and purpose).
- Privacy trade-off: A concept that the students identify as a trade-off between the privacy of the visitors and another factor (cost of implementation, architecture complexity, amount of information retrieved, and reward bias).
- Privacy concept: A concept related to privacy that is not mentioned in the two proposed frameworks and that is not considered a trade-off (consent and proportionality).
5. Discussion
6. Threats to Validity
6.1. Conclusion Validity
6.2. Internal Validity
- Selection bias: Assuming two types of subject: Type A, more prone to provide a wider variety of alternatives (i.e., because they have more extensive knowledge about technology, they are less biased against overspecifications or they have a more critical mindset), and type B, more prone to stick with the proposed alternative (i.e., because they have limited knowledge about technological alternatives, tend to be easily biased against overspecifications or simply are not motivated enough to make an effort to challenge the problem proposed). A selection bias would occur if any particular selection method promoted that individuals of type A were assigned the PATH method while individuals of type B were assigned the QOC-FC method. As the students were assigned one session or the other based on their availability, students and researchers did not know each other beforehand, and the students did not know any details about the experiment, it is not likely that this type of bias had an impact on the result. We could have applied a more thorough random selection method; however, that would have led us to mix both methods in the two sessions, endangering the result to be altered by a diffusion bias (treatment subjects observing or learning from control subjects or the other way around). Although the selection process was not biased in any way, it must be acknowledged that one group of students may have been better than the other one just by random chance. With the low number of participants, a small random unevenness of the two groups could have an effect. For instance, if there were two students who were a lot more competent than the rest (e.g., with a better understanding of privacy, stronger ability to challenge premature design assumptions), and both were to end up in the PATH section, perhaps each dominating a group in a positive direction, this—rather than a difference between the two approaches—could explain some of the differences in student results. At least, it can be said that from the observation of the work sessions of the experiment groups, all participants seemed to be eagerly contributing to the discussions, and none of the groups had individual students that seemed to be strongly dominating the discussions. Nevertheless, it cannot be completely excluded that the groups were different by chance, and this is a definite weakness that must be acknowledged from having a limited number of participants.
- Diffusion bias: As the two sessions of the evaluation took place in consecutive days, there was a chance of subjects who assisted in the first day leaking information to those who were going to assist the day after. To prevent this, the students were explicitly requested to avoid sharing information about the experiment with others. The second day, during the interview process, the students were asked a few key questions to ensure that they received no information from previous students.
- Maturation: As each participant was involved in the experiment once and no more than one sample was obtained from the same individual, there is no threat of maturation in the experiment.
- Timing effect: The two sessions took place on two consecutive working days (Thursday and Friday) from 10:00 to 14:00. No special dates or holidays took place during the week. We consider that the allocated time slots had a neglectable impact over the students’ cognitive performance.
- Experimenter bias: As one of the methods (PATH) was designed by the researchers, there was a high risk that an experimenter bias altered the results of the evaluation. We were especially aware of this and adopted a precautionary approach. An experimenter bias could be introduced in three different points during the evaluation, namely, task communication (i.e., by giving unconscious hints to the students about what type of response the researchers expected them to provide), interview (i.e., by leading the conversation towards a point in which the students reached the desired response), and data analysis (i.e., by focusing our attention on specific segments of the obtained data and ignoring others). It would have been ideal to request the intervention of an independent third party to conduct the experiment, however, the lack of resources was an impediment for this. Instead, we elaborated a thorough research protocol to avoid biases in the experiment. For the task communication, all the information was given to the students through a printed document (one page for the problem description and one page for the method to use, Figure 1 and Figure 2). The initial presentation consisted only of reading out loud the documents to the students. A special effort was put in not disclosing which method was our own. Since none of the methods are part of the study program the students were not aware of the authorship of both. The provided documents did not include any author information (i.e., names, affiliations, or dates). All documents were read in a uniform way so that the students did not suspect that one of the method had any preference over the other. Any questions from the students that could be solved by reading again parts of the document were answered this way. For questions about any other aspect that was not specified in the documents, the students were informed that they had freedom of choice on that matter. To avoid biases during the interview process, a semistructured approach was taken, following a script with the topics to prevent an induced response from the students. To prevent a biased analysis of the data, the whole interview was recorded and transcribed, using the method proposed by Bryman [51] (ch. 13) as described in Section 3.3.
6.3. Construct Validity
- Would it be possible that a group provided a wide variety of alternatives to the design while still suffering the consequences of the overspecification? (false positive).
- Would it be possible that a group provided only solutions based on the given alternative (video capture) even though they were not conditioned by the overspecification? (false negative)
6.4. External Validity
7. Conclusions and Future Work
- Conducting similar experiments with a larger number of participants and with a higher level of expertise.
- Comparing the PATH framework with other privacy methods. One possibility would be to compare PATH with STRAP, as both include GOA as part of the method, even though STRAP does not target overspecifications.
- Evaluating PATH over a long term experiment on a case study to better understand the iterative aspect of the framework. This way, it could be observed if the framework is still effective for managing overspecification while preventing that new types of overspecifications are introduced in the design.
- Investigate other types of overspecifications. So far, the PATH framework is designed to address overspecifications similar to the one proposed in the science museum scenario (overspecified interaction mechanisms). The cost of the implementation, for example, is another type of nonfunctional overspecification and it can be interesting since most of the groups interviewed were concerned about this matter.
- The PATH vocabulary can be extended with some of the attributes mentioned by the students: wearability, since wearable devices are more likely to expose personal information (i.e., location) than non-wearable devices and identifiability, as in the capability of an interaction mechanism of identifying the user.
- Propose other application scenarios with different types of interaction mechanism. The impact on users’ privacy might be different depending on the technology used to implement the system so it is interesting to find better ways to guide the design.
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
Abbreviations
BCI | Brain Computer Interface |
NFC | Near Field Communication |
GDPR | General Data Protection Regulation |
IoT | Internet of Things |
JAD | Joint Application Design |
PATH | Privacy Aware Transmission Highway |
QOC-FC | Question and Options Criteria-Feedback and Control |
RAD | Rapid Application Development |
RFID | Radio Frequency IDentification |
STRAP | STRuctured Analysis of Privacy |
References
- Warren, S.D.; Brandeis, L.D. The right to privacy. Harv. Law Rev. 1890, 4, 193–220. [Google Scholar] [CrossRef]
- Westin, A.F. Privacy and Freedom; Bodley Head: London, UK, 1967. [Google Scholar]
- Altman, I. The Environment and Social Behavior: Privacy, Personal Space, Territory, and Crowding; Brooks/Cole Pub. Co.: Monterey, CA, USA, 1975. [Google Scholar]
- Solove, D.J. Understanding Privacy; Harvard University Press: Cambridge, MA, USA, 2008. [Google Scholar]
- Williams, M.; Nurse, J.R.; Creese, S. Privacy is the boring bit: user perceptions and behaviour in the Internet-of-Things. In Proceedings of the 2017 IEEE 15th Annual Conference on Privacy, Security and Trust (PST), Calgary, AB, Canada, 28–30 August 2017; pp. 181–189. [Google Scholar]
- Ziegeldorf, J.H.; Morchon, O.G.; Wehrle, K. Privacy in the Internet of Things: threats and challenges. Secur. Commun. Netw. 2014, 7, 2728–2742. [Google Scholar] [CrossRef]
- Pérez Fernández, A. Towards the Tangible Hyperlink. In Proceedings of the Seventh International Conference on Advances in Computer-Human Interactions, ACHI 2014, Barcelona, Spain, 23–27 March 2014; pp. 17–20. [Google Scholar]
- Hoepman, J.H. Privacy design strategies. In Proceedings of the IFIP International Information Security Conference, Berlin, Germany, 4 June 2014; pp. 446–459. [Google Scholar]
- Spiekermann, S.; Cranor, L.F. Engineering privacy. IEEE Trans. Softw. Eng. 2009, 35, 67–82. [Google Scholar] [CrossRef]
- Langheinrich, M. Privacy by design—Principles of privacy-aware ubiquitous systems. In Proceedings of the Ubicomp 2001: Ubiquitous Computing, Berlin, Germany, 2 October 2001; pp. 273–291. [Google Scholar]
- Yamin, M.; Alsaawy, Y.; Alkhodre, A.; Abi Sen, A.A. An Innovative Method for Preserving Privacy in Internet of Things. Sensors 2019, 19, 3355. [Google Scholar] [CrossRef]
- Yin, X.C.; Liu, Z.G.; Ndibanje, B.; Nkenyereye, L.; Riazul Islam, S.M. An IoT-Based Anonymous Function for Security and Privacy in Healthcare Sensor Networks. Sensors 2019, 19, 3146. [Google Scholar] [CrossRef] [PubMed]
- Cavoukian, A. Privacy by Design: The 7 Foundational Principles. Information and Privacy Commissioner of Ontario, Canada. 2009. Available online: https://www.iab.org/wp-content/IAB-uploads/2011/03/fred_carter.pdf (accessed on 3 October 2019).
- Iachello, G.; Abowd, G.D. Privacy and proportionality: adapting legal evaluation techniques to inform design in ubiquitous computing. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Portland, OR, USA, 2–7 April 2005; pp. 91–100. [Google Scholar]
- Abdulghani, H.A.; Nijdam, N.A.; Collen, A.; Konstantas, D. A Study on Security and Privacy Guidelines, Countermeasures, Threats: IoT Data at Rest Perspective. Symmetry 2019, 11, 774. [Google Scholar] [CrossRef]
- Abdul-Ghani, H.A.; Konstantas, D. A Comprehensive Study of Security and Privacy Guidelines, Threats, and Countermeasures: An IoT Perspective. J. Sens. Actuator Netw. 2019, 8, 22. [Google Scholar] [CrossRef]
- Shmueli, O.; Pliskin, N.; Fink, L. Explaining over-requirement in software development projects: an experimental investigation of behavioral effects. Int. J. Proj. Manag. 2015, 33, 380–394. [Google Scholar] [CrossRef]
- Shmueli, O.; Ronen, B. Excessive software development: Practices and penalties. Int. J. Proj. Manag. 2017, 35, 13–27. [Google Scholar] [CrossRef]
- Pérez Fernández, A.; Sindre, G. The privacy aware transmission highway framework. Int. J. Inf. Priv. Secur. Integr. 2018, 3, 327–350. [Google Scholar] [CrossRef]
- Perez Fernandez, A.; Sindre, G. Software Assisted Privacy Impact Assessment in Interactive Ubiquitous Computing Systems. 2019; accepted. [Google Scholar]
- Bednar, K.; Spiekermann, S.; Langheinrich, M. Engineering Privacy by Design: Are engineers ready to live up to the challenge? Inf. Soc. 2019, 35, 122–142. [Google Scholar] [CrossRef]
- Boehm, B. Anchoring the software process. IEEE Softw. 1996, 13, 73–82. [Google Scholar] [CrossRef]
- Jones, C. Strategies for managing requirements creep. Computer 1996, 29, 92–94. [Google Scholar] [CrossRef]
- Yu, E.; Mylopoulos, J. Why goal-oriented requirements engineering. In Proceedings of the 4th International Workshop on Requirements Engineering: Foundations of Software Quality, Pisa, Italy, 8–9 June 1998; Volume 15, pp. 15–22. [Google Scholar]
- Buschmann, F. Learning from failure, part 2: featuritis, performitis, and other diseases. IEEE Softw. 2010, 27, 10–11. [Google Scholar] [CrossRef]
- Coman, A.; Ronen, B. Icarus’ predicament: managing the pathologies of overspecification and overdesign. Int. J. Proj. Manag. 2010, 28, 237–244. [Google Scholar] [CrossRef]
- Simon, H.A. Models of Man; Social and Rational; Wiley: Oxford, UK, 1957. [Google Scholar]
- Shmueli, O.; Pliskin, N.; Fink, L. Can the outside-view approach improve planning decisions in software development projects? Inf. Syst. J. 2016, 26, 395–418. [Google Scholar] [CrossRef]
- Hendrix, T.D.; Schneider, M.P. NASA’s TReK project: A case study in using the spiral model of software development. Commun. ACM 2002, 45, 152–159. [Google Scholar] [CrossRef]
- Lee-Kelley, L.; Sankey, T. Global virtual teams for value creation and project success: A case study. Int. J. Proj. Manag. 2008, 26, 51–62. [Google Scholar] [CrossRef]
- Bjarnason, E.; Wnuk, K.; Regnell, B. Are you biting off more than you can chew? A case study on causes and effects of overscoping in large-scale software engineering. Inf. Softw. Technol. 2012, 54, 1107–1124. [Google Scholar] [CrossRef] [Green Version]
- Damian, D.; Chisan, J. An empirical study of the complex relationships between requirements engineering processes and other processes that lead to payoffs in productivity, quality, and risk management. IEEE Trans. Softw. Eng. 2006, 32, 433–453. [Google Scholar] [CrossRef]
- Choi, K.; Bae, D.H. Dynamic project performance estimation by combining static estimation models with system dynamics. Inf. Softw. Technol. 2009, 51, 162–172. [Google Scholar] [CrossRef]
- Gellman, R. Fair Information Practices: A Basic History. 2017. Available online: http://dx.doi.org/10.2139/ssrn.2415020 (accessed on 3 October 2019).
- Bellotti, V.; Sellen, A. Design for privacy in ubiquitous computing environments. In Proceedings of the Third European Conference on Computer-Supported Cooperative Work, ECSCW’93, Milan, Italy, 13–17 September 1993; pp. 77–92. [Google Scholar]
- MacLean, A.; Young, R.M.; Bellotti, V.M.; Moran, T.P. Questions, options, and criteria: Elements of design space analysis. Hum.-Comput. Interact. 1991, 6, 201–250. [Google Scholar]
- Jensen, C.; Tullio, J.; Potts, C.; Mynatt, E.D. STRAP: A Structured Analysis Framework for Privacy. 2005. Available online: https://smartech.gatech.edu/handle/1853/4450 (accessed on 4 October 2019).
- Thomas, K.; Bandara, A.K.; Price, B.A.; Nuseibeh, B. Distilling privacy requirements for mobile applications. In Proceedings of the 36th International Conference on Software Engineering, Hyderabad, India, 31 May 2014; pp. 871–882. [Google Scholar]
- ElShekeil, S.A.; Laoyookhong, S. GDPR Privacy by Design. 2017. Available online: https://dsv.su.se/polopoly_fs/1.351720.1507815130!/menu/standard/file/Stipendie2017_ElShekeil-Laoyookhong.pdf (accessed on 4 October 2019).
- Gillham, B. Developing a Questionnaire; A&C Black: London, UK, 2000. [Google Scholar]
- Zainal, Z. Case study as a research method. J. Kemanus. 2007, 5. Available online: https://jurnalkemanusiaan.utm.my/index.php/kemanusiaan/article/view/165 (accessed on 4 October 2019).
- Sjøberg, D.I.; Hannay, J.E.; Hansen, O.; Kampenes, V.B.; Karahasanovic, A.; Liborg, N.K.; Rekdal, A.C. A survey of controlled experiments in software engineering. IEEE Trans. Softw. Eng. 2005, 31, 733–753. [Google Scholar] [CrossRef]
- Falessi, D.; Juristo, N.; Wohlin, C.; Turhan, B.; Münch, J.; Jedlitschka, A.; Oivo, M. Empirical software engineering experts on the use of students and professionals in experiments. Empir. Softw. Eng. 2018, 23, 452–489. [Google Scholar] [CrossRef]
- Höst, M.; Regnell, B.; Wohlin, C. Using students as subjects—A comparative study of students and professionals in lead-time impact assessment. Empir. Softw. Eng. 2000, 5, 201–214. [Google Scholar] [CrossRef]
- Svahnberg, M.; Aurum, A.; Wohlin, C. Using students as subjects-an empirical evaluation. In Proceedings of the Second ACM-IEEE International Symposium on Empirical Software Engineering and Measurement, Kaiserslautern, Germany, 9–10 October 2008; pp. 288–290. [Google Scholar]
- Salman, I.; Misirli, A.T.; Juristo, N. Are students representatives of professionals in software engineering experiments? In Proceedings of the 2015 IEEE/ACM 37th IEEE International Conference on Software Engineering, Florence, Italy, 16–24 May 2015; Volume 1, pp. 666–676. [Google Scholar]
- Diehl, M.; Stroebe, W. Productivity loss in brainstorming groups: Toward the solution of a riddle. J. Pers. Soc. Psychol. 1987, 53, 497. [Google Scholar] [CrossRef]
- Rietzschel, E.F.; Nijstad, B.A.; Stroebe, W. Productivity is not enough: A comparison of interactive and nominal brainstorming groups on idea generation and selection. J. Exp. Soc. Psychol. 2006, 42, 244–251. [Google Scholar] [CrossRef]
- Rodríguez, D.; Sicilia, M.A.; García, E.; Harrison, R. Empirical findings on team size and productivity in software development. J. Syst. Softw. 2012, 85, 562–570. [Google Scholar] [CrossRef]
- Iachello, G.; Abowd, G.D. From privacy methods to a privacy toolbox: Evaluation shows that heuristics are complementary. ACM Trans. Comput.-Hum. Interact. (TOCHI) 2008, 15, 8. [Google Scholar] [CrossRef]
- Bryman, A. Social Research Methods; Oxford University Press: New York, NY, USA, 2016. [Google Scholar]
- Wolpaw, J.R.; Birbaumer, N.; Heetderks, W.J.; McFarland, D.J.; Peckham, P.H.; Schalk, G.; Donchin, E.; Quatrano, L.A.; Robinson, C.J.; Vaughan, T.M. Brain-computer interface technology: A review of the first international meeting. IEEE Trans. Rehabil. Eng. 2000, 8, 164–173. [Google Scholar] [CrossRef] [PubMed]
- Huth, D.; Matthes, F. “Appropriate Technical and Organizational Measures”: Identifying Privacy Engineering Approaches to Meet GDPR Requirements. Available online: https://aisel.aisnet.org/amcis2019/info_security_privacy/info_security_privacy/5/ (accessed on 4 October 2019).
- Stålhane, T.; Sindre, G. Safety hazard identification by misuse cases: Experimental comparison of text and diagrams. In Proceedings of the International Conference on Model Driven Engineering Languages and Systems, Toulouse, France, 28 September–3 October 2008; pp. 721–735. [Google Scholar]
- Hong, J.I.; Ng, J.D.; Lederer, S.; Landay, J.A. Privacy risk models for designing privacy-sensitive ubiquitous computing systems. In Proceedings of the 5th Conference on Designing Interactive Systems: Processes, Practices, Methods, and Techniques, Cambridge, MA, USA, 1–4 August 2004; pp. 91–100. [Google Scholar]
- Wohlin, C.; Runeson, P.; Höst, M.; Ohlsson, M.C.; Regnell, B.; Wesslén, A. Experimentation in Software Engineering; Springer Science & Business Media: Berlin, Germany, 2012. [Google Scholar]
- Seddon, P.B.; Scheepers, R. Towards the improved treatment of generalization of knowledge claims in IS research: Drawing general conclusions from samples. Eur. J. Inf. Syst. 2012, 21, 6–21. [Google Scholar] [CrossRef]
- Yamamoto, J.; Inoue, K.; Yoshioka, M. Investigation of customer behavior analysis based on top-view depth camera. In Proceedings of the 2017 IEEE Winter Applications of Computer Vision Workshops (WACVW), Santa Rosa, CA, USA, 24–31 March 2017; pp. 67–74. [Google Scholar]
- Paolanti, M.; Romeo, L.; Liciotti, D.; Pietrini, R.; Cenci, A.; Frontoni, E.; Zingaretti, P. Person Re-Identification with RGB-D Camera in Top-View Configuration through Multiple Nearest Neighbor Classifiers and Neighborhood Component Features Selection. Sensors 2018, 18, 3471. [Google Scholar] [CrossRef] [PubMed]
Group | Number of Members | Method Used | Scenario |
---|---|---|---|
A | 3 | QOC-FC | Visitors Tracking |
B | 4 | QOC-FC | Visitors Tracking |
C | 4 | PATH | Visitors Tracking |
D | 3 | PATH | Visitors Tracking |
© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Pérez Fernández, A.; Sindre, G. Mitigating the Impact on Users’ Privacy Caused by over Specifications in the Design of IoT Applications. Sensors 2019, 19, 4318. https://doi.org/10.3390/s19194318
Pérez Fernández A, Sindre G. Mitigating the Impact on Users’ Privacy Caused by over Specifications in the Design of IoT Applications. Sensors. 2019; 19(19):4318. https://doi.org/10.3390/s19194318
Chicago/Turabian StylePérez Fernández, Alfredo, and Guttorm Sindre. 2019. "Mitigating the Impact on Users’ Privacy Caused by over Specifications in the Design of IoT Applications" Sensors 19, no. 19: 4318. https://doi.org/10.3390/s19194318