Design and Framework of Non-Intrusive Spatial System for Child Behavior Support in Domestic Environments
Abstract
1. Introduction
- To translate behavior-guided spatial strategies into implementable system logic.
- To compare feasible sensing architectures in terms of behavioral resolution, privacy, and spatial adaptability.
- To propose an integrated, low-cost sensing model suitable for compact child-centered environments.
2. Background and Related Work
2.1. Spatial and Ambient Cues for Behavioral Support
2.2. Behavioral Sensing in Domestic Environments
2.3. Privacy-Conscious and Ethical Sensing
2.4. Research Gap and Technical Challenges
- Translating behavioral strategies into implementable sensing logics.
- Evaluating sensor configurations in terms of privacy, fidelity, and spatial adaptability.
- Proposing a technically feasible and ethically aligned sensing system for practical usage.
3. Design Framework and Technical Translation
3.1. Core Strategies for Behavior-Guided Design
3.2. Translating Design Strategies into System Logic
- Input Conditions: What the system observes (e.g., lack of movement, emotional input).
- Trigger Thresholds: When the system decides to act (e.g., inactivity longer than 15 min).
- Feedback Outputs: How the system responds (e.g., ambient cue, notification, color shift).
3.3. Sensor-Centered Implementation
3.3.1. From Strategy to System Logic
- Presence and Routine Detection is used to assess whether expected behaviors (e.g., movement during routine times) are taking place. It supports strategies like ‘Routine Recovery’ and ‘External Linkage’.
- Posture and Stillness Interpretation captures behavioral stasis or disengagement by recognizing prolonged stillness or inactive posture. It is relevant to ‘Behavioral Transition Induction’ and ‘Emotion-Responsive Adjustment’.
- Voluntary Feedback Capture collects self-reported emotional cues through child-initiated inputs such as tactile buttons or emotion cards. This logic enhances the ‘Emotion-Responsive Adjustment’, thus offering ethical interaction without relying solely on passive sensing.
3.3.2. Sensing Modalities and Their Roles
- Environmental Sensors (e.g., PIR and pressure mats) detect presence or motion. These are discreet, low-cost, and minimally invasive, i.e., ideal for routine-based logic. Their ability to support Human Activity Recognition while being cost-effective and privacy-preserving, especially through pressure-based systems, has been demonstrated [22].
- Posture-Sensitive Sensors (e.g., low-res LiDAR and mmWave radar) identify stillness, body orientation, and depth cues. These sensors enable higher-fidelity interpretation of behavioral disengagement. The fusion of various sensor types is critical to detect, track, and identify people in realistic scenarios, thereby enhancing overall system robustness [21].
- User-Triggered Inputs (e.g., tactile interfaces) provide explicit emotional signals. They promote autonomy, reduce interpretive ambiguity, and avoid privacy concerns. This multi-modal sensing approach, particularly the fusion of passive data (e.g., from LiDAR or mmWave radar) with active, user-triggered inputs, is crucial for nuanced behavioral interpretation and respecting user autonomy. It allows the system to gather comprehensive information, covering both implicit behavioral cues and explicit self-reported states, especially vital for complex strategies like Emotion-Responsive Adjustment.
3.3.3. Evaluation Criteria for Sensor Suitability
3.3.4. Strategy-to-Sensor Mapping Summary
3.3.5. Reframing Design as Interaction
4. System Architecture and Workflow
4.1. From Design Logic to System Architecture
4.1.1. Sensing Layer
4.1.2. Interpretation Layer
4.1.3. Feedback Layer
4.2. Sensor Configuration Spectrum and Tradeoffs
4.2.1. Configuration Options
4.2.2. Comparison by Key Evaluation Criteria
- In a shared bedroom for siblings, Option 1 may be sufficient and least disruptive.
- In a solo care room with emotional disengagement concerns, Option 2 or 3 may support subtle behavioral tracking better without visual monitoring.
- Where ethical constraints prohibit any implicit monitoring, manual input tools (e.g., emotion cards) can complement sensing for respectful engagement.
4.2.3. Design Implication
4.3. Behavioral Feedback Workflow: A 7-Stage Adaptive Loop
- Input Collection: Behavioral signals such as movement, posture, stillness, or prolonged inactivity are continuously gathered through a configured sensor array, including passive infrared sensors, pressure mats, and posture-sensitive technologies.
- State Classification: The system interprets raw input data to classify the current behavioral state. Predefined rules identify patterns such as activity, inactivity, routine disruption, or disengagement.
- Strategy Selection: Based on the classified state, one of the four core strategies (Routine Recovery, Emotion-Responsive Adjustment, Behavioral Transition Induction, External Linkage) is selected to guide the system’s feedback logic.
- Feedback Mapping: The selected strategy is translated into an ambient output plan, such as color-shifting lights, soft audio, or a glowing prompt, based on the pre-configured associations between strategy and sensory cues.
- Soft Guidance: Feedback is delivered in a non-intrusive manner to nudge behavior gently, e.g., encouraging reengagement, facilitating activity transitions, or validating emotional expression. The system aims to support, not interrupt.
- Behavior Monitoring: The system continues to monitor for signs of renewed activity or state resolution. If the intended engagement resumes, the system returns to a passive state, and the feedback is withdrawn.
- Adaptive Calibration: Short-term behavioral data are stored locally to adjust timing thresholds and sensitivity parameters. This allows the system to refine its responsiveness over time while respecting privacy. Specifically, the system continuously analyzes localized, temporary behavioral logs (e.g., historical durations of activity/inactivity over a period) to dynamically tune trigger thresholds for each strategy. This prevents over-intervention by adjusting sensitivity based on observed patterns, fostering a more personalized and less intrusive supportive environment aligned with the child’s evolving rhythm. Adaptation can leverage simple statistical methods like moving averages or rule-based adjustments.
4.4. Spatial Integration and Ethical Sensor Deployment
4.5. Technical Considerations and Ethical Design Principles
5. Comparative Evaluation of Sensor Architectures
5.1. Purpose and Evaluation Framework
- Detection Precision: Can the system accurately detect and differentiate the key behavioral indicators, such as posture changes, immobility, or prolonged disengagement, at a sufficient resolution?
- Privacy Profile: Does the system intrinsically preserve the dignity and autonomy of the user, particularly children in sensitive situations, by avoiding identifiable data capture and ensuring local processing?
- Spatial Integration: Can the sensors be discreetly and easily deployed in everyday home environments without being obtrusive or stigmatizing?
- Implementation Feasibility: Are the components affordable, readily available, and easily integrable with existing domestic technologies?
5.2. Strategic Fit and Selection Guidelines
5.2.1. Overview of Sensor Options
5.2.2. Alignment with Behavioral Design Strategies
5.3. Contextual Factors in Sensor Selection
- Low-resolution LiDAR provides useful behavioral detail without compromising privacy and is especially suitable for shared domestic spaces like bedrooms and study areas where posture and motion inform engagement patterns. However, its performance can be affected by direct sunlight or highly reflective surfaces, requiring careful calibration.
- mmWave radar excels in protecting privacy and detecting subtle movements even through obstacles. However, it requires higher algorithmic control for precise behavioral interpretation and is better reserved for future applications in sensitive care environments due to its current complexity and limited widespread precedent in consumer home settings
- Environmental sensors provide a lightweight entry point for deployment, particularly where user familiarity and unobtrusiveness are priorities. Nonetheless, their functional range is narrow and better suited for routine pattern detection rather than fine-grained, real-time behavioral adaptation, primarily detecting presence or absence.
5.4. Contextual Selection and Design Implications
6. Toward Empirical Validation: Prototype and Research Plan
6.1. Prototype Configuration and Use Scenario
6.2. Planned Pilot and Evaluation Parameters
- Participants: Fifteen to twenty households with children aged six to ten.
- Location: Bedrooms, study nooks, or shared domestic play areas.
- Duration: One month per household (four weeks).
- System Goals:
- A.
- Detect presence, absence, and posture changes in daily routines.
- B.
- Trigger ambient, child-appropriate feedback without capturing identifiable data.
- C.
- Operate reliably with minimal maintenance.
7. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Mostafa, M. An Architecture for Autism: Concepts of Design Intervention for the Autistic User. Archnet-IJAR Int. J. Archit. Res. 2008, 2, 189–211. [Google Scholar]
- Almaz, A.; Mohamed, I. The Role of Architectural and Interior Design in Creating an Autism-Friendly Environment to Promote Sensory-Mitigated Design. Int. Des. J. 2024, 14, 239–255. [Google Scholar]
- Deng, L.; Rattadilok, P. A Sensor and Machine Learning-Based Sensory Management Recommendation System for Children with Autism Spectrum Disorders. Sensors 2022, 22, 5803. [Google Scholar] [CrossRef]
- Voss, C.; Schwartz, J.; Daniels, J.; Kline, A.; Haber, N.; Washington, P.; Tariq, Q.; Robinson, T.N.; Desai, M.; Phillips, J.M.; et al. Effect of Wearable Digital Intervention for Improving Socialization in Children with Autism Spectrum Disorder: A Randomized Clinical Trial. JAMA Pediatr. 2019, 173, 446–454. [Google Scholar] [CrossRef] [PubMed]
- Diraco, G.; Rescio, G.; Caroppo, A.; Manni, A.; Leone, A. Human Action Recognition in Smart Living Services and Applications: Context Awareness, Data Availability, Personalization, and Privacy. Sensors 2023, 23, 6040. [Google Scholar] [CrossRef] [PubMed]
- Toscos, T.; Connelly, K.; Rogers, Y. Best intentions: Health monitoring technology and children. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems ACM, Austin, TX, USA, 5–10 May 2012; pp. 1431–1440. [Google Scholar] [CrossRef]
- Pandria, N.; Bamidis, P.D. Technology-based solutions for behavior monitoring: Lessons learned. Front. Educ. 2023, 8, 1150239. [Google Scholar] [CrossRef]
- Alazman, A.; O’Shea, P.; Zolfaghari, S. Technology-based self-monitoring system for on-task behavior of students with disabilities: A quantitative meta-analysis of single-subject research. Dev. Med. Child Neurol. 2021, 63, 1403–1412. [Google Scholar]
- Brown, S.M.; Doom, J.R.; Lechuga-Peña, S.; Watamura, S.E.; Koppels, T. Stress and Parenting During the Global COVID-19 Pandemic. Child Abuse Neglect 2020, 110, 104699. [Google Scholar] [CrossRef]
- Jones, D.J.; Forehand, R.; Cuellar, J.; Parent, J.; Honeycutt, A.; Khavjou, O.; Gonzalez, M.; Anton, M.; Newey, G.A. Technology-enhanced program for child disruptive behavior disorders: Development and pilot randomized control trial. J. Child Psychol. Psychiatry 2020, 61, 1221–1230. [Google Scholar] [CrossRef]
- Bronfenbrenner, U. The Ecology of Human Development: Experiments by Nature and Design; Harvard University Press: Cambridge, MA, USA, 1979. [Google Scholar]
- Kopec, D.A. Environmental Psychology for Design; Fairchild Books: New York, NY, USA, 2012. [Google Scholar]
- Obrusnikova, I.; Cavalier, A.R. Perceived Barriers and Facilitators of Participation in After-School Physical Activity by Children with Autism Spectrum Disorders. J. Dev. Phys. Disabil. 2011, 23, 195–211. [Google Scholar] [CrossRef]
- Gibson, J.J. The Ecological Approach to Visual Perception; Houghton Mifflin: Boston, MA, USA, 1979. [Google Scholar]
- Gross, J.J. The Emerging Field of Emotion Regulation: An Integrative Review. Rev. Gen. Psychol. 1998, 2, 271–299. [Google Scholar] [CrossRef]
- Berrezueta-Guzmán, J.; Pau, I.; Martín-Ruiz, M.L.; Máximo-Bocanegra, N. Smart-home environment to support homework activities for children. IEEE Access 2020, 8, 160251–160267. [Google Scholar] [CrossRef]
- Consolvo, S.; McDonald, D.W.; Landay, J.A. Theory-Driven Design Strategies for Technologies that Support Behavior Change. In Everyday Life, Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Boston, MA, USA, 4–9 April 2009; Dalkey Archive Press: London, UK, 2009; pp. 405–414. [Google Scholar]
- Wilson, D.H.; Atkeson, C. Simultaneous Tracking and Activity Recognition (STAR) Using Many Anonymous, Binary Sensors. In Lecture Notes in Computer Science; Gellersen, H.W., Want, R., Schmidt, A., Eds.; Springer: Berlin, Germany, 2005; Volume 3468, pp. 62–79. [Google Scholar] [CrossRef]
- Cook, D.J.; Crandall, A.S.; Singh, R. Detecting anomalies in daily activity routines of older persons in single resident smart homes: Proof-of-concept study. J. Ambient. Intell. Smart Environ. 2013, 5, 521–534. [Google Scholar]
- Ahmad, M.; Shah, S.A.; Khan, Z.A. Fundamentals, Algorithms, and Technologies of Occupancy Detection for Smart Buildings Using IoT Sensors. Sensors 2020, 20, 7215. [Google Scholar] [CrossRef]
- Teixeira, T.V.S. A Sensor-Fusion System to Detect, Track, and Identify People in Realistic Scenarios; Yale University: New Haven, CT, USA, 2010. [Google Scholar]
- Jain, A.; Akerkar, R.; Srivastava, A. Privacy-Preserving Human Activity Recognition System for Assisted Living Environments. IEEE Trans. Artif. Intell. 2023, 5, 2342–2357. [Google Scholar] [CrossRef]
- Rinchi, O.; Ghazzai, H.; Alsharoa, A.; Massoud, Y. LiDAR Technology for Human Activity Recognition: Outlooks and Challenges. IEEE Internet Things Mag. 2023, 6, 143–150. [Google Scholar] [CrossRef]
- Pinheiro, A.; Canedo, E.D.; Junior, R.T.D.S.; Albuquerque, R.D.O.; Villalba, L.J.G.; Kim, T.-H. People Detection and Tracking Using LIDAR Sensors. Sensors 2018, 18, 753. [Google Scholar] [CrossRef]
- Koide, K.; Miura, J.; Menegatti, E. A portable three-dimensional LIDAR-based system for long-term and wide-area people behavior measurement. Int. J. Adv. Robot. Syst. 2019, 16, 1–16. [Google Scholar] [CrossRef]
- Krupke, C.; Eling, C.; Schütt, F. Assessing temporal behavior in lidar point clouds of urban environments. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2023, X-1, 195–202. [Google Scholar]
- Joo, J.-E.; Hu, Y.; Kim, S.; Kim, H.; Park, S.; Kim, J.-H.; Kim, Y.; Park, S.-M. An Indoor-Monitoring LiDAR Sensor for Patients with Alzheimer Disease Residing in Long-Term Care Facilities. Sensors 2022, 22, 7934. [Google Scholar] [CrossRef]
- Delgado-Santos, P.; Stragapede, G.; Tolosana, R.; Guest, R.; Deravi, F.; Vera-Rodriguez, R. A Survey of Privacy Vulnerabilities of Mobile Device Sensors. ACM Comput. Surv. 2021, 54, 1–30. [Google Scholar] [CrossRef]
- Zhao, M.; Li, T.; Alsheikh, M.A.; Tian, Y.; Zhao, H.; Torralba, A. Through-Wall Human Pose Estimation Using Radio Signals. Nat. Commun. 2021, 12, 3140. [Google Scholar]
- Senstar Corporation. Smart 3D LiDAR for Security Applications. 2023. Available online: https://senstar.com/products/above-ground-sensors/smart-3d-lidar-for-security-applications/ (accessed on 1 July 2025).
- Yazar, A. Multi-Sensor Based Ambient Assisted Living System. Master’s Thesis, Bilkent University, Ankara, Turkey, 2013. Available online: https://repository.bilkent.edu.tr/items/c66561b6-f8be-4ccb-99a5-1bb532fa21d8 (accessed on 31 May 2025).
- Xie, Y.; Jiang, R.; Guo, X.; Wang, Y.; Cheng, J.; Chen, Y. mmEat: Millimeter wave-enabled environment-invariant eating behavior monitoring. Smart Health 2022, 23, 100236. [Google Scholar] [CrossRef]
- Wang, T.; Sakamoto, T.; Oshima, Y.; Iwata, I.; Kato, M.; Kobayashi, H.; Wakuta, M.; Myowa, M.; Nishimura, T.; Senju, A. Detection and Classification of Teacher-Rated Children’s Activity Levels Using Millimeter-Wave Radar and Machine Learning: A Pilot Study in a Real Primary School Environment. IEEE Access 2025, 13, 23156–23170. [Google Scholar] [CrossRef]
- Rashidi, P.; Mihailidis, A. Review and challenges of technologies for real-time human behavior monitoring. IEEE Trans. Biomed. Circuits Syst. 2012, 42, 1152–1164. [Google Scholar]
- Bandura, A. Self-Efficacy: The Exercise of Control; W.H. Freeman: New York, NY, USA, 1997. [Google Scholar]
Evaluation Criteria | Description |
---|---|
Privacy Sensitivity | Level of intrusiveness and potential for re-identification; assessed by: (1) Identifiability of raw data (e.g., facial features), (2) Level of data abstraction (e.g., point cloud vs. bounding box), (3) Processing location (edge vs. cloud), and (4) Data retention policy. |
Detection Fidelity | Accuracy (precision, recall, F1-score) in recognizing key behavioral indicators (e.g., posture shifts, sustained stillness) and differentiating subtle human activities. |
Spatial Integration | Ease of physical deployment (e.g., installation complexity, calibration requirements), aesthetic compatibility with domestic settings, and minimal physical disruption. |
Feedback Synchronization | Timeliness of system response (latency from input to output) and contextual relevance of feedback based on real-time sensor data, ensuring effective behavioral nudges. |
Strategy | High-Priority Criteria | Preferred Sensor Types |
---|---|---|
Routine Recovery | Feedback Synchronization, Spatial Integration | PIR, pressure mat |
Emotion-Responsive Adjustment | Privacy, Detection Fidelity | Emotion card, mmWave radar |
Behavioral Transition Induction | Detection Fidelity, Feedback Synchronization | mmWave radar, low-res LiDAR |
External Linkage | Privacy, Spatial Integration | PIR, tactile button |
Option | Sensor Types | Key Advantages | Key Limitations |
---|---|---|---|
1 | Environmental sensors (PIR, pressure mat, ambient light) | Low cost, easy installation, privacy-preserving | Cannot detect posture or nuanced stillness |
2 | Low-resolution LiDAR | Captures spatial engagement, posture, and inactivity accurately | Requires careful placement and calibration |
3 | mmWave radar, thermal IR | Works in low light, enables non-visual tracking, protects privacy | High algorithmic complexity, limited deployment history |
Criteria | Option 1 | Option 2 | Option 3 |
---|---|---|---|
Privacy | High | Moderate | High |
Detection Fidelity | Low | Moderate to high | High |
Installability | Very high | Moderate (requires clear space) | Moderate to low (sensitive to layout) |
Feedback Timing | High (fast and simple signals) | High | High (if algorithms are tuned well) |
Sensor Type | Recommended Location | Design Considerations |
---|---|---|
PIR | Entrances, desks | Detects transitions between spaces; should avoid persistent monitoring zones |
Pressure Mat | Beds, chairs | Embedded in seating or bedding surfaces; must remain unobtrusive and passive |
Light Sensor | Ceilings, upper corners | Used to monitor ambient brightness; should not interfere with visual comfort |
LiDAR | Ceiling-mounted, angled downward | Requires careful cone-of-view design to avoid facial recognition or direct gaze |
mmWave Radar | Upper wall, high corners | Non-visible; maintains privacy even in dark; requires directional calibration |
Principle | Description |
---|---|
Edge Processing | All data is processed locally within the system device. No cloud connectivity is used, ensuring that behavioral data remains within the child’s space. |
No Biometric Capture | The system avoids cameras, microphones, or any biometric sensors that could identify or surveil the child. This maintains dignity and eliminates profiling risks. |
Temporary Data Use | Behavioral logs are stored only temporarily to allow adaptive calibration. No long-term personal data is retained. |
Informed Consent | The system operates transparently. Where developmentally appropriate, children are informed of how the system works and are given the option to engage. |
Design Strategy | Sensor Fit | Justification |
---|---|---|
Routine Recovery | All options viable | Basic presence/inactivity detection is sufficient |
Emotion-Responsive Adjustment | LiDAR or mmWave | Captures behavioral indicators for emotional states; complements direct input |
Behavioral Transition Induction | LiDAR or mmWave | Requires detection of immobility or spatial stagnation |
External Linkage | All (if logic applied) | Triggered by repeated absence or failure patterns; logic-based rather than resolution-based |
Parameter | Values | Unit |
---|---|---|
Size | mm | |
Weight | g | |
Power supply | V | |
Maximum power consumption | W | |
Operating temperature | °C | |
Infrared VCSEL emitter | Nm | |
Emitting angle | ° | |
Maximum measurable distance | m | |
Minimum measurable distance | m | |
Maximum output frame rate | fps |
Category | Details |
---|---|
Sensor Setup | Low-resolution LiDAR with a local edge processor (e.g., Raspberry Pi, Jetson Nano) |
Core Functions | Detects posture, immobility, presence; operates in real-time; no image or biometric storage |
Feedback Modality | Visual cues only; color-shifting lights, ambient icons (no sound unless user-triggered) |
Target Spaces | Shared domestic settings (bedrooms, study areas, play zones) |
Design Principles | Privacy-preserving (non-visual sensing), Seamless spatial integration, User-controllable feedback |
Category | Details |
---|---|
Participant Households | 6 to 10 families with children aged 6–10 |
Installation Areas | Daily-use child spaces (e.g., bedroom, desk area) |
Trial Duration | Two weeks per household |
Evaluation Criteria | Detection accuracy (vs. caregiver logs), System reliability (uptime, fault tolerance), Social fit (child response, behavioral influence) |
Future Expansion | Applicable to group housing, welfare shelters, learning spaces Potential for mmWave integration in sensitive environments |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Yoo, D.-U.; Kang, J.; Park, S.-M. Design and Framework of Non-Intrusive Spatial System for Child Behavior Support in Domestic Environments. Sensors 2025, 25, 5257. https://doi.org/10.3390/s25175257
Yoo D-U, Kang J, Park S-M. Design and Framework of Non-Intrusive Spatial System for Child Behavior Support in Domestic Environments. Sensors. 2025; 25(17):5257. https://doi.org/10.3390/s25175257
Chicago/Turabian StyleYoo, Da-Un, Jeannie Kang, and Sung-Min Park. 2025. "Design and Framework of Non-Intrusive Spatial System for Child Behavior Support in Domestic Environments" Sensors 25, no. 17: 5257. https://doi.org/10.3390/s25175257
APA StyleYoo, D.-U., Kang, J., & Park, S.-M. (2025). Design and Framework of Non-Intrusive Spatial System for Child Behavior Support in Domestic Environments. Sensors, 25(17), 5257. https://doi.org/10.3390/s25175257