Next Article in Journal
A Novel Approach to Company Bankruptcy Prediction Using Convolutional Neural Networks and Generative Adversarial Networks
Previous Article in Journal
AI-Driven Intelligent Financial Forecasting: A Comparative Study of Advanced Deep Learning Models for Long-Term Stock Market Prediction
Previous Article in Special Issue
Revolutionizing Cardiac Risk Assessment: AI-Powered Patient Segmentation Using Advanced Machine Learning Techniques
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

A Review of Methods for Unobtrusive Measurement of Work-Related Well-Being

1
Jožef Stefan Institute, 1000 Ljubljana, Slovenia
2
Jožef Stefan International Postgraduate School, 1000 Ljubljana, Slovenia
3
Faculty of Informatics, Università della Svizzera Italiana, 6900 Lugano, Switzerland
4
Institute of Work and Organizational Psychology, University of Neuchâtel, 2000 Neuchâtel, Switzerland
5
Faculty of Computer and Information Science, University of Ljubljana, 1000 Ljubljana, Slovenia
*
Author to whom correspondence should be addressed.
Mach. Learn. Knowl. Extr. 2025, 7(3), 62; https://doi.org/10.3390/make7030062
Submission received: 28 May 2025 / Revised: 24 June 2025 / Accepted: 28 June 2025 / Published: 1 July 2025
(This article belongs to the Special Issue Sustainable Applications for Machine Learning)

Abstract

Work-related well-being is an important research topic, as it is linked to various aspects of individuals’ lives, including job performance. To measure it effectively, unobtrusive sensors are desirable to minimize the burden on employees. Because there is a lack of consensus on the definitions of well-being in the psychological literature in terms of its dimensions, our work begins by proposing a conceptualization of well-being based on the refined definition of health provided by the World Health Organization. We focus on reviewing the existing literature on the unobtrusive measurement of well-being. In our literature review, we focus on affect, engagement, fatigue, stress, sleep deprivation, physical comfort, and social interactions. Our initial search resulted in a total of 644 studies, from which we then reviewed 35, revealing a variety of behavioral markers such as facial expressions, posture, eye movements, and speech. The most commonly used sensory devices were red, green, and blue (RGB) cameras, followed by microphones and smartphones. The methods capture a variety of behavioral markers, the most common being body movement, facial expressions, and posture. Our work serves as an investigation into various unobtrusive measuring methods applicable to the workplace context, aiming to foster a more employee-centric approach to the measurement of well-being and to emphasize its affective component.

1. Introduction

Well-being measurement has become an increasingly prominent research topic across various disciplines due to its impact on numerous aspects of life. As per the American Psychological Association (APA) Dictionary of Psychology, well-being is defined as a state characterized by happiness, low levels of distress, good physical and mental health, a positive outlook, and high quality of life [1]. In the psychological literature, however, various definitions of well-being exist, and despite extensive research, open questions regarding them persist. Similarly, there is a large body of literature on the definition of work-related well-being. Wijngaards et al. [2], for example, define well-being at work as the experience or state of well-being in the work setting or when working ([2], p. 798).
Within the organizational context, workplace practices and work-related well-being are linked to several organizational improvements [3], such as employee performance [4].
Researching well-being has its origins in psychology, where a wide range of instruments exist for measuring it (e.g., [5,6,7]). However, most of them rely on self-reports, which are impractical and obtrusive for employees, as filling out questionnaires is time-consuming and disruptive to their workflow. Recently, however, researching well-being has garnered interest in other fields, such as computer science, where sensing technologies that have a less disruptive nature have increasingly been employed. Research is available utilizing a variety of sensors in diverse contexts, such as support tools for chronic disease management [8]. In this review, we therefore focus on the use of unobtrusive sensors, which are designed to collect data without requiring active participation or causing disruption to normal activities. In consequence, unobtrusive sensors are subtle and can be easily integrated into daily scenarios [8]. Given their numerous advantages, unobtrusive sensors are increasingly featured in the scientific literature and are being integrated into novel research methods.
When utilizing unobtrusive methods, researchers can focus on either (or both) behavioral or physiological signals. Significant challenges arise when measuring physiological signals in naturalistic environments where conditions are highly variable (e.g., movement, changing lighting, makeup, and eyewear). These conditions often lead to decreased accuracy and stability of the measurements. For instance, Slapničar et al. [9] conducted a study in which they estimated blood pressure using a modified red, green, and blue (RGB) camera, which required stable environmental conditions (e.g., consistent lighting, minimal movement). This suggests that remote physiological monitoring may not be practical for large-scale naturalistic applications. In contrast, behavioral signals are more directly observable, making them more robust and less dependent on environmental variables. Furthermore, monitoring physiological signals often relies on wearable devices, which would mean increased obtrusiveness for employees. Given the aforementioned limitations of physiological signal measurement in settings such as the workplace, we will investigate behavioral markers instead of physiological signals in this review.
A variety of studies have been conducted across diverse contexts, such as the well-being of residents in long-term care facilities [10] and the well-being of students in educational settings [11]. In this review, we will focus on investigating well-being in the workplace. We do this because, as mentioned in the previous paragraphs, work-related well-being is linked to several organizational improvements [3], such as employee performance [4], making it an important topic of investigation. Furthermore, monitoring well-being in the workplace using methods such as questionnaires is obtrusive to employees’ workflow, emphasizing the importance of investigating alternative methods such as unobtrusive sensors. Given the significant variations among workplaces, we decided to focus on the workplace of knowledge workers, making our approach context-specific.
To the best of our knowledge, there is currently a lack of research that focuses on monitoring well-being in the workplace using unobtrusive sensors. Additionally, well-being is frequently used as an umbrella term, which often hinders the comparability of existing studies. For example, Wallace et al. [12] investigated the potential of a pressure-sensitive mat for assessing well-being, which is particularly suitable for deployment in the homes of aging adults. However, the presented method focuses mainly on assessing the physical aspect of well-being, which is not suitable in the context of knowledge workers, where other (i.e., psychological) aspects are equally important. Because of these issues, we decided to define multiple aspects of well-being, as presented in the following paragraphs.
To tackle the challenges related to defining the dimensions of well-being and to ensure clarity, we draw on the definition of health provided by the World Health Organization (WHO) [13], by refining it and proposing a conceptual framework of well-being. We do this by defining the dimensions and sub-dimensions of well-being based on the existing psychological literature and explaining the relevance of each sub-dimension to the overall construct. Then, we present the most suitable and developed unobtrusive methods for measuring each sub-dimension and discuss how these methods can be applied in the context of the workplace. Lastly, we mention the limitations of our approach and the general issues that this field is facing.

2. Conceptualizing Well-Being

As already mentioned, various definitions of well-being exist in the psychological literature, and despite extensive research, open questions regarding them persist. The complexity and variability of well-being have prevented a consensus on its definition [14]. Nevertheless, the importance of its affective components is well-recognized (e.g., [15,16]). A model (with proposed dimensions of well-being) reaching high agreement among professionals that could be used in practice remains to be established [5]. Furthermore, well-being is often used as an umbrella term [16], leading to ambiguous interpretations and questionable comparability among studies.

2.1. Taxonomies of Well-Being

Well-being is a complex construct with numerous influencing factors. Due to its dynamic structure that changes over time and fluctuates within a person [17], an extensive body of literature explores the factors that lead to well-being and the outcomes it is connected to. In this section, we review several highly influential studies on well-being from psychology with the aim of introducing the reader to the main approaches in this field.
First, in 1984, Diener [15] investigated well-being, including happiness, life satisfaction, and positive affect, highlighting that various factors influence well-being at different analytical levels. Second, another early work was presented by Ryff and Keyes [18], who focused specifically on psychological well-being and empirically investigated the construct’s six-factor structure: autonomy, environmental mastery, personal growth, positive relations with others, purpose in life, and self-acceptance. By comparing this approach to others and using empirical data for validation, the authors reported that the proposed structure effectively captured the main aspects of psychological well-being. Third, the WHO provided a broader framework [13] by introducing a definition of health in 1995 as not only the absence of disease but also a state of physical, mental, and social well-being. This conceptualization indicated three important aspects of well-being, which remain some of the most widely investigated today.
Fourth, emphasizing its dynamic and changing structure [17], in their proposed framework, Dodge et al. [14] viewed well-being as the interplay between an individual’s resources and challenges. Both resources and challenges can be psychological, social, or physical. Well-being is achieved when resources and challenges are in equilibrium, meaning the individual has the resources to meet their challenges. Fifth, Sonnentag [17] complemented this work by gathering empirical evidence supporting the fluctuating nature of well-being, particularly in the workplace, where it manifests in various on-the-job behaviors, including performance. She investigated the role of job stressors, job resources, interpersonal factors, personal resources, and work–home interface in relation to work-related well-being and its variability.
Sixth, Fisher [16] focused more on the context of the workplace and proposed a conceptualization of work-related well-being, defining three major components: social, eudaimonic, and subjective. Social well-being involves satisfaction with peers, leaders, etc. Eudaimonic well-being includes job involvement, engagement, flow, and intrinsic motivation. Lastly, subjective well-being pertains to attitudes toward work and experiences of both positive and negative affect.
In summary, various taxonomies have been proposed, yet the field still lacks a consensus due to the complexity of well-being. For the purpose of our research, we use the definition of health provided by the WHO [13], specifically the three aspects of well-being, because it captures the broadness of the construct and is well-known and established in the field. We further divide the three proposed dimensions into sub-dimensions to create a clearer and more defined framework because the definition provided by the WHO [13] is too general for the purpose of our review. We do this by combining the findings of the presented taxonomies, as certain proposed dimensions are shared. Refining the three proposed dimensions in a more concrete and precise manner allows us to partially bridge the gap between the different terminologies used to assess well-being in psychology and computer science.
In this review, we therefore adopt a pragmatic approach by proposing a concept of well-being that not only focuses on affect but also highlights the importance of engagement, fatigue, stress, physical comfort, sleep deprivation, and social interactions, as further described in the next section. Our approach is pragmatic in that, instead of conducting an exhaustive and systematic review of various well-being taxonomies from the psychological literature, we selectively reviewed some of the most prominent ones and used them to shape our own conceptual framework. The proposed conceptualization with clearly defined components should help us investigate the available methods for the unobtrusive measurement of well-being.

2.2. Proposed Concept of Well-Being

As highlighted in the previous section, well-being is a broad and complex construct and is often used as an umbrella term. In our review, we decided to use the definition provided by the WHO [13], which we further divided into sub-dimensions. In our concept, we focused on affect while also emphasizing the importance of other dimensions. It is important to note, however, that certain dimensions now serve as predictors, while others act as indicators of well-being. As this falls outside the scope of the current review, it will not be explored further. The dimensions and sub-dimensions that we adopt in our work are presented in Figure 1. We define each of the sub-dimensions as follows:
  • Psychological well-being: As per Ryff [19], psychological well-being includes happiness (the experience of pleasure) and eudaimonic well-being (referring to flourishing, engagement, feeling a sense of purpose):
    Affect: Affect is defined as the experiences of negative and positive affect. As per the circumplex model [20], each affect can be described as a combination of two independent dimensions: pleasure and arousal. Negative affect is a negative well-being indicator, whereas positive affect is a positive well-being indicator.
    Engagement: Engagement is defined as a state where an employee has a high level of energy, is enthusiastic, and is immersed in their work. Engagement is composed of vigor, dedication, and absorption. Vigor refers to high energy and resilience levels, dedication refers to being strongly involved in one’s work, and absorption refers to being focused on and immersed in one’s work [21]. Engagement is a positive well-being indicator.
    Fatigue: Fatigue is defined as a state that happens as a consequence of long periods of demanding cognitive activity [20]. Fatigue is a negative well-being indicator.
    Stress: As per the APA dictionary [22], stress is recognized as the physiological or psychological response to stressors, which can be either internal or external. The physiological response can manifest in sweating, a dry mouth, accelerated speech, etc., and influences how people behave and feel. Stress is a negative well-being indicator.
  • Physical well-being: As per Seligman [23], physical well-being extends beyond the absence of sickness, capturing an individual’s capability to realize their fullest wellness potential:
    Physical comfort: As defined by Kölsch et al. [24], the physical comfort zone is composed of postures and motions that are voluntarily adopted, as opposed to those that are avoided. Physical comfort positively impacts well-being.
    Sleep deprivation: Sleep deprivation occurs when there is either a total absence of sleep or a reduction in sleep duration [25]. Sleep deprivation negatively impacts well-being.
  • Social well-being: As per Pressman [26], social well-being is experienced when various social needs, such as the feeling of support and belonging, are met:
    Social interactions: A social interaction is defined as a process that entails mutual interaction or responses between two or more individuals [27]. There is evidence of the link between frequent and deeper social interactions and well-being [28], indicating that enriching interactions and social support can contribute to increased well-being.
All sub-dimensions were defined by us based on the existing literature. Affect is crucial in investigating well-being, as evidenced by the research of Diener [15] and Fisher [16], with the latter also emphasizing the importance of engagement. Fatigue is linked to occupational safety through its impact on cognitive and motor performance [29], while stress is associated with various negative organizational outcomes, including turnover intentions and job burnout [30]. Physical comfort, particularly posture, is recognized as a significant aspect of workplace experience and well-being [31]. Furthermore, the importance of sleep in various domains of our lives is extensively documented in the literature [32]. Lastly, social interactions and related elements are incorporated into the taxonomy proposed by Fisher [16], since studies such as [33] indicate that quality connections can be sources of energy and well-being at work.

3. Paper Selection Method

For our review, we adopted an exploratory approach. We primarily limited our search to two databases, IEEE Xplore and the ACM Digital Library, chosen for their rich collections of relevant articles from computer science. Our search focused on articles published in the last five years, covering each sub-dimension that comprises well-being, as portrayed in Figure 1. Consequently, we conducted seven different keyword-based searches of both libraries, as presented in the Appendix A, Table A1. Our interest in methods for unobtrusive sensing guided our keyword selection. Each study had to meet two criteria for inclusion in our review:
  • Focus on one of the identified sub-dimensions of well-being.
  • Focus on unobtrusive sensing methods.
Due to the large number of reviews already conducted on sensing affect and stress, we introduced an additional criterion for these sub-dimensions: to include only review studies.
As presented in Table 1 and Figure 2, our initial search resulted in 644 articles investigating the various identified sub-dimensions. We then screened each article to determine its relevance and quality and decided whether to include it in the subsequent analysis. Specifically, we focused on whether each article met our inclusion criteria, as specified above, and excluded articles with poor grammar.
After this process, the number of relevant articles decreased greatly, and sometimes, we could not find many suitable articles using the described methodology. In such cases, we conducted manual searches, including studies found on Google Scholar or in the cited literature from already included articles. Additionally, highly relevant papers older than five years were also considered where necessary.
Overall, the screening process resulted in 35 articles, which were thoroughly reviewed by the authors and included in the analysis. In the following sections, we provide an overview of the relevant papers related to the unobtrusive measurement of the sub-dimensions of well-being we proposed in Section 2.

4. Psychological Well-Being

4.1. Affect

Because emotion and affect are widely researched, we decided to only investigate existing reviews from the chosen libraries. Six reviews were deemed relevant, as they focused on detecting current or short-term emotions and affect using unobtrusive sensors. In the reviewed articles, affect was most often investigated through facial expressions and facial motions [34,35] and speech [34,36,37]. Additionally, manifestations through different auricular positions (positions of the ear) [38] and facial temperature changes [39] were investigated. The most commonly used sensors were an RGB camera [34,35], a microphone [34,36,37,38], a thermal camera [39], and radar sensors [40]. It is important to highlight the extensive body of literature focused on monitoring physiological signals associated with affect (e.g., [41]). However, these methods have various limitations, as discussed in Section 1. Notably, some studies complement behavioral markers with physiological signals, which we include in this section due to their high relevance.
In a systematic literature review of wearable sensors by Rödigger et al. [36], the use of wearable devices for detecting emotions and other physiological states was investigated. Specifically, for capturing emotions, an electroencephalogram (EEG), an accelerometer, and a microphone were mentioned. Furthermore, a system for measuring the ear position using reflected sound was mentioned, indicating a relationship between auricular positions and emotions [38]. Gong et al. [35] investigated the use of inertial data from smartphones and wearables to capture characteristic facial motions.
In addition to Röddiger’s findings, Stampf et al. [42] presented a review in which various sensor modalities for detecting emotions and physiological states in highly automated vehicles were investigated for monitoring stress and arousal levels. The review mentioned a study that explored auditory signals like paralinguistic cues, where an automated emotion recognition system was created using multiple microphones to infer seven different emotion categories, such as boredom, sadness, happiness, etc. [37]. The model used various acoustic features processed using a neural network classifier and reached accuracies between 70% and 90% for the classification of different groups of emotions [43]. Similarly, a survey of Ambient Intelligence by Dunne et al. [44] addressed various methods for emotion detection, emphasizing the role of computer vision and physiological signals. Techniques such as gesture analysis and body language interpretation were discussed, most often for recognizing basic emotions (e.g., anger, joy, pleasure, sadness, fear) and focusing on the upper body, hands, and arms.
In a review by Braun et al. [34], the contactless detection of drivers’ emotional states was investigated. The review discussed various methods for emotion recognition, including facial expressions, speech, physiological states, and body gestures, emphasizing the potential for combining these techniques to enhance the accuracy of emotion detection. Differences in body temperature in some facial regions were also mentioned as an indicator of emotional reactions, therefore demonstrating the potential of using thermal imaging technology for emotion detection [39]. Additionally, Souondariya et al. [45] proposed a visual method for emotion recognition: eye-tracking with electrooculography (EOG) signals, which detects changes in eye positions and recognizes affective states using the valence-arousal model and multi-class SVM classifier. Although this method is not inherently unobtrusive, it can be adapted to reduce its level of intrusion.
Lastly, Nocera et al. [40] provided an overview of radar-based methods for the unobtrusive measurement of various signals, which are then used to build machine learning (ML) models. Despite focusing primarily on studies detecting physiological signals, the review also mentioned methods for affective computing. Highlighting the high potential of emotion recognition, the review summarized various use cases from recognizing the emotional response of patients to recognizing the responses of consumers. One of the studies included in this review was conducted by Zeng and Liu [46], who proposed a method for emotion recognition based on millimeter-wave radar. Their unobtrusive method reached an accuracy of 86.50% when detecting emotional states.
In this section, a wide range of methods and sensors for detecting and recognizing various emotions and types of affect were presented. Covering different approaches, some of the studies mentioned relied on both physiological signals and behavioral markers in their research, making it challenging to assess the contribution of the latter. Furthermore, the review by Rödigger et al. [36] focused on investigating wearable sensors, and the unobtrusiveness of these methods is questionable. The wearable sensors are not contactless, but due to their resemblance to earphones (which are used by numerous people every day in the office), they were nevertheless deemed unobtrusive in the context of the workplace and therefore included in our review. Reviewing such methods highlighted the potential and accuracy of unobtrusive techniques for emotion recognition.

4.2. Engagement

We included a total of seven studies focusing on measuring engagement in our review. In the reviewed studies, engagement was most often investigated via its manifestation through facial expressions [47,48,49,50,51,52] and posture [47,48]. Manifestation through hand gestures [47], upper-body motion [53], gaze estimation, and head rotation [52] was also investigated. Accordingly, the most commonly used sensor in these studies was an RGB camera [47,49,50,52], followed by a microphone array [48], wearables with inertial sensors [51], and a depth camera [53].
Ashwin et al. [47] investigated student engagement unobtrusively using an RGB camera. They utilized various features derived from facial expressions, hand gestures, and posture. Employing a convolutional neural network, they classified engagement into four categories, achieving a classification accuracy of 71%. Similarly, Whitehill et al. [49] investigated student engagement using an RGB camera and features based on facial expressions. They tested various ML approaches, and their binary classification algorithm (high vs. low engagement) achieved a classification accuracy comparable to human annotators. The most discriminating feature referred to the in-plane rotation of the face, which was negatively associated with high engagement. The second most discriminating feature referred to the raising of the upper lip (positive correlation with engagement), followed by the features referring to the raising of the inner brow and eye closure (negative correlation with engagement). Aslan et al. [50] proposed another approach using an RGB camera to infer student engagement based on facial expressions. In addition to video data, the approach incorporated student-specific performance data. The authors introduced a platform for monitoring student engagement to aid teachers in making their teaching methods more engaging. The platform monitored engagement at both individual and group levels, proving to be a promising and beneficial tool in supporting teachers’ roles.
Gao et al. [48] introduced another method for monitoring engagement using a device equipped with a speaker and a microphone array to track user engagement based on facial expressions and emotional gestures (i.e., hand gestures that accompany emotions). The signals emitted by the speaker, also included in the setup, are near-ultrasound (16 kHz to 20 kHz). The signal is directed toward the user’s face and upper body, and the reflected echo is received by the microphone array, enabling unobtrusive and contact-free monitoring of the user.
A depth camera is also an effective sensor for monitoring engagement [53]. Huynh et al. demonstrated its use in inferring features related to upper-body motion. They described a protocol for predicting player engagement using data from a mobile phone, depth camera, and wristband. Besides upper-body motion, physiological data and touchscreen events were utilized to predict engagement. Evaluated across six different games, random forest (RF) emerged as the most promising ML approach among the ones investigated, achieving classification accuracies of 85% in cross-sample evaluation and 77% in cross-subject evaluation, thus successfully classifying player engagement into three categories.
As seen in previous studies, engagement can not only be monitored directly but also through inferring facial expressions. There is extensive work in this field, but given its indirect connection to engagement, we limit our review to only some of the most relevant ones that came up in our search. A study by Verma et al. [51] introduced a system for recognizing 32 facial action units using wearables with inertial sensors, achieving an accuracy of 89.9% using deep learning. Similarly, the work presented by Vedernikov et al. [52] used facial expressions to detect the engagement of participants during online meetings using non-contact sensors (i.e., an RGB camera and a remote pulse oximeter). Vedernikov et al. [52] used a fusion of various physiological and behavioral features to predict engagement, with the latter including facial expressions and motion features such as gaze tracking, gaze direction, eye landmarks, and head rotation. Using an ensemble of KNN and RF, their model predicted engagement with an accuracy of 96% when using the fusion of both types of features.
As outlined in the previous paragraphs, various methods and sensors are suitable for measuring and predicting engagement. However, most available studies focus on investigating engagement in specific scenarios (e.g., in the classroom or while watching a video clip), raising questions about the generalizability of these results to different contexts. This also prompts inquiry into the specificity of engagement as a construct, questioning whether employee engagement significantly differs from student engagement. Furthermore, most of the aforementioned studies focus on detecting behavioral engagement, focusing less on the emotional and cognitive aspects of it, therefore deviating slightly from our definition of engagement presented in Section 2. Nonetheless, the presented methods allow for the unobtrusive and comprehensive measurement of engagement, and they can be easily adapted for use in the workplace, making them relevant for the aim of this review.

4.3. Fatigue

Our search revealed that the literature often lacks clear distinctions between different types of fatigue. Additionally, there is sometimes a poor distinction between different constructs, e.g., mental fatigue, cognitive load, and effort. We reviewed two studies that focused on measuring and/or predicting mental fatigue using unobtrusive sensors. The methods used relied on an analysis of blinking behavior, namely the blink rate, duration, pupil measurement, gaze point, etc. [54]. Another relevant method found in the literature relied on an analysis of breathing frequency and respiratory cycle duration [55]. The sensors used in these works were an eye tracker [54] and a radar [55].
Li et al. [54] investigated the accuracy of various ML approaches in predicting the mental fatigue of construction industry operators. They used eye-movement data obtained with a wearable eye tracker and self-report questionnaires as ground truth for training the models. The proposed approach resulted in the highly accurate classification (accuracies between 79.5% and 85.0%) of operators’ mental fatigue into three distinct categories. Among the evaluated methods, support vector machine (SVM) and linear discriminant analysis (LDA) yielded the most accurate results.
Furthermore, Turetskaya et al. [55] investigated the differences in respiratory patterns detected by two radars under different conditions: resting state and mental fatigue/effort. Although the study did not directly investigate mental fatigue, the proposed methods remain relevant for exploring it, provided they are appropriately adapted. The authors of [55] used bioradiolocation, a non-contact radar method for detecting organisms, to gather data. Their study concluded that significant differences in breathing patterns exist between the two conditions, making the proposed method suitable for detecting respiratory pattern changes due to mental effort.
Our search revealed a variety of unobtrusive methods available for measuring mental fatigue, as summarized in the previous paragraphs. However, there is a poor distinction between the different types of fatigue. Many studies from our initial keyword search focused on physical fatigue [56] or driver fatigue/drowsiness [57,58]. Adão Martins et al. [29] conducted a literature review on monitoring fatigue using wearable sensors and similarly found a low number of studies investigating mental fatigue (8 out of 59). These studies spanned various domains such as healthcare and transportation, using diverse fatigue-inducing tasks and types of input data, achieving generally good model performance.

4.4. Stress

Stress is a construct often present in the literature, and there are three main methods used to measure stress [59]: self-reports, either using standardized questionnaires, interviews, or diaries; observational methods of behavior and emotional expressions; and measurements of physiological correlates.
Stress physiology has been of central importance ever since the introduction of the term [60] and plays an important role in automatic stress detection (see related reviews [61,62,63,64,65]), especially since it is seen as an objective measure of stress. As such, we can reduce the problem of unobtrusive stress monitoring to the non-intrusive detection of vital signs. As a good example of an overview of one such technique, namely millimeter-wave sensing technologies, we refer the reader to the work of Wu et al. [66]. However, using physiological markers to monitor stress is often affected by motion artifacts, making it challenging to implement in workplace settings [65].
Due to the limitations of the unobtrusive monitoring of physiological signals described in Section 1, in this work, we focus instead on behavioral manifestations of stress, which are crucial, since they can mediate affective and physiological responses [59]. They are often framed in terms of coping with stress and thus alleviating it, but some dysfunctional behaviors, such as drinking, smoking, or aggressive behaviors, can even exacerbate stress. Behavioral stress responses have received less attention in stress detection studies (see [61]) but lend themselves well to unobtrusive monitoring. In fact, observational methods are widely used by psychologists when studying the stress responses of very young children and adults with cognitive impairments. When traditional pen-and-paper techniques are employed for this purpose, observation is an expensive and time-consuming method. This highlights the motivation for developing unobtrusive and automatic monitoring methods for behavioral stress responses.
We identified three studies that approached stress detection by unobtrusively observing people’s behaviors. The reviewed studies employed a variety of methods, relying on body movements [67], activity information [68], communication patterns, phone usage, and location [69]. The sensors used included a millimeter-wave sensor [67], a WiFi-based localization system [68], and a smartphone [69].
Using a millimeter-wave sensor, Ha et al. [67] implemented a system to monitor stress in a contact-free manner. Their approach was based on both physiological signals and behavioral markers. Besides measuring heart rate variability and breathing rate, they also included body movements as features that captured activities such as shaking the leg or stretching the neck. These features were described by the movement intensity, number of high-activity occurrences, and mean intensity of high activity. Motion features were used alongside heartbeat interval and respiratory features to predict stress levels. This approach allowed for a classification of stress into three classes, reaching a median accuracy of over 84% when trained and tested across different subjects. Zakaria et al. [68] made use of a passive WiFi-based localization system to track students’ locations and the presence of other people. With these features, they heuristically mapped the locations to activities, broadly classified into work, non-work, and group activities. In their main study, they compared the logistic regression (LR), SVM, and RF classifiers. Among the three classifiers, the latter achieved the highest area under the ROC curve (AUC) score of 0.97, demonstrating that activity information can be successfully used to differentiate between severely and normally stressed students. However, the subsequent validation study encountered numerous false positives, likely due to the low number of severely stressed participants. Nosakhare and Picard [69] used a mobile application in addition to a smartwatch to determine communication patterns (call and SMS exchanges), phone usage, and locations (time spent on campus, indoors, and outdoors). They used these features to uncover latent behavioral patterns using supervised latent Dirichlet allocation (sLDA), from which they were able to detect stress. Using sLDA, they were able to classify low vs. high stress levels with an accuracy of approximately 60%.
In the studies described, body movements, student activity derived from locations, phone usage, and communication patterns were all behaviors that were monitored and related to stress. It should be noted, however, that the relationship between stress and behavior is highly dependent on the population being studied. While predicting stress based on physiological signals could offer more opportunities for generalization, the same cannot be concluded for behavioral markers. For example, the features in [68,69] were developed with students as participants and would be difficult to generalize to other populations. Some behaviors, such as phone usage, might show more consistent patterns, while others, such as student activity derived from locations, are only applicable to the particular population with whom the method was developed. Thus, it is paramount to consider whether a feature or method is appropriate for the population or the culture it is employed in [59].

5. Physical Well-Being

5.1. Physical Comfort

Our search initially revealed several methods for the unobtrusive measurement of employee physical comfort. We discovered a high number of studies focusing on health monitoring and various physiological signals, but the prevalence of research specifically addressing employee physical comfort remained rather low. This resulted in a decrease in the number of relevant studies compared to the initial pool of identified studies. We included three studies that focused on measuring physical comfort in our review, even though one study only partially met our criteria for inclusion. Specifically, the method was not entirely non-contact, and its unobtrusiveness was not clear-cut.
Two of the reviewed studies used posture as a descriptor of physical comfort [31,70], and one study used repetitive strain injuries of the wrists and elbows [71]. The sensors used included inclinometers [70], an RGB camera [31], and resistive flex sensors [71].
Olsen et al. [70] proposed a system for monitoring the posture of dentists using three inclinometers positioned on the dentist’s coat to ensure unobtrusiveness. The system required individual calibration and was assessed in a user study in which multiple ML algorithms were used for posture classification. The k-nearest neighbor (kNN) algorithm was reported as being the most suitable approach in terms of accuracy and other characteristics, namely simplicity and speed.
Chen et al. [31] proposed a system for monitoring the health and well-being of employees using an RGB camera, offering personalized recommendations to improve well-being and productivity. The system addressed multiple aspects of well-being, including posture. It utilized semi-Markov models to learn user behavior, habits, and preferences, after which ergonomic recommendations were given. Consisting of two layers, the model incorporated the influence of contextual information on the current states, thereby improving its ability to accurately identify user behavior and deliver more relevant recommendations.
Lastly, Wac et al. [71] proposed a prototype for monitoring and assessing users’ hand and elbow movements to prevent repetitive strain injuries, which are increasingly common due to the rise in office work. Three resistive flex sensors were implemented in a sleeve—two of them on the wrist and one on the elbow. Although this was only preliminary work, the initial evaluation showed that 91% of users would use the system again. Although the system was not non-contact, as it was implemented in a sleeve, its obtrusiveness can be debated. Nevertheless, due to its high relevance to the context of knowledge workers, we include it in our review.
In summary, there are few studies investigating the measurement of employee physical comfort, as outlined in the previous paragraphs. Despite not being entirely unobtrusive, the presented methods are beneficial for measuring the physical comfort of knowledge workers.

5.2. Sleep Deprivation

Among the articles initially identified, eight were relevant. The reviewed studies mainly focused on detecting a lack of sleep, either as micro-sleep events or drowsiness during a specific activity [57,72,73,74]. This was often measured through eye blinking [72,73,74,75], facial expressions with an emphasis on yawning [72,73,74], and nodding [74]. For these manifestations, an RGB camera was the most utilized sensor. Despite our focus on behavioral markers, we included studies that complement them with physiological signals due to their high relevance, similar to the approach taken in Section 4.1.
Jan et al. [72] developed a method for detecting driver drowsiness, in which they recorded ten drivers with two RGB cameras—one facing the driver and the other facing the road. They concluded that the behaviors that can define a driver’s behavior or drowsiness are closing the eyes (for at least 3 s), turning the head or distractions, yawning, using a smartphone, crossing lanes, and being too close to another car. Their approach, evaluated using jackknife cross-validation, resulted in a mean accuracy of 88.75%.
Next, Soleymanpour et al. [57] compared behavior to physiological signals for detecting driver drowsiness. They monitored metrics like the percentage of eyelid closure (PERCLOS) and sleepiness ratings via the Karolinska Sleepiness Scale (KSS) and observed yawning and head movements with a front RGB camera. Additionally, they used a Neuro-Bio Monitor (NBM) to detect driver drowsiness using non-intrusive sensors located in the headrest. The NBM detects the brain’s electromagnetic field from neuronal activity and detects fatigue thresholds by analyzing the ratio of theta and alpha to beta brainwave activity. In the evaluations, the authors achieved an accuracy of 78.79% and a detection rate of 95% compared to the KSS. The NBM also exhibited trends similar to the PERCLOS and blink-duration measurements, indicating the comparable performance of behavioral markers and physiological signals. In the study, eye-tracking glasses were utilized to measure the PERCLOS, which raises concerns about the method’s unobtrusiveness. However, we included this study in our review due to its potential for adaptation to a less intrusive approach. Additionally, the other behavioral markers examined were derived using unobtrusive methods, ensuring the overall relevance and applicability of the findings to the focus of our review.
Yamamoto et al. [75] introduced a novel method for estimating blink duration using a Doppler sensor, which is a device that uses the Doppler effect to measure the velocity and movement of objects based on changes in frequency or wavelength, and spectrogram analysis, focusing on the dynamics of eyelid movement. The method detects eyelid movements using Doppler-shifted microwaves, avoiding the privacy and low-light issues of camera-based systems. The system uses a Doppler sensor to monitor the speed and direction of eyelid movements, with the blink duration estimated from the timing differences between the onset of eyelid-closing and eyelid-opening energies on the spectrogram. The method successfully detected drowsiness, achieving an average root mean squared error (RMSE) of 49.3 milliseconds across the tests.
Zhang et al. [73] proposed another method for detecting driver drowsiness using second-order blind identification, relying on smartphone-based videos. This method simultaneously separates multiple physiological sources, such as blood volume pulse, and certain behaviors, such as eye blinks and yawning, directly from facial video streams without needing detailed localization of the eyes or mouth. Additionally, by analyzing these separated signals in parallel, driver drowsiness is determined. The research confirmed that integrating multiple physiological signals and behaviors using this method effectively enhances the accuracy and reliability of drowsiness detection.
Another method combining behavioral markers and physiological signals was introduced by Kundinger et al. [74], who utilized an RGB camera to obtain data on micro-sleep events and observer ratings to assess driver drowsiness. Trained observers analyzed video recordings to rate the driver’s level of drowsiness on a six-item scale based on facial expressions and visible behaviors such as eye closure and nodding off. Concurrently, micro-sleep events were identified using an image-processing method when the driver’s eyes closed for at least one second, indicating significant lapses in attention. The observer ratings and detected micro-sleep events were then combined into an assessment of the drowsiness level of each participant. Additionally, heart rate data was collected from wrist-worn sensors and an electrocardiogram. Using the assessment of the drowsiness level as ground truth, a binary classification of drowsiness (drowsy vs. not drowsy) was performed based on physiological signals. A range of classifiers, including RF, random tree, decision stump, decision table, and k-nearest neighbor, were compared for developing both person-specific and person-independent models. The results indicated that the person-specific models generally achieved higher accuracy, with some classifiers reaching approximately 90%. This suggests that tailoring models to individual users can significantly enhance classification performance, highlighting the importance of personalized approaches in achieving optimal accuracy.
Our review of methods for the unobtrusive measurement of sleep deprivation revealed a lack of diversity in studies that focus on this construct. The majority of articles identified in the initial search predominantly examined sleep quality recorded during the night. These methods cannot be directly applied to the workplace environment because they would not be relevant, as people do not sleep during work hours. Therefore, we decided not to include such studies in our review. Moreover, a large body of literature on driver drowsiness was examined, which primarily involved recording participants for any signs of sleepiness or drowsiness. These findings are promising for potential application to workplace research, suggesting that similar methods could be adapted to measure employee alertness.

6. Social Well-Being

Social Interactions

From the initial search, six studies were found to be relevant, as they focused on using unobtrusive sensors to track and record social interactions. All of the reviewed studies utilized microphones to record different types of interactions. The reviewed studies predominantly focused on detecting spoken conversations and analyzing aspects such as their content and duration [76,77,78], frequency [79], and the frequency of different content categories [79]. Moreover, some studies investigated texting and calling, where the frequency and duration of calls [80,81] and the number of texts [81] were recorded. Consequently, the two devices most often used were microphones [76,77,78,79] and smartphones [80,81].
Tan et al. [79] investigated team communication as a measure of cohesion in pre-formed League of Legends teams. Using the microphones in participants’ smartphones, they recorded voice communication and analyzed two measures: communication frequency (words per minute) and category frequency (instances per category per minute). Categories included commands, suggestions, disagreements, social interactions, encouragement, and emotional expressions. The results indicated that word frequency and the sequence of communication categories could serve as proxies for team cohesion, with social interactions emerging as the key predictor.
Jo et al. [77] developed an application called MAMAS for monitoring parent–child mealtime conversations and food intake. The MAMAS app records a dialogue between a parent and child using the phone’s microphone and creates an analysis report based on the interactions, using speech recognition software and a self-report survey. Bi et al. [78] similarly used microphones, among other sensors, to log conversations and other activities during family mealtimes.
Shash et al. [76] extracted voice conversations from the phone’s microphone data and used them to infer the duration of students’ social interactions during the day and to assess their behavioral patterns involving smartphones.
Furthermore, in the study by Sefidgar et al. [80], social interactions were tracked using phone call data to understand the short-term behavioral impact of discrimination. The researchers analyzed the number and duration of incoming, outgoing, and missed calls, leveraging these metrics as one of the indicators of social support and interactions.
Similarly, Maxhumi et al. [81] used smartphones to continuously record and classify human voice interactions and to analyze phone call logs and SMS logs. They examined the number, duration, and frequency of calls and messages to investigate their correlation with perceived stress levels, aiming to enhance stress prediction models based on objective smartphone data.
In summary, our review identified many studies that focused on detecting and logging social interactions, predominantly using (smartphone) microphones. These studies employed various methods, such as analyzing voice conversations, phone call data, and SMS (text message) logs to infer social behaviors and psychological states. Despite the diversity in the methods, the (smartphone) microphone consistently proved to be a crucial tool, helping achieve state-of-the-art performance in these studies. It is important to note, however, that some of the approaches discussed are still in the early stages of development. Additionally, the complexity and diversity of social interactions make this area of research particularly challenging. As a result, providing a straightforward overview of the algorithms and methods employed was not always feasible. These challenges highlight the need for continued investigation in this field to better understand and model social dynamics. Furthermore, our review found no studies that specifically addressed the detection of social interactions in the workplace, but the approaches discussed could be adapted and applied in the context of the workplace.

7. Privacy-Aware Sensing

Data privacy has long been a primary concern in mobile sensing systems, as the use of unobtrusive sensing technologies, such as video and audio data collection methods, can significantly intrude on users’ privacy [82]. One of the main issues is that, despite their convenience, physically unobtrusive technologies often collect personal and sensitive data, such as video and audio recordings. For the purpose of analyzing well-being in research, users are typically more willing to participate in controlled studies but may hesitate when asked to participate in field studies. Thus, to develop a well-being measurement approach that users are likely to accept, experimenters must establish trust through the implementation of robust privacy measures. A critical first step in ensuring privacy is risk assessment, which identifies the primary risks and vulnerabilities of sensing systems and guides the prioritization of privacy measures. These measures typically require ad hoc customization, as privacy is both user- and context-dependent, meaning that what one user considers private may not be a concern for another [83,84]. To improve user privacy, particularly in safeguarding third parties, several methods have been developed, including statistical and ML approaches, to automatically detect privacy-sensitive situations and ensure that only users with valid consent are sensed. These techniques also allow users to set their preferences regarding when sensing occurs, such as only during video calls or when actively using the device. Privacy-sensitive situations can be detected using methods like identifying sensitive scenes in videos (e.g., bathrooms) [85,86], sound shredding or subsampling for audio [87], and applying differential privacy [88] to eye-tracking data [88,89]. In recent years, privacy-by-design approaches have also included deep learning-based strategies, such as encoding raw data in non-interpretable, low-dimensional representations (known as embeddings) or using federated learning [90], where user data is stored in model weights rather than in raw format. This ensures that user identification remains infeasible even when data is processed by state-of-the-art deep learning technologies.

8. Proposed Setup

As highlighted in the previous sections, a variety of methods exist for measuring different sub-dimensions of well-being. Based on our analysis of the most common methods, we chose the most frequent ones from Table 2 that would be suitable for measuring well-being in the workplace without interfering with the employee’s work. The most suitable setup is presented in Figure 3. It is composed of an RGB camera with a microphone, an eye tracker, and a radar. The RGB camera and microphone can be utilized to measure engagement, physical comfort, sleep deprivation, social interactions, and affect. The eye tracker and radar can be utilized to measure fatigue. Furthermore, collecting data from the individual’s smartphone should be included with the aim of measuring stress.

9. Discussion

As highlighted in this review, measuring work-related well-being using unobtrusive sensors is a promising field, yet it is still in its early stages of development. In this article, we aimed to provide a framework to guide research in this field. As outlined in Section 1, there is a lack of consensus among professionals regarding the definition of well-being [14], which is also often an issue in the broader field of ML in mental health [91]. Due to this gap, we decided to use the definition provided by the WHO [13] and refine it based on the relevant literature, as presented in Section 2. By explicitly defining what we mean by well-being and corroborating it with related works, we showed how insights from studies across various domains can be combined to contribute to the measurement of well-being. Our definition aims to provide a foundation for developing methods that effectively capture work-related well-being in a non-invasive manner, thereby advancing the field and improving practical applications in workplace settings.
The sensors and behavioral markers used to monitor each sub-dimension of well-being we defined are summarized in Table 2, illustrating various approaches used to infer well-being from behavior. The most commonly used sensors include different types of cameras (RGB, depth), microphones, and smartphones. These sensors capture a range of behavioral markers, with the most common being facial expressions, posture, and body movements.
There are several limitations of our work. Specifically, we did not follow robust guidelines for reporting systematic literature reviews, such as the PRISMA 2020 guidelines [92], which are widely used in the literature to ensure that systematic reviews are valuable, transparent, and complete. Although this might have influenced our analysis, adhering to these guidelines was challenging due to the lack of consensus regarding the definition of well-being and the scarcity of relevant articles. Consequently, we could not always strictly apply our inclusion criteria (see Section 3 above). We faced challenges regarding the unobtrusiveness of certain methods and included some older articles if we deemed them still highly relevant. While we focused on behavioral markers, many studies also included physiological signals. In those cases, we reported both to give a complete picture, which sometimes made it difficult to highlight the specific impact of behavioral markers. Additionally, workplace-specific studies were limited, so we examined studies from other settings that could be adapted to the context of knowledge workers.
Thieme et al. [91] pointed out that most studies in the field of ML in mental health are proof-of-concept studies that “focus on technical or algorithmic development of (initial) ML models” ([91], p. 21). The same can be said of the studies included in our review. This means that the insights should be considered preliminary and are not necessarily immediately applicable to real-world scenarios.
Another challenge in the field of unobtrusive well-being measuring is that, to the best of our knowledge, most studies tend to investigate only specific aspects of the construct, rarely approaching it in a holistic manner. Given that well-being consists of a broad range of states [93], this narrow focus may overlook important dimensions that contribute to its comprehensive understanding. Adopting a more holistic approach could potentially lead to a more accurate and complete measurement of the construct, capturing the full spectrum of factors that influence it.

10. Conclusions

Our review is an initial attempt to explore existing methods for measuring work-related well-being in an unobtrusive, privacy-aware, and holistic manner. Despite the challenges posed by an ill-defined and complex field, as mentioned above, we provide an overview of possible approaches for addressing this issue. Our findings suggest that a wide variety of methods can provide insight into well-being without disrupting daily work activities.
However, the field remains in an early stage, with most studies focusing on isolated well-being components (e.g., stress or fatigue), often within narrow experimental contexts. Current research, therefore, rarely approaches well-being holistically, which limits its applicability to real-world organizational settings. With our review, we hope to facilitate the establishment of integrated, multi-dimensional frameworks and cross-disciplinary collaboration between psychology and computer science. Future work in this field should aim to validate existing methods and approaches across diverse contexts, with the aim of making them more applicable and generalizable. Furthermore, diverse cultural backgrounds should also be considered. More longitudinal studies would also be beneficial for obtaining a more in-depth understanding of the studied constructs.
In terms of broader implications, our work provides a framework that can guide the future development of mature, privacy-aware, and unobtrusive systems, as this could facilitate the early detection (and possible prevention) of decline in employee well-being. Considering the unobtrusiveness of the proposed framework, our approach could also be extended to studies involving vulnerable populations, such as individuals with physical, cognitive, or other disabilities. Because the framework and the proposed setup rely on unobtrusive sensing that does not require active user participation or the wearing of any devices, it is particularly well-suited for populations who may have reduced mobility or increased sensitivity to monitoring technologies. Therefore, this approach can facilitate the inclusive research of vulnerable populations and enable well-being assessment in contexts where traditional methods may be impractical or ethically problematic. Future studies are necessary, however, to explore how such unobtrusive methods can be tailored to accommodate the specific needs of various populations.
Our work also lays the foundation for applications in the increasingly common context of remote work. Naturally, such applications require careful ethical consideration to avoid the risk of worker exploitation or excessive pressure to boost productivity. Nonetheless, they have the potential to enhance employee well-being, particularly in remote settings where supervisors have limited direct contact and insight into workers’ mental (and physical) states due to reduced direct contact. This underscores the value of unobtrusive sensing technologies, which can support the remote monitoring of well-being without being invasive. Recent studies, such as [94], have demonstrated how sensors can be integrated into home environments for non-intrusive monitoring. Although applied primarily in clinical populations, such studies indicate promising directions for future applications.
To conclude, by laying this foundation, we aim to facilitate further research into unobtrusive and user-centric well-being measurement methods. As workplace well-being becomes an increasingly critical concern for both employees and employers, scalable technological solutions will be essential for fostering sustainable and healthy work environments.

Author Contributions

Conceptualization, Z.A., K.Ž. and J.L.; methodology, Z.A., K.Ž. and J.L.; validation, J.L.; investigation, Z.A., K.Ž. and J.L.; writing—original draft preparation, Z.A., K.Ž., J.L. and P.B.; writing—review and editing, P.B., G.S., M.L. (Mohan Li), M.G., M.E.D., S.T., M.L. (Mitja Luštrek) and M.L. (Marc Langheinrich); visualization, Z.A. and J.L.; supervision, M.L. (Mitja Luštrek) and M.L. (Marc Langheinrich); project administration, P.B., G.S., M.G., M.L. (Mitja Luštrek) and M.L. (Marc Langheinrich); funding acquisition, P.B., G.S., M.G., M.L. (Mitja Luštrek) and M.L. (Marc Langheinrich). All authors have read and agreed to the published version of the manuscript.

Funding

This study was funded by the TRUST-ME project, jointly supported by the Swiss National Science Foundation (SNSF) (grant agreement 205121L_214991) and the Slovenian Agency of Research and Innovation (ARIS) (grant agreement N1-0319). Dr. Gjoreski’s work was funded by the SNSF through the XAI-PAC project (PZ00P2_216405).

Institutional Review Board Statement

Not applicable.

Data Availability Statement

No new data were created, as this is a review study.

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
APAAmerican Psychological Association
MLmachine learning
PERCLOSpercentage of eyelid closure
RGBred, green, and blue
WHOWorld Health Organisation

Appendix A

Table A1. Keywords used in paper selection.
Table A1. Keywords used in paper selection.
Sub-DimensionKeywords Used
Physical comfort(“comfort” OR “physical comfort”) AND ((“unobtrusive” OR “non-contact” OR “contact-free” OR “contact free”) AND (“sensors” OR “sensing”))
Sleep deprivation(“sleepiness” OR “sleep deprivation”) AND ((“unobtrusive” OR “non-contact” OR “contact-free” OR “contact free”) AND (“sensors” OR “sensing”))
Engagement(“engagement”) AND ((“unobtrusive” OR “non-contact” OR “contact-free” OR “contact free”) AND (“sensors” OR “sensing”))
Affect(“emotions” OR “emotion” OR “affect” OR “affects”) AND ((“unobtrusive” OR “non-contact” OR “contact-free” OR “contact free”) AND (“sensors” OR “sensing”)) AND (“review” OR “literature review” OR “survey”)
Fatigue(“fatigue”) AND ((“unobtrusive” OR “non-contact” OR “contact-free” OR “contact free”) AND (“sensors” OR “sensing”))
Stress(“stress”) AND ((“unobtrusive” OR “non-contact” OR “contact-free” OR “contact free”) AND (“sensors” OR “sensing”)) AND (“review” OR “literature review” OR “survey”)
Social interactions(“social interaction” OR “social relations” OR “relationships”) AND ((“unobtrusive” OR “non-contact” OR “contact-free” OR “contact free”) AND (“sensors” OR “sensing”))

References

  1. American Psychological Association. Well-being. In APA Dictionary of Psychology; American Psychological Association: Washington, DC, USA, 2018. [Google Scholar]
  2. Wijngaards, I.; King, O.C.; Burger, M.J.; van Exel, J. Worker well-being: What it is, and how it should be measured. Appl. Res. Qual. Life 2021, 17, 795–832. [Google Scholar] [CrossRef]
  3. Grawitch, M.; Gottschalk, M.; Munz, D. The Path to a Healthy Workplace A Critical Review Linking Healthy Workplace Practices, Employee Well-being, and Organizational Improvements. Consult. Psychol. J. Pract. Res. 2006, 58, 129–147. [Google Scholar] [CrossRef]
  4. Isham, A.; Mair, S.; Jackson, T. Worker wellbeing and productivity in advanced economies: Re-examining the link. Ecol. Econ. 2021, 184, 106989. [Google Scholar] [CrossRef]
  5. Pradhan, R.K.; Hati, L. The Measurement of Employee Well-being: Development and Validation of a Scale. Glob. Bus. Rev. 2022, 23, 385–407. [Google Scholar] [CrossRef]
  6. Czerw, A. Diagnosing Well-Being in Work Context—Eudemonic Well-Being in the Workplace Questionnaire. Curr. Psychol. 2019, 38, 331–346. [Google Scholar] [CrossRef]
  7. Chari, R.; Sauter, S.L.; Petrun Sayers, E.L.; Huang, W.; Fisher, G.G.; Chang, C.C. Development of the National Institute for Occupational Safety and Health Worker Well-Being Questionnaire. J. Occup. Environ. Med. 2022, 64, 707–717. [Google Scholar] [CrossRef]
  8. Guo, Y.; Liu, X.; Peng, S.; Jiang, X.; Xu, K.; Chen, C.; Wang, Z.; Dai, C.; Chen, W. A review of wearable and unobtrusive sensing technologies for chronic disease management. Comput. Biol. Med. 2021, 129, 104163. [Google Scholar] [CrossRef]
  9. Slapničar, G.; Wang, W.; Luštrek, M. Feasibility of Remote Blood Pressure Estimation via Narrow-band Multi-wavelength Pulse Transit Time. ACM Trans. Sen. Netw. 2024, 20, 77. [Google Scholar] [CrossRef]
  10. Carucci, K.; Toyama, K. Making Well-being: Exploring the Role of Makerspaces in Long Term Care Facilities. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, Glasgow, UK, 4–9 May 2019; CHI ’19. pp. 1–12. [Google Scholar] [CrossRef]
  11. Ola, O.; Harrington, B. Exploring Lightweight Practices to Support Students’ Well-being. In Proceedings of the 53rd ACM Technical Symposium on Computer Science Education V. 2, Providence, RI, USA, 3–5 March 2022; SIGCSE 2022. pp. 1070–1071. [Google Scholar] [CrossRef]
  12. Wallace, B.; Larivière-Chartier, J.; Liu, H.; Sloan, T.; Goubran, R.; Knoefel, F. Frequency Response of a Novel IR Based Pressure Sensitive Mat for Well-being Assessment. In Proceedings of the 2020 IEEE 20th International Conference on Bioinformatics and Bioengineering (BIBE), Cincinnati, OH, USA, 26–28 October 2020; pp. 481–486. [Google Scholar] [CrossRef]
  13. World Health Organization. Constitution of the World Health Organization; World Health Organization: Geneva, Switzerland, 1995. [Google Scholar]
  14. Dodge, R.; Daly, A.; Huyton, J.; Sanders, L. The challenge of defining wellbeing. Int. J. Wellbeing 2012, 2, 222–235. [Google Scholar] [CrossRef]
  15. Diener, E. Subjective well-being. Psychol. Bull. 1984, 95, 542. [Google Scholar] [CrossRef]
  16. Fisher, C. Conceptualizing and Measuring Wellbeing at Work. In Work and Wellbeing; Wiley Blackwell: Hoboken, NJ, USA, 2014; pp. 1–25. [Google Scholar] [CrossRef]
  17. Sonnentag, S. Dynamics of well-being. Annu. Rev. Organ. Psychol. Organ. Behav. 2015, 2, 261–293. [Google Scholar] [CrossRef]
  18. Ryff, C.D.; Keyes, C.L.M. The structure of psychological well-being revisited. J. Personal. Soc. Psychol. 1995, 69, 719. [Google Scholar] [CrossRef] [PubMed]
  19. Ryff, C.D. Happiness is everything, or is it? Explorations on the meaning of psychological well-being. J. Personal. Soc. Psychol. 1989, 57, 1069. [Google Scholar] [CrossRef]
  20. Russell, J. A Circumplex Model of Affect. J. Personal. Soc. Psychol. 1980, 39, 1161–1178. [Google Scholar] [CrossRef]
  21. Bakker, A.B.; Demerouti, E. Towards a model of work engagement. Career Dev. Int. 2008, 13, 209–223. [Google Scholar] [CrossRef]
  22. American Psychological Association. Stress. In APA Dictionary of Psychology; American Psychological Association: Washington, DC, USA, 2018. [Google Scholar]
  23. Seligman, M.E. Positive health. Appl. Psychol. 2008, 57, 3–18. [Google Scholar] [CrossRef]
  24. Kölsch, M.; Beall, A.C.; Turk, M. An objective measure for postural comfort. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting; SAGE Publications Sage CA: Los Angeles, CA, USA, 2003; Volume 47, pp. 725–728. [Google Scholar]
  25. Orzeł-Gryglewska, J. Consequences of sleep deprivation. Int. J. Occup. Med. Environ. Health 2010, 23, 95–114. [Google Scholar] [CrossRef]
  26. Pressman, S.D.; Kraft, T.; Bowlin, S. Well-Being: Physical, Psychological, Social. In Encyclopedia of Behavioral Medicine; Gellman, M.D., Turner, J.R., Eds.; Springer: New York, NY, USA, 2013; pp. 2047–2052. [Google Scholar] [CrossRef]
  27. American Psychological Association. Social interaction. In APA Dictionary of Psychology; American Psychological Association: Washington, DC, USA, 2018. [Google Scholar]
  28. Sun, J.; Harris, K.; Vazire, S. Is well-being associated with the quantity and quality of social interactions? J. Personal. Soc. Psychol. 2020, 119, 1478. [Google Scholar] [CrossRef]
  29. Adão Martins, N.R.; Annaheim, S.; Spengler, C.M.; Rossi, R.M. Fatigue Monitoring Through Wearables: A State-of-the-Art Review. Front. Physiol. 2021, 12, 790292. [Google Scholar] [CrossRef]
  30. Salama, W.; Abdou, A.H.; Mohamed, S.A.K.; Shehata, H.S. Impact of Work Stress and Job Burnout on Turnover Intentions among Hotel Employees. Int. J. Environ. Res. Public Health 2022, 19, 9724. [Google Scholar] [CrossRef]
  31. Chen, C.W.; Määttä, T.; Wong, K.B.Y.; Aghajan, H. A collaborative framework for ergonomic feedback using smart cameras. In Proceedings of the 2012 Sixth International Conference on Distributed Smart Cameras (ICDSC), Hong Kong, China, 30 October–2 November 2012; pp. 1–6. [Google Scholar]
  32. Forrester, N. How better sleep can improve productivity. Nature 2023, 619, 659–661. [Google Scholar] [CrossRef] [PubMed]
  33. Dutton, J.E. Energize Your Workplace: How to Create and Sustain High-Quality Connections at Work; John Wiley & Sons: Hoboken, NJ, USA, 2003; Volume 5. [Google Scholar]
  34. Braun, M.; Weber, F.; Alt, F. Affective Automotive User Interfaces–Reviewing the State of Driver Affect Research and Emotion Regulation in the Car. ACM Comput. Surv. 2021, 54, 137. [Google Scholar] [CrossRef]
  35. Gong, J.; Zhang, X.; Huang, Y.; Ren, J.; Zhang, Y. Robust Inertial Motion Tracking through Deep Sensor Fusion across Smart Earbuds and Smartphone. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 2021, 5, 62. [Google Scholar] [CrossRef]
  36. Röddiger, T.; Clarke, C.; Breitling, P.; Schneegans, T.; Zhao, H.; Gellersen, H.; Beigl, M. Sensing with Earables: A Systematic Literature Review and Taxonomy of Phenomena. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 2022, 6, 135. [Google Scholar] [CrossRef]
  37. Jones, C.; Jonsson, I.M. Using Paralinguistic Cues in Speech to Recognise Emotions in Older Car Drivers. In Affect and Emotion in Human-Computer Interaction: From Theory to Applications; Peter, C., Beale, R., Eds.; Springer: Berlin/Heidelberg, Germany, 2008; pp. 229–240. [Google Scholar] [CrossRef]
  38. Huang, D.Y.; Seyed, T.; Li, L.; Gong, J.; Yao, Z.; Jiao, Y.; Chen, X.A.; Yang, X.D. Orecchio: Extending Body-Language through Actuated Static and Dynamic Auricular Postures. In Proceedings of the 31st Annual ACM Symposium on User Interface Software and Technology, Berlin, Germany, 14 October 2018; UIST ’18. pp. 697–710. [Google Scholar] [CrossRef]
  39. Abdelrahman, Y.; Schmidt, A. Beyond the visible: Sensing with thermal imaging. Interactions 2018, 26, 76–78. [Google Scholar] [CrossRef]
  40. Nocera, A.; Senigagliesi, L.; Raimondi, M.; Ciattaglia, G.; Gambi, E. Machine learning in radar-based physiological signals sensing: A scoping review of the models, datasets and metrics. IEEE Access 2024, 12, 156082–156117. [Google Scholar] [CrossRef]
  41. Koelstra, S.; Muhl, C.; Soleymani, M.; Lee, J.S.; Yazdani, A.; Ebrahimi, T.; Pun, T.; Nijholt, A.; Patras, I. DEAP: A Database for Emotion Analysis; Using Physiological Signals. IEEE Trans. Affect. Comput. 2012, 3, 18–31. [Google Scholar] [CrossRef]
  42. Stampf, A.; Colley, M.; Rukzio, E. Towards Implicit Interaction in Highly Automated Vehicles—A Systematic Literature Review. Proc. ACM Hum.-Comput. Interact. 2022, 6, 191. [Google Scholar] [CrossRef]
  43. Jones, C.; Sutherland, J. Acoustic Emotion Recognition for Affective Computer Gaming. In Affect and Emotion in Human-Computer Interaction: From Theory to Applications; Peter, C., Beale, R., Eds.; Springer: Berlin/Heidelberg, Germany, 2008; pp. 209–219. [Google Scholar] [CrossRef]
  44. Dunne, R.; Morris, T.; Harper, S. A Survey of Ambient Intelligence. ACM Comput. Surv. 2021, 54, 73. [Google Scholar] [CrossRef]
  45. Soundariya, R.; Renuga, R. Eye movement based emotion recognition using electrooculography. In Proceedings of the 2017 Innovations in Power and Advanced Computing Technologies (i-PACT), Vellore, India, 21–22 April 2017; pp. 1–5. [Google Scholar] [CrossRef]
  46. Zeng, K.; Liu, G. Emotion recognition based on millimeter wave radar. In Proceedings of the 2023 3rd International Conference on Bioinformatics and Intelligent Computing, Sanya, China, 10–12 February 2023; BIC ’23. pp. 232–236. [Google Scholar] [CrossRef]
  47. Ashwin, T.S.; Guddeti, R.M.R. Unobtrusive Behavioral Analysis of Students in Classroom Environment Using Non-Verbal Cues. IEEE Access 2019, 7, 150693–150709. [Google Scholar] [CrossRef]
  48. Gao, Y.; Jin, Y.; Choi, S.; Li, J.; Pan, J.; Shu, L.; Zhou, C.; Jin, Z. SonicFace: Tracking Facial Expressions Using a Commodity Microphone Array. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 2021, 5, 156. [Google Scholar] [CrossRef]
  49. Whitehill, J.; Serpell, Z.; Lin, Y.C.; Foster, A.; Movellan, J.R. The Faces of Engagement: Automatic Recognition of Student Engagementfrom Facial Expressions. IEEE Trans. Affect. Comput. 2014, 5, 86–98. [Google Scholar] [CrossRef]
  50. Aslan, S.; Alyuz, N.; Tanriover, C.; Mete, S.E.; Okur, E.; D’Mello, S.K.; Arslan Esme, A. Investigating the Impact of a Real-time, Multimodal Student Engagement Analytics Technology in Authentic Classrooms. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, Glasgow, UK, 4–9 May 2019; CHI ’19. pp. 1–12. [Google Scholar] [CrossRef]
  51. Verma, D.; Bhalla, S.; Sahnan, D.; Shukla, J.; Parnami, A. ExpressEar: Sensing Fine-Grained Facial Expressions with Earables. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 2021, 5, 129. [Google Scholar] [CrossRef]
  52. Vedernikov, A.; Sun, Z.; Kykyri, V.L.; Pohjola, M.; Nokia, M.; Li, X. Analyzing Participants’ Engagement during Online Meetings Using Unsupervised Remote Photoplethysmography with Behavioral Features. In Proceedings of the 2024 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), Seattle, WA, USA, 17–18 June 2024; pp. 389–399. [Google Scholar] [CrossRef]
  53. Huynh, S.; Kim, S.; Ko, J.; Balan, R.K.; Lee, Y. EngageMon: Multi-Modal Engagement Sensing for Mobile Games. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 2018, 2, 13. [Google Scholar] [CrossRef]
  54. Li, J.; Li, H.; Umer, W.; Wang, H.; Xing, X.; Zhao, S.; Hou, J. Identification and classification of construction equipment operators’ mental fatigue using wearable eye-tracking technology. Autom. Constr. 2020, 109, 103000. [Google Scholar] [CrossRef]
  55. Turetskaya, A.; Anishchenko, L.; Ivanisova, E. Non-Contact Detection of Respiratory Pattern Changes due to Mental Workload. In Proceedings of the 2020 Ural Symposium on Biomedical Engineering, Radioelectronics and Information Technology (USBEREIT), Yekaterinburg, Russia, 14–15 May 2020; pp. 125–127. [Google Scholar] [CrossRef]
  56. Zhou, L.; Fischer, E.; Brahms, C.M.; Granacher, U.; Arnrich, B. Using Transparent Neural Networks and Wearable Inertial Sensors to Generate Physiologically-Relevant Insights for Gait. In Proceedings of the 2022 21st IEEE International Conference on Machine Learning and Applications (ICMLA), Nassau, Bahamas, 12–14 December 2022; pp. 1274–1280. [Google Scholar] [CrossRef]
  57. Soleymanpour, R.; Shishavan, H.H.; Heo, J.S.; Kim, I. Novel Driver’s Drowsiness Detection System and its Evaluation in a Driving Simulator Environment. In Proceedings of the 2021 IEEE International Conference on Systems, Man, and Cybernetics (SMC), Melbourne, Australia, 17–20 October 2021; pp. 1204–1208. [Google Scholar] [CrossRef]
  58. Chen, H.; Han, X.; Hao, Z.; Yan, H.; Yang, J. Non-contact Monitoring of Fatigue Driving Using FMCW Millimeter Wave Radar. ACM Trans. Internet Things 2023, 5, 3. [Google Scholar] [CrossRef]
  59. Ice, G.H.; James, G.D. Conducting a field study of stress. In Measuring Stress in Humans; Ice, G.H., James, G.D., Eds.; Cambridge University Press: Cambridge, UK, 2006; Chapter 1; pp. 3–24. [Google Scholar]
  60. Selye, H. The general adaptation syndrome and the diseases of adaptation. J. Clin. Endocrinol. Metab. 1946, 6, 117–230. [Google Scholar] [CrossRef]
  61. Alberdi, A.; Aztiria, A.; Basarab, A. Towards an automatic early stress recognition system for office environments based on multimodal measurements. J. Biomed. Inform. 2016, 59, 49–75. [Google Scholar] [CrossRef]
  62. Arsalan, A.; Anwar, S.M.; Majid, M. Mental Stress Detection using Data from Wearable and Non-wearable Sensors: A Review. arXiv 2023, arXiv:2202.03033. [Google Scholar] [CrossRef]
  63. Gedam, S.; Paul, S. A Review on Mental Stress Detection Using Wearable Sensors and Machine Learning Techniques. IEEE Access 2021, 9, 84045–84066. [Google Scholar] [CrossRef]
  64. Taskasaplidis, G.; Fotiadis, D.A.; Bamidis, P.D. Review of Stress Detection Methods Using Wearable Sensors. IEEE Access 2024, 12, 38219–38246. [Google Scholar] [CrossRef]
  65. Masri, G.; Al-Shargie, F.; Tariq, U.; Almughairbi, F.; Babiloni, F.; Al-Nashash, H. Mental Stress Assessment in the Workplace: A Review. IEEE Trans. Affect. Comput. 2024, 15, 958–976. [Google Scholar] [CrossRef]
  66. Wu, Y.; Ni, H.; Mao, C.; Han, J.; Xu, W. Non-intrusive Human Vital Sign Detection Using mmWave Sensing Technologies: A Review. ACM Trans. Sens. Netw. 2023, 20, 16. [Google Scholar] [CrossRef]
  67. Ha, U.; Madani, S.; Adib, F. WiStress: Contactless Stress Monitoring Using Wireless Signals. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 2021, 5, 103. [Google Scholar] [CrossRef]
  68. Zakaria, C.; Balan, R.; Lee, Y. StressMon: Scalable Detection of Perceived Stress and Depression Using Passive Sensing of Changes in Work Routines and Group Interactions. Proc. ACM Hum.-Comput. Interact. 2019, 3, 37. [Google Scholar] [CrossRef]
  69. Nosakhare, E.; Picard, R. Probabilistic Latent Variable Modeling for Assessing Behavioral Influences on Well-Being. In Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, Anchorage, AK, USA, 4–8 August 2019. KDD ’19. [Google Scholar] [CrossRef]
  70. Olsen, G.F.; Brilliant, S.S.; Primeaux, D.; Najarian, K. Signal processing and machine learning for real-time classification of ergonomic posture with unobtrusive on-body sensors; application in dental practice. In Proceedings of the 2009 ICME International Conference on Complex Medical Engineering, Tempe, AZ, USA, 9–11 April 2009; pp. 1–11. [Google Scholar] [CrossRef]
  71. Wac, M.; Kou, R.; Unlu, A.; Jenkinson, M.; Lin, W.; Roudaut, A. TAILOR: A Wearable Sleeve for Monitoring Repetitive Strain Injuries. In Proceedings of the CHI ’20: CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA, 25–30 April 2020; CHI EA ’20. pp. 1–8. [Google Scholar] [CrossRef]
  72. Jan, M.T.; Hashemi, A.; Jang, J.; Yang, K.; Zhai, J.; Newman, D.; Tappen, R.; Furht, B. Non-intrusive Drowsiness Detection Techniques and Their Application in Detecting Early Dementia in Older Drivers. In Proceedings of the Future Technologies Conference (FTC) 2022, Volume 2; Arai, K., Ed.; Springer International Publishing: Berlin/Heidelberg, Germany, 2023; pp. 776–796. [Google Scholar]
  73. Zhang, C.; Wu, X.; Zheng, X.; Yu, S. Driver drowsiness detection using multi-channel second order blind identifications. IEEE Access 2019, 7, 11829–11843. [Google Scholar] [CrossRef]
  74. Kundinger, T.; Sofra, N.; Riener, A. Assessment of the Potential of Wrist-Worn Wearable Sensors for Driver Drowsiness Detection. Sensors 2020, 20, 1029. [Google Scholar] [CrossRef]
  75. Yamamoto, K.; Toyoda, K.; Ohtsuki, T. Doppler Sensor-Based Blink Duration Estimation by Analysis of Eyelids Closing and Opening Behavior on Spectrogram. IEEE Access 2019, 7, 42726–42734. [Google Scholar] [CrossRef]
  76. Shah, D.; Upasini, A.; Sasidhar, K. Findings from an experimental study of student behavioral patterns using smartphone sensors. In Proceedings of the 2020 International Conference on COMmunication Systems & NETworkS (COMSNETS), Bengaluru, India, 7–11 January 2020; pp. 768–772. [Google Scholar] [CrossRef]
  77. Jo, E.; Bang, H.; Ryu, M.; Sung, E.J.; Leem, S.; Hong, H. MAMAS: Supporting Parent–Child Mealtime Interactions Using Automated Tracking and Speech Recognition. Proc. ACM Hum.-Comput. Interact. 2020, 4, 66. [Google Scholar] [CrossRef]
  78. Bi, C.; Xing, G.; Hao, T.; Huh-Yoo, J.; Peng, W.; Ma, M.; Chang, X. FamilyLog: Monitoring Family Mealtime Activities by Mobile Devices. IEEE Trans. Mob. Comput. 2019, 19, 1818–1830. [Google Scholar] [CrossRef]
  79. Tan, E.T.S.; Rogers, K.; Nacke, L.E.; Drachen, A.; Wade, A. Communication Sequences Indicate Team Cohesion: A Mixed-Methods Study of Ad Hoc League of Legends Teams. Proc. ACM Hum.-Comput. Interact. 2022, 6, 225. [Google Scholar] [CrossRef]
  80. Sefidgar, Y.S.; Seo, W.; Kuehn, K.S.; Althoff, T.; Browning, A.; Riskin, E.; Nurius, P.S.; Dey, A.K.; Mankoff, J. Passively-sensed Behavioral Correlates of Discrimination Events in College Students. Proc. ACM Hum.-Comput. Interact. 2019, 3, 114. [Google Scholar] [CrossRef] [PubMed]
  81. Maxhuni, A.; Hernandez-Leal, P.; Morales, E.F.; Sucar, L.E.; Osmani, V.; Mayora, O. Unobtrusive Stress Assessment Using Smartphones. IEEE Trans. Mob. Comput. 2020, 20, 2313–2325. [Google Scholar] [CrossRef]
  82. Langheinrich, M. Privacy by design—Principles of privacy-aware ubiquitous systems. In Proceedings of the International Conference on Ubiquitous Computing; Springer: Berlin/Heidelberg, Germany, 2001; pp. 273–291. [Google Scholar]
  83. Du, W.; Li, A.; Zhou, P.; Niu, B.; Wu, D. Privacyeye: A privacy-preserving and computationally efficient deep learning-based mobile video analytics system. IEEE Trans. Mob. Comput. 2021, 21, 3263–3279. [Google Scholar] [CrossRef]
  84. Hoyle, R.; Templeman, R.; Anthony, D.; Crandall, D.; Kapadia, A. Sensitive lifelogs: A privacy analysis of photos from wearable cameras. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, Seoul, Republic of Korea, 18–23 April 2015; pp. 1645–1648. [Google Scholar]
  85. Korayem, M.; Templeman, R.; Chen, D.; Crandall, D.; Kapadia, A. Enhancing lifelogging privacy by detecting screens. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, San Jose, CA, USA, 7–12 May 2016; pp. 4309–4314. [Google Scholar]
  86. Templeman, R.; Korayem, M.; Crandall, D.J.; Kapadia, A. PlaceAvoider: Steering First-Person Cameras away from Sensitive Spaces. In Proceedings of the NDSS, Citeseer, 2014. San Diego, CA, USA, 23–26 February 2014; Volume 14, pp. 23–26. [Google Scholar]
  87. Kumar, S.; Nguyen, L.T.; Zeng, M.; Liu, K.; Zhang, J. Sound shredding: Privacy preserved audio sensing. In Proceedings of the 16th International Workshop on Mobile Computing Systems and Applications, Santa Fe, NM, USA, 12–13 February 2015; pp. 135–140. [Google Scholar]
  88. Dwork, C. Differential privacy. In Proceedings of the International Colloquium on Automata, Languages, and Programming; Springer: Berlin/Heidelberg, Germany, 2006; pp. 1–12. [Google Scholar]
  89. Dedovic, K.; Rexroth, M.; Wolff, E.; Duchesne, A.; Scherling, C.; Beaudry, T.; Lue, S.D.; Lord, C.; Engert, V.; Pruessner, J.C. Neural correlates of processing stressful information: An event-related fMRI study. Brain Res. 2009, 1293, 49–60. [Google Scholar] [CrossRef]
  90. McMahan, B.; Moore, E.; Ramage, D.; Hampson, S.; y Arcas, B.A. Communication-efficient learning of deep networks from decentralized data. In Proceedings of the Artificial Intelligence and Statistics, Lauderdale, FL, USA, 20–22 April 2017; pp. 1273–1282. [Google Scholar]
  91. Thieme, A.; Belgrave, D.; Doherty, G. Machine Learning in Mental Health: A Systematic Review of the HCI Literature to Support the Development of Effective and Implementable ML Systems. ACM Trans. Comput.-Hum. Interact. 2020, 27, 34. [Google Scholar] [CrossRef]
  92. Page, M.J.; McKenzie, J.E.; Bossuyt, P.M.; Boutron, I.; Hoffmann, T.C.; Mulrow, C.D.; Shamseer, L.; Tetzlaff, J.M.; Akl, E.A.; Brennan, S.E.; et al. The PRISMA 2020 statement: An updated guideline for reporting systematic reviews. BMJ 2021, 372, n71. [Google Scholar] [CrossRef]
  93. VanderWeele, T.J. On the promotion of human flourishing. Proc. Natl. Acad. Sci. USA 2017, 114, 8148–8156. [Google Scholar] [CrossRef]
  94. Menniti, M.; Laganà, F.; Oliva, G.; Bianco, M.; Fiorillo, A.S.; Pullano, S.A. Development of Non-Invasive Ventilator for Homecare and Patient Monitoring System. Electronics 2024, 13, 790. [Google Scholar] [CrossRef]
Figure 1. Proposed concept of work-related well-being.
Figure 1. Proposed concept of work-related well-being.
Make 07 00062 g001
Figure 2. Review steps.
Figure 2. Review steps.
Make 07 00062 g002
Figure 3. Proposed setup for the unobtrusive measurement of work-related well-being.
Figure 3. Proposed setup for the unobtrusive measurement of work-related well-being.
Make 07 00062 g003
Table 1. Number of articles included.
Table 1. Number of articles included.
Sub-DimensionNo. of Articles After the Keyword SearchNo. of Articles After the Screening
Affect556
Engagement987
Fatigue1122
Stress323
Physical comfort1763
Sleep deprivation348
Social interactions1626
Sum64435
Table 2. Framework for inferring work-related well-being from behavior.
Table 2. Framework for inferring work-related well-being from behavior.
Sub-DimensionBehavioral MarkerSensorsRelevant Literature
AffectFacial expressionsRGB camera, microphone[34,35]
SpeechMicrophone[34,37]
Auricular positionsRGB camera[38]
Facial temperature changesThermal camera[39]
Eye movement and positionRGB camera, EOG signals[45]
EngagementFacial expressionsRGB camera, microphone array[47,48,49,50,51,52]
PostureRGB camera, microphone array[47,48]
Hand gesturesRGB camera[47]
Upper-body motionDepth camera[53]
Gaze tracking and directionRGB camera[52]
Head rotationRGB camera[52]
FatigueEye movementEye tracker[54]
Respiratory patternRadar[55]
StressBody movementsMillimeter-wave sensor[67]
Activity informationWiFi-based localization system[68]
Communication patternsSmartphone[69]
Phone usageSmartphone[69]
LocationSmartphone[69]
Physical comfortPostureInclinometer, RGB camera[31,70]
Hand and elbow movementWearable sleeve[71]
Sleep deprivationEye blinkingRGB camera[72,73,74]
DistractionRGB camera[72]
YawningRGB camera[57,72,73,74]
Head movementRGB camera[57,72]
Blink durationDoppler sensor[75]
Micro-sleep eventsRGB camera[74]
NoddingRGB camera[74]
Social interactionsCommunication frequencyMicrophone[79]
Category frequencyMicrophone[79]
Spoken conversationsMicrophone[77,78]
Duration of interactionMicrophone[76]
Number and duration of phone callsSmartphone[80,81]
Number of SMSsSmartphone[81]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Anžur, Z.; Žinkovič, K.; Lukan, J.; Barbiero, P.; Slapničar, G.; Li, M.; Gjoreski, M.; Debus, M.E.; Trojer, S.; Luštrek, M.; et al. A Review of Methods for Unobtrusive Measurement of Work-Related Well-Being. Mach. Learn. Knowl. Extr. 2025, 7, 62. https://doi.org/10.3390/make7030062

AMA Style

Anžur Z, Žinkovič K, Lukan J, Barbiero P, Slapničar G, Li M, Gjoreski M, Debus ME, Trojer S, Luštrek M, et al. A Review of Methods for Unobtrusive Measurement of Work-Related Well-Being. Machine Learning and Knowledge Extraction. 2025; 7(3):62. https://doi.org/10.3390/make7030062

Chicago/Turabian Style

Anžur, Zoja, Klara Žinkovič, Junoš Lukan, Pietro Barbiero, Gašper Slapničar, Mohan Li, Martin Gjoreski, Maike E. Debus, Sebastijan Trojer, Mitja Luštrek, and et al. 2025. "A Review of Methods for Unobtrusive Measurement of Work-Related Well-Being" Machine Learning and Knowledge Extraction 7, no. 3: 62. https://doi.org/10.3390/make7030062

APA Style

Anžur, Z., Žinkovič, K., Lukan, J., Barbiero, P., Slapničar, G., Li, M., Gjoreski, M., Debus, M. E., Trojer, S., Luštrek, M., & Langheinrich, M. (2025). A Review of Methods for Unobtrusive Measurement of Work-Related Well-Being. Machine Learning and Knowledge Extraction, 7(3), 62. https://doi.org/10.3390/make7030062

Article Metrics

Back to TopTop