Sensors 2012, 12(5), 6075-6101; doi:10.3390/s120506075

Article
A Stress Sensor Based on Galvanic Skin Response (GSR) Controlled by ZigBee
María Viqueira Villarejo , Begoña García Zapirain and Amaia Méndez Zorrilla *
DeustoTech-Life Unit, Deusto Institute of Technology, University of Deusto, Avda. de las Universidades, 24, Bilbao 48007, Spain; E-Mails: maria.viqueira@opendeusto.es (M.V.V.); mbgarciazapi@deusto.es (B.G.Z.)
*
Author to whom correspondence should be addressed; E-Mail: amaia.mendez@deusto.es; Tel.: +34-944-139-000.
Received: 29 January 2012; in revised form: 18 April 2012 / Accepted: 27 April 2012 /
Published: 10 May 2012

Abstract

: Sometimes, one needs to control different emotional situations which can lead the person suffering them to dangerous situations, in both the medium and short term. There are studies which indicate that stress increases the risk of cardiac problems. In this study we have designed and built a stress sensor based on Galvanic Skin Response (GSR), and controlled by ZigBee. In order to check the device's performance, we have used 16 adults (eight women and eight men) who completed different tests requiring a certain degree of effort, such as mathematical operations or breathing deeply. On completion, we appreciated that GSR is able to detect the different states of each user with a success rate of 76.56%. In the future, we plan to create an algorithm which is able to differentiate between each state.
Keywords:
GSR; skin resistance; stress; ZigBee

1. Introduction

Stress, better explained in [1], is a response to particular events. It is the way our body prepares itself to face a difficult situation with focus, strength and heightened alertness. When we perceive a threat, our nervous system responds by releasing a flood of stress hormones, including adrenaline and cortisol. These hormones rouse the body for emergency action. In some cases it is necessary to collect feedback in order to control this symptom because it can become dangerous in certain situations. Therefore, it is necessary to build a device to detect stress.

For this objective, we have designed a Galvanic Skin Response (GSR) device in order to detect the different conductance of the skin when a person is under stress or when not [2]. It uses just two electrodes which are placed on the fingers and act as if they were the two terminals of one resistance [3,4].

This device sends different data to a coordinator via ZigBee and, at the same time, this coordinator will send the information to a computer. The final objective is to implement this GSR into an application which controls different medical devices, [5,6]. The Figure 1 shows the communication of the final application.

The user can use the stress sensor anywhere in his home provided he is at a distance of less than 10 meters [7]. By using a wireless communication system, the user is provided with a certain degree of freedom when using the device. The final user could manage the different devices from his television and the control center could take different action so as to change a person's stress levels. Therefore, the coordination center could use different systems to help the person relax, such as turning down the lights or changing the kind of music the user is listening to. There are two main reasons why we decided to work with ZigBee:

  • It's low power consumption.

  • It is possible to connect as many as 255 nodes.

This means of communication has been used before in healthcare applications, as can be seen in [8] and [9]. In order to verify the stress sensor, we carried out different trials with 16 adult subjects. The idea was to establish one threshold for each person because there are people who are more nervous than others, so there could be cases where the results are not reliable. Despite this, there exist studies which have obtained good results establishing the same limit for all the subjects [10].

Therefore, there are two parts: the hardware design of the GSR and the algorithm which detects the emotional state of the user. This first study focuses on the hardware part, so the trials were done in order to verify that the device detects different changes in the person's condition, more specifically whether it detects an effort being made.

The output voltage of the designed circuit is connected to the ADC of a ZigBee board. There are two ZigBee boards: one for acquiring the data, and a second one to send it to the computer. This second board (the coordinator) also receives information about other devices and functions of the domotic application.

This paper is divided into the following sections: first, the paper describes the state of the art. Secondly, a complete methodology with all the technologies involved in this system is described, and then the system design is explained, as well as the results obtained during the tests carried out on the system. This document ends with the conclusions and the discussions arising from the topic.

2. State of the Art

There exist different studies which try to detect people's emotional states, including attempts to find out whether someone is suffering from stress or is not. Studies [11] and [12] use EEG to classify the different data acquired by brain activity. They extract different frequency features of the signals for their posterior classification with good results.

Heart Rate Variability is another parameter used to measure stress levels [13]. To induce stress the authors in [14] propose using hyperventilation and talk preparation. Then, they present a method based on fuzzy logic to analyze the different data from HR and GSR. An ambulatory device is developed in [4] in order to evaluate stress in blind people. This device also includes the measurement of skin temperature, which is another parameter used to analyze stress [15].

As regards Galvanic Skin Response (GSR), there are several studies which propose different methods of detecting stress levels by measuring skin conductance [16]. The study described in [17] has the objective of detecting sweat levels for the diagnosis of sudomotor dysfunction, something that can help in the diagnosis of diabetes. There are other medical applications based on skin conductance, such as epilepsy control: sweaty hands may be a warning signal of an epileptic attack [18]; or [19], as support of the diagnosis and treatment of bipolar disorder patients.

By combining the sweat of the hands with the temperature of the skin, it is possible to develop a truth meter [20]; as when the person is lying, his hands are colder and skin resistance is lower. In this case it is not necessary to include an ADC because the variation of skin resistance happens at odd times so, with different resistances and transistors, it is possible to build a lie detector.

In [10], different videos are shown to the participants in order to induce different emotions. The data acquired by GSR are classified by Immune Particle Swarm Optimization, obtaining a high average when classifying different emotions from the conductance of the skin. The study presented in [21] shows a method based on Support Vector Regression for recognizing emotions by combining different devices.

Continuing with the differentiation of the emotional state, literature includes other studies like [22], where different devices are combined so that, by means of Cross-Correlation and Fisher, it is possible to distinguish six different kinds of emotions.

In [23], applying a method based on Principal Component Analysis (PCA) to reduce the dimension of the GSR data is proposed, saving as much information as possible. The devices described above are also used in the analysis of different bio signals [24].

3. Methods

In order to develop GSR, it is necessary to use a mechanism to send the data via ZigBee as well as the corresponding algorithm to determine the stress level in accordance with the different tests. The different methods used in this first study can be seen below:

3.1. Hardware

We use Jennic JN-5148 boards (Figure 2) for data acquisition and its subsequent submission to the computer. These were chosen because of their ease of implementing a communication protocol between the coordinator board and the sensor board and because they are part of the ZigBee Alliance. Other devices like those used in [5] were not appropriate for developing a global domotic system. The resolution of the Analogic to Digital Converser is appropriate for the needs of the device.

The output signal of the (Vo) device is connected to pin 34 of the sensor board, while the reference signal is connected to number 40. Through the ZigBee communication protocol, the sensor board sends the data to the coordinator board, which, by means of a USB with a JWT terminal, sends the data to a computer. The weight and the size of these boards are quite small, so they can easily be implemented in the same kit as the device.

3.2. Signal

After doing some tests, it was seen that the Analogic Digital Converser saturates at 2.35 V. It is an ADC of 12 bits, so the resolution is:

2.35 V 4096 = 0.573 m V

The galvanic skin response oscillates between 10 kΩ and 10 MΩ [25,26], as it can be seen in existing studies about the skin conductance obtained from different applied voltages [27,28]. After initial contact with the subjects, we established an input tension of 1.8 V. We took measurements from our circuit, using different resistances that are within the range of skin resistance (Table 1). These values were chosen in order to know the theoretical behavior of the output voltage, depending on skin conductance. The different values of Rs (Table 1) are determined by the combination of different real resistances.

As the board's ADC has a resolution of 0.573 mV and the minimum tension is 136 mV, an operational amplifier does not have to be included. We can also observe that the differences between some resistance values and others are higher than the ADC resolution. There are GSR devices which use an amplifier before the ADC [6].

3.3. Trials

Different tests were conducted in order to verify the behavior of the GSR device:

  • In a calm state, trying to feel relaxed.

  • Trying to be nervous: at the second stage, we asked the subjects to think about something which makes him nervous or produces anxiety.

  • Taking in air and expelling it forcefully: the subject is relaxed and, after one minute, he is asked to take in air and expel it, trying to push himself as hard as possible. As the result of this, the nervous system indicates to the sweat glands that an effort has been made.

  • Mathematical operations: the subject is relaxed and, after one minute, the computer shows him different mathematical operations. The subject is asked to feed the different results into the computer.

  • Reading: after 90 seconds, the screen shows some words that the user has to read as fast as possible (Figure 3).

  • Another test where the computer shows different images to the subject was also used. Some of these pictures are “emotional” (they should affect the subject's emotional state) and the others are “neutral” (they have no influence on the subject). It was supposed that the emotional ones would provide a response, but, after trying with some subjects, we have not included this because the results were insignificant.

Table 2 shows the places where the different studies took place. The studies have been made at the subject's home and office because GSR is intended to work in daily situations.

3.4. Matlab

In order to analyze and manipulate the different data, we use Matlab.

3.5. WEKA

To verify the different tests, we used the WEKA learning machine for testing the following methods: Bayesian Network, J48 and Sequential Minimal Optimization (SMO). They were chosen because there are studies that have obtained good results with them [29]. A Cross-Validation method was used to evaluate the different results.

4. System Design

We now present different diagrams showing the application's system design.

4.1. High Level Design

The main application performs the operations on the Figure 4.

The data acquisition and subsequent sending of information to the computer is done by two different boards, which are connected to each other via ZigBee (see Figure 5).

The device built performs the operations of the Figure 6:

The GSR electrodes collect the skin resistance and, after this, the device can determine the person's stress level.

4.2. Processing Stage

At this stage (Figure 7) it is necessary to develop an algorithm which must be able to differentiate between the stress levels.

We separate the data according to stress situations and relax situations.

4.3. Low Level Design

Hardware

A person's skin acts as a resistance to the passage of electrical current. By placing two electrodes on the fingers, we can calculate the Galvanic Skin Response (GSR). To find out this value, we use one resistance, as it can be seen in Figure 8, in series with the skin resistance, to form a voltage divider.

V o = R 2 R s + R 2 Vcc
where Rs is the resistance of the skin.

It can be observed that the Vo output tension is inversely proportional to the value of the skin resistance. The more stressed the person is, the more his hands will sweat, so his resistance will decrease. Therefore, we can conclude that the more stress the person is under, the higher output voltage will be.

It also includes a low-pass filter made by a capacitor and a resistance to filter the high frequencies. The resulting circuit in Figure 9.

We will use a 1.5 V battery of as supply. The device, Figure 10, will have the following form:

5. Results

We have conducted several tests in order to change the emotional state of the subjects. Knowing the moments when the person should be stressed and the ones where he should not, we can analyze each kind of data separately. We have used 16 subjects aged between 23 and 56 (eight women and eight men). The sample rate is 4 Hz. Once the data have been obtained, we have smoothed them with an average size-5 window.

All the users have done the following tests:

  • Staying relaxed

  • Mathematical operations

  • Breathing deeply

  • Reading as fast as possible

Table 3 shows the output voltage averages for the different tests in those cases where the differences are better appreciated.

Following it can be seen the answers to different situations. Figures 11, 16 and 17 show the variation of the output voltage when the users are reading; Figures 12 and 15 represents the effort doing mathematical operations and Figures 13, 14 and 18 are the answer of the user breathing deeply.

It has been appreciated that when the participants were asked to relax, there was a decrease in the output voltage for that period of time. Figures 1922 show the decreasing of the signal when the user is relaxed.

The Table 4 shows whether changes have been appreciated in these four tests for each user.

With this data, we obtained an average success rate of 76.56%. The users who had done some trials beforehand (Users 1, 2, 3 and 12), were more successful than the rest. Additionally, there are users who were asked to think of something that makes them nervous for a later comparison (Table 5):

Figures 23 and 24 show the responses of User 1 and User 3, trying to be nervous and relaxed

User 4 was a special case: after doing the stage where she had to think about something that makes her nervous, she had said that she could not do so. We then told her that the next stage was mathematical operations, something which made her nervous (Figure 25). Therefore, we decided repeat the acquisitions:

Figures 26 and 27 show additional situations tested by the GSR:

After drinking coffee, User 1 presents a higher output voltage.

While User 4 was relaxed, she became nervous, which is why an increment in output voltage can be appreciated.

Figure 28 represents the following three situations:

  • Upon arrival at the department, done fairly quickly (blue).

  • After a while, much more calmly (green).

  • In the afternoon, thinking about something she had to do, and that made her nervous (red).

In order to verify the different tests, we introduced the different data to WEKA machine learning, using BayesNet, J48 and SMO. There separated those measurements which are supposed to represent an effort from those where the user was relaxed. The data have been processed individually for each participant due to the fact that each one has got different thresholds. Cross-Validation was used to test the different classifiers. Below, in Table 6, we present the different results:

Confusion matrix is shown in Table 7:

Figures 2931 show the graphs of the error classifier:

Figures 3234 represent the graphs showing the ROC curve for some of the different subjects:

Below is a comparative graph (Figure 35) with the results obtained from the different subjects, showing the averages when relaxed and when in situations requiring effort:

Table 8 contains the different voltages between relaxing situations and effort situations:

We did more trials for Users 2, 3, 4, 5 and 13 in order to determine whether they are stressed or not. In these trials they were asked to feel both relaxed and nervous. After talking to them, they concluded that most of them were not able to make themselves nervous (Figures 3640). Because of this, the measurements were not treated as they were supposed to be, but as according to how the participant felt. The results are reflected in Tables 913.

Table 14 shows the classification average.

We have established the following limit to differentiate being relaxed from being nervous for an initial study:

Limit = stress average × 0.6 + relax average × 0.4 2

We have separated the data in windows of 10 seconds, overlapping 5 seconds. If the average is higher than the limit, the result is 1, if is lower, the result is −1 (Figures 4146). These are the results:

6. Discussion

The main part of this study involved the design of a device which is able to detect skin resistance in different situations. It also includes an initial threshold between being stressed and being relaxed, but it is not the algorithm that is going to be implemented in the final application.

With the different graphs, it can be observed that signals increase or decrease depending on the effort or the mental situation of the user. User 4, User 2, and User 3 had done some previous tests before these results. This may be the explanation for why they present more clarity in their graphs.

The main problem is that, for the moment, we cannot differentiate being stressed from making an effort. This is clearly seen in User 13's last graph, where a laugh presents a similar response to feeling stressed. Apart from the reflected trials, we also collected data while the user was playing different games, such as Tetris or PacMan. However, significant results were not obtained, so they are not included in this study. There are other studies [30] which have used different games in longer tests, and they have obtained good results.

7. Conclusions

The GSR device detects whether there has been an effort or a different situation from being relaxed with a success rate of 90.97%. It has been observed that participants who had done some trials before obtained the highest difference; so the average could be higher if the user is familiarized with the device. The next stage is to design an algorithm in order to establish the threshold between different emotional situations because this first algorithm does not distinguish between being stressed or making an effort. Two tasks lying ahead of us are:

  • Improving the algorithm to establish more reliable thresholds;

  • Using different tests for the calibration state: conducting tests that last longer [17,30].

This research work is partly funded by the Basque Government Department of Education and Research, as well as the Department of Industry, Trade and Tourism.

References

  1. Cooper, C.L.; Dewe, P.J. Stress: A Brief History; Blackwell Publishing: Oxford, UK, 2004.
  2. Handri, S.; Nomura, S.; Kurosawa, Y.; Yajima, K.; Ogawa, N.; Fukumura, Y. User Evaluation of Student's Physiological Response Towards E-Learning Courses Material by Using GSR Sensor. Proceedings of 9th IEEE/ACIS International Conference on Computer and Information Science, Yamagata, Japan, 18–20 August 2010.
  3. Zhai, J.; Barreto, A. Stress Detection in Computer Users Based on Digital Signal Processing of Noninvasive Physiological Variables. Proceedings of IEEE 28th Annual International Conference on Engineering in Medicine and Biology Society, New York, NY, USA, 30 August–3 September 2006.
  4. Massot, B.; Baltenneck, N.; Gehin, C.; Dittmarm, A.; McAdams, E. EmoSense: An Ambulatory Device for the Assessment of ANS Activity—Application in the Objective User Evaluation of Stress with the Blind. Sensors J. 2011, 12, 543–551.
  5. Burns, A.; Greene, B.R.; McGrath, M.J.; O'Shea, T.J.; Kuris, B.; Ayer, S.M.; Stroiescu, F.; Cionca, V. SHIMMER—A Wireless Sensor Platform for Noninvasive Biomedical Research. Sensors J. 2010, 10, 1527–1534, doi:10.1109/JSEN.2010.2045498.
  6. Huang, T.; Xiong, J.; Xuan, L. Design and Implementation of a Wireless Healthcare System Based on Galvanic Skin Response. Commun. Comput. Inform. Sci. 2011, 225, 337–343.
  7. Farahani, S. ZigBee Wireless Networks and Transceivers; Newness: Oxford, UK, 2008.
  8. Zhao, Z.; Cui, L. EasiMed: A Remote Health Care Solution. Proceedings of 27th Annual International Conference of the Engineering in Medicine and Biology Society, Shanghai, China, 17–18 January 2006.
  9. Jung, J.Y.; Lee, J.W. ZigBee Device Access Control and Reliable Data Transmission in ZigBee Based Health Monitoring System. Proceedings of 10th International Conference on Advanced Communication Technology, Phoenix Park, Korea, 17–20 February 2008.
  10. Wu, G.; Liu, G.; Hao, M. The Analysis of Emotion Recognition from GSR Based on PSO. Proceedings of International Symposium on Intelligence Information Processing and Trusted Computing, Huanggang, China, 28–29 October 2010.
  11. Hosseini, S.A.; Khalilzadeh, M.A.; Branch, M. Emotional Stress Recognition System Using EEG and Psychophysiological Signals: Using New Labelling Process of EEG Signals in Emotional Stress State. Proceedings of International Conference on Biomedical Engineering and Computer Science, Wuhan, China, 23–25 April 2010.
  12. Khosrowabadi, R.; Quek, C.; Ang, K.K.; Tung, S.W.; Heijnen, M. A Brain-Computer Interface for classifying EEG Correlates of Chronic Mental Stress. Proceedings of International Joint Conference on Neural Networks (IJCNN), San Jose, CA, USA, 31 July–5 August 2011.
  13. Jongyoon, C.; Gutierrez-Osuna, R. Removal of Respiratory Influences from Heart Rate Variability in Stress Monitoring. IEEE Sensor J. 2011, 11, 2649–2656, doi:10.1109/JSEN.2011.2150746.
  14. Santos, A.; Sánchez, C.; Guerra, J.; Bailador del Pozo, G. A Stress-Detection System Based on Physiological Signals and Fuzzy Logic. IEEE Trans. Industr. Electrs. 2011, 58, 4857–4865, doi:10.1109/TIE.2010.2103538.
  15. Yamakoshi, T.; Yamakoshi, K.; Tanaka, S.; Nogawa, M.; Park, S. B.; Shibata, M.; Sawada, Y.; Rolfe, P.; Hirose, Y. Feasibility Study on Driver's Stress Detection from Differential Skin Temperature Measurement. Proceedings of 30th Annual Conference of the IEEE in Engineering in Medicine and Biology Society, Vancouver, Canada, 20–25 August 2008.
  16. Jing, Z; Barreto, A.B.; Chin, C.; Chao, L. Realization of stress Detection Using Psychophysiological Signals for Improvement of Human-Computer Interactions. Proceedings of the IEEE Southeastcon, Fort Lauderdale, FL, USA, 8–10 April 2005.
  17. Khalfallah, K.; Ayoub, H.; Calvet, J. H.; Neveu, X.; Brunswick, P.; Griveau, S.; Lair, V.; Cassir, M.; Bedioui, F. Noninvasive Galvanic Skin Sensor for Early Diagnosis Of Sudomotor Dysfunction: Application to Diabetes. IEEE Sensor J. 2010, 12, 456–463.
  18. Takahashi, K. Epilepsy Research Progress; Nova Science Publishers: New York, NY, USA, 2008.
  19. Kappeler-Setz, C.; Gravenhorst, F.; Schumm, J.; Arnrich, B.; Tröster, G. Towards Long Term Monitoring of Electrodermal Activity in Daily Life. Proceedings of 5th International Workshop on Ubiquitous Health and Wellness, Copenhagen, Denmark, 26 September 2010.
  20. Li, J.; Zhuo, Q.; Wang, W. A Novel Approach to Analyze the Result of Polygraph. Proceedings of IEEE International Conference on Systems, Man and Cybernetics, Nashville, TN, USA, 8–11 October 2000.
  21. Chang, C.-Y.; Zheng, J.-Y.; Wang, C.-J. Based on Support Vector Regression for Emotion Recognition Using Pshysiological Signals. Proceedings of 2010 International Joint Conference on Neural Networks, Barcelona, Spain, 18–23 July 2010.
  22. Wen, W.H.; Qiu, Y. H.; Liu, G. Y.; Cheng, N.P.; Huang, X.T. Construction and Cross-Correlation Analysis of the Affective Physiological Response Database. Sci. China Inf. Sci. 2010, 53, 1774–1784.
  23. Tarvainen, M.P.; Karj, P.A.; Koistinen, A.S.; Valkonen-Korhonen, M.V. Principal Component Analysis of Galvanic Skin Responses. Proceedings of 22nd Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Piscataway, NJ, USA, 23–28 July 2000.
  24. Ranta-aho, P.O.; Tarvainen, M.T.; Koistinen, A.S.; Karj, P.A. Software Package for Bio-Signal Analysis. Proceedings of 23rd Annual International Conference on Engineering in Medicine and Biology Society, Istanbul, Turkey, 25–28 October 2001.
  25. Nabours, R.E.; Fish, R.M.; Hill, P.F. Electrical Injuries: Engineering, Medical and Legal Aspects, 2nd ed. ed.; Lawyers & Judges Publishing Company, Inc.: Tucson, AZ, USA, 2004.
  26. Aguado, M.J. Resistencia de la piel al paso de la corriente eléctrica en adultos trabajadores. E-Prints Complutense; Biblioteca Universidad Complutense: Madrid, Spain, 2004.
  27. Kallman, R. Realities of Wrist Strap Monitoring Systems. Proceedings of Electrical Overstress-Electrostatic Discharge Symposium, Las Vegas, NV, USA, 27–29 September 1994.
  28. Andrews, H.L. Skin Resistance Changes and Measurements of Pain Threshold. Clin. Investig. J. 1943, 22, 517–520, doi:10.1172/JCI101421.
  29. Sun, F.; Kuo, C.; Cheng, H.; Buthpitiya, S.; Collins, P.; Griss, M. Activity-Aware Mental Stress Detection Using Physiological Sensors. Proceedings of 2nd International Conference on Mobile Computing, Applications and Services (MobiCASE), Santa Clara, CA, USA, 25–28 October 2010.
  30. Wu, S.; Lin, T. Exploring the Use of Physiology in Adaptive Game Design. Proceedings of International Conference on Consumer Electronics Communications and Networks, Xianning, China, 16–18 April 2011.
Sensors 12 06075f1 200
Figure 1. Final application.

Click here to enlarge figure

Figure 1. Final application.
Sensors 12 06075f1 1024
Sensors 12 06075f2 200
Figure 2. Jennic board

Click here to enlarge figure

Figure 2. Jennic board
Sensors 12 06075f2 1024
Sensors 12 06075f3 200
Figure 3. Prototype and test.

Click here to enlarge figure

Figure 3. Prototype and test.
Sensors 12 06075f3 1024
Sensors 12 06075f4 200
Figure 4. General diagram.

Click here to enlarge figure

Figure 4. General diagram.
Sensors 12 06075f4 1024
Sensors 12 06075f5 200
Figure 5. Acquisition diagram.

Click here to enlarge figure

Figure 5. Acquisition diagram.
Sensors 12 06075f5 1024
Sensors 12 06075f6 200
Figure 6. Device function.

Click here to enlarge figure

Figure 6. Device function.
Sensors 12 06075f6 1024
Sensors 12 06075f7 200
Figure 7. Processing stage.

Click here to enlarge figure

Figure 7. Processing stage.
Sensors 12 06075f7 1024
Sensors 12 06075f8 200
Figure 8. Voltage divider.

Click here to enlarge figure

Figure 8. Voltage divider.
Sensors 12 06075f8 1024
Sensors 12 06075f9 200
Figure 9. General circuit.

Click here to enlarge figure

Figure 9. General circuit.
Sensors 12 06075f9 1024
Figure 10. Device.
Sensors 12 06075f10 1024
Sensors 12 06075f11 200
Figure 11. Output voltage of User 4 reading.

Click here to enlarge figure

Figure 11. Output voltage of User 4 reading.
Sensors 12 06075f11 1024
Sensors 12 06075f12 200
Figure 12. Output voltage of User 5 doing mathematical operations.

Click here to enlarge figure

Figure 12. Output voltage of User 5 doing mathematical operations.
Sensors 12 06075f12 1024
Sensors 12 06075f13 200
Figure 13. Output voltage of User 13 breathing.

Click here to enlarge figure

Figure 13. Output voltage of User 13 breathing.
Sensors 12 06075f13 1024
Sensors 12 06075f14 200
Figure 14. Output voltage of User 6 breathing.

Click here to enlarge figure

Figure 14. Output voltage of User 6 breathing.
Sensors 12 06075f14 1024
Sensors 12 06075f15 200
Figure 15. Output voltage of User 10 doing mathematical operations.

Click here to enlarge figure

Figure 15. Output voltage of User 10 doing mathematical operations.
Sensors 12 06075f15 1024
Sensors 12 06075f16 200
Figure 16. Output voltage of User 11 reading.

Click here to enlarge figure

Figure 16. Output voltage of User 11 reading.
Sensors 12 06075f16 1024
Sensors 12 06075f17 200
Figure 17. Output voltage of User 12 reading.

Click here to enlarge figure

Figure 17. Output voltage of User 12 reading.
Sensors 12 06075f17 1024
Sensors 12 06075f18 200
Figure 18. Output voltage of User 14 breathing.

Click here to enlarge figure

Figure 18. Output voltage of User 14 breathing.
Sensors 12 06075f18 1024
Sensors 12 06075f19 200
Figure 19. Output voltage of User 9 being relaxed.

Click here to enlarge figure

Figure 19. Output voltage of User 9 being relaxed.
Sensors 12 06075f19 1024
Sensors 12 06075f20 200
Figure 20. Output voltage of User 6 being relaxed.

Click here to enlarge figure

Figure 20. Output voltage of User 6 being relaxed.
Sensors 12 06075f20 1024
Sensors 12 06075f21 200
Figure 21. Output voltage of User 10 being relaxed.

Click here to enlarge figure

Figure 21. Output voltage of User 10 being relaxed.
Sensors 12 06075f21 1024
Sensors 12 06075f22 200
Figure 22. Output voltage of User 16 being relaxed.

Click here to enlarge figure

Figure 22. Output voltage of User 16 being relaxed.
Sensors 12 06075f22 1024
Sensors 12 06075f23 200
Figure 23. Output voltage of User 1 trying to be relaxed and nervous.

Click here to enlarge figure

Figure 23. Output voltage of User 1 trying to be relaxed and nervous.
Sensors 12 06075f23 1024
Sensors 12 06075f24 200
Figure 24. Output voltage of User 3 trying to be relaxed and nervous.

Click here to enlarge figure

Figure 24. Output voltage of User 3 trying to be relaxed and nervous.
Sensors 12 06075f24 1024
Sensors 12 06075f25 200
Figure 25. Output voltage of User 4 trying to be relaxed and nervous.

Click here to enlarge figure

Figure 25. Output voltage of User 4 trying to be relaxed and nervous.
Sensors 12 06075f25 1024
Sensors 12 06075f26 200
Figure 26. Output voltage of User 4 before and after drinking coffee.

Click here to enlarge figure

Figure 26. Output voltage of User 4 before and after drinking coffee.
Sensors 12 06075f26 1024
Sensors 12 06075f27 200
Figure 27. Output voltage of User 4 in the morning.

Click here to enlarge figure

Figure 27. Output voltage of User 4 in the morning.
Sensors 12 06075f27 1024
Sensors 12 06075f28 200
Figure 28. Output voltage of User 1 in different situations.

Click here to enlarge figure

Figure 28. Output voltage of User 1 in different situations.
Sensors 12 06075f28 1024
Sensors 12 06075f29 200
Figure 29. Classifier error BayesNet.

Click here to enlarge figure

Figure 29. Classifier error BayesNet.
Sensors 12 06075f29 1024
Sensors 12 06075f30 200
Figure 30. Classifier error J48.

Click here to enlarge figure

Figure 30. Classifier error J48.
Sensors 12 06075f30 1024
Sensors 12 06075f31 200
Figure 31. Classifier error SMO.

Click here to enlarge figure

Figure 31. Classifier error SMO.
Sensors 12 06075f31 1024
Sensors 12 06075f32 200
Figure 32. User 1 BayesNet ROC curve.

Click here to enlarge figure

Figure 32. User 1 BayesNet ROC curve.
Sensors 12 06075f32 1024
Sensors 12 06075f33 200
Figure 33. User 10 BayesNet ROC curve.

Click here to enlarge figure

Figure 33. User 10 BayesNet ROC curve.
Sensors 12 06075f33 1024
Sensors 12 06075f34 200
Figure 34. User 9 J48 ROC curve.

Click here to enlarge figure

Figure 34. User 9 J48 ROC curve.
Sensors 12 06075f34 1024
Sensors 12 06075f35 200
Figure 35. Average comparison between being relaxed and making an effort.

Click here to enlarge figure

Figure 35. Average comparison between being relaxed and making an effort.
Sensors 12 06075f35 1024
Sensors 12 06075f36 200
Figure 36. Output voltage of User 2 at the prediction stage.

Click here to enlarge figure

Figure 36. Output voltage of User 2 at the prediction stage.
Sensors 12 06075f36 1024
Sensors 12 06075f37 200
Figure 37. Output voltage of User 3 at the prediction stage, nervous-relaxed.

Click here to enlarge figure

Figure 37. Output voltage of User 3 at the prediction stage, nervous-relaxed.
Sensors 12 06075f37 1024
Sensors 12 06075f38 200
Figure 38. Output voltage of User 4 at the prediction stage, nervous-relaxed.

Click here to enlarge figure

Figure 38. Output voltage of User 4 at the prediction stage, nervous-relaxed.
Sensors 12 06075f38 1024
Sensors 12 06075f39 200
Figure 39. Output voltage of User 5 at the prediction stage, relax 1.

Click here to enlarge figure

Figure 39. Output voltage of User 5 at the prediction stage, relax 1.
Sensors 12 06075f39 1024
Sensors 12 06075f40 200
Figure 40. Output voltage of User 13 at the prediction stage, nervous-relaxed.

Click here to enlarge figure

Figure 40. Output voltage of User 13 at the prediction stage, nervous-relaxed.
Sensors 12 06075f40 1024
Sensors 12 06075f41 200
Figure 41. State of the User 2 at the prediction stage.

Click here to enlarge figure

Figure 41. State of the User 2 at the prediction stage.
Sensors 12 06075f41 1024
Sensors 12 06075f42 200
Figure 42. State of the User 3 at the prediction stage, nervous-relaxed.

Click here to enlarge figure

Figure 42. State of the User 3 at the prediction stage, nervous-relaxed.
Sensors 12 06075f42 1024
Sensors 12 06075f43 200
Figure 43. State of the User 3 at the prediction stage, relaxed.

Click here to enlarge figure

Figure 43. State of the User 3 at the prediction stage, relaxed.
Sensors 12 06075f43 1024
Sensors 12 06075f44 200
Figure 44. State of the User 4 at the prediction stage, nervous-relaxed.

Click here to enlarge figure

Figure 44. State of the User 4 at the prediction stage, nervous-relaxed.
Sensors 12 06075f44 1024
Sensors 12 06075f45 200
Figure 45. State of the User 5 at the prediction stage, relaxed 1.

Click here to enlarge figure

Figure 45. State of the User 5 at the prediction stage, relaxed 1.
Sensors 12 06075f45 1024
Sensors 12 06075f46 200
Figure 46. State of the User 13 at the prediction stage, nervous-relaxed.

Click here to enlarge figure

Figure 46. State of the User 13 at the prediction stage, nervous-relaxed.
Sensors 12 06075f46 1024
Table Table 1. Different values of the resistances.

Click here to display table

Table 1. Different values of the resistances.
RsVout

10 k1.755 V
49.5 k1.677 V
100 k1.587 V
200 k1.434 V
560 k1.054 V
760 k0.923 V
1 M0.813 V
3.3 M0.357 V
9.93 M0.136 V
Table Table 2. Places of the tests.

Click here to display table

Table 2. Places of the tests.
User 1WorkUser 9Home
User 2WorkUser 10Home
User 3WorkUser 11Home
User 4WorkUser 12Work
User 5HomeUser 13Home
User 6HomeUser 14Work
User 7HomeUser 15Work
User 8HomeUser 16Home
Table Table 3. Average of different situations.

Click here to display table

Table 3. Average of different situations.
Relax(V)Operations (V)Breathing (V)Reading (V)

User 4 (25)1.40681.69451.64761.6712
User 5 (30)1.11231.13831.14841.1426
User 6 (27)0.8761.03810.95671.0609
User 7 (24)1.00111.08681.07861.1176
User 8 (26)0.82380.99040.88640.9975
User 9 (26)1.1011.11451.11051.1439
User 10 (27)1.0601.11971.06951.1022
User 11 (55)0.70960.75460.78400.8408
User 12 (28)1.05291.06851.08931.0856
User 13 (23)1.36991.55421.55991.6468
User 14 (30)0.82380.99040.88640.9975

Table Table 4. Success of each test by subject.

Click here to display table

Table 4. Success of each test by subject.
ReadingBreathingOperationsRelaxing

User 1YESYESYESYES
User 2NOYESYESYES
User 3YESYESYESYES
User 4YESYESYESYES
User 5NOYESYESYES
User 6YESYESYESYES
User 7NOYESNOYES
User 8YESNONOYES
User 9YESNOYESYES
User 10NOYESYESYES
User 11YESYESNOYES
User 12YESYESYESNO
User 13NOYESNOYES
User 14YESNOYESYES
User 15NONOYESYES
User 16YESYESYESYES

Total success10121215
% success62.5757593.75

Table Table 5. Differences between trying to be relaxed and trying to be nervous.

Click here to display table

Table 5. Differences between trying to be relaxed and trying to be nervous.
Relaxed(V)Nervous(V)

User 1 (26)1.61181.7396
User 2 (26)1.55351.5379
User 3 (24)1.55761.6153
User 4 (25)1.40681.3839
User 5 (30)1.11231.1266
User 15 (26)1.09021.1388

Table Table 6. Global results.

Click here to display table

Table 6. Global results.
BayesNetJ48SMO

Relative absolute error15.86%16.91%17.17%
Root relative squared error26.77%27.20%41.61%
Correctly classified93.73%93.79%91.95%
Incorrectly classified6.27%6.21%8.05%

Table Table 7. Confusion matrix.

Click here to display table

Table 7. Confusion matrix.
BayesNetJ48SMO
PNPNPN
T88611014T88671008T85461329
F44210347F43510354F49710292
Table Table 8. Difference between being relaxed and effort situations.

Click here to display table

Table 8. Difference between being relaxed and effort situations.
Calm (V)Effort (V)Diference (%)

User 11.61181.73967.929
User 21.62161.63090.5735
User 31.55761.61533.70445
User 41.40681.671118.7873
User 51.11231.14312.7690
User 60.8761.018616.2747
User 71.00111.09439.3131
User 80.82380.958116.3025
User 91.1011.1231.9952
User 101.0601.09713.5129
User 110.70960.793111.7718
User 121.05291.08112.6814
User 131.36991.58715.84548
User 140.82380.958116.3023
User 151.09021.13884.4499

Table Table 9. Results of the different methods for User 2.

Click here to display table

Table 9. Results of the different methods for User 2.
User 2
Trial 1: stressed
BayesNetJ48SMO

Relative absolute error0.95%0.00%0.00%
Correctly classified100.00%100.00%100.00%
Incorrectly classified0.00%0.00%0.00%

Table Table 10. Results of the different methods for User 3.

Click here to display table

Table 10. Results of the different methods for User 3.
User 3
Trial 1: stressed and relaxed
BayesNetJ48SMO

Relative absolute error30.29%43.61%21.35%
Correctly classified85.55%85.55%89.37%
Incorrectly classified14.45%14.45%10.63%

Trial 2: relaxed
BayesNetJ48SMO

Relative absolute error38.05%42.40%29.87%
Correctly classified84.00%84.00%90.90%
Incorrectly classified16.00%16.00%9.10%

Table Table 11. Results of the different methods for User 4.

Click here to display table

Table 11. Results of the different methods for User 4.
User 4
Trial 1: stressed and relaxed
BayesNetJ48SMO

Relative absolute error29.43%40.73%24.91%
Correctly classified85.35%79.63%87.54%
Incorrectly classified14.65%20.37%12.46%

Trial 2: relaxed
BayesNetJ48SMO

Relative absolute error0.21%7.84%0.00%
Correctly classified100.00%96.17%100.00%
Incorrectly classified0.00%3.83%0.00%

Table Table 12. Results of the different methods for User 5.

Click here to display table

Table 12. Results of the different methods for User 5.
User 5
Trial 1: relaxed
BayesNetJ48SMO

Relative absolute error0.00%0.00%0.00%
Correctly classified100.00%100.00%100.00%
Incorrectly classified0.00%0.00%0.00%

Trial 2: relaxed
BayesNetJ48SMO

Relative absolute error0.12%0.00%0.00%
Correctly classified100.00%100.00%100.00%
Incorrectly classified0.00%0.00%0.00%

Table Table 13. Results of the different methods for User 13.

Click here to display table

Table 13. Results of the different methods for User 13.
User 13
Trial 1: stressed and relaxed
BayesNetJ48SMO

Relative absolute error32.50%35.11%40.94%
Correctly classified86.96%86.96%79.46%
Incorrectly classified13.04%13.04%20.54%

Trial 2: relaxed
BayesNetJ48SMO

Relative absolute error42.17%45.61%51.85%
Correctly classified71.43%71.43%71.43%
Incorrectly classified28.57%28.57%28.57%

Table Table 14. Classification average of the different methods.

Click here to display table

Table 14. Classification average of the different methods.
BayesNet90.37%
J4889.30%
SMO90.97%
Sensors EISSN 1424-8220 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert