Next Article in Journal
Safe Helicopter Landing on Unprepared Terrain Using Onboard Interferometric Radar
Previous Article in Journal
Fast Screening of Whole Blood and Tumor Tissue for Bladder Cancer Biomarkers Using Stochastic Needle Sensors
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Perception of a Haptic Stimulus Presented Under the Foot Under Workload

by
Landry Delphin Chapwouo Tchakoute
* and
Bob-Antoine J. Menelas
Department of Mathematics and Computer Sciences, University of Quebec at Chicoutimi, Chicoutimi, QC G7H 2B1, Canada
*
Author to whom correspondence should be addressed.
Sensors 2020, 20(8), 2421; https://doi.org/10.3390/s20082421
Submission received: 27 January 2020 / Revised: 27 March 2020 / Accepted: 17 April 2020 / Published: 24 April 2020

Abstract

:
It is clear that the haptic channel can be exploited as a communication medium for several tasks of everyday life. Here we investigated whether such communication can be altered in a cognitive load situation. We studied the perception of a vibrotactile stimulus presented under the foot when the attention is loaded by another task (cognitive load). The results demonstrated a significant influence of workload on the perception of the vibrotactile stimulus. Overall, we observed that the average score in the single-task (at rest) condition was greater than the overall mean score in the dual-task conditions (counting forwards, counting backwards, and walking). The walking task was the task that most influenced the perception of the vibrotactile stimulus presented under the foot.

1. Introduction

Humans constantly interact with their environment, which exposes them to conditions that can be distracting (public places, walking, metro, commercial spaces, etc.). In motion, for example, humans’ attention is mostly occupied by tasks such as walking while visualizing their space or walking while speaking to someone. These activities and/or tasks mostly involve haptic perception through the haptic channel [1]. Indeed, haptic perception is a process mediated by cutaneous and kinesthetic afferent subsystems [2]. However, we are now observing an increasing need for haptic interactions in everyday products to improve the user experience [3]. For instance, in designing, constraints of miniaturization and portability may considerably limit computation capabilities. This has an effect on the quality and the perception of the transmitted signal, especially in noisy environments [1]. Moreover, research in human–computer interaction (HCI) has shown the importance of choosing the haptic modality as a suitable channel for communication with humans [4,5,6]. We realized that in the area of health, the haptic channel could be used to orient people with low mobility [7]. It can also be used to reduce the risk of falling at home or in mobile situations by taking into account some aspects of the external environment, such as the types of soil [7,8]. In this context, in the past, we have demonstrated that it is possible to use the haptic channel to inform the user of a potential risk (low, medium, high, and very high) by using a vibrotactile stimulus sent to the plantar surface of the user’s foot [4]. Moreover, we have shown that the time taken to perceive a vibrotactile stimulus can vary on certain types of soil [9], and this variation was independent of the device positioning on the human body [10]. We have also found that the perception of a vibrotactile stimulus transmitted to the plantar surface of the foot can be influenced by auditory stimuli [9]. However, in everyday life, humans are constantly moving and spend most of their time engaged in dual-tasks situations (e.g., walking and talking on the phone), which can lead to focusing their attention on a specific task. One question thus arises about the perception of the transmitted information. Will the user be able to perceive this information; especially in the presence of external distractors? Thus, the main purpose of this study is to evaluate the haptic perception of information transmitted to the sole of the foot under workload (cognitive tasks and while walking).
This article is organized as follows. The introduction in Section 1 gives an overview of basic topics that are important for this study; this is followed by some related works in Section 2. Section 3 presents our research methodology and procedure, while Section 4 and Section 5 present the results and discussion, respectively. Section 6 presents the conclusion, with recommendations for future works.

2. Background

Attention is a complex cognitive function that is paramount in human behaviour. It allows for focusing on a main task while ignoring the other aspects, but it also involves cognitive efficiency, whether in perceiving, memorizing, or solving problems [11]. In everyday life, humans are often required to perform several tasks simultaneously, such as having a conversation while driving. Their attention can thus be shared between the main task and the secondary one. In this case, humans must concentrate on managing the many pieces of information because of the workload, and this requires more cognitive resources. A double-task situation may therefore require more resources, and this may cause a decline in performance in the main task [12].
When humans interact with computers, visual rendering is mainly exploited. Although this modality is suitable in many situations, there are scenarios in which the visual channel may be overloaded by the amount of presented data. To fully take advantage of the sensory capabilities of the human system, a benefit would certainly come from exploiting other sensory channels, such as the haptic and audio ones. Moreover, several studies support the use of haptic feedback as a means of communication [13,14] because, unlike visual feedback [15], it can improve human performance. Such observations explain why we are interested in using haptics as a communication under workload.
Oakley and Park have evaluated the recognition rate of tactons (structured vibrotactile messages) under workload [16]. The tactons were presented on a wearable device worn on the wrist while users performed three different distraction tasks. Two of the tasks involved completing work on a computer; the other was a mobile task requiring the participants to walk around. Tasks were chosen to represent common activities and explore different aspects of distraction. The results demonstrated the importance of the influence of cognitive disturbances. Tang et al. conducted a study to explore how haptic feedback could be used as another channel for transmitting information when the visual system is saturated [17]. They have shown that people can accurately perceive and process the haptic renderings of ordinal data under the cognitive workload. They evaluated three haptic models for rendering ordinal data with participants who were performing a taxing visual tracking task. The results demonstrated that information rendered by these models is perceptually available, even when users are visually busy. In the absence of other tasks, the participants detected and identified the change at 1.8 s and 2.5 s, respectively, but with the addition of visual and auditory distraction tasks, detection times increased significantly to 4.0 s [18]. In the same way, Chan et al. discuss the use of vibrotactile feedback to convey basic information when recipients are absorbed by a visual and/or auditory primary task [14].
Psychologists have studied various types of activity presented in daily living situations. For instance, Kaber and Zhang reviewed the use of virtual reality and haptic technologies for studying human performance in tasks involving the tactile sense [19]. Studies have revealed that tasks that are apparently dissimilar (e.g., talking and driving) can strongly interfere with each other when performed together [12] or in real-world challenges [20]. These tasks are mainly haptic tasks and mobility tasks [19]. Montero-Odasso et al. studied the walking task, a complex learned task that becomes automatic for most people from early childhood onwards and becomes a cognitive control of gait in older adults [21,22]. They hypothesized that walking while performing a secondary task (dual-task paradigm) is a way to assess the relationship between cognition and gait and is a dual-task paradigm for the daily living activities of elderly people [23]. In their procedure, they used counting forwards and counting backwards as the secondary task. At the same time, Timmermans et al. examined the effects of different walking environments on the cognitive-motor interference and task prioritization in dual-task walking in stroke patients [24]. They found that the walking environment strongly influenced the cognitive-motor interference and task prioritization during dual-task walking in stroke patients. All these studies evaluated various aspects of haptic communication in dual tasks and sometimes under workload. The major observation is that there is no clear study regarding the influence of a dual task and/or a cognitive load on haptic perception. In addition, these studies did not demonstrate how the workload influenced the perception of a haptic vibrotactile stimulus presented under the foot. The present study intended to fill this gap. We studied the perception of a vibrotactile stimulus presented under the foot when the attention is loaded by a main task (cognitive task or walking). To examine the validity of this hypothesis, we designed the following procedure.

3. Apparatus: Enactive Shoe

3.1. Wearable Device

The wearable device was designed based on some previous works [4,25,26]. We used an enactive shoe controlled by a smartphone that prevents falls related to a person’s immediate environment or an abnormal gait [8,27,28]. This device is a wearable device with a mini-computer-based processing system including a set of sensors and actuators distributed in strategic positions (Figure 1). It can be attached to the body, communicate wirelessly with smartphones (Figure 1f) via the Bluetooth and Wi-Fi protocols, and convey a vibrotactile stimulus to a specific location on the body during walking. The device has two separate enclosures, as suggested in [29]. Figure 1b is the processing system responsible for computing and conveying the signal, storing data, and waiting for the incoming participant input via the tactile screen. It is mainly composed of a Raspberry Pi-3-B mini-computer. Figure 1c is the haptic system responsible for transmitting the vibrotactile stimulus to a specific location of the body through haptuators. This is linked to the smartphone through a Bluetooth connection. The entire system is powered by a 9v lithium rechargeable battery of 600 mAh (Figure 1a). Each haptuator is mounted on removable straps that can be placed directly on a specific body location. The engine (haptuator) supply is a stable signal of 3.3 volts powered at a frequency of 100 Hz with an impedance of 10 ohms. This haptuator is not only capable of communicating, [4] but it is also capable of evaluating the impact of the auditory distraction [30] and the time taken to react to vibrotactile stimuli in the rest position [9]. Each haptuator measures 32 mm × 9 mm × 9 mm, with a frequency range between 90 and 1000 Hz [10].

3.2. Used Haptic Stimulus

One vibrotactile stimulus (Equation (1)) with a duration of one second was designed according to various studies [9,30,31,32] and was also used in our previous works [4,10]. The vibrotactile stimuli were elicited at frequency w (Equation (1)) and sent on Pacinian corpuscles mechanoreceptors field under the plantar surface of the foot.
W = a sin(2π ω t),

4. Experiment

This experiment aimed to evaluate the influence of dual tasks on the perception of a vibrotactile stimulus conveyed randomly to the plantar surface of the foot.

4.1. Participants

Twenty-eight healthy students, aged from 20 to 35, from the University of Quebec at Chicoutimi (UQAC) participated in the study (Table 1). They were recruited by means of randomized sampling after a written electronic invitation to participate in a study related to the response time (RT) to a vibrotactile stimulus. All the participants attended the session voluntarily, and informed consent was obtained before the experimental sessions. Further, all the participants were novices to haptic technologies. Each participant filled out a short questionnaire on their health history and underwent touch inspection surrounding their foot sensitivity. The experiment and consent form were approved by the local ethics committee (certificate number: 602.434.01).

4.2. Experimental Setup

What follows is a description of the setup, including the protocol.

4.2.1. Test Environment

The experimental phase took place in a calm space, specifically, in our laboratory at the university, equipped with chairs and a table for the preparation of the participants. The laboratory was equipped with the flooring surface conditions and a hygienic kit to clean the device after each session was finished. This environment remained constant during the experiments.

4.2.2. Experiment conditions

To achieve the goal set in this study, we chose four tasks/activities that, based on preliminary studies [10,19,33], would highlight the influence of workload on haptic perception with the foot. We defined four conditions. The first condition was the at-rest condition, in which there was no distraction. The second was counting forwards to 100, the third was counting backwards from 100 to 0, and the fourth consisted of walking on the ground.
Our protocol is summarized in Table 2. We had two main sessions, which are described as follows:

4.3. Experimental Sessions: Control and Experimental

The experiments conducted to achieve the goal of this study involved two sessions: a control session and an experimental session. The control session concerned the evaluation of the vibrotactile stimulus perception in the normal condition (at rest, without any distraction), whereas the experimental session concerned the evaluation of vibrotactile stimulus perception under the influence of distractors. Each session (control and experimental) featured a familiarization phase followed by a test phase. For all conditions, the vibrotactile stimulus was conveyed when the participant had the left foot on the ground. The control session had an average duration of 25 min, whereas the experimental session lasted for around 45 min; there was a break of 5 min between the two sessions and between each phase. The space dimension used to walk was width × length (470 cm × 80 cm). The entire method (control and experimental) consisted of 336 measures (28 participants × 3 trials × 1 vibrotactile stimulus x 4 test phases). In all phases, a signal (vibrotactile stimulus) is sent. Each sending of a signal is spaced by at least five seconds and the user must have his foot on the ground. A delay of three seconds is granted for the user’s response. Otherwise, the answer is perceived to be a false positive.

4.4. Familiarization Phase

We had different configurations for the familiarization phase. It was performed at rest (sitting on a chair) for the baseline session and while walking for the control session. During this phase, we explained and demonstrated all aspects of the experiment to the participants. For the two conditions, the participants had ear protection and the haptic device attached to the left foot. We chose the left foot for technical reasons. Nothing in the literature suggests that there may be differences in lower extremity perception due to dominance. The participants had a smartphone on the hand outside of their field of vision. They also had to look at a black spot on the opposite wall. The participants were asked to press the smartphone screen whenever they perceived a vibrotactile stimulus. We recorded 28 participants × 1 vibrotactile stimulus × 3 trials (84 measures). Each good tacton recognition was computed as haptic perception recognition rate (score). Once each participant was trained (after achieving a recognition rate of 95%), the test phase began.

4.5. Test Phase

The test phase was performed at rest (sitting on a chair) for the baseline session. During this phase, the participants wore the haptic device and ear protection. For each condition phase, their vibrotactile stimulus perception was tested. When they perceived the vibrotactile stimulus, they pressed the smartphone screen. Thereafter, the identification was saved in the database. No results were shown to the participants during the test phase. All identifications were considered for measures. The test phase conditions were counterbalanced (randomized) between participants. When the test phase was finished, the participants were invited to fill in a form about the experience. The device was cleaned up after each test condition for the next participant.

5. Results

All the participants completed the experiment successfully, with 336 measures recorded (28 participants × 1 vibrotactile stimulus × 3 trials × 4 conditions). We took the mean of the three trials for each measure. Overall, we observed that the best mean perception score was achieved during the single-task condition (at rest; mean = 10.10 ± 0.73) and the worst was achieved in the presence of distractors (counting backwards; mean = 7 ± 2.21), as illustrated in Figure 2 (with mean value inside the bars). Between each condition, we observed some results on perception accuracy, as depicted in Figure 2. In the control condition (at rest), there was a mean score of 10.11 ± 0.72, which was the highest mean perception score among all conditions. For the dual-task conditions with cognitive tasks, there was an overall mean of 7 ± 2.17 in the counting forwards condition, compared to a mean of 6.39 ± 1.65 in the counting backwards condition. Finally, in the last dual-task condition with the motor task (walking), there was a mean of 5.75 ± 2.89, which was the smallest mean perception score among all conditions.
These results indicate that the perception accuracy was greater in the single-task condition than in the dual-task conditions, as illustrated in Figure 3 (with mean value inside the bars).
To detect any interaction effect between the conditions (single-task and dual-task), we performed an analysis of variance with repeated measures on the accuracy of the perception score. Indeed, as our goal was to examine the effect of dual tasks on the perception of a vibrotactile stimulus presented under the foot, we investigated whether distractions (dual task) had an impact on vibrotactile stimulus perception under the foot. Therefore, we made the following assumption:
Hypothesis 1 (H1).
Does a dual task affect vibrotactile stimulus perception?
We assumed that all the means would be equal if H1‘s null hypothesis (H01) were true and that at least one mean would be different from the others if H1’s alternative hypothesis (Ha1) were true. Our significance level was (alpha) = 0.05. The dependent variable was the score, and our independent variables were the conditions. All the tests were performed using Minitab version 17, and the visualizations were created using Power BI Desktop, January 2019 release. The analysis of variance (ANOVA) evaluation was performed with post-hoc Tukey HSD (honest significant difference) tests. Our input data independent group factors were at rest (single task), counting forwards, counting backwards, and walking (dual task). Pairwise comparisons, which identified significant differences between conditions, were used in all analyses. Statistical significance was set at the 95% confidence level (p < 0.05). The sample observation was n = 28. The data satisfied the conditions to justify the use of ANOVA. Analysis of the distribution of the data suggested that they were normally distributed.
A one-way ANOVA with repeated measures was conducted to determine whether the perception of the vibrotactile stimulus under the foot was different for the participants in the dual-task condition. Overall, there was a statistically significant difference between the conditions, as determined by the result of the one-way ANOVA: F(3,108) = 24.71, Fcritical = 2.70, p < 0.05. This result suggests that one or more of the conditions (dual task) influenced the perception of the vibrotactile stimulus. According, to Cohen’s guidelines [34], a small effect size is 0.01, a medium effect size is 0.059, and a large effect size is 0.138. Therefore, the effect size we obtained, (η2) = 0.407, was a very large effect size. It also indicates that 41% of the change in the perception accuracy (score) can be accounted for by the conditions (dual task), as depicted in Figure 4 (the circles in the figure are mean values). With a Cohen’s d [34] of 0.9, 82% of the participants in the dual-task condition, counting forwards, counting backwards, and walking, will be above the mean of the participants at rest, without a dual task. Further, 65% of the two groups will overlap, and there is a 74% chance that a person chosen at random from the dual-task group will have a higher perception score (accuracy) than a person selected at random from the group without the dual task (probability of superiority). Moreover, to have one outcome that is more favorable in the dual-task group than in the single-task group (at rest), we need to treat 3.1 people. This means that if 100 people perceived the vibrotactile stimulus in the dual-task condition, 32.3 more people would have a more favorable score than if they had perceived the stimulus at rest.
To identify the conditions that most influenced the perception of the vibrotactile stimulus, we performed the Tukey simultaneous test for differences of means (Table 3).
We observed a significant difference between the single-task condition and the three groups of dual-task conditions (Table 3). This means that the dual tasks (cognitive and motor) used in this experiment decreased the perception accuracy of the vibrotactile stimulus presented under the foot. However, as illustrated in Table 3, the difference between the dual tasks was not significant.
We performed an analysis on false positive to see how user’s responses referred to the stimulus delivered. We performed an ANOVA analysis on false positive response between conditions. We observed a non-significant difference between all conditions. This means that participant’s answers really referred to the delivered vibrotactile stimulus.
We also analyzed the normality of the sample checked. We used the Anderson–Darling test. Results diagrams and plots are reported in the Figure 5.

6. Discussions

The objective of this study was to evaluate the influence of dual tasks on the perception of a vibrotactile stimulus presented under the foot. The data were analyzed in terms of the overall mean score of perception within each condition, single task (at rest) and dual tasks (counting forwards, counting backwards, and walking). The results obtained in this study indicate that being in a dual-task condition is detrimental to the perception of stimuli presented under the foot.

6.1. Influence of Walking on the Perception of a Vibrotactile Stimulus

During our study, we mainly used two tasks: pure cognitive (to count) and a cognitive-motor task (to walk). The results revealed a significant difference in response time to the perception stimulus in the walking task. According to Table 3, the walking task represented the worst mean score obtained by the participants; this indicates that walking can affect perception when a user perceives a vibrotactile stimulus under the foot. This result can be explained by the fact that walking is a complex motor act, a multisegmental task and a semi-automatic control of rhythmic movements by the central nervous system [10]. This implies that sensory information (haptic and other) coming from the lower limbs reaches the sensory cortex during the walking motor task using a neurological network. This sensory information can facilitate the perception when it is a repetitive task (motor coordination) or bias vibrotactile stimulus perception when it is not a repetitive task. Moreover, the interpretation of psychophysiological responses becomes doubly difficult when a person is engaged in a multi-task environment [12]. Indeed, it has been shown that in daily life, tasks that are apparently dissimilar (e.g., talking and driving) can strongly interfere with each other when done together, both in laboratory conditions [35] and in real-world challenges [20]. One of the most useful methods for studying such situations is the dual-task paradigm. This paradigm involves performing two tasks concurrently, resulting in impaired behavioural performance on one or both tasks. As it was in our experiment, we found that the walking represented the task that most influenced the perception of a vibrotactile stimulus presented under the foot. This finding is in line with some other works, which have reported that vibrotactile stimuli delivery to the foot was influenced in communicating information in some conditions [4,9,10,28,29]. While most of the participants reported the system to be comfortable, they all reported that the walking condition was much more difficult than the others.

6.2. Influence of Cognitive Tasks on Vibrotactile Stimulus Perception

As pointed out previously, there was a significant difference between vibrotactile stimulus perception at rest (single task) and vibrotactile stimulus perception when the participants were counting forwards and/or counting backwards (dual task). However, the results of the difference levels (Table 3) for counting forwards and backwards had an adjusted p-value of 0.272, which is not significant. This is not surprising, because the dual-task counting forwards or backwards is related to the same cortex in memory as cognitive tasks [36]. Thus, on average, there was no difference of perception when the participants were engaged in a cognitive dual task because they always remained with the same workload. In other words, the dual tasks: motor (walking) or cognitive task (counting forwards and counting backwards) can be associated to the same workload when delivering a stimulus on the foot. This is in line with some other works [12,37], which have found that perception can be discriminate between normal performance and mental overload. Like it was in our study, the two tasks can be expected to at least partially share the same mental resources, since they are both arithmetic tasks [36]. Our results tend to be in line with the work of Oakley and Park, who found that distraction (double or multitasking) masks the perception of vibrotactile signals and influences the main task [16]. In summary, our results point out the influence of cognitive tasks; one could state that cognitive distractors should be taken into account when designing and conveying information with the foot, as mentioned in [16]. All this reinforces the idea that the haptic channel can be used as a communication channel [38,39].

6.3. Implication of the Results from This Study

The results of this study show a poor performance in vibrotactile messages perception when walking, unlike cognitive tasks. These results indicate that it is not appropriate to communicate a risk of falling throughout a haptic message presented under the foot as suggested in [1,5]. Indeed, when walking there is friction exerted on the receptors of the surface of the foot. These frictions can cause perceptual conflicts, especially when the sole has different textures [8,9,10]. As a result, the transmission of information by the haptic modality in such conditions will be poorly perceived because of the dual-task situation. As a result, a future improvement could underline a new method who could take into account these results. One approach would be to choose another position that does not suffer from the perceptual conflict and would have a better haptic perception rate. To this, the literature offers connected jackets to communicate non-visual information [40,41,42,43] or the wrist [16]. This new method will have a wider range of applications including spatial orientation and guidance, attention management, haptic guidance, assistance, and better learnability [44,45].

7. Conclusions and Future Works

Following the design and development of a shoe capable of preventing accidental falls linked to balance problems, we studied the use of vibrotactile returns to communicate a risk of falling. The objective of this study was to investigate the influence of workload on the perception of a vibrotactile stimulus presented on under the foot. The apparatus used was a wearable device capable of conveying a vibrotactile stimulus. The experimentation protocol included four conditions: one single task and three dual tasks. The measures were performed in the controlled environment of our laboratory. Overall, the results revealed a significant influence of dual tasks on the perception of a vibrotactile stimulus. Specifically, we observed that the dual task involving walking influenced the perception of the vibrotactile stimulus far more than the cognitive task (counting forwards or backwards). These findings suggest that dual tasks could bias perception and should be taken into account when conveying vibrotactile stimuli under the foot. For this, as an alternative we intend to investigate the use of clothing for the communication of haptic information. This avenue is privileged as it appears that this new method will have a wider range of applications including spatial orientation and guidance, attention management.

Author Contributions

Supervision: B.-A.J.M.; Design, setup, experiment, data analysis: L.D.C.T; Writing: L.D.C.T. and B.-A.J.M. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Natural Sciences and Engineering Research Council of Canada (NSERC). Grant numbers from NSERC are 418624-2013 and RGPIN-2019-07169.

Acknowledgments

The authors would like to thank Florian Mazière for Python algorithm.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Brewster, S.; Chohan, F.; Brown, L. Tactile feedback for mobile interactions. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (HI ’07), San Jose, CA, USA, 28 April–3 May 2007. [Google Scholar]
  2. Lederman, S.J.; Klatzky, R.L. Haptic perception: A tutorial. Atten. Percept. Psychophys. 2009, 71, 1439–1459. [Google Scholar] [CrossRef] [Green Version]
  3. MacLean, K.E. Haptic Interaction Design for Everyday Interfaces. Rev. Hum. Factors Ergon. 2008, 4, 149–194. [Google Scholar] [CrossRef]
  4. Tchakouté, L.D.C.; Gagnon, D.; Menelas, B.-A.J. Use of tactons to communicate a risk level through an enactive shoe. J. Multimodal User Interfaces 2018, 12, 41–53. [Google Scholar] [CrossRef]
  5. Hoggan, E.; Brewster, S. New parameters for tacton design. In CHI ’07 Extended Abstracts on Human Factors in Computing Systems (CHI EA ’07); Association for Computing Machinery: New York, NY, USA, 2007; pp. 2417–2422. [Google Scholar]
  6. Brewster, S.; Brown, L.M. Tactons: Structured tactile messages for non-visual information display. In Proceedings of the Fifth Conference on Australasian User Interface—Volume 28 (AUIC ’04).; Australian Computer Society Inc.: Darling Hurst, Australia, 2004; pp. 15–23. [Google Scholar]
  7. Lévesque, V. Blindness, Technology and Haptics; Center for Intelligent Machines: Montréal, QC, Canada, 2005; pp. 19–21. [Google Scholar]
  8. Otis, M.J.-D.; Ayena, J.C.; Tremblay, L.E.; Fortin, P.E.; Ménélas, B.-A.J. Use of an Enactive Insole for Reducing the Risk of Falling on Different Types of Soil Using Vibrotactile Cueing for the Elderly. PLoS ONE 2016, 11, e0162107. [Google Scholar] [CrossRef] [PubMed]
  9. Tchakouté, L.D.C.; Ménélas, B.A.J. Reaction Time to Vibrotactile Messages on Different Types of Soil. In Proceedings of the 13th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications, Funchal, Portugal, 27–29 January 2018. [Google Scholar]
  10. Tchakouté, L.C.; Tremblay, L.; Menelas, B.-A. Response Time to a Vibrotactile Stimulus Presented on the Foot at Rest and During Walking on Different Surfaces. Sensors 2018, 18, 2088. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  11. Engle, R.W. Working Memory Capacity as Executive Attention. Curr. Dir. Psychol. Sci. 2002, 11, 19–23. [Google Scholar] [CrossRef]
  12. Novak, D.; Mihelj, M.; Munih, M. Dual-task performance in multimodal human-computer interaction: A psychophysiological perspective. Multimedia Tools Appl. 2012, 56, 553–567. [Google Scholar] [CrossRef]
  13. Levitin, D.J. The perception of cross-modal simultaneity (or “the Greenwich Observatory Problem” revisited). In AIP Conference Proceedings; AIP Publishing: Melville, NY, USA, 2000. [Google Scholar]
  14. Chan, A.; MacLean, K.; McGrenere, J. Learning and Identifying Haptic Icons under Workload. In Proceedings of the First Joint Eurohaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, World Haptics Conference, Pisa, Italy, 18–20 March 2005. [Google Scholar]
  15. Boy, G.A. A Human-Centered Design Approach. In The Handbook of Human-Machine Interaction; CRC Press: Boca Raton, FL, USA, 2017; pp. 1–20. [Google Scholar]
  16. Oakley, I.; Park, J. Did you feel something? Distracter tasks and the recognition of vibrotactile cues. Interact. Comput. 2008, 20, 354–363. [Google Scholar] [CrossRef]
  17. Tang, A.; McLachlan, P.; Lowe, K.; Saka, C.R.; MacLean, K. Perceiving ordinal data haptically under workload. In Proceedings of the 7th International Conference on Multimodal Interfaces—ICMI ’05, Torento, Italy, 4–6 October 2005. [Google Scholar]
  18. Kaber, D.B.; Zhang, T. Human Factors in Virtual Reality System Design for Mobility and Haptic Task Performance. Rev. Hum. Factors Ergon. 2011, 7, 323–366. [Google Scholar] [CrossRef]
  19. Chen, J.Y.C. Concurrent Performance of Military and Robotics Tasks and Effects of Cueing in a Simulated Multi-Tasking Environment. Presence Teleoper. Virtual Environ. 2009, 18, 1–15. [Google Scholar] [CrossRef]
  20. Alexander, N.B. Gait Disorders in Older Adults. J. Am. Geriatr. Soc. 1996, 44, 434–451. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  21. Woollacott, M.; Shumway-Cook, A. Attention and the control of posture and gait: A review of an emerging area of research. Gait Posture 2002, 16, 1–14. [Google Scholar] [CrossRef]
  22. Montero-Odasso, M.; Casas, A.; Hansen, K.T.; Bilski, P.; Gutmanis, I.; Wells, J.L.; Borrie, M.J. Quantitative gait analysis under dual-task in older people with mild cognitive impairment: A reliability study. J. Neuroeng. Rehabil. 2009, 6, 35. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  23. Timmermans, C.; Roerdink, M.; Janssen, T.W.J.; Meskers, C.G.M.; Beek, P.J. Dual-Task Walking in Challenging Environments in People with Stroke: Cognitive-Motor Interference and Task Prioritization. Stroke Res. Treat. 2018, 2018, 7928597. [Google Scholar] [CrossRef] [Green Version]
  24. Menelas, B.-A.J.; Otis, M.J. Design of a serious game for learning vibrotactile messages. In Proceedings of the 2012 IEEE International Workshop on Haptic Audio Visual Environments and Games (HAVE 2012), Munich, Germany, 8–9 October 2012. [Google Scholar]
  25. Otis, M.J.-D.; Otis, M.J.; Menelas, B.-A.J. Toward an augmented shoe for preventing falls related to physical conditions of the soil. In Proceedings of the 2012 IEEE International Conference on Systems, Man, and Cybernetics (SMC), Seoul, Korea, 14–17 October 2012. [Google Scholar]
  26. Zanotto, D.; Turchet, L.; Boggs, E.M.; Agrawal, S.K. SoleSound: Towards a novel portable system for audio-tactile underfoot feedback. In Proceedings of the 5th IEEE RAS/EMBS International Conference on Biomedical Robotics and Biomechatronics, São Paulo, Brazil, 12–15 August 2014. [Google Scholar]
  27. Velázquez, R.; Pissaloux, E.E.; Hafez, M.; Szewczyk, J. Toward Low-Cost Highly Portable Tactile Displays with Shape Memory Alloys. Appl. Bionics Biomech. 2007, 4, 57–70. [Google Scholar] [CrossRef]
  28. Meier, A.; Matthies, D.J.C.; Urban, B.; Wettach, R. Exploring vibrotactile feedback on the body and foot for the purpose of pedestrian navigation. In Proceedings of the 2nd international Workshop on Sensor-based Activity Recognition and Interaction—WOAR ’15, Rostock, Germany, 25–26 June 2015. [Google Scholar]
  29. Tchakouté, L.D.C.; Ménélas, B.A.J. Impact of Auditory Distractions on Haptic Messages Presented Under the Foot. In Proceedings of the VISIGRAPP (2: HUCAPP), Funchal, Portugal, 27–29 January 2018; pp. 55–63. [Google Scholar]
  30. Menelas, B.-A.J.; Otis, M.J.D. Toward an Automatic System for Training Balance Control Over Different Types of Soil. In Virtual, Augmented Reality and Serious Games for Healthcare 1; Springer: Berlin, Germany, 2014; pp. 391–408. [Google Scholar]
  31. Yang, T.; Xie, D.; Li, Z.; Zhu, H. Recent advances in wearable tactile sensors: Materials, sensing mechanisms, and device performance. Mater. Sci. Eng. R Rep. 2017, 115, 1–37. [Google Scholar] [CrossRef]
  32. Bucks, R.S.; Ashworth, D.L.; Wilcock, G.K.; Siegfried, K. Assessment of activities of daily living in dementia: Development of the Bristol Activities of Daily Living Scale. Age Ageing 1996, 25, 113–120. [Google Scholar] [CrossRef] [Green Version]
  33. Lachenbruch, P.A.; Cohen, J. Statistical Power Analysis for the Behavioral Sciences (2nd ed). J. Am. Stat. Assoc. 1989, 84, 1096. [Google Scholar] [CrossRef]
  34. Posner, M.I.; Sandson, J.; Dhawan, M.; Shulman, G.L. Is word recognition automatic? A cognitive-anatomical approach. J. Cogn. Neurosci. 1989, 1, 50–60. [Google Scholar] [CrossRef]
  35. Clair-Thompson, H.L.S.; Allen, R.J. Are forward and backward recall the same? A dual-task study of digit recall. Mem. Cogn. 2012, 41, 519–532. [Google Scholar] [CrossRef] [Green Version]
  36. Wilson, G.F.; Russell, C.A. Operator functional state classification using multiple psychophysiological features in an air traffic control task. Hum. Factors J. Hum. Factors Ergon. Soc. 2003, 45, 381–389. [Google Scholar] [CrossRef] [PubMed]
  37. Chan, A.; MacLean, K.; McGrenere, J. Designing haptic icons to support collaborative turn-taking. Int. J. Hum. Comput. Stud. 2008, 66, 333–355. [Google Scholar] [CrossRef]
  38. Menelas, B.-A.J.; Picinali, L.; Bourdot, P.; Katz, B.F.G. Non-visual identification, localization, and selection of entities of interest in a 3D environment. J. Multimodal User Interfaces 2014, 8, 243–256. [Google Scholar] [CrossRef]
  39. Ménélas, B.; Picinali, L.; Katz, B.F.; Bourdot, P.; Ammi, M. Haptic audio guidance for target selection in a virtual environment. In Proceedings of the 4th International Haptic and Auditory Interaction Design Workshop (HAID’09), Dresden, Germany, 10–11 September 2009; Volume 2. [Google Scholar]
  40. Hunter, S.W.; Divine, A.; Frengopoulos, C.; Montero-Odasso, M. A framework for secondary cognitive and motor tasks in dual-task gait testing in people with mild cognitive impairment. BMC Geriatr. 2018, 18, 202. [Google Scholar] [CrossRef]
  41. Jones, L.A.; Nakamura, M.; Lockyer, B. Development of a tactile vest. In Proceedings of the 12th International Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems (HAPTICS ‘04), Chicago, IL, USA, 27–28 March 2004; pp. 82–89. [Google Scholar]
  42. Van Erp, J.; Van Veen, H.A.H.C.; Jansen, C.; Dobbins, T. Waypoint navigation with a vibrotactile waist belt. ACM Trans. Appl. Percept. 2005, 2, 106–117. [Google Scholar] [CrossRef]
  43. Karuei, I.; MacLean, K.E.; Foley-Fisher, Z.; MacKenzie, R.; Koch, S.; El-Zohairy, M. Detecting vibrations across the body in mobile contexts. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Vancouver, BC, Canada, 7–12 May 2011; pp. 3267–3276. [Google Scholar]
  44. Chapwouo Tchakoute, L.D. Exploitation of Haptic Renderings to Communicate Risk Levels of Falling. Ph.D. Thesis, Université du Québec à Chicoutimi, Chicoutimi, QC, Canada, 2018. [Google Scholar]
  45. Menelas, B.-A.; Benaoudia, R.S. Use of Haptics to Promote Learning Outcomes in Serious Games. Multimodal Technol. Interact. 2017, 1, 31. [Google Scholar] [CrossRef] [Green Version]
Figure 1. The wearable device worn on the left foot with a strap to hold the haptuator. (I) The device component used for the experiment. The haptuator is located under the arch of the second toe fixed by the black strap. (II) The electronic diagram of the device showing how the components are joined together in order to deliver the vibrotactile stimulus.
Figure 1. The wearable device worn on the left foot with a strap to hold the haptuator. (I) The device component used for the experiment. The haptuator is located under the arch of the second toe fixed by the black strap. (II) The electronic diagram of the device showing how the components are joined together in order to deliver the vibrotactile stimulus.
Sensors 20 02421 g001
Figure 2. Score mean perception of each condition (with mean value inside the bars).
Figure 2. Score mean perception of each condition (with mean value inside the bars).
Sensors 20 02421 g002
Figure 3. Overall score mean perception of single task compared to dual task (with mean value inside the bars).
Figure 3. Overall score mean perception of single task compared to dual task (with mean value inside the bars).
Sensors 20 02421 g003
Figure 4. Box plot of score perception between conditions: at rest; counting forwards (CF); counting backwards (CB); and walking.
Figure 4. Box plot of score perception between conditions: at rest; counting forwards (CF); counting backwards (CB); and walking.
Sensors 20 02421 g004
Figure 5. Normality test results: residual plots of conditions at rest, counting forward (CF), counting backward (CB), and walking.
Figure 5. Normality test results: residual plots of conditions at rest, counting forward (CF), counting backward (CB), and walking.
Sensors 20 02421 g005
Table 1. Participant’s characteristics.
Table 1. Participant’s characteristics.
ParticipantsValue
Age (Y)26.45 ± 4.45 *
Height (cm)162.95 ± 29.42 *
Weight (kg)78.98± 33.26 *
GenderMen (n = 14)|Women (n = 14)
* Values represented as mean ± standard deviation (SD).
Table 2. Experiment protocol summary.
Table 2. Experiment protocol summary.
SessionsConditionsPhaseDistractionsPositioning
Control1: At restFamiliarization phase NoneStatic: At rest
Test phaseNoneStatic: At rest
Experi-mental2: Counting forwardsFamiliarization phaseCounting ForwardsAt rest
Test phaseCounting ForwardsAt rest
3: Counting backwardsFamiliarization phaseCounting BackwardsAt rest
Test phaseCounting BackwardsAt rest
4: WalkingFamiliarization phaseWalkingMoving
Test phaseWalkingMoving
Table 3. Tukey simultaneous tests for differences of means.
Table 3. Tukey simultaneous tests for differences of means.
Difference of LevelsDifference of MeansSE of Difference95% CIT-ValueAdjusted p-Value
CF–At rest−3.1070.550(−4.541; −1.673)−5.650.000
CB–At rest−3.7140.550(−5.148; −2.280)−6.760.000
Walking–At rest−4.3570.550(−5.791; −2.923)−7.930.000
CB–CF−0.6070.550(−2.041; 0.827)−1.100.687
Walking–CF−1.2500.550(−2.684; 0.184)−2.270.11
Walking–CB−0.6430.550(−2.077; 0.791)−1.170.647

Share and Cite

MDPI and ACS Style

Chapwouo Tchakoute, L.D.; Menelas, B.-A.J. Perception of a Haptic Stimulus Presented Under the Foot Under Workload. Sensors 2020, 20, 2421. https://doi.org/10.3390/s20082421

AMA Style

Chapwouo Tchakoute LD, Menelas B-AJ. Perception of a Haptic Stimulus Presented Under the Foot Under Workload. Sensors. 2020; 20(8):2421. https://doi.org/10.3390/s20082421

Chicago/Turabian Style

Chapwouo Tchakoute, Landry Delphin, and Bob-Antoine J. Menelas. 2020. "Perception of a Haptic Stimulus Presented Under the Foot Under Workload" Sensors 20, no. 8: 2421. https://doi.org/10.3390/s20082421

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop