Next Article in Journal
Water Content Detection of Red Sandstone Based on Shock Acoustic Sensing and Convolutional Neural Network
Previous Article in Journal
Sensor-Based Assessment of Groove Music and Sports Dance on Cognitive–Emotional and Neuromuscular Functions in Older Adults
Previous Article in Special Issue
Transformer-Based Decomposition of Electrodermal Activity for Real-World Mental Health Applications
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Advancing Mobile Neuroscience: A Novel Wearable Backpack for Multi-Sensor Research in Urban Environments

1
Institute of Physiology, Lisbon School of Medicine, University of Lisbon, 1649-028 Lisbon, Portugal
2
Associate Laboratory TERRA, Centre of Geographical Studies, Institute of Geography and Spatial Planning, University of Lisbon, 1649-004 Lisbon, Portugal
3
NeuroGEARS Ltd., London NW1 7EA, UK
4
NeuroGEARS Portugal Lda, 1600-514 Lisbon, Portugal
5
Climateflux GmbH, 80337 Munich, Germany
*
Author to whom correspondence should be addressed.
Sensors 2025, 25(23), 7163; https://doi.org/10.3390/s25237163
Submission received: 29 September 2025 / Revised: 4 November 2025 / Accepted: 18 November 2025 / Published: 24 November 2025

Abstract

Rapid global urbanization has intensified the demand for sensing solutions that can capture the complex interactions between urban environments and their impact on human physical and mental health. Conventional laboratory-based approaches, while offering high experimental control, often lack ecological validity and fail to represent real-world exposures. To address this gap, we present the eMOTIONAL Cities Walker—a portable multimodal sensing platform designed as a wearable backpack unit developed for the synchronous collecting of multimodal data in either indoor or outdoor settings. The system integrates a suite of environmental sensors (covering microclimate, air pollution and acoustic monitoring) with physiological sensing technologies, including electroencephalography (EEG), mobile eye-tracking and wrist-based physiological monitoring. This configuration enables real-time acquisition of environmental and physiological signals in dynamic, naturalistic settings. Here, we describe the system’s technical architecture, sensor specifications, and field deployment across selected Lisbon locations, demonstrating its feasibility and robustness in urban environments. By bridging controlled laboratory paradigms with ecologically valid real-world sensing, this platform provides a novel tool to advance translational research at the intersection of sensor technology, human experience, and urban health.

1. Introduction

The 21st century has been described as the Century of the City, reflecting the unprecedented demographic shift toward urban living [1]. Global population growth is projected to continue for several decades, reaching approximately 10.3 billion by 2080, with the majority concentrated in urban areas [2]. While urban development has created opportunities and improved quality of life, it also generates challenges that negatively affect physical and mental health [3]. Urban residents, for example, are more likely to experience anxiety and depression compared to rural populations [4]. At the same time, exposure to restorative environments such as green or blue spaces has been shown to promote psychological recovery [5,6,7,8]. Yet the physiological and neural mechanisms underlying these associations remain poorly understood, as only a limited number of studies have examined brain and body responses that link environmental exposures to health outcomes [9].
Traditional research on human–environment interactions has relied heavily on stationary participants in controlled laboratory settings. While these methods offer strong experimental control, they often lack ecological validity. Recent advances in mobile neuroscience technologies now permit the recording of brain activity during freely moving behaviour [10], opening new possibilities for studying human experience in real-world urban contexts. The emerging fields of Neurourbanism (and also Environmental Neuroscience), embraces an interdisciplinary framework by integrating robust neuroscientific methods with environmental assessment to achieve a more objective and comprehensive understanding of how built and natural urban features influence subjective experience, behaviour as well as the physical and mental well-being [11,12].
Within this framework, wearable and mobile sensors play a central role, as they enable the direct measurement of brain and body physiological responses in naturalistic conditions. Mobile neuroimaging has emerged as a prominent set of tools for investigating cognitive and affective neural processes beyond the confines of laboratory—and particularly wihin city environments [13]. By combining such approaches with environmental sensing, researchers can obtain richer insights into the interplay between urban context and mental health, with implications for evidence-based urban planning and public health strategies [14,15,16].
Achieving high ecological validity in contemporary research requires capturing human responses during daily interactions in complex and dynamic urban environments [17]. This, however, presents methodological challenges. Multisensory and naturalistic paradigms, such as outdoor walks, must be adapted to ensure validity and comparability with laboratory-based protocols [18,19]. Furthermore, it remains technically demanding to synchronously measure environmental exposures (e.g., air quality, noise, microclimate, visual stimuli) alongside physiological responses (e.g., brain, cardiovascular, or autonomic signals) in real-world conditions.

1.1. Wearable Sensors and Outdoor Studies

Modelling and simulation approaches to urban comfort often underestimate the human dimension [20]. Human behaviour is strongly shaped by environmental context, and people in urban settings tend to act in ways that optimize comfort [21]. Furthermore, various features of the environment (e.g., the urban microclimate) that our comfort and wellbeing are dynamic—varying with the season [22] or time of day [23]. Hence, mobile and wearable sensors are becoming essential tools for assessing how microscale environmental conditions affect human behaviour [24], with a wide range of devices now available to monitor thermal, acoustic, or air quality parameters [25,26,27]. However, these environmental sensors are rarely integrated with physiological measures [27,28]. Conversely, many health-oriented studies still rely primarily on subjective assessments such as questionnaires, and only a minority incorporate portable biosensors [29]. Existing work has often focused on cardiovascular and respiratory outcomes of air quality exposure, while other environmental factors such as noise and temperature remain underexplored [28,29]. Some attempts have been made to integrate environmental and physiological sensing, such as combining meteorological monitoring with skin temperature recordings to assess thermal comfort [25], but these remain limited in scope.
Crucially, assessing mental states in urban contexts requires brain imaging methods, with electroencephalography (EEG) and, to a lesser extent, functional near-infrared spectroscopy (fNIRS) emerging as the most suitable techniques for outdoor use [11,13]. The EEG is generally favoured due to its millisecond temporal resolution and portability, which make it well suited for mobile applications; moreover, it is a well-established and extensively validated neuroimaging techniques [30,31]. The emergence of consumer-grade EEG devices has further expanded its use beyond laboratory settings, particularly in brain–computer interface research [19,32]. This availability has stimulated studies examining the performance of mobile EEG in moving participants, including analyses of signal quality and artifact management [33,34,35]. Some work has attempted to link mobile EEG recordings to environmental exposures—for example, treadmill-based paradigms [36] or outdoor walks with wireless EEG systems [37,38]. However, many of these studies rely on low-density EEG systems or setups vulnerable to motion artifacts, limiting their robustness in naturalistic conditions.
Growing interest in ecological validity has driven progress in mobile cognitive neuroscience, in parallel with advances in wearable recording technologies [16,18]. Yet significant challenges remain as urban environments are inherently complex and dynamic—requiring simultaneous and synchronized measurement of multiple influencing variables. Current studies rarely achieve this level of integration, which constrains their capacity to fully characterize human–environment interactions.
To address this gap, we developed the eMOTIONAL Cities Walker—a wearable, reconfigurable backpack system capable of synchronous acquisition of both physiological responses and environmental exposures during real-world urban exposure.

1.2. Scope and Objectives

This study was conducted within the framework of the eMOTIONAL Cities project (https://emotionalcities-h2020.eu/), a European Union Horizon 2020 Research and Innovation Action that investigates how natural and built urban environments shape human cognition and emotion through their neurobiological underpinnings. The project integrates advanced neuroscience methodologies, including neuroimaging and physiological monitoring with environmental sensing to generate robust evidence on the links between urban characteristics and individual well-being.
In this paper, we present the methodological foundations for conducting neuroscience research with freely moving participants in real-world urban environments. Specifically, we introduce a new multimodal wearable experimental platform that combines multiple environmental and biosensors, including electroencephalography (EEG) for assessment of mental state assessment, enabling both indoor and outdoor studies in naturalistic conditions. We also describe the implementation of a flexible experimental framework designed to ensure comparability, reliability, and reproducibility of data across laboratory and field paradigms.
By bridging laboratory-based and field-based approaches, this work contributes to the emerging domains of Environmental Neuroscience and Neurourbanism, by providing validated tools to capture the complexity of human experiences in urban spaces. Ultimately, the proposed framework supports the development of evidence-based strategies for urban planning and design aimed at promoting mental health and well-being.
The wearable data collection unit, termed the eMOTIONAL Cities Walker, is an ergonomically designed backpack apparatus developed to acquire multimodal data streams of data from both the user and the surrounding environment (Figure 1). At its core, the system employs a mobility-focused computer, the HP® VR Backpack PC (HP Inc., Palo Alto, CA, USA), optimized for wearable use. The backpack’s design allows participants to walk naturally in outdoor environments while carrying the system comfortably. Dimension-wise, the backpack weighed approximately 9 kg when fully equipped, featured an adjustable height ranging from 86 cm to 118 cm, a depth between 23 cm and 27 cm, and a width of 28 cm. Dual hot-swappable external batteries ensure uninterrupted field operation for up to two hours. Its compact design, wireless connectivity, and portability make the system suitable for unobtrusive outdoor use with multiple biosensors and environmental sensors. To support continuous experimental oversight, a touchscreen interface was integrated, enabling real-time monitoring and interaction between the researcher and the system.
The eMOTIONAL Cities Walker comprises two main components—human-centred sensors and environmental sensors, supported by integration modules. These sensor systems were selected to work in a complementary manner, enabling the collection of data on human–environment interactions across multiple levels.
Human-centered sensors: the EEG was included to measure neural activity underlying cognitive functions (e.g., attention) and context-dependent mental states, including signals often associated with stress and restoration [17,31]. Eye-tracking provides geotagged, egocentric video data, enabling identification of environmental features attracting visual attention [39,40]. Combined with computer vision algorithms, eye-tracking facilitates object and landmark recognition in real-world settings [41]. Peripheral cardiovascular and autonomic sensors included electrocardiography (ECG), blood volume pulse (BVP), and galvanic skin response (GSR), providing indices of arousal and stress levels also used in other studies [42,43,44,45]. In addition, microphones were also considered by us to record ambient noise, a well-established environmental stressor
Environmental sensors: enable the estimation of thermal comfort metrics, such as the universal thermal climate index (UTCI) [46]. This index accounts for variables including air temperature, mean radiant temperature (via black globe temperature), relative humidity, and wind speed:
UTCI = f T a , T g , R H , v
where   T a = air   temperature ,   T g = globe   temperature ,   R H = relative   humidity ,   and   v = wind   speed . Additionally, real-time monitoring of particulate matter (PM) concentrations permits assessment of air quality and pollution exposure [47].
The various types of sensors were selected based on prior validation in urban and environmental neuroscience studies for myriad research questions and their utility in environmental neuroscience proven to be useful (Table 1); with portability, robustness, and compatibility being also prioritized for integration into a mobile platform. High sampling rates across modalities ensure the capture of both transient and sustained responses, facilitating advanced multimodal analysis of human experience in urban contexts.

2. System Concept and Design

2.1. Sensors for Human Physiology and Behavioural Signals

2.1.1. Electroencephalogram (EEG)

EEG signals were acquired using a 32-channel Enobio® system (Neuroelectrics®, Barcelona, Spain). The setup involved fitting a neoprene cap with dry electrodes positioned according to the international 10–20 system. Electrode cable sets were connected to a lightweight amplifier containing a rechargeable internal battery. The amplifier was secured with a hook-and-loop fastener and linked to the backpack computer via USB. Although the backpack can accommodate different EEG systems, outdoor use is constrained to portable, low-weight amplifiers. Dry electrodes were selected to enable rapid setup, despite their lower signal quality compared to wet electrodes [19,32]; however, the system also supports gel-based electrodes if required. EEG signals were streamed in real time to the Neuroelectrics® Instrument Controller (NIC2) software (v2.0.11). Event markers were transmitted through the Lab Streaming Layer (LSL) to synchronize EEG with other sensor modalities [51].

2.1.2. Peripheral Physiological Signals

Peripheral physiology was monitored using the E4 wristband (Empatica Inc., Boston, MA, USA). The device, worn on the participant’s non-dominant wrist to reduce motion artifacts, contains an internal rechargeable battery. The wristband integrates a photoplethysmography (PPG) sensor providing blood volume pulse (BVP), inter-beat interval (IBI), and heart rate (HR), alongside a three-axis accelerometer, skin temperature sensor, and electrodermal activity sensor. Data were transmitted wirelessly via Bluetooth to the E4 Streaming Server application and subsequently integrated into the Bonsai dataflow framework.

2.1.3. Electrocardiogram (ECG)

The ECG recordings were obtained using the SparkFun® AD8232 board (SparkFun Electronics, Niwot, CO, USA). Disposable gel electrodes (Ambu® Blue Sensor®) were positioned on the torso following a three-lead configuration: right abdomen (red), right chest (white), and left chest (black). The electrodes were connected to the board with a 3.5 mm jack cable. This sensor provided more reliable cardiac signals than the wrist-worn PPG device, which often yielded missing data in the presence of arm movement and a looser watch placement. The AD8232 board was placed in the backpack’s electronics compartment and connected via USB to the processing unit.

2.1.4. Eye-Tracker

Eye movements were captured with Pupil Invisible® glasses (Pupil Labs GmbH, Berlin, Germany). After fitting, calibration was performed using a dedicated smartphone running the Pupil Invisible app, enabling rapid deployment. The glasses contain two inward-facing eye cameras and an outward-facing scene camera that recorded the participant’s visual field. The smartphone, carried in the participant’s pocket or strapped to the backpack, connected to the local Wi-Fi network to wirelessly transmit data to the backpack computer. Recorded variables included gaze position, blink events, fixation metrics, and synchronized video from the scene camera.

2.1.5. Nine-Axis Inertial Measurement Unit (IMU)

Movement and orientation were measured using the BNO055 intelligent nine-axis absolute orientation sensor (Bosch Sensortec GmbH). The module integrates a 14-bit triaxial accelerometer, a 16-bit triaxial gyroscope, a triaxial geomagnetic sensor, and a 32-bit ARM Cortex-M0+ microcontroller running onboard sensor fusion algorithms. This device captured accelerations, heading changes, and stationary periods with higher temporal precision than Global Navigation Satellite System (GNSS) data. The sensor was directly interfaced with the backpack computer.

2.1.6. Microphone

Binaural audio was recorded using OKM II Solo microphones (Soundman®, Berlin, Germany). Each microphone earpiece was inserted through the openings of the EEG cap to ensure stability and participant comfort.

2.2. Environmental Sensors

Environmental variables were recorded using sensors from Tinkerforge (Tinkerforge GmbH, Schloß Holte-Stukenbrock, Germany) and METER Group (Pullman, WA, USA). The Tinkerforge modules included the Humidity Bricklet (relative humidity), Thermocouple Bricklet (black globe temperature), Particulate Matter Bricklet (airborne particles), Sound Pressure Level Bricklet (acoustic intensity), Industrial Dual 0–20 mA Bricklet (irradiance), and Air Quality Bricklet (air pressure and air quality index). Complementary measurements of air temperature, wind velocity, and wind direction were obtained with the ATMOS 22 ultrasonic anemometer (METER Group), thereby enabling the characterization of microclimatic conditions during outdoor experiments. All these environmental sensors comprise the backpack’s mobile weather station module, whose specifications and data quality control measurements were assessed and compared to other mobile weather stations in another study [52].

2.3. Spatiotemporal Sensor Synchronization

Temporal synchronization of multimodal data streams was managed by Pluma, a Hardware Research Platform (Harp) compatible device. Harp is a standard for self-synchronizing hardware widely adopted in neuroscience for asynchronous real-time data acquisition and experimental control [53,54,55]. Synchronization between non-Harp-compatible devices was performed via Harp-timestamped TTL trigger pulses delivered by Pluma to all devices, both at the start and periodically throughout the experiment, to minimize desynchronization and timing drift. The temporal jitter between periodic random interval pulses is used as a reference for subsequent alignment of the data streams. All temporal alignment and synchronization code is available at https://github.com/emotional-cities/pluma-analysis (accessed on 17 November 2025). Spatial positioning was obtained via the ZED-F9R high-precision dead-reckoning GNSS module (u-blox, Thalwil, Switzerland). Each GNSS position sample was paired with a high precision Coordinated Universal Time (UTC) timestamp, which provided a common temporal reference. This timestamp was subsequently used to align and contextualize physiological, behavioral, and environmental data, ensuring accurate spatiotemporal integration across all sensor modalities (Table 2). Synchronization quality was assessed for all collected datasets as part of the quality control procedure described and illustrated in Figure S1.

2.4. Integration Software

To ensure spatiotemporal synchronization of multimodal data, all acquisition processes were integrated within a unified framework based on Bonsai, an open-source visual programming language for reactive data streams [56]. Standard interfaces were used to access each sensor, allowing seamless data routing and management. An overview of the integrated sensors and their communication protocols with the Bonsai software (v2.8.1) is presented in Figure 2. All experimental acquisition and control code, including descriptions and technical details of all system streams are available at https://github.com/emotional-cities/walker-experiments (accessed on 17 November 2025).
Bonsai generated random timing events, which were converted by the eMOTIONAL Cities Walker’s motherboard into Harp timestamps. These timestamps were routed as transistor–transistor logic (TTL) pulses to synchronize sensors capable of hardware-level alignment, including Tinkerforge modules, the Enobio® EEG system, and the u-blox GNSS. Sampling of the nine-axis IMU was also hardware-triggered and Harp-timestamped by the motherboard.
This dual approach—hardware TTL pulses for compatible devices and software timestamping for others—ensured reliable temporal alignment across all data streams. Random timing events coupled with Harp timestamps enabled accurate reconstruction of temporal order, even in cases of missing data or dropped events. Ultimately, all sensor signals were synchronized to GNSS-derived UTC timestamps, providing a high-precision spatiotemporal reference for multimodal data integration.

3. Experimental Setting for Testing and Results

As a proof of principle, this section describes an experimental task in which the eMOTIONAL Cities Walker was deployed to collect multimodal data along predefined urban paths in Lisbon, Portugal.

3.1. Participants

Participants were recruited through convenience sampling using informal outreach strategies, including personal and professional networks (e.g., colleagues, friends, and acquaintances). Although this approach may have introduced some degree of selection bias, it was considered appropriate for this study’s aims and is not expected to have substantially influenced the present description. Eligibility criteria required participants to be ≥18 years of age, residing in Lisbon, fluent in Portuguese or English, and with no history of major psychiatric or neurological disorders. Volunteers were invited to participate in both indoor and outdoor experiments, although only three completed both. Each participant could repeat the task along different routes, with a maximum of three paths per individual. In total, 43 participants contributed to 60 walks (10 acquisitions per path). Participants received a voucher as compensation for their time. All subjects provided written informed consent, and the study protocol was approved by the Ethics Committee of the Institute of Geography and Spatial Planning, University of Lisbon (IGOT) (reference: ETHIC-07/2022).

3.2. Experimental Procedures and Stimuli

Participants were randomly assigned to one of six predefined paths, each corresponding to distinct urban environments in Lisbon. These paths were selected in a stakeholder workshop that brought together professionals from multiple domains to identify city areas capturing a wide range of urban characteristics [57].
Prior to the walk, participants were fitted with the multimodal backpack apparatus (see Section 2). Instructions emphasized that participants should behave as naturally as possible, abstracting from the experimental context to simulate an everyday walk. Each path was approximately 1 km in length. A researcher followed the participant at a discrete distance, providing verbal guidance if necessary.
Along each route, a few checkpoints were defined. These locations were also filmed for subsequent use in a complementary laboratory-based EEG experiment, designed to validate data across paradigms (full analysis reported separately). At each checkpoint, participants completed three tasks sequentially: (i) observing the surrounding environment while standing still; (ii) walking continuously for 20 s; and (iii) answering short questions to assess perceived naturalness, crowdedness, and subjective feelings of valence and arousal (Figure 3).

3.3. Sensor Reliability

The eMOTIONAL Cities Walker backpack proved overall reliable for outdoor multimodal data acquisition. Setup time was approximately 10 min per participant, which is efficient given the number and complexity of sensors. Participants across age groups completed the approximate 25 min walking protocol with minimal difficulty, highlighting the ergonomic suitability of the system.
Nevertheless, sensor data acquisition varied throughout the 60 sessions (Figure 4B). While most sensors functioned consistently, failures occasionally occurred due to: (i) mechanical wear of connectors with repeated use; (ii) cable strain and spurious disconnections during movement; and signal interference in outdoor settings. In practice, several hardware-related issues were encountered during data collection. The Empatica® E4 wristband occasionally suffered from Bluetooth connectivity interruptions. The Pupil Labs® Invisible eye-tracking glasses showed susceptibility to USB-C connectivity failures after extensive use, particularly at the connection port embedded in the glasses, and occasional data loss due to temporary Wi-Fi disconnections. The Enobio® EEG system experienced similar mechanical wear at the connection interface between the amplifier and the backpack unit. Missing ECG data primarily resulted from suboptimal electrode placement—leading to detachment during walking; or, less frequently, from incorrect port connections. The causes of failure in the remaining sensors were more difficult to identify, as these devices operate within sealed internal modules whose connectivity is not directly accessible or handled by the researcher on site.
Whenever a sensor lost one or more samples during a given path, its corresponding data stream was labelled as incomplete and as complete otherwise. If a sensor failed to record any data throughout the walk (e.g., due to hardware malfunction, communication failure, or data corruption—see also example Figure 4A), it was labelled as absent (Figure 4B). Percent mean uptime was calculated for each sensor across all recording sessions to provide a fine-grained quantitative assessment of effective data acquisition. The environmental sensors within the Tinkerforge module achieved the highest reliability, with mean uptimes exceeding 99%. These were followed by the Empatica® E4 wristband’s skin temperature and EDA streams, accelerometer data, and ECG signals, all showing mean uptimes of approximately 95%. The u-blox GPS module (elevation, longitude, and latitude) and the audio recordings each presented mean uptimes of around 84%—the reduction in the former primarily attributable to signal loss in underground areas or in locations surrounded by tall buildings. The Neuroelectrics® Enobio EEG system exhibited a lower mean uptime of 74%, while the Empatica® E4 heart rate stream showed the lowest value, with a mean uptime of about 16%.

3.4. Output Data Structure

For each acquisition, raw sensor outputs were stored in a dedicated folder. Electroencephalography (EEG) recordings were saved separately by the NIC software and subsequently integrated into the session dataset. All data streams were then consolidated into a spatiotemporal index using a Python-based (v3.11) pipeline (available at https://github.com/emotional-cities/notebooks, accessed on 17 November 2025), which enabled joint analysis of multimodal data.
This framework allows environmental, physiological, and behavioural signals to be plotted alongside Global Positioning System (GPS) coordinates (Figure 5). Since GPS trajectories were consistent across sessions for each predefined path, the spatiotemporal structure facilitates direct comparative analyses both between participants and across sessions.

4. Discussion

We have described a novel wearable data collection unit that seamlessly integrates environmental and physiological sensors within a unified, synchronized, and portable system. Its innovation lies not only in the combined acquisition of biosensing and environmental data—enabling real-world investigations at the interface between urban environments, brain function, and human well-being, but also in the precise temporal synchronization achieved through a central computational unit implementing HARP-based alignment—ensuring millisecond-level correspondence across heterogeneous data streams.
One of the main advantages of equipping a powerful computer with a wearable backpack is the ability to acquire multimodal data through wired connections, which reduces the risk of data loss often associated with wireless communication (e.g., via Bluetooth). The apparatus is also highly modular and built around open-source hardware and software, granting researchers a level of flexibility rarely available in outdoor EEG studies within the field of Neurourbanism [58]. A particularly valuable feature is the use of Harp devices, which enable synchronization across sensors operating at different sampling rates—a longstanding challenge in multimodal integration [59]. With two fully charged batteries, the system supports approximately two hours of continuous recording, assuming all individual devices with internal batteries begin fully charged. In addition, the backpack’s computer itself possessed an internal battery, which extended acquisition time. In fact, the battery autonomy of individual sensors, such as the EEG amplifier, were usually the limiting factor for acquisition time. As such, the choice of external hardware needs to be considered when assessing the system’s overall autonomy. Furthermore, the backpack can be adapted for virtual reality tasks: the climate station may be removed to increase freedom of movement, while the system’s broad port selection allows compatibility with external devices such as VR headsets or GPU-accelerated computers for demanding VR modelling tasks.
The use of EEG is particularly valuable as it provides direct access to neural activity with millisecond temporal precision, offering insights into brain processes that underpin perception, cognition, and emotion. This makes EEG uniquely positioned to bridge neuroscience and urban environmental research, while also informing our understanding of mental health in real-world contexts. According to recent recommendations for the use of mobile EEG in outdoor settings [58], the Enobio® 32-channel system offers several advantages: a lightweight amplifier, a cap with standard electrode positioning, and seamless integration with the LSL protocol. However, it also presents notable limitations: a lower channel count compared with the 64 channels often recommended for high-quality mobile EEG; occasional cable sway during walking; and mastoid reference electrodes that were prone to displacement. For this task, dry electrodes were preferred over gel electrodes to speed up preparation time and increase convenience for the participant, and while it was known that data quality would be greatly reduced, it proved to be an opportunity to understand the applicability of dry electrodes in the context of outdoor walks. Nonetheless, the 32-channel Enobio® system also supports gel electrodes. Importantly, the modularity of the eMOTIONAL Cities Walker makes it sensor-agnostic, meaning it allows straightforward integration of alternative EEG systems and does not constrain the user to any particular EEG system (see [60,61] for a review on other devices).
Beyond EEG, the platform is designed to incorporate a comprehensive suite of multimodal body sensors, enabling a richer characterization of human–environment interactions. These include eye-tracking devices, which provide precise measures of visual attention; physiological sensors that capture cardiovascular metrics, electrodermal activity, and ECG signals to assess autonomic arousal and stress responses; and movement-tracking sensors that quantify accelerations, heading changes, and stationary periods with high temporal precision. Together, these complementary data streams enrich EEG-derived neural dynamics and support efforts to understand how cognitive, emotional, and behavioural states unfold in real-world environments [17,18]. This multimodal approach substantially strengthens ecological validity and opens new pathways for nuanced interpretations of how urban settings impact both mental and physiological well-being. It should be noted, however, that a thorough understanding of the interaction between environmental features and EEG data is not specifically addressed in this paper (as it will be the focus of future works within our group).
Although the outdoor paradigm aims to simulate a natural walk, participants are inevitably affected by various sources of discomfort, including the backpack’s weight, the sensation of EEG electrodes, and the social exposure in public spaces where photos may be taken. Nonetheless, a dropout rate of 0% was achieved across all 60 sessions. In the qualitative analysis of post-walk interviews, 16 participants mentioned the backpack in their responses, with 14 of these references being related to perceived discomfort associated with its volume or visibility—i.e., drawing attention in public spaces, rather than to any significant physical strain or pain. Moreover, most sensors are sensitive to movement, leading to degraded signal quality. The EEG signal is particularly highly susceptible to motion artefacts, with signal distortions that often exceed the amplitude of actual brain activity. Mitigation strategies include instructing participants to maintain steady gait, using active electrodes (gel-based or dry electrodes designed for better hair penetration), and employing higher-density EEG systems with more channels that benefit more from advanced preprocessing tools. Gait dynamics contribute significantly to periodic cable sway that can pull on the EEG electrodes and cause huge artifacts. Solutions to this problem include the use of strain-relief connectors and wireless alternatives. Likewise, ECG and EDA sensors are sensitive to movement artifacts. For example, wrist-based EDA sensors require a tight fit on the participant’s wrist and minimal arm sway to guarantee correct signal acquisition, which incidentally explains why the 3-lead ECG exhibited fewer data losses than the Empatica® E4 heart-rate stream during field recordings.
It should be noted, however, that the primary aim of this study was not to directly evaluate signal quality in ambulatory conditions, but rather to demonstrate the successful integration of multiple sensing devices within a unified acquisition framework. Validation of signal quality and reliability for the individual sensors in relatively similar settings has been extensively reported elsewhere (see, for example [62] for the Empatica® E4 wristband and [63] for the Neuroelectrics® Enobio 32-channel EEG system with dry electrodes).
Despite these challenges, the eMOTIONAL Cities Walker Backpack demonstrates the feasibility of integrating complex, multimodal data streams essential for investigating brain–environment interactions in naturalistic settings. A key strength of the system lies in its use of redundancy to increase reliability and safeguard against individual sensor failure. Among all data types, spatiotemporal streams are the most critical, as they provide the framework for synchronizing all other signals. GPS plays a central role in spatial referencing, though its accuracy can be compromised in dense urban environments such as tunnels or areas with tall buildings. In such cases, video footage from the eye-tracking device’s external camera offers a valuable fallback for inferring spatial context. Temporal synchrony—ensured by Harp devices—is irreplaceable, as it anchors all data streams to a precise timeline.
Looking ahead, this apparatus sets a new benchmark for mobile neuroscience platforms. Future developments should prioritize enhancing participant comfort and discretion to further increase ecological validity and participant compliance in outdoor research, while also expanding interoperability with emerging sensing technologies.

Supplementary Materials

The following supporting information can be downloaded at: https://www.mdpi.com/article/10.3390/s25237163/s1, Figure S1: Example quality control metrics collected to assess spatiotemporal synchronization for every acquisition session.

Author Contributions

Conceptualization, P.M. and B.M.; Data curation, J.A., R.R. and A.B.; Formal analysis, J.A. and R.R.; Funding acquisition, P.M. and B.M.; Investigation, J.A., R.R. and A.B.; Methodology, A.A., J.F., B.F.C., A.E., F.C., G.L., A.C. and D.S.; Project administration, P.M. and B.M.; Resources, P.M. and B.M.; Software, A.A., J.F., B.F.C., A.E., F.C., G.L., A.C. and D.S.; Supervision, P.M. and B.M.; Visualization, J.A., R.R., A.B. and G.L.; Writing—original draft, J.A. and R.R.; Writing—review and editing, J.A., G.L. and B.M. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by the European Union’s Horizon 2020 Research and Innovation Programme under Grant Agreement No. 945307 (eMOTIONAL Cities).

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki, and approved by the Ethics Committee of the Institute of Geography and Spatial Planning (IGOT), University of Lisbon (ETHIC-07/2022).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The raw data supporting the conclusions of this article will be made available by the authors on request.

Conflicts of Interest

Authors André Almeida, Bruno F. Cruz, Andrew Erskine and Gonçalo Lopes are employed by the company NeuroGEARS Ltd. Authors João Frazão and Filipe Carvalho are employed by the company NeuroGEARS Portugal Lda. Authors Ata Chokhachian and Daniele Santucci are employed by the company Climateflux GmbH. The remaining authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

References

  1. Corburn, J. Equitable and Healthy City Planning: Towards Healthy Urban Governance in the Century of the City. In Healthy Cities; Leeuw, E., Simos, J., Eds.; Springer: Berlin/Heidelberg, Germany, 2017; pp. 31–41. [Google Scholar]
  2. Nations, U. Department of Economic and Social Affairs, Population Division. 2024. Available online: https://unhabitat.org/sites/default/files/2022/04/nua_tomorrow_today_together_digital_a.pdf (accessed on 17 November 2025).
  3. Joint Research Centre (European Commission); Baranzelli, C.; Vandecasteele, I.; Aurambout, J.-P.; Siragusa, A. The Future of Cities: Opportunities, Challanges and the Way Forward; EUR (Luxembourg. Online); Publications Office of the European Union: Luxembourg, 2019; ISBN 978-92-76-03847-4.
  4. Peen, J.; Schoevers, R.A.; Beekman, A.T.; Dekker, J. The Current Status of Urban-Rural Differences in Psychiatric Disorders. Acta Psychiatr. Scand. 2010, 121, 84–93. [Google Scholar] [CrossRef]
  5. Mitchell, R.J.; Richardson, E.A.; Shortt, N.K.; Pearce, J. Neighborhood Environments and Socioeconomic Inequalities in Mental Well-Being. Am. J. Prev. Med. 2015, 49, 80–84. [Google Scholar] [CrossRef] [PubMed]
  6. Alcock, I.; White, M.P.; Wheeler, B.W.; Fleming, L.E.; Depledge, M.H. Longitudinal Effects on Mental Health of Moving to Greener and Less Green Urban Areas. Environ. Sci. Technol. 2014, 48, 1247–1255. [Google Scholar] [CrossRef] [PubMed]
  7. White, M.P.; Elliott, L.R.; Grellier, J.; Economou, T.; Bell, S.; Bratman, G.N.; Cirach, M.; Gascon, M.; Lima, M.L.; Lõhmus, M.; et al. Associations between Green/Blue Spaces and Mental Health across 18 Countries. Sci. Rep. 2021, 11, 8903. [Google Scholar] [CrossRef] [PubMed]
  8. Bonifácio, A. The Role of Bluespaces for Well-Being and Mental Health. Rivers as Catalysts for the Quality of Urban Life. In Urban and Metropolitan Rivers: Geomorphology, Planning and Perception; Farguell Pérez, J., Santasusagna Riu, A., Eds.; Springer International Publishing: Cham, Switzerland, 2024; pp. 207–222. ISBN 978-3-031-62641-8. [Google Scholar]
  9. Tost, H.; Champagne, F.A.; Meyer-Lindenberg, A. Environmental Influence in the Brain, Human Welfare and Mental Health. Nat. Neurosci. 2015, 18, 1421–1431. [Google Scholar] [CrossRef]
  10. Stangl, M.; Maoz, S.L.; Suthana, N. Mobile Cognition: Imaging the Human Brain in the ‘Real World’. Nat. Rev. Neurosci. 2023, 24, 347–362. [Google Scholar] [CrossRef]
  11. Kuhn, R.L. A Landscape of Consciousness: Toward a Taxonomy of Explanations and Implications. Prog. Biophys. Mol. Biol. 2024, 190, 28–169. [Google Scholar] [CrossRef]
  12. Adli, M.; Berger, M.; Brakemeier, E.-L.; Engel, L.; Fingerhut, J.; Gomez-Carrillo, A.; Hehl, R.; Heinz, A.; Mayer, H.J.; Mehran, N.; et al. Neurourbanism: Towards a new discipline. Lancet Psychiatry 2017, 4, 183–185. [Google Scholar] [CrossRef]
  13. Ancora, L.A.; Blanco-Mora, D.A.; Alves, I.; Bonifácio, A.; Morgado, P.; Miranda, B. Cities and Neuroscience Research: A Systematic Literature Review. Front. Psychiatry 2022, 13, 983352. [Google Scholar] [CrossRef]
  14. Buttazzoni, A.; Doherty, S.; Minaker, L. How Do Urban Environments Affect Young People’s Mental Health? A Novel Conceptual Framework to Bridge Public Health, Planning, and Neurourbanism. Public Health Rep. 2022, 137, 48–61. [Google Scholar] [CrossRef]
  15. Bonifácio, A.; Morgado, P.; Peponi, A.; Ancora, L.; Blanco-Mora, D.A.; Conceição, M.; Miranda, B. Musings on Neurourbanism, Public Space and Urban Health. Finisterra 2023, 58, 63–88. [Google Scholar] [CrossRef]
  16. Pykett, J.; Osborne, T.; Resch, B. From Urban Stress to Neurourbanism: How Should We Research City Well-Being? Ann. Am. Assoc. Geogr. 2020, 110, 1936–1951. [Google Scholar] [CrossRef]
  17. Vigliocco, G.; Convertino, L.; De Felice, S.; Gregorians, L.; Kewenig, V.; Mueller, M.A.E.; Veselic, S.; Musolesi, M.; Hudson-Smith, A.; Tyler, N.; et al. Ecological Brain: Reframing the Study of Human Behaviour and Cognition. R. Soc. Open Sci. 2024, 11, 240762. [Google Scholar] [CrossRef] [PubMed]
  18. Vallet, W.; Van Wassenhove, V. Can Cognitive Neuroscience Solve the Lab-Dilemma by Going Wild? Neurosci. Biobehav. Rev. 2023, 155, 105463. [Google Scholar] [CrossRef] [PubMed]
  19. Casson, A.J. Wearable EEG and Beyond. Biomed. Eng. Lett. 2019, 9, 53–71. [Google Scholar] [CrossRef] [PubMed]
  20. Kowalski, L.F.; Lopes, A.M.S.; Masiero, E. Integrated Effects of Pavement Simulation Models and Scale Differences on the Thermal Environment of Tropical Cities: Physical and Numerical Modeling Experiments. City Built Environ. 2024, 2, 9. [Google Scholar] [CrossRef]
  21. Melnikov, V.R.; Christopoulos, G.I.; Krzhizhanovskaya, V.V.; Lees, M.H.; Sloot, P.M.A. Behavioural Thermal Regulation Explains Pedestrian Path Choices in Hot Urban Environments. Sci. Rep. 2022, 12, 2441. [Google Scholar] [CrossRef]
  22. Silva, T.; Reis, C.; Braz, D.; Vasconcelos, J.; Lopes, A. Climate Walking and Linear Mixed Model Statistics for the Seasonal Outdoor Thermophysiological Comfort Assessment in Lisbon. Urban Clim. 2024, 55, 101933. [Google Scholar] [CrossRef]
  23. Reis, C.; Nouri, A.S.; Lopes, A. Human Thermo-Physiological Comfort Assessment in Lisbon by Local Climate Zones on Very Hot Summer Days. Front. Earth Sci. 2023, 11, 1099045. [Google Scholar] [CrossRef]
  24. Silva, T.; Lopes, A.; Vasconcelos, J. A Micro-Scale Look into Pedestrian Thermophysiological Comfort in an Urban Environment. Bull. Atmos. Sci. Technol. 2024, 5, 18. [Google Scholar] [CrossRef]
  25. Chokhachian, A.; Ka-Lun Lau, K.; Perini, K.; Auer, T. Sensing Transient Outdoor Comfort: A Georeferenced Method to Monitor and Map Microclimate. J. Build. Eng. 2018, 20, 94–104. [Google Scholar] [CrossRef]
  26. Santucci, D.; Chokhachian, A. Climatewalks: Human centered environmental sensing. In Conscious Cities Anthology 2019: Science-Informed Architecture and Urbanism; 2019; Available online: https://theccd.org/article/climatewalks-human-centered-environmental-sensing/ (accessed on 17 November 2025).
  27. Helbig, C.; Ueberham, M.; Becker, A.M.; Marquart, H.; Schlink, U. Wearable Sensors for Human Environmental Exposure in Urban Settings. Curr. Pollut. Rep. 2021, 7, 417–433. [Google Scholar] [CrossRef]
  28. Salamone, F.; Masullo, M.; Sibilio, S. Wearable Devices for Environmental Monitoring in the Built Environment: A Systematic Review. Sensors 2021, 21, 4727. [Google Scholar] [CrossRef] [PubMed]
  29. Lin, X.; Luo, J.; Liao, M.; Su, Y.; Lv, M.; Li, Q.; Xiao, S.; Xiang, J. Wearable Sensor-Based Monitoring of Environmental Exposures and the Associated Health Effects: A Review. Biosensors 2022, 12, 1131. [Google Scholar] [CrossRef]
  30. Gramann, K.; Gwin, J.T.; Ferris, D.P.; Oie, K.; Jung, T.-P.; Lin, C.-T.; Liao, L.-D.; Makeig, S. Cognition in Action: Imaging Brain/Body Dynamics in Mobile Humans. Rev. Neurosci. 2011, 22, 593–608. [Google Scholar] [CrossRef]
  31. Mavros, P.; Austwick, M.Z.; Smith, A.H. Geo-EEG: Towards the Use of EEG in the Study of Urban Behaviour. Appl. Spat. Anal. 2016, 9, 191–212. [Google Scholar] [CrossRef]
  32. Zander, T.O.; Lehne, M.; Ihme, K.; Jatzev, S.; Correia, J.; Kothe, C.; Picht, B.; Nijboer, F. A Dry EEG-System for Scientific Research and Brain–Computer Interfaces. Front. Neurosci. 2011, 5, 53. [Google Scholar] [CrossRef]
  33. Gorjan, D.; Gramann, K.; Pauw, K.D.; Marusic, U. Removal of Movement-Induced EEG Artifacts: Current State of the Art and Guidelines. J. Neural Eng. 2022, 19, 011004. [Google Scholar] [CrossRef]
  34. Klug, M.; Jeung, S.; Wunderlich, A.; Gehrke, L.; Protzak, J.; Djebbara, Z.; Argubi-Wollesen, A.; Wollesen, B.; Gramann, K. The BeMoBIL Pipeline for Automated Analyses of Multimodal Mobile Brain and Body Imaging Data. BioRxiv 2022. preprint. [Google Scholar] [CrossRef]
  35. Klug, M.; Gramann, K. Identifying Key Factors for Improving ICA-Based Decomposition of EEG Data in Mobile and Stationary Experiments. Eur. J. Neurosci. 2021, 54, 8406–8420. [Google Scholar] [CrossRef]
  36. Mavros, P.; Wälti, M.J.; Nazemi, M.; Ong, C.H.; Hölscher, C. A Mobile EEG Study on the Psychophysiological Effects of Walking and Crowding in Indoor and Outdoor Urban Environments. Sci. Rep. 2022, 12, 18476. [Google Scholar] [CrossRef] [PubMed]
  37. Aspinall, P.; Mavros, P.; Coyne, R.; Roe, J. The Urban Brain: Analysing Outdoor Physical Activity with Mobile EEG. Br. J. Sports Med. 2015, 49, 272–276. [Google Scholar] [CrossRef] [PubMed]
  38. Debener, S.; Minow, F.; Emkes, R.; Gandras, K.; de Vos, M. How about Taking a Low-Cost, Small, and Wireless EEG for a Walk? Psychophysiology 2012, 49, 1617–1621. [Google Scholar] [CrossRef] [PubMed]
  39. Buttazzoni, A.; Parker, A.; Minaker, L. Investigating the Mental Health Implications of Urban Environments with Neuroscientific Methods and Mobile Technologies: A Systematic Literature Review. Health Place 2021, 70, 102597. [Google Scholar] [CrossRef]
  40. Kiefer, P.; Giannopoulos, I.; Raubal, M.; Duchowski, A. Eye Tracking for Spatial Research: Cognition, Computation, Challenges. Spat. Cogn. Comput. 2017, 17, 1–19. [Google Scholar] [CrossRef]
  41. Kiefer, P.; Giannopoulos, I.; Raubal, M. Where Am I? Investigating Map Matching During Self—Localization with Mobile Eye Tracking in an Urban Environment. Trans. GIS 2014, 18, 660–686. [Google Scholar] [CrossRef]
  42. Bolpagni, M.; Pardini, S.; Dianti, M.; Gabrielli, S. Personalized Stress Detection Using Biosignals from Wearables: A Scoping Review. Sensors 2024, 24, 3221. [Google Scholar] [CrossRef]
  43. Reeves, J.P.; Knight, A.T.; Strong, E.A.; Heng, V.; Neale, C.; Cromie, R.; Vercammen, A. The Application of Wearable Technology to Quantify Health and Wellbeing Co-Benefits from Urban Wetlands. Front. Psychol. 2019, 10, 1840. [Google Scholar] [CrossRef]
  44. Kim, J.; Bouchard, C.; Bianchi-Berthouze, N.; Aoussat, A. Measuring Semantic and Emotional Responses to Bio-Inspired Design. In Design Creativity 2010; Taura, T., Nagai, Y., Eds.; Springer: London, UK, 2011; pp. 131–138. [Google Scholar]
  45. Kyriakou, K.; Resch, B.; Sagl, G.; Petutschnig, A.; Werner, C.; Niederseer, D.; Liedlgruber, M.; Wilhelm, F.; Osborne, T.; Pykett, J. Detecting Moments of Stress from Measurements of Wearable Physiological Sensors. Sensors 2019, 19, 3805. [Google Scholar] [CrossRef]
  46. Jendritzky, G.; De Dear, R.; Havenith, G. UTCI—Why Another Thermal Index? Int. J. Biometeorol. 2012, 56, 421–428. [Google Scholar] [CrossRef]
  47. Hojaiji, H.; Goldstein, O.; King, C.E.; Sarrafzadeh, M.; Jerrett, M. Design and Calibration of a Wearable and Wireless Research Grade Air Quality Monitoring System for Real-Time Data Collection. In Proceedings of the 2017 IEEE Global Humanitarian Technology Conference (GHTC), San Jose, CA, USA, 19–22 October 2017; pp. 1–10. [Google Scholar]
  48. Osborne, T. Restorative and Afflicting Qualities of the Microspace Encounter: Psychophysiological Reactions to the Spaces of the City. Ann. Am. Assoc. Geogr. 2022, 112, 1461–1483. [Google Scholar] [CrossRef]
  49. Johnson, T.; Kanjo, E.; Woodward, K. DigitalExposome: Quantifying Impact of Urban Environment on Wellbeing Using Sensor Fusion and Deep Learning. Comput. Urban Sci. 2023, 3, 14. [Google Scholar] [CrossRef] [PubMed]
  50. Askari, F.; Habibi, A.; Fattahi, K. Advancing Neuro-Urbanism: Integrating Environmental Sensing and Human-Centered Design for Healthier Cities. Contrib. Sci. Technol. Eng. 2025, 2, 37–50. [Google Scholar] [CrossRef]
  51. Kothe, C.; Shirazi, S.Y.; Stenner, T.; Medine, D.; Boulay, C.; Grivich, M.I.; Artoni, F.; Mullen, T.; Delorme, A.; Makeig, S. The Lab Streaming Layer for Synchronized Multimodal Recording. Imaging Neurosci. 2025, 3, IMAG.a.136. [Google Scholar] [CrossRef] [PubMed]
  52. Silva, T.; Ramusga, R.; Matias, M.; Amaro, J.; Bonifácio, A.; Reis, C.; Chokhachian, A.; Lopes, G.; Almeida, A.; Frazão, J.; et al. Climate Walking: A Comparison Study of Mobile Weather Stations and Their Relevance for Urban Planning, Design, Human Health and Well-Being. City Environ. Interact. 2025, 27, 100212. [Google Scholar] [CrossRef]
  53. Cruz, B.F.; Carriço, P.; Teixeira, L.; Freitas, S.; Mendes, F.; Bento, D.; Silva, A. A Flexible Fluid Delivery System for Rodent Behavior Experiments. eNeuro 2025, 12, ENEURO.0024–25.2025. [Google Scholar] [CrossRef]
  54. Silva, A.; Carvalho, F.; Cruz, B.F. High-Performance Wide-Band Open-Source System for Acoustic Stimulation. HardwareX 2024, 19, e00555. [Google Scholar] [CrossRef]
  55. Carvalho, F.; Silva, A.; Cruz, B.; Frazao, J.; Lopes, G. Harp: A Standard for Reactive and Self-Synchronising Hardware for Behavioural Research. 2024. Available online: https://zenodo.org/records/15874648 (accessed on 17 November 2025).
  56. Lopes, G.; Bonacchi, N.; Frazão, J.; Neto, J.P.; Atallah, B.V.; Soares, S.; Moreira, L.; Matias, S.; Itskov, P.M.; Correia, P.A.; et al. Bonsai: An Event-Based Framework for Processing and Controlling Data Streams. Front. Neuroinform. 2015, 9, 7. [Google Scholar] [CrossRef]
  57. Miranda, B.; Bonifácio, A.; Ancora, L.; Blanco Mora, D.A.; Conceição, M.A.; Amaro, J.; Meshi, D.; Kaur, A.; Hoogstraten, S.; Blanco Casares, Á.D.; et al. eMOTIONAL Cities Deliverable 5.3.—Report on the Results of the Indoor Lab Experiments. 2025. Available online: https://zenodo.org/records/15204462 (accessed on 17 November 2025).
  58. Gramann, K. Mobile EEG for Neurourbanism Research—What Could Possibly Go Wrong? A Critical Review with Guidelines. J. Environ. Psychol. 2024, 96, 102308. [Google Scholar] [CrossRef]
  59. Zhang, Z.; Amegbor, P.M.; Sabel, C.E. Assessing the Current Integration of Multiple Personalised Wearable Sensors for Environment and Health Monitoring. Sensors 2021, 21, 7693. [Google Scholar] [CrossRef]
  60. Niso, G.; Romero, E.; Moreau, J.T.; Araujo, A.; Krol, L.R. Wireless EEG: A Survey of Systems and Studies. NeuroImage 2023, 269, 119774. [Google Scholar] [CrossRef] [PubMed]
  61. Lau-Zhu, A.; Lau, M.P.H.; McLoughlin, G. Mobile EEG in Research on Neurodevelopmental Disorders: Opportunities and Challenges. Dev. Cogn. Neurosci. 2019, 36, 100635. [Google Scholar] [CrossRef] [PubMed]
  62. Milstein, N.; Gordon, I. Validating Measures of Electrodermal Activity and Heart Rate Variability Derived from the Empatica E4 Utilized in Research Settings That Involve Interactive Dyadic States. Front. Behav. Neurosci. 2020, 14, 148. [Google Scholar] [CrossRef] [PubMed]
  63. Ruffini, G.; Dunne, S.; Farres, E.; Cester, I.; Watts, P.C.P.; Silva, S.R.P.; Grau, C.; Fuentemilla, L.; Marco-Pallares, J.; Vandecasteele, B. ENOBIO Dry Electrophysiology Electrode; First Human Trial plus Wireless Electrode System. In Proceedings of the 2007 29th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Lyon, France, 22–26 August 2007; IEEE: Piscataway, NJ, USA, 2017; pp. 6689–6693. [Google Scholar]
Figure 1. (A) Schematic of the eMOTIONAL Cities Walker backpack, illustrating the integrated devices and sensors of the data collection unit. (B) Side-view photograph of a participant wearing the backpack during an outdoor walk experiment.
Figure 1. (A) Schematic of the eMOTIONAL Cities Walker backpack, illustrating the integrated devices and sensors of the data collection unit. (B) Side-view photograph of a participant wearing the backpack during an outdoor walk experiment.
Sensors 25 07163 g001
Figure 2. Schematic representation of the integration framework and communication protocols for each data stream. The Pupil Labs eye-tracking system required an external smartphone connected to the glasses via USB, which stored data locally and simultaneously transmitted it to the backpack computer over Wi-Fi. Empatica E4 data were streamed via Bluetooth. EEG signals were transferred to the backpack through either USB or Wi-Fi, with event synchronization managed using the LSL protocol. All other sensors communicated directly with the processing board through UART or USB interfaces, with temporal alignment provided by the Pluma Harp device.
Figure 2. Schematic representation of the integration framework and communication protocols for each data stream. The Pupil Labs eye-tracking system required an external smartphone connected to the glasses via USB, which stored data locally and simultaneously transmitted it to the backpack computer over Wi-Fi. Empatica E4 data were streamed via Bluetooth. EEG signals were transferred to the backpack through either USB or Wi-Fi, with event synchronization managed using the LSL protocol. All other sensors communicated directly with the processing board through UART or USB interfaces, with temporal alignment provided by the Pluma Harp device.
Sensors 25 07163 g002
Figure 3. The outdoor walking task was conducted across six different paths in the city of Lisbon, which were chosen based on certain geographical and sociodemographic characteristics. Between the start (A) and end (B), each path also had some “checkpoints” (marked with stars)—where participants were asked to pause (observing the surroundings still for one minute), walk for 20 s and then pause again to answer several questions about the walking experience. The data gathered at these checkpoints was then compared to data acquired in a related laboratory-based experiment that used first-person video walks of the same section of the path.
Figure 3. The outdoor walking task was conducted across six different paths in the city of Lisbon, which were chosen based on certain geographical and sociodemographic characteristics. Between the start (A) and end (B), each path also had some “checkpoints” (marked with stars)—where participants were asked to pause (observing the surroundings still for one minute), walk for 20 s and then pause again to answer several questions about the walking experience. The data gathered at these checkpoints was then compared to data acquired in a related laboratory-based experiment that used first-person video walks of the same section of the path.
Sensors 25 07163 g003
Figure 4. (A) Dashboard interface for real-time visualisation of multimodal data streams. The example illustrates a moment when the recording of Pupil Labs® eye-tracking data (“Pupil”) failed, indicated by the corresponding red circle. Similarly, the red “Start” circle denotes that the participant has moved away from the initial GPS-calibration location. Green circles indicate that remaining sensors have adequate ongoing data acquisition. (B) Summary of sensor recording status across sixty data acquisition sessions conducted in Lisbon. For each path, individual sensors were classified as: (i) complete—data recorded throughout the entire path; (ii) incomplete—data recorded for only part of the path; or (iii) absent—no data recorded for that session.
Figure 4. (A) Dashboard interface for real-time visualisation of multimodal data streams. The example illustrates a moment when the recording of Pupil Labs® eye-tracking data (“Pupil”) failed, indicated by the corresponding red circle. Similarly, the red “Start” circle denotes that the participant has moved away from the initial GPS-calibration location. Green circles indicate that remaining sensors have adequate ongoing data acquisition. (B) Summary of sensor recording status across sixty data acquisition sessions conducted in Lisbon. For each path, individual sensors were classified as: (i) complete—data recorded throughout the entire path; (ii) incomplete—data recorded for only part of the path; or (iii) absent—no data recorded for that session.
Sensors 25 07163 g004
Figure 5. (A) Example of environmental data streams extracted from a single subject’s raw data for one and mapped to the corresponding path. Illustrated variables include PM2.5, UTCI, and wind speed, each plotted along the acquired GPS trajectory. (B) Radar chart displaying averages of normalized values (min-max) of climate sensor metrics from the same path across multiple participants, highlighting variability between sessions. (C) Derivative time series plots from one subject. The plots show different data streams, each re-sampled at 1 Hz, and synchronized from −40 to 40 s relative to the onset of the “stop walking” event. Phasic EDA values correspond to high-pass raw EDA values and EEG band power was computed with Welch’s method on two second windows with 50% overlap. The heart rate plot showcases data from the E4 wristband. The accelerometer and sound pressure levels streams were plotted directly as raw sensor outputs.
Figure 5. (A) Example of environmental data streams extracted from a single subject’s raw data for one and mapped to the corresponding path. Illustrated variables include PM2.5, UTCI, and wind speed, each plotted along the acquired GPS trajectory. (B) Radar chart displaying averages of normalized values (min-max) of climate sensor metrics from the same path across multiple participants, highlighting variability between sessions. (C) Derivative time series plots from one subject. The plots show different data streams, each re-sampled at 1 Hz, and synchronized from −40 to 40 s relative to the onset of the “stop walking” event. Phasic EDA values correspond to high-pass raw EDA values and EEG band power was computed with Welch’s method on two second windows with 50% overlap. The heart rate plot showcases data from the E4 wristband. The accelerometer and sound pressure levels streams were plotted directly as raw sensor outputs.
Sensors 25 07163 g005
Table 1. Some examples of previous applications and outcomes of different sensor modalities in environmental psychology or neurourbanism research.
Table 1. Some examples of previous applications and outcomes of different sensor modalities in environmental psychology or neurourbanism research.
ModalityUse CasesOutcomes
EEGBand power activity and time-domain neural responses associated with emotional states or cognitive tasks in urban environments [13,39].Increases in global alpha and theta power have been associated with relaxation states, particularly in naturalistic environmental settings [13].
Eye-trackerEye-tracking metrics such as fixations, saccades, blink rate, and scanpath length provide valuable indices of attentional processes in real-world environments [40].In spatial navigation tasks, increased visual attention to salient landmarks—as reflected in gaze fixations—enhances self-localization performance in urban environments [41].
CardiovascularCardiovascular metrics such as heart rate or blood volume pulse are widely used as real time indicators of autonomic stress responses in urban contexts [42].Wearable monitoring has revealed significant heart rate differences between wetland and urban settings, indicating measurable physiological benefits of exposure to natural environments [43].
Electrodermal activity (EDA)The EDA provides a reliable index of emotional arousal, allowing researchers to capture how different environmental features elicit affective responses [44]Wearable sensors for EDA have identified acute moments of stress during city walks, providing fine-grained insights into how urban environments impact human well-being [45]
EnvironmentalIdentification of emotional “micro-spaces” within the built environment that contribute to better wellbeing [48,49]Air pollution, high temperature and inadequate light intensity have been shown to modulate physiological responses—acting as potential urban stressors [48,49,50]
Table 2. Summary of the sensors in the eMOTIONAL Cities Walker.
Table 2. Summary of the sensors in the eMOTIONAL Cities Walker.
NameBrandData TypeUnitsSampling Rate
BiosensorsEnobio® 32 headsetNeuroelectrics®EEGμV500 Hz
Pupil Invisible®Pupil Labs GmbHWorld CameraImage Frame32 Hz
GazePixels250 Hz
AD8232 Heart Rate MonitorSparkFun®ECGmV1000 Hz
Binaural OKM II Solo microphonesSoundman®AudiodBV/Pa44,100 Hz
E4 wristbandEmpatica Inc.Heart ratebpm1.56 Hz
Blood Volume PulsemmHg64 Hz
Skin Temperature°C4 Hz
Electrodermal ActivityμS
BNO055 9-axis IMUBosch Sensortec
GmbH
Orientationdegrees50 Hz
Gyroscope
Magnetometer
Accelerometerm/s2
Gravity
Spatiotemporal Sensors *Harp ClockHarpHarp Timestampμs31,250 Hz
GNSS ZED-F9Ru-bloxLatitudeDDªMIN’SEC” DIRECTION1 Hz
Longitude
Altitudemeters
Timehh:mm:ss
Environmental SensorsPTC BrickletTinkerforgeTemperature°C100 Hz
Industrial Dual 0–20 mA BrickletIrradiancemA
Thermocouple BrickletBlack Globe Temperature°C
Particulate Matter BrickletParticulate Matterμg/m3
Sound Pressure Level BrickletSound Pressure LeveldB(A)
Humidity BrickletHumidity%RH1 Hz
Air Quality BrickletAir PressurehPa
Atmos 22METER GroupNorth Wind Speedm/s2 Hz
East Wind Speed
Gust Wind
* The synchronization between GPS spatial signal and the Harp clock signal is described in detail in Section 2.3.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Amaro, J.; Ramusga, R.; Bonifácio, A.; Almeida, A.; Frazão, J.; Cruz, B.F.; Erskine, A.; Carvalho, F.; Lopes, G.; Chokhachian, A.; et al. Advancing Mobile Neuroscience: A Novel Wearable Backpack for Multi-Sensor Research in Urban Environments. Sensors 2025, 25, 7163. https://doi.org/10.3390/s25237163

AMA Style

Amaro J, Ramusga R, Bonifácio A, Almeida A, Frazão J, Cruz BF, Erskine A, Carvalho F, Lopes G, Chokhachian A, et al. Advancing Mobile Neuroscience: A Novel Wearable Backpack for Multi-Sensor Research in Urban Environments. Sensors. 2025; 25(23):7163. https://doi.org/10.3390/s25237163

Chicago/Turabian Style

Amaro, João, Rafael Ramusga, Ana Bonifácio, André Almeida, João Frazão, Bruno F. Cruz, Andrew Erskine, Filipe Carvalho, Gonçalo Lopes, Ata Chokhachian, and et al. 2025. "Advancing Mobile Neuroscience: A Novel Wearable Backpack for Multi-Sensor Research in Urban Environments" Sensors 25, no. 23: 7163. https://doi.org/10.3390/s25237163

APA Style

Amaro, J., Ramusga, R., Bonifácio, A., Almeida, A., Frazão, J., Cruz, B. F., Erskine, A., Carvalho, F., Lopes, G., Chokhachian, A., Santucci, D., Morgado, P., & Miranda, B. (2025). Advancing Mobile Neuroscience: A Novel Wearable Backpack for Multi-Sensor Research in Urban Environments. Sensors, 25(23), 7163. https://doi.org/10.3390/s25237163

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop