Next Article in Journal
A Machine Learning-Based Method for Content Verification in the E-Commerce Domain
Next Article in Special Issue
Strategic Assessment of Cyber Security Contenders to the Brazilian Agribusiness in the Beef Sector
Previous Article in Journal
Awakening City: Traces of the Circadian Rhythm within the Mobile Phone Network Data
Previous Article in Special Issue
A Review of Tabular Data Synthesis Using GANs on an IDS Dataset
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:

Introducing the Architecture of FASTER: A Digital Ecosystem for First Responder Teams

Evangelos Katsadouros
Dimitrios G. Kogias
Charalampos Z. Patrikakis
Gabriele Giunta
Anastasios Dimou
3 and
Petros Daras
Department of Electrical and Electronics Engineering, University of West Attica, 12241 Egaleo, Greece
Smart Transport and Infraastructures Unit with the IS3 R&D Lab, Engineering Ingegneria Informatica Spa, 90146 Palermo, Italy
Visual Computing Lab, Information Technologies Institute, Center for Research and Technology Hellas, 57001 Thessaloniki, Greece
Author to whom correspondence should be addressed.
Information 2022, 13(3), 115;
Submission received: 27 December 2021 / Revised: 21 February 2022 / Accepted: 23 February 2022 / Published: 26 February 2022


Emergency first responders play an important role during search and rescue missions, by helping people and saving lives. Thus, it is important to provide them with technology that will maximize their performance and their safety on the field of action. IFAFRI, the “International Forum to Advanced First Responder Innovation” has pointed out several capability gaps that are found in the existing solutions. Based on them, there is a need for the development of novel, modern digital solutions that will better assist responders by helping them on the field and, at the same time, better protect them. The work presented here introduces the logical architecture implemented in the Horizon 2020 project called FASTER (First responders Advanced technologies for Safe and efficienT Emergency Response), which is an innovating digital ecosystem for emergency first response teams. It is a system that meets the requirements of the consortium members but also fills all the gaps that IFARFI has pointed out and consists of mechanisms and tools for data communication, data analysis, monitoring, privacy protection and smart detection mechanisms.

1. Introduction

Europe is experiencing an increasing number of disasters derived either from natural phenomena, technological accidents or human actions [1]. The effects of these disasters reflect on the lives of millions of people and on the economies of the European countries and are severe [2]. In particular, over the period of 1980–2019, the total reported losses caused by weather and climate-related extremes in the EEA member countries amounted to EUR 446 billion [3]. The economic and societal impact will continue to escalate, as weather-related disasters alone could affect about two-thirds of the EU population annually by the year 2100, according to a recent data-driven forecast study [4].
First responders (FRs) are the people who are among the first to arrive and who provide assistance at the disaster scene. They are typically professionals with specialized training, including LEAs, firefighters, emergency medical personnel, rescuers, animal units (e.g., K9 dogs), civil protection authorities and other related organizations. Due to the nature of their work, FRs often operate in risky and hazardous conditions, in disaster sites including demolished, burnt or flooded districts, and are exposed to non-visible threats (such as high temperatures and dangerous gases). Furthermore, FRs may experience personal incidents during operations (e.g., feel sudden illness, dizziness, or exhaustion strokes), which can prevent them from completing their mission but, more importantly, place their own health at risk. Furthermore, overzealous FRs may often not notice early signs or choose to ignore them in favor of accomplishing their mission, which can lead to additional casualties of the disaster.
Despite their willingness and proper training, FRs’ capabilities may be limited by chaotic environments, making it extremely difficult for them to operate and, for example, estimate the exact position of the victims, recognize dangerous areas or territories, contact other FR teams, or reach and utilize valuable resources. The combination of the overwhelming amount of information available to them along with the dynamically changing environment where they operate may result in reducing, rather than increasing, their situational awareness—personal and local. Often, the problem lies not in the lack of resources and willingness to provide help, but in the logistics to efficiently direct and deliver assistance to the right places where and when it is most needed. These problems raise the need to exploit rapidly evolving technological advances toward protecting FRs from multiple and unexpected dangers and to provide solutions enabling them to operate in a seamless and efficient way in any environment and in cooperation with the community and their peers.
“First responders Advanced technologies for Safe and efficienT Emergency Response” or FASTER [5] is a European Horizon 2020 project that acknowledges the special needs and requirements that arise in these difficult conditions for the FRs and moves for the development of different, beyond state-of-the-art technological tools, addressing issues concerning: data collection, operation capabilities, risk assessment, improved ergonomics, resilient communications, tactical situational awareness, and efficient co-operation and interoperability for the FRs. To achieve all these, FASTER is based in an architecture that, following the advice of the involved FRs in the project, builds an innovating digital ecosystem for emergency first response teams that enhances situational awareness control even in the dynamic and dangerous environments where FRs are often deployed. FASTER’s architecture presents a complete solution that consists of tools and mechanisms for data communication, data analysis, monitoring, privacy protection and smart detection, providing the FRs the needed solutions with efficiency and high usability.
The contribution of this paper focuses on:
  • Highlighting the details of FASTER’s architecture and presenting the outcomes of the implementation experience;
  • Sharing the results from the evaluation events that have been completed and presenting a list of events that are scheduled until the end of the project;
  • Discussing possible next steps in an effort to go beyond the delivered architecture of FASTER, using the experience coming from the design and implementation of it, and to provide ideas and suggestions on possible future extensions that can address more sophisticated hazards (e.g., fake news detection regarding on-going disasters).
The rest of the paper is structured as follows: in Section 2, the requirements and goals of the designed architecture are discussed, while in Section 3, detailed presentation of the FASTER logical architecture is provided. In Section 4, the evaluation results of the implementation from the piloting events in FASTER are presented, and in Section 5, a discussion for the possible extensions to the existing architecture is provided. Finally, the paper concludes in Section 6.

2. FASTER Architecture: Requirements and Goals

FASTER Architecture has been designed with the goal and the requirement of integrating innovative and emerging tools and technologies to assist first responders before, during and after their operations on the field. Research was conducted on this end, and the most relevant results can be found in [6,7,8,9,10,11].
More specifically, the tools have been divided into the following macro-categories for easier understanding of their role and functionality in FASTER architecture but also for better reference throughout the paper:
  • Category 1—Augmented Reality (AR) for Operational Situational Awareness: the tools in this category aim to offer more efficient situational awareness and decision making to practitioners in critical conditions that require full attention and focus from involved FRs. To achieve this, AR technology is used to deliver, in real-time, information gathered from the other components of the ecosystem (e.g., alerts, team status and location, sensor values), filtering the information and providing targeted content to the AR user. AR is supplied both through mobile phones and AR glasses (e.g., HoloLens) and enables FRs to access previously unreachable information in a contextual fashion, by superimposing the data to the real world.
    The specific modules/tools in the architecture that are included in this category are:
    Mobile Augmented Reality for Operational Support;
    Extended Vision Technologies using Commercial Light-Weight UAVs.
  • Category 2—Mobile and Wearable Technologies: Mobile and wearable technologies are developed in FASTER to enable communication between and to FRs. FRs will be able to receive and send information in an easy way that does not hinder their work, enabled by the technologies’ recognition of gestures and voice commands. Control centers can oversee the location and personal information of FRs in the field and relay messages and alerts. Such technologies are being developed in FASTER, not only for humans but for animals (e.g., dogs/K9s) as well, in order to collect valuable information about the animals’ activities.
    The specific modules/tools in the architecture that are included in this category are:
    Smart wearables and textiles;
    Animal Wearables for Behavior Recognition;
    Sensory data fusion.
  • Category 3—Body and Gesture-based User Interfaces: FASTER provides a framework for wearable devices that capture and identify arm/body movements exploiting artificial intelligence (AI). It provides non-visual/nonaudible communication capabilities, translating movements or critical readings from paired wearable devices to coded messages, able to be communicated to cooperating agents on the field through vibrations on wearable devices and following Morse code.
    The specific modules/tools in the architecture that are included in this category are:
    Hand gesture recognition for remote FRs communication;
    Gesture-based UxV.
  • Category 4—Autonomous Vehicles: The FASTER robotic vehicle includes a robotic platform able to integrate different sensors (i.e., optical and thermal cameras, environmental, nuclear, biological, chemical, radiological and explosives). Moreover, an array of drones of different sizes and equipped with different payloads that will be capable of providing different services to the first responder’s operators can also be deployed.
    The specific modules/tools in the architecture that are included in this category are:
    Robotic Platform;
    Swarm operational capabilities to allow complex tasks.
  • Category 5—Resilient Communications Support: FASTER provides a resilient communication infrastructure allowing FRs to easily communicate during a disaster. In particular, a novel, low-cost device called ResCuE, which is capable of delivering, through broadcasting, critical information to first responders or instructions to a crowd of Civilians is developed.
Moreover, a 5G-enabled infrastructure offers the means to manage and orchestrate resources of an edge cloud in proximity to the geographic area under investigation. A resilient communication service based on devices from the ecosystem and an augmented communication support through opportunistic relay services, e.g., swarm of drone usage is also delivered. A resilient communication service, employing common devices is also implemented. Finally, a solution based on distributed ledger technology allows the central systems of FRs and other relief mission participants, including social networks and IoT control systems, to connect via a distributed network.
The specific modules/tools in the architecture that are included in this category are:
  • Emergency communication box (ResCuE);
    5G-enabled communication infrastructure;
    Communication mesh through opportunistic relay services;
    Blockchain distributed network (distributed ledger technology—AIngle);
  • Category 6—Common Operational Picture: In disaster scenarios, a portable control center (PCC) collectively visualizes all the information needed for having a better understanding of the emergency, such as event location and people involved, enabling immediate reactions and supporting decision making. In particular, PCC provides to the emergency management teams a portable common operational picture (PCOP) with a clear perception of the scene, highlighting the relations between the actors involved (first responders, victims, NGOs, etc.) and environmental factors (risks, hazards, points-of-interest) in respect to time and space. The PCOP constitutes an information integration and visualization medium of data coming from heterogeneous information sources (e.g., sensors, robots, UAVs and mobile devices).
    The specific modules/tools in the architecture that are included in this category are:
    Portable control center;
    Social media analysis;
    Mission management and progress monitoring.
In parallel with the aforementioned goals, in the development of its innovative tools and services, FASTER has taken into consideration two sources of information regarding the user requirements that need to be addressed. These sources are:
The IFAFRI (, accessed on 20 February 2022) guidelines: IFAFRI is the International Forum to Advance First Responder Innovation, an organization created by international government leaders that gives a greater voice to FRs. IFAFRI has tried to identify potential areas of research and development where there may be opportunities for industry and academia to propose and develop innovative solutions. The forum focuses on the technologies needed to help FRs conduct their missions safely, effectively and efficiently and provides an overall umbrella for the requirements that should be met, and FASTER was aligned with IFAFRI’s main aims.
Members of the consortium: FASTER is a project that holds a consortium that includes (among others) eight first responder teams from seven European countries. These teams have been presented with the potentials of each tool and have presented a list of user requirements explicitly for them.
One of the IFAFRI strategic objectives is to identify and prioritize common capability gaps that pinpoint and prioritize areas that can benefit from new or improved solutions for FRs. To arrive at a set of capability gaps, IFAFRI’s membership conducted analyses of FR capability gaps in their countries and reached consensus on ten common FR capability gaps.
FASTER’s architecture, which is presented here, addresses the gaps defined by IFAFRI by addressing challenges associated with the protection of FRs in hazardous environments, while at the same time enhancing their capabilities in terms of situational awareness and communication.
In particular, FASTER’s architecture covers the IFAFRI common capability gaps by including:
Innovative and efficient tools covering real-time gathering and processing of heterogeneous physiological and critical environmental data from smart textiles, wearables, sensors, and social media;
Extended inspection capabilities and physical mitigation;
Tools for individual health assessment and disaster scene analysis for early warnings and risk mitigation;
Improved ergonomics providing augmented reality tools for enhanced information streaming;
Resilient communication at the field level and at the infrastructure level;
Tactical situational awareness providing innovative visualization services for a portable common operational picture (COP) for both indoor and outdoor scenario representation;
Efficient cooperation and interoperability among first responders, law enforcements agencies (LEAs), community members and other resource providers.
Table 1 shows the IFAFRI common capabilities gaps along with FASTER’s approach/tools to address them:

3. FASTER Architecture Design

FASTER logical architecture, as seen in Figure 1, consists of three different layers in vertical order:
The bottom layer is called “EDGE-Layer” and consists of IoT devices, from data aggregators and monitoring tools. Data aggregators can be smartphone applications, single-board computer applications and unmanned aerial vehicles (UAVs) that share information to the layer above (i.e., FOG-Layer). The monitoring tools are applications that assist FRs on the field by providing them with crucial information (e.g., biometrics, environmental data, video streaming and area mapping).
The next layer, in the middle of the three-layer approach, is called the “FOG-Layer”, and the architecture of this layer is based on event-driven microservices. The core service of this layer is a pub/sub system (e.g., a KAFKA implementation), and all the other services function based on the updates of the pub/sub system topics. FOG is a layer that supports the EGDE even without connectivity to the INTERNET layer. It is deployed locally and makes sure that FASTER tools operate and co-operate as defined and when Internet connection to the backbone communication network is achieved, synchronizing with the INTERNET layer to include the characteristics supported by it.
The top layer is called “INTERNET-Layer” and its purpose is to provide analysis in more depth and central processing of several emergency scenes. Similar to the FOG-Layer, the architecture of the INTERNET-Layer is an event-driven microservices architecture. The core part, once more, is a pub/sub system, and any changes to it trigger the rest of the services.
In general, FASTER architecture includes three main features: connectivity, interoperability (using a pub/sub data delivery system), security and authentication.
The connectivity between the layers varies. For the connectivity between the INTERNET and EDGE layers, Wide Area Networks (WANs) technologies are used, such as 5G, 4G, satellite etc. This form of connectivity is called the “Primary Communication Network”, which connects the INTERNET layer with the other layers in the FASTER architecture. The “Secondary Communication Network” is the communication between the FOG and EDGE layers it uses in local area networks technologies such as Wi-Fi, ZigBee, Bluetooth etc. In every type of communication, data encryption is applied to protect foremost the personal data of the users and the mission against malicious activities that will expose users’ personal data and harm the operation.
At the same time, INTERNET and FOG layers are both designed as event-driven systems with services that are triggered by events produced in a pub/sub system, which constitutes the main component of those layers. Interoperability is achieved with the replication of this pub/sub system on the two upper layers of the architecture, while the data that enter from the lowest layer (i.e., EDGE) follow a specific JSON structure that allows for their processing in the upper layers. In addition, based on [12], event-driven architecture offers benefits such as high agility, performance and scalability. To run the services of this architecture, FASTER uses containerization, which along with event-driven architecture, offers grate benefits, including improved security through isolation, improved agility and flexibility.
Security is especially important in order to have a reliable system that will assist first responding teams. To this end, FASTER uses an authentication–authorization–accounting (AAA) server to protect the information and the services of FOG layer. Every single request on this service have to be authorized by the AAA server. In that way, we protect the system from unauthorized use (coming either from below or from above layers) and increase the confidentiality and integrity of data. Furthermore, to protect the data that are communicated all the services, the use of TLS cryptographic protocol to protect the confidentiality and integrity of the data is implemented.
In general, first responding systems are critical due to their nature; thus, they demand quality characteristics such as: high performance, high availability, reliability and security. FASTER’s architecture was designed to fulfil all these requirements. Figure 1 illustrates the presented FASTER logical architecture while the following subsections describe in depth the structure of each layer and the technologies that are used to fulfil the aforementioned requirements and deliver the expected performance.

3.1. INTERNET Layer

The INTERNET layer lies at the top of the FASTER digital ecosystem. It demands a high level of security and availability because it controls all the operations in different responding cases. It is important for this layer to be up and perform with no interruptions. Services of this layer are deployed and run in a cloud-infrastructure using internet connectivity. The main purpose of this layer is to enhance the global situation awareness, and to accomplice this task, it makes use of the following services:
  • Social Media Analysis gathers information from Twitter streams and analyzes the in order to extract further information that will enhance the operation on the field.
  • QoS Monitoring Framework monitors all systems in the FASTER ecosystem by collecting information such as latency, latency, bandwidth, computation efficiency, as well as other application metrics. Based on these data, the team that manages the ecosystem makes decision about services replacement, configuration, scaling-up, etc.
  • Scene Analysis from Building Sensor Data analyzes data from sensors in a building to trigger events such as high levels of CO2 and high temperature.
  • Mission Management service provides functionality that supports the operation on the field using a chatbot application for in-field responders and processing and monitoring of the gathered information. Portable Command and Control Center is a visualization system and has a crucial role in the ecosystem because it is the service that monitors the operation on the field.

3.2. FOG Layer

The FOG layer stands between the INTERNET and EDGE layers and acts as a link between those layers by providing tools for gathering, processing and monitoring information from sources in the EDGE layer. FOG can be deployed locally near the field of action, helping with the propagation of the information regardless of an Internet connection to the INTERNET layer.
The services that the FOG layer offers are:
  • AR for operational support creates and augments reality to allow first responders to work hands-free by displaying useful information (commands, information from PCOP, alerts etc.) and sharing annotations.
  • Portable command operation picture (PCOP) is the eye of the team that manages the operation. It visualizes all the incoming data and processes information from the EDGE layer.
  • Building sensor data visualization service creates all the appropriate visualizations with building sensor data gathered from sources in the EDGE layer.
  • Scene analysis from video identifies potential risks on the field by using video data (AR devices, UxVs) and a trained AI algorithm.
  • Information processing for enhanced COP assists PCOP by analyzing the data produced by the rest components of the FASTER ecosystem, to extract useful knowledge and events.
  • Scene analysis from IoT sensor data generates local awareness alerts by analyzing alerts produced by wearables and textiles in the EDGE layer.
  • UxVs gesture control component gives the advantage to first responders to control UxVs intuitively without using controllers.
  • Extended vision extends the field of view of first responders using UAVs for gathering video from the field and visualizes it to first responders in AR device such Microsoft’s HoloLens (, accessed on 20 February 2022).
  • 2D/3D mapping provides first responders with 2D and 3D mapping of the affected area of the mission. This service uses images gathered from UAVs to provide updated maps along with location information.
  • The Secure IoT Middleware (SIM) provides a set of services/functionalities that secures the operation in the FOG layer. Such services are authentication, authorization, and accounting (AAA), which protect all the services in this layer, as well as traffic analysis to protect infrastructure from malicious users, mechanisms that detect personal information on incoming packets, and which anonymize it, and finally a mechanism that validates data before producing them for the appropriate topic.

3.3. EDGE Layer

EDGE layer is the edge device layer, which has limited computing power and acts as a set of sources of information. The EDGE layer communicates with FOG’s pub/sub system to provide information about the field and the operation. The pub/sub system is protected by the AAA server on the FOG layer; thus, devices on the EDGE layer communicate with the AAA server to grant the required permission to communicate to the pub/sub system. The reason for duplicate AAA servers is that the local operation might not have Internet connectivity; thus, the FOG layer should provide a local AAA solution that is set up regarding the devices and the needs of the EDGE layer. The devices/applications on this layer are:
  • Building sensors on the EDGE layer produce information to the pub/sub system, and the building sensor data module sends them to the INTERNET layer for analyses purposes.
  • Smart Textiles Framework (STF) consists of sensors integrated in wearable textiles, providing biometric and environmental data for each FR to the FOG layer. Smart textiles are interconnected to smartphones in order to produce the generated data to the FOG layer.
  • Behavioral recognition Animal Wearable is a custom wearable solution that uses AI algorithms to provide notifications when the animal barks or when specific movements are repeated (by the animal) that are performed when a victim is located.
  • Movement recognition tool (MORSE) provides a novel mechanism for translating critical arm moves and gestures to messages using AI techniques.
  • Mini-UAVs capabilities provide video streaming of the surrounding areas
  • AR device and mobile application helps the FR see the needed information on their glasses and to act, allowing them to move freely in the surroundings.
  • Chatbot client will help share mission updates and critical information
  • ResCuE is a low-cost resilient communication device that supports the broadcasting of short text messages when no communication infrastructure is available.
  • Weather station delivers local weather information to the FR deployed in the area.
  • UGV with navigation capabilities allows the FR to use the drones with their hand
  • Swarm of UAVs capabilities to be deployed when communications are damaged and used to deliver messages, forming an ad hoc communication network.

4. Evaluation Results

FASTER’s architecture has been constantly evolving and, as the time passes, becomes more and more mature and efficient. To test its performance, several evaluation events (or piloting activities) have been performed, although the COVID-19 pandemic has limited travelling for the involved partners. To overpass these difficulties, more piloting events have been organized to different countries that include FASTER consortium members (especially end-users, i.e., FRs) to allow all the FRs participating in the project to use the tools and to provide feedback on the overall system performance.
To this end, in 2021, more than eight piloting events took place, in Europe and in Japan, allowing for most of the tools and technologies to be tested and receive constructive feedback for improvements. These piloting events were included in the first evaluation round and included several use cases: earthquake, search and rescue operation, flooding, fire, and possible terrorist attacks in a school building. With four more major piloting events coming in the beginning of 2022 as part of the second evaluation round, thorough testing of all the FASTER tools will be succeeded along with thorough testing of the systems’ performance and capabilities. Table 2 shows the participation of each tool in the scheduled piloting activities.
Given the dynamically progression of the implementation of FASTER’s architecture, two piloting rounds are included. In the initial round of evaluation tests, not all the features were tested, while the whole system will be tested in the second iteration of pilots. Table 3 presents the features that are included for testing in each one.
The results from the first round were promising and were focused mainly on improvements to the devices that the FRs are interacting with, along with suggestions and improvements on the data representation in the INERNET and FOG layer.

4.1. A Use Case Scenario

A use case scenario from one of the piloting events will be described here to be used for better understanding of the performance of the FASTER architecture and of the involved tools. This simple scenario shows how all the layers of the FASTER architecture can work together during one mission. In more detail, it shows how the IoT devices (drones, robots, wearables, UAVs in this scenario) deployed on the EDGE layer communicate with the FOG layer (e.g., portable command center), while the data are also delivered to the INTERNET layer when the connection is active. In addition, commands from the control center (INTERNET layer) can be delivered to EDGE devices.

4.2. Scenario Description

A fire alert is triggered in an industrial area close to the urban perimeter of Grândola (Spain). First responders arrive at the scene and notify the command center of the possibility of multiple fires being started due to the breakdown of a power line that crossed the airspace above the industrial area. Conventional communication lines are useless due to the power line breakdown. The global control center (Category 6) is up and running in the command center and needs Internet access to receive any updates.
Due to the high electromagnetic radiation on site, the UAV communication relay service (Category 5) is established to support first responders’ communications (EDGE to FOG communication path and vice versa). Works on using this tool to extend the network covering area and provide a link with the main communication infrastructure are initialized (FOG to INTERNET and vice versa).
Responders want to have an idea of the affected area; they perform some mapping (extended vision technologies using commercial light-weight UAVs, Category 1) to visualize on the portable control center (Category 6), which is located at the area of interest (EDGE to FOG communication path). This will help with response planning as well.
Two FRs with USAR training arrive at the scene in order to perform a search and rescue operation inside one of the affected buildings—the FR team is equipped with a wearable (Category 2) that transmits their biometric information to the portable control center (Category 6), tracking their health status (EDGE to FOG). These data will also be delivered to the global communication center (FOG to INTERNET layer) for further assessment of the situation when the communication path is active.
An autonomous vehicle (robotic platform, Category 4) is deployed to better inspect, with the installed cameras and sensors, the existence of hidden ignitions, existence of radiation and gas leaks (EDGE to FOG and FOG to INTERNET communication paths).
At the same time, the connection to the main communication network is established. Then, by using the social media analysis tool (Category 6), the global command center is able to receive additional information coming from civilians regarding the news in the area and can send mission alerts to the FRs’ wearables to investigate possible dangers (INTERNET to FOG and FOG to EDGE communication path).
Finally, the situation is controlled by FRs, and all operations terminate.

5. Expanding FASTER’s Architecture

FASTER’s logical architecture constitutes a novel solution for FRs’ teams, providing useful technologies and tools for the control and best performance of the deployed FR units on the field. Given the complete integration of the aforementioned tools and applications, a number of possible extensions that could improve its performance and usage are presented, which cover use cases that were not initially conceived for FASTER.
The first one is an important extension that allows the ecosystem to include as a feature the fake news detection. FASTER already uses a social media analysis tool for analyzing data and extracting information that could be useful for the operation on the field. Improving its capabilities with detection of fake news could also add a layer of protection since fake news may harm the operation on the field, as they will act as misleading information. Examples in the literature support this idea and highlight that the task of fake-news detection is a task that can be easily overtaken by exploiting machine learning technics. In [13], researchers used convolutional neural networks (CNNs) along with bidirectional long short-term memories (LSTMs) for identifying fake news with an accuracy of 88.78%. In [14], researchers exploited the power of graphs to detect fake news. They used a Graph-aware CoAttention Networks (GCAN) for fake-news detection on Twitter, analyzing the source tweet and its propagation through users. Finally, in [15], the authors introduced a novel mechanism for paraphrase detection using bidirectional encoder representations from transformers (BERT) and measuring the similarity of two posts. Such a mechanism, to make social-media analysis more secure or to be used as a source of information, can be easily integrated with the proposed architecture and can improve the performance while allowing its use in a wider number of use cases.
In addition, since FRs are called to operate in natural disasters, such as wildfires and earthquakes where they are exposed to difficult conditions, it would be helpful to provide them with natural-disaster prediction tools, in order to allow them to be better prepared and to organize their protection during action. To achieve this, we could use AI to create such mechanisms. In [16,17], researchers used deep neural networks, such as long short-term memory (LSTM), for earthquake prediction, achieving an accuracy of above 80%. In [18], researchers used support vector machines (SVM) and an artificial neural network (ANN) for wildfire prediction, achieving an accuracy of above 97%. In [19], researchers used AI approaches such as SVM, ANN and random forest (RF) for predicting wildfire susceptibility using GPS data with the latest approach (RF), achieving the highest cross-validation score of 88%. Another example of natural disaster prediction is the work in [20], which introduces a mechanism for tornado prediction using active learning along with SVM. Such mechanisms could be used in the INTERNET layer for global scene management. In addition, such a service can be included in the FOG layer for local scene management.
As already mentioned, a crucial feature in FASTER architecture is security. First responding systems demand a high level of security in order for the operations to be secure and successful; to this end, some additions that enhance the offered security of the existing ecosystem can be considered. In [21], researchers introduced a reliable and efficient pub/sub system based on a blockchain distributed network. This system offers enhanced security through blockchain, offering a distributed solution for bypassing the issue of single point of failure of other pub/sub systems (e.g., KAFKA using zookeeper) and offering a novel mechanism for detecting illegal operations from malicious parts. Another addition could be a more enhanced login method for the edge devices on the field. This could happen using an NFC login as proposed in [16]. In [22], researchers introduced a near-field communication (NFC)-based method for login using an NFC tag and a passcode. In that way, only well-known people of the team could again access to edge devices from a secured area on the field without the risk of being hacked. Alternatively, the implementation of a named data networking (NDN) approach for better security and better control on data that flows from the EDGE layer to the FOG layer can also be considered. Recent research on this [23,24,25] has proven that the use of such an approach can give us reduced delay times on communication, less energy consumption (which is important for edge devices). However, such an approach could trigger the need for significant changes to the existing implementation.
Finally, at the moment, FASTER’s functionality mainly focuses on the preparation and guidance of the FR teams before, during and, for assessment, after their operation. Another important factor is the monitoring of the health and mentality of the individual FRs after their participation in difficult missions. Based on recent research [26], post-traumatic stress disorder (PTSD) seems to be a serious problem in FRs; thus, there is a need for mechanisms that will detect PTSD in individual FRs, such that their teams are able to handle such a situation before further damaging is caused. The works in [27,28,29] have shown that it is feasible to use machine learning for mental issues detection; thus, we could train a PTSD classifier with data collected from wearable devices to identify if a first responder is facing mental issues.

6. Conclusions

In this paper, the presentation of the logical architecture of a modern cyber-physical ecosystem that is implemented in the FASTER project is delivered, with its main characteristics being the connectivity, interoperability and security of the system and its layers. This complex architecture aims to meet the requirements of FR teams following the requirements of the members in the consortium but also following the guidelines of IFAFRI, a world-wide organization that represents FRs and highlights the basic needs and gaps in the existing implementations.
A list of evaluation events that are scheduled or delivered is also discussed, along with ideas of possible extensions to the implemented system that enhance its security and extend its performance to cover more use cases and become even more focused in FRs, both as individuals and as a team.

Author Contributions

Conceptualization, E.K., D.G.K. and C.Z.P.; Funding acquisition, C.Z.P., G.G., A.D. and P.D.; Methodology, E.K. and D.G.K.; Project administration, C.Z.P., G.G., A.D. and P.D.; Software, E.K.; Supervision, C.Z.P.; Validation, E.K.; Writing—original draft, E.K. and D.G.K.; Writing—review and editing, G.G. All authors have read and agreed to the published version of the manuscript.


This research was funded by European Union’s Horizon 2020 research and innovation program under the FASTER project grant number 833507.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.


The work presented in this paper has received funding from the European Union’s Horizon 2020 research and innovation program under the FASTER project, grant agreement no. 833507.

Conflicts of Interest

The authors declare no conflict of interest.


  1. European Environment Agency. Available online: (accessed on 13 December 2021).
  2. Wehrli, A.; Herkendell, J.; Jol, A. Mapping the Impacts of Natural Hazards and Technological Accidents in Europe; European Environment Agency (EEA): København, Denmark, 2010. [Google Scholar]
  3. European Environment Agency. Available online: (accessed on 13 December 2021).
  4. Forzieri, G.; Cescatti, A.; e Silva, F.B.; Feyen, L. Increasing risk over time of weather-related hazards to the European population: A data-driven prognostic study. Lancet Planet. Health 2017, 1, e200–e208. [Google Scholar] [CrossRef]
  5. First Responder Advanced Technologies for Safe and Efficient Emergency Response. Available online: (accessed on 13 December 2021).
  6. Piscitelli, S.; Arnaudo, E.; Rossi, C. Multilingual Text Classification from Twitter during Emergencies. In Proceedings of the 2021 IEEE International Conference on Consumer Electronics (ICCE), Las Vegas, NV, USA, 10–12 January 2021; pp. 1–6. [Google Scholar] [CrossRef]
  7. Sainidis, D.; Tsiakmakis, D.; Konstantoudakis, K.; Albanis, G.; Dimou, A.; Daras, P. Single-handed Gesture UAV Control and Video Feed AR Visualization for First Responders. In Proceedings of the International Conference on Information Systems for Crisis Response and Management (ISCRAM), Blacksburg, VA, USA, 23–26 May 2021. [Google Scholar]
  8. Patrikakis, C.Z.; Kogias, D.G.; Chatzigeorgiou, C.; Kalyvas, D.; Katsadouros, E.; Giannousis, C. A method for measuring urban space density of people and deliver notification, with respect to privacy. In Proceedings of the 2021 IEEE International Conference on Consumer Electronics (ICCE), Las Vegas, NV, USA, 10–12 January 2021; pp. 1–6. [Google Scholar] [CrossRef]
  9. Ragab, A.R.; Isaac, M.S.A.; Luna, M.A.; Peña, P.F. Unmanned Aerial Vehicle Swarming. In Proceedings of the 2021 International Conference on Engineering and Emerging Technologies (ICEET), Istanbul, Turkey, 27–28 September 2021; pp. 1–6. [Google Scholar] [CrossRef]
  10. Luna, M.A.; Ragab, A.R.; Isac, M.S.A.; Peña, P.F.; Cervera, P.C. A New Algorithm Using Hybrid UAV Swarm Control System for Firefighting Dynamical Task Allocation. In Proceedings of the 2021 IEEE International Conference on Systems, Man, and Cybernetics (SMC), Melbourne, Australia, 17–20 October 2021; pp. 655–660. [Google Scholar] [CrossRef]
  11. Kasnesis, P.; Doulgerakis, V.; Uzunidis, D.; Kogias, D.G.; Funcia, S.I.; González, M.B.; Giannousis, C.; Patrikakis, C.Z. Deep Learning Empowered Wearable-Based Behavior Recognition for Search and Rescue Dogs. Sensors 2022, 22, 993. [Google Scholar] [CrossRef] [PubMed]
  12. Richards, M. Event-Driven Architecture. In Software Architecture Patterns; O’REILLY: Newton, MA, USA, 2015; pp. 18–19. [Google Scholar]
  13. Kumar, S.; Asthana, R.; Upadhyay, S.; Upreti, N.; Akbar, M. Fake news detection using deep learning models: A novel approach. Trans. Emerg. Telecommun. Technol. 2020, 31, e3767. [Google Scholar] [CrossRef]
  14. Lu, Y.-J.; Li, C.T. GCAN: Graph-aware Co-Attention Networks for Explainable Fake News Detection on Social Media. arXiv 2020, arXiv:2004.11648. [Google Scholar]
  15. Kasnesis, P.; Heartfield, R.; Toumanidis, L.; Liang, X.; Loukas, G.; Patrikakis, C. A prototype deep learning paraphrase identification service for discovering information cascades in social networks. In Proceedings of the 2020 IEEE International Conference on Multimedia & Expo Workshops (ICMEW), London, UK, 6–10 July 2020; pp. 1–4. [Google Scholar]
  16. Wang, Q.; Guo, Y.; Yu, L.; Li, P. Earthquake prediction based on spatio-temporal data mining: An LSTM network approach. IEEE Trans. Emerg. Top. Comput. 2017, 8, 148–158. [Google Scholar] [CrossRef]
  17. Asim, K.M.; Idris, A.; Iqbal, T.; Martínez-Álvarez, F. Earthquake prediction model using support vector regressor and hybrid neural networks. PLoS ONE 2018, 13, e0199004. [Google Scholar] [CrossRef]
  18. Sayad, Y.O.; Mousannif, H.; Al Moatassime, H. Predictive modeling of wildfires: A new dataset and machine learning approach. Fire Saf. J. 2019, 104, 130–146. [Google Scholar] [CrossRef]
  19. Ghorbanzadeh, O.; Valizadeh Kamran, K.; Blaschke, T.; Aryal, J.; Naboureh, A.; Einali, J.; Bian, J. Spatial prediction of wildfire susceptibility using field survey gps data and machine learning approaches. Fire 2019, 2, 43. [Google Scholar] [CrossRef] [Green Version]
  20. Trafalis, T.B.; Adrianto, I.; Richman, M.B. Richman. Active learning with support vector machines for tornado prediction. In International Conference on Computational Science; Springer: Berlin/Heidelberg, Germany, 2007; pp. 1130–1137. [Google Scholar]
  21. Huang, B.; Zhang, R.; Lu, Z.; Zhang, Y.; Wu, J.; Zhan, L.; Hung, P.C. BPS: A reliable and efficient pub/sub communication model with blockchain-enhanced paradigm in multi-tenant edge cloud. J. Parallel Distrib. Comput. 2020, 143, 167–178. [Google Scholar] [CrossRef]
  22. Hufstetler, W.A.; Ramos, M.J.H.; Wang, S. NFC unlock: Secure two-factor computer authentication using NFC. In Proceedings of the 2017 IEEE 14th International Conference on Mobile Ad Hoc and Sensor Systems (MASS), Orlando, FL, USA, 22–25 October 2017; pp. 507–510. [Google Scholar]
  23. Ali, Z.; Shah, M.A.; Almogren, A.; Ud Din, I.; Maple, C.; Khattak, H.A. Named data networking for efficient iot-based disaster management in a smart campus. Sustainability 2020, 12, 3088. [Google Scholar] [CrossRef]
  24. Wang, X.; Cai, S. Secure healthcare monitoring framework integrating NDN-based IoT with edge cloud. Future Gener. Comput. Syst. 2020, 112, 320–329. [Google Scholar] [CrossRef]
  25. Rawat, D.B.; Doku, R.; Adebayo, A.; Bajracharya, C.; Kamhoua, C. Blockchain enabled named data networking for secure vehicle-to-everything communications. IEEE Netw. 2020, 34, 185–189. [Google Scholar] [CrossRef]
  26. Wilson, S.; Guliani, H.; Boichev, G. On the economics of post-traumatic stress disorder among first responders in Canada. J. Community Saf. Well-Being 2016, 1, 26–31. [Google Scholar] [CrossRef] [Green Version]
  27. Subhani, A.R.; Mumtaz, W.; Saad, M.N.B.M.; Kamel, N.; Malik, A.S. Machine learning framework for the detection of mental stress at multiple levels. IEEE Access 2017, 5, 13545–13556. [Google Scholar] [CrossRef]
  28. Pandey, P.S. Machine learning and IoT for prediction and detection of stress. In Proceedings of the 2017 17th International Conference on Computational Science and Its Applications (ICCSA), Trieste, Italy, 3–6 July 2017; pp. 1–5. [Google Scholar]
  29. Vuppalapati, C.; Raghu, N.; Veluru, P.; Khursheed, S. A system to detect mental stress using machine learning and mobile development. In Proceedings of the 2018 International Conference on Machine Learning and Cybernetics (ICMLC), Chengdu, China, 15–18 July 2018; Volume 1, pp. 161–166. [Google Scholar]
Figure 1. FASTER logical architecture.
Figure 1. FASTER logical architecture.
Information 13 00115 g001
Table 1. IFAFRI communication gaps and FASTER solutions/tools for them.
Table 1. IFAFRI communication gaps and FASTER solutions/tools for them.
A/ARequirement/Capability GapCategories of Tools Used to Meet the RequirementFASTER Solution/Tool
1The ability to know the location of responders and their proximity to threats and hazards in real timeCategories 1 (Situational awareness), 2 (Mobile and Wearable Technologies), 6 (Common Operational Picture)
  • Augmented Reality (AR) for Operational Support (Category 1),
  • Mission management and Progress Monitoring (Category 6),
  • Smart Wearables and Textiles (Category 2),
  • Portable Control Center (Category 6)
2The ability to detect, monitor, and analyze passive and active threats and hazards at incident scenes in real time
  • Categories 1 (Situational awareness), 2 (Mobile and Wearable Technologies), 3 (Body and Gesture based UI), 4 (Autonomous Vehicles) and 6 (Common Operational Picture)
  • Augmented Reality (AR) for Operational Support (Category 1),
  • Extended Vision Technologies using Commercial Light-Weight UAVs (Category 3),
  • Animal Wearable for Behavior Recognition (Category 2),
  • Gesture-based UxV (Category 3),
  • Robotic Platform (Category 4),
  • Swarm operational capabilities to allow complex tasks (Category 4),
  • Portable control center, social media analysis (Category 6)
3The ability to rapidly identify hazardous agents and contaminants
  • Categories 1 (Situational awareness), 2 (Mobile and Wearable Technologies) and 4 (Autonomous Vehicles)
  • Augmented Reality (AR) for Operational Support (Category 1),
  • Smart wearables and textiles (Category 2),
  • Robotic Platform (Category 4)
4The ability to incorporate information from multiple and non-traditional sources into incident command operations
  • Categories 2 (Mobile and Wearable Technologies), 5 (Resilient Communications Support) and 6 (Common Operational Picture)
  • Sensory data fusion (Category 2),
  • Blockchain distributed network (Distributed Ledger Technology) (Category 5),
  • Portable control center (Category 6),
  • Social media analysis (Category 6)
5The ability to maintain interoperable communications with responders in any environmental conditions
  • Categories 1 (Situational awareness), 3 (Body and Gesture based UI), 5 (Resilient Communications Support) and 6 (Common Operational Picture)
  • Augmented Reality (AR) for Operational Support (Category 1),
  • Mission management and Progress Monitoring (Category 6),
  • Hand gesture recognition for remote FRs communication (Category 3),
  • Swarm operational capabilities to allow complex tasks (Category 5),
  • Emergency communication box (Category 5),
  • 5G-enabled communication infrastructure (Category 5),
  • Communication mesh through opportunistic relay services (Category 5),
  • Blockchain distributed network (Distributed Ledger Technology) (Category 5)
6The ability to obtain critical information remotely about the extent, perimeter, or interior of the incident
  • Categories 1 (Situational awareness), 2 (Mobile and Wearable Technologies), 4 (Autonomous Vehicles) and 6 (Common Operational Picture)
  • Augmented Reality (AR) for Operational Support (Category 1),
  • Extended Vision Technologies using Commercial Light-Weight UAVs (Category 1),
  • Mission management and Progress Monitoring (Category 6),
  • K9 Behavior Recognition (Category 2),
  • Wearable Gesture-based UxV (Category 2),
  • Robotic Platform (Category 4),
  • Swarm operational capabilities to allow complex tasks (Category 4),
  • Portable control center (Category 6),
  • Social media analysis (Category 6)
7The ability to conduct on-scene operations remotely without endangering responders
  • Categories 1 (Situational awareness), 2 (Mobile and Wearable Technologies), 4 (Autonomous Vehicles) and 6 (Common Operational Picture)
  • Extended Vision Technologies using Commercial Light-Weight UAVs (Category 1),
  • Animal Wearable for Behavior Recognition (Category 2),
  • Wearable Gesture-based UxV (Category 2),
  • Robotic Platform (Category 4),
  • Swarm operational capabilities to allow complex tasks (Category 4),
  • Portable control center (Category 6)
8The ability to monitor the physiological signs of emergency responders
  • Categories 1 (Situational awareness), 2 (Mobile and Wearable Technologies) and 6 (Common Operational Picture)
  • Augmented Reality (AR) for Operational Support (Category 1),
  • Mission management and Progress Monitoring (Category 6),
  • Smart wearables and textiles (Category 2),
  • Portable control center (Category 6)
9The ability to create actionable intelligence based on data and information from multiple sources
  • Categories 1 (Situational awareness), 2 (Mobile and Wearable Technologies), 5 (Resilient Communications Support) and 6 (Common Operational Picture)
  • Augmented Reality (AR) for Operational Support (Category 1),
  • Mission management and Progress Monitoring (Category 6),
  • Sensory data fusion (Category 2),
  • Blockchain distributed network (Distributed Ledger Technology) (Category 5),
  • Portable control center (Category 6),
  • Social media analysis (Category 6)
10The ability to provide appropriate and advanced personal protective equipment
  • Categories 2 (Mobile and Wearable Technologies),
  • Smart wearables and textiles (Category 2)
Table 2. Participation List of FASTER tools in piloting activities for evaluation.
Table 2. Participation List of FASTER tools in piloting activities for evaluation.
Module NamePT PilotJP Pilot 1ES PilotGR PilotFR PilotJP Pilot 2PL PilotIT FinalFI FinalES FinalTotal
Portable command and control center10
Smart textiles framework 8
Animal harness for behavior recognition 5
MORSE—gesture communication 7
RESCUE—communication box 5
Extended vision using mini-UAVs+ gesture control 8
2D/3D mapping and AI scene analysis (aerial)10
Ground autonomous vehicles and 3D mapping 6
Swarm of drones for complex tasks 6
Augmented reality for operational support 6
Mission management tool 8
Social media analysis 3
Building Situation Tool (BUST) 1
Local weather station 6
UAV relay for extended communication capabilities 5
5G-enabled network 2
Table 3. FASTER architecture features scheduled to be tested on each piloting/testing round.
Table 3. FASTER architecture features scheduled to be tested on each piloting/testing round.
First Testing/Piloting RoundSecond Testing/Piloting Round
Communication, Interoperability, SecurityCommunication, Interoperability, Security, Authentication, Encryption
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Katsadouros, E.; Kogias, D.G.; Patrikakis, C.Z.; Giunta, G.; Dimou, A.; Daras, P. Introducing the Architecture of FASTER: A Digital Ecosystem for First Responder Teams. Information 2022, 13, 115.

AMA Style

Katsadouros E, Kogias DG, Patrikakis CZ, Giunta G, Dimou A, Daras P. Introducing the Architecture of FASTER: A Digital Ecosystem for First Responder Teams. Information. 2022; 13(3):115.

Chicago/Turabian Style

Katsadouros, Evangelos, Dimitrios G. Kogias, Charalampos Z. Patrikakis, Gabriele Giunta, Anastasios Dimou, and Petros Daras. 2022. "Introducing the Architecture of FASTER: A Digital Ecosystem for First Responder Teams" Information 13, no. 3: 115.

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop