On the Adoption of Modern Technologies to Fight the COVID-19 Pandemic: A Technical Synthesis of Latest Developments

: In the ongoing COVID-19 pandemic, digital technologies have played a vital role to minimize the spread of COVID-19, and to control its pitfalls for the general public. Without such technologies, bringing the pandemic under control would have been tricky and slow. Consequently, exploration of pandemic status, and devising appropriate mitigation strategies would also be difﬁcult. In this paper, we present a comprehensive analysis of community-beneﬁcial digital technologies that were employed to ﬁght the COVID-19 pandemic. Speciﬁcally, we demonstrate the practical applications of ten major digital technologies that have effectively served mankind in different ways during the pandemic crisis. We have chosen these technologies based on their technical signiﬁcance and large-scale adoption in the COVID-19 arena. The selected technologies are the Internet of Things (IoT), artiﬁcial intelligence(AI), natural language processing(NLP), computer vision (CV), blockchain (BC), federated learning (FL), robotics, tiny machine learning (TinyML), edge computing (EC), and synthetic data (SD). For each technology, we demonstrate the working mechanism, technical applications in the context of COVID-19, and major challenges from the perspective of COVID-19. Our analysis can pave the way to understanding the roles of these digital COVID-19-ﬁghting technologies that can be used to ﬁght future infectious diseases to prevent global crises. Moreover, we discuss heterogeneous data that have signiﬁcantly contributed to addressing multiple aspects of the ongoing pandemic when fed to the aforementioned technologies. To the best of the authors’ knowledge, this is a pioneering work on community-beneﬁcial and transformative technologies in the context of COVID-19 with broader coverage of studies and applications.


Introduction
During the coronavirus disease 2019 (COVID-19) pandemic, there was tremendous growth in digital technologies, especially in the healthcare sector [1]. There are two major driving forces for this digital revolution, (i) to control the pandemic [2], and (ii) to regulate business (or education) [3]. COVID-19 has accelerated the digital revolution, and the companies which are reluctant to adopt new tools/technologies amid this paradigm shift will likely face dropping sales, or even financial difficulty to sustain their market value [4]. This pandemic has highlighted that exploiting the power of digital tools is an effective way to contain any infectious diseases [5]. Digital technologies (IoT, AI, DL, ML, blockchain, augmented and virtual reality, cloud and fog computing, big data, computational intelligence, robots, thermal scanning technologies, and drones, contact tracing mobile apps, telecommunications, and 5G) have played a dominant role in early detection and diagnosis, pre-screening, contact tracing, monitoring infected/quarantined/exposed people, trends analysis, risk estimation, mask checking, forecasting future infection rates, predicting the possibility of COVID-19 via images, cough samples analysis, and more [6]. Apart from these technologies, many fused technologies such as blockchain and federated learning, differential privacy and federated learning, big data analytics and AIoT have also helped in controlling the risks of this pandemic [7][8][9]. Some studies have been proposed to constrain/monitor the precautionary measures recommended by the World Health Organization (WHO) to control the pandemic [10]. Big data technologies have played a vital role to restrict the spread of this deadly virus by fusing huge data, and by forecasting the trends [11,12]. In addition to these technologies, a great deal of software/apps were developed to find the contacts of an infected person to lower the spread rate.
As companies and tech giants seek new technical solutions to curb the COVID crisis, some interesting technical themes have emerged during this pandemic period. New technical developments can be classified into certain categories. In Figure 1, we present category and region-wise use cases about digital developments in COVID-19 era. Based on the analysis of the Boston Consulting Group (as shown in Figure 1a,b), 25% of digital solutions focused on containment and detection, 20% addressed problems concerning healthcare provider enablement, and about 21% focused on grappling with economic resilience. As shown in Figure 1b, nearly a third of these use cases were global, due to the interconnective and collaborative nature of COVID-19 response. As the world soberly reaches the virus' 2 years and 11 months anniversary, identifying such use cases can be a gentle reminder of the creativity and resilience the world continues to show to tackle this global crisis (https://www.weforum.org/agenda/2020/08/5-technology-advancementsduring-covid-19-wearables-ai/, accessed on 7 December 2022).
As cited above in Figure 1, digital tools have made huge progress in certain aspects to bring the pandemic under control. A recent study has briefly presented the role of IT in the era of COVID-19 [13]. Another study discussed the technologies used in planning responses to COVID-19 including pandemic planning, testing, surveillance, quarantine, contact tracing, and health care [14]. The technologies that have provided support in the prevention of COVID-19 were discussed in [15]. The status of privacy during the COVID-19 pandemic was analyzed in [16]. A need for digital infrastructure to contain COVID-19like pandemics in the future was highlighted in the recent study [17]. The role of mobile technologies to fight the COVID-19 pandemic was demonstrated in a practical study [18]. The authors compared the use of digital technologies in many countries of the world. The acceptance of COVID-19 digital tackling technologies (CDTT) in various countries and the need for revision of health protection regulations targeting the COVID-19 pandemic were assessed in a study [19]. Digital technologies that can forecast the course of the pandemic in India using the ARIMA model were proposed [20]. The role of AI in the context of COVID-19 was comprehensively discussed in a recent study [21]. Unfortunately, most of the studies have provided limited coverage and knowledge of available technologies that played a vital role in this pandemic. Furthermore, comprehensive knowledge related to various services (e.g., detection, tracing, tracking, surveillance, monitoring of diseases, therapeutic responses, etc.) offered by the latest digital technologies remained unexplored. To this end, we present the first work that highlights the sustainable digital technologies and corresponding services that have played a vital role in the COVID-19 pandemic. Specifically, we make the following contributions to the body of knowledge targetting COVID-19.

•
We explore digital and transformative technologies that have firmly contributed to constraining the spread of COVID-19, and we identify opportunities to highlight and thoroughly discuss those underrated technologies along with their unique services. • We identify the ten latest technologies (i.e., IoT, AI, NLP, computer vision, blockchain, federated learning, robotics, TinyML, edge computing, and synthetic data) through rigorous analysis of research papers, developed tools, blogs, and industry leader talks. • We identify and present the services offered by these digital technologies in a systematic way that remained unexplored in the current literature. • We extract and present the heterogeneous data that have played a vital role in the management and containment of COVID-19 when used in the above-cited technologies. • We identify challenges faced by these technologies and pinpoint various promising research trajectories that can enable rapid development to contain future pandemics/epidemics. • To the best of our knowledge, this is the first work that targets COVID-19-fighting technologies, unique services provided by them, and heterogeneous data used in COVID-19-fighting technologies. With this article, we aim to provide comprehensive coverage of the technical developments of the past 2.5 years in the COVID-19 context that will provide a ground-breaking foundation for future research.

Community Beneficial Digitial Technologies in the Context of COVID-19
Digital technologies have now been implemented and applied to almost every aspect of health care [22], and personal data in the healthcare sector is regarded as a living thing [23]. Figure 2 groups different digital health tools into corresponding a dozen application arenas, but the individual applications number in the thousands. The authors in [22] highlight the potential of digital innovation in the healthcare prospects in the following emerging areas: ensuring care continuity, advancing diagnosis and treatment, partnering with individuals to support self-management, facilitating off-site patient management through telemedicine, and reducing error and waste in the healthcare delivery system. Further information concerning these aspects can be learned from a recent study [22]. In the early years of the pandemic, most countries developed sophisticated tools and smart apps to control the spread of this deadly pandemic. For example, in South Korea, two smart apps for quarantine management and symptom reporting were developed to manage incoming travelers from abroad. Furthermore, a robotics-based system was developed to check the temperature, masks, sanitizing hands, etc., at the airports. Furthermore, a privacy-preserved entry logs collection system was also developed to store the contact information of those individuals who visit coffee shops, restaurants, and universities/colleges. A comprehensive contact tracing platform that identifies the close contacts of an infected person leveraging mobile phone, credit card, and CCTV data was also developed. These commercial technologies have helped contain the virus, and South Korea was considered a role model for these innovative technologies [24]. A growing number of smart apps were also developed to prove vaccine status upon entry into restaurants/bars. Figure 3 presents the details of innovative applications from the perspective of COVID-19.

Top Ten COVID-19-Fighting Digital Technologies: An Insightful Analysis
COVID-19-fighting digital technologies have helped mankind in various ways such as improved healthcare, ambient assisted living, smart services, and awareness/forecast of future events [25]. The fuel of these technologies was data concerning an individ-ual or group of individuals that can be of multiple types such as spatial-temporal activities, demographics, medical data, and physiological readings, to name a few [26]. Although many technologies were used to fight the ongoing pandemic, only a few technologies have shown promising results in combating the pandemic. In this work, ten major technologies were chosen for painstaking analysis concerning COVID-19. However, each technology has multiple subtypes that were also employed to fight the pandemic. Figure 4 demonstrates the top ten technologies and their subtypes that have significantly helped society during the pandemic crisis. To the best of our knowledge, the role of these technologies remained unexplored in the current literature. A concrete discussion about the need for such technologies and how each technology emerged to be one of the successful technologies to fight COVID-19 is discussed in Table 1.

Internet of Things (IoT)
Recently, the Internet of Things (IoT) technology has attracted significant attention from the research community due to the various benefits it brings to society. IoT is considered a leading technology of the future and will encompass billions of intelligent and communicating 'things/devices' [27]. IoT implementation and use have significant impacts on lowering healthcare costs, enhancing treatment performance, lessening mistakes, digital twins, significantly improving treatment results for the patients, and pandemics/epidemics control. IoT can assist multiple stakeholders (patients, clinicians, manufacturers, developers, drug and treatment developers, etc.) in the healthcare industry. This technology has played a vital role in the recent COVID-19 pandemic by lowering the virus spread as well as intervention planning [28]. In the medical context, IoT is mainly called the Internet of Medical Things (IoMT). Figure 5 presents the system-level overview of IoT/IoMT technology in realistic COVID-19 scenarios. The IoMT technology has played a vital role in collecting heterogeneous data about personalized healthcare (symptoms, breathing patterns, etc.) and individuals' movement (spatial and temporal stay point) that was used to evaluate different interventions. Furthermore, this technology has contributed to lowering mortality in this pandemic [29]. The synergy of IoT with other latest technologies has helped restrict the far-reaching consequences of this pandemic on the general public. Table 2 presents the unique applications/services of IoT technology in the context of COVID-19. Finding infected patients Real-time findings collection Goar et al. [35] Patients monitoring remotely Good quality data acquision Awotunde et al. [36] Sending relevant information Retrieve health data Hanuman et al. [37] Scanning of COVID-19 Assist in raw data collection Barnawi et al. [38] Detect and control pandemic Image data collection Herath et al. [39] Remote healthcare Vital parameters collection Mukati et al. [40] Breaking the chain of the virus transmission Hetrogenious data collection and processing Mohammed et al. [41]

Artificial Intelligence (AI)
AI is at the top of the list of transformative technologies of 2022, and the horizon of its application is increasing day by day. AI has become the focus of research across the globe, and its adoption has increased in many commercial sectors [42].
AI has tremendous applications in the healthcare sector such as drug discovery, pandemic management, diagnosis, disease prediction, decision-making, health informatics, and disease surveillance, to name a few. AI has revolutionized the healthcare industry in many ways, and modern healthcare systems are adopting AI-lead methods and tools in healthcare [43,44]. In the COVID-19 era, AI has remained in the spotlight, and helped curb the spread of COVID-19 in many ways [45]. The application of AI, such as identifying the possibility of infection from cough samples, is handy in constraining this deadly disease [46]. In the coming years, AI will be the leading technology in the healthcare sector with innovative applications that can eliminate/reduce time-consuming data collection/processing methods. Furthermore, the development of pruning and quantization techniques will likely increase the performance of AI models two-or three-fold [47]. Furthermore, the new wave of revolution in AI, such as data-centric AI, is expected to increase the reputation of AI in many industrial sectors [48][49][50]. Lastly, the ability of AI to process heterogeneous data is assisting health professionals in various ways. Figure 6 presents a brief overview of AI applications with heterogeneous data in the context of COVID-19. Table 3 presents the latest and promising applications of AI in the era of COVID-19 along with experimental details. Specifically, we discuss the applications of AI, the datasets used in experiments with AI, and the AI model employed to achieve the desired results. The detailed analysis can pave the way to understanding the pertinent role of AI in this pandemic. To the best of the authors' knowledge, such analysis has not been reported in previous research.   As shown in Table 3, diverse types of data were used in AI models to address multiple aspects (e.g., diagnosis, dynamics modeling, severity estimation, prediction of trends, flow modeling, contact tracing, suspects finding, etc.) of COVID-19 pandemic. Figure 7 demonstrates some representative data about the uses of AI for COVID-19 diagnosis. As discussed in Table 3, there exist many applications of AI in the context of COVID-19. Apart from these applications, there are many practical examples such as treatment (drug discovery/repurposing), estimation (future spread dynamic, infection rate), diagnosis (text data-based analytics, image data-based analytics, sound data-based analytics), association analysis, etc. Many studies have described various AI examples in the context of COVID-19 leveraging heterogeneous data [87,88]. The latest review has presented many innovative applications of AI in terms of prediction, diagnosis, drug discovery, and vaccine development [89]. All these examples have contributed to lowering the effects of COVID-19 and developing appropriate treatments for COVID-19.

Computer Vision (CV)
Various CV approaches have been developed so far, dealing with multiple aspects to combat COVID-19. These CV approaches vary in terms of their applications to the fundamental benefits. Figure 8 highlights some of the key areas in which CV approaches have played a vital role in the era of COVID-19. CV has played a vital role in detecting COVID-19 using image data [90]. In most COVID-19-related applications (e.g., predictive modeling with ML, medical imaging, disease spread modeling and control, symptoms clustering and analysis, etc.), CV has been integrated with the AI models to control the pandemic [91]. Specifically, CV has been widely used in three different aspects of COVID-19: diagnosis and treatment, control and prevention, and clinical treatment [92]. Due to its ability to work with the image data, CV has been widely used to identify the infected lung region, pneumonia and COVID-19 identification, image segmentation, and region of interest selection [93]. Furthermore, CV-based approaches have been widely used in enforcing social distancing that was regarded as a well-known non-pharmaceutical interventions [94]. Furthermore, CV approaches have been integrated with other emerging technologies to control the effects of this pandemic [95,96]. CV approaches have also assisted in controlling the spread of COVID-19 including face mask detection [97]. A detailed analysis of CV applications in the context of COVID-19 can be found in a recent study [98]. In conclusion, CV approaches have been widely used to control and manage this pandemic effectively.

Blockchain
Just like other technologies, blockchain (BC) technology has also played a vital role in the pandemic arena, especially resource deployments and planning operations [100]. This technology can assist in this pandemic crisis by providing effective solutions such as user privacy protection, outbreak tracking, performance enhancement of the medical supply chain, safe day-to-day operations, and donation tracking. BC technology has been widely used in contact tracing, data sharing, resource planning, and medical supplies distribution.
Sharma et al. [101] discussed the nine practical applications of BC in the era of COVID-19: (i) information sharing, (ii) contact tracing, (iii) supply chain, (iv) contactless delivery, (v) insurance management, (vi) online education, (vii) resource management, (viii) response planning, (ix) privacy preservation. Fusco et al. [102] discussed the SWOT analysis of adopting the BC-based prediction model in the healthcare sector and SARS-CoV-2 infection. In some cases, BC technology was integrated with the blockchain to combat the virus effectively [103]. Interestingly, BC technology was extensively used in developing and distributing COVID-19 vaccines [104,105]. BC technology has assisted in many ways during the ongoing pandemic [106]. Many contact tracing protocols have been developed to preserve the privacy of infected individuals' contacts, as well as the infected individuals [107]. Digital technologies such as BC can play a vital role in combating future infectious diseases. There exist a variety of prospects/opportunities in which BC technology can be integrated with the healthcare sector to serve mankind (i.e., community-beneficial technology). BC technology can play a vital role in the post-pandemic era as well [108]. Due to its decentralized nature, it can overcome privacy issues in the healthcare sector [109]. Figure 9 presents the key application of BC technology in the context of COVID-19. As shown in Figure 9, BC technology has been widely used in solving many problems in the ongoing pandemic. In addition, this technology was involved in many aspects compared to other COVID-19-fighting technologies. The adoption of BC-powered technologies was also higher due to the least privacy concerns [110]. BC technology is expected to effectively contribute in the post-pandemic era in various ways.

Federated Learning (FL)
Federated learning (FL) is a state-of-the-art (SOTA technology that can work with heterogeneous data sources without centralizing the data [111]. FL has revolutionized the privacy domain with its unique concept (e.g., algorithms → data). In contrast, centralized learning (CL) brings data close to algorithms. The difference between central learning (CL) and FL is demonstrated in Equation (1): Not centralizing data, and still training large-scale AI models makes the FL a leading technology of the future. FL has many potential applications in most sectors such as healthcare, finance, supply chain, and social network analysis. Since its inception in 2017, it has been widely investigated from multiple perspectives. In the COVID-19 era, it has effectively served mankind by creating synergies with other technologies such as IoT, BC, CV, and robotics [112][113][114]. The FL can provide all services such as AI without acquiring data at some central place. Due to FL, data silos and data winter problems are effectively solved across the globe [115]. In the COVID-19 era, FL has played a vital role in management, COVID-19 detection, pandemic control, privacy protection, and policy planning [116]. Figure 10 presents an overview of the data island problem and FL working mechanism that can be regarded as the solution to this practical problem. As shown in Figure 10a, all hospitals individually train the model with their data. However, there exist structural and quantity problems with data at each site (i.e., hospital), and therefore, the quality of most AI models is low. One promising solution can be to share the data and train a powerful AI model. However, recent privacy regulations prevent data sharing due to privacy risks. In this case, all hospitals are facing data quality and low performance of AI models. In contrast, FL (Figure 10b) can solve all these problems by performing local training and sharing the model parameters with the server for aggregation purposes, since FL does not orchestrate data and, therefore, it is legallycompliant technology. Recently, FL has created synergies with many emerging technologies to enhance technical persuasiveness in the context of COVID-19. Figure 11 demonstrates the synergies of FL with other emerging technologies. The purpose of these synergies is to accomplish multiple tasks such as privacy preservation, servicing end users, data sharing, and response planning for the COVID-19 pandemic. Through a detailed analysis of the SOTA published in the past 3 years, we summarize applications of FL in Table 4. To the best of our knowledge, these applications remained unexplored in the recent literature. There exist many successful examples in which FL has significantly contributed in the context of COVID-19. FL has contributed to preserving the privacy of patient data while still permitting the training of AI models. It has helped efficiently and correctly diagnose COVID-19 patients by leveraging image data. It has also contributed to predicting the dynamics of infection, and medical supplies needed to combat the challenges of COVID-19. We refer interested readers to our recent publication that solely describes the example and applications of FL in the context of COVID-19 [117]. Many examples of FL on biomedical data concerning the COVID-19 pandemic have also been reported in a recent study [118]. Based on the above analysis, it can be concluded that there exist many successful examples of FL in the context of COVID-19.

Robotics
Robotics has also played a vital role during this ongoing pandemic in terms of medical supplies, temperature checking, data collection, alerting people to stay away from contaminated places, announcements, mask-wearing status analysis, and many more [150]. In South Korea, robots were used at crowded places such as airports for social distance monitoring, mask checking, and alerting people when their distance from others was less than 2 m. Such applications have proved the effective use of robot technology in the ongoing pandemic. In most cases, robots were used to collect data that can be analyzed with the latest AI technologies. Robots were able to perform multiple tasks involving sample delivery, patient information, and equipment in COVID-19 hospitals by reducing the possibility of infection. In the era of COVID-19, online shopping is significantly enhanced through fast logistics systems where robots/UAVs are being employed as the means to deliver food (or medical) supplies and other commodities because in-person delivery is not allowed. Countries such as South Korea, China, UAE, and the United States have launched contactless delivery systems where the customer's products ordered online are dropped off at the selected areas/locations instead of the people receiving them for themselves using their hands. All these measures were adopted to lower the risk of contacts and corresponding infection [151,152].
Robotics have been widely integrated into many of the latest AI techniques to enhance the quality and to make the healthcare systems cost-effective [153]. Similarly, robotics combined with IoT systems have helped infected and disabled people during the ongoing pandemic [154]. During this pandemic, robotic technologies were used in different scenarios, including clinical care, disease prevention, and monitoring, laboratory automation, medical supplies delivery, logistics, alerting, and maintenance of socioeconomic activities across the globe [155]. Recently, robots were employed to collect relevant data and send it to the server for analytics purposes [156]. The robotics-powered medical applications are advancing day by day, and robots have been used in the medical domain for well over 30 years [157]. These applications prove the significant role of robotics in the COVID-19 era.

Tiny Machine Learning (TinyML)
Tiny machine learning (TinyML) is the latest development in the world of AI and deep learning. TinyML brings the capability to run/operate ML models in a ubiquitous microcontroller unit (MCU) [158]. The MCU is the smallest electronic chip that is present almost everywhere these days. MCUs are the brains of many devices. From an elevator to a TV remote controller to a smart speaker, they are present everywhere. Multiple sensors or wearables that can send telemetry data are connected to an MCU. Actuators, such as motors and switches, are also connected to the same MCU. It carries embedded code/logic that can obtain the data from wearables/sensors, and control the actuators. The evolution of TinyML marks a significant paradigm shift in how end-users take benefits from AI [159]. Vendors from the software and hardware industries are collaborating to efficiently bring AI models close to the MCUs. The ability and capacity to run sophisticated AL models embedded within an electronic device (or even inside the human body) opens up many commercial avenues.
TinyML does not require a cloud, edge, or Internet connection. TinyML runs locally on the same MCU, which has the logic and control to effectively manage the connected actuators and sensors. There are two types of evolution in the context of TinyML: (i) AI in the cloud and (ii) AI at the edge [160]. TinyML is regarded as the big future of ML. Figure 12 presents the composition of TinyML paradigm. TinyML is the amalgamation of three main components: software, hardware, and algorithms.  Table 5 presents the promising applications of TinyML in the era of COVID-19.  [172] As shown in Table 5, TinyML has played a vital role in many aspects concerning COVID-19. Detailed information related to TinyML use in healthcare can be learned from a recent study [173,174]. The TinyML field is relatively new, but many applications will be developed in the coming years, especially in the medical field [175]. The synergy of TinyML with other technologies such as IoT is also increasing day by day [176]. TinyML demands more effort from other vendors such as chip vendors, compiler companies, service providers, etc., to make it the deepest edge possible IoT technology.

Edge Computing
Many cloud-based applications use data centers as central servers to process huge data produced by edge devices, such as tablets, wearables, smartphones, and industrial units. This cloud-centric model increases both the computational and communication overheads, leading to an adverse effect on Quality-of-Service (QoS) and Quality of Experience (QoE). To resolve these issues, a concept of edge computing (EC) is proposed that dedicates/moves some of the computational burdens towards the edge of the network to take benefits from computational capabilities that are currently untapped in edge nodes, such as routers, base stations, and switches [177]. Figure 13 shows the schematic of EC. The EC computing paradigm is imperative for performing most calculations/ computations at the edge of the network. The EC paradigm has abilities to solve many latency challenges of the traditional cloud-centric approaches. EC is one of the emerging technologies as it is lightweight and can contribute to small-scale data preservation and processing [178,179]. Due to its capabilities in processing data at a much faster rate than cloud computing, EC has been widely used in the COVID-19 era for disease mitigation, detection, management, and mask detection [180][181][182]. EC has created synergies with AI approaches to resolve many challenges concerning the ongoing pandemic [183]. Due to its ability in processing data quickly, it has significantly contributed to controlling the pandemic [184]. In this pandemic, EC-based technologies have assisted in detecting COVID-19 from X-ray images [185,186]. The EC technology is beneficial in terms of minimizing data transmissions to cloud servers, and performs most processing at the edge, thereby improving user's privacy aspects [187]. Furthermore, EC can acquire the data from heterogeneous sources, thereby contributing to the clinical screening of COVID-19 [188].
EC technologies have also been used to reduce communication, energy, and computation overheads while detecting COVID-19 from X-ray images [189]. In addition, EC techniques contribute to enhancing training efficiency without losing guarantees of model accuracy. In some cases, EC technology was used in monitoring social distances to curb the spread of COVID-19 [190]. There exist some explainable AI and EC-based implementations with promising results in COVID-19 scenarios, which prove the technical significance of EC in the COVID-19 era [191]. EC has abilities to work with AI techniques without orchestrating data to the central server and, therefore, some applications of EC combined with AI have contributed towards ambient assisted living [192]. The integration of IoE with EC is expected to further enhance the application scenarios and promising trends of EC in healthcare [193]. The application of EC in healthcare is increasing day by day and cooperative architecture that can contribute to disease diagnosis has been developed recently [194]. Cloud, fog and edge computing-based architectures have been developed to make appropriate decisions to control the ongoing pandemic [195,196]. EC architectures have been widely used in analyzing the dynamics of infectious diseases that can assist in mitigating the spread of COVID-19 [197]. Detailed information about EC applications, service scenarios, and use cases can be learned from the previous studies [198][199][200][201]. EC technology is going through an immense revolution with time and is a SOTA technology of the future with many applications [202,203]. The unique application such as privacy preservation of personal data has enhanced the acceptance of EC-based technologies across the world [204][205][206]. From the discussion of promising applications of COVID-19, it can be seen that EC is a promising technology to fight current/future infectious diseases. The promising applications of EC in COVID-19 arena are given in Figure 14.

Natural Language Processing
During the ongoing pandemic, natural language processing (NLP) models have been increasingly used to detect misinformation, entity recognition, question answering, and symptoms/knowledge discovery [207,208]. NLP has also assisted in understanding the temporal evolution of COVID-19 through the robust analysis of the published studies [209]. NLP techniques were used to identify the key issues (i.e., topics and sentiment polarity) by employing social media data [210]. Furthermore, NLP techniques were extensively used to analyze the side effects of the COVID-19 pandemic on the general public [211]. NLP techniques are designed to significantly reduce the amount of time a doctor spends on documentation, which can augment the time and effort a doctor to work with actual patients directly in the ongoing pandemic [212]. Furthermore, NLP techniques have greatly contributed to identifying the populations at higher risk during the COVID-19 pandemic [213]. The NLP techniques have vastly contributed to extracting knowledge that is imperative in controlling, treating, and managing the strain of COVID-19 [214]. As a tool to understand and analyze human language, NLP has become an integral part of smart healthcare [215,216]. In this pandemic, NLP-based predictive analytics and NLP techniques successfully enabled medical data-driven patient guidance and pooled testing. [217]. The latest application of NLP contributed to knowledge extraction, topic modeling, and entity recognition using data of various kinds [218]. A comprehensive analysis of SOTA NLP approaches, and their efficacy in the COVID-19 era is given in a recent study [219]. Figure 15 presents the applications of NLP in the context of COVID-19 along with data sources. From the analysis presented in Figure 15, it can be observed that NLP has contributed in multiple aspects concerning COVID-19. Sengupta et al. [220] discussed the NLP-based approach for sentiment analysis to determine the impact of the pandemic on mental health using tweets data. Similarly, Ye et al. [221] analyzed public sentiments, using social media data in the U.S., toward COVID-19 vaccination. The authors discussed the public concerns and attitudes regarding COVID-19 vaccination. In some cases. NLP techniques have helped in identifying outbreaks of COVID-19 in public places [222]. NLP-based techniques have increasingly been used in the spread of COVID-19 through gene analysis present in complete sequence [223]. Heider et al. [224] developed an NLP-based tool for extracting relevant information from clinical notes that can be used to control (or plan a response) the COVID-19 pandemic. The DECOVRI tool is expected to be released as an open-source tool to fight the COVID-19 pandemic. Detailed information about NLP models and relevant data that were used to fight the pandemic can be learned from recent studies [225,226]. Soon, the role of NLP as a mobile doctor or voice assistant will be imperative to improve healthcare for the general public [227,228]. The analysis cited above can assist in understanding the role of NLP technology in the era of COVID-19. Recently, NLP-powered robots and chatbots have also played a vital role in the healthcare industry.

Synthetic Data (SD)
Synthetic data (SD) are becoming a leading privacy enhancing technology (PET) due to restricted access to real/reference data but still offer similar analytic results to real data [229]. SD can be generated with either mathematical models or machine/deep learning methods by replicating the structure and distributions of real data [230]. SD is the main pillar in well-being and health domains for verifying multiple hypotheses as well as generating new hypotheses concerning biomedical research [231]. SD has many benefits in advancing science and influencing societies, especially in the era of artificial intelligence (AI) and big data [232]. We demonstrate the efficacy of SD in real scenarios from six aspects as follows.

1.
SD can be shared on a large scale that may not be possible with real data due to growing privacy concerns and legal enforcement measures.

2.
SD can augment the performance of AI models in most real-world scenarios by supplying large amounts, and a variety of, SD for training ML/DL models [233].

3.
It can be a pertinent solution for the data island problem (e.g., lower performance of AI models due to higher differences in sizes and distributions of data at each site).

4.
It can provide early access to data when real data cannot be accessed due to a small size or unexpected circumstances (taking COVID-19 as an example). 5.
SD can be generated in different formats, such as medical images, electronic health records, in time series, as biomedical signals or activity data, that can be vital for research (or policy-making) and analytical tasks [234]. 6.
It can be a leading PET (like FL) by restricting access to the real data but still permitting analytics of various kinds (e.g., drawing pictures out of the data).
SD can be generated with the help of AI models (i.e., generative adversarial networks and their variants). An example of SD generation with conditional GAN is in Figure 16. Recently, SD has become one of the famous technologies to fulfill the demands of highquality data that is not easily accessible when it comes to analytics and mining [235][236][237]. SD has contributed in solving many research problems such as speech recognition [238], privacy preservation [239], money laundering detection [240], medical image segmentation [241], shockable rhythms detection [242], protection against membership inference attacks [243], innovative healthcare applications [244], privacy and utility enhancements [245], mobility modelling [246],object detection/manipulation [247], crack identification [248], crowd counting [249,250], vague data classification [251], grammatical error detection [252], AI models training for medical applications [253], support in clinical developments [254], time series analysis [255], data loss prevention [256], and surgical planning [257]. SD has contributed significantly in the era of COVID-19. Table 6 summarizes the potential applications of SD in the context of COVID-19.
As shown in Table 6, SD has significantly contributed to addressing many concerns related to the ongoing pandemic. SD is a mainstream technology in the medical field when it comes to augmenting many AI models' performances and making data broadly available. Further information related to SD in medical research can be obtained from a recent study [258]. In the coming years, SD will be an integral component of many AI models. Furthermore, it has been widely used to fulfill data governance/use in most sectors. Importance of good quality data in the context of COVID-19: Most of the above technologies used heterogeneous data stemming from clinical analysis or directly collected from people. Since data has replaced oil as the most economically desirable resource in the world, its collection and utilization have become challenging [280]. Some studies have highlighted the need for new and good-quality datasets to fight the pandemic [281]. The recent AI-based technique requires good data to predict the course of this disease [282]. Figure 17 presents the data sources and digital solutions that were employed to fight the pandemic. From the analysis, it can be observed that a variety of data was used to bring the pandemic under control in the absence of vaccines. Similarly, future infectious diseases can be handled well using diverse datasets and advanced AI (or digital) technologies. Recently, the quality and availability of data have become a hot research topic in many disciplines. To this end, a new discipline named, data-centric AI has emerged [283,284]. Hence, exploring the use of this new concept in the COVID-19 era has become more emergent than ever [285]. COVID-19 pushing us to the new industrial revolution (fifth industrial revolution) and, therefore, the role of data-related movements and technological advancements is imperative in the context of future infectious diseases [286]. The insightful analysis and discussion cited above are imperative in understanding the role of technology in the era of COVID-19. In addition, it helps identification of research gaps, that, in return, can help strengthen the response to this pandemic.

Summary and Comparisons
In this paper, we described the technical efficacy and applications of the top ten digital technologies that have proven successful in the era of COVID-19. Specifically, we selected digital technologies that have commercial success in many countries across the globe. Our analysis is handy in terms of getting to know which technologies played a vital role in the ongoing pandemic. Moreover, we highlight the analysis in a pictorial form which is convenient for researchers and experts. Appropriate references have also been inserted for detailed discussions and further readings. To the best of our knowledge, we are the first to propose such an in-depth analysis of emerging technologies in the context of COVID-19. We presented the detailed analysis in tabular and textual forms for each identified technology. For five key technologies, we presented and compared the SOTA studies in a tabular form, and related literature in five technologies was analyzed in a textual form. In the former analysis, our main focus was to present the experimental details. In contrast, the later analysis traditionally discussed the studies, mostly in a theoretical form. Table 7 presents further information concerning the analysis of SOTA studies in each technology. There exist hybrid technologies that used more than one technology to fight the pandemic. Nair et al. [287] used IoMT and FL technologies together to restrict privacy issues in big data analytics. Aich et al. [288] employed the BC and FL technology to protect the privacy of sensitive EHR. Zhang et al. [289] created the fusion of FL models with EC to robustly detect COVID-19 infection from X-ray or CT scan data. Jat et al. [290] integrated the drones and EC to control the outbreak in various situations to fight COVID-19. Firouzi et al. [291] employed four different technologies such as IoT, BC, AI, and robotics in the healthcare sector to manage the pandemic. Sahu et al. [292] integrated AI and drones to prevent the infection of COVID-19. Feldman et al. [293] employed NLP and AI to plan a response to the COVID-19 scenarios. Jaimin et al. [294] employed NLP and EC to develop interactive chatbots for COVID-19 scenarios. Poongodi et al. [295] highlighted the role of multiple technologies in the context of COVID-19. Kanade et al. [296] developed a full technology stack to remotely monitor COVID-19 patients. A comparative analysis of all 10 identified technologies is given in Table 8. We compare each technology on three grounds (i.e., use in the COVID-19 context, main contributions, and performance). It is important to note that each technology has contributed in multiple aspects concerning COVID-19; however, we present only the major contributions here. The analysis of all technologies can pave the way to meticulously understanding the role of them in the era of COVID-19. Our study is more comprehensive and systematic than previous surveys that either focused on COVID-19 detection (i.e., only one scenario) or generic applications of these technologies in the healthcare sector. Furthermore, most studies have covered one/two technologies' roles in the pandemic era. Furthermore, the experimental details (models, data, and application) are very limited in those surveys. Lastly, most of the previous reviews highlighted and covered studies from the early days of the pandemic that are not fully reliable. Our work addresses all these limitations of the previous surveys and presents extended knowledge extracted from high-quality research papers along with experimental details. Our work aligns with recent trends toward community beneficial technologies development and analysis.

Future Research Challenges and Directions
It is important to note that COVID-19 has accelerated digital innovation, and many digital tools have been developed to either handle the pandemic or regulate businesses. The rapid digital innovation and software development trend will likely stay in the post COVID-19 era. However, the data modalities and scale are much different compared to the pre-pandemic arena, and therefore, a lot of challenges exist in terms of data collection, processing, use, etc. For example, the scale and scope of data have significantly changed amid the pandemic, and more robust techniques are required to handle it. Furthermore, highly diverse types (e.g., life logs, location, working environment, etc.) of data are being collected which can lead to privacy issues of diverse kinds. To provide a clear overview, we arrange future research challenges and directions into six broad categories (i.e., data, software, hardware, AI, privacy, and general) as shown in Figure 18.
From the perspective of data, the handling of diverse data types (e.g., table, graphs, metrics, location traces, mobility graphs, heat maps, etc.), and extracting enclosed knowledge from them is very challenging. Furthermore, processing and storing huge data stemming from different automated tools (e.g., contact tracing apps, quarantine monitoring apps, proximity detection software, etc.) is also challenging. Furthermore, a large amount of data stemming from the pandemic is noisy and, therefore, sophisticated techniques are required to enhance its quality. However, there is a lack of domain experts who can contribute to enhancing the data quality. On the other hand, most contact tracing applications collect and store data that may not be needed at all and, therefore, appropriate data collection methodologies are needed. In conclusion, there exist multiple challenges concerning the quality of data in the COVID-19 arena. From the perspective of software, reducing computing overheads while processing a substantial amount of data is very challenging. Furthermore, enhancing the robustness of software while querying a large amount of data is also tricky. Furthermore, developing software (that is, decision support systems) that can diagnose COVID-19 patients effectively is also very challenging. Although much medical-related software has been developed, their efficacy in terms of correctly analyzing all dynamics of COVID-19 was relatively low compared to other well-known diseases. From the perspective of hardware, some architectures yield below-par performance due to complex arithmetic operations. Therefore, upgrading the hardware architectures and developing dedicated hardware that can fasten the computation is challenging. In the future, dedicated hardware is of paramount importance to accelerate the computation on large datasets.
From the perspective of AI, lowering the overall complexity of neural network models is very challenging while building AI models for diagnosis. Furthermore, quantizing AI models and pruning redundant weights so that AI models can work on edge devices is also very challenging and require expertise. Moreover, reducing the computing overheads from AI models is very challenging. In many AI models such as supervised learning, a higher amount of human involvement is needed for feature engineering, feature selection, model selection, hyperparameter tuning, etc. Therefore, developing automated tools to limit human involvement to the extent possible is very challenging. In addition, developing AI models that are free from biases is one of the biggest challenges while applying AI to any real-world problems. Additionally, providing explanations along with AI models' decisions is also very challenging. Lastly, improving AI models' performance when data is scarce or poor in quality is also very challenging.
Privacy remained one of the biggest challenges and barriers to the adoption of digital technologies in the context of COVID-19. In the COVID-19 era, maintaining the balance between public safety and privacy was very challenging. In many countries, the adoption of contact tracing apps was very low due to privacy concerns. Apple and Google devoted efforts to developing a contact tracing platform without collecting any identity-related information [297]. In addition, many BC-related tools were also developed to combat the privacy paradox [298,299]. Despite these developments, privacy preservation in digital tools that were employed to control the pandemic is still very challenging. Developing privacy pipelines that can guarantee end-to-end privacy is very challenging [300]. Lastly, deriving useful knowledge with the least data (e.g., without collecting explicit identityrelated data) is also very challenging.
Lastly, developing contact tracing apps that can assist in robustly finding the contacts of an infected individual was also very challenging. Due to privacy issues, their adoption was low, and contact tracing was ineffective in many countries of the world. Furthermore, getting the correct picture of the pandemic by leveraging heterogeneous data was challenging. For example, finding the peak pandemic time, likely end of the pandemic time, and cases, etc. were very challenging to measure. In addition, the use of digital technologies to find a candidate drug for COVID-19 was also challenging. All these challenges require robust solutions in the near future.
Based on the painstaking analysis of the published literature, we suggest the following important topics that require further research/development from the research community: (i) development of privacy-enhancing technologies, (ii) development of data fusion and knowledge discovery methods, (iii) development of dedicated software and hardware for COVID-19 scenarios, (iv) development of strong security mechanism and secure multiparty computing techniques, (v) development of automated data processing methods, (vi) development of data-centric AI methods for COVID-19, (vii) development of methods for solving data imbalance and heterogeneity problems, (viii) development of innovative technologies for tracing infected people, (ix) development of decentralized apps that do not leak data, and (x) development of self-diagnosis apps or frameworks. All of these directions can contribute to developing community-beneficial technologies that, in return, can serve humans in various ways.

Conclusions and Future Work
This paper presented a systematic coverage of the top ten technologies that have significantly helped the general public in lowering the effects of COVID-19, in particular, when vaccines were unavailable. We provided extended details (e.g., applications, data details, model names, pertinent references, etc.) about each technology, and discussed stateof-the-art studies. Different from the previous work, our analysis targets mainstream technologies and their benefits/applications solely in the context of COVID-19. Our painstaking analysis can be beneficial for the computer science community in quickly grasping the research status of major technologies in the fight against the COVID-19 pandemic. Lastly, our work aligns with the recent trends toward developing/analyzing digital technologies that can serve mankind effectively. In the future, we intend to explore the role of all information and communication technologies (ICTs) that helped the general public during the ongoing pandemic.

Conflicts of Interest:
The authors declare no conflict of interest.