Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (7)

Search Parameters:
Authors = Insoo Sohn

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
37 pages, 2323 KiB  
Article
Smart Lithium-Ion Battery Monitoring in Electric Vehicles: An AI-Empowered Digital Twin Approach
by Mitra Pooyandeh and Insoo Sohn
Mathematics 2023, 11(23), 4865; https://doi.org/10.3390/math11234865 - 4 Dec 2023
Cited by 18 | Viewed by 6863
Abstract
This paper presents a transformative methodology that harnesses the power of digital twin (DT) technology for the advanced condition monitoring of lithium-ion batteries (LIBs) in electric vehicles (EVs). In contrast to conventional solutions, our approach eliminates the need to calibrate sensors or add [...] Read more.
This paper presents a transformative methodology that harnesses the power of digital twin (DT) technology for the advanced condition monitoring of lithium-ion batteries (LIBs) in electric vehicles (EVs). In contrast to conventional solutions, our approach eliminates the need to calibrate sensors or add additional hardware circuits. The digital replica works seamlessly alongside the embedded battery management system (BMS) in an EV, delivering real-time signals for monitoring. Our system is a significant step forward in ensuring the efficiency and sustainability of EVs, which play an essential role in reducing carbon emissions. A core innovation lies in the integration of the digital twin into the battery monitoring process, reshaping the landscape of energy storage and alternative power sources such as lithium-ion batteries. Our comprehensive system leverages a cloud-based IoT network and combines both physical and digital components to provide a holistic solution. The physical side encompasses offline modeling, where a long short-term memory (LSTM) algorithm trained with various learning rates (LRs) and optimized by three types of optimizers ensures precise state-of-charge (SOC) predictions. On the digital side, the digital twin takes center stage, enabling the real-time monitoring and prediction of battery activity. A particularly innovative aspect of our approach is the utilization of a time-series generative adversarial network (TS-GAN) to generate synthetic data that seamlessly complement the monitoring process. This pioneering use of a TS-GAN offers an effective solution to the challenge of limited real-time data availability, thus enhancing the system’s predictive capabilities. By seamlessly integrating these physical and digital elements, our system enables the precise analysis and prediction of battery behavior. This innovation—particularly the application of a TS-GAN for data generation—significantly contributes to optimizing battery performance, enhancing safety, and extending the longevity of lithium-ion batteries in EVs. Furthermore, the model developed in this research serves as a benchmark for future digital energy storage in lithium-ion batteries and comprehensive energy utilization. According to statistical tests, the model has a high level of precision. Its exceptional safety performance and reduced energy consumption offer promising prospects for sustainable and efficient energy solutions. This paper signifies a pivotal step towards realizing a cleaner and more sustainable future through advanced EV battery management. Full article
Show Figures

Figure 1

23 pages, 1757 KiB  
Article
DB-COVIDNet: A Defense Method against Backdoor Attacks
by Samaneh Shamshiri, Ki Jin Han and Insoo Sohn
Mathematics 2023, 11(20), 4236; https://doi.org/10.3390/math11204236 - 10 Oct 2023
Cited by 4 | Viewed by 1936
Abstract
With the emergence of COVID-19 disease in 2019, machine learning (ML) techniques, specifically deep learning networks (DNNs), played a key role in diagnosing the disease in the medical industry due to their superior performance. However, the computational cost of deep learning networks (DNNs) [...] Read more.
With the emergence of COVID-19 disease in 2019, machine learning (ML) techniques, specifically deep learning networks (DNNs), played a key role in diagnosing the disease in the medical industry due to their superior performance. However, the computational cost of deep learning networks (DNNs) can be quite high, making it necessary to often outsource the training process to third-party providers, such as machine learning as a service (MLaaS). Therefore, careful consideration is required to achieve robustness in DNN-based systems against cyber-security attacks. In this paper, we propose a method called the dropout-bagging (DB-COVIDNet) algorithm, which works as a robust defense mechanism against poisoning backdoor attacks. In this model, the trigger-related features will be removed by the modified dropout algorithm, and then we will use the new voting method in the bagging algorithm to achieve the final results. We considered AC-COVIDNet as the main inducer of the bagging algorithm, which is an attention-guided contrastive convolutional neural network (CNN), and evaluated the performance of the proposed method with the malicious COVIDx dataset. The results demonstrated that DB-COVIDNet has strong robustness and can significantly reduce the effect of the backdoor attack. The proposed DB-COVIDNet nullifies backdoors before the attack has been activated, resulting in a tremendous reduction in the attack success rate from 99.5% to 3% with high accuracy on the clean data. Full article
Show Figures

Figure 1

12 pages, 1088 KiB  
Review
Neural Network Optimization Based on Complex Network Theory: A Survey
by Daewon Chung and Insoo Sohn
Mathematics 2023, 11(2), 321; https://doi.org/10.3390/math11020321 - 7 Jan 2023
Cited by 14 | Viewed by 4063
Abstract
Complex network science is an interdisciplinary field of study based on graph theory, statistical mechanics, and data science. With the powerful tools now available in complex network theory for the study of network topology, it is obvious that complex network topology models can [...] Read more.
Complex network science is an interdisciplinary field of study based on graph theory, statistical mechanics, and data science. With the powerful tools now available in complex network theory for the study of network topology, it is obvious that complex network topology models can be applied to enhance artificial neural network models. In this paper, we provide an overview of the most important works published within the past 10 years on the topic of complex network theory-based optimization methods. This review of the most up-to-date optimized neural network systems reveals that the fusion of complex and neural networks improves both accuracy and robustness. By setting out our review findings here, we seek to promote a better understanding of basic concepts and offer a deeper insight into the various research efforts that have led to the use of complex network theory in the optimized neural networks of today. Full article
(This article belongs to the Special Issue Big Data and Complex Networks)
Show Figures

Figure 1

23 pages, 1625 KiB  
Review
Cybersecurity in the AI-Based Metaverse: A Survey
by Mitra Pooyandeh, Ki-Jin Han and Insoo Sohn
Appl. Sci. 2022, 12(24), 12993; https://doi.org/10.3390/app122412993 - 18 Dec 2022
Cited by 93 | Viewed by 17744
Abstract
The Metaverse is a multi-user virtual world that combines physical reality with digital virtual reality. The three basic technologies for building the Metaverse are immersive technologies, artificial intelligence, and blockchain. Companies are subsequently making significant investments into creating an artificially intelligent Metaverse, with [...] Read more.
The Metaverse is a multi-user virtual world that combines physical reality with digital virtual reality. The three basic technologies for building the Metaverse are immersive technologies, artificial intelligence, and blockchain. Companies are subsequently making significant investments into creating an artificially intelligent Metaverse, with the consequence that cybersecurity has become more crucial. As cybercrime increases exponentially, it is evident that a comprehensive study of Metaverse security based on artificial intelligence is lacking. A growing number of distributed denial-of-service attacks and theft of user identification information makes it necessary to conduct comprehensive and inclusive research in this field in order to identify the Metaverse’s vulnerabilities and weaknesses. This article provides a summary of existing research on AI-based Metaverse cybersecurity and discusses relevant security challenges. Based on the results, the issue of user identification plays a very important role in the presented works, for which biometric methods are the most commonly used. While the use of biometric data is considered the safest method, due to their uniqueness, they are also susceptible to misuse. A cyber-situation management system based on artificial intelligence should be able to analyze data of any volume with the help of algorithms. To prepare researchers who will pursue this topic in the future, this article provides a comprehensive summary of research on cybersecurity in the Metaverse based on artificial intelligence. Full article
(This article belongs to the Special Issue Enabling Technologies and Critical Applications of Metaverse)
Show Figures

Figure 1

10 pages, 2781 KiB  
Communication
A Study on the Performance of a Silicon Photodiode Sensor for a Particle Dosimeter and Spectrometer
by Bobae Kim, Uk-Won Nam, Sunghwan Kim, Sukwon Youn, Won-Kee Park, Jongdae Sohn, Hong Joo Kim, Seh-Wook Lee, Junga Hwang, Sung-Joon Ye, Insoo Jun and Young-Jun Choi
Sensors 2021, 21(23), 8029; https://doi.org/10.3390/s21238029 - 1 Dec 2021
Cited by 2 | Viewed by 3572
Abstract
A lunar vehicle radiation dosimeter (LVRAD) has been proposed for studying the radiation environment on the lunar surface and evaluating its impact on human health. The LVRAD payload comprises four systems: a particle dosimeter and spectrometer (PDS), a tissue-equivalent dosimeter, a fast neutron [...] Read more.
A lunar vehicle radiation dosimeter (LVRAD) has been proposed for studying the radiation environment on the lunar surface and evaluating its impact on human health. The LVRAD payload comprises four systems: a particle dosimeter and spectrometer (PDS), a tissue-equivalent dosimeter, a fast neutron spectrometer, and an epithermal neutron spectrometer. A silicon photodiode sensor with compact readout electronics was proposed for the PDS. The PDS system aims to measure protons with 10–100 MeV of energy and assess dose in the lunar space environment. The manufactured silicon photodiode sensor has an effective area of 20 mm × 20 mm and thickness of 650 μm; the electronics consist of an amplifier, analog pulse processor, and a 12-bit analog-to-digital converter for signal readout. We studied the responses of silicon sensors which were manufactured with self-made electronics to gamma rays with a wide range of energies and proton beams. Full article
(This article belongs to the Special Issue Sensors and X-ray Detectors)
Show Figures

Figure 1

16 pages, 1669 KiB  
Review
Edge Network Optimization Based on AI Techniques: A Survey
by Mitra Pooyandeh and Insoo Sohn
Electronics 2021, 10(22), 2830; https://doi.org/10.3390/electronics10222830 - 18 Nov 2021
Cited by 20 | Viewed by 7277
Abstract
The network edge is becoming a new solution for reducing latency and saving bandwidth in the Internet of Things (IoT) network. The goal of the network edge is to move computation from cloud servers to the edge of the network near the IoT [...] Read more.
The network edge is becoming a new solution for reducing latency and saving bandwidth in the Internet of Things (IoT) network. The goal of the network edge is to move computation from cloud servers to the edge of the network near the IoT devices. The network edge, which needs to make smart decisions with a high level of response time, needs intelligence processing based on artificial intelligence (AI). AI is becoming a key component in many edge devices, including cars, drones, robots, and smart IoT devices. This paper describes the role of AI in a network edge. Moreover, this paper elaborates and discusses the optimization methods for an edge network based on AI techniques. Finally, the paper considers the security issue as a major concern and prospective approaches to solving this issue in an edge network. Full article
(This article belongs to the Topic Internet of Things: Latest Advances)
Show Figures

Figure 1

11 pages, 3495 KiB  
Article
Reconstructing Damaged Complex Networks Based on Neural Networks
by Ye Hoon Lee and Insoo Sohn
Symmetry 2017, 9(12), 310; https://doi.org/10.3390/sym9120310 - 9 Dec 2017
Cited by 9 | Viewed by 4373
Abstract
Despite recent progress in the study of complex systems, reconstruction of damaged networks due to random and targeted attack has not been addressed before. In this paper, we formulate the network reconstruction problem as an identification of network structure based on much reduced [...] Read more.
Despite recent progress in the study of complex systems, reconstruction of damaged networks due to random and targeted attack has not been addressed before. In this paper, we formulate the network reconstruction problem as an identification of network structure based on much reduced link information. Furthermore, a novel method based on multilayer perceptron neural network is proposed as a solution to the problem of network reconstruction. Based on simulation results, it was demonstrated that the proposed scheme achieves very high reconstruction accuracy in small-world network model and a robust performance in scale-free network model. Full article
(This article belongs to the Special Issue Graph Theory)
Show Figures

Figure 1

Back to TopTop