Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (2)

Search Parameters:
Keywords = gigantic jet

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
12 pages, 2619 KB  
Case Report
Implication of Subsequent Leaders in the Gigantic Jet
by Wen-Qian Chang, Yan-Mou Lai, Cheng-Ling Kuo, Janusz Mlynarczyk and Zhong-Yi Lin
Atmosphere 2024, 15(7), 781; https://doi.org/10.3390/atmos15070781 - 29 Jun 2024
Viewed by 1620
Abstract
Most of the lightning appears below the cloud or inside the cloud. Unlike conventional lightning, blue jets and gigantic jets (GJ) produce upward discharge since electric discharge occurs as a form of cloud-to-air leader. We analyzed a gigantic jet recorded in the 2022 [...] Read more.
Most of the lightning appears below the cloud or inside the cloud. Unlike conventional lightning, blue jets and gigantic jets (GJ) produce upward discharge since electric discharge occurs as a form of cloud-to-air leader. We analyzed a gigantic jet recorded in the 2022 Taiwan campaign. For our color photograph recorded in the observation, high spatial resolution (150 m) at a close distance (140 km) resolves the important spatial features of the GJ phenomena. First, the GJ propagated upwardly as the fully developed jet with a maximum height of ~80 km above the cloud top ~17 km. After the fully developed stage, the subsequent leader reached its top height of ~30 km with a width of 0.5–1.0 km. The subsequent leader attempted but failed to develop from leader to fully developed jet. The subsequent leader may be interpreted as a negative stepped leader associated with cloud rebrightening, similar to the subsequent stroke in the multi-stroke lightning. Besides, the relatively higher IC flash rates associated with the rise of cloud tops benefit the required meteorological conditions for developing gigantic jets. Full article
(This article belongs to the Special Issue Recent Advances in Lightning Research)
Show Figures

Figure 1

21 pages, 1647 KB  
Article
Artificial Intelligence Approach for Classifying Images of Upper-Atmospheric Transient Luminous Events
by Axi Aguilera and Vidya Manian
Sensors 2024, 24(10), 3208; https://doi.org/10.3390/s24103208 - 18 May 2024
Cited by 1 | Viewed by 2103
Abstract
Transient Luminous Events (TLEs) are short-lived, upper-atmospheric optical phenomena associated with thunderstorms. Their rapid and random occurrence makes manual classification laborious and time-consuming. This study presents an effective approach to automating the classification of TLEs using state-of-the-art Convolutional Neural Networks (CNNs) and a [...] Read more.
Transient Luminous Events (TLEs) are short-lived, upper-atmospheric optical phenomena associated with thunderstorms. Their rapid and random occurrence makes manual classification laborious and time-consuming. This study presents an effective approach to automating the classification of TLEs using state-of-the-art Convolutional Neural Networks (CNNs) and a Vision Transformer (ViT). The ViT architecture and four different CNN architectures, namely, ResNet50, ResNet18, GoogLeNet, and SqueezeNet, are employed and their performance is evaluated based on their accuracy and execution time. The models are trained on a dataset that was augmented using rotation, translation, and flipping techniques to increase its size and diversity. Additionally, the images are preprocessed using bilateral filtering to enhance their quality. The results show high classification accuracy across all models, with ResNet50 achieving the highest accuracy. However, a trade-off is observed between accuracy and execution time, which should be considered based on the specific requirements of the task. This study demonstrates the feasibility and effectiveness of using transfer learning and pre-trained CNNs for the automated classification of TLEs. Full article
(This article belongs to the Special Issue Applications of Video Processing and Computer Vision Sensor II)
Show Figures

Figure 1

Back to TopTop