Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (3)

Search Parameters:
Keywords = Fair DARTS

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
28 pages, 24617 KB  
Article
Noise-Disruption-Inspired Neural Architecture Search with Spatial–Spectral Attention for Hyperspectral Image Classification
by Aili Wang, Kang Zhang, Haibin Wu, Shiyu Dai, Yuji Iwahori and Xiaoyu Yu
Remote Sens. 2024, 16(17), 3123; https://doi.org/10.3390/rs16173123 - 24 Aug 2024
Cited by 6 | Viewed by 2152
Abstract
In view of the complexity and diversity of hyperspectral images (HSIs), the classification task has been a major challenge in the field of remote sensing image processing. Hyperspectral classification (HSIC) methods based on neural architecture search (NAS) is a current attractive frontier that [...] Read more.
In view of the complexity and diversity of hyperspectral images (HSIs), the classification task has been a major challenge in the field of remote sensing image processing. Hyperspectral classification (HSIC) methods based on neural architecture search (NAS) is a current attractive frontier that not only automatically searches for neural network architectures best suited to the characteristics of HSI data, but also avoids the possible limitations of manual design of neural networks when dealing with new classification tasks. However, the existing NAS-based HSIC methods have the following limitations: (1) the search space lacks efficient convolution operators that can fully extract discriminative spatial–spectral features, and (2) NAS based on traditional differentiable architecture search (DARTS) has performance collapse caused by unfair competition. To overcome these limitations, we proposed a neural architecture search method with receptive field spatial–spectral attention (RFSS-NAS), which is specifically designed to automatically search the optimal architecture for HSIC. Considering the core needs of the model in extracting more discriminative spatial–spectral features, we designed a novel and efficient attention search space. The core component of this innovative space is the receptive field spatial–spectral attention convolution operator, which is capable of precisely focusing on the critical information in the image, thus greatly enhancing the quality of feature extraction. Meanwhile, for the purpose of solving the unfair competition issue in the traditional differentiable architecture search (DARTS) strategy, we skillfully introduce the Noisy-DARTS strategy. The strategy ensures the fairness and efficiency of the search process and effectively avoids the risk of performance crash. In addition, to further improve the robustness of the model and ability to recognize difficult-to-classify samples, we proposed a fusion loss function by combining the advantages of the label smoothing loss and the polynomial expansion perspective loss function, which not only smooths the label distribution and reduces the risk of overfitting, but also effectively handles those difficult-to-classify samples, thus improving the overall classification accuracy. Experiments on three public datasets fully validate the superior performance of RFSS-NAS. Full article
(This article belongs to the Special Issue Recent Advances in the Processing of Hyperspectral Images)
Show Figures

Graphical abstract

19 pages, 1760 KB  
Review
Gradient-Based Neural Architecture Search: A Comprehensive Evaluation
by Sarwat Ali and M. Arif Wani
Mach. Learn. Knowl. Extr. 2023, 5(3), 1176-1194; https://doi.org/10.3390/make5030060 - 14 Sep 2023
Cited by 11 | Viewed by 4893
Abstract
One of the challenges in deep learning involves discovering the optimal architecture for a specific task. This is effectively tackled through Neural Architecture Search (NAS). Neural Architecture Search encompasses three prominent approaches—reinforcement learning, evolutionary algorithms, and gradient descent—that have demonstrated noteworthy potential in [...] Read more.
One of the challenges in deep learning involves discovering the optimal architecture for a specific task. This is effectively tackled through Neural Architecture Search (NAS). Neural Architecture Search encompasses three prominent approaches—reinforcement learning, evolutionary algorithms, and gradient descent—that have demonstrated noteworthy potential in identifying good candidate architectures. However, approaches based on reinforcement learning and evolutionary algorithms often necessitate extensive computational resources, requiring hundreds of GPU days or more. Therefore, we confine this work to a gradient-based approach due to its lower computational resource demands. Our objective encompasses identifying the optimal gradient-based NAS method and pinpointing opportunities for future enhancements. To achieve this, a comprehensive evaluation of the use of four major Gradient descent-based architecture search methods for discovering the best neural architecture for image classification tasks is provided. An overview of these gradient-based methods, i.e., DARTS, PDARTS, Fair DARTS and Att-DARTS, is presented. A theoretical comparison, based on search spaces, continuous relaxation strategy and bi-level optimization, for deriving the best neural architecture is then provided. The strong and weak features of these methods are also listed. Experimental results for comparing the error rate and computational cost of these gradient-based methods are analyzed. These experiments involved using bench marking datasets CIFAR-10, CIFAR-100 and ImageNet. The results show that PDARTS is better and faster among the examined methods, making it a potent candidate for automating Neural Architecture Search. By effectively conducting a comparative analysis, our research provides valuable insights and future research directions to address the criticism and gaps in the literature. Full article
(This article belongs to the Special Issue Deep Learning and Applications)
Show Figures

Figure 1

9 pages, 1450 KB  
Article
Could Prosthesis Use Provide a Competitive Advantage in Darts?
by Jack DeVillez, Jenna Sabato and Goeran Fiedler
Prosthesis 2022, 4(2), 244-252; https://doi.org/10.3390/prosthesis4020024 - 20 May 2022
Cited by 6 | Viewed by 4214
Abstract
Competitive darts has become increasingly popular over the past few decades, and efforts have been made to have the game recognized as an Olympic sport in the future. The raised profile of the sport and the associated rewards bring up new challenges for [...] Read more.
Competitive darts has become increasingly popular over the past few decades, and efforts have been made to have the game recognized as an Olympic sport in the future. The raised profile of the sport and the associated rewards bring up new challenges for the integrity of the game, as athletes are incentivized to exploit rule ambiguities in order to gain competitive advantages. In this research, it was hypothesized that uneven leg lengths and weights, which are comparatively easily realizable in prosthetic limbs, allow players to lean closer to the target and thus improve their throwing accuracy. This hypothesis was tested in a sample of 13 able-bodied subjects who participated in the study, with three sets of throwing trials; one to establish the baseline and two with a longer and heavier trailing leg, respectively. The findings suggest that these modifications are indeed beneficial, resulting in significantly shorter throwing distances and average accuracy improvements of up to 11%. The debate about the potential competitive advantages of prosthesis-wearing Paralympic athletes over their able-bodied peers previously focused on short track running events, where rules have been established that govern the allowable geometry and configuration of sprint prostheses. It appears that comparable regulations should be considered for darts competitions, in order to ensure fair conditions for all participants. Full article
(This article belongs to the Section Orthopedics and Rehabilitation)
Show Figures

Figure 1

Back to TopTop