Next Article in Journal
Rényi Entropy-Based Spectrum Sensing in Mobile Cognitive Radio Networks Using Software Defined Radio
Next Article in Special Issue
Lie Group Statistics and Lie Group Machine Learning Based on Souriau Lie Groups Thermodynamics & Koszul-Souriau-Fisher Metric: New Entropy Definition as Generalized Casimir Invariant Function in Coadjoint Representation
Previous Article in Journal
Evanescent Wave Approximation for Non-Hermitian Hamiltonians
Previous Article in Special Issue
Rigid Shape Registration Based on Extended Hamiltonian Learning
Article

Multi-Stage Meta-Learning for Few-Shot with Lie Group Network Constraint

School of Computer Science and Technology, Soochow University, Suzhou 215006, China
*
Author to whom correspondence should be addressed.
Entropy 2020, 22(6), 625; https://doi.org/10.3390/e22060625
Received: 14 May 2020 / Accepted: 29 May 2020 / Published: 5 June 2020
Deep learning has achieved many successes in different fields but can sometimes encounter an overfitting problem when there are insufficient amounts of labeled samples. In solving the problem of learning with limited training data, meta-learning is proposed to remember some common knowledge by leveraging a large number of similar few-shot tasks and learning how to adapt a base-learner to a new task for which only a few labeled samples are available. Current meta-learning approaches typically uses Shallow Neural Networks (SNNs) to avoid overfitting, thus wasting much information in adapting to a new task. Moreover, the Euclidean space-based gradient descent in existing meta-learning approaches always lead to an inaccurate update of meta-learners, which poses a challenge to meta-learning models in extracting features from samples and updating network parameters. In this paper, we propose a novel meta-learning model called Multi-Stage Meta-Learning (MSML) to post the bottleneck during the adapting process. The proposed method constrains a network to Stiefel manifold so that a meta-learner could perform a more stable gradient descent in limited steps so that the adapting process can be accelerated. An experiment on the mini-ImageNet demonstrates that the proposed method reached a better accuracy under 5-way 1-shot and 5-way 5-shot conditions. View Full-Text
Keywords: meta-learning; lie group; machine learning; deep learning; convolutional neural network meta-learning; lie group; machine learning; deep learning; convolutional neural network
Show Figures

Figure 1

MDPI and ACS Style

Dong, F.; Liu, L.; Li, F. Multi-Stage Meta-Learning for Few-Shot with Lie Group Network Constraint. Entropy 2020, 22, 625. https://doi.org/10.3390/e22060625

AMA Style

Dong F, Liu L, Li F. Multi-Stage Meta-Learning for Few-Shot with Lie Group Network Constraint. Entropy. 2020; 22(6):625. https://doi.org/10.3390/e22060625

Chicago/Turabian Style

Dong, Fang, Li Liu, and Fanzhang Li. 2020. "Multi-Stage Meta-Learning for Few-Shot with Lie Group Network Constraint" Entropy 22, no. 6: 625. https://doi.org/10.3390/e22060625

Find Other Styles
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Back to TopTop