Next Article in Journal
Transfer-to-Transfer Learning Approach for Computer Aided Detection of COVID-19 in Chest Radiographs
Previous Article in Journal
A Biologically Motivated, Proto-Object-Based Audiovisual Saliency Model
Article

Design Patterns for Resource-Constrained Automated Deep-Learning Methods

1
Institute of Applied Information Technology, School of Engineering, Zurich University of Applied Sciences ZHAW, 8400 Winterthur, Switzerland
2
Faculty of Informatics, Università della Svizzera italiana USI, 6900 Lugano, Switzerland
3
Institute of Neural Information Processing, Computer Science Department, Faculty of Engineering, Computer Science and Psychology, Ulm University, 89081 Ulm, Germany
4
Machine Learning and Optimization Laboratory, École Polytechnique Fédérale de Lausanne (EPFL), 1015 Lausanne, Switzerland
5
ECLT—European Centre for Living Technology, 30123 Venice, Italy
*
Author to whom correspondence should be addressed.
The authors contributed equally to this work.
AI 2020, 1(4), 510-538; https://doi.org/10.3390/ai1040031
Received: 18 September 2020 / Revised: 28 October 2020 / Accepted: 30 October 2020 / Published: 6 November 2020
(This article belongs to the Section AI Systems: Theory and Applications)
We present an extensive evaluation of a wide variety of promising design patterns for automated deep-learning (AutoDL) methods, organized according to the problem categories of the 2019 AutoDL challenges, which set the task of optimizing both model accuracy and search efficiency under tight time and computing constraints. We propose structured empirical evaluations as the most promising avenue to obtain design principles for deep-learning systems due to the absence of strong theoretical support. From these evaluations, we distill relevant patterns which give rise to neural network design recommendations. In particular, we establish (a) that very wide fully connected layers learn meaningful features faster; we illustrate (b) how the lack of pretraining in audio processing can be compensated by architecture search; we show (c) that in text processing deep-learning-based methods only pull ahead of traditional methods for short text lengths with less than a thousand characters under tight resource limitations; and lastly we present (d) evidence that in very data- and computing-constrained settings, hyperparameter tuning of more traditional machine-learning methods outperforms deep-learning systems. View Full-Text
Keywords: automated machine learning; architecture design; computer vision; audio processing; natural language processing; weakly supervised learning automated machine learning; architecture design; computer vision; audio processing; natural language processing; weakly supervised learning
Show Figures

Graphical abstract

MDPI and ACS Style

Tuggener, L.; Amirian, M.; Benites, F.; von Däniken, P.; Gupta, P.; Schilling, F.-P.; Stadelmann, T. Design Patterns for Resource-Constrained Automated Deep-Learning Methods. AI 2020, 1, 510-538. https://doi.org/10.3390/ai1040031

AMA Style

Tuggener L, Amirian M, Benites F, von Däniken P, Gupta P, Schilling F-P, Stadelmann T. Design Patterns for Resource-Constrained Automated Deep-Learning Methods. AI. 2020; 1(4):510-538. https://doi.org/10.3390/ai1040031

Chicago/Turabian Style

Tuggener, Lukas, Mohammadreza Amirian, Fernando Benites, Pius von Däniken, Prakhar Gupta, Frank-Peter Schilling, and Thilo Stadelmann. 2020. "Design Patterns for Resource-Constrained Automated Deep-Learning Methods" AI 1, no. 4: 510-538. https://doi.org/10.3390/ai1040031

Find Other Styles

Article Access Map by Country/Region

1
Back to TopTop