Next Article in Journal
Green Technology Fitness
Next Article in Special Issue
The Capacity for Correlated Semantic Memories in the Cortex
Previous Article in Journal
Information Transfer Among the Components in Multi-Dimensional Complex Dynamical Systems
Previous Article in Special Issue
Modelling of Behavior for Inhibition Corrosion of Bronze Using Artificial Neural Network (ANN)
Article Menu
Issue 10 (October) cover image

Export Article

Open AccessArticle

Statistical Mechanics of On-Line Learning Under Concept Drift

Bernoulli Institute for Mathematics, Computer Science and Artificial Intelligence, University of Groningen, Nijenborgh 9, 9747 AG Groningen, The Netherlands
Center of Excellence—Cognitive Interaction Technology (CITEC), Bielefeld University, Inspiration 1, 33619 Bielefeld, Germany
Author to whom correspondence should be addressed.
Entropy 2018, 20(10), 775;
Received: 4 September 2018 / Revised: 3 October 2018 / Accepted: 8 October 2018 / Published: 10 October 2018
(This article belongs to the Special Issue Statistical Mechanics of Neural Networks)
PDF [462 KB, uploaded 10 October 2018]


We introduce a modeling framework for the investigation of on-line machine learning processes in non-stationary environments. We exemplify the approach in terms of two specific model situations: In the first, we consider the learning of a classification scheme from clustered data by means of prototype-based Learning Vector Quantization (LVQ). In the second, we study the training of layered neural networks with sigmoidal activations for the purpose of regression. In both cases, the target, i.e., the classification or regression scheme, is considered to change continuously while the system is trained from a stream of labeled data. We extend and apply methods borrowed from statistical physics which have been used frequently for the exact description of training dynamics in stationary environments. Extensions of the approach allow for the computation of typical learning curves in the presence of concept drift in a variety of model situations. First results are presented and discussed for stochastic drift processes in classification and regression problems. They indicate that LVQ is capable of tracking a classification scheme under drift to a non-trivial extent. Furthermore, we show that concept drift can cause the persistence of sub-optimal plateau states in gradient based training of layered neural networks for regression. View Full-Text
Keywords: concept drift; on-line learning; continual learning; neural networks; learning vector quantization; statistical physics of learning concept drift; on-line learning; continual learning; neural networks; learning vector quantization; statistical physics of learning

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited (CC BY 4.0).

Share & Cite This Article

MDPI and ACS Style

Straat, M.; Abadi, F.; Göpfert, C.; Hammer, B.; Biehl, M. Statistical Mechanics of On-Line Learning Under Concept Drift. Entropy 2018, 20, 775.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics



[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top