Special Issue "Information in Dynamical Systems and Complex Systems"

Quicklinks

A special issue of Entropy (ISSN 1099-4300).

Deadline for manuscript submissions: closed (28 February 2014)

Special Issue Editors

Guest Editor
Prof. Dr. Erik M Bollt
Department of Mathematics, Clarkson University, Potsdam, NY 13699-5815, USA
Website: http://www.clarkson.edu/~bolltem
E-Mail: bolltem@clarkson.edu
Phone: +1 315 268 2307
Fax: +1 315 268 2371
Interests: dynamical systems; chaos theory; control of chaos; time-series analysis; Frobenius-Perron operators; stochastic dynamical systems; measurable dynamics; symbol dynamics; connections to information theory; image processing; data assimilation; connections between models and observed data; complex and networked coupled systems

Guest Editor
Dr. Jie Sun
Department of Mathematics, Clarkson University, Potsdam, NY 13699-5815, USA
E-Mail: sunj@clarkson.edu
Phone: +1 315 268 2307
Fax: +1 315 268 2388
Interests: dynamical systems; complex networks; information theory; time series analysis

Special Issue Information

Dear Colleagues,

From July 18-19, 2013, a workshop was held, entitled, Information in Dynamical Systems and Complex Systems, Summer 2013 Workshop in Burlington, VT with Organizers: Erik M. Bollt and Jie Sun (Clarkson University). Invited Attendees were, Erik Bollt, Ned Corron (U.S. Army), James Crutchfield (University of California, Davis),  David Feldman (College of the Atlantic), Adom Giffin (Clarkson University), Kevin Knuth (University at Albany, SUNY), Ying-Cheng Lai (Arizona State University), John Mahoney, (University of California, Merced), Konstantin Mischaikow, (Rutgers University), Edward Ott, (University of Maryland, College Park), Milan Palus, (Academy of Sciences of the Czech Republic), Shawn Pethel (U.S. Army), Maurizio Porfiri, (Polytechnic Institute of New York University), Samuel Stanton (U.S. Army), Jie Sun, James Yorke, (University of Maryland, College Park).

This special issue of Entropy will offer a venue to collect some of the synergy, consensus and collective thoughts on the topical themes stated for the session.  The following were the topics and themes of the workshop and the participants are invited to submit papers summarizing the collective discussions presented.

To that end, writings should take the form of,

  • Research articles related to the presentation given at the workshop.
  • Research articles related to the themes of the workshop stated below.
  • Commentaries on future directions of information and complexity in large scaled systems as related to some themes below.
  • Commentaries regarding connections and contrasts in the directions below.
  • Commentaries on wisdom from experience and theory related to misuses of concepts from tools from this area, the so-called, “stop the insanity” thoughts.
  • Discussions on other related themes connecting to the broader themes such as connections between observer and intrinsic based information and flow.

Workshop themes were stated as follows.  Given the modern focus of dynamical systems on coupled oscillators that form complex networks, it is important to move forward and explore these problems from the perspective of information content, flow and causality. The following general areas are central in this endeavor:

Information flow. In particular transfer entropy has gained a great deal of interest in recent years as future states may be conditioned on past states both with and without access to other stochastic processes as a test through Kullback-Lieber divergence, but recent work suggests that there are possible misinterpretations from the use of transfer entropy for causality inference.
Causality, and information signatures of causation. A central question in science is what causes outcomes of interest, and in particular for affecting and controlling outcomes this is even more important. From the perspective of information flow, causation and causal inference becomes particularly poignant.
Symmetries and reversibility, may be exploited in special circumstances to enlighten understanding of causality,structure, as well as clustering.
Scales, hierarchies, lead to understanding of relevance and nested topological partitions when defining a scale from which information symbols are defined. Variational and optimization principles of information in physics, in particular regarding maximum entropy principles that lead to understanding underlying laws.
Randomness, structure and causality. In some sense randomness may be described as external and unmodelled effects, which we may interpret in the context here as \unknown information.” This leads to:
Hidden states and hidden processes, including such methods as hidden Markov models and more generally Bayesian inference methods. In the context of information content of a dynamical system, such perspective should potentially yield better understanding.
Measures and metrics of complexity and information content. The phrase “complexity” is commonly used for a wide variety of systems, behaviors and processes, and yet a commonly agreed description as to what the phrase means is lacking.
Physical laws as information filters or algorithms. Since physical laws lead to evolution equations, which from the perspective of this discussion defines evolution from some information state to a new information state, then it can be said that physical laws may be described either as algorithms or information filters that translate states.
Some questions to consider:
Can we develop a general mechanistic description of what renders a real complex system different from a large but perhaps simpler system (particularly from an information theory perspective)?
Can physical laws be defined in an algorithmic and information theoretic manner?
Identify engineering applications, especially those that benefit directly from information theory perspective and methods.
How can this perspective impact design?
Can specific control methods be developed that benefit?
Are piecewise impulsive systems from mechanical as well as electronic engineering design particularly well suited?
Can group behaviors and cooperative behaviors such as those of animals and humans be better understood in terms of information theoretic descriptions? What role does hierarchical structures come into play?
Can synchronization be understood as the counter point to complex behavior?
Can methods designed to identify causal influences be adapted to further adjust and define control strategies for complex systems in biological, social, physical and engineering contexts?
Is there a minimal information description of a dynamical system that will facilitate engineering design? Does approximate description of the formal language suffice for approximate modeling lead to faster and easier design?

Discuss the validity of the popular approaches of information and entropy measures as a systems probe, change detection, damage detection and systems health monitoring.

Prof. Dr. Erik M Bollt
Dr. Jie Sun
Guest Editors

Published Papers (6 papers)

by ,  and
Entropy 2014, 16(7), 3889-3902; doi:10.3390/e16073889
Received: 4 March 2014; in revised form: 23 June 2014 / Accepted: 7 July 2014 / Published: 15 July 2014
Show/Hide Abstract | PDF Full-text (339 KB)

by ,  and
Entropy 2014, 16(6), 3416-3433; doi:10.3390/e16063416
Received: 28 April 2014; in revised form: 14 May 2014 / Accepted: 9 June 2014 / Published: 20 June 2014
Show/Hide Abstract | PDF Full-text (1396 KB) | HTML Full-text | XML Full-text

by  and
Entropy 2014, 16(6), 3379-3400; doi:10.3390/e16063379
Received: 23 March 2014; in revised form: 29 May 2014 / Accepted: 6 June 2014 / Published: 19 June 2014
Show/Hide Abstract | PDF Full-text (601 KB) | HTML Full-text | XML Full-text

by  and
Entropy 2014, 16(5), 2839-2849; doi:10.3390/e16052839
Received: 18 February 2014; in revised form: 15 May 2014 / Accepted: 20 May 2014 / Published: 23 May 2014
Show/Hide Abstract | PDF Full-text (126 KB) | HTML Full-text | XML Full-text

by  and
Entropy 2014, 16(3), 1396-1413; doi:10.3390/e16031396
Received: 20 December 2013; in revised form: 24 February 2014 / Accepted: 5 March 2014 / Published: 10 March 2014
Show/Hide Abstract | PDF Full-text (484 KB) | HTML Full-text | XML Full-text

by , ,  and
Entropy 2014, 16(3), 1315-1330; doi:10.3390/e16031315
Received: 30 December 2013; in revised form: 18 February 2014 / Accepted: 25 February 2014 / Published: 28 February 2014
Show/Hide Abstract | Cited by 1 | PDF Full-text (715 KB) | HTML Full-text | XML Full-text

Last update: 11 December 2013

Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert