Chaos Forgets and Remembers: Measuring Information Creation, Destruction, and Storage

Por • 1 oct, 2013 • Sección: Filosofía

Ryan G. James, Korana Burke, James P. Crutchfield

Abstract: The hallmark of deterministic chaos is that it creates information—the rate being given by the Kolmogorov-Sinai metric entropy. Since its introduction half a century ago, the metric entropy has been used as a unitary quantity to measure a system’s intrinsic unpredictability. Here, we show that it naturally decomposes into two structurally meaningful components: A portion of the created information—the ephemeral information—is forgotten and a portion—the bound information—is remembered. The bound information is a new kind of intrinsic computation that differs fundamentally from information creation: it measures the rate of active information storage. We show that it can be directly and accurately calculated via symbolic dynamics, revealing a hitherto unknown richness in how dynamical systems compute.

arXiv:1309.5504v1 [nlin.CD]

Post to Twitter

Etiquetado con: , , , , , , , , ,

Escribe un comentario