Artículos con la etiqueta ‘Caos y dinámica’

Justifying Typicality Measures of Boltzmannian Statistical Mechanics and Dynamical Systems

Por • 14 oct, 2013 • Category: Leyes

A popular view in contemporary Boltzmannian statistical mechanics is to interpret the measures as typicality measures. In measure-theoretic dynamical systems theory measures can similarly be interpreted as typicality measures. However, a justification why these measures are a good choice of typicality measures is missing, and the paper attempts to fill this gap. The paper first argues that Pitowsky’s (2012) justification of typicality measures does not fit the bill. Then a first proposal of how to justify typicality measures is presented. The main premises are that typicality measures are invariant and are related to the initial probability distribution of interest (which are translation-continuous or translation-close). The conclusion are two theorems which show that the standard measures of statistical mechanics and dynamical systems are typicality measures. There may be other typicality measures, but they agree about judgements of typicality. Finally, it is proven that if systems are ergodic or epsilon-ergodic, there are uniqueness results about typicality measures.

Are Deterministic Descriptions And Indeterministic Descriptions Observationally Equivalent?

Por • 14 oct, 2013 • Category: Ciencia y tecnología

The central question of this paper is: are deterministic and indeterministic descriptions observationally equivalent in the sense that they give the same predictions? I tackle this question for measure-theoretic deterministic systems and stochastic processes, both of which are ubiquitous in science. I first show that for many measure-theoretic deterministic systems there is a stochastic process which is observationally equivalent to the deterministic system. Conversely, I show that for all stochastic processes there is a measure-theoretic deterministic system which is observationally equivalent to the stochastic process. Still, one might guess that the measure-theoretic deterministic systems which are observationally equivalent to stochastic processes used in science do not include any deterministic systems used in science. I argue that this is not so because deterministic systems used in science even give rise to Bernoulli processes. Despite this, one might guess that measure-theoretic deterministic systems used in science cannot give the same predictions at every observation level as stochastic processes used in science. By proving results in ergodic theory, I show that also this guess is misguided: there are several deterministic systems used in science which give the same predictions at every observation level as Markov processes. All these results show that measure-theoretic deterministic systems and stochastic processes are observationally equivalent more often than one might perhaps expect. Furthermore, I criticise the claims of the previous philosophy papers Suppes (1993, 1999), Suppes and de Barros (1996) and Winnie (1998) on observational equivalence.

Justifying Definitions in Mathematics—Going Beyond Lakatos

Por • 9 oct, 2013 • Category: Educacion

This paper addresses the actual practice of justifying definitions in mathematics. First, I introduce the main account of this issue, namely Lakatos’s proof-generated definitions. Based on a case study of definitions of randomness in ergodic theory, I identify three other common ways of justifying definitions: natural-world-justification, condition-justification and redundancy-justification. Also, I clarify the interrelationships between the different kinds of justification. Finally, I point out how Lakatos’s ideas are limited: they fail to show that various kinds of justification can be found and can be reasonable, and they fail to acknowledge the interplay between the different kinds of justification.

Chaos Forgets and Remembers: Measuring Information Creation, Destruction, and Storage

Por • 1 oct, 2013 • Category: Filosofía

The hallmark of deterministic chaos is that it creates information—the rate being given by the Kolmogorov-Sinai metric entropy. Since its introduction half a century ago, the metric entropy has been used as a unitary quantity to measure a system’s intrinsic unpredictability. Here, we show that it naturally decomposes into two structurally meaningful components: A portion of the created information—the ephemeral information—is forgotten and a portion—the bound information—is remembered. The bound information is a new kind of intrinsic computation that differs fundamentally from information creation: it measures the rate of active information storage. We show that it can be directly and accurately calculated via symbolic dynamics, revealing a hitherto unknown richness in how dynamical systems compute.