Artículos con la etiqueta ‘Chaotic Dynamics (nlin.CD)’

Frontiers of chaotic advection

Por • 13 mar, 2014 • Category: Leyes

We review the present position of and survey future perspectives in the physics of chaotic advection; the field that emerged three decades ago at the intersection of fluid mechanics and nonlinear dynamics, which encompasses a range of applications with length scales ranging from micrometers to hundreds of kilometers, including systems as diverse as mixing and thermal processing of viscous fluids, micro-fluidics, biological flows, and large-scale dispersion of pollutants in oceanographic and atmospheric flows.

Symbolic Toolkit for Chaos Explorations

Por • 27 oct, 2013 • Category: Ciencia y tecnología

New computational technique based on the symbolic description utilizing kneading invariants is used for explorations of parametric chaos in a two exemplary systems with the Lorenz attractor: a normal model from mathematics, and a laser model from nonlinear optics. The technique allows for uncovering the stunning complexity and universality of the patterns discovered in the bi-parametric scans of the given models and detects their organizing centers — codimension-two T-points and separating saddles.

Justifying Typicality Measures of Boltzmannian Statistical Mechanics and Dynamical Systems

Por • 14 oct, 2013 • Category: Leyes

A popular view in contemporary Boltzmannian statistical mechanics is to interpret the measures as typicality measures. In measure-theoretic dynamical systems theory measures can similarly be interpreted as typicality measures. However, a justification why these measures are a good choice of typicality measures is missing, and the paper attempts to fill this gap. The paper first argues that Pitowsky’s (2012) justification of typicality measures does not fit the bill. Then a first proposal of how to justify typicality measures is presented. The main premises are that typicality measures are invariant and are related to the initial probability distribution of interest (which are translation-continuous or translation-close). The conclusion are two theorems which show that the standard measures of statistical mechanics and dynamical systems are typicality measures. There may be other typicality measures, but they agree about judgements of typicality. Finally, it is proven that if systems are ergodic or epsilon-ergodic, there are uniqueness results about typicality measures.

Are Deterministic Descriptions And Indeterministic Descriptions Observationally Equivalent?

Por • 14 oct, 2013 • Category: Ciencia y tecnología

The central question of this paper is: are deterministic and indeterministic descriptions observationally equivalent in the sense that they give the same predictions? I tackle this question for measure-theoretic deterministic systems and stochastic processes, both of which are ubiquitous in science. I first show that for many measure-theoretic deterministic systems there is a stochastic process which is observationally equivalent to the deterministic system. Conversely, I show that for all stochastic processes there is a measure-theoretic deterministic system which is observationally equivalent to the stochastic process. Still, one might guess that the measure-theoretic deterministic systems which are observationally equivalent to stochastic processes used in science do not include any deterministic systems used in science. I argue that this is not so because deterministic systems used in science even give rise to Bernoulli processes. Despite this, one might guess that measure-theoretic deterministic systems used in science cannot give the same predictions at every observation level as stochastic processes used in science. By proving results in ergodic theory, I show that also this guess is misguided: there are several deterministic systems used in science which give the same predictions at every observation level as Markov processes. All these results show that measure-theoretic deterministic systems and stochastic processes are observationally equivalent more often than one might perhaps expect. Furthermore, I criticise the claims of the previous philosophy papers Suppes (1993, 1999), Suppes and de Barros (1996) and Winnie (1998) on observational equivalence.

On the Observational Equivalence of Continuous-Time Deterministic and Indeterministic Descriptions

Por • 10 oct, 2013 • Category: Filosofía

This paper presents and philosophically assesses three types of results on the observational equivalence of continuous-time measure-theoretic deterministic and indeterministic descriptions. The first results establish observational equivalence to abstract mathematical descriptions. The second results are stronger because they show observational equivalence between deterministic and indeterministic descriptions found in science. Here I also discuss Kolmogorov’s contribution. For the third results I introduce two new meanings of `observational equivalence at every observation level’. Then I show the even stronger result of observational equivalence at every (and not just some) observation level between deterministic and indeterministic descriptions found in science. These results imply the following. Suppose one wants to find out whether a phenomenon is best modeled as deterministic or indeterministic. Then one cannot appeal to differences in the probability distributions of deterministic and indeterministic descriptions found in science to argue that one of the descriptions is preferable because there is no such difference. Finally, I criticise the extant claims of philosophers and mathematicians on observational equivalence.

Justifying Definitions in Mathematics—Going Beyond Lakatos

Por • 9 oct, 2013 • Category: Educacion

This paper addresses the actual practice of justifying definitions in mathematics. First, I introduce the main account of this issue, namely Lakatos’s proof-generated definitions. Based on a case study of definitions of randomness in ergodic theory, I identify three other common ways of justifying definitions: natural-world-justification, condition-justification and redundancy-justification. Also, I clarify the interrelationships between the different kinds of justification. Finally, I point out how Lakatos’s ideas are limited: they fail to show that various kinds of justification can be found and can be reasonable, and they fail to acknowledge the interplay between the different kinds of justification.

Production and Transfer of Energy and Information in Hamiltonian Systems

Por • 9 oct, 2013 • Category: Ambiente

We present novel results that relate energy and information transfer with sensitivity to initial conditions in chaotic multidimensional Hamiltonian systems. We show the relation among Kolmogorov – Sinai entropy, Lyapunov exponents, and upper bounds for the Mutual Information Rate calculated in the Hamiltonian phase space and on bi-dimensional subspaces. Our main result is that the net amount of transfer from kinetic to potential energy per unit of time is a power-law of the upper bound for the Mutual Information Rate between kinetic and potential energies, and also a power-law of the Kolmogorov – Sinai entropy. Therefore, transfer of energy is related with both transfer and production of information. However, the power-law nature of this relation means that a small increment of energy transferred leads to a relatively much larger increase of the information exchanged. Finally, a relation between our results and important quantities of Thermodynamics is presented.

Novel measures based on the Kolmogorov complexity for use in complex system behavior studies and time series analysis

Por • 9 oct, 2013 • Category: Crítica

We have proposed novel measures based on the Kolmogorov complexity for use in complex system behavior studies and time series analysis. We have considered background of the Kolmogorov complexity and also we have discussed meaning of the physical as well as other complexities. To get better insights into the complexity of complex systems and time series analysis we have introduced the three novel measures based on the Kolmogorov complexity: (i) the Kolmogorov complexity spectrum, (ii) the Kolmogorov complexity spectrum highest value and (iii) the overall Kolmogorov complexity. The characteristics of these measures have been tested using a generalized logistic equation. Finally, the proposed measures have been applied on different time series originating from: the model output (the biochemical substance exchange in a multi-cell system), four different geophysical phenomena (dynamics of: river flow, long term precipitation, indoor 222Rn concentration and UV radiation dose) and economy (stock prices dynamics). Results which are obtained offer deeper insights into complexity of the system dynamics behavior and time series analysis when the proposed complexity measures are applied.

What Are the New Implications of Chaos for Unpredictability?

Por • 9 oct, 2013 • Category: Ciencia y tecnología

From the beginning of chaos research until today, the unpredictability of chaos has been a central theme. It is widely believed and claimed by philosophers, mathematicians and physicists alike that chaos has a new implication for unpredictability, meaning that chaotic systems are unpredictable in a way that other deterministic systems are not. Hence one might expect that the question ‘What are the new implications of chaos for unpredictability?’ has already been answered in a satisfactory way. However, this is not the case. I will critically evaluate the existing answers and argue that they do not fit the bill. Then I will approach this question by showing that chaos can be defined via mixing, which has not been explicitly argued for. Based on this insight, I will propose that the sought-after new implication of chaos for unpredictability is the following: for predicting any event all sufficiently past events are approximately probabilistically irrelevant.

Chaos Forgets and Remembers: Measuring Information Creation, Destruction, and Storage

Por • 1 oct, 2013 • Category: Filosofía

The hallmark of deterministic chaos is that it creates information—the rate being given by the Kolmogorov-Sinai metric entropy. Since its introduction half a century ago, the metric entropy has been used as a unitary quantity to measure a system’s intrinsic unpredictability. Here, we show that it naturally decomposes into two structurally meaningful components: A portion of the created information—the ephemeral information—is forgotten and a portion—the bound information—is remembered. The bound information is a new kind of intrinsic computation that differs fundamentally from information creation: it measures the rate of active information storage. We show that it can be directly and accurately calculated via symbolic dynamics, revealing a hitherto unknown richness in how dynamical systems compute.