Artículos con la etiqueta ‘teoría de la información’

Information Content of Elementary Systems as a Physical Principle

Por • 30 mar, 2014 • Category: Ambiente

Quantum physics has remarkable characteristics such as quantum correlations, uncertainty relations, no cloning, which make for an interpretative and conceptual gap between the classical and the quantum world. To provide more unified framework the generalized probabilistic theories were formulated. Recently, it turned out that such theories include so called “postquantum” ones which share many of the typical quantum characteristics but predict supraquantum effects such as correlations stronger than quantum ones. As a result we reveal even more dramatic gap between classical/quantum and post-quantum world. Therefore it is imperative to search for information principles characterizing physical theories. In recent years, different principles has been proposed, however all of the principles considered so far has been correlation ones. Here, we introduce an elementary system information content principle (ICP) whose basic ingredient is the phenomenon of Heisenberg uncertainty.

“Information-Friction” and its implications on minimum energy required for communication

Por • 10 ene, 2014 • Category: Ambiente

Just as there are frictional losses associated with moving masses on a surface, what if there were frictional losses associated with moving information on a substrate? Indeed, many methods of communication suffer from such frictional losses. We propose to model these losses as proportional to “bit-meters,” i.e., the product of mass of information (i.e., the number of bits) and the distance of information transport. We use this “information-friction” model to understand fundamental energy requirements on encoding and decoding in communication circuitry. First, for communication across a binary input AWGN channel, we arrive at limits on bit-meters (and thus energy consumption) for decoding implementations that have a predetermined input-independent lengths of messages.

Information-theoretic interpretation of quantum formalism

Por • 6 ene, 2014 • Category: Leyes

We propose an information-theoretic interpretation of quantum formalism based on Bayesian probability and free from any additional axiom. Quantum information is construed as a technique of statistical estimation of the variables within an information manifold. We start from a classical register. The input data are converted into a Bayesian prior, conditioning the probability of the variables involved. In static systems, this framework leads to solving a linear programming problem which is next transcribed into a Hilbert space using the Gleason theorem. General systems are introduced in a second step by quantum channels. This provides an information-theoretic foundation to quantum information, including the rules of commutation of observables. We conclude that the theory, while dramatically expanding the scope of classical information, is not different from the information itself and is therefore a universal tool of reasoning.

Complexity measurement of natural and artificial languages

Por • 2 dic, 2013 • Category: Ambiente

We compared entropy for texts written in natural languages (English, Spanish) and artificial languages (computer software) based on a simple expression for the entropy as a function of message length and specific word diversity. Code text written in artificial languages showed higher entropy than text of similar length expressed in natural languages. Spanish texts exhibit more symbolic diversity than English ones. Results showed that algorithms based on complexity measures differentiate artificial from natural languages, and that text analysis based on complexity measures allows the unveiling of important aspects of their nature. We propose specific expressions to examine entropy related aspects of tests and estimate the values of entropy, emergence, self-organization and complexity based on specific diversity and message length.

Production and Transfer of Energy and Information in Hamiltonian Systems

Por • 9 oct, 2013 • Category: Ambiente

We present novel results that relate energy and information transfer with sensitivity to initial conditions in chaotic multidimensional Hamiltonian systems. We show the relation among Kolmogorov – Sinai entropy, Lyapunov exponents, and upper bounds for the Mutual Information Rate calculated in the Hamiltonian phase space and on bi-dimensional subspaces. Our main result is that the net amount of transfer from kinetic to potential energy per unit of time is a power-law of the upper bound for the Mutual Information Rate between kinetic and potential energies, and also a power-law of the Kolmogorov – Sinai entropy. Therefore, transfer of energy is related with both transfer and production of information. However, the power-law nature of this relation means that a small increment of energy transferred leads to a relatively much larger increase of the information exchanged. Finally, a relation between our results and important quantities of Thermodynamics is presented.

Chaos Forgets and Remembers: Measuring Information Creation, Destruction, and Storage

Por • 1 oct, 2013 • Category: Filosofía

The hallmark of deterministic chaos is that it creates information—the rate being given by the Kolmogorov-Sinai metric entropy. Since its introduction half a century ago, the metric entropy has been used as a unitary quantity to measure a system’s intrinsic unpredictability. Here, we show that it naturally decomposes into two structurally meaningful components: A portion of the created information—the ephemeral information—is forgotten and a portion—the bound information—is remembered. The bound information is a new kind of intrinsic computation that differs fundamentally from information creation: it measures the rate of active information storage. We show that it can be directly and accurately calculated via symbolic dynamics, revealing a hitherto unknown richness in how dynamical systems compute.

Problem Complexity Research from Energy Perspective

Por • 21 sep, 2013 • Category: Crítica

Computational complexity is a particularly important objective. The idea of Landauer principle was extended through mapping three classic problems (sorting,ordered searching and max of N unordered numbers) into Maxwell demon thought experiment in this paper. The problems’complexity is defined on the entropy basis and the minimum energy required to solve them are rigorous deduced from the perspective of energy (entropy) and the second law of thermodynamics. Then the theoretical energy consumed by real program and basic operators of classical computer are both analyzed, the time complexity lower bounds of three problems’all possible algorithms are derived in this way. The lower bound is also deduced for the two n*n matrix multiplication problem. In the end, the reason why reversible computation is impossible and the possibility of super-linear energy consumption capacity which may be the power behind quantum computation are discussed, a conjecture is proposed which may prove NP!=P. The study will bring fresh and profound understanding of computation complexity.

Towards a mathematical theory of meaningful communication

Por • 24 ago, 2013 • Category: Leyes

Despite its obvious relevance, meaning has been outside most theoretical approaches to information in biology. As a consequence, functional responses based on an appropriate interpretation of signals has been replaced by a probabilistic description of correlations between emitted and received symbols. This assumption leads to potential paradoxes, such as the presence of a maximum information associated to a channel that would actually create completely wrong interpretations of the signals. Game-theoretic models of language evolution use this view of Shannon’s theory, but other approaches considering embodied communicating agents show that the correct (meaningful) match resulting from agent-agent exchanges is always achieved and natural systems obviously solve the problem correctly. How can Shannon’s theory be expanded in such a way that meaning -at least, in its minimal referential form- is properly incorporated?

Where the “it from bit” come from?

Por • 10 jun, 2013 • Category: Filosofía

In his 1989 essay, John Archibald Wheeler has tried to answer the eternal question of existence. He did it by searching for links between information, physics, and quanta. The main concept emerging from his essay is that “every physical quantity, every it, derives its ultimate significance from bits, binary yes-or-no indications”. This concept has been summarized in the catchphrase “it from bit”. In the Wheeler’s essay, it is possible to read several times the echoes of the philosophy of Niels Bohr. The Danish physicist has pointed out how the quantum and relativistic physics – forcing us to abandon the anchor of the visual reference of common sense – have imposed a greater attention to the language. Bohr did not deny the physical reality, but recognizes that there is always need of a language no matter what a person wants to do. To put it as Carlo Sini, language is the first toolbox that man has at hands to analyze the experience. It is not a thought translated into words, because to think is to operate with signs as reminded us by various philosophers from Leonardo da Vinci to Ludwig Wittgenstein. […]

Information and Computation

Por • 7 abr, 2013 • Category: Opinion

In this chapter, concepts related to information and computation are reviewed in the context of human computation. A brief introduction to information theory and different types of computation is given. Two examples of human computation systems, online social networks and Wikipedia, are used to illustrate how these can be described and compared in terms of information and computation.