Artículos con la etiqueta ‘Information Theory (cs.IT)’

Information Content of Elementary Systems as a Physical Principle

Por • 30 mar, 2014 • Category: Ambiente

Quantum physics has remarkable characteristics such as quantum correlations, uncertainty relations, no cloning, which make for an interpretative and conceptual gap between the classical and the quantum world. To provide more unified framework the generalized probabilistic theories were formulated. Recently, it turned out that such theories include so called “postquantum” ones which share many of the typical quantum characteristics but predict supraquantum effects such as correlations stronger than quantum ones. As a result we reveal even more dramatic gap between classical/quantum and post-quantum world. Therefore it is imperative to search for information principles characterizing physical theories. In recent years, different principles has been proposed, however all of the principles considered so far has been correlation ones. Here, we introduce an elementary system information content principle (ICP) whose basic ingredient is the phenomenon of Heisenberg uncertainty.



“Information-Friction” and its implications on minimum energy required for communication

Por • 10 ene, 2014 • Category: Ambiente

Just as there are frictional losses associated with moving masses on a surface, what if there were frictional losses associated with moving information on a substrate? Indeed, many methods of communication suffer from such frictional losses. We propose to model these losses as proportional to “bit-meters,” i.e., the product of mass of information (i.e., the number of bits) and the distance of information transport. We use this “information-friction” model to understand fundamental energy requirements on encoding and decoding in communication circuitry. First, for communication across a binary input AWGN channel, we arrive at limits on bit-meters (and thus energy consumption) for decoding implementations that have a predetermined input-independent lengths of messages.



Information-theoretic interpretation of quantum formalism

Por • 6 ene, 2014 • Category: Leyes

We propose an information-theoretic interpretation of quantum formalism based on Bayesian probability and free from any additional axiom. Quantum information is construed as a technique of statistical estimation of the variables within an information manifold. We start from a classical register. The input data are converted into a Bayesian prior, conditioning the probability of the variables involved. In static systems, this framework leads to solving a linear programming problem which is next transcribed into a Hilbert space using the Gleason theorem. General systems are introduced in a second step by quantum channels. This provides an information-theoretic foundation to quantum information, including the rules of commutation of observables. We conclude that the theory, while dramatically expanding the scope of classical information, is not different from the information itself and is therefore a universal tool of reasoning.



Complexity measurement of natural and artificial languages

Por • 2 dic, 2013 • Category: Ambiente

We compared entropy for texts written in natural languages (English, Spanish) and artificial languages (computer software) based on a simple expression for the entropy as a function of message length and specific word diversity. Code text written in artificial languages showed higher entropy than text of similar length expressed in natural languages. Spanish texts exhibit more symbolic diversity than English ones. Results showed that algorithms based on complexity measures differentiate artificial from natural languages, and that text analysis based on complexity measures allows the unveiling of important aspects of their nature. We propose specific expressions to examine entropy related aspects of tests and estimate the values of entropy, emergence, self-organization and complexity based on specific diversity and message length.



Chaos Forgets and Remembers: Measuring Information Creation, Destruction, and Storage

Por • 1 oct, 2013 • Category: Filosofía

The hallmark of deterministic chaos is that it creates information—the rate being given by the Kolmogorov-Sinai metric entropy. Since its introduction half a century ago, the metric entropy has been used as a unitary quantity to measure a system’s intrinsic unpredictability. Here, we show that it naturally decomposes into two structurally meaningful components: A portion of the created information—the ephemeral information—is forgotten and a portion—the bound information—is remembered. The bound information is a new kind of intrinsic computation that differs fundamentally from information creation: it measures the rate of active information storage. We show that it can be directly and accurately calculated via symbolic dynamics, revealing a hitherto unknown richness in how dynamical systems compute.



Problem Complexity Research from Energy Perspective

Por • 21 sep, 2013 • Category: Crítica

Computational complexity is a particularly important objective. The idea of Landauer principle was extended through mapping three classic problems (sorting,ordered searching and max of N unordered numbers) into Maxwell demon thought experiment in this paper. The problems’complexity is defined on the entropy basis and the minimum energy required to solve them are rigorous deduced from the perspective of energy (entropy) and the second law of thermodynamics. Then the theoretical energy consumed by real program and basic operators of classical computer are both analyzed, the time complexity lower bounds of three problems’all possible algorithms are derived in this way. The lower bound is also deduced for the two n*n matrix multiplication problem. In the end, the reason why reversible computation is impossible and the possibility of super-linear energy consumption capacity which may be the power behind quantum computation are discussed, a conjecture is proposed which may prove NP!=P. The study will bring fresh and profound understanding of computation complexity.



Towards a mathematical theory of meaningful communication

Por • 24 ago, 2013 • Category: Leyes

Despite its obvious relevance, meaning has been outside most theoretical approaches to information in biology. As a consequence, functional responses based on an appropriate interpretation of signals has been replaced by a probabilistic description of correlations between emitted and received symbols. This assumption leads to potential paradoxes, such as the presence of a maximum information associated to a channel that would actually create completely wrong interpretations of the signals. Game-theoretic models of language evolution use this view of Shannon’s theory, but other approaches considering embodied communicating agents show that the correct (meaningful) match resulting from agent-agent exchanges is always achieved and natural systems obviously solve the problem correctly. How can Shannon’s theory be expanded in such a way that meaning -at least, in its minimal referential form- is properly incorporated?



Information and Computation

Por • 7 abr, 2013 • Category: Opinion

In this chapter, concepts related to information and computation are reviewed in the context of human computation. A brief introduction to information theory and different types of computation is given. Two examples of human computation systems, online social networks and Wikipedia, are used to illustrate how these can be described and compared in terms of information and computation.



Measurement of statistical evidence on an absolute scale following thermodynamic principles

Por • 27 jun, 2012 • Category: Crítica

Statistical analysis is used throughout biomedical research and elsewhere to assess strength of evidence. We have previously argued that typical outcome statistics (including p-values and maximum likelihood ratios) have poor measure-theoretic properties: they can erroneously indicate decreasing evidence as data supporting an hypothesis accumulate; and they are not amenable to calibration, necessary for meaningful comparison of evidence across different study designs, data types, and levels of analysis. We have also previously proposed that thermodynamic theory, which allowed for the first time derivation of an absolute measurement scale for temperature (T), could be used to derive an absolute scale for evidence (E). Here we present a novel thermodynamically-based framework in which measurement of E on an absolute scale, for which “one degree” always means the same thing, becomes possible for the first time. The new framework invites us to think about statistical analyses in terms of the flow of (evidential) information, placing this work in the context of a growing literature on connections among physics, information theory, and statistics.



Introducing the Computable Universe

Por • 18 jun, 2012 • Category: Leyes

Some contemporary views of the universe assume information and computation to be key in understanding and explaining the basic structure underpinning physical reality. We introduce the Computable Universe exploring some of the basic arguments giving foundation to these visions. We will focus on the algorithmic and quantum aspects, and how these may fit and support the computable universe hypothesis.