Algorithmic identification of probabilities is hard

Por • 30 may, 2014 • Sección: Educacion

Laurent Bienvenu, Benoit Monin, Alexander Shen

Abstract: Suppose that we are given an infinite binary sequence which is random for a Bernoulli measure of parameter p  . By the law of large numbers, the frequency of zeros in the sequence tends to ~p  , and thus we can get better and better approximations of p  as we read the sequence. We study in this paper a similar question, but from the viewpoint of inductive inference. We suppose now that p  is a computable real, but one asks for more: as we are reading more and more bits of our random sequence, we have to eventually guess the exact parameter p  (in the form of a Turing code). Can one do such a thing uniformly on all sequences that are random for computable Bernoulli measures, or even on a `large enough’ fraction of them? In this paper, we give a negative answer to this question. In fact, we prove a very general negative result which extends far beyond the class of Bernoulli measures.

arXiv:1405.5139v1 [math.LO]

Logic (math.LO)

Post to Twitter

Escribe un comentario