Filters








96 Hits in 1.5 sec

Universal Convergence of Semimeasures on Individual Random Sequences [chapter]

Marcus Hutter, Andrej Muchnik
2004 Lecture Notes in Computer Science  
Solomonoff's central result on induction is that the posterior of a universal semimeasure M converges rapidly and with probability 1 to the true sequence generating posterior µ, if the latter is computable  ...  We show that W converges to D and D to µ on all random sequences. The Hellinger distance measuring closeness of two distributions plays a central role.  ...  -random sequence and show the existence of a universal semimeasure which does not converge on this sequence, hence answering the open question negatively for some M .  ... 
doi:10.1007/978-3-540-30215-5_19 fatcat:gmlx75srpff6rdtqtkowth2up4

Universal Convergence of Semimeasures on Individual Random Sequences [article]

Marcus Hutter, Andrej Muchnik
2004 arXiv   pre-print
Solomonoff's central result on induction is that the posterior of a universal semimeasure M converges rapidly and with probability 1 to the true sequence generating posterior mu, if the latter is computable  ...  We show that W converges to D and D to mu on all random sequences. The Hellinger distance measuring closeness of two distributions plays a central role.  ...  -random sequence and show the existence of a universal semimeasure which does not converge on this sequence, hence answering the open question negatively for some M.  ... 
arXiv:cs/0407057v1 fatcat:khwjqxwkczh2nafiul2zjpn64q

On semimeasures predicting Martin-Löf random sequences

Marcus Hutter, Andrej Muchnik
2007 Theoretical Computer Science  
We show that there are universal semimeasures M which do not converge to µ on all µ-random sequences, i.e. we give a partial negative answer to the open problem.  ...  Solomonoff's central result on induction is that the prediction of a universal semimeasure M converges rapidly and with probability 1 to the true sequence generating predictor µ, if the latter is computable  ...  A shorter version appeared in the proceedings of the ALT 2004 conference [1] .  ... 
doi:10.1016/j.tcs.2007.03.040 fatcat:smm3ujv6k5cxlbhjlqm77tuqma

On Semimeasures Predicting Martin-Loef Random Sequences [article]

Marcus Hutter, Andrej Muchnik
2007 arXiv   pre-print
Solomonoff's central result on induction is that the posterior of a universal semimeasure M converges rapidly and with probability 1 to the true sequence generating posterior mu, if the latter is computable  ...  We show that W converges to D and D to mu on all random sequences. The Hellinger distance measuring closeness of two distributions plays a central role.  ...  In Section 4 we investigate whether convergence for all Martin-Löf random sequences hold. We construct a µ-M.L.-random sequence on which some universal semimeasures M do not converge to µ.  ... 
arXiv:0708.2319v1 fatcat:6ew7f75zejgftnwvjwwa74c3oa

On the Existence and Convergence Computable Universal Priors [article]

Marcus Hutter
2003 arXiv   pre-print
We introduce a generalized concept of randomness for individual sequences and use it to exhibit difficulties regarding these issues.  ...  His central result is that the posterior of his universal semimeasure M converges rapidly to the true sequence generating posterior mu, if the latter is computable.  ...  (iii) uses Martin-Löf's notion of randomness of individual sequences to define convergence M.L.  ... 
arXiv:cs/0305052v1 fatcat:gvyokczehzakpfnb5xgp5dpu7q

On the Existence and Convergence of Computable Universal Priors [chapter]

Marcus Hutter
2003 Lecture Notes in Computer Science  
We introduce a generalized concept of randomness for individual sequences and use it to exhibit difficulties regarding these issues.  ...  His central result is that the posterior of his universal semimeasure M converges rapidly to the true sequence generating posterior µ, if the latter is computable.  ...  (iii) uses Martin-Löf's notion of randomness of individual sequences to define convergence M.L.  ... 
doi:10.1007/978-3-540-39624-6_24 fatcat:3dle7qlw6jayvairul3ry36pty

On Martin-Löf Convergence of Solomonoff's Mixture [chapter]

Tor Lattimore, Marcus Hutter
2013 Lecture Notes in Computer Science  
We study the convergence of Solomonoff's universal mixture on individual Martin-Löf random sequences.  ...  A new result is presented extending the work of Hutter and Muchnik (2004) by showing that there does not exist a universal mixture that converges on all Martin-Löf random sequences.  ...  Martin-Löf randomness is the usual characterisation of the randomness of individual sequences [6] .  ... 
doi:10.1007/978-3-642-38236-9_20 fatcat:xsljijw47zhnrhymaoeb4k4y7m

On Generalized Computable Universal Priors and their Convergence [article]

Marcus Hutter
2005 arXiv   pre-print
In particular, we show that convergence fails (holds) on generalized-random sequences in gappy (dense) Bernoulli classes.  ...  We introduce a generalized concept of randomness for individual sequences and use it to exhibit difficulties regarding these issues.  ...  (iii) uses Martin-Löf's notion of randomness of individual sequences to define convergence M.L.  ... 
arXiv:cs/0503026v1 fatcat:5zya6ovf5jf6vept2po7kpkjtq

On generalized computable universal priors and their convergence

Marcus Hutter
2006 Theoretical Computer Science  
In particular, we show that convergence fails (holds) on generalized-random sequences in gappy (dense) Bernoulli classes.  ...  We introduce a generalized concept of randomness for individual sequences and use it to exhibit difficulties regarding these issues.  ...  s notion of randomness of individual sequences to define convergence M.L.  ... 
doi:10.1016/j.tcs.2006.07.039 fatcat:zidqw3nhhrhujpscobtrwd6wiu

Ray Solomonoff, Founding Father of Algorithmic Information Theory

Paul M.B. Vitanyi
2010 Algorithms  
Suppose we have an infinite binary sequence every odd bit of which is uniformly random and every even bit is a bit of pi = 3.1415... written in binary.  ...  For prediction one uses not the universal a priori probability which is a probability mass function, but a semimeasure which is a weak form of a measure.  ... 
doi:10.3390/a3030260 fatcat:ypsejgoql5bgni4qksom72alxi

Sequential predictions based on algorithmic complexity

Marcus Hutter
2006 Journal of computer and system sciences (Print)  
This paper studies sequence prediction based on the monotone Kolmogorov complexity Km = − log m, i.e. based on universal deterministic/one-part MDL. m is extremely close to Solomonoff's universal prior  ...  We show that for deterministic computable environments, the "posterior" and losses of m converge, but rapid convergence could only be shown on-sequence; the off-sequence convergence can be slow.  ...  M can be used to characterize randomness of individual sequences: a sequence x 1:∞ is (Martin-Löf) -random, iff ∃c : M(x 1:n ) c · (x 1:n ) ∀n.  ... 
doi:10.1016/j.jcss.2005.07.001 fatcat:tpvox6g6jbakjlfnu6ti2aw2vy

Sequential Predictions based on Algorithmic Complexity [article]

Marcus Hutter
2005 arXiv   pre-print
This paper studies sequence prediction based on the monotone Kolmogorov complexity Km=-log m, i.e. based on universal deterministic/one-part MDL. m is extremely close to Solomonoff's universal prior M,  ...  We show that for deterministic computable environments, the "posterior" and losses of m converge, but rapid convergence could only be shown on-sequence; the off-sequence convergence can be slow.  ...  M can be used to characterize randomness of individual sequences: A sequence x 1:∞ is (Martin-Löf) µ-random, iff ∃c : M(x 1:n ) ≤ c·µ(x 1:n )∀n.  ... 
arXiv:cs/0508043v1 fatcat:psltsxgxeba5pk6bb7af3l27fm

Ray Solomonoff, Founding Father of Algorithmic Information Theory

Paul M.B. Vitanyi
2010 Algorithms  
Suppose we have an infinite binary sequence every odd bit of which is uniformly random and every even bit is a bit of pi = 3.1415... written in binary.  ...  For prediction one uses not the universal a priori probability which is a probability mass function, but a semimeasure which is a weak form of a measure.  ... 
doi:10.3390/algor3030260 fatcat:7c3nfi3myjeqzkvygfvys36uxu

Sequence Prediction based on Monotone Complexity [article]

Marcus Hutter
2003 arXiv   pre-print
We show that for deterministic computable environments, the "posterior" and losses of m converge, but rapid convergence could only be shown on-sequence; the off-sequence behavior is unclear.  ...  This paper studies sequence prediction based on the monotone Kolmogorov complexity Km=-log m, i.e. based on universal deterministic/one-part MDL. m is extremely close to Solomonoff's prior M, the latter  ...  M can be used to characterize randomness of individual sequences: A sequence x 1:∞ is (Martin-Löf) µ-random, iff ∃c : M(x 1:n ) ≤ c·µ(x 1:n )∀n.  ... 
arXiv:cs/0306036v1 fatcat:qhcwtzup4vhs5lpuisbfph6gx4

On prediction by data compression [chapter]

Paul Vitányi, Ming Li
1997 Lecture Notes in Computer Science  
Making these ideas rigorous involves the length of the shortest effective description of an individual object: its Kolmogorov complexity.  ...  In a previous paper we have shown that optimal compression is almost always a best strategy in hypotheses identification (an ideal form of the minimum description length (MDL) principle).  ...  In fact, classical probability theory cannot express the notion of randomness of an individual sequence.  ... 
doi:10.1007/3-540-62858-4_69 fatcat:wikhwtgwlnd7hjpnpy675akfmi
« Previous Showing results 1 — 15 out of 96 results