id author title date pages extension mime words sentences flesch summary cache txt work_mr277dtjk5cjlaldsrhnizxngq Tom F. Sterkenburg Solomonoff Prediction and Occam's Razor 2016 18 .pdf application/pdf 10026 755 56 Algorithmic information theory gives an idealized notion of compressibility, that is often presented as an objective measure of simplicity. inductive assumption, that by a basic property of Bayesian prediction methods entails reliability under that very same assumption – leaving the conclusion of the a representation theorem that bridges Solomonoff's predictors and Bayesian prediction. machines, we have an infinite class of algorithmic probability predictors. denote Q := {QU}U , the class of algorithmic probability predictors via all universal We can prove that any Bayesian predictor, operating under the inductive assumption of S, is reliable under the assumption that the data is indeed generated probability predictors, and a particular class of Bayesian mixtures, the effective assumption of an i.i.d. source; and Theorem 6 shows that the algorithmic probability predictors operate under the inductive assumption of an effective source. Solomonoff's algorithmic probability predictors are precisely the Bayesian predictors operating under the inductive assumption of effectiveness. ./cache/work_mr277dtjk5cjlaldsrhnizxngq.pdf ./txt/work_mr277dtjk5cjlaldsrhnizxngq.txt