![]() ![]() Integrating logical and physical uncertainty.Reflective oracles and logical uncertainty.Optimal predictors (complexity theory approach).Benford’s test and irreducible patterns.Polynomial-time approximate Bayesian updating via Kullback-Leibler divergence.Interesting questions, topics, and results about this are: The question at hand is “How do we deal with uncertainty about the output of deterministic algorithms?” And of course, if you have infinite computing power, there’s not much of a question of the matter: you just know all your (at least first-order) theorems, the end. It only really becomes interesting/hard when we get to the non-omniscient finite deductively-limited case. Philosophical question Infinite computing power Computable but intractable models P-time models Actually usable algorithms In specific, Patrick LaVictoire was talking about a certain hierarchy of constructions: On the first day we went over the basics: what MIRI’s project is, why we care about Logical Uncertainty, and what interesting and/or open questions were. Let’s talk about some more of the actual workshop. that the things we actually know/have proven constrain our distribution, then that’d be ideal. However, if we could guarantee local coherence, i.e. Before we know that two sentences are logically equivalent, or that one implies the other, it may very well be the case that they have probabilities that aren’t coherent. ![]() ![]() This is clearly a strict weakening of the notion of coherence, and might be better to characterise realistic deductively limited logical uncertainty. Weak normalisation: for all sentences that are axioms or have been proven to be true,.Paul Christiano proves that for any sentences and you can determine whether in time where is the total length of these sentences, which obviates the usefulness of this definition as opposed to fully general logical equivalence, which is computationally intractable.Īnd with that we have another definition:ĭefinition 3 (Local Coherence). Given a finite set of sentences, we say that a distribution defined on that set is locally coherent if: Two sentences linked to each other by this relation are said to be trivially equivalent. Let’s define something else, then:ĭefinition 2 (Trivial equivalence). We define a relation as the minimal equivalence relation satisfying the following for all sentences, , and : I think this would probably be doubly true for second-order logical sentences, but I haven’t thought about this enough to come up with a proof. But then, this would mean that is coherent, Gaifman, and computably approximable, and according to Theorem 1 of that post it would give probability to some true, provable sentences, violating Normalisation: contradiction. □ That would imply it’s also computably approximable: just take the function where is the computable function that outputs. Ergo, a coherent distribution over the set of all logical sentences is Gaifman, since all true sentences are provable and therefore must have, as per Normalisation. Now suppose this distribution was computable. In first-order logic, as the Completeness Theorem says, all true statements are provable, and soundness of the first-order logic axioms means all provable statements are true. Any coherent distribution over the set of all first-order logical sentences is uncomputable. TL DR: A distribution is coherent if it’s an actual probability distribution and obeys the basic laws of probability (to perform inference you’d define ). ) then normalisation and additivity imply, and the whole definition implies. ), and if stands for a contradiction (e.g. A distribution over a set of sentences is coherent if: I’m gonna reference definitions and results from that post here, too, though I’ll redefine:ĭefinition 1 (Coherence). And a few weeks ago I wrote about a specific problem of Logical Uncertainty which was presented in the MIRI workshop. So I’ve written before about Logical Uncertainty in a very vague way. ![]()
0 Comments
Leave a Reply. |