Jump to content

Pure inductive logic

From Wikipedia, the free encyclopedia

Pure inductive logic (PIL) is the area of mathematical logic concerned with the philosophical and mathematical foundations of probabilistic inductive reasoning. It combines classical predicate logic and probability theory (Bayesian inference). Probability values are assigned to sentences of a first-order relational language to represent degrees of belief that should be held by a rational agent. Conditional probability values represent degrees of belief based on the assumption of some received evidence.

PIL studies prior probability functions on the set of sentences and evaluates the rationality of such prior probability functions through principles that such functions should arguably satisfy. Each of the principles directs the function to assign probability values and conditional probability values to sentences in some respect rationally. Not all desirable principles of PIL are compatible, so no prior probability function exists that satisfies them all. Some prior probability functions however are distinguished through satisfying an important collection of principles.

History

[edit]

Inductive logic started to take a clearer shape in the early 20th century in the work of William Ernest Johnson and John Maynard Keynes, and was further developed by Rudolf Carnap. Carnap introduced the distinction between pure and applied inductive logic,[1] and the modern Pure Inductive Logic evolves along the lines of the pure, uninterpreted approach envisaged by Carnap.

Framework

[edit]

General case

[edit]

In its basic form, PIL uses first-order logic without equality, with the usual connectives (and, or, not and implies respectively), quantifiers finitely many predicate (relation) symbols, and countably many constant symbols .

There are no function symbols. The predicate symbols can be unary, binary or of higher arities. The finite set of predicate symbols may vary while the rest of the language is fixed. It is a convention to refer to the language as and write

where the list the predicate symbols. The set of all sentences is denoted . If a sentence is written with constants appearing in it listed then it is assumed that the list includes at least all those that appear. is the set of structures for with universe and with each constant symbol interpreted as itself.

A probability function for sentences of is a function with domain and values in the unit interval satisfying the following conditions:

– any logically valid sentence has probability
– if sentences and are mutually exclusive then
– for a formula with one free variable the probability of is the limit of probabilities of as tends to .

This last condition, which goes beyond the standard Kolmogorov axioms (for finite additivity) is referred to as Gaifman's Axiom and it is intended to capture the idea that the exhaust the universe.

For a probability function and a sentence with , the corresponding conditional probability function is defined by

Unlike belief functions in many valued logics, it is not the case that the probability value of a compound sentence is determined by the probability values of its components. Probability respects the classical semantics: logically equivalent sentences must be given the same probability. Hence logically equivalent sentences are often identified.

A state description for a finite set of constants is a conjunction of atomic sentences (predicates or their negations) instantiated exclusively by these constants, such that for any eligible atomic sentence either it or its negation (but not both) appears in the conjunction.

Any probability function is uniquely determined by its values on state descriptions. To define a probability function, it suffices to specify nonnegative values of all state descriptions for (for all ) so that the values of all state descriptions for extending a given state description for sum to the value of the state description they all extend, with the convention that the (only) state description for no constants is a tautology and that has value .

If is a state description for a set of constants including then it is said that are indistinguishable in , , just when upon adding equality to the language (and axioms of equality to the logic) the sentence is consistent. is an equivalence relation.

Unary case

[edit]

In the special case of Unary PIL, all the predicates are unary. Formulae of the form

where stands for one of , , are called atoms. It is assumed that they are listed in some fixed order as .

A state description specifies an atom for each constant involved in it, and it can be written as a conjunction of these atoms instantiated by the corresponding constants. Two constants are indistinguishable in the state description if it specifies the same atom for both of them.

Central question

[edit]

Assume a rational agent inhabits a structure in but knows nothing about which one it is. What probability function should s/he adopt when is to represent his/her degree of belief that a sentence is true in this ambient structure?

Rational principles

[edit]

General rational principles

[edit]

The following principles have been proposed as desirable properties of a rational prior probability function for .

The constant exchangeability principle, Ex. The probability of a sentence does not change when the in it are replaced by any other -tuple of (distinct) constants.

The principle of predicate exchangeability, Px. If are predicates of the same arity then for a sentence ,

where is the result of simultaneously replacing by and by throughout .

The strong negation principle, SN. For a predicate and sentence ,

where is the result of simultaneously replacing by and by throughout .

The principle of regularity, Reg. If a quantifier-free sentence is satisfiable then .

The principle of super regularity (universal certainty), SReg. If a sentence is satisfiable then .

The constant irrelevance principle, IP. If sentences have no constants in common then .

The weak irrelevance principle, WIP. If sentences have no constants nor predicates in common then .

Language invariance principle, Li. There is a family of probability functions , one on each language , all satisfying Px and Ex, and such that and if all predicates of belong also to then and agree on sentences of .

The (strong) counterpart principle, CP. If are sentences such that is the result of replacing some constant/relation symbols in by new constant/relation symbols of the same arity not occurring in then

(SCP) If moreover is the result of replacing the same and possibly also additional constant/relation symbols in by new constant/relation symbols of the same arity not occurring in then

The Invariance Principle, INV. If is an isomorphism of the Lindenbaum-Tarski algebra of sentences of supported by some permutation of in the sense that for sentences ,

just when

then .

The Permutation Invariance Principle, PIP. As INV except that is additionally required to map (equivalence classes of) state descriptions to (equivalence classes of) state descriptions.

The Spectrum Exchangeability Principle, Sx. The probability of a state description depends only on the spectrum of , that is, on the multiset of sizes of equivalence classes with respect to the equivalence relation .

Li with Sx. As the Language Invariance Principle but all the probability functions in the family also satisfy Spectrum Exchangeability.

The Principle of Induction, PI. Let be a state description and a constant not appearing in . Let , be state descriptions extending to include (just) . If is -equivalent to some and at least as many constants as it is -equivalent to then .

Further rational principles for unary PIL

[edit]

The Principle of Instantial Relevance, PIR. For a sentence , atom and constants not appearing in ,

.

The Generalized Principle of Instantial Relevance, GPIR. For quantifier-free sentences with constants not appearing in , if then

Johnson Sufficientness Principle, JSP. For a state description for constants, atom and constant not appearing in , the probability

depends only on and on the number of constants for which specifies .

The Principle of Atom Exchangeability, Ax. If is a permutation of and is a state description expressed as a conjunction of instantiated atoms then where obtains from upon replacing each by .

Reichenbach's Axiom, RA. Let for be an infinite sequence of atoms and an atom. Then as tends to , the difference between the conditional probability

and the proportion of occurrences of amongst the tends to .

Principle of Induction for Unary languages, UPI. For a state description , atoms and constant not appearing in , if specifies for at least as many constants as then

Recovery. Whenever is a state description then there is another state description such that and for any quantifier-free sentence ,

Unary Language Invariance Principle, ULi. As Li, but with the languages restricted to the unary ones.

ULi with Ax. As ULi but with all the probability functions in the family also satisfying Atom Exchangeability.

Relationships between principles

[edit]

General Case

[edit]

Sx implies Ex, Px and SN.

PIP + Ex implies Sx.

INV implies PIP and Ex.

Li implies CP and SCP.

Li with Sx implies PI.

Unary case

[edit]

Ex implies PIR.

Ax is equivalent to PIP.

Ax+Ex implies UPI.

Ax+Ex is equivalent to Sx.

ULi with Ax implies Li with Sx.

Important probability functions

[edit]

General probability functions

[edit]

Functions . For a given structure and ,

Functions . For a given state description , is defined via specifying its values for state descriptions as follows. is the probability that when are randomly picked from , with replacement and according to the uniform distribution, then

Functions . As above but employing a non-standard universe (starting with a possibly non-standard state description ) to obtain the standard .

The are the only probability functions that satisfy Ex and IP.

Functions . For a given infinite sequence of non-negative real numbers such that

and ,

is defined via specifying its values for state descriptions as follows:

For a sequence of natural numbers and a state description , is consistent with if whenever then . is the number of state descriptions for consistent with . is the sum over those with which is compatible, of

The are the only probability functions that satisfy WIP and Li with Sx. (The language invariant family witnessing Li with Sx consists of the functions with fixed , where is as but defined with language .)

Further probability functions (unary PIL)

[edit]

Functions . For a vector of non-negative real numbers summing to one, is defined via specifying its values for state descriptions as follows:

where the is number of constants for which specifies .

The are the only probability functions that satisfy Ex and IP (they are also expressible as ).

Carnap continuum functions For , the probability function is uniquely determined by the values

where is a state description for constants not including and is the number of constants for which specifies .

Furthermore, is the probability function that assigns to every state description for constants and is the probability function that assigns to any state description in which all constants are indistinguishable, to any other state description.

The are the only probability functions that satisfy Ex and JSP.

They also satisfy Li – the functions with fixed , where is as but defined with language provide the unary language-invariant family members.

Functions . For , is the average of the functions where has all but one coordinate equal to each other with the odd coordinate differing from them by , so

where , ( in th place) and .

For , the are equal to for

and as such they satisfy Li.

The are the only functions that satisfy GPIR, Ex, Ax and Reg.

The with are the only functions that satisfy Recovery, Reg and ULi with Ax.

Representation theorems

[edit]

A representation theorem for a class of probability functions provides means of expressing every probability function in the class in terms of generic, relatively simple probability functions from the same class.

Representation Theorem for all probability functions. Every probability function for can be represented as

where is a -additive measure on the -algebra of subsets of generated by the sets

Representation Theorem for Ex (employing non-standard analysis and Loeb Integration Theory[2]). Every probability function for satisfying Ex can be represented as

where is an internal set of state descriptions for (with a fixed infinite natural number) and is a -additive measure on a -algebra of subsets of .

Representation Theorem for Li with Sx. Every probability function for satisfying Li with Sx can be represented as

where is the set of sequences

of non-negative reals summing to and such that and is a -additive measure on the Borel subsets of in the product topology.

de Finetti's Representation Theorem (unary). In the unary case (where is a language containing unary predicates), the representation theorem for Ex is equivalent to:

Every probability function for satisfying Ex can be represented as

where is the set of vectors of non-negative real numbers summing to one and is a -additive measure on .

Notes

[edit]
  1. ^ Rudolf Carnap (1971). A Basic System of Inductive Logic, in Studies in Inductive Logic and Probability, Volume 1, pp 69-70.
  2. ^ Cutland, N.J., Loeb measure theory, in Developments in Nonstandard Mathematics, Eds. N.J.Cutland, F.Oliveira, V.Neves, J.Sousa-Pinto, Pitman Research Notes in Mathematics Series, Vol. 336, Longman Press, 1995, pp151-177.

References

[edit]