Interesting Esoterica

A categorical foundation for Bayesian probability

Article by Culbertson, Jared and Sturtz, Kirk
  • Published in 2012
  • Added on
In the collection
Given two measurable spaces $H$ and $D$ with countably generated $\sigma$-algebras, a prior probability measure $P_H$ on $H$ and a sampling distribution $\mcS:H \rightarrow D$, there is a corresponding inference map $\mcI:D \rightarrow H$ which is unique up to a set of measure zero. Thus, given a data measurement $\mu:1 \rightarrow D$, a posterior probability $\hat{P_H}=\mcI \circ \mu$ can be computed. This procedure is iterative: with each updated probability $P_H$, we obtain a new joint distribution which in turn yields a new inference map $\mcI$ and the process repeats with each additional measurement. The main result shows that the assumption of Polish spaces to obtain regular conditional probabilities is not necessary---countably generated spaces suffice. This less stringent condition then allows for non-trivial decision rules (Eilenberg--Moore algebras) on finite (as well as non finite) spaces, and also provides for a common framework for decision theory and Bayesian probability.

Links

Other information

key
Culbertson2012
type
article
date_added
2012-05-08
date_published
2012-05-01
arxivId
1205.1488
keywords
bayesian probability,categorical foundation for probability,giry,monad,msc 2000 subject,primary 60a05,probabilistic logic,regular conditional probability,secondary 62c10
pages
18

BibTeX entry

@article{Culbertson2012,
	key = {Culbertson2012},
	type = {article},
	title = {A categorical foundation for Bayesian probability},
	author = {Culbertson, Jared and Sturtz, Kirk},
	abstract = {Given two measurable spaces {\$}H{\$} and {\$}D{\$} with countably generated {\$}\sigma{\$}-algebras, a prior probability measure {\$}P{\_}H{\$} on {\$}H{\$} and a sampling distribution {\$}\mcS:H \rightarrow D{\$}, there is a corresponding inference map {\$}\mcI:D \rightarrow H{\$} which is unique up to a set of measure zero. Thus, given a data measurement {\$}\mu:1 \rightarrow D{\$}, a posterior probability {\$}\hat{\{}P{\_}H{\}}=\mcI \circ \mu{\$} can be computed. This procedure is iterative: with each updated probability {\$}P{\_}H{\$}, we obtain a new joint distribution which in turn yields a new inference map {\$}\mcI{\$} and the process repeats with each additional measurement. The main result shows that the assumption of Polish spaces to obtain regular conditional probabilities is not necessary---countably generated spaces suffice. This less stringent condition then allows for non-trivial decision rules (Eilenberg--Moore algebras) on finite (as well as non finite) spaces, and also provides for a common framework for decision theory and Bayesian probability.},
	comment = {},
	date_added = {2012-05-08},
	date_published = {2012-05-01},
	urls = {http://arxiv.org/abs/1205.1488,http://arxiv.org/pdf/1205.1488v3},
	collections = {Probability and statistics},
	url = {http://arxiv.org/abs/1205.1488 http://arxiv.org/pdf/1205.1488v3},
	archivePrefix = {arXiv},
	arxivId = {1205.1488},
	eprint = {1205.1488},
	keywords = {bayesian probability,categorical foundation for probability,giry,monad,msc 2000 subject,primary 60a05,probabilistic logic,regular conditional probability,secondary 62c10},
	month = {may},
	pages = 18,
	year = 2012,
	urldate = {2012-05-08}
}