Markov kernel

In probability theory transition probabilities describe the probability with which merges a Markov chain from one state to the current time of observation in other states. More generally, a probability that the core or core stochastic a diagram which assigns a probability to each state of the set of states. This figure plays the same role as the transition probability of Markov chains for Markov processes with general state space. The definition is extended to stochastic probability cores between different rooms, an example is the full conditional probability.

General case

In the general case indicates the probabilities with which you move from one state to any event. These were and measuring rooms. A mapping is called stochastic kernel or Markov kernel from to, if:

  • For each a probability measure on.
  • For each is a - measurable function.

Each measure on maps by assigning

A measure to on. This measure is commonly referred to. Is a probability measure, that is true, then also, that is also a probability measure.

In case a measure applies to the called stationary measure. A stationary probability measure is also called stationary distribution.

Note: Some definitions of the arguments are written in reverse order, or, based on conditional probabilities.

Discrete event

In the discrete case, where a finite or countable and amounts, it is sufficient to indicate the probabilities that can be reached from state to state. With the notation of the general case then applies. These probabilities form a transition matrix, which has the property that all elements between and and that the row sums have the value. Such a matrix is called a stochastic matrix. It assigns to each probability distribution with a probability density, the probability density

A probability distribution to which is calculated by the standard matrix multiplication, which probability densities are regarded as row vectors.

Note: In some definitions of rows and columns of the matrix can be used vice versa.

View as Daniell continuous mappings and composition

Each Markov kernel of after is on the space of numeric, non-negative functions

A mapping with the following properties associated with:

For each image with these properties, there is again exactly one core, for representing the figure thus formed.

Of the composition of these mappings, a definition of the composition of the associated cores can be derived: by

A core defined by stochastic after, referred to as the composition of and. In the discrete case corresponds to the multiplication of the two transition matrices.

Special Applications

They are widely used in the modeling with the help of Markov and hidden Markov models. In quantum physics, often transition probabilities between quantum mechanical states are investigated.

550580
de