Moments of random variables are parameters of descriptive statistics and play a theoretical role in the stochastics. The terms mean, variance, skewness and kurtosis for the description of a random variable is closely related to its moments.
The concept of moments plays outside the stochastics, for example, in engineering mechanics an important role. If one interprets distributions as mass assignments of the abscissa, then the first moment - the mean - the focus and the second central moment - the variance - the moment of inertia of the assignment for rotation about an axis perpendicular to the abscissa, running through the center axis.
A distribution function is determined by giving all the moments of the corresponding random variable, if the moments exist and the series converges to the moment generating function. Determining a distribution of predetermined moments is termed the moment problem.
There are distributions whose moments exist only up to a certain order. These include the t- distribution, moments which exist only for systems which are less than the degree of freedom. In the special case of the Cauchy distribution so there is not even the first torque ( the mean value ), which is also the case in the Levy distribution.
It is a random variable, and a natural number. Then, referred to as the moment of order k of X or is shorter than k-th moment of the expected value of the X - th power (under the condition that any):
And as k-th absolute moment of X is the expected value of the -th power of the absolute value of is called:
In theoretical studies, sometimes non-integer order moments are considered.
The existence of moments of a given order provides general statements about the distribution of the probability mass.
The first moment is the expectation value. It is usually denoted by and can be considered as average or " median ".
Representation for real random variable
If a defined on a probability space real random variable with the distribution function, then it follows from the definition of the expectation value
If a continuous random variable with density function, then applies
And is for a discrete random variable with the values and the associated probabilities
With the help of the Lebesgue integral with respect to the probability measure, these cases uniformly write as
Key factors in
In addition to the moments defined above, the central moments can be defined, in which the distribution of the probability mass is considered around the expected value of the random variable:
From the definition it follows immediately that the first central moment is always 0:
The second central moment is the variance:
The third central moment is after normalization, the skewness ( skewness engl. ):
The fourth central moment is by normalizing the curvature (English kurtosis, excess ):
Skewness and kurtosis are referred to as higher moments. The curvature is often used as a measure of deviation from the normal distribution, the skewness is a measure of the deviation from a symmetric distribution.
Moments and characteristic function
By repeatedly deriving the formula for the characteristic function to obtain a representation of the common memories by the characteristic function as a
The th moment can be determined using the moment generating function.
Markov 's inequality
The importance of the moments is illustrated by the following sentence:
If the - th absolute moment of the random variable exists, then applies
This is the Markov 's inequality, which provides an indication of the probability values of a larger magnitude. In the special case, it follows with the variance of the well-known Chebyshev
A statement about the probability of large deviations of random variables from their expected value makes.
The moment concept can be extended to several random variables. In the case of two random variables and the joint torques (English joint moments ) of and
With the joint density.
Are the central moments of common and as analog
Defined. In particular, the covariance of and.