﻿ Covariance

# Covariance

The covariance is Stochastics a non-standardized measure of association for a monotonic relationship between two random variables with joint probability distribution. The value of this parameter makes tendentious statements about whether high values ​​of a random variable associated with rather high or rather low values ​​of the other random variables.

## Definition

Are and two real, integrable random variables whose product is also integrable, that is, the expectation values ​​, and exist, then that means

The covariance of and. The required existence of the expectation values ​​is in particular satisfied if and square integrable, ie if and valid.

## Properties and calculation rules

### Interpretation of the covariance

• The covariance is positive if and possess a monotonic relation, that is, high (low ) values ​​of go with high ( low ) values ​​of associated.
• The covariance is negative if and opposing monotonous context have, that is, high values ​​of a random variable go with low values ​​of other random variables associated and vice versa.
• If the result is zero, there is no monotonic relationship between and ( Nonmonotonic relationships but are possible. ).

The covariance is indeed the direction of a relationship between two random variables, on the strength of the association but no statement is made. This is due to the linearity of the covariance. To make a connection comparable, the covariance must be normalized. The most common normalization by the standard deviation leads to the correlation coefficient.

### Shift theorem

To often simpler calculation of the covariance can also apply the shift theorem as an alternative representation of the covariance.

Rate ( shift kit for the covariance ):

Proof:

### Relationship to the variance

Sentence: The covariance is a generalization of the variance, because it is

Proof:

The variance is therefore the covariance of a random variable with itself

Using the covariances can also be the variance of a sum of square integrable random variables to calculate. generally

Therefore is especially true for the sum of two random variables, the formula

### Linearity, symmetry and definiteness

Sentence: The covariance is a positive semidefinite symmetric bilinear form on the vector space of square integrable random variables.

Thus, the following three sets:

Set ( bilinearity ): For the following applies:

Proof:

The covariance is obviously invariant under addition of constants to the random variables. In the second equation, the covariance is because of the symmetry in the first argument linear.

Set ( symmetry):

Proof:

Set ( Positive semidefiniteness ):

Proof:

Overall follows as for any positive semidefinite symmetric bilinear form, the Cauchy- Schwarz inequality

The linearity of the covariance with the result that the covariance depends on the scale of the random variables. We obtain, for example, ten times the covariance when viewed instead of the random variable. Specifically, the value of the covariance depends on the used units of random variables. Since this property makes the absolute values ​​of the covariance difficult to interpret, we consider in the study on a linear relationship between and often instead the scale independent correlation coefficients

### Non-correlation and independence

Definition ( uncorrelated ): Let and, consequently, are then called the random variables and uncorrelated.

Sentence: Be and stochastically independent random variables, then applies

Proof: For stochastically independent random variables, that is,

The converse is not true in general, but; whichever is

Theorem: Let, then the random variables and in general are not stochastically independent.

Proof: Let a uniformly distributed in the interval [ -1, 1 ] random variable. Are apparent and interdependent. Moreover applies

Stochastically independent random variables whose covariance exists, that are also uncorrelated. Conversely, no correlation but not necessarily mean that the random variables are stochastically independent, because there can be a non-monotonic dependence, which does not capture the covariance.

Other examples of uncorrelated, but stochastically dependent random variables:

• Let and be random variables with and
• Be the random variables and Bernoulli distributed and independent, then and uncorrelated, but not independent parameter.

## Examples

• Be two-dimensional multi- normally distributed with
• Be dimensional multinomialverteilt (), then:
487158
de