Dependent and independent variables

Statistically dependent ( also explained ) variables and independent (also explanatory ) variables are terms of the empirical sciences. In contrast to deterministic dependencies, the statistical dependencies are not described by deterministic coefficients, but by watching the regression coefficients.

In a scientific experiment, a random variable manipulated variable, the independent ( exogenous ) variable changed by the experimenter specifically and the impact of these manipulations on a measured variable, the dependent ( endogenous ) variable measured. Do the data predicted by a theory of numbers, this result supports the idea of the theory.

Model approach

In a simple stochastic linear model

Refers to the response (see stimulus-response model). This is also called variable of interest, endogenous variable or the target variable. Furthermore, denotes the explanatory variable, it is also called the independent variable, the predictor variable or an exogenous variable. referred to the so-called residue at this position. Based on a series of measurements contains all stochastic error components of the measurement to the ideal deterministic model.

Typically one variable that as a response having a natural variability. A simple example is the display of the body weight in kg (here, Y) depending on the height in cm (in this case X). It is seen that the response Y and the covariate x are not interchangeable, since the body size remains unchanged after a certain age.

Measurement

Although their significance is controversial, however, is often noted that a unit variance methods can (English common -method variance ) occur when the dependent and the independent variable is measured with the same method, for example, in the same questionnaire.

Regression

The regression indicated regressors as explanatory variables ( statistically independent variable), which exert a decisive influence on the to be explanatory variable ( regressand, statistically dependent variable). The regressand is only the dependent variable represents a regressor is considered significant if their influence is detected on the regressands as not random. In addition to the significance of other properties such as independence and completeness can be formulated.

The correlation coefficient merely referred to the degree of dependence between two variables. The assumed statistical independence of the regressors is only a model approach and must be confirmed with the help of statistical methods. In empirical data, there is always a certain degree of multicollinearity, who need not be disruptive. However, an allowable limit for " serious " multicollinearity does not specify itself.

Complex dependencies such as the regressors with each other ( multicollinearity ) or residuals ( autocorrelation ) lead to inefficiencies of methods that are used to determine the Einflüsses of regressors. The completeness assumption requires that all relevant explanatory variables are included in the model.

Only the regression analysis or analysis of variance to evaluate the relationship of the variables with respect to a previously selected model. Expanding over the correlation coefficient, the direction of the relationship in the model is uniquely defined by the regression coefficients.

Special features of the regression analysis

Regressors are a part of the regression analysis, and may not be readily viewed as independent variables. Regression analysis is a structure under test procedures and assumes independence and will miss other assumptions about the regressors. A regressor is thus more than one independent variable, because independence is merely assumed! Whether a regressor actually identifies this property, the analysis can be found only in the course. It is therefore a priori ( in advance ) is not clear whether a regressor is actually an explanatory variable, and whether it exerts a significant effect on the regressand.

Examples

Example 1: In the equation, for example, a physical formula, is often regarded as a variable is need to change ( the dependent variable ) to satisfy the equation when changing any other variables (the independent variables).

Example 2: In general, one can assume that, whatever be fixed advertising expenditures or independently be fixed quality of products both have a positive effect on their sales. With the help of statistical analysis can now be checked whether a more in advertising or quality sales can increase significantly and how strong is the (relative) effect of the two regressors advertising spending and quality in the paragraph.

Example 3: There is an unlabelled button on a radio. One possible hypothesis is that the volume can be adjusted with him. Forecast: Turning the knob in one direction should the radio lower the volume, turning louder in the other direction. In the experiment, this prediction is checked.

Example 4: Did given for example in a uniformly moving body the constant speed, then one can examine how the distance ( the dependent variable ) when changing the time ( the independent variable ) behaves. Vice versa, can also be examined, after which time ( as the dependent variable ), a certain predetermined distance has been covered ( as independent variables ).

Example 5: You want to know if the color has an influence on the sales of a car. Then the color is the independent variable and the dependent variable of the paragraph. If you want to know, however, if the previous paragraph affects the color, then the paragraph is the independent variable and the color of the dependent variable.

25117
de