Basis (linear algebra)

In linear algebra, a base is a subset of a vector space, with the help of which any vector of the space can be uniquely represented as a finite linear combination. The coefficients of this linear combination are called the coordinates of the vector with respect to this basis. A member of the base is called a basis vector. If confusion with other basic concepts are to be feared (eg the Schauder basis), called such a subset also Hamel basis ( after Georg Hamel ). A vector space generally has various bases, and a change of a coordinate transformation based approaches.

Definition and basic concepts

A basis of a vector space is a subset of the following equivalent characteristics:

The elements of a base called basis vectors. If the vector space is a function space, called the basis vectors and basis functions. A base can be calculated using an index set in the form of describing a finite basis, for example in the form. However, if such an index set used, then one usually used to denote the base equal to the family notation d instead of h.

Note that in the family spelling an order relation on the index set produces an arrangement of the basis vectors; it means ' orderly basis '. This makes you look at the description of the orientation of vector spaces advantage. An index set with order relation makes it possible, to establish the bases of orientation classes ( handedness ). Examples: countably infinite base, finite basis.

The coefficients occur in the representation of a vector as a linear combination of vectors of the base, called the coordinates of the vector with respect to. These are elements of the vector space underlying field (eg or ). Together, these form a coordinate vector, which is, however, in another vector space, the coordinate space. Attention: Since the assignment of coordinates to their respective base vectors is crucial need here - in the absence of a common index set - the basis vectors are used themselves to indexing.

Although bases are usually written as sets, hence is a given by an index set " indexing " practical. The coordinates of vectors have the form, which is the coordinate space. Is equipped with an order relation, the result for the coordinate vector, a sequence of coordinates. In the example of the coordinate vector of the form ( " numbering " coordinate ). The coordinate space is here, with real or complex vector spaces or so.

Key Features

  • Every vector space has a basis. A proof of this statement is proof of existence given in the section.
  • All bases of a vector space have the same number of elements. This number, which may be an infinite cardinal number is called the dimension of the vector space.
  • A subset of a vector space uniquely defines a linear map, where the -th standard unit vector called.
  • Injective if the are linearly independent;
  • Surjective if the form a generating system;
  • Bijective if that form a basis.

Examples

  • In the Euclidean plane, there is the standard basis of so -called. In addition, in this plane if and only form two vectors a base if they do not point in the same ( or opposite ) direction.
  • The standard basis of the vector space is the set of canonical unit vectors.
  • The standard basis in the space of matrices formed by the standard matrices, where exactly is an entry and all other entries are.
  • The zero vector space has dimension zero; its single base is the empty set.
  • As a vector space over the base is used for most. A lot is just then a base of over when no real number. As a vector space has a basis, but you can not specify explicitly.
  • The vector space of polynomials over a field, the base. There are also many other bases which, although cumbersome to write, but in concrete applications are more practical, such as the Legendre polynomials.
  • In the vector space of real number sequences, the vectors form a linearly independent system though, but no basis for it, for example, the result does not produce it, since a combination of infinitely many vectors is not a linear combination.

Proof of the equivalence of the definitions

The following considerations sketch a proof that the four characterizing features that are mentioned in this article as a definition of base equivalent. ( For this proof of the axiom of choice or Zorn lemma is not needed. )

  • If each vector can be uniquely represented as a linear combination of vectors in, then in particular a generating system is ( by definition). If it is not minimal generating system, then there is a proper subset that is also a generating set. Let now an element of which is not in. Then can be written in at least two different ways as a linear combination of vectors in. Namely, once as a linear combination of vectors in and once as. The result is a contradiction and therefore is minimal. Thus applies ( 1) → (2).
  • Each minimal generating set must be linearly independent. Because if it is not linearly independent, then there is a vector in which can be represented as a linear combination of vectors in. Then you can choose any linear combination of vectors in rewrite by a linear combination of vectors in and would not be minimal. Thus applies ( 2) → (4).
  • Every linearly independent set of generators must be a maximal linearly independent set. In fact, if not maximal linearly independent, then there would be a ( not in lies ), which would be together with linearly independent. But can be represented as a linear combination of elements of, which contradicts the linear independence. Consequently, where (4) → (3).
  • A maximum of linearly independent system is a system of generators: Let be an arbitrary vector. If is included, then can be written as a linear combination of elements of. But if not is included, then the set is a proper subset of it, and thus no longer linearly independent. The vectors that occur in a possible linear dependence, can not be all out, therefore, must be one of them (say) to be equal, with equal to 0 is therefore. Consequently, where (3) → (1).

Proof of existence

By Lemma anger can prove that every vector space must have a base, even if they often can not explicitly specify.

Be a vector space. You want to find a maximal linearly independent subset of the vector space. It was therefore obvious that quantity system

Look at, which is partially ordered by the relation. It can be shown:

From the lemma now follows anger that has a maximal element. The maximal elements of are now but exactly the maximal linearly independent subsets of, ie the bases of. Therefore, having a base and it is, moreover, that any linearly independent subset of a basis of is included.

Basis extension theorem

If a given amount of linearly independent vectors, and one goes in the above proof of

Out, we obtain the statement that is at a maximum element of contain. Since such a maximal element proves again as a base of, it is shown that one can add a lot of linearly independent vectors to a basis of. This statement is called a basis extension theorem.

Further accounts of bases

  • Austauschlemma of Steinitz ( according to E. Steinitz ): Are a basis of a vector space and a further different from the zero vector vector from, it can be one of the basis vectors against " exchange ", that is, there exists an index, so also is a basis of. This statement is often used to show that all bases of a vector space of the same number of vectors exist.
  • Each vector space is a free object on its base. This is a universal property of vector spaces in the sense of category theory. Specifically, this means:
  • In one - dimensional vector space over a finite field having elements there different bases.

Basic concepts in special vector spaces

Real and complex vector spaces usually carry additional topological structure. From this structure, a basic concept can result that differs from the described here.

Deviation from the base concept in inner product spaces

In the study of a real or complex inner product spaces, particularly of Hilbert spaces, there is yet another, there more appropriate way of representing the elements of the room. A base consists of pairwise orthogonal unit vectors, and are not only finite but also infinite sums (called ranks ) of basis vectors allowed. Such a complete orthonormal system in an infinite dimensional space is never a basis in the sense defined here, for better distinction is also called Schauder basis. The described in this paper base type is called to distinguish also Hamel basis.

Auerbach bases

A Auerbach basis is a Hamel basis for a dense subspace in a normed vector space so that the distance of each basis vector from the product of the remaining vectors is equal to its norm.

Definition of basic terms

  • Both a Hamel basis as well as a Schauder basis is a linearly independent set of vectors.
  • A Hamel basis or simply base, as described in this Article shall constitute a generating set of the vector space, ie, any vector of the space can be represented as a linear combination of finitely many vectors of the Hamel basis.
  • In a finite dimensional real or complex inner product space is an orthonormal basis (ie, a minimal generating system of normalized orthogonal vectors) also Hamel and Schauder basis.
  • In an infinite-dimensional, full real or complex inner product space (especially so in an infinite-dimensional Hilbert space ) is a Schauder basis is never a Hamel basis, and vice versa. In the infinite-dimensional case, a Hamel basis often can not even orthonormieren.
  • The Hamel basis of an infinite-dimensional, separable Hilbert space consists of uncountably many elements. A Schauder basis, however, consists in this case of countably many elements.
  • In Hilbert spaces with base ( without additives) mostly a Schauder basis meant in vector spaces without inner product is always a Hamel basis.
90939
de