Adaptive resonance theory

Adaptive Resonance Theory ( ART) translates adaptive resonance theory. ART is an architectural concept for neural networks and was developed by Stephen Grossberg and Gail A. Carpenter.

Basic Structure

The simple ART- system belongs to the class of supervised learning algorithms.

Typically it consists of a comparison field F1 and a recognition field F2, a vigilance parameter P and a reset module R. neurons in F1 representing attributes neurons in F2 categories. Therefore, both fields usually consist of a different number of neurons. F1 and F2 are bidirectional fully networked. Each compound is evaluated using a specific weight. These weights are the long-term storage of ART. All weights from F1 to F2 are combined in a weight matrix W, the other part of the weight matrix Z. The vigilance parameter has considerable influence on the system: a high value causes fine-grained memories ( many small categories), a low value ensures more abstraction ( less coarser categories).

Classification

Classifying the attributes of a one-dimensional vector I is applied to the F1 neurons. This is measured using the weights W and F2 transmitted to the neurons. The F2 - neuron which receives the maximum signal, determines the category, belongs to the I. The output of all other F2 neurons is set to zero.

The result is evaluated with Z and returned to the F1 layer. Is the difference between I and the back of next signal within the set of P boundary, the system is in resonance and the weights can be trained. Otherwise, the active F2 neuron is disabled until a new input vector is applied by the reset signal. It is followed by a new cycle, which leads to another Neuron F2 ( other category). Thus, during the search for and tried after long categories until either a category has been found that is similar enough or all categories were compared.

Should I fit into any of the categories stored, so a previously untapped F2 neuron is used and initialized the weights for this with the values ​​of I. If there is no free F2 neuron available ( memory exhausted ), so the new pattern can not be learned.

In a state of resonance, the weights of the selected category are adapted to the attributes of I. Therefore, the system takes its name: it learns in a state of resonance by adjusting stored categories to new cases, provided that the new case is not too different from the stored category.

Training

There are two different ways to train ART networks: slow and fast.

Slow learning happens when a satisfactory type is found ( In the style of vibrating systems this condition is called resonance). The weights in this category are adjusted by using differential equations to the attributes of I. The degree of adjustment is dependent on how long presented I.

During fast learning the adaptation of the weights using algebraic equations is determined.

When fast learning is effective and efficient for many applications, slow learning is biologically plausible.

ART types

ART 1

ART 1 is the simplest and original variant. ART- 1 can handle only binary attributes.

ART 2

ART ART- 1 and 2 extends enables the processing of continuous attributes.

ART 2-A

ART 2-A is a rationalized form of ART -2 with drastically improved running time and rarely poorer results.

ART 3

ART 3 is based on ART- 2, mimicking chemical processes of biological neural networks after.

Fuzzy ART

Fuzzy ART ART 1 extended to the use of fuzzy logic. This allows gradual assignment to different categories simultaneously. By the complementary coding the attributes the lack of attributes can be handled, which reduces unnecessary proliferation of categories.

Distributed ART

Distributed ART

ARTMAP

ARTMAP, even Predictive ART ( predictive ART) is called. Two ART modules are combined with a compound V to a monitored network learning system. The first module processes the attribute vector I and the second the result vector C. The control system S controls the vigilance parameter of the attribute system, depending on the outcome categories. Thus, different gross categories of attributes can be produced.

Only a simple ( supervised learning ) classification system is required, the second module can be simplified.

Fuzzy ARTMAP

Fuzzy ARTMAP is simply a ARTMAP consists of two Fuzzy ART modules.

Distributed ARTMAP

Distributed ARTMAP

ARTMAP - IC

ARTMAP - IC

Default ARTMAP

Default ARTMAP is a combination of Fuzzy ARTMAP and partially Distributed ARTMAP.

ARTMAP Overview

WTA = " winner takes all" Only the F2 neuron with the largest output is active, the output of all other is zero.

Pooled = All F2 neurons are active.

Instance count = The outputs of F2 neurons are weighted by the F3 neurons. Categories, which were more frequent in the training cases, get a bigger vote.

29395
de