Biological neural network

Neural networks are an attempt to structure and information architecture of a nervous system of animals or humans reduced map. A nervous system consists of neurons and glia, as well as an environment. The neurons via synapses are linked together, which can be interpreted as linking sites or nodes of an inter- euro national network. Next takes place between the neurons and neuroglia, oligodendroglia and astroglia in particular in the form of a chemical and electrical exchange takes place, which can change the weighting of signals.

The " circuitry " of neurons usually knows several inputs and one output. If the sum of the input signals exceeds a certain threshold, "fires" the neuron: an action potential ( AP) is initiated at the axon hillock, formed in the initial element and passed along the axon. Action potentials or action potential series are the output signals of neurons. Aufzweigungen its axon, the signal of a neuron can transmit to several other neurons ( divergence). Similarly, a neuron signals from different neurons may accrue ( convergence ), mainly via its dendrites as inputs. By inhibitory synapses, the formation of an AP can be hindered or suppressed. By combining signals thus received a sign, and can be attenuated at the link or strengthened beyond. With frequently repeated transfer in rapid succession of synaptic transmission mode is longer lasting changes ( long-term potentiation ) and are shaped (neural plasticity ).

Learning

About the learning in neural networks, there are several now well standardized theories. The first neural learning rule was described in 1949 by Donald O. Hebb ( Hebb learning rule ); significant developments were made, inter alia, by working Teuvo Kohonen of the Finns mid-1980s.

This resulted in typical properties of neural networks that apply equally to natural as artificial " neural systems ". This includes the property that they can learn complex patterns without an abstraction of the these patterns takes place possibly underlying rules. That is, neural networks do not follow the laws of the so-called artificial intelligence, but a sort of " natural intelligence." This means in particular that not only the rules must be developed before learning. On the other hand, from the neural network also not subsequently a possible logic can be identified which accounted for the learning success.

The whole thing does not mean that logical behavior and precise rules do not exist; only these will not be acquired " by itself" through experience, but must be developed through many years of " training " more or less tedious. In contrast, neural networks learn not explicitly, but implicitly.

Especially for the simulation of artificial neural networks in science and technology is concerned

The "right" training a neural network is a prerequisite for successful learning and for the correct processing of a pattern in a nervous system.

Conversely, that a prediction about the "correct" interpretation of a pattern by a neural network is not accurately possible unless this specific power is applied or by anticipated that specific learning experience. Neural networks thus have the problem that cause after the learning process patterns that do not resemble the models that are implemented in the learning set, stochastic (ie seemingly " random " ) behavior of the output neurons. This is the main difficulty, which is why neural networks can be so far limited their use for pattern recognition.

Research

  • The study of biochemical and physiological properties of neural networks is a subject of the neurophysiology.
  • In the neuro- computer science is an attempt to simulate neural networks computer-based artificial neural networks and to make the properties of neural networks for software applications available.
  • A conceptual abstraction of neural networks also takes place in theoretical biology.
  • In particular, in the computational neuroscience model neurons that have different degrees of abstraction of the biological conditions, using simulated synapses to networks connected to investigate its dynamism and ability to process information. This happens at mathematically simple models by mathematical analysis, but most also by computer simulations.
  • In the eighties and nineties of the last century, physicists have gone into this area and then have contributed significantly to the understanding. Currently, neural networks for analysis in high-energy physics, eg in the top physics used. So-called multivariate methods here are a key component for the separation of folded experimental data.

Alzheimer

The age Alzheimer disease seems mainly due to a damage of the neural network in the brain to run out, by damage to the charge of the so-called communication Myelonen.

314499
de