A set of units (nerves, individuals, institutions, states), and a rule that defines whether, how, and to what extent any two units are tied to each other
The processing ability of a neural network is stored in the inter unit connection strengths, or weights, obtained by a process of adaptation to, or learning from, a set of training patterns
A neural network can be likened to the brain in two ways: 1) Knowledge is acquired by the network from its environment through a learning process, and 2) Interneuron connection strengths, known as synaptic weights, store the acquired knowledge
The human nervous system can be considered as a three-staged system: 1) The human brain is the center, 2) It is represented by the neural or nerve net which acts as the receiver of information called stimulus, and 3) It perceives the information gathered and makes appropriate decisions to the stimulus
The receptors translate the stimuli from the human body or any external environment into electric impulses that relay the information to the neural net or the brain, and the effectors convert the electric impulses produced by the neural net into visible responses as system outputs
The brain has small scale and a large scale anatomical organizations wherein different functions take place at lower and higher levels, from synapses to neural microcircuits, dendritic trees, neurons, local circuits, and interregional circuits
Neural networks are software simulations that behave as though they're built from billions of highly interconnected brain cells working in parallel, but they are not actual brains
Neural networks process information like the human brain, composed of a large number of highly interconnected processing elements (neurons) working in parallel to solve specific problems
Neural networks are "connectionist" computational systems where information is processed collectively and in parallel throughout a network of nodes (neurons), unlike procedural programs that execute instructions in a linear manner