A **radial basis function** or **RBF** network is a neural network that has many similarities to the perceptron in how it is trained and used. An RBF network has three layers:

- an
**input layer**whose neurons accept a vector of values;

- a single
**hidden layer**consisting of**radial basis function****neurons**. Each neuron in the hidden layer receives the n-dimensional vector of values from the input layer via unweighted connections and uses a typically Gaussian function to measure the deviation of the point it represents with respect to a defined centre and standard deviation value. It then propagates this deviation to the output layer;

- an
**output layer**in which each neuron represents a classification. In a trained RBF network, each output neuron will be activated depending on the probability with which the data supplied to the input layer fits its classification.

Mathematically, the addition of multiple Gaussian curve functions can be used to represent any shape, so that an RBF network can learn any classification although it only has one hidden layer. Compare this to a perceptron, which requires at least three hidden layers for an arbitrarily complex function.

However, the difficulty lies in supplying the right centre and deviation for each neuron in the hidden layer. There are RBF variants where both these values and the weights connecting the hidden layer to the outer layer can be learned. There are also variants where additional hidden layer neurons are added to the network whenever detected errors cannot be eliminated by changing weights. However, the best results tend to be obtained where the centres and deviations are already known from the problem domain and can be hard-coded. Generally, it makes sense to use an RBF network when this is the case and a perceptron otherwise. A further important consideration is that the RBF network trains better than the perceptron when there are many output neurons (many output classifications) because it has fewer hidden layers.

- alias
- subtype
- has functional building block
- FBB_Classification
- has input data type
- IDT_Vector of quantitative variables
- has internal model
- INM_Neural network
- has output data type
- ODT_Classification ODT_Probability
- has learning style
- LST_Supervised
- has parametricity
- PRM_Nonparametric with hyperparameter(s)
- has relevance
- REL_Relevant
- uses
- sometimes supports
- mathematically similar to