General regression neural network
Generalized regression neural network (GRNN) is a variation to radial basis neural networks. GRNN was suggested by D.F. Specht in 1991.[1]
GRNN can be used for regression, prediction, and classification. GRNN can also be a good solution for online dynamical systems.
GRNN represents an improved technique in the neural networks based on the nonparametric regression. The idea is that every training sample will represents a mean to a radial basis neuron.[2]
Mathematical representation
where:
- is the prediction value of input
- is the activation weight for the pattern layer neuron at
- is the Radial basis function kernel (Gaussian kernel) as formulated below.
where is the squared euclidean distance between the training samples and the input
Implementation
GRNN has been implemented in many software including MATLAB[3], R- programming language and Python (programming language).
Advantages and disadvantages
Similar to RBFNN GRNN has the following advantages:
- Single-pass learning so no backpropagation is required.
- High accuracy in the estimation since it uses Gaussian functions.
- It can handle noises in the inputs.
The main disadvantages of GRNN are:
- Its size can grow to huge size which computationally expensive.
- There is no optimal method to improve it.
References
- ^ "A general regression neural network - IEEE Xplore Document". Ieeexplore.ieee.org. 2002-08-06. Retrieved 2017-03-13.
- ^ https://minds.wisconsin.edu/bitstream/handle/1793/7779/ch2.pdf?sequence=14
- ^ https://au.mathworks.com/help/nnet/ug/generalized-regression-neural-networks.html