General regression neural network
Generalized regression neural network (GRNN) is a variation to radial basis neural networks. GRNN was suggested by D.F. Specht in 1991.[1]
GRNN can be used for regression, prediction, and classification. GRNN can also be a good solution for online dynamical systems.
GRNN represents an improved technique in the neural networks based on the nonparametric regression. The idea is that every training sample will represent a mean to a radial basis neuron.[2]
Mathematical representation
where:
- is the prediction value of input
- is the activation weight for the pattern layer neuron at
- is the Radial basis function kernel (Gaussian kernel) as formulated below.
where is the squared euclidean distance between the training samples and the input
Implementation
GRNN has been implemented in many computer languages including MATLAB[3], R- programming language and Python (programming language).
Advantages and disadvantages
Similar to RBFNN, GRNN has the following advantages:
- Single-pass learning so no backpropagation is required.
- High accuracy in the estimation since it uses Gaussian functions.
- It can handle noises in the inputs.
The main disadvantages of GRNN are:
- Its size can be huge, which would make it computationally expensive.
- There is no optimal method to improve it.
References
- ^ Specht, D. F. (2002-08-06). "A general regression neural network - IEEE Xplore Document". IEEE Transactions on Neural Networks. 2 (6). Ieeexplore.ieee.org: 568–576. doi:10.1109/72.97934. PMID 18282872.
- ^ https://minds.wisconsin.edu/bitstream/handle/1793/7779/ch2.pdf?sequence=14
- ^ "Generalized Regression Neural Networks - MATLAB & Simulink - MathWorks Australia".