Jump to content

General regression neural network

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by Kautenja (talk | contribs) at 03:38, 19 October 2017 (Gaussian Kernel). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Generalized regression neural network (GRNN) is a variation to radial basis neural networks. GRNN was suggested by D.F. Specht in 1991.[1]

GRNN can be used for regression, prediction, and classification. GRNN can also be a good solution for online dynamical systems.

GRNN represents an improved technique in the neural networks based on the nonparametric regression. The idea is that every training sample will represents a mean to a radial basis neuron.[2]

Mathematical representation

where:

  • is the prediction value of input
  • is the activation weight for the pattern layer neuron at
  • is the Radial basis function kernel (Gaussian kernel) as formulated below.

where is the squared euclidean distance between the training samples and the input [3]

Implementation

GRNN has been implemented in many software including MATLAB[4], R- programming language and Python (programming language).

Advantages and disadvantages

Similar to RBFNN GRNN has the following advantages:

The main disadvantages of GRNN are:

  • Its size can grow to huge size which computationally expensive.
  • There is no optimal method to improve it.

References

  1. ^ "A general regression neural network - IEEE Xplore Document". Ieeexplore.ieee.org. 2002-08-06. Retrieved 2017-03-13.
  2. ^ https://minds.wisconsin.edu/bitstream/handle/1793/7779/ch2.pdf?sequence=14
  3. ^ https://minds.wisconsin.edu/bitstream/handle/1793/7779/ch2.pdf?sequence=14
  4. ^ https://au.mathworks.com/help/nnet/ug/generalized-regression-neural-networks.html