Jump to content

General regression neural network

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by Headbomb (talk | contribs) at 18:40, 9 August 2019 (Alter: title. | You can use this tool yourself. Report bugs here.). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Generalized regression neural network (GRNN) is a variation to radial basis neural networks. GRNN was suggested by D.F. Specht in 1991.[1]

GRNN can be used for regression, prediction, and classification. GRNN can also be a good solution for online dynamical systems.

GRNN represents an improved technique in the neural networks based on the nonparametric regression. The idea is that every training sample will represent a mean to a radial basis neuron.[2]

Mathematical representation

where:

  • is the prediction value of input
  • is the activation weight for the pattern layer neuron at
  • is the Radial basis function kernel (Gaussian kernel) as formulated below.

where is the squared euclidean distance between the training samples and the input

Implementation

GRNN has been implemented in many computer languages including MATLAB[3], R- programming language and Python (programming language).

Advantages and disadvantages

Similar to RBFNN, GRNN has the following advantages:

The main disadvantages of GRNN are:

  • Its size can be huge, which would make it computationally expensive.
  • There is no optimal method to improve it.

References

  1. ^ Specht, D. F. (2002-08-06). "A general regression neural network". IEEE Transactions on Neural Networks. 2 (6): 568–576. doi:10.1109/72.97934. PMID 18282872.
  2. ^ https://minds.wisconsin.edu/bitstream/handle/1793/7779/ch2.pdf?sequence=14
  3. ^ "Generalized Regression Neural Networks - MATLAB & Simulink - MathWorks Australia".