Jump to content

Nonlinear regression

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by 195.92.168.173 (talk) at 19:02, 6 October 2004. The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.
(diff) ← Previous revision | Latest revision (diff) | Newer revision → (diff)

Nonlinear regression is hard. It is the problem of fitting a model y=f(x, theta) to multidimension x,y data, where f is a nonlinear function of y with parameters theta. Unlike linear regression, which can be solved analytically, there is no general procedure for finding the optimal parameters. (Is this because of noncomputability constraints?). Instead, iterative algorithms such as gradient descent are used, but these can get stuck in local minima.

--is there a rigourous theory of when analytic solutions are possible? --are there some models that have no analytic solutions, but have iterative solutions that are guarenteed to converge to the global optimal?