Jump to content

Talk:Gaussian function

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by 2001:620:600:6000:ecdf:3a24:4cba:7595 (talk) at 11:47, 15 April 2013 (Multivariate gaussian: new section). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.
WikiProject iconStatistics C‑class Mid‑importance
WikiProject iconThis article is within the scope of WikiProject Statistics, a collaborative effort to improve the coverage of statistics on Wikipedia. If you would like to participate, please visit the project page, where you can join the discussion and see a list of open tasks.
CThis article has been rated as C-class on Wikipedia's content assessment scale.
MidThis article has been rated as Mid-importance on the importance scale.

FWHM

Someone should write something relating the gaussian full width half maximum to the given parameters. That way, using the fact that its a gaussian function and given the FWHM, I could construct one that works.

Do we mean to say that

gaussian functions are eigenfunctions of the Fourier transform,

or that

eigenfunctions of the Fourier transform are gaussian functions

or neither? -- Miguel

Not all eigenfunctions of the Fourier transform are Gaussian. See Hermite polynomials. Michael Hardy 15:10, 30 Aug 2003 (UTC)

maximum entropy

Could someone add something about Gaussian functions being the ones with maximum entropy? I think this can also be related to the Heisenberg uncertainty principle since momentum and position are canonical conjugate variables.

This article links to normal distribution, which I suspect already gives that information. For non-normalized Gaussian functions, I'm not sure at this moment what the maximum-entropy statement would say. Michael Hardy 23:46, 27 Feb 2005 (UTC)

image

are they all the same bell-shape? if so, let's get a picture! - Omegatron 17:51, Mar 15, 2005 (UTC)

Definition of

The function definition uses as parameters, while the graph uses as parameters. What is the relation between the two sets of parameters?

--NeilenMarais 20:18, 24 May 2006 (UTC)[reply]

Answering myself, it seems from looking at Gaussian function that , and . One could mention this, or perhaps even better, generate an image using the correct parameters. Opinions? --NeilenMarais 20:27, 24 May 2006 (UTC)[reply]

Not entirely correct. Not all Gaussian functions are probability density functions, so a need not be a normalizing constant that makes the integral equal to 1.

But certainly I think the caption should explain the notation used in the illustration. Michael Hardy 21:29, 24 May 2006 (UTC)[reply]


Yes, I'd also like a better explanation of this. And also, for the 2D case, and are said to be the "spread" of the function, which term is not explained. Is it related to the FWHM?

213.115.59.220 08:49, 17 July 2007 (UTC)[reply]

Gaussian Function ..

Would I be correct if i said that a gaussian function as such represents the values a variable can have .............that is to say ......it gives us a range of possible values of the variable , or it shows the region were the value of that variable lies ......

Is that what the Gaussian function does ... —The preceding unsigned comment was added by Hari krishnan07 (talkcontribs) 04:36, 3 December 2006 (UTC).[reply]

Not directly. The gaussian function is the name for a function with specific properties e.g. as illustrated in the curves in the article. What you refer to is a probability distribution and can have the form of a gaussian Kghose 16:01, 16 December 2006 (UTC)[reply]


A function of the form are some kind of Gaussian functions?? --Karl-H 11:12, 27 January 2007 (UTC)[reply]

I think not. This is related to Gaussian functions of course, but a true Gaussian function should not have the x^{2m} term in front. Oleg Alexandrov (talk) 18:41, 27 January 2007 (UTC)[reply]

Ambiguity or error in definition of sigma?

There seems to be an ambiguity or error in the definition of sigma here. If I am not mis-informed, sigma-x and sigma-y are the standard deviations of the function along the x and y axis respectively? If this is correct, then if a 2-d Gaussian ellipse is inclined at theta = 45 degrees it would have the same sigma-x and sigma-y as a circular 2-d Gaussian, but with the covariance = 0 for the circular Gaussian and non-zero for the elliptical Gaussian.

In the 3 plots showing rotation of the ellipse from theta = 0 to theta = pi/3, the values for sigma-x and sigma-y are the same, 1 and 2 respectively. This implies that here sigma-x and sigma-y are the standard deviations along the minor and major axis of the ellipse, not along the x and y axis of the function.

Could someone please clear this up? Also, an equation that relates the angle theta to the covariance term would be helpful.

""""jgreen —Preceding unsigned comment added by 75.75.90.207 (talk) 21:22, 20 September 2007 (UTC)[reply]

Is there a spurious factor of two infront of 'b' for the 2D gaussian? The Matlab code contains no 2, whilst the latek image of the equation does. —Preceding unsigned comment added by 220.239.69.107 (talk) 05:52, 16 October 2007 (UTC)[reply]

I'm inclined to agree with the last statement re:factor of two. See mathworld... —Preceding unsigned comment added by 74.74.223.195 (talk) 10:22, 21 June 2008 (UTC)[reply]

I believe to remember, that all derivatives of a gaussian are again gaussian. But there may as well be an additional condition to the polynominal. Could someone shed some light on this ? —Preceding unsigned comment added by 84.227.21.231 (talk) 16:53, 18 January 2009 (UTC)[reply]

I changed the wording on the definition of a Gaussian derivative, I do suggest a Math expert review to ensure the new description is accurate. So far this is the best resource on the web that I can find particularly with explaining Gaussian derivatives. Jon.N. —Preceding undated comment added 21:56, 8 August 2009 (UTC).[reply]

Algorithm? A misspelling?

Gaussian functions arise by applying the exponential function to a general quadratic function. The Gaussian functions are thus those functions whose algorithm is a quadratic function.

Did the author mean "logarithm" instead of "algorithm" here?

147.8.235.63 (talk) 10:18, 1 December 2008 (UTC)[reply]

FFT?

I'm confused... I see that the FFT of a Gaussian is Gaussian, but in a discrete implementation using scipy's fft to transform Gaussian functions, I get that σ -> N*5/(32σ) where N is the number of bins. This 5/32 seems like a weird magic number. What am I missing? —Ben FrantzDale (talk) 16:28, 2 April 2010 (UTC)[reply]

I agree, and can add that the magic number is . So that (at least in matlab), doing an fft on a Gaussian with sigma, c, results in a Gaussian with sigma: .--12.yakir (talk) 18:22, 27 August 2012 (UTC)[reply]

No mention of parabola

So, no mention of the fact that a linear-logarithmic Gaussian is a linear-linear parabola. I would imagine this would be considered a feature of note, but I don't know whether it is or not, or how to say that in a way that gives some substance to the fact. ᛭ LokiClock (talk) 06:21, 8 July 2010 (UTC)[reply]

Matlab code better in Python?

Because not everybody has access to Matlab due to its costs, I would like to see Python/Numpy code here instead. It is very similar, but nevertheless differs in some details. I could do the conversion myself, if people agree. --maye (talk) 10:12, 27 October 2010 (UTC)[reply]

I agree. There has been an attempt in the past to write this into policy, but the discussion was sidetracked and then ceased. The policies on media content do not merely disallow content they can't legally republish, but disallows content under non-commercial licenses, because anyone should be able to use all of the encyclopedia freely. At least in this case, you would be acting towards that end, and I see no reason why MATLAB has particular significance in this article. ᛭ LokiClock (talk) 14:10, 27 October 2010 (UTC)[reply]
Most Matlab code can be run using Octave, which is free and open-source. Alhead (talk) 21:00, 7 December 2011 (UTC)[reply]

Exponent's quadratic's form

The article states:

"Gaussian functions arise by applying the exponential function to a general quadratic function."

But how does applying the one to the other lead to that form for the quadratic? This should probably go in a History section. ᛭ LokiClock (talk) 00:47, 26 March 2011 (UTC)[reply]

Merging

I guess they mean exactly the same thing, and normal distribution is the more formal name from the mathematical point of view. (Unsigned post.)

Was suggestion to merge with Normal distribution.
The Gaussian function does also have numerous applications outside the field of statistics, for example regarding solution of diffusion equations and Hermite functions and regarding feature detection in computer vision. If this article would be merged under normal distributions, these connections would be lost. Hence, I think that it is more appropriate to keep this article with appropriate cross referencing. Tpl (talk) 11:53, 8 June 2011 (UTC)[reply]
Merge templates go on the articles not on the discussion page. New stuff on discussion page goes at the end. Have removed merge template as merge would not be good. Melcombe (talk) 14:44, 8 June 2011 (UTC)[reply]

Typo?

Can someone verify that the equation for 2-dimensional Gaussian elliptical is correct as currently stated:

Or should the second term in the exponent actually be negative like so:

I'm drawing from this page (wikipedia.org) and other sources for the bivariate gaussian pdf. — Preceding unsigned comment added by 129.123.61.172 (talk) 21:04, 28 September 2011 (UTC)[reply]

Well, it seems the only difference is the change of sign of the parameter . Then it is simply the matter of definition. Bakken (talk) 10:52, 1 October 2011 (UTC)[reply]

Multivariate gaussian

An undefined variable called B keeps showing up inside the "Multivariate gaussian" paragraph. I have no idea if it is simply: B = A, or if there is something subtle going on here, but either we need to add a definition for B, or just remove it from the math.