Jump to content

Talk:Moment-generating function

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by Michael Hardy (talk | contribs) at 22:12, 10 July 2009 (Terms). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.
WikiProject iconStatistics Unassessed
WikiProject iconThis article is within the scope of WikiProject Statistics, a collaborative effort to improve the coverage of statistics on Wikipedia. If you would like to participate, please visit the project page, where you can join the discussion and see a list of open tasks.
???This article has not yet received a rating on Wikipedia's content assessment scale.
???This article has not yet received a rating on the importance scale.
WikiProject iconMathematics Start‑class Mid‑priority
WikiProject iconThis article is within the scope of WikiProject Mathematics, a collaborative effort to improve the coverage of mathematics on Wikipedia. If you would like to participate, please visit the project page, where you can join the discussion and see a list of open tasks.
StartThis article has been rated as Start-class on Wikipedia's content assessment scale.
MidThis article has been rated as Mid-priority on the project's priority scale.

Hyphentation

Spelling question: I've never (before now) seen the name spelled with a hyphen. Searches of Math Reviews (MathSciNet) and Current Index to Statistics show an overwhelming preference for no hyphen. Should the title, at least, be changed (move the article to "Moment generating function" with a redirect)? Zaslav 18:08, 8 December 2006 (UTC)[reply]

Sir Ronald Fisher always used the hyphen in "moment-generating function". This is an instance of the fact that in this era the traditional hyphenation rules are usually not followed in scholarly writing, nor in advertising or package labelling, althouth they're still generally followed in newspapers, magazines, and novels. This particular term seldom appears in novels, advertisements, etc. Personally I prefer the traditional rules because in some cases they are a very efficient disambiguating tool. Michael Hardy 20:03, 8 December 2006 (UTC)[reply]

Terms

I would like _all_ the terms such as E to be defined explicitly. Otherwise these articles are unintelligible to the casual reader. I would have thought that all terms in any formula should be defined every any article, or else reference should be made to some common form of definition of terms for that context. How about a bit more help for the randomly browsing casual student? I would like to see a recommendation in the Wikipedia "guidelines for authors" defining some kind of standard for this, otherwise it is very arbitrary which terms are defined and which are expected to be known. — Preceding unsigned comment added by 220.253.60.249 (talk)

Seriously, this article is written for people who already know this stuff apparently. The article doesn't really say what E is, and for that matter what is t? Seriously, for M(t) what is t? mislih 20:33, 10 July 2009 (UTC)[reply]
t is the dummy argument used to define the function, like the x when one defines the squaring function ƒ by saying ƒ(x) = x2. And it says so in the second line of the article, where it says t ∈ R. If you're not following that, then what you don't know that you'd need to know to understand it is the topics of other articles, not of this one. In cases where the random variable is measured in some units, the units of t must be the reciprocal of the units of x. As for what E is, there is a link to expectation in the next line. Anyone who knows the basic facts about probability distributions has seen this notation. Michael Hardy (talk) 22:08, 10 July 2009 (UTC)[reply]

Certainly one could put in links to those things, but this article is the wrong place to explain what "E" is, just as an article about Shakespeare's plays is the wrong place to explain how to spell "Denmark", saying that the "D" represents the initial sound in "dog", etc.

This is not written for people who already know this material.

It is written for people who already know what probability distributions are and the standard basic facts about them. Michael Hardy (talk) 21:47, 10 July 2009 (UTC)[reply]

Definition

the definition of the n-th moment is wrong, the last equality is identically zero, as the nth derivative of 1 evaluated at t=0 will always be zero. the evaluation bar must be placed at the end (so we know we are differentiating Mx(t) n times and evaluating it at zero).

The only thing I find here that resembles a definition of the nth moment is where it says:
the nth moment is given by
That definition is correct.
I take the expression
to mean that we are first differentiating n times and then evaluating at zero. Unless you were referring to something else, your criticism seems misplaced. Michael Hardy 22:05, 8 April 2006 (UTC)[reply]

Please provide a few examples, e.g. for a Gaussian distribution.

How about adding something like this?
For the Gaussian distribution
the moment-generating function is
Completing the square and simplifying, one obtains
(mostly borrowed from article normal distribution.) I don't know if there's enough space for a complete derivation. The "completing the square" part is rather tedious. -- 130.94.162.64 00:37, 17 June 2006 (UTC)[reply]

I would also like to see some more in the article about some basic properties of the moment-generating function, such as convexity, non-negativity, the fact that M(0) always equals one, and also some other not-so-obvious properties (of which I lack knowledge) indicating what the mgf is used for. --130.94.162.64 00:55, 17 June 2006 (UTC)[reply]

Also, is it true that "Regardless of whether the probability distribution is continuous or not, the moment-generating function is given by the Riemann-Stieltjes integral" When you calculate the MGF of Poisson distribution. X~Poisson(lambda) M_X(t)=E[exp(tX)]=sum(exp(tx)*p(x),x=0,infinity) is the correct formula to use. This is clearly not an integral. Does Riemann-Stieltjes integral include summation as well? If not, the quoted statement is wrong and should be removed from the article. —Preceding unsigned comment added by Sjayzzang (talkcontribs) 20:02, 15 April 2009 (UTC)[reply]

Certainly a sum is a Riemann–Stieltjes integral. One would hope that that article would be crystal-clear about that. I'll take a look. Michael Hardy (talk) 21:55, 10 July 2009 (UTC)[reply]
...I see: that article is not explicit about that. Michael Hardy (talk) 21:56, 10 July 2009 (UTC)[reply]

Vector of random variables or stochastic process

We should mention the case when X is a vector of random variables or a stochastic process. Jackzhp 22:29, 3 September 2006 (UTC)[reply]

I've added a brief statement about this. (Doubtless more could be said about it.) Michael Hardy 03:48, 4 September 2006 (UTC)[reply]

Properties would be nice

There are a whole bunch of properties of MGFs that it would be nice to include -- e.g. the MGF of a linear transformation of a random variable, MGF of a sum of independent random variables, etc.

Discrete form of mgf

something should be added about the discrete form of the mgf, no? 24.136.121.150 08:37, 20 January 2007 (UTC)[reply]

Common Moment Generating Functions

It would seem like a good and self-evidently obvious thing to include a link to a wikipedia page which tabulates common moment generating functions (ie: the moment generating functions for common statistical distributions), placing them online. The information is already there on wikipedia, it would just be a case of organising it a little better.

Also, there is probably some efficient way in which the set of all possible functions which commonly occur when dealing with statistical distributions can be organised to highly the possibility of inter-relationships (perhaps some mgf's are nested mgfs so that the fact that

 

could be highlighted in a list of mgf interdependencies...).

ConcernedScientist (talk) 00:47, 18 February 2009 (UTC)[reply]

Uniqueness

We have a theorem that if two mgf's coincide in some region around 0, then corresponding random variables have same distribution. There is however a concern that this statement being true from the point of view of mathematician, is not so reliable from the point of view of applied statistician. McCullagh (1954)[1] gives following example:

with cumulant generating functions

Although the densities are visibly different, their corresponding cgfs are virtually indistinguishable, with maximum difference less than 1.34•10-9 over the entire range. Thus from numerical standpoint mgfs fail to uniquely determine distribution.

On the other hand Waller (1995)[2] shows that characteristic function does much better job in determining the distribution.

  1. ^ McCullagh, P. (1954). Does the moment generating function characterize a distribution? American Statistics, 48, p.208
  2. ^ Waller, L.A. (1995). Does the characteristic function numerically distinguish distributions? American Statistics, 49, pp. 150-151