<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>https://wiki.sarg.dev/index.php?action=history&amp;feed=atom&amp;title=Moment-generating_function</id>
	<title>Moment-generating function - Revision history</title>
	<link rel="self" type="application/atom+xml" href="https://wiki.sarg.dev/index.php?action=history&amp;feed=atom&amp;title=Moment-generating_function"/>
	<link rel="alternate" type="text/html" href="https://wiki.sarg.dev/index.php?title=Moment-generating_function&amp;action=history"/>
	<updated>2026-04-18T06:19:50Z</updated>
	<subtitle>Revision history for this page on the wiki</subtitle>
	<generator>MediaWiki 1.44.2</generator>
	<entry>
		<id>https://wiki.sarg.dev/index.php?title=Moment-generating_function&amp;diff=137015&amp;oldid=prev</id>
		<title>2A00:23C8:DE24:801:414:35E9:4E90:9155: /* Important properties */ more on log-normal</title>
		<link rel="alternate" type="text/html" href="https://wiki.sarg.dev/index.php?title=Moment-generating_function&amp;diff=137015&amp;oldid=prev"/>
		<updated>2025-10-26T13:17:13Z</updated>

		<summary type="html">&lt;p&gt;&lt;span class=&quot;autocomment&quot;&gt;Important properties: &lt;/span&gt; more on log-normal&lt;/p&gt;
&lt;p&gt;&lt;b&gt;New page&lt;/b&gt;&lt;/p&gt;&lt;div&gt;{{Short description|Concept in probability theory and statistics}}&lt;br /&gt;
In [[probability theory]] and [[statistics]], the &amp;#039;&amp;#039;&amp;#039;moment-generating function&amp;#039;&amp;#039;&amp;#039; of a real-valued [[random variable]] is an alternative specification of its [[probability distribution]]. It allows to increase the search radius, thus to come out of local minima. Thus, it provides the basis of an alternative route to analytical results compared with working directly with [[probability density function]]s or [[cumulative distribution function]]s. There are particularly simple results for the moment-generating functions of distributions defined by the weighted sums of random variables. However, not all random variables have moment-generating functions.&lt;br /&gt;
&lt;br /&gt;
As its name implies, the moment-[[generating function]] can be used to compute a distribution’s [[Moment (mathematics)|moments]]: the {{mvar|n}}-th moment about 0 is the {{mvar|n}}-th derivative of the moment-generating function, evaluated at 0.&lt;br /&gt;
&lt;br /&gt;
In addition to univariate real-valued distributions, moment-generating functions can also be defined for vector- or matrix-valued random variables, and can even be extended to more general cases.&lt;br /&gt;
&lt;br /&gt;
The moment-generating function of a real-valued distribution does not always exist, unlike the [[Characteristic function (probability theory)|characteristic function]]. There are relations between the behavior of the moment-generating function of a distribution and properties of the distribution, such as the existence of moments.&lt;br /&gt;
&lt;br /&gt;
==Definition==&lt;br /&gt;
Let &amp;lt;math&amp;gt; X &amp;lt;/math&amp;gt; be a [[random variable]] with [[Cumulative distribution function|CDF]] &amp;lt;math&amp;gt;F_X&amp;lt;/math&amp;gt;. The moment generating function (mgf) of &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; (or &amp;lt;math&amp;gt;F_X&amp;lt;/math&amp;gt;), denoted by &amp;lt;math&amp;gt;M_X(t)&amp;lt;/math&amp;gt;, is&lt;br /&gt;
&lt;br /&gt;
&amp;lt;math display=&amp;quot;block&amp;quot;&amp;gt; M_X(t) = \operatorname E \left[e^{tX}\right] &amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
provided this [[expected value|expectation]] exists for &amp;lt;math&amp;gt;t&amp;lt;/math&amp;gt; in some open [[Neighborhood (mathematics)|neighborhood]] of 0. That is, there is an &amp;lt;math&amp;gt;h &amp;gt; 0&amp;lt;/math&amp;gt; such that for all &amp;lt;math&amp;gt;t&amp;lt;/math&amp;gt; satisfying   &amp;lt;math&amp;gt;-h &amp;lt; t &amp;lt; h&amp;lt;/math&amp;gt;,  &amp;lt;math&amp;gt;\operatorname E \left[e^{tX}\right] &amp;lt;/math&amp;gt; exists. If the expectation does not exist in an open neighborhood of 0, we say that the moment generating function does not exist.&amp;lt;ref&amp;gt;{{cite book |last1=Casella |first1=George|last2= Berger|first2= Roger L. |title=Statistical Inference |publisher=Wadsworth &amp;amp; Brooks/Cole|year=1990 |page=61 |isbn=0-534-11958-1 }}&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In other words, the moment-generating function of {{mvar|X}} is the [[expected value|expectation]] of the random variable &amp;lt;math&amp;gt; e^{tX}&amp;lt;/math&amp;gt;. More generally, when &amp;lt;math&amp;gt;\mathbf X = ( X_1, \ldots, X_n)^{\mathrm{T}}&amp;lt;/math&amp;gt;, an &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt;-dimensional [[random vector]], and &amp;lt;math&amp;gt;\mathbf t&amp;lt;/math&amp;gt; is a fixed vector, one uses &amp;lt;math&amp;gt;\mathbf t \cdot \mathbf X = \mathbf t^\mathrm T\mathbf X&amp;lt;/math&amp;gt; instead of&amp;amp;nbsp;{{nowrap|&amp;lt;math&amp;gt;tX&amp;lt;/math&amp;gt;:}}&lt;br /&gt;
&amp;lt;math display=&amp;quot;block&amp;quot;&amp;gt; M_{\mathbf X}(\mathbf t) := \operatorname E \left[e^{\mathbf t^\mathrm T\mathbf X}\right].&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;math&amp;gt; M_X(0) &amp;lt;/math&amp;gt; always exists and is equal to&amp;amp;nbsp;1. However, a key problem with moment-generating functions is that moments and the moment-generating function may not exist, as the integrals need not converge absolutely. By contrast, the [[Characteristic function (probability theory)|characteristic function]] or Fourier transform always exists (because it is the integral of a bounded function on a space of finite [[measure (mathematics)|measure]]), and for some purposes may be used instead.&lt;br /&gt;
&lt;br /&gt;
The moment-generating function is so named because it can be used to find the moments of the distribution.&amp;lt;ref&amp;gt;{{cite book |last=Bulmer |first=M. G. |title=Principles of Statistics |publisher=Dover |year=1979 |pages=75–79 |isbn=0-486-63760-3 }}&amp;lt;/ref&amp;gt;  The series expansion of &amp;lt;math&amp;gt;e^{tX}&amp;lt;/math&amp;gt; is&lt;br /&gt;
&lt;br /&gt;
&amp;lt;math display=&amp;quot;block&amp;quot;&amp;gt;&lt;br /&gt;
e^{t X} = 1 + t X + \frac{t^2 X^2}{2!} + \frac{t^3 X^3}{3!} + \cdots + \frac{t^n X^n}{n!} + \cdots.&lt;br /&gt;
&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Hence,&lt;br /&gt;
&amp;lt;math display=&amp;quot;block&amp;quot;&amp;gt;\begin{align}&lt;br /&gt;
M_X(t) &amp;amp;= \operatorname E [e^{t X}] \\[1ex]&lt;br /&gt;
&amp;amp;= 1 + t \operatorname E[X] + \frac{t^2 \operatorname E[X^2]}{2!} + \frac{t^3 \operatorname E[X^3]}{3!} + \cdots + \frac{t^n\operatorname E [X^n]}{n!}+\cdots \\[1ex]&lt;br /&gt;
&amp;amp; = 1 + t m_1 + \frac{t^2 m_2}{2!} + \frac{t^3 m_3}{3!} + \cdots + \frac{t^n m_n}{n!} + \cdots,&lt;br /&gt;
\end{align}&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
where &amp;lt;math&amp;gt;m_n&amp;lt;/math&amp;gt; is the {{nowrap|&amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt;-th}} [[moment (mathematics)|moment]]. Differentiating &amp;lt;math&amp;gt;M_X(t)&amp;lt;/math&amp;gt; &amp;lt;math&amp;gt;i&amp;lt;/math&amp;gt; times with respect to &amp;lt;math&amp;gt;t&amp;lt;/math&amp;gt; and setting &amp;lt;math&amp;gt;t = 0&amp;lt;/math&amp;gt;, we obtain the &amp;lt;math&amp;gt;i&amp;lt;/math&amp;gt;-th moment about the origin, &amp;lt;math&amp;gt;m_i&amp;lt;/math&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
If &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; is a continuous random variable, the following relation between its moment-generating function &amp;lt;math&amp;gt;M_X(t)&amp;lt;/math&amp;gt; and the [[two-sided Laplace transform]] of its probability density function &amp;lt;math&amp;gt;f_X(x)&amp;lt;/math&amp;gt; holds:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;math display=&amp;quot;block&amp;quot;&amp;gt;M_X(t) = \mathcal{L}\{f_X\}(-t),&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
since the PDF&amp;#039;s two-sided Laplace transform is given as&lt;br /&gt;
&lt;br /&gt;
&amp;lt;math display=&amp;quot;block&amp;quot;&amp;gt;\mathcal{L}\{f_X\}(s) = \int_{-\infty}^\infty e^{-sx} f_X(x)\, dx,&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
and the moment-generating function&amp;#039;s definition expands (by the [[law of the unconscious statistician]]) to&lt;br /&gt;
&amp;lt;math display=&amp;quot;block&amp;quot;&amp;gt;M_X(t) = \operatorname E \left[e^{tX}\right] = \int_{-\infty}^\infty e^{tx} f_X(x)\, dx.&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This is consistent with the characteristic function of &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; being a [[Wick rotation]] of &amp;lt;math&amp;gt;M_X(t)&amp;lt;/math&amp;gt; when the moment generating function exists, as the characteristic function of a continuous random variable &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; is the [[Fourier transform]] of its probability density function &amp;lt;math&amp;gt;f_X(x)&amp;lt;/math&amp;gt;, and in general when a function &amp;lt;math&amp;gt;f(x)&amp;lt;/math&amp;gt; is of [[exponential order]], the Fourier transform of &amp;lt;math&amp;gt;f&amp;lt;/math&amp;gt; is a Wick rotation of its two-sided Laplace transform in the region of convergence. See [[Fourier transform#Laplace transform|the relation of the Fourier and Laplace transforms]] for further information.&lt;br /&gt;
&lt;br /&gt;
==Examples==&lt;br /&gt;
Here are some examples of the moment-generating function and the characteristic function for comparison. It can be seen that the characteristic function is a [[Wick rotation]] of the moment-generating function &amp;lt;math&amp;gt;M_X(t)&amp;lt;/math&amp;gt; when the latter exists.&lt;br /&gt;
{|class=&amp;quot;wikitable&amp;quot; style=&amp;quot;padding-left:1.5em;&amp;quot;&lt;br /&gt;
|-&lt;br /&gt;
! Distribution&lt;br /&gt;
! Moment-generating function &amp;lt;math&amp;gt;M_X(t)&amp;lt;/math&amp;gt;&lt;br /&gt;
! Characteristic function &amp;lt;math&amp;gt;\varphi (t)&amp;lt;/math&amp;gt;&lt;br /&gt;
|-&lt;br /&gt;
|[[Degenerate distribution|Degenerate]] &amp;lt;math&amp;gt;\delta_a&amp;lt;/math&amp;gt;&lt;br /&gt;
|&amp;lt;math&amp;gt;e^{ta}&amp;lt;/math&amp;gt;&lt;br /&gt;
|&amp;lt;math&amp;gt;e^{ita}&amp;lt;/math&amp;gt;&lt;br /&gt;
|-&lt;br /&gt;
| [[Bernoulli distribution|Bernoulli]] &amp;lt;math&amp;gt;P(X = 1) = p&amp;lt;/math&amp;gt; &lt;br /&gt;
| &amp;lt;math&amp;gt;1 - p + pe^t&amp;lt;/math&amp;gt;&lt;br /&gt;
| &amp;lt;math&amp;gt;1 - p + pe^{it}&amp;lt;/math&amp;gt;&lt;br /&gt;
|-&lt;br /&gt;
| [[Binomial distribution|Binomial]] &amp;lt;math&amp;gt;B(n, p)&amp;lt;/math&amp;gt;&lt;br /&gt;
| &amp;lt;math&amp;gt;\left(1 - p + pe^t\right)^n&amp;lt;/math&amp;gt;&lt;br /&gt;
| &amp;lt;math&amp;gt;\left(1 - p + pe^{it}\right)^n&amp;lt;/math&amp;gt;&lt;br /&gt;
|-&lt;br /&gt;
| [[Geometric distribution|Geometric]]  &amp;lt;math&amp;gt;(1 - p)^{k}\,p&amp;lt;/math&amp;gt;&lt;br /&gt;
| &amp;lt;math&amp;gt;\frac{p}{1 - (1 - p) e^t}, ~ t &amp;lt; -\ln(1 - p)&amp;lt;/math&amp;gt;&lt;br /&gt;
| &amp;lt;math&amp;gt;\frac{p}{1 - (1 - p)\,e^{it}}&amp;lt;/math&amp;gt;&lt;br /&gt;
|-&lt;br /&gt;
|[[Negative binomial distribution|Negative binomial]] &amp;lt;math&amp;gt;\operatorname{NB}(r, p)&amp;lt;/math&amp;gt;&lt;br /&gt;
|&amp;lt;math&amp;gt;\left(\frac{p}{1 - e^t + pe^t}\right)^r, ~ t&amp;lt;-\ln(1-p)&amp;lt;/math&amp;gt;&lt;br /&gt;
|&amp;lt;math&amp;gt;\left(\frac{p}{1 - e^{it} + pe^{it}}\right)^r&amp;lt;/math&amp;gt;&lt;br /&gt;
|-&lt;br /&gt;
| [[Poisson distribution|Poisson]] &amp;lt;math&amp;gt;\operatorname{Pois}(\lambda)&amp;lt;/math&amp;gt;&lt;br /&gt;
| &amp;lt;math&amp;gt;e^{\lambda(e^t - 1)}&amp;lt;/math&amp;gt; &lt;br /&gt;
| &amp;lt;math&amp;gt;e^{\lambda(e^{it} - 1)}&amp;lt;/math&amp;gt; &lt;br /&gt;
|- &lt;br /&gt;
| [[Uniform distribution (continuous)|Uniform (continuous)]] &amp;lt;math&amp;gt;\operatorname U(a, b)&amp;lt;/math&amp;gt;&lt;br /&gt;
| &amp;lt;math&amp;gt;\frac{e^{tb} - e^{ta}}{t(b - a)}&amp;lt;/math&amp;gt;&lt;br /&gt;
| &amp;lt;math&amp;gt;\frac{e^{itb} - e^{ita}}{it(b - a)}&amp;lt;/math&amp;gt;&lt;br /&gt;
|- &lt;br /&gt;
| [[Discrete uniform distribution|Uniform (discrete)]] &amp;lt;math&amp;gt;\operatorname{DU}(a, b)&amp;lt;/math&amp;gt;&lt;br /&gt;
| &amp;lt;math&amp;gt;\frac{e^{at} - e^{(b + 1)t}}{(b - a + 1)(1 - e^t)}&amp;lt;/math&amp;gt;&lt;br /&gt;
| &amp;lt;math&amp;gt;\frac{e^{ait} - e^{(b + 1)it}}{(b - a + 1)(1 - e^{it})}&amp;lt;/math&amp;gt;&lt;br /&gt;
|-&lt;br /&gt;
|[[Laplace distribution|Laplace]] &amp;lt;math&amp;gt;L(\mu, b)&amp;lt;/math&amp;gt;&lt;br /&gt;
|&amp;lt;math&amp;gt;\frac{e^{t\mu}}{1 - b^2t^2}, ~ |t| &amp;lt; 1/b&amp;lt;/math&amp;gt;&lt;br /&gt;
|&amp;lt;math&amp;gt;\frac{e^{it\mu}}{1 + b^2t^2}&amp;lt;/math&amp;gt;&lt;br /&gt;
|-&lt;br /&gt;
| [[Normal distribution|Normal]] &amp;lt;math&amp;gt;N(\mu, \sigma^2)&amp;lt;/math&amp;gt;&lt;br /&gt;
| &amp;lt;math&amp;gt;e^{t\mu + \sigma^2 t^2 / 2}&amp;lt;/math&amp;gt;&lt;br /&gt;
| &amp;lt;math&amp;gt;e^{it\mu - \sigma^2 t^2 / 2}&amp;lt;/math&amp;gt;&lt;br /&gt;
|-&lt;br /&gt;
| [[Chi-squared distribution|Chi-squared]] &amp;lt;math&amp;gt;\chi^2_k&amp;lt;/math&amp;gt;&lt;br /&gt;
| &amp;lt;math&amp;gt;{\left(1 - 2t\right)}^{-k/2}, ~ t &amp;lt; 1/2&amp;lt;/math&amp;gt;&lt;br /&gt;
| &amp;lt;math&amp;gt;{\left(1 - 2it\right)}^{-{k}/{2}}&amp;lt;/math&amp;gt;&lt;br /&gt;
|-&lt;br /&gt;
|[[Noncentral chi-squared distribution|Noncentral chi-squared]] &amp;lt;math&amp;gt;\chi^2_k(\lambda)&amp;lt;/math&amp;gt;&lt;br /&gt;
| &amp;lt;math&amp;gt;e^{\lambda t/(1-2t)} {\left(1 - 2t\right)}^{-k/2}&amp;lt;/math&amp;gt;&lt;br /&gt;
| &amp;lt;math&amp;gt;e^{i\lambda t/(1-2it)} {\left(1 - 2it\right)}^{-k/2}&amp;lt;/math&amp;gt;&lt;br /&gt;
|-&lt;br /&gt;
| [[Gamma distribution|Gamma]] &amp;lt;math&amp;gt;\Gamma(k, \tfrac{1}{\theta})&amp;lt;/math&amp;gt;&lt;br /&gt;
|&amp;lt;math&amp;gt;{\left(1 - t\theta\right)}^{-k}, ~ t &amp;lt; \tfrac{1}{\theta}&amp;lt;/math&amp;gt;&lt;br /&gt;
| &amp;lt;math&amp;gt;{\left(1 - it\theta\right)}^{-k}&amp;lt;/math&amp;gt;&lt;br /&gt;
|-&lt;br /&gt;
| [[Exponential distribution|Exponential]] &amp;lt;math&amp;gt;\operatorname{Exp}(\lambda)&amp;lt;/math&amp;gt;&lt;br /&gt;
| &amp;lt;math&amp;gt;\left(1 - t\lambda^{-1}\right)^{-1}, ~ t &amp;lt; \lambda&amp;lt;/math&amp;gt;&lt;br /&gt;
| &amp;lt;math&amp;gt;\left(1 - it\lambda^{-1}\right)^{-1}&amp;lt;/math&amp;gt;&lt;br /&gt;
|-&lt;br /&gt;
|[[Beta distribution|Beta]]&lt;br /&gt;
|&amp;lt;math&amp;gt;1  +\sum_{k=1}^{\infty} \left( \prod_{r=0}^{k-1} \frac{\alpha+r}{\alpha+\beta+r} \right) \frac{t^k}{k!}&amp;lt;/math&amp;gt;&lt;br /&gt;
|&amp;lt;math&amp;gt;{}_1F_1(\alpha; \alpha+\beta; i\,t)\! &amp;lt;/math&amp;gt;{{br}}(see [[Confluent hypergeometric function]])&lt;br /&gt;
|-&lt;br /&gt;
| [[Multivariate normal distribution|Multivariate normal]] &amp;lt;math&amp;gt;N(\mathbf{\mu}, \mathbf{\Sigma})&amp;lt;/math&amp;gt;&lt;br /&gt;
|&amp;lt;math&amp;gt;\exp\left[\mathbf{t}^\mathrm{T} \left(  \boldsymbol{\mu} + \tfrac{1}{2} \boldsymbol{\Sigma} \mathbf{t}\right)\right]&amp;lt;/math&amp;gt;&lt;br /&gt;
|&amp;lt;math&amp;gt;\exp\left[\mathbf{t}^\mathrm{T} \left(i \boldsymbol{\mu} - \tfrac{1}{2} \boldsymbol{\Sigma} \mathbf{t}\right)\right]&amp;lt;/math&amp;gt;&lt;br /&gt;
|-&lt;br /&gt;
| [[Cauchy distribution|Cauchy]] &amp;lt;math&amp;gt;\operatorname{Cauchy}(\mu, \theta)&amp;lt;/math&amp;gt;&lt;br /&gt;
|[[Indeterminate form|Does not exist]]&lt;br /&gt;
| &amp;lt;math&amp;gt;e^{it\mu - \theta|t|}&amp;lt;/math&amp;gt;&lt;br /&gt;
|-&lt;br /&gt;
|[[Multivariate Cauchy distribution|Multivariate Cauchy]] &lt;br /&gt;
&amp;lt;math&amp;gt;\operatorname{MultiCauchy}(\mu, \Sigma)&amp;lt;/math&amp;gt;&amp;lt;ref&amp;gt;Kotz et al.{{full citation needed|date=December 2019}} p. 37 using 1 as the number of degree of freedom to recover the Cauchy distribution&amp;lt;/ref&amp;gt;&lt;br /&gt;
|Does not exist&lt;br /&gt;
|&amp;lt;math&amp;gt;\exp\left(i\mathbf{t}^{\mathrm{T}}\boldsymbol\mu - \sqrt{\mathbf{t}^{\mathrm{T}}\boldsymbol{\Sigma} \mathbf{t}}\right)&amp;lt;/math&amp;gt;&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==Calculation==&lt;br /&gt;
The moment-generating function is the expectation of a function of the random variable, it can be written as:&lt;br /&gt;
&lt;br /&gt;
* For a discrete [[probability mass function]], &amp;lt;math&amp;gt;M_X(t)=\sum_{i=0}^\infty e^{tx_i}\, p_i&amp;lt;/math&amp;gt;&lt;br /&gt;
* For a continuous [[probability density function]], &amp;lt;math&amp;gt; M_X(t)  = \int_{-\infty}^\infty e^{tx} f(x)\,dx &amp;lt;/math&amp;gt;&lt;br /&gt;
* In the general case: &amp;lt;math&amp;gt;M_X(t) = \int_{-\infty}^\infty e^{tx}\,dF(x)&amp;lt;/math&amp;gt;, using the [[Riemann&amp;amp;ndash;Stieltjes integral]], and  where &amp;lt;math&amp;gt;F&amp;lt;/math&amp;gt; is the [[cumulative distribution function]]. This is simply the [[Laplace-Stieltjes transform]] of &amp;lt;math&amp;gt;F&amp;lt;/math&amp;gt;, but with the sign of the argument reversed.&lt;br /&gt;
&lt;br /&gt;
Note that for the case where &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; has a continuous [[probability density function]] &amp;lt;math&amp;gt;f(x)&amp;lt;/math&amp;gt;,  &amp;lt;math&amp;gt;M_X(-t)&amp;lt;/math&amp;gt; is the [[two-sided Laplace transform]] of &amp;lt;math&amp;gt;f(x)&amp;lt;/math&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;math display=&amp;quot;block&amp;quot;&amp;gt;\begin{align}&lt;br /&gt;
M_X(t) &amp;amp; = \int_{-\infty}^\infty e^{tx} f(x)\,dx \\[1ex]&lt;br /&gt;
&amp;amp; = \int_{-\infty}^\infty \left( 1+ tx + \frac{t^2 x^2}{2!} + \cdots + \frac{t^n x^n}{n!} + \cdots\right) f(x)\,dx \\[1ex]&lt;br /&gt;
&amp;amp; = 1 + tm_1 + \frac{t^2 m_2}{2!} + \cdots + \frac{t^n m_n}{n!} +\cdots,&lt;br /&gt;
\end{align}&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
where &amp;lt;math&amp;gt;m_n&amp;lt;/math&amp;gt; is the &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt;th [[moment (mathematics)|moment]].&lt;br /&gt;
&lt;br /&gt;
===Linear transformations of random variables ===&lt;br /&gt;
If random variable &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; has moment generating function &amp;lt;math&amp;gt;M_X(t)&amp;lt;/math&amp;gt;, then &amp;lt;math&amp;gt;\alpha X + \beta&amp;lt;/math&amp;gt; has moment generating function &amp;lt;math&amp;gt;M_{\alpha X + \beta}(t) = e^{\beta t}M_X(\alpha t)&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;math display=&amp;quot;block&amp;quot;&amp;gt;&lt;br /&gt;
M_{\alpha X + \beta}(t) = \operatorname{E}\left[e^{(\alpha X + \beta) t}\right] = e^{\beta t} \operatorname{E}\left[e^{\alpha Xt}\right] = e^{\beta t} M_X(\alpha t)&lt;br /&gt;
&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Linear combination of independent random variables===&lt;br /&gt;
If &amp;lt;math display=&amp;quot;inline&amp;quot;&amp;gt;S_n = \sum_{i=1}^n a_i X_i&amp;lt;/math&amp;gt;, where the {{math|&amp;#039;&amp;#039;X&amp;#039;&amp;#039;&amp;lt;sub&amp;gt;&amp;#039;&amp;#039;i&amp;#039;&amp;#039;&amp;lt;/sub&amp;gt;}} are independent random variables and the {{math|&amp;#039;&amp;#039;a&amp;#039;&amp;#039;&amp;lt;sub&amp;gt;&amp;#039;&amp;#039;i&amp;#039;&amp;#039;&amp;lt;/sub&amp;gt;}} are constants, then the probability density function for {{math|&amp;#039;&amp;#039;S&amp;#039;&amp;#039;&amp;lt;sub&amp;gt;&amp;#039;&amp;#039;n&amp;#039;&amp;#039;&amp;lt;/sub&amp;gt;}} is the [[convolution]] of the probability density functions of each of the {{math|&amp;#039;&amp;#039;X&amp;#039;&amp;#039;&amp;lt;sub&amp;gt;&amp;#039;&amp;#039;i&amp;#039;&amp;#039;&amp;lt;/sub&amp;gt;}}, and the moment-generating function for {{math|&amp;#039;&amp;#039;S&amp;#039;&amp;#039;&amp;lt;sub&amp;gt;&amp;#039;&amp;#039;n&amp;#039;&amp;#039;&amp;lt;/sub&amp;gt;}} is given by&lt;br /&gt;
&lt;br /&gt;
&amp;lt;math display=&amp;quot;block&amp;quot;&amp;gt;&lt;br /&gt;
M_{S_n}(t) = M_{X_1}(a_1t) M_{X_2}(a_2t) \cdots M_{X_n}(a_nt) \, .&lt;br /&gt;
&amp;lt;/math&amp;gt;&lt;br /&gt;
&amp;lt;!----------&lt;br /&gt;
Below was lifted from [[generating function]] ... there should be an &lt;br /&gt;
analog for the moment-generating functionbuted with common probability-generating function &amp;#039;&amp;#039;G&amp;#039;&amp;#039;&amp;lt;sub&amp;gt;X&amp;lt;/sub&amp;gt;, then&lt;br /&gt;
&amp;lt;math display=&amp;quot;block&amp;quot;&amp;gt;G_{S_N}(z) = G_N(G_X(z)).&amp;lt;/math&amp;gt;&lt;br /&gt;
--------&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Vector-valued random variables===&lt;br /&gt;
For [[random vector|vector-valued random variables]] &amp;lt;math&amp;gt;\mathbf X&amp;lt;/math&amp;gt; with [[real number|real]] components, the moment-generating function is given by&lt;br /&gt;
&lt;br /&gt;
&amp;lt;math display=&amp;quot;block&amp;quot;&amp;gt; M_X(\mathbf t) = \operatorname{E}\left[e^{\langle \mathbf t, \mathbf X \rangle}\right] &amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
where &amp;lt;math&amp;gt;\mathbf t&amp;lt;/math&amp;gt; is a vector and &amp;lt;math&amp;gt;\langle \cdot, \cdot \rangle&amp;lt;/math&amp;gt; is the [[dot product]].&lt;br /&gt;
&lt;br /&gt;
==Important properties==&lt;br /&gt;
&lt;br /&gt;
Moment generating functions are positive and [[Logarithmically convex function|log-convex]],{{Citation needed|reason=log-convexity|date=June 2023}} with &amp;#039;&amp;#039;M&amp;#039;&amp;#039;(0) = 1.&lt;br /&gt;
&lt;br /&gt;
An important property of the moment-generating function is that it uniquely determines the distribution. In other words, if &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;Y&amp;lt;/math&amp;gt; are two random variables and for all values of&amp;amp;nbsp;{{mvar|t}},&lt;br /&gt;
&lt;br /&gt;
&amp;lt;math display=&amp;quot;block&amp;quot;&amp;gt;M_X(t) = M_Y(t), &amp;lt;/math&amp;gt;&lt;br /&gt;
then&lt;br /&gt;
&amp;lt;math display=&amp;quot;block&amp;quot;&amp;gt;F_X(x) = F_Y(x) &amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
for all values of {{mvar|x}} (or equivalently {{mvar|X}} and {{mvar|Y}} have the same distribution). This statement is not equivalent to the statement &amp;quot;if two distributions have the same moments, then they are identical at all points.&amp;quot; This is because in some cases, the moments exist and yet the moment-generating function does not, because the limit&lt;br /&gt;
&lt;br /&gt;
&amp;lt;math display=&amp;quot;block&amp;quot;&amp;gt;\lim_{n \to \infty} \sum_{i=0}^n \frac{t^i m_i}{i!}&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
may not exist. The [[log-normal distribution]] is an example of when this occurs: its moments are &amp;lt;math&amp;gt;\operatorname{E}[X^n] = e^{n\mu+n^2\sigma^2/2}&amp;lt;/math&amp;gt; and are all finite but its moment generating function &amp;lt;math&amp;gt;\operatorname E \left[e^{tX}\right]&amp;lt;/math&amp;gt; is not defined for any positive {{mvar|t}} as the integral diverges and so not in a neighbourhood of 0; there are other distributions with the same moments.&amp;lt;ref name=&amp;quot;Heyde&amp;quot;&amp;gt;{{Citation | last = Heyde | first = CC. | title = On a Property of the Lognormal Distribution | work = Journal of the Royal Statistical Society, Series B | date = 2010 | volume = 25 | issue = 2 | pages = 392–393 | doi = 10.1007/978-1-4419-5823-5_6 | isbn = 978-1-4419-5822-8 | doi-access = free}}&amp;lt;/ref&amp;gt;&lt;br /&gt;
&amp;lt;!--&lt;br /&gt;
If the moment generating function is defined on such an interval, then it uniquely determines a probability distribution. --&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Calculations of moments===&lt;br /&gt;
The moment-generating function is so called because if it exists on an open interval around {{math|1=&amp;#039;&amp;#039;t&amp;#039;&amp;#039; = 0}}, then it is the [[exponential generating function]] of the [[moment (mathematics)|moments]] of the [[probability distribution]]:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;math display=&amp;quot;block&amp;quot;&amp;gt;m_n = \operatorname{E}\left[ X^n \right] = M_X^{(n)}(0) = \left. \frac{d^n M_X}{dt^n}\right|_{t=0}.&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
That is, with {{mvar|n}} being a nonnegative integer, the {{mvar|n}}-th moment about 0 is the {{mvar|n}}-th derivative of the moment generating function, evaluated at {{math|1=&amp;#039;&amp;#039;t&amp;#039;&amp;#039; = 0}}.&lt;br /&gt;
&lt;br /&gt;
==Other properties==&lt;br /&gt;
[[Jensen&amp;#039;s inequality]] provides a simple lower bound on the moment-generating function:&lt;br /&gt;
&amp;lt;math display=&amp;quot;block&amp;quot;&amp;gt; M_X(t) \geq e^{\mu t}, &amp;lt;/math&amp;gt;&lt;br /&gt;
where &amp;lt;math&amp;gt;\mu&amp;lt;/math&amp;gt; is the mean of {{mvar|X}}.&lt;br /&gt;
&lt;br /&gt;
The moment-generating function can be used in conjunction with [[Markov&amp;#039;s inequality]] to bound the upper tail of a real random variable {{mvar|X}}. This statement is also called the [[Chernoff bound]]. Since &amp;lt;math&amp;gt;x \mapsto e^{xt}&amp;lt;/math&amp;gt; is monotonically increasing for &amp;lt;math&amp;gt;t&amp;gt;0&amp;lt;/math&amp;gt;, we have&lt;br /&gt;
&amp;lt;math display=&amp;quot;block&amp;quot;&amp;gt; \Pr(X \ge a) = \Pr(e^{tX} \ge e^{ta}) \le e^{-at} \operatorname{E}\left[e^{tX}\right] = e^{-at}M_X(t)&amp;lt;/math&amp;gt;&lt;br /&gt;
for any &amp;lt;math&amp;gt;t&amp;gt;0&amp;lt;/math&amp;gt; and any {{mvar|a}}, provided &amp;lt;math&amp;gt;M_X(t)&amp;lt;/math&amp;gt; exists. For example, when {{mvar|X}} is a standard normal distribution and &amp;lt;math&amp;gt;a &amp;gt; 0&amp;lt;/math&amp;gt;, we can choose &amp;lt;math&amp;gt;t=a&amp;lt;/math&amp;gt; and recall that &amp;lt;math&amp;gt;M_X(t)=e^{t^2/2}&amp;lt;/math&amp;gt;. This gives &amp;lt;math&amp;gt;\Pr(X\ge a)\le e^{-a^2/2}&amp;lt;/math&amp;gt;, which is within a factor of {{math|1+&amp;#039;&amp;#039;a&amp;#039;&amp;#039;}} of the exact value.&lt;br /&gt;
&lt;br /&gt;
Various lemmas, such as [[Hoeffding&amp;#039;s lemma]] or [[Bennett&amp;#039;s inequality]] provide bounds on the moment-generating function in the case of a zero-mean, bounded random variable.&lt;br /&gt;
&lt;br /&gt;
When &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; is non-negative, the moment generating function gives a simple, useful bound on the moments:&lt;br /&gt;
&amp;lt;math display=&amp;quot;block&amp;quot;&amp;gt;\operatorname{E}[X^m] \le \left(\frac{m}{te}\right)^m M_X(t),&amp;lt;/math&amp;gt;&lt;br /&gt;
For any &amp;lt;math&amp;gt;X,m\ge 0&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;t&amp;gt;0&amp;lt;/math&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
This follows from the inequality &amp;lt;math&amp;gt;1+x\le e^x&amp;lt;/math&amp;gt; into which we can substitute &amp;lt;math&amp;gt;x&amp;#039;=tx/m-1&amp;lt;/math&amp;gt; implies &amp;lt;math&amp;gt;tx/m\le e^{tx/m-1}&amp;lt;/math&amp;gt; for any {{nowrap|&amp;lt;math&amp;gt;x, t, m \in \mathbb R&amp;lt;/math&amp;gt;.}}&lt;br /&gt;
Now, if &amp;lt;math&amp;gt;t &amp;gt; 0&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;x,m\ge 0&amp;lt;/math&amp;gt;, this can be rearranged to &amp;lt;math&amp;gt;x^m \le (m/(te))^m e^{tx}&amp;lt;/math&amp;gt;.&lt;br /&gt;
Taking the expectation on both sides gives the bound on &amp;lt;math&amp;gt;\operatorname{E}[X^m]&amp;lt;/math&amp;gt; in terms of &amp;lt;math&amp;gt;\operatorname{E}[e^{tX}]&amp;lt;/math&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
As an example, consider &amp;lt;math&amp;gt;X\sim\text{Chi-Squared}&amp;lt;/math&amp;gt; with &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; degrees of freedom. Then from the [[Moment-generating function#Examples|examples]] &amp;lt;math&amp;gt;M_X(t) = (1-2t)^{-k/2}&amp;lt;/math&amp;gt;.&lt;br /&gt;
Picking &amp;lt;math&amp;gt;t=m/(2m+k)&amp;lt;/math&amp;gt; and substituting into the bound:&lt;br /&gt;
&amp;lt;math display=&amp;quot;block&amp;quot;&amp;gt;\operatorname{E}[X^m] \le {\left(1 + 2m/k\right)}^{k/2} e^{-m} {\left(k + 2m\right)}^m.&amp;lt;/math&amp;gt;&lt;br /&gt;
We know that [[Chi-square distribution#Noncentral moments|in this case]] the correct bound is &amp;lt;math&amp;gt;\operatorname{E}[X^m]\le 2^m \Gamma(m+k/2)/\Gamma(k/2)&amp;lt;/math&amp;gt;.&lt;br /&gt;
To compare the bounds, we can consider the asymptotics for large &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;.&lt;br /&gt;
Here the moment-generating function bound is &amp;lt;math&amp;gt;k^m(1+m^2/k + O(1/k^2))&amp;lt;/math&amp;gt;,&lt;br /&gt;
where the real bound is &amp;lt;math&amp;gt;k^m(1+(m^2-m)/k + O(1/k^2))&amp;lt;/math&amp;gt;.&lt;br /&gt;
The moment-generating function bound is thus very strong in this case.&lt;br /&gt;
&lt;br /&gt;
==Relation to other functions==&lt;br /&gt;
Related to the moment-generating function are a number of other [[integral transform|transforms]] that are common in probability theory:&lt;br /&gt;
&lt;br /&gt;
;[[Characteristic function (probability theory)|Characteristic function]]: The [[characteristic function (probability theory)|characteristic function]] &amp;lt;math&amp;gt;\varphi_X(t)&amp;lt;/math&amp;gt; is related to the moment-generating function via &amp;lt;math&amp;gt;\varphi_X(t) = M_{iX}(t) = M_X(it):&amp;lt;/math&amp;gt; the characteristic function is the moment-generating function of &amp;#039;&amp;#039;iX&amp;#039;&amp;#039; or the moment generating function of &amp;#039;&amp;#039;X&amp;#039;&amp;#039; evaluated on the imaginary axis.  This function can also be viewed as the [[Fourier transform]] of the [[probability density function]], which can therefore be deduced from it by inverse Fourier transform.&lt;br /&gt;
;[[Cumulant-generating function]]: The [[cumulant-generating function]] is defined as the logarithm of the moment-generating function; some instead define the cumulant-generating function as the logarithm of the [[Characteristic function (probability theory)|characteristic function]], while others call this latter the &amp;#039;&amp;#039;second&amp;#039;&amp;#039; cumulant-generating function.&lt;br /&gt;
;[[Probability-generating function]]: The [[probability-generating function]] is defined as &amp;lt;math&amp;gt;G(z) = \operatorname{E}\left[z^X\right].&amp;lt;/math&amp;gt; This immediately implies that &amp;lt;math&amp;gt;G(e^t) = \operatorname{E}\left[e^{tX}\right] = M_X(t).&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==See also==&lt;br /&gt;
* [[Characteristic function (probability theory)]]&lt;br /&gt;
* [[Factorial moment generating function]]&lt;br /&gt;
* [[Rate function]]&lt;br /&gt;
* [[Hamburger moment problem]]&lt;br /&gt;
&lt;br /&gt;
{{More footnotes|date=February 2010}}&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
===Citations===&lt;br /&gt;
{{Reflist}}&lt;br /&gt;
&lt;br /&gt;
===Sources===&lt;br /&gt;
{{Refbegin}}&lt;br /&gt;
* {{cite book |last1=Casella |first1=George |last2=Berger |first2=Roger |title=Statistical Inference |year=2002 |edition=2nd |isbn = 978-0-534-24312-8 |pages=59–68 |publisher=Thomson Learning }}&lt;br /&gt;
{{Refend}}&lt;br /&gt;
&lt;br /&gt;
{{Clear}}&lt;br /&gt;
{{Theory of probability distributions}}&lt;br /&gt;
{{Authority control}}&lt;br /&gt;
&lt;br /&gt;
{{DEFAULTSORT:Moment-Generating Function}}&lt;br /&gt;
[[Category:Moments (mathematics)]]&lt;br /&gt;
[[Category:Generating functions]]&lt;/div&gt;</summary>
		<author><name>2A00:23C8:DE24:801:414:35E9:4E90:9155</name></author>
	</entry>
</feed>