Zeta distribution

From Vero - Wikipedia
Jump to navigation Jump to search

Template:Short description Template:More citations needed Template:Probability distribution{\zeta(s)}</math>|

 mean       =<math>\frac{\zeta(s-1)}{\zeta(s)}~\textrm{for}~s>2</math>|
 median     =|
 mode       =<math>1\,</math>|
 variance   =<math>\frac{\zeta(s)\zeta(s-2) - \zeta(s-1)^2}{\zeta(s)^2}~\textrm{for}~s>3</math>|
 skewness   =|
 kurtosis   =|
 entropy    =<math>\sum_{k=1}^\infty\frac{\log (k^s \zeta(s))}{k^s\zeta(s)}.</math>|
 mgf        =does not exist|
 char       =<math>\frac{\operatorname{Li}_s(e^{it})}{\zeta(s)}</math>|
 pgf        =<math>\frac{\operatorname{Li}_s(z)}{\zeta(s)}</math> 

}}

In probability theory and statistics, the zeta distribution is a discrete probability distribution. If X is a zeta-distributed random variable with parameter s, then the probability that X takes the positive integer value k is given by the probability mass function

<math display="block">f_s(k) = \frac{k^{-s}}{\zeta(s)} </math>

where ζ(s) is the Riemann zeta function (which is undefined for s = 1).

The multiplicities of distinct prime factors of X are independent random variables.

The Riemann zeta function being the sum of all terms <math>k^{-s}</math> for positive integer k, it appears thus as the normalization of the Zipf distribution. The terms "Zipf distribution" and "zeta distribution" are often used interchangeably. But while the Zeta distribution is a probability distribution by itself, it is not associated with Zipf's law with the same exponent.

Definition

The Zeta distribution is defined for positive integers <math>k \geq 1</math>, and its probability mass function is given by <math display="block"> P(x=k) = \frac 1 {\zeta(s)} k^{-s}, </math> where <math>s>1</math> is the parameter, and <math>\zeta(s)</math> is the Riemann zeta function.

The cumulative distribution function is given by <math display="block">P(x \leq k) = \frac{H_{k,s}}{\zeta(s)},</math> where <math>H_{k,s}</math> is the generalized harmonic number <math display="block">H_{k,s} = \sum_{i=1}^k \frac 1 {i^s}.</math>

Moments

The nth raw moment is defined as the expected value of Xn:

<math display="block">m_n = E(X^n) = \frac{1}{\zeta(s)}\sum_{k=1}^\infty \frac{1}{k^{s-n}}</math>

The series on the right is just a series representation of the Riemann zeta function, but it only converges for values of <math>s-n</math> that are greater than unity. Thus:

<math display="block">m_n = \begin{cases} \zeta(s-n)/\zeta(s) & \text{for } n < s-1 \\ \infty & \text{for } n \ge s-1 \end{cases} </math>

The ratio of the zeta functions is well-defined, even for n > s − 1 because the series representation of the zeta function can be analytically continued. This does not change the fact that the moments are specified by the series itself, and are therefore undefined for large n.

Moment generating function

The moment generating function is defined as

<math display="block">M(t;s) = E(e^{tX}) = \frac{1}{\zeta(s)} \sum_{k=1}^\infty \frac{e^{tk}}{k^s}.</math>

The series is just the definition of the polylogarithm, valid for <math>e^t<1</math> so that

<math display="block">M(t;s) = \frac{\operatorname{Li}_s(e^t)}{\zeta(s)}\text{ for }t<0.</math>

Since this does not converge on an open interval containing <math> t=0</math>, the moment generating function does not exist.

The case s = 1

ζ(1) is infinite as the harmonic series, and so the case when s = 1 is not meaningful. However, if A is any set of positive integers that has a density, i.e. if

<math display="block">\lim_{n\to\infty}\frac{N(A,n)}{n}</math>

exists where N(An) is the number of members of A less than or equal to n, then

<math display="block">\lim_{s\to 1^+}P(X\in A)\,</math>

is equal to that density.

The latter limit can also exist in some cases in which A does not have a density. For example, if A is the set of all positive integers whose first digit is d, then A has no density, but nonetheless, the second limit given above exists and is proportional to

<math display="block">\log(d+1) - \log(d) = \log\left(1+\frac{1}{d}\right),\,</math>

which is Benford's law.

Infinite divisibility

The Zeta distribution can be constructed with a sequence of independent random variables with a geometric distribution. Let <math>p</math> be a prime number and <math>X(p^{-s})</math> be a random variable with a geometric distribution of parameter <math>p^{-s}</math>, namely

<math display="block"> \mathbb{P}\left( X(p^{-s}) = k \right) = p^{-ks } (1 - p^{-s} )</math>

If the random variables <math>( X(p^{-s}) )_{p \in \mathcal{P} }</math> are independent, then, the random variable <math>Z_s</math> defined by

<math display="block"> Z_s = \prod_{p \in \mathcal{P} } p^{ X(p^{-s}) }</math>

has the zeta distribution: <math>\mathbb{P}\left( Z_s = n \right) = \frac{1}{ n^s \zeta(s) }</math>.

Stated differently, the random variable <math>\log(Z_s) = \sum_{p \in \mathcal{P} } X(p^{-s}) \, \log(p)</math> is infinitely divisible with Lévy measure given by the following sum of Dirac masses:

<math display="block"> \Pi_s(dx) = \sum_{p \in \mathcal{P} } \sum_{k \geqslant 1 } \frac{p^{-k s}}{k} \delta_{k \log(p) }(dx)</math>

See also

Other "power-law" distributions

  • Template:Cite CiteSeerX What Gut calls the "Riemann zeta distribution" is actually the probability distribution of −log X, where X is a random variable with what this article calls the zeta distribution.
  • Template:MathWorld

Template:ProbDistributions