Parameters Probability mass functionThe horizontal axis is the index k. The function is only non-zero at integer values of k. The connecting lines are only guides for the eye and do not indicate continuity. Cumulative distribution functionThe horizontal axis is the index k. The CDF is discontinuous at the integers of k and flat everywhere else because a variable that is Poisson distributed only takes on integer values. $\lambda \in (0,\infty)$ $k \in \{0,1,2,\ldots\}$ $\frac{e^{-\lambda} \lambda^k}{k!}\!$ $\frac{\Gamma(\lfloor k+1\rfloor, \lambda)}{\lfloor k\rfloor !}\!\text{ for }k\ge 0$(where Γ(x,y) is the Incomplete gamma function) $\lambda\,$ $\text{usually about }\lfloor\lambda+1/3-0.02/\lambda\rfloor$ $\lfloor\lambda\rfloor$ and λ − 1 if λ is an integer $\lambda\,$ $\lambda^{-1/2}\,$ $\lambda^{-1}\,$ $\lambda[1\!-\!\ln(\lambda)]\!+\!e^{-\lambda}\sum_{k=0}^\infty \frac{\lambda^k\ln(k!)}{k!}$(for large λ) $\frac{1}{2}\log(2 \pi e \lambda) - \frac{1}{12 \lambda} - \frac{1}{24 \lambda^2} - \frac{19}{360 \lambda^3} + O(\frac{1}{\lambda^4})$ $\exp(\lambda (e^t-1))\,$ $\exp(\lambda (e^{it}-1))\,$

In probability theory and statistics, the Poisson distribution is a discrete probability distribution that expresses the probability of a number of events occurring in a fixed period of time if these events occur with a known average rate and independently of the time since the last event. In Mathematics, the support of a function is the set of points where the function is not zero or the closure of that set In Probability theory, a probability mass function (abbreviated pmf) is a function that gives the probability that a discrete Random variable In Probability theory and Statistics, the cumulative distribution function (CDF, also probability distribution function or just distribution function In Mathematics, the Gamma function is defined by a definite integral. In Probability theory and Statistics, a median is described as the number separating the higher half of a sample a population or a Probability distribution In Statistics, the mode is the value that occurs the most frequently in a Data set or a Probability distribution. The integers (from the Latin integer, literally "untouched" hence "whole" the word entire comes from the same origin but via French In Probability theory and Statistics, the variance of a Random variable, Probability distribution, or sample is one measure of In Probability theory and Statistics, skewness is a measure of the asymmetry of the Probability distribution of a real -valued In Probability theory and Statistics, kurtosis (from the Greek word κυρτός kyrtos or kurtos, meaning bulging is a measure of the "peakedness" In Probability theory and Statistics, the moment-generating function of a Random variable X is M_X(t=\operatorname{E}\left(e^{tX}\right In Probability theory, the characteristic function of any Random variable completely defines its Probability distribution. Probability theory is the branch of Mathematics concerned with analysis of random phenomena Statistics is a mathematical science pertaining to the collection analysis interpretation or explanation and presentation of Data. In Probability theory, a Probability distribution is called discrete if it is characterized by a Probability mass function. In Probability theory, to say that two events are independent intuitively means that the occurrence of one event makes it neither more nor less probable that the other The Poisson distribution can also be used for the number of events in other specified intervals such as distance, area or volume.

The distribution was discovered by Siméon-Denis Poisson (17811840) and published, together with his probability theory, in 1838 in his work Recherches sur la probabilité des jugements en matières criminelles et matière civile ("Research on the Probability of Judgments in Criminal and Civil Matters"). Siméon-Denis Poisson (21 June 1781 &ndash 25 April 1840 was a French Mathematician, Geometer, and Physicist. Year 1781 ( MDCCLXXXI) was a Common year starting on Monday (link will display the full calendar of the Gregorian calendar (or a Common Year 1840 ( MDCCCXL) was a Leap year starting on Wednesday (link will display the full calendar of the Gregorian calendar (or a Leap year The work focused on certain random variables N that count, among other things, a number of discrete occurrences (sometimes called "arrivals") that take place during a time-interval of given length. A random variable is a rigorously defined mathematical entity used mainly to describe Chance and Probability in a mathematical way For other uses see Time (disambiguation Time is a component of a measuring system used to sequence events to compare the durations of If the expected number of occurrences in this interval is λ, then the probability that there are exactly k occurrences (k being a non-negative integer, k = 0, 1, 2, . The integers (from the Latin integer, literally "untouched" hence "whole" the word entire comes from the same origin but via French . . ) is equal to

$f(k; \lambda)=\frac{\lambda^k e^{-\lambda}}{k!},\,\!$

where

• e is the base of the natural logarithm (e = 2. The Mathematical constant e is the unique Real number such that the function e x has the same value as the slope of the tangent line 71828. . . )
• k is the number of occurrences of an event - the probability of which is given by the function
• k! is the factorial of k
• λ is a positive real number, equal to the expected number of occurrences that occur during the given interval. Definition The factorial function is formally defined by n!=\prod_{k=1}^n k In Mathematics, the real numbers may be described informally in several different ways For instance, if the events occur on average 4 times per minute, and you are interested in the number of events occurring in a 10 minute interval, you would use as model a Poisson distribution with λ = 10*4 = 40. A minute is a Unit of measurement of Time or of Angle. The minute is a unit of Time equal to 1/60th of an Hour or 60

As a function of k, this is the probability mass function. In Probability theory, a probability mass function (abbreviated pmf) is a function that gives the probability that a discrete Random variable The Poisson distribution can be derived as a limiting case of the binomial distribution. WikipediaWikiProject Probability#Standards for a discussion of standards used for probability distribution articles such as this one

The Poisson distribution can be applied to systems with a large number of possible events, each of which is rare. A classic example is the nuclear decay of atoms.

The Poisson distribution is sometimes called a Poissonian, analogous to the term Gaussian for a Gauss or normal distribution. The normal distribution, also called the Gaussian distribution, is an important family of Continuous probability distributions applicable in many fields

## Poisson noise and characterizing small occurrences

The parameter λ is not only the mean number of occurrences $\scriptstyle\langle k \rangle$, but also its variance $\scriptstyle\sigma_k^2 \ =\ \langle k^{2} \rangle - \langle k \rangle^{2}$ (see Table). In Statistics, mean has two related meanings the Arithmetic mean (and is distinguished from the Geometric mean or Harmonic mean In Probability theory and Statistics, the variance of a Random variable, Probability distribution, or sample is one measure of Thus, the number of observed occurrences fluctuates about its mean λ with a standard deviation $\scriptstyle\sigma_{k}\, =\, \sqrt{\lambda}$. In Probability and Statistics, the standard deviation is a measure of the dispersion of a collection of values These fluctuations are denoted as Poisson noise or (particularly in electronics) as shot noise. Shot noise is a type of Electronic noise that occurs when the finite number of particles that carry energy such as Electrons in an electronic circuit or Photons

The correlation of the mean and standard deviation in counting independent, discrete occurrences is useful scientifically. By monitoring how the fluctuations vary with the mean signal, one can estimate the contribution of a single occurrence, even if that contribution is too small to be detected directly. For example, the charge e on an electron can be estimated by correlating the magnitude of an electric current with its shot noise. Electric current is the flow (movement of Electric charge. The SI unit of electric current is the Ampere. Shot noise is a type of Electronic noise that occurs when the finite number of particles that carry energy such as Electrons in an electronic circuit or Photons If N electrons pass a point in a given time t on the average, the mean current is I = eN / t; since the current fluctuations should be of the order $\scriptstyle\sigma_{I} = e\sqrt{N/t\ }$ (i. In Statistics, mean has two related meanings the Arithmetic mean (and is distinguished from the Geometric mean or Harmonic mean Electric current is the flow (movement of Electric charge. The SI unit of electric current is the Ampere. e. the variance of the Poisson process), the charge e can be estimated from the ratio $\scriptstyle\sigma_{I}^{2}/I$. In Probability theory and Statistics, the variance of a Random variable, Probability distribution, or sample is one measure of An everyday example is the graininess that appears as photographs are enlarged; the graininess is due to Poisson fluctuations in the number of reduced silver grains, not to the individual grains themselves. Silver (ˈsɪlvɚ is a Chemical element with the symbol " Ag " (argentum from the Ancient Greek: ἀργήντος - argēntos gen By correlating the graininess with the degree of enlargement, one can estimate the contribution of an individual grain (which is otherwise too small to be seen unaided). Many other molecular applications of Poisson noise have been developed, e. g. , estimating the number density of receptor molecules in a cell membrane. In Biochemistry, a receptor is a Protein molecule embedded in either the Plasma membrane or Cytoplasm of a cell to which a mobile signaling The cell membrane (also called the plasma membrane, plasmalemma, or "phospholipid bilayer" is a Selectively permeable Lipid bilayer

$\Pr(N_t=k)=f(k;\lambda t)=\frac{e^{-\lambda t} (\lambda t)^k}{k!},\,\!$

## Related distributions

• If $X_1 \sim \mathrm{Pois}(\lambda_1)\,$ and $X_2 \sim \mathrm{Pois}(\lambda_2)\,$ then the difference Y = X1X2 follows a Skellam distribution. The Skellam distribution is the discrete Probability distribution of the difference K_1-K_2 of two correlated or uncorrelated Random
• If $X_1 \sim \mathrm{Pois}(\lambda_1)\,$ and $X_2 \sim \mathrm{Pois}(\lambda_2)\,$ are independent, and Y = X1 + X2, then the distribution of X1 conditional on Y = y is a binomial. WikipediaWikiProject Probability#Standards for a discussion of standards used for probability distribution articles such as this one Specifically, $X_1|(Y=y) \sim \mathrm{Binom}(y, \lambda_1/(\lambda_1+\lambda_2))\,$. More generally, if X1, X2,. . . , Xn are Poisson random variables with parameters λ1, λ2,. . . , λn then $X_i \left|\sum_{j=1}^n X_j\right. \sim \mathrm{Binom}\left(\sum_{j=1}^nX_j,\frac{\lambda_i}{\sum_{j=1}^n\lambda_j}\right)$
• The Poisson distribution can be derived as a limiting case to the binomial distribution as the number of trials goes to infinity and the expected number of successes remains fixed. WikipediaWikiProject Probability#Standards for a discussion of standards used for probability distribution articles such as this one Therefore it can be used as an approximation of the binomial distribution if n is sufficiently large and p is sufficiently small. There is a rule of thumb stating that the Poisson distribution is a good approximation of the binomial distribution if n is at least 20 and p is smaller than or equal to 0. 05. According to this rule the approximation is excellent if n ≥ 100 and np ≤ 10. [1]
• For sufficiently large values of λ, (say λ>1000), the normal distribution with mean λ, and variance λ, is an excellent approximation to the Poisson distribution. The normal distribution, also called the Gaussian distribution, is an important family of Continuous probability distributions applicable in many fields If λ is greater than about 10, then the normal distribution is a good approximation if an appropriate continuity correction is performed, i. In Probability theory, if a Random variable X has a Binomial distribution with parameters n and p, i e. , P(X ≤ x), where (lower-case) x is a non-negative integer, is replaced by P(X ≤ x + 0. 5).
$F_\mathrm{Poisson}(x;\lambda) \approx F_\mathrm{normal}(x;\mu=\lambda,\sigma^2=\lambda)\,$
• If the number of arrivals in a given time follows the Poisson distribution, with mean = λ, then the lengths of the inter-arrival times follow the Exponential distribution, with rate 1 / λ. WikipediaWikiProject Probability#Standards for a discussionof standards used for probability distribution articles such as this one

## Occurrence

The Poisson distribution arises in connection with Poisson processes. A Poisson process, named after the French mathematician Siméon-Denis Poisson (1781 &ndash 1840 is the Stochastic process in which events occur continuously and It applies to various phenomena of discrete nature (that is, those that may happen 0, 1, 2, 3, . . . times during a given period of time or in a given area) whenever the probability of the phenomenon happening is constant in time or space. Space is the extent within which Matter is physically extended and objects and Events have positions relative to one another Examples of events that may be modelled as a Poisson distribution include:

• The number of soldiers killed by horse-kicks each year in each corps in the Prussian cavalry. Prussia ( Latin: Borussia, Prutenia; Prūsija Prūsija Prusy Old Prussian: Prūsa) was most recently a historic state This example was made famous by a book of Ladislaus Josephovich Bortkiewicz (18681931). Ladislaus Josephovich Bortkiewicz (ru Владислав Иосифович Борткевич, pl Władysław Bortkiewicz, August 7, 1868 Year 1868 ( MDCCCLXVIII) was a Leap year starting on Wednesday (link will display the full calendar of the Gregorian Calendar (or a Leap Year 1931 ( MCMXXXI) was a Common year starting on Thursday (link will display full 1931 calendar of the Gregorian calendar.
• The number of phone calls at a call center per minute. A call centre or call center (see spelling differences) is a centralized office used for the purpose of receiving and transmitting a large volume of requests by
• The number of times a web server is accessed per minute. The term web server can mean one of two things A Computer program that is responsible for accepting HTTP requests from web clients which are
• The number of mutations in a given stretch of DNA after a certain amount of radiation. In biology mutations are changes to the Nucleotide sequence of the Genetic material of an organism Deoxyribonucleic acid ( DNA) is a Nucleic acid that contains the genetic instructions used in the development and functioning of all known

[Note: the intervals between successive Poisson events are reciprocally-related, following the Exponential distribution. WikipediaWikiProject Probability#Standards for a discussionof standards used for probability distribution articles such as this one For example, the lifetime of a lightbulb, or waiting time between buses. ]

## How does this distribution arise? — The law of rare events

In several of the above examples—for example, the number of mutations in a given sequence of DNA—the events being counted are actually the outcomes of discrete trials, and would more precisely be modelled using the binomial distribution. WikipediaWikiProject Probability#Standards for a discussion of standards used for probability distribution articles such as this one However, the binomial distribution with parameters n and λ/n, i. e. , the probability distribution of the number of successes in n trials, with probability λ/n of success on each trial, approaches the Poisson distribution with expected value λ as n approaches infinity. This provides a means by which to approximate random variables using the Poisson distribution rather than the more-cumbersome binomial distribution.

This limit is sometimes known as the law of rare events, since each of the individual Bernoulli events each rarely trigger. In Probability theory and Statistics, the Bernoulli distribution, named after Swiss scientist Jakob Bernoulli, is a discrete Probability The name may be misleading because the total count of success events in a Poisson process need not be rare if the parameter λ is not small. For example, the number of telephone calls to a busy switchboard in one hour follows a Poisson distribution with the events appearing frequent to the operator, but they are rare from the point of the average member of the population who is very unlikely to make a call to that switchboard in that hour.

Here are the details. First, recall from calculus that

$\lim_{n\to\infty}\left(1-{\lambda \over n}\right)^n=e^{-\lambda}.$

Let p = λ/n. Calculus ( Latin, calculus, a small stone used for counting is a branch of Mathematics that includes the study of limits, Derivatives Then we have

$\lim_{n\to\infty} P(X=k)=\lim_{n\to\infty}{n \choose k} p^k (1-p)^{n-k}=\lim_{n\to\infty}{n! \over (n-k)!k!} \left({\lambda \over n}\right)^k \left(1-{\lambda\over n}\right)^{n-k}$
$=\lim_{n\to\infty}\underbrace{\left[\frac{n!}{n^k\left(n-k\right)!}\right]}_F\left(\frac{\lambda^k}{k!}\right)\underbrace{\left(1-\frac{\lambda}{n}\right)^n}_{\approx\exp\left(-\lambda\right)}\underbrace{\left(1-\frac{\lambda}{n}\right)^{-k}}_{\approx 1} =F\exp\left(-\lambda\right)\left(\frac{\lambda^k}{k!}\right)$

For the F term, first take its logarithm:

$\lim_{n\to\infty} \log\left(F\right) =\log\left(n!\right) - k\log\left(n\right) - \log\left[\left(n-k\right)!\right]$

Using the Stirling formula

$\lim_{n\to\infty} \log\left(n!\right) \approx n\log\left(n\right) - n$

The expression for $\log\left(F\right)$ can be further simplified to

$\lim_{n\to\infty} \log\left(F\right) =\left[n\log\left(n\right) - n\right] -\left[k\log\left(n\right)\right] -\left[\left(n-k\right)\log\left(n-k\right)-\left(n-k\right)\right]$
$= \left(n-k\right)\log\left(\frac{n}{n-k}\right) - k$
$= \underbrace{-\left(1-\frac{k}{n}\right)}_{\approx -1} \underbrace{\log\left(1-\frac{k}{n}\right)^n}_{=-k} - k= k - k = 0$

Therefore $\lim_{n\to\infty}F = \exp\left(0\right) = 1$.

Consequently the limit of the distribution becomes

${\lambda^k \exp\left(-\lambda\right) \over k!}.\,\!$

which now assumes the Poisson distribution.

More generally, whenever a sequence of binomial random variables with parameters n and pn is such that

$\lim_{n\rightarrow\infty} np_n = \lambda,$

the sequence converges in distribution to a Poisson random variable with mean λ (see, e. In Probability theory, there exist several different notions of Convergence of Random variables The convergence (in one of the senses presented below of Sequences g. law of rare events).

## Properties

• The expected value of a Poisson-distributed random variable is equal to λ and so is its variance. In Probability theory and Statistics, the variance of a Random variable, Probability distribution, or sample is one measure of The higher moments of the Poisson distribution are Touchard polynomials in λ, whose coefficients have a combinatorial meaning. The Touchard polynomials, named after Jacques Touchard, also called the exponential polynomials, comprise a Polynomial sequence of Binomial type Combinatorics is a branch of Pure mathematics concerning the study of discrete (and usually finite) objects In fact when the expected value of the Poisson distribution is 1, then Dobinski's formula says that the nth moment equals the number of partitions of a set of size n. In combinatorial mathematics Dobinski's formula states that the number of partitions of a set of n members is {1 \over e}\sum_{k=0}^\infty In Mathematics, a partition of a set X is a division of X into non-overlapping " parts " or " blocks "
• The mode of a Poisson-distributed random variable with non-integer λ is equal to $\scriptstyle\lfloor \lambda \rfloor$, which is the largest integer less than or equal to λ. In Statistics, the mode is the value that occurs the most frequently in a Data set or a Probability distribution. This is also written as floor(λ). In Mathematics and Computer science, the floor and ceiling functions map Real numbers to nearby Integers The When λ is a positive integer, the modes are λ and λ − 1.
• Sums of Poisson-distributed random variables:
If $X_i \sim \mathrm{Poi}(\lambda_i)\,$ follow a Poisson distribution with parameter $\lambda_i\,$ and Xi are independent, then $Y = \sum_{i=1}^N X_i \sim \mathrm{Poi}\left(\sum_{i=1}^N \lambda_i\right)\,$ also follows a Poisson distribution whose parameter is the sum of the component parameters. In Probability theory, to say that two events are independent intuitively means that the occurrence of one event makes it neither more nor less probable that the other
$\mathrm{E}\left(e^{tX}\right)=\sum_{k=0}^\infty e^{tk} f(k;\lambda)=\sum_{k=0}^\infty e^{tk} {\lambda^k e^{-\lambda} \over k!} =e^{\lambda(e^t-1)}.$
• All of the cumulants of the Poisson distribution are equal to the expected value λ. In Probability theory and Statistics, the moment-generating function of a Random variable X is M_X(t=\operatorname{E}\left(e^{tX}\right In Probability theory and Statistics, a Random variable X has an Expected value μ = E ( X) and a Variance σ2 The nth factorial moment of the Poisson distribution is λn. In Probability theory, the n th factorial moment of a Probability distribution, also called the n th factorial moment of any Random variable
• The Poisson distributions are infinitely divisible probability distributions. In Probability theory, to say that a Probability distribution F on the real line is infinitely divisible means that if X is any Random variable
$\Delta(\lambda||\lambda_0) = \lambda \left( 1 - \frac{\lambda_0}{\lambda} + \frac{\lambda_0}{\lambda} \log \frac{\lambda_0}{\lambda} \right).$

### Generating Poisson-distributed random variables

A simple way to generate random Poisson-distributed numbers is given by Knuth, see References below. In Probability theory and Information theory, the Kullback–Leibler divergence (also information divergence, information gain, or relative Donald Ervin Knuth (kəˈnuːθ (born 10 January 1938) is a renowned computer scientist and Professor Emeritus of the Art of Computer

algorithm poisson random number (Knuth):    init:         Let L ← e−λ, k ← 0 and p ← 1.     do:         k ← k + 1.          Generate uniform random number u in [0,1] and let p ← p × u.     while p ≥ L.     return k − 1.

While simple, the complexity is linear in λ. There are many other algorithms to overcome this. Some are given in Ahrens & Dieter, see References below.

## Parameter estimation

### Maximum likelihood

Given a sample of n measured values ki we wish to estimate the value of the parameter λ of the Poisson population from which the sample was drawn. To calculate the maximum likelihood value, we form the log-likelihood function

$L(\lambda) = \ln \prod_{i=1}^n f(k_i \mid \lambda) \!$
$= \sum_{i=1}^n \ln\!\left(\frac{e^{-\lambda}\lambda^{k_i}}{k_i!}\right) \!$
$= -n\lambda + \left(\sum_{i=1}^n k_i\right) \ln(\lambda) - \sum_{i=1}^n \ln(k_i!). \!$

Take the derivative of L with respect to λ and equate it to zero:

$\frac{\mathrm{d}}{\mathrm{d}\lambda} L(\lambda) = 0\iff -n + \left(\sum_{i=1}^n k_i\right) \frac{1}{\lambda} = 0 \!$

Solving for λ yields the maximum-likelihood estimate of λ:

$\widehat{\lambda}_\mathrm{MLE}=\frac{1}{n}\sum_{i=1}^n k_i. \!$

Since each observation has expectation λ so does this sample mean. Maximum likelihood estimation ( MLE) is a popular statistical method used for fitting a mathematical model to some data Therefore it is an unbiased estimator of λ. In Statistics, the difference between an Estimator 's Expected value and the true value of the parameter being estimated is called the bias. It is also an efficient estimator, i. e. its estimation variance achieves the Cramér-Rao lower bound (CRLB). In Estimation theory and Statistics, the Cramér–Rao bound (CRB or Cramér–Rao lower bound (CRLB, named in honor of Harald Cramér and

### Bayesian inference

In Bayesian inference, the conjugate prior for the rate parameter λ of the Poisson distribution is the Gamma distribution. Bayesian inference is Statistical inference in which evidence or observations are used to update or to newly infer the Probability that a hypothesis may be true In Bayesian probability theory a class of Prior probability distributions p (θ is said to be conjugate to a class of Likelihood functions In Probability theory and Statistics, the gamma distribution is a two-parameter family of continuous Probability distributions It has a Scale parameter Let

$\lambda \sim \mathrm{Gamma}(\alpha, \beta) \!$

denote that λ is distributed according to the Gamma density g parameterized in terms of a shape parameter α and an inverse scale parameter β:

$g(\lambda \mid \alpha,\beta) = \frac{\beta^{\alpha}}{\Gamma(\alpha)} \; \lambda^{\alpha-1} \; e^{-\beta\,\lambda} \qquad \mbox{for}\ \lambda>0 \,\!.$

Then, given the same sample of n measured values ki as before, and a prior of Gamma(α, β), the posterior distribution is

$\lambda \sim \mathrm{Gamma}(\alpha + \sum_{i=1}^n k_i, \beta + n). \!$

The posterior mean E[λ] approaches the maximum likelihood estimate $\widehat{\lambda}_\mathrm{MLE}$ in the limit as $\alpha\to 0,\ \beta\to 0$. In Mathematics, a probability density function (pdf is a function that represents a Probability distribution in terms of Integrals Formally a probability In Probability theory and Statistics, a shape parameter is a kind of Numerical parameter of a parametric family of Probability distributions In Probability theory and Statistics, a scale parameter is a special kind of Numerical parameter of a Parametric family of Probability distributions

The posterior predictive distribution of additional data is a Gamma-Poisson (i. In Probability and Statistics the negative binomial distribution is a Discrete probability distribution. e. negative binomial) distribution. In Probability and Statistics the negative binomial distribution is a Discrete probability distribution.

## The "law of small numbers"

The word law is sometimes used as a synonym of probability distribution, and convergence in law means convergence in distribution. In Probability theory and Statistics, a probability distribution identifies either the probability of each value of an unidentified Random variable Accordingly, the Poisson distribution is sometimes called the law of small numbers because it is the probability distribution of the number of occurrences of an event that happens rarely but has very many opportunities to happen. The Law of Small Numbers is a book by Ladislaus Bortkiewicz about the Poisson distribution, published in 1898. Ladislaus Josephovich Bortkiewicz (ru Владислав Иосифович Борткевич, pl Władysław Bortkiewicz, August 7, 1868 Year 1898 ( MDCCCXCVIII) was a Common year starting on Saturday (link will display the full calendar of the Gregorian calendar (or a Common Some historians of mathematics have argued that the Poisson distribution should have been called the Bortkiewicz distribution. [2]

• Anscombe transform - a variance-stabilising transformation for the Poisson distribution
• Compound Poisson distribution
• Tweedie distributions
• Poisson process
• Poisson regression
• Poisson sampling
• Queueing theory
• Erlang distribution which describes the waiting time until n events have occurred. In Statistics, the Anscombe transform (1948 is a variance-stabilizing transformation that transforms a Random variable with a Poisson distribution In Probability theory, a compound Poisson distribution is the Probability distribution of the sum of a "Poisson-distributed number" of Independent identically-distributed In Probability and Statistics, the Tweedie distributions are a family of Probability distributions which include continuous distributions such as the A Poisson process, named after the French mathematician Siméon-Denis Poisson (1781 &ndash 1840 is the Stochastic process in which events occur continuously and In Statistics, Poisson regression is a form of Regression analysis used to model Count data and Contingency tables Poisson regression assumes In the theory of Finite population sampling, Poisson sampling is a sampling process where each element of the population that is sampled is subjected to Queueing theory is the mathematical study of waiting lines (or queues ' The Erlang distribution is a continuous Probability distribution with wide applicability primarily due to its relation to the exponential and Gamma distributions For temporally distributed events, the Poisson distribution is the probability distribution of the number of events that would occur within a preset time, the Erlang distribution is the probability distribution of the amount of time until the nth event. For other uses see Time (disambiguation Time is a component of a measuring system used to sequence events to compare the durations of
• Skellam distribution, the distribution of the difference of two Poisson variates, not necessarily from the same parent distribution. The Skellam distribution is the discrete Probability distribution of the difference K_1-K_2 of two correlated or uncorrelated Random
• Incomplete gamma function used to calculate the CDF. In Mathematics, the Gamma function is defined by a definite integral.
• Dobinski's formula (on combinatorial interpretation of the moments of the Poisson distribution)
• Schwarz formula
• Robbins lemma, a lemma relevant to empirical Bayes methods relying on the Poisson distribution
• Coefficient of dispersion, a simple measure to assess whether observed events are close to Poisson

## References

1. ^ NIST/SEMATECH, '6.3.3.1. Counts Control Charts', e-Handbook of Statistical Methods, accessed 25 October 2006
2. ^ p. In combinatorial mathematics Dobinski's formula states that the number of partitions of a set of n members is {1 \over e}\sum_{k=0}^\infty In Mathematics, especially Complex analysis, the Schwarz formula says if a complex-valued function f is continuous on the disk |z| and In Statistics, the Robbins lemma, named after Herbert Robbins, states that if X is a Random variable with a Poisson distribution, In Statistics, empirical Bayes methods are a class of methods which use empirical data to evaluate / approximate the conditional Probability distributions that arise In Statistics, the coefficient of dispersion, also known as the Fano factor, is a measure used to quantify whether a set of observed occurrences are relatively clustered The Statistics Online Computational Resource (SOCR is a suite of online tools and interactive aids for hands-on learning and teaching concepts in statistical analyses and e. I J Good, Some statistical applications of Poisson's work, Statist. Sci. 1 (2) (1986), 157-180. JSTOR link
• Donald E. Knuth (1969). Seminumerical Algorithms, The Art of Computer Programming, Volume 2. Addison Wesley. Addison-Wesley is a Book publishing imprint of Pearson PLC, best known for computer books
• Joachim H. Ahrens, Ulrich Dieter (1974). "Computer Methods for Sampling from Gamma, Beta, Poisson and Binomial Distributions". Computing 12 (3): 223--246. doi:10.1007/BF02293108. A digital object identifier ( DOI) is a permanent identifier given to an Electronic document.
• Joachim H. Ahrens, Ulrich Dieter (1982). "Computer Generation of Poisson Deviates". ACM Transactions on Mathematical Software 8 (2): 163--179. doi:10.1145/355993.355997. A digital object identifier ( DOI) is a permanent identifier given to an Electronic document.
• Ronald J. Evans, J. Boersma, N. M. Blachman, A. A. Jagers (1988). "The Entropy of a Poisson Distribution: Problem 87-6". SIAM Review 30 (2): 314--317.