We now turn the attention to discrete distributions. In particular the focus is on counting distributions. These are the discrete distributions that have positive probabilities only on the non-negative integers 0, 1, 2, 3, … One important application is on finding suitable counting distributions for modeling the number of losses or the number claims to an insurer or more generally the number of other random events that are of interest in actuarial applications. From a claim perspective, these counting distributions would be models for claim frequency. Combining frequency models with models for claim severity would provide a more complete picture of the exposure of risks to the insurer than using claim severity alone. This post and several subsequent posts are preparation for the discussion on modeling aggregate losses and claims. Another upcoming topic would be the effect of insurance coverage modifications (e.g. deductibles) on the claim frequency and claim severity.
This post focuses on the three commonly used counting distributions – Poisson, binomial and negative binomial (the big 3). These three distributions are the basis for defining a large class of other counting distributions.
Probability Generating Function
Let be a random variable with positive probabilities only on the non-negative integers, i.e. is positive only for . The function is the probability of the occurrence of the event , i.e. the observed value of the random variable is . It is called the probability function of the random variable (also called probability mass function). From the probability function, many other distributional quantities can be derived, e.g. mean, variance and higher moments.
We can also elicit information about from its generating function. The generating function (or probability generating function) of is defined by:
where each . The generating function is defined wherever the infinite sum converges. At minimum, converges for . Some converges for all real , e.g. when has a Poisson distribution (see below).
One reason for paying attention to generating function is that the moments of can be generated from . The th moment of is derived from the result of taking the th derivative of and evaluating at .
The above expectation is said to be a factorial moment. It follows that . Since , the second moment is . In general, the th moment can be expressed in terms of for all .
Another application of generating function is that encodes the probability function , which is obtained by taking the derivatives of and then evaluated at .
where . Another useful property about generating function is that the probability distribution of a random variable is uniquely determined by its generating function. This fundamental property is useful in determining the distribution of an independent sum. The generating function of an independent sum of random variables is simply the product of the individual generating functions. If the product is the generating function of a certain distribution, then the independent sum must be of the same distribution.
For a more detailed discussion on probability generating function, see this blog post in a companion blog.
We now describe the three counting distributions indicated at the beginning of the post. We start with the Poisson distribution. Consider a random variable that only takes on the non-negative integers. For each , let .
The random variable has a Poisson distribution if its probability function is:
for some positive constant . This constant is the parameter of the Poisson distribution in question. It is also the mean and variance of the Poisson distribution. The following is the probability generating function of the Poisson distribution.
The Poisson generating function is defined for all real numbers . The mean and variance and higher moments can be computed using the generating function.
One interesting characteristic of the Poisson distribution is that its mean is the same the variance. From a mathematical standpoint, the Poisson distribution arises from the Poisson process (see a more detailed discussion here). Another discussion of the Poisson distribution is found here.
One useful characteristic of the Poisson distribution is that combining independent Poisson distributions results in another Poisson distribution. Suppose that are independent Poisson random variables with means , respectively. Then the probability generating function of the sum is simply the product of the individual probability generating functions.
where . The probability generating function of the sum is the generating function of a Poisson distribution. Thus independent sum of Poisson distributions is a Poisson distribution with parameter being the sum of the individual Poisson parameters.
Another useful property is that of splitting a Poisson distribution. For example, suppose that the number of claims in a given year follows a Poisson distribution with mean per year. Also suppose that the claims can be classified into distinct types such that the probability of a claim being of type is , and such that . If we are interested in studying the number of claims in a year that are of type , , then are independent Poisson random variables with means , respectively. For a mathematical discussion of this Poisson splitting phenomenon, see this blog post in a companion blog.
Consider a series of independent events each of which results in one of two distinct outcomes (one is called Success and the other Failure) in such a way that the probability of observing a Success in a trial is constant across all trials (these are called Bernoulli trials). For a binomial distribution, we are only interested in observing such trials and count the number of successes in these trials.
More specifically, let be the probability of observing a Success in a Bernoulli trial. Let be the number of Successes observed in independent trials. Then the random variable is said to have a binomial distribution with parameters and .
Note that the random variable is the independent sum of where is the number of Success in the th Bernoulli trial. Thus is 1 with probability and is 0 with probability . Its probability generating function would be:
As a result, the probability generating function for would be raised to .
The generating function is defined for all real values . Differentiating twice produces the mean and variance.
By differentiating and evaluating at , we obtain the probability function.
By taking the product of probability generating functions, it follows that the independent sum , where each has a binomial distribution with parameters and , has a binomial distribution with parameters and . In other words, as long as the probability of success is identical in the binomial distributions, the independent sum is always a binomial distribution.
Note that the variance of the binomial distribution is less than the mean. Thus the binomial distribution is suitable candidate for modeling frequency for the situations where the sample variance is smaller than the sample mean.
Negative Binomial Distribution
As mentioned above, the Poisson distribution requires that the mean and the variance are equaled. The binomial binomial distribution requires that the variance is smaller than the mean. Thus these two counting distributions are not appropriate in all cases. The negative binomial distribution is an excellent alternative to the Poisson distribution and the binomial distribution, especially in the cases where the observed variance is greater than the observed mean.
The negative binomial naturally arises from the same probability experiment that generates the binomial distribution. Consider a series of independent Bernoulli trials each of which results in one of two distinct outcomes (called success and failure) in such a way that the probability of success is constant across the trials. Instead of observing the outcomes in a fixed number of trials, we now observe the trials until number of success have occurred.
As we observe the Bernoulli trials, let’s be the number of failures until the th success has occurred. The random variable has a negative binomial distribution with parameters and . The parameter is necessarily a positive integer and the parameter is a real number between 0 and 1. The following is the probability function for the random variable .
In the above probability function, the parameter must be a positive integer. The binomial coefficient is computed by its usual definition. The above probability probability function can be relaxed so that can be any positive real number. The key to the relaxation is a reformulation of the binomial coefficient.
Note that in the above formulation, the in does not have to be an integer. If were to be a positive integer, the usual definition would lead to the same calculation. The reformulation is a generalization of the usual binomial coefficient definition.
With the new definition of binomial coefficient, the following is the probability function of the negative binomial distribution in the general case.
The following is the same probability function with the binomial coefficient explicitly written out.
For either of the above versions, the mean and variance are:
Another formulation of the negative binomial distribution is that it is a Poisson-gamma mixture. The following is the probability function.
It is still a 2-parameter discrete distribution. The parameters and originate from the parameters of the gamma distribution in the Poisson-gamma mixture. The mean and variance are:
The negative binomial distribution has been discussed at length in blog posts in several companion blogs. For the natural interpretation of negative binomial distribution based on counting the number of failures until the th success, see this blog post. This is an excellent introduction.
For the general version of the negative binomial distribution where the parameter can be any positive real number, see this blog post.
For the version of negative binomial distribution from a Poisson-Gamma mixture point of view, see this blog post.
More Counting Distributions
The three counting distributions – Poisson, binomial and negative binomial – provide a versatile tool kit in modeling the number of random events such as losses to the insured or claims to the insurer. The tool kit can be greatly expanded by modifying these three distributions to generate additional distributions. The new distributions belong to the (a,b,0) and (a,b,1) classes. This topic is discussed in the subsequent posts.
Dan Ma actuarial topics
Dan Ma actuarial
Dan Ma math
Daniel Ma actuarial
Daniel Ma mathematics
Daniel Ma actuarial topics