Moment Generating Function(mgf) and Probability Generating Function(pgf) are useful techniques in Probability Theorem. As Loss Model studies a lot about probability, mgf and pgf are necessary techniques. So I post some stuffs about them.

The definition of Moment Generating Function(Univariate Case) is

More generally, if , we use instead of :

The definition of mgf seems it will be complicated, but why defining it like that? According to Wikipedia, defining that way can be used to find all the moments of the distribution. Employing Taylor's Series to expand , we have that

Such that

It is straightforward to differentiate n times with respect to t and setting t =0 to get .

And if , , , is sequence of independent random variables, and . The mgf of is

It is notable to remind that some distributions have no mgf because in some case is not exist. For example, lognormal distribution.

For pgf , the definition is here:

. If we do a little bit transformation, we could drive our car to mgf:

When I reading the instruction of pgf on Wikipeida, it sounds like pgf is more appropriate for discrete random variable, but I don't have any evidence.

For Univariate case, a more detailed pgf definition is here:

And for Multivariate case, the definition is here:

From its definition, it is obviously a power series, which guarantees that will make the power series converged. If we setting , we could get that

And if , , , is sequence of independent random variables, and . The pgf of is

And particularly, if , we have

Note: All the materials of this post comes from wikipedia.org, you could check it out if you want something more detailed.