World Library  
Flag as Inappropriate
Email this Article

Central moment

Article Id: WHEBN0000019983
Reproduction Date:

Title: Central moment  
Author: World Heritage Encyclopedia
Language: English
Subject: Moment (mathematics), V-statistic, Variance, Jarque–Bera test, Combinant
Publisher: World Heritage Encyclopedia

Central moment

In probability theory and statistics, a central moment is a moment of a probability distribution of a random variable about the random variable's mean. The rth moment about any point a is called a central moment; it is the expected value of a specified integer power of the deviation of the random variable from the mean. The various moments form one set of values by which the properties of a probability distribution can be usefully characterised. Central moments are used in preference to ordinary moments, computed in terms of deviations from the mean instead of from the zero, because the higher-order central moments relate only to the spread and shape of the distribution, rather than also to its location.

Sets of central moments can be defined for both univariate and multivariate distributions.

Univariate moments

The nth moment about the mean (or nth central moment) of a real-valued random variable X is the quantity μn := E[(X − E[X])n], where E is the expectation operator. For a continuous univariate probability distribution with probability density function f(x), the nth moment about the mean μ is

\mu_n = \operatorname{E} \left[ ( X - \operatorname{E}[X] )^n \right] = \int_{-\infty}^{+\infty} (x - \mu)^n f(x)\,dx. [1]

For random variables that have no mean, such as the Cauchy distribution, central moments are not defined.

The first few central moments have intuitive interpretations:


The nth central moment is translation-invariant, i.e. for any random variable X and any constant c, we have


For all n, the nth central moment is homogeneous of degree n:


Only for n such that 1 ≤ n ≤ 3 do we have an additivity property for random variables X and Y that are independent:

\mu_n(X+Y)=\mu_n(X)+\mu_n(Y)\text{ provided }1 \leq n\leq 3.\,

A related functional that shares the translation-invariance and homogeneity properties with the nth central moment, but continues to have this additivity property even when n ≥ 4 is the nth cumulant κn(X). For n = 1, the nth cumulant is just the expected value; for n = either 2 or 3, the nth cumulant is just the nth central moment; for n ≥ 4, the nth cumulant is an nth-degree monic polynomial in the first n moments (about zero), and is also a (simpler) nth-degree polynomial in the first n central moments.

Relation to moments about the origin

Sometimes it is convenient to convert moments about the origin to moments about the mean. The general equation for converting the nth-order moment about the origin to the moment about the mean is

\mu_n = \sum_{j=0}^n {n \choose j} (-1) ^{n-j} \mu'_j \mu^{n-j},

where μ is the mean of the distribution, and the moment about the origin is given by

\mu'_j = \int_{-\infty}^{+\infty} x^j f(x)\,dx.

For the cases n = 2, 3, 4 — which are of most interest because of the relations to variance, skewness, and kurtosis, respectively — this formula becomes (noting that \mu = \mu'_1):

\mu_2 = \mu'_2 - \mu^2\,
\mu_3 = \mu'_3 - 3 \mu \mu'_2 + 2 \mu^3\,
\mu_4 = \mu'_4 - 4 \mu \mu'_3 + 6 \mu^2 \mu'_2 - 3 \mu^4.\,

Symmetric distributions

In a symmetric distribution (one that is unaffected by being reflected about its mean), all odd moments equal zero, because in the formula for the nth moment, each term involving a value of X less than the mean by a certain amount exactly cancels out the term involving a value of X greater than the mean by the same amount.

Multivariate moments

For a continuous bivariate probability distribution with probability density function f(x,y) the (j,k) moment about the mean μ = (μX, μY) is

\mu_{j,k} = \operatorname{E} \left[ ( X - \operatorname{E}[X] )^j ( Y - \operatorname{E}[Y] )^k \right] = \int_{-\infty}^{+\infty} \int_{-\infty}^{+\infty} (x - \mu_X)^j (y - \mu_Y)^k f(x,y )\,dx \,dy.

See also


  1. ^ Grimmett, Geoffrey and Stirzaker, David (2009). Probability and Random Processes. Oxford, England: Oxford University Press.  
This article was sourced from Creative Commons Attribution-ShareAlike License; additional terms may apply. World Heritage Encyclopedia content is assembled from numerous content providers, Open Access Publishing, and in compliance with The Fair Access to Science and Technology Research Act (FASTR), Wikimedia Foundation, Inc., Public Library of Science, The Encyclopedia of Life, Open Book Publishers (OBP), PubMed, U.S. National Library of Medicine, National Center for Biotechnology Information, U.S. National Library of Medicine, National Institutes of Health (NIH), U.S. Department of Health & Human Services, and, which sources content from all federal, state, local, tribal, and territorial government publication portals (.gov, .mil, .edu). Funding for and content contributors is made possible from the U.S. Congress, E-Government Act of 2002.
Crowd sourced content that is contributed to World Heritage Encyclopedia is peer reviewed and edited by our editorial staff to ensure quality scholarly research articles.
By using this site, you agree to the Terms of Use and Privacy Policy. World Heritage Encyclopedia™ is a registered trademark of the World Public Library Association, a non-profit organization.

Copyright © World Library Foundation. All rights reserved. eBooks from Project Gutenberg are sponsored by the World Library Foundation,
a 501c(4) Member's Support Non-Profit Organization, and is NOT affiliated with any governmental agency or department.