Applications of information measures to assess convergence in the central limit theorem

Ranjani Atukorala, Maxwell Leslie King, Sivagowry Sriananthakumar

Research output: Contribution to journalArticleResearchpeer-review

Abstract

The Central Limit Theorem (CLT) is an important result in statistics and econometrics and econometricians often rely on the CLT for inference in practice. Even though different conditions apply to different kinds of data, the CLT results are believed to be generally available for a range of situations. This paper illustrates the use of the Kullback-Leibler Information (KLI) measure to assess how close an approximating distribution is to a true distribution in the context of investigating how different population distributions affect convergence in the CLT. For this purpose, three different non-parametric methods for estimating the KLI are proposed and investigated. The main findings of this paper are 1) the distribution of the sample means better approximates the normal distribution as the sample size increases, as expected, 2) for any fixed sample size, the distribution of means of samples from skewed distributions converges faster to the normal distribution as the kurtosis increases, 3) at least in the range of values of kurtosis considered, the distribution of means of small samples generated from symmetric distributions is well approximated by the normal distribution, and 4) among the nonparametric methods used, Vasicek s [33] estimator seems to be the best for the purpose of assessing asymptotic approximations. Based on the results of this paper, recommendations on minimum sample sizes required for an accurate normal approximation of the true distribution of sample means are made.
Original languageEnglish
Pages (from-to)265 - 276
Number of pages12
JournalModel Assisted Statistics and Applications
Volume10
Issue number3
DOIs
Publication statusPublished - 2015

Cite this

@article{213fdd0f5ddb4b9492b869f9f0493817,
title = "Applications of information measures to assess convergence in the central limit theorem",
abstract = "The Central Limit Theorem (CLT) is an important result in statistics and econometrics and econometricians often rely on the CLT for inference in practice. Even though different conditions apply to different kinds of data, the CLT results are believed to be generally available for a range of situations. This paper illustrates the use of the Kullback-Leibler Information (KLI) measure to assess how close an approximating distribution is to a true distribution in the context of investigating how different population distributions affect convergence in the CLT. For this purpose, three different non-parametric methods for estimating the KLI are proposed and investigated. The main findings of this paper are 1) the distribution of the sample means better approximates the normal distribution as the sample size increases, as expected, 2) for any fixed sample size, the distribution of means of samples from skewed distributions converges faster to the normal distribution as the kurtosis increases, 3) at least in the range of values of kurtosis considered, the distribution of means of small samples generated from symmetric distributions is well approximated by the normal distribution, and 4) among the nonparametric methods used, Vasicek s [33] estimator seems to be the best for the purpose of assessing asymptotic approximations. Based on the results of this paper, recommendations on minimum sample sizes required for an accurate normal approximation of the true distribution of sample means are made.",
author = "Ranjani Atukorala and King, {Maxwell Leslie} and Sivagowry Sriananthakumar",
year = "2015",
doi = "10.3233/MAS-150330",
language = "English",
volume = "10",
pages = "265 -- 276",
journal = "Model Assisted Statistics and Applications",
issn = "1574-1699",
publisher = "IOS Press",
number = "3",

}

Applications of information measures to assess convergence in the central limit theorem. / Atukorala, Ranjani; King, Maxwell Leslie; Sriananthakumar, Sivagowry.

In: Model Assisted Statistics and Applications, Vol. 10, No. 3, 2015, p. 265 - 276.

Research output: Contribution to journalArticleResearchpeer-review

TY - JOUR

T1 - Applications of information measures to assess convergence in the central limit theorem

AU - Atukorala, Ranjani

AU - King, Maxwell Leslie

AU - Sriananthakumar, Sivagowry

PY - 2015

Y1 - 2015

N2 - The Central Limit Theorem (CLT) is an important result in statistics and econometrics and econometricians often rely on the CLT for inference in practice. Even though different conditions apply to different kinds of data, the CLT results are believed to be generally available for a range of situations. This paper illustrates the use of the Kullback-Leibler Information (KLI) measure to assess how close an approximating distribution is to a true distribution in the context of investigating how different population distributions affect convergence in the CLT. For this purpose, three different non-parametric methods for estimating the KLI are proposed and investigated. The main findings of this paper are 1) the distribution of the sample means better approximates the normal distribution as the sample size increases, as expected, 2) for any fixed sample size, the distribution of means of samples from skewed distributions converges faster to the normal distribution as the kurtosis increases, 3) at least in the range of values of kurtosis considered, the distribution of means of small samples generated from symmetric distributions is well approximated by the normal distribution, and 4) among the nonparametric methods used, Vasicek s [33] estimator seems to be the best for the purpose of assessing asymptotic approximations. Based on the results of this paper, recommendations on minimum sample sizes required for an accurate normal approximation of the true distribution of sample means are made.

AB - The Central Limit Theorem (CLT) is an important result in statistics and econometrics and econometricians often rely on the CLT for inference in practice. Even though different conditions apply to different kinds of data, the CLT results are believed to be generally available for a range of situations. This paper illustrates the use of the Kullback-Leibler Information (KLI) measure to assess how close an approximating distribution is to a true distribution in the context of investigating how different population distributions affect convergence in the CLT. For this purpose, three different non-parametric methods for estimating the KLI are proposed and investigated. The main findings of this paper are 1) the distribution of the sample means better approximates the normal distribution as the sample size increases, as expected, 2) for any fixed sample size, the distribution of means of samples from skewed distributions converges faster to the normal distribution as the kurtosis increases, 3) at least in the range of values of kurtosis considered, the distribution of means of small samples generated from symmetric distributions is well approximated by the normal distribution, and 4) among the nonparametric methods used, Vasicek s [33] estimator seems to be the best for the purpose of assessing asymptotic approximations. Based on the results of this paper, recommendations on minimum sample sizes required for an accurate normal approximation of the true distribution of sample means are made.

U2 - 10.3233/MAS-150330

DO - 10.3233/MAS-150330

M3 - Article

VL - 10

SP - 265

EP - 276

JO - Model Assisted Statistics and Applications

JF - Model Assisted Statistics and Applications

SN - 1574-1699

IS - 3

ER -