The Central Limit Theorem (CLT) is an important result in statistics and econometrics and econometricians often rely on the CLT for inference in practice. Even though different conditions apply to different kinds of data, the CLT results are believed to be generally available for a range of situations. This paper illustrates the use of the Kullback-Leibler Information (KLI) measure to assess how close an approximating distribution is to a true distribution in the context of investigating how different population distributions affect convergence in the CLT. For this purpose, three different non-parametric methods for estimating the KLI are proposed and investigated. The main findings of this paper are 1) the distribution of the sample means better approximates the normal distribution as the sample size increases, as expected, 2) for any fixed sample size, the distribution of means of samples from skewed distributions converges faster to the normal distribution as the kurtosis increases, 3) at least in the range of values of kurtosis considered, the distribution of means of small samples generated from symmetric distributions is well approximated by the normal distribution, and 4) among the nonparametric methods used, Vasicek s  estimator seems to be the best for the purpose of assessing asymptotic approximations. Based on the results of this paper, recommendations on minimum sample sizes required for an accurate normal approximation of the true distribution of sample means are made.
Atukorala, R., King, M. L., & Sriananthakumar, S. (2015). Applications of information measures to assess convergence in the central limit theorem. Model Assisted Statistics and Applications, 10(3), 265 - 276. https://doi.org/10.3233/MAS-150330