Data Research, Vol. 2, Issue 4, Aug  2018, Pages 111-119; DOI: 10.31058/j.data.2018.24006 10.31058/j.data.2018.24006

A Modified Review and Proof of Central Limit Theorem in Relation with Law of Large Numbers

, Vol. 2, Issue 4, Aug  2018, Pages 111-119.

DOI: 10.31058/j.data.2018.24006

Casmir Chidiebere Onyeneke 1*

1 Department of Mathematics and Statistics, Hezekiah University, Umudi, Nigeria

Received: 24 August 2018; Accepted: 20 October 2018; Published: 16 November 2018

Abstract

The study of the central limit theorem and law of large are paramount to mathematicians and statisticians because of the undisputable role they play in probability, estimation, decision making and analysis of confidence intervals. As the fundamental theorems underlying applied mathematics, it can never be overemphasized that the distribution of the sum of a large number of independent and identically distributed approximates to normal irrespective of the nature of the underlying distribution. This is the sole theorem of the study of central limit theorem and law of large numbers. The essence of this modified review and study of these powerful keys of mathematics is to elaborate the essential proofs and applications that uphold their position. Indeed, it is hard to overstate the importance of the central limit theorem due to the fact that it is the major reason for many statistical and mathematical procedures to work. When the distribution of statistical cumulative functions is viewed in the cause of this study, they all approximate to normal under repeated observation in the same condition for a long period of occurrences. The first thing to do in every given distribution is to standardize the function if it assumes non-normal. Next, is to normalize them by the assumptions of central limit theorem and law of large numbers. In this work, different approaches are used to tackle different distributions in order to get their approximation to normal by the application of central limit theorem and law of large numbers.

Keywords

Central Limit Theorem, Law of Large Number, Discrete Random Variables, Cumulative, Moments, Probability Magnitude, Normal Distribution, Variance

1. Introduction

The central limit theorem and law of large numbers are applied in probability theory for conditions which the mean of an adequately large number of independent random variables, each with finite mean and variance, approximates to normal distributed [1]. One of the underlying assumptions is that the random variables must be identically distributed to depict the central limit theorem in its common form. In real life application, the societal quantities comprise most of the balanced sum of events that are not observed. As a result, the two theorems provide limited justification for the incidence of the normal probability distribution. It is not easy to separate central limit theorem from the law of large number because central limit theorem substantiates the approximation of large-sample statistics to the normal distribution in controlled experiments. This study examines the different approaches used to prove the central limit theorem and law of large numbers by some respective professional approaches. This work presented the proof which shows that the normal form variation in probabilities of independent and identically distributed functions have a limiting cumulative distribution function which approaches to normal distribution. The methods of cumulative and moment functions were used to back the evidence guiding the assumptions of the central limit theorem and laws of large numbers. A brief view of Martingale and Yuval central limit theorems were given. At the end, it was shown that central limit theorem which concludes has a cumulative function of a standard normal distribution with mean 0 and Variance with acontinuous limiting distribution.There are also some illustrations of the law of Large numbers both in discrete and continuous distributions. The central limit theorem and the law of large number have a relationship with Chebyshev’s inequality which is stated that for random variables with expected value and variance,for. In thinking about a sequence . . . of an independent random variable with the same distribution, is the common expected value and is the common variance. It is assumed that is positive. Letrepresents the normal sums denoted by.It is understood that the notion of standard normal distribution gives the understanding that the have expected value 0 and variance 1. The central limit theorem now states thatfor nfor all, where is the distribution function of the standard normal distribution.

         (1)

The distribution function of the sums converges to when n converges to infinity ().This is a quite amazing result and the absolute climax of probability theory. The surprising thing is that the limit distribution of the standard normal is independent of the distribution of the.Consider a sequence of independent random variables all having the same point probabilities. The sums is binomially distributed with expected value and variance . Thestandard normal thus becomes

         (2)

The distribution of is given by the distribution function . The Central Limit Theorem states that converges to standard normal distribution for n [2].

Consider a sequence of independent random variables with the same distribution and let be the common expected value denoted by,. The law of large numbers then states that for for every [3]In other words, the mean value of a sample from any given distribution conveys to the expected value of that distribution when the size n of the sample approaches.At this point, examine the law of large numbers for discrete random variables. Let be an independent trials process with finite expected value and finite variance [4]. Let then for any, as , equally, as . Since are independent and have the same distributions, then andand. Chebyshev’s inequality for,. Thefixed, gives as or equivalently, as .

The fact aboutlaw of large numbers for continuous probably distributions is that if is an independent trials process with continuous density function, finite expected value and variance. Then is the sum of the . Then for any real number, . In other words, . However, this is not necessarily true if is infinite.

2. Methodological Proofs

Let be a set of independent random variables and each has an arbitrary probability distributionwith meanand a finite variance then the standard normal form of the variables is states ashas a limiting cumulative distribution function which approaches a normal distribution [5]. Under additional conditions on the distribution of the sums, the probability density itself is also normalwith and variance [6].If conversion to normal form is not performed, then the variablesarenormally distributed with and

Consider the inverse Fourier transformation offor more proof of the central limit theorem. According to [8]

(3)

         (4)

         (5)

So,

         (6)

(7)

         (8)

         (9)

         (10)

         (11)

         (12)

The expansion ofgave

         (13)

         (14)

Since Taking the Fourier transform,

         (15)

It can be written as

         (16)

In the formula, and and it represents a Fouriertransform of a Gaussian function, so

         (17)

Therefore,

         (18)

         (19)

         (20)

But and, so

Central limit theorem says that under general circumstances, if independent random variable are summed and normalized accordingly, then the limit approaches to a normal distribution [9]. Consider the density of the normal distribution with mean and variance, that is . Suppose that are independent, identically distributed random variables with zero mean and variance, then. If the variables do not have zero mean, they can easily be normalized by subtracting the expectation from them.The meaning of is that each interval [a, b], .

The definition of the cumulative generating function of a random variable X is [10]. This implies that, so that it is logical to take its logarithm. Infact, using Taylor series of, expands as a power series, which begins asThe first two coefficient of serves as the exponential generating function. Ignoring the factors, the expectation and the variance, the formula is called coefficient cumulative functionas. In particular, it shows that;. In general, using the Taylor series of,can be expressed as a polynomial in the moments. Conversely using the Taylor series of can be used to show the moments as polynomials in the cumulative functions. This provides an example of Moebius inversion [11].

The formulas for balancing and simplifying extend to cumulative generating function as. Supposed that are independent random variables with zero mean, then=. It can be written interms of the cumulative function as. It implies that so the first cumulative is not propelled. The second cumulative, the variance, is simply averages. What happened to all the higher cumulative if the cumulative are bonded by some constant C, then for M >2,

In other words, all the higher cumulative disappear in the limit. The cumulative of the normalized sums tend to the cumulative of some fixed distribution, which must be the normal distribution. The limit cumulative generating function, which can be expressed asindeed corresponds to normal variable. In particular, if the normal distributed are used, then becomes normal distribution of the form.

There are some advantages of providing a conditional explanation for values of the moments of a normal distribution. In other words, it is important to calculate these very moments. Note that the moment of a normal distribution with zero mean and unit variance can be written as

(21)

The repeated relation is [12].Since where all odd numbersare zero because the distribution is symmetric and the event moments are. Proof using moment briefly computes the limit of the moments of. It is assumed that the variables are independent with 0 mean, unit variance and bonded moments. The proof can be adapted to the case of varying variances.It implies that and the second momentsince where as

Then, the third moment assumes, which tends to zero.The fourth moment brings forth a more interesting calculation whereThe distribution of term with variables with multiplicitiesM1 Mi is at mostwhere C is a bound on Thus the term is asymptotically zero if . If then six otherwise the term is identically zero such that and. In this case, the contribution of the term is .Since the random variables as unit variance, the moment converges to the number of such terms.

3. Results and Discussion

In distinguishing the facts in the law of large numbers and central limit theorem, the law of large numbers enables the investigation of the conditions under which a random sample of mean tends to the population mean from which the sample is drawn whilethe central limit theory assists in findingthe probability magnitude of the discrepancy or the size of the sample that can give a reliable estimate.

Let be a sequence of independently and identically distributed random variables such that, var [13]. Then . Note that = [13].Let denote the cumulative function of XK, the distribution function is F or F. Then, . Since the first and the second moments of XK exist and = =

=

==

= = =

=          (22)

Consider the characteristic function of, then

where are identically distributed. Thus

         (23)

The logarithm of both sides of equation (2) gives . By replacing t in equation (23) by it gives and hence

         (24)

Consider the identitywhere. Application of this identity to (24) gives = as ,

         (25)

Equation 4 has a characteristics function of a standard normal distribution with mean 0 and Variance. The limiting distribution is continuous. When a die is rolled 420 times, the central limit theorem can be used to determine the probability that the sum of the rolls lies between 1400 and 1550. The sum is a random variablewhere each has a distributionwe have seen that thus and Therefore,

Again, let be a Bernouli trials process with probability 0.3 for success and 0.7 for failure. Let if jth outcome is a success and 0 otherwise. Then and if is the average of . Then and . Example, let,. if , or if ; . These can be rewritten as, . These can be compared with the actual values, which are (to six decimal places), .

Suppose we choose at random numbers from the interval with uniform distribution. Then if describes the choice, it gives;. Hence and for any; . This says that we choose n numbers at random from, and then the numbers are better than, that the difference is less that.Suppose we choose n real numbers at random, using a normal distribution with mean 0 and variance 1. Then, Hence and .

4. Conclusions

Without getting into the mathematical details, the Central Limit Theorem states that if you take a lot of samples from a certain probability distribution, the distribution of their sum (and therefore their mean) will be approximately normal, even if the original distribution was not normal. Furthermore, it gives you the standard deviation of the mean distribution which is . When testing a statistical hypothesis or calculating a confidence interval, we generally take the mean of a certain number of samples from a population, and assume that this mean is a value from a normal distribution. The Central Limit Theorem tells us that this assumption is approximately correct, for large samples, and tells us the standard deviation to use.

The Martingale central limit theorem generalizes the result of central limit for random variables to martingales, which are stochastic processes where the change in the value of the process from time t to time t + 1 has expectation zero, even conditioned on previous outcomes [14]. Let be a martingale with bounded increments. That is, suppose And Almost surely for some fixed bound K and all t. also assume that almost surely. Define and let[15]. Thenconverges the distribution to the normal distribution with mean 0 and variance 1 as. More explicitly, .

Conflicts of Interest

The author declares that there is no conflict of interest regarding the publication of this article.

Copyright

© 2017 by the authors. Licensee International Technology and Science Press Limited. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

References

[1] Abramowitz, M.; Stegun, I. A. Handbook of Mathematical Functions with Formulas, Graphs, and Mathematical Tables, 9th printing. National Bureau of Standards Applied Mathematics Series - 55. Dover, New York, 1972, Chapter 3, 53-60.

[2] Trotter, H. F. An Elementary Proof of the Central Limit Theorem. Architectural Mathematics, 1959, 10, 226-234.

[3] David B. Essentials of Statistics. David Brinks and Ventures Publishing ApS. 2010, Chapter 5, 26 -27.

[4] Lindeberg, J. W. Eine neue Herleitung des Exponentialgesetzes in der Wahrscheinlichkeitsrechnung. Bulletin of the American Mathematical Society, 1922, 15, 211-225.

[5] Feller, W. The Fundamental Limit Theorems in Probability. Bulletin of the American Mathematical Society, 1945, 51, 800-832.

[6] Feller, W. An Introduction to Probability Theory and Its Applications 3rd Edition, Volume 2, Wiley Publishing, New York, 1971, Chapter 2, 31-98.

[7] Kallenberg, O. Foundations of Modern Probability. Springer-Verlag, New York, 1997, Chapters 3, 254-618.

[8] Feller, W. An Introduction to Probability Theory and Its Applications, 3rd Edition, Volume 1, Wiley Publishing, New York, 1968, Chapter 5, 229.

[9] Yuval F. Two Proves of the Central Limit Theorem. Architectural Mathematics 2010, Chapter 101, 567-934.

[10] Mlodinow, L. The Drunkards Walk: How Randomness Rules Our Lives. Random House, New York, 2008, Chapter 2, 45-50.

[11] Spiegel, M. R. Theory and Problems of Probability and Statistics. McGraw-Hill, New York, 1992, Chapter 6, 112-113.

[12] Zabell, S. L. Alan Turing and the Central Limit Theorem. Bulletin of the American Mathematical Society, 1995, Chapter 102, 483-494.

[13] Artstein, S.; Ball, K.; Barthe F.; Naor A. Solution of Shannons Problem on the Monotonicity of Entropy. Journal of the American Mathematical Society, 2004, 17(4), 4-10.

[14] Hall, P.; Heyde, C.C. Martingale Limit Theory and its application. Academic Press, New York 1980, Chapter 2, 17-25.

[15] Bradley, R. On the same results of M. I. Gordin: a classification of a misunderstanding. Journal of Theoretical Probability (Springer), 1988, Chapter 1, 115-119.