Consistent way of ensuring i.i.d. condition for infinitely many random variables
In probability theory, the term i.i.d. condition is loosely stated to describe random variables that are independent of each other and are identically distributed. This is equivalent to saying that there is a joint distribution whose marginals are identical and independent of each other. In an infinite dimensional setting, Daniel-Kolmogorov theorem is invoked to explore the conditions under which such a joint distribution can be established.
Main question about i.i.d. condition
In an elementary probably class, we learn that is defined to be probability that certain random variable is less than or equal to some value . Can we reverse it? In other words, given distribution function F, can we ensure that there is a corresponding random variable satisfying ?
Well, this is a result of the existence of Lebesgue-Stieltjes measure as a consequence of the renowned Caratheodory extension theorem: as long as is right-continuous, non decreasing, and , the existence is ensured.
Caratheodory extension theorem proceeds by initially assuming a pre-measure on an algebra, which is a collection of sets closed under finite union and intersection. This pre-measure is then expanded to an outer measure that applied to all subsets of a given set . This outer measure qualifies as a measure on the collection of Caratheodory-measurable sets, which form a -algebra. The theorem additionally stipulates that -finiteness makes this extension unique.
Now going back to our case of finding a random variable , we can simply put and set , where is a Lebesgue Stieltjes measure. Then we have .
More generally, this tells us how to construct a random vector out of .
How about an entire sequence for which we prescribe the finite-dimensional distribution function for every ?
In probability theory textbooks, this condition is loosely referred to as “i.i.d. condition” of inifinitely many random variables. In order to show that this is feasible, one way is to establish some probability measure whose marginal distribution corresponds to every random variable in the sequence.
Daniel-Kolmogorov theorem
Let be a given family of finite-dimensional probability distribution functions, and denote by the corresponding (induced) distributions. If these distributions satisfy the consistency conditions
1) If is a permutation of then for any Borel sets of the real line we have
2) If and then for any Borel set we have
then on the “canonical” space , there exists a probability measure such that with for every .
The proof follows the natural logic of reduction, detailed here, where the author conveniently assumed that is in time domain. It largely follows the following four steps.
1) We begin by defining for where cylinder set is defined by . Then we have , and it is well-defined due to condition 1 and 2 (think when B is a rectangular measurable set; permutation condition is necessary).
2) We can check that cylinder sets satisfy the set conditions of Caratheodory extension theorem and since has ambient value 1, it is -finite. Therefore, it suffices to show that is countably additive.
3) We reformulate the problem into showing that by some sequence of “decreasing” set . Note that this reduces the problem into the one involving Borel set by denoting for some “decreasing” sequence of sets . This makes the problem easier to handle.
4) We can approximate with compact sets . This approximation allows us to use limiting argument. Assuming that does not converge to zero, we can extract some nice sequence of points satisfying for every , and this leads to the contradiction because , which implies an absurd statement .
Continuing with i.i.d.
Let and define for with and . This is a probability distribution function on and induces a probability measure , the n-product measure of with itself on . It is clear that the family satisfies the consistency conditions above in the theorem.
According to the theorem, there exists a probability measure on with for every in , . But this means that the random variables are independent.
Application of Daniel-Kolmogorov in the construction of Brownian motion
On a side note, this is very similar to how the Brownian motion is constructed.
Consider
and let denote the field of all such sets. Further, let -algebra generated by this set be denoted by .
Using Daniel-Kolmogorov theorem, we could establish the probability measure on , under which the coordinate mapping process has stationary, independent increments. Further, an increment where is normally distributed with mean zero and variance (we can do this by explicitly writing down joint density function).
In order to establish the continuity, we invoke the modification theorem by Kolmogorov and Centov, whose proof detailed in Karatzas and Shreve:
Suppose that a process on a probability space satisfies the condition for some positive constants and . Then there exists a continuous modification of , which is locally Holder-continuous with exponent for every , i.e.,
where is an a.s. positive random variable and is an appropriate constant.
In our case, the condition in particular holds with and . Since Holder continuity implies continuity, we are done with the construction of Brownian motion.
By the way, there are other ways of constructing Brownian motion, one of which involves constructing Brownian motion in dyadic rationals in an interval [0,1] (by explicitly writing down the joint distribution), filling the gaps (using uniform continuity), and patching the rest, which is also elaborated in Karatzas and Shreve.