Def: We say that a measurable space (\Omega, \mathcal F, P) is a probability space if P(\Omega) = 1; in turn, a random variable X is a measurable function X: \Omega \to S, for some measurable space S, usually either \mathbb{R}^n or \mathbb{C}^n.
From now on, we will implicitly have all random variables which are mentioned together share the same probability space, and we will only consider real valued variables unless otherwise stated.
Def: The expectation of a random variable X is simply the integral E[X] = \int_\Omega X dP.
Def: The covariance between two random variables X and Y is the expectation \mathop{\mathrm{Cov}}(X, Y) = E[(X - E[X])(Y - E[Y])] and the variance is simply \mathop{\mathrm{Var}}(X) = \mathop{\mathrm{Cov}}(X, X) = E[(X - E[X])^2].
Def: A finite collection of events E_1, \dots, E_n are independent if P \left( \bigcap_{i=1}^m E_{k_i} \right) = \prod_{i=1}^m P(E_{k_i}) for all 2 \leq m \leq n and 1 \leq k_i \leq n. An infinite collection of events are independent if any finite subcollection of those events are independent.
Def: A collection of classes of sets \{\mathcal F_n\}_{n=1}^\infty are independent if for any collection of events \{E_n\}_{n=1}^\infty with E_n \in \mathcal F_n are independent. Similarly, a sequence of random variables \{X_n\}_{n=1}^\infty are independent if \{\sigma(X_n)\}_{n=1}^\infty are independent.
Def: For a sequence of events \{E_n\}_{n=1}^\infty, we say that E is the event which happens infinitely often (oft abbreviated i.o.) if E = \{E_n \text{ i.o.} \} = \bigcap_{m=1}^\infty \bigcup_{n=m}^\infty A_n = \limsup_n A_n. The counterpart is eventually, where E = \{E_n \text{ eventually} \} = \bigcup_{m=1}^\infty \bigcap_{n=m}^\infty A_n = \liminf_n A_n.
Def: Given a sequence of random variables \{X_n\}_{n=1}^\infty as well as another random variable X, we say that
We begin with the three big guns for interchanging integration and limits.