Pages

Monday, July 7, 2014

Law of Iterated Expectation

Let $X$ and $Y$ denote continuous, random, real-valued variables with joint probability density function $f(X, Y)$. The marginal density function of $Y$ is $f_Y(y) := \int_{x \in \mathbb R} f(x, y) dx$. The expectation $E(Y)$ of $Y$ can be recovered by integrating against the marginal density function. In particular, \begin{equation}\label{eq:E(Y)} E(Y)=\int_{y \in \mathbb R} y f_Y(y) dy = \int_{y \in \mathbb R} \int_{x \in \mathbb R} y f(x, y) \ dx \ dy. \end{equation} The conditional probability density function of $Y$ given that $X$ is equal to some value $x$ is defined by \begin{equation}\label{eq:cdf} f_{Y \mid X} (y \mid X = x):= f_{Y \mid X} (y \mid x)= \frac{f(x, y)}{f_X(x)} = \frac{f(x,y)}{\int_{y \in \mathbb R} f(x, y) \ dy}. \end{equation} The conditional expectation $E(Y \mid X = x)$ of $Y$ given that $X$ has value $x$ is given by \begin{equation}\label{eq:cond exp} E(Y \mid X = x) = \int_{y \in \mathbb R} y f_{Y \mid X} (y \mid x) \ dy. \end{equation} But $E(Y \mid X = x)$ depends on $X$, so in turn is itself a random variable denoted $E(Y \mid X)$, whence we can compute its expectation. Now \begin{align*} E (E (Y \mid X)) &= \int_{x \in \mathbb R} E(Y \mid x) f_X(x) \ dx \\ & = \int_{x \in \mathbb R} f_X(x) \left( \int_{y \in \mathbb R} y f_{Y \mid X}(x, y) \ dy \right) \ dx \quad \text{(by \ref{eq:cond exp} )} \\ & = \int_{x \in \mathbb R} \int_{y \in \mathbb R} f_X(x) \cdot y \frac{f(x,y)}{f_X(x)} \ dy \ dx \quad \text{(by \ref{eq:cdf})}\\ & = \int_{y \in \mathbb R} \int_{x \in \mathbb R} y f(x, y) \ dx \ dy \\ & = E(Y). \quad \text{(by \ref{eq:E(Y)})} \end{align*} This result is sometimes called the law of the iterated expectation.

Tuesday, July 1, 2014

Distributions, Densities, and Mass Functions

One unfortunate aspect of probability theory is that common measure theoretic constructions are given different, often conflated, names. The technical definitions of the probability distribution, probability density function, and probability mass function for a random variable are all related to each other, as this post hopes to make clear.

We begin with some general measure theoretic constructions. Let $(A, \mathcal A)$ and $(B, \mathcal B)$ be measurable spaces and suppose that $\mu$ is a measure on $(A, \mathcal A)$. Any $(\mathcal A, \mathcal B)$-measurable function $X \colon A \to B$ induces a push-forward measure $X_*(\mu)$ on $(B, \mathcal B)$ via: \[ [X_*(\mu)](S \in \mathcal B):= \mu(X^{-1} (S)). \]

Thursday, April 24, 2014

Ring Endomorphisms of the Reals

Today we present a (mostly) algebraic proof that the only field endomorphism of the real numbers $\mathbb R$ is the identity map. Many arguments of this fact invoke continuity, which we particularly try to avoid. To be clear, we require in this discussion that ring morphisms fix the multiplicative identity. First we recall that any characteristic $0$ field has a prime subfield isomorphic to the rationals $\mathbb Q$, so we will always assume characteristic $0$ fields contain $\mathbb Q$.

Sunday, January 12, 2014

Categorical Limits and Colimits

Many types of universal constructions that appear in a wide variety of mathematical contexts can be realized as categorical limits and colimits. To define a limit we first need the notion of a cone of a diagram. A diagram of type $J$ in a category $\mathcal C$ is a simply a functor $F \colon J \to \mathcal C$. We imagine $J$ as an indexing for the objects and morphisms in $\mathcal C$ under consideration. When $J$ is a finite category it can be visualized as a directed graph.

A cone of $F$ is a pair $(N, \Psi)$ where $N$ is an object of $\mathcal C$ and $\Psi$ is a collection of $\mathcal C$-morphisms $\Psi_X \colon N \to F(X) $ (one for each object $X$ in $J$) such that $\Psi_Y = F(f) \circ \Psi_X$ for every $J$-morphism $f \colon X \to Y$. Given two cones $(N, \Psi)$ and $(M, \Phi)$ we call a $\mathcal C$-morphism $u \colon N \to M$ a cone morphism from $(N, \Psi)$ to $(M, \Phi)$ provided that $u$ respects the cone property, which is to say that for every object $X$ in $J$ the morphism $\Psi_X$ factors through $u$, i.e., $\Psi_X = \Phi_X \circ u$. We will write $u \colon (N, \Psi) \to (M, \Phi)$ to denote that $u$ is a cone morphism.