Basic Concept of Probability Distributions 8: Normal Distribution

PDF version

PDF & CDF

The probability density function is $$f(x; \mu, \sigma) = {1\over\sqrt{2\pi}\sigma}e^{-{1\over2}{(x-\mu)^2\over\sigma^2}}$$ The cumulative distribution function is defined by $$F(x; \mu, \sigma) = \Phi\left({x-\mu\over\sigma}\right)$$ where $$\Phi(z) = {1\over\sqrt{2\pi}} \int_{-\infty}^{z}e^{-{1\over2}x^2}\ dx$$
Proof:
$$
\begin{align*}
\int_{-\infty}^{\infty}f(x; \mu, \sigma) &= \int_{-\infty}^{\infty}{1\over\sqrt{2\pi}\sigma}e^{-{1\over2}{(x-\mu)^2\over\sigma^2}}\ dx\\
&= {1\over\sqrt{2\pi}\sigma}\int_{-\infty}^{\infty}e^{-{1\over2}{(x-\mu)^2\over\sigma^2}}\ dx\\
&= {1\over\sqrt{2\pi}}\int_{-\infty}^{\infty}e^{-{1\over2}y^2}\ dy\quad\quad\quad\quad\quad(\mbox{setting}\ y={x-\mu\over\sigma} \Rightarrow dx = \sigma dy)\\
\end{align*}
$$
Let $I = \int_{-\infty}^{\infty}e^{-{1\over2}y^2}\ dy$, then
$$
\begin{eqnarray*}
I^2 &=& \int_{-\infty}^{\infty}e^{-{1\over2}y^2}\ dy\int_{-\infty}^{\infty}e^{-{1\over2}x^2}\ dx\\
&=& \int_{-\infty}^{\infty}\int_{-\infty}^{\infty}e^{-{1\over2}(y^2+x^2)}\ dydx\quad\quad\quad\quad(\mbox{setting}\ x=r\cos\theta, y=r\sin\theta)\\
&=& \int_{0}^{\infty}\int_{0}^{2\pi}e^{-{1\over2}r^2}\ rd\theta dr \\
& & (\mbox{double integral}\ \iint\limits_{D}f(x, y)\ dxdy = \iint\limits_{D^*}f(r\cos\theta, r\sin\theta)r\ drd\theta) \\
&=& 2\pi\int_{0}^{\infty}re^{-{1\over2}r^2}\ dr\\
&=& -2\pi e^{-{1\over2}r^2}\Big|_{0}^{\infty}\\
&=& 2\pi
\end{eqnarray*}
$$
Hence $$\int_{-\infty}^{\infty}f(x; \mu, \sigma) = {1\over\sqrt{2\pi}} \cdot\sqrt{2\pi} = 1$$

Standard Normal Distribution

If $X$ is normally distributed with parameters $\mu$ and $\sigma^2$, then $$Z = {X-\mu\over\sigma}$$ is normally distributed with parameters 0 and 1.

Proof:
An important conclusion is that if $X$ is normally distributed with parameters $\mu$ and $\sigma^2$, then $Y = aX + b$ is normally distributed with parameters $a\mu + b$ and $a^2\sigma^2$. Denote $F_{Y}$ as the cumulative distribution function of $Y$:
$$
\begin{align*}
F_{Y}(x) &= P(Y \leq x)\\
&= P(aX + b \leq x)\\
&= P(X \leq {x-b\over a})\\
&= F_{X}\left({x-b\over a}\right)
\end{align*}
$$
where $F_{X}(x)$ is the cumulative distribution function of $X$. By differentiation, the probability density function of $Y$ is
$$
\begin{align*}
f_{Y}(x) &= {1\over a}f_{X}\left({x-b\over a}\right)\\
&= {1\over\sqrt{2\pi}a\sigma}e^{-{1\over2}{({x-b\over a} - \mu)^2\over \sigma^2}}\\
&= {1\over\sqrt{2\pi}(a\sigma)}e^{-{1\over2}{(x-b - a\mu)^2\over a^2\sigma^2}}\\
&= {1\over\sqrt{2\pi}(a\sigma)}e^{-{1\over2}{(x-(b + a\mu))^2\over (a\sigma)^2}}
\end{align*}
$$
which shows that $Y$ is normally distributed with parameters $a\mu + b$ and $a^2\sigma^2$.
According to the above result, we can easily deduce that $Z = {X-\mu\over\sigma}$ follows the normally distributed with parameters 0 and 1.

Mean

The expected value is $$E[X] = \mu$$
Proof:
$$
\begin{align*}
E[Z] &= \int_{-\infty}^{\infty}xf_{Z}(x)\ dx\quad\quad\quad \quad\quad \quad\quad (\mbox{setting}\ Z={X-\mu\over\sigma})\\
&= {1\over\sqrt{2\pi}}\int_{-\infty}^{\infty}xe^{-{1\over2}x^2}\ dx\\
&= -{1\over\sqrt{2\pi}}\int_{-\infty}^{\infty}e^{-{1\over2}x^2}\ d\left(-{1\over2}x^2\right)\\
&= -{1\over\sqrt{2\pi}}e^{-{1\over2}x^2}\Big|_{-\infty}^{\infty}\\
&= 0
\end{align*}
$$
Hence
$$
\begin{align*}
E[X] &= E\left[\sigma Z+\mu\right]\\
&= \sigma E[Z] + \mu\\
&= \mu
\end{align*}
$$
Variance

The variance is $$\mbox{Var}(X) = \sigma^2$$
Proof:
$$
\begin{align*}
E\left[Z^2\right] &= {1\over\sqrt{2\pi}}\int_{-\infty}^{\infty}x^2e^{-{1\over2}x^2}\ dx\quad\quad\quad \quad\quad \quad\quad\quad\quad\quad (\mbox{setting}\ Z={X-\mu\over\sigma})\\
&= {1\over\sqrt{2\pi}}\left(-xe^{-{1\over2}x^2}\Big|_{-\infty}^{\infty} +\int_{-\infty}^{\infty}e^{-{1\over2}x^2}\ dx\right)\quad\quad\quad(\mbox{integrating by parts})\\
&= {1\over\sqrt{2\pi}}\int_{-\infty}^{\infty}e^{-{1\over2}x^2}\ dx \quad\quad\quad\quad\quad\quad\quad(\mbox{standard normal distribution})\\
&= 1
\end{align*}
$$
the integral by parts: $$u= x,\ dv = xe^{-{1\over2}x^2}\ dx$$ $$\implies du = dx,\ v = \int xe^{-{1\over2}x^2}\ dx = -e^{-{1\over2}x^2}$$ $$\implies \int x^2e^{-{1\over2}x^2}\ dx =-xe^{-{1\over2}x^2} +\int e^{-{1\over2}x^2}\ dx$$ Hence $$\mbox{Var}(X) = \mbox{Var}(\sigma Z + \mu)= \sigma^2\mbox{Var}(Z) = \sigma^2$$

Basic Concept of Probability Distributions 7: Uniform Distribution

PDF version

PDF & CDF

The probability density function of the uniform distribution is $$f(x; \alpha, \beta) = \begin{cases}{1\over\beta-\alpha} & \mbox{if}\ \alpha < x < \beta\\ 0 & \mbox{otherwise} \end{cases} $$ The cumulative distribution function of the uniform distribution is $$F(x) = \begin{cases}0 & x\leq\alpha \\ {x-\alpha\over \beta-\alpha} & \alpha < x < \beta\\ 1 & x \geq \beta \end{cases}$$ Proof:
$$
\begin{align*}
\int_{-\infty}^{\infty}f(x; \alpha, \beta)\ dx &= \int_{\alpha}^{\beta}{1\over\beta-\alpha}\ dx\\
&= {x\over\beta-\alpha}\Big|_{\alpha}^{\beta}\\
&= {\beta\over\beta-\alpha} - {\alpha\over\beta-\alpha}\\
&= 1
\end{align*}
$$
And
$$
\begin{align*}
F(x; \alpha, \beta) &= \int_{-\infty}^{x}f(x; \alpha, \beta)\ dx\\
&= \int_{-\infty}^{x}{1\over\beta-\alpha}\ dx\\
&= {x\over\beta-\alpha}\Big|_{\alpha}^{x}\\
&= {x - \alpha\over\beta-\alpha}
\end{align*}
$$
Mean

The expected value is $$\mu = E[X] = {\beta + \alpha \over 2}$$
Proof:
$$
\begin{align*}
E[X] &= \int_{-\infty}^{\infty}xf(x; \alpha, \beta)\ dx\\
&= \int_{\alpha}^{\beta}{x\over\beta-\alpha}\ dx\\
&= {x^2\over2(\beta - \alpha)}\Big|_{\alpha}^{\beta}\\
&= {\beta^2-\alpha^2\over2(\beta-\alpha)}\\
&= {\beta + \alpha \over 2}
\end{align*}
$$
Variance

The variance is $$\sigma^2 = \mbox{Var}(X) = {(\beta - \alpha)^2 \over 12}$$
Proof:
$$
\begin{align*}
E\left[X^2\right] &= \int_{-\infty}^{\infty}x^2f(x;\alpha, \beta)\ dx\\
&= \int_{\alpha}^{\beta}{x^2\over\beta-\alpha}\ dx\\
&= {x^3\over 3(\beta - \alpha)}\Big|_{\alpha}^{\beta}\\
&= {\beta^3 - \alpha^3\over 3(\beta - \alpha)}\\
&= {\beta^2 + \alpha\beta + \alpha^2\over 3}
\end{align*}
$$
Hence
$$
\begin{align*}
\mbox{Var}(X) &= E\left[X^2\right] - E[X]^2\\
&= {\beta^2 + \alpha\beta + \alpha^2\over 3} - {\alpha^2+2\alpha\beta +\beta^2 \over 4}\\
&= {\beta^2 + \alpha^2 -2\alpha\beta \over 12}\\
&= {(\beta - \alpha) ^2 \over 12}
\end{align*}
$$

Basic Concept of Probability Distributions 6: Exponential Distribution

PDF version

PDF & CDF
The exponential probability density function (PDF) is $$f(x; \lambda) = \begin{cases}\lambda e^{-\lambda x} & x\geq0\\ 0 & x < 0 \end{cases}$$ The exponential cumulative distribution function (CDF) is $$F(x; \lambda) = \begin{cases}1 - e^{-\lambda x} & x\geq0\\ 0 & x < 0 \end{cases}$$ Proof:
$$
\begin{align*}
F(x; \lambda) &= \int_{0}^{x}f(x; \lambda)\ dx\\
&= \int_{0}^{x}\lambda e^{-\lambda x}\ dx \\
&= \lambda\cdot\left(-{1\over\lambda}\right)\int_{0}^{x}e^{-\lambda x}\ d(-\lambda x)\\
&= -e^{-\lambda x}\Big|_{0}^{x}\\
&= 1 - e^{-\lambda x}
\end{align*}
$$
And $$F(\infty) = 1$$

Mean
The expected value is $$\mu = E[X] = {1\over\lambda}$$
Proof:
$$
\begin{align*}
E\left[X^k\right] &= \int_{0}^{\infty}x^kf(x; \lambda)\ dx\\
&= \int_{0}^{\infty}x^k\lambda e^{-\lambda x}\ dx\\
&= -x^ke^{-\lambda x}\Big|_{0}^{\infty} + \int_{0}^{\infty}e^{-\lambda x}kx^{k-1}\ dx\quad\quad\quad\quad(\mbox{integrating by parts})\\
&= 0 + {k\over \lambda}\int_{0}^{\infty}x^{k-1}\lambda e^{-\lambda x}\ dx\\
&= {k\over\lambda}E\left[X^{k-1}\right]
\end{align*}
$$
Using the integrating by parts: $$u= x^k\Rightarrow du = kx^{k-1}\ dx,\ dv = \lambda e^{-\lambda x}\Rightarrow v = \int\lambda e^{-\lambda x}\ dx = -e^{-\lambda x}$$ $$\implies \int x^k\lambda e^{-\lambda x}\ dx =uv - \int vdu = -x^ke^{-\lambda x} + \int e^{-\lambda x}kx^{k-1}\ dx$$
Hence setting $k=1$: $$E[X]= {1\over\lambda}$$

Variance
The variance is $$\sigma^2 = \mbox{Var}(X) = {1\over\lambda^2}$$

Proof:
$$
\begin{align*}
E\left[X^2\right] &= {2\over\lambda} E[X] \quad\quad \quad\quad (\mbox{setting}\ k=2)\\
&= {2\over\lambda^2}
\end{align*}
$$
Hence
$$
\begin{align*}
\mbox{Var}(X) &= E\left[X^2\right] - E[X]^2\\
&= {2\over\lambda^2} - {1\over\lambda^2}\\
&= {1\over\lambda^2}
\end{align*}
$$