Probability Notes

  • $Var(X) = E(X - E(X))^2 = E(X^2) - E(X)^2 = M2 - M1$ (moments)

  • $Cov(X,Y) = E ((X −E(X))(Y −E(Y))) = E(XY)−E(X)E(Y)$
    • $Var(X) = Cov(X,X)$
  • Standard Deviation = $\sqrt{Var(x)}$

  • $P(A \cup B) = P (A) + P(B) - P(A \cap B)$

  • Marginal distribution: $P(X = x) = \sum_y P(X = x, Y = y)$

  • If A and B are independent events: $P(A \vert B) = P(A)$ and $P(A ∩ B) = P(A) . P(B)$
    • If A and B are independent: $Cov(A,B) = 0$
  • A and B are conditionally independent if: $P((A ∩ B)/C) = P(A \vert C) . P(B \vert C)$

Bayes’ Formula:

P(AB)=P(BA)P(A)P(B)P(A \vert B) = {P(B \vert A)P(A) \over P(B)}

P(AB)=P(AB)P(B) or P(AB)=P(AB).P(B)=P(BA).P(A)P(A \vert B) = {P(A \cap B) \over P(B)} \text{ or } P(A ∩ B) = P(A \vert B) . P(B) = P(B \vert A) . P(A)

In other words:

posterior=likelihoodpriorevidence\text{posterior} = {\text{likelihood} * \text{prior} \over \text{evidence}}

Law of Total Probability

P(A)=P(AB1).P(B1)+P(AB2).P(B2)+...+P(ABn).P(Bn)P(A) = P(A \vert B1).P(B1) + P(A \vert B2).P(B2) + ... + P(A \vert Bn).P(Bn)

or,

p(x)=1cp(xyi)P(yi)p(x)=\sum_1^cp(x \vert y_i)P(y_i)

Counting

  • Permutation: $P_{k,n} = {n! \over (n-k)!}$
  • Combination: $C_{k,n} = {P_{k,n} \over k!} = {n! \over k!(n-k)!}$

ToDo:

  • Distributions
  • Maximum Likelihood Estimator
  • Maximum a posteriori probability (MAP) estimator
  • The Central Limit Theorem