Probability Notes
-
$Var(X) = E(X - E(X))^2 = E(X^2) - E(X)^2 = M2 - M1$ (moments)
- $Cov(X,Y) = E ((X −E(X))(Y −E(Y))) = E(XY)−E(X)E(Y)$
- $Var(X) = Cov(X,X)$
-
Standard Deviation = $\sqrt{Var(x)}$
-
$P(A \cup B) = P (A) + P(B) - P(A \cap B)$
-
Marginal distribution: $P(X = x) = \sum_y P(X = x, Y = y)$
- If A and B are independent events: $P(A \vert B) = P(A)$ and $P(A ∩ B) = P(A) . P(B)$
- If A and B are independent: $Cov(A,B) = 0$
- A and B are conditionally independent if: $P((A ∩ B)/C) = P(A \vert C) . P(B \vert C)$
Bayes’ Formula:
In other words:
Law of Total Probability
or,
Counting
- Permutation: $P_{k,n} = {n! \over (n-k)!}$
- Combination: $C_{k,n} = {P_{k,n} \over k!} = {n! \over k!(n-k)!}$
ToDo:
- Distributions
- Maximum Likelihood Estimator
- Maximum a posteriori probability (MAP) estimator
- The Central Limit Theorem