Summary: Counting and Probability | N00tc0d3r
Permutation
A permutation, on the contrary, is an arrangement of several items selected from a given set.
P(n, k) = n! / (n-k)!
Combination
Mathematically, a combination is to select several items from a given set where orders are not matter.or
Useful properties:
- C(n, k) = C(n, n-k)
- C(n, 0) = C(n, n) = 1
- C(n, k) = (n/k) * C(n-1, k-1) = (n/(n-k)) * C(n-1, k)
- C(n, k) = C(n-1, k-1) + C(n-1, k)
Binomial Coefficients
In binomial expansion, we have:
(x+y)^n = SUM_{k=0}^n ( C(n, k) * x^k * y^(n-k) )
Let x = y = 1.
2^n = SUM_{k=0}^n C(n, k)
Statistics
Probability
Given a sample space S, which containing events A1, A2, ..., An.
- P(A) >= 0 for any event A
- P(S) = 1
For any two sets A and B, the union of the two sets is a set of points that in A or in B or in both, whereas the intersection of the two sets is a set of points that in both A and B.
The probability of A or B is
The probability of A or B is
The probability of A and B is
Bayes' Theorem
Bayes' theorem connects conditional probabilities to their inverses.
It can be extended as follows, using the law of total probability.
A probability mess function is a function that gives the probability of X being exactly equal to some specified value x.
Note that random variables can also be defined on continuous sample spaces which is characterized with a probability density function.
It can be extended as follows, using the law of total probability.
Discrete Random Variables
A discrete random variable X is a variable whose value is from a countable sample space S, each with an associated probability.A probability mess function is a function that gives the probability of X being exactly equal to some specified value x.
Note that random variables can also be defined on continuous sample spaces which is characterized with a probability density function.
The expected value of a discrete random variable is
E[X] = SUM_x ( x * P(X = x) )
Properties of expected value
- E[X + Y] = E[X] + E[Y]
- E[X + a] = E[X] + a, where a is a constant.
- E[aX + Y] = aE[X] + E[Y], where a is a constant.
- E[XY] = E[X]E[Y], if X and Y are independent and each has a defined expectation.
- E[X | Y = y] = SUM_x ( x * P(X = x | Y = y) )
- E[X] = E[E[X|Y]]
The variance of a discrete random variable is
Var[X] = E[X^2] - E^2[X]
Properties of variance
- Var[aX] = a^2 * Var[X]
- Var[X + Y] = Var[X] + Var[Y], if X and Y are independent.
Distributions
Bernoulli trial
An experiment with a probability p of success and a probability of 1-p of failure.
Geometric distribution
A probability distribution satisfying P(X = k) = p * (1 - p)^(k-1).
The expectation of a geometric distribution is E[X] = 1/p.
The variance of a geometric distribution is Var[X] = (1 - p) / p^2.
Binomial distribution
A probability distribution satisfying P(X = k) = C(n, k) * p^k * (1 - p)^(n-k).
The expectation of a binomial distribution is E[X] = np.
The variance of a binomial distribution is Var[X] = np(1 - p).
Read full article from Summary: Counting and Probability | N00tc0d3r