I. I. D. Random Variables
(independently and identically distributed)
- Two variables are identically distributed if they have the same probability distribution
- Where is the property assumed
- IID of a random variables in factorisation of a joint probability.
- A and B are independent variables if knowing B does not give us any information on the value of A and vice versa
- Independent and identically distributed random variables satisfy both these properties
Thank you for watching this video :)
dont forget to like, subscribe, show it to your neighbors and present your offering to our lord and saviour
Identically Distributed
random variables that have the same probability distribution
- X and Y are I.D. iff they have the same cumulative distribution function
same mean and same variance
Not identically distributed
INDEPENDENCY
two random variables are statistically independent if P(X|Y)=P(X)
- two random variables are independent if knowing the probability distribution of one of them does not give us information about the other
INDEPENDENCY
P(secondCard|First Card) != P(secondCard)
Where is I.I.D. assumed?
IID is core to simplifying machine learning training
...in Machine Learning
Supervised learning: labeled data set
- distributions between training and test set are equal and that there are no in-built sampling dependencies.
- data distribution after the deployment of the sample does not change
...in Statistical Modelling
- Building blocks for:
- Bernoulli principle, uniform processes, gaussian processes
probability distribution stays the same throughtout time
- error comparison for model building and model selection
... in Probability Theory
- The greater an observed sample average from a large sample population the more it will truly resemble the real population average.
- If we take sufficiently large random samples from a population, the sample means will be approximately normally distributed.
- random samples obtained cannot be dependent, and the random variable distribution cannot change
Factorisation of a joint probability p(a,b,c)
Factorisation of a joint probability p(a,b,c)
p(a,b,c) = p(a|b,c)*p(b,c)
p(a,b,c) = p(a|b,c)*p(b|c)*p(c)
Different solutions dependent on start variable:
p(a,b,c)=p(b|a,c)p(a|c)p(c)
p(a,b,c)=p(a|b,c)p(b|c)p(c)
p(a,b,c)=p(c|a,b)p(b|a)p(a)
p(a,b,c)=p(a|b,c)p(c|b)p(b)
p(a,b,c)=p(c|a,b)p(a|b)p(b)
p(a,b,c)=p(b|a,c)p(c|a)p(a)
Factorisation of a joint probability p(a,b,c)
- If joint probabilities are independent of each other the factorization can be written as a product of all probabilities.
- factorization relies on marginal probabilities created by the product rule.
- Factorization can shortened if the probabilities are independent