A little disclaimer:
I started learning ML and currently reading Ian Goodfellow's "Deep learning" and these are my first attempts to use my Statistics and Probabilities Theory knowledge, so some concepts are not directly clear for me.
Everything was more or less clear prior to Maximum Likelihood Estimation.
Below the author is trying to explain how it works.
"Consider a set of m examples ... drawn independently from the true but unknown data generating distribution pdata(x)"
What is meant here? That x is some random variable where pdata is some unknown probability density function?
Then "Let pmodel(x;θ) be a parametric family of probability distributions over the same space indexed by θ"
So "parametric family of probability distributions" means any PDF with some parameters, right? For example Normal with it's mean and variance? But what does "over the same space indexed by θ" mean? What is θ in current context? Is it a parameter(or set of parameters) of PDF?
- 254 views
- Critique my null and alternative hypothesis (beginner)
- Foundations in probability
- Poisson process question
- Probability question regarding Moment genrating function and Chebyshev's ineqaulity(show in file).
- Central Limit Theorem question
- Help with probability proofs and matrices proofs (5 problems)
- A bag contains 3 red jewels and 7 black jewels. You randomly draw the jewels out one by one without replacement. What is the probability that the last red jewel was the 8th one withdrawn?