A little disclaimer:
I started learning ML and currently reading Ian Goodfellow's "Deep learning" and these are my first attempts to use my Statistics and Probabilities Theory knowledge, so some concepts are not directly clear for me.
Everything was more or less clear prior to Maximum Likelihood Estimation.
Below the author is trying to explain how it works.
"Consider a set of m examples ... drawn independently from the true but unknown data generating distribution pdata(x)"
What is meant here? That x is some random variable where pdata is some unknown probability density function?
Then "Let pmodel(x;θ) be a parametric family of probability distributions over the same space indexed by θ"
So "parametric family of probability distributions" means any PDF with some parameters, right? For example Normal with it's mean and variance? But what does "over the same space indexed by θ" mean? What is θ in current context? Is it a parameter(or set of parameters) of PDF?
- 320 views
- Poisson process question
- Bivariate Normality questions
- Help with statistics
- foundations in probability
- ANCOVA: R Squared and Partial Eta Squared
- A miner trapped in a mine
- Compute the cumulative density function of X
- Drawing a random number with chance of redrawing a second time. Best strategy that will never lose long term?