Are these two inequalities are equivalent?
Let's assume that $I_j \in \mathcal{J}$, where $\mathcal{J}$ is a set of images that are correctly classified and $p(I)$ is the output probability distribution of the used underlying model. Out of $\mathcal{J}$, we select $\hat{I}_{\!j^*}$ according to a well defined metric according to the inequality shown in (2). Given that $p(\hat{I}_{\!j^*} | \hat{y}= y) \leq p(\hat{I}_{\!j} | \hat{y}= y), $ where $y$ is the Ground Truth label. However, for $p(\hat{I}_{\!j^*} | \hat{y}= y) > p(\hat{I}_{\!j^*} | \hat{y}\neq y)$ and $p(I_j| \hat{y}= y) < p(I_j | \hat{y}\neq y)$.
How could we show that the first inequality shown in (1) is equivalent to the second inequality shown in (2)?
$$ \sum p(\hat{I}_{\!j^*})\log p(\hat{I}_{\!j^*}) \leq \sum p(\hat{I}_{\!j}) \log p(\hat{I}_{\!j}) \tag{1} $$
$$ \sum \log p(\hat{I}_{\!j^*}) \leq \sum \log p(\hat{I}_{\!j}) \tag{2} $$
A possible arugment that I believe might solve the problem is the following:
We can observe that the difference between both inequalities lies in the fact that (1) can be viewed as a scaled version of (2), where the probabilities are multiplied by their logarithms. Considering the monotonically increasing nature of the logarithmic function, we can roughly establish the equivalence of these two inequalities.
I have tested it numerically, and both inequalities are equivalent, but I cannot prove it mathematically.
- unanswered
- 220 views
- Pro Bono
Related Questions
- $\textbf{I would like a proof in detail of the following question.}$
- foundations in probability
- Find the maximum likelihood estimator
- real analysis
- Prove the uniqueness of a sequence using a norm inequality.
- Probability question
- How do we describe an intuitive arithmetic mean that gives the following? (I can't type more than 200 letters)
- Expected value of random variables - On average, how many points do you expect to receive in each round of this game?
Questions at this level should come with a bounty.