Optimize a linear function to reduce misses

Let $V$ and $W$ be two $\mathbb{K}$ vector spaces with dimensions $n := dim(V)$ and $m := dim(W)$ and $n > m$. $V$, $W$ and $\mathbb{K}$ are finite. Also, there is a probability distribution $pr$ on $V$, s.t. for all vectors $v\in V$ we have $pr(v) \in [0,1]$.

We define:
Given a linear function $f: V \rightarrow W$ and an element $(v_1, v_2) \in V^2$ we call $(v_1, v_2)$ a $\textit{miss}$ iff $f(v_1) = f(v_2) \land v_1 \neq v_2. $

Now, our goal is the following:
We want to find a linear function $f: V \rightarrow W$ such that when we randomly and independently draw two elements $v_1, v_2 \in V$ according to $pr$, the probability of $(v_1, v_2)$ being a miss is as low as possible. 
Especially, how should you distribute the $v \in V$ to the different preimages $V / ker(f)$ according to their probabilities $pr(v)$?

  • This question, as is, doesn't quite make sense. Is your field K finite? If not, is your probability distribution atomic? If not, shouldn't all vectors have probability 0? Can't you just assign probability 0 outside a copy of W in V? Am I missing something?

    • Yes, sorry. $V$, $W$ and $\mathbb{K}$ are all finite/discrete. I just added it to the question.

    • Also, we cannot influence the probability distribution. We can only define the function f in the way we want.

  • I think I have an intuition about the solution, but the bounty is a bit too low to make it worth to expand on it. A guideline about how to price the questions should be available in the info of the website, if I recall correctly.

Join Matchmaticians Affiliate Marketing Program to earn up to a 50% commission on every question that your affiliated users ask or answer.