Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

There's one problem with considering the geometric mean to be the "average probability" which is that the geometric mean of p and q isn't 1 minus the mean of 1 - p and 1 - q. Ideally you'd like the average probability of something happening and the average probability of something not happening to add up to one. Depending on what you do with it this may or may not be a problem.


That's true, but it's not supposed to add up to 1. Likelihood represents the probability you assigned to the actual outcome. So the probability of every possible outcome needs to sum to 1. But because each question increases the number of possible outcomes exponentially, the probability you mass you assign to any specific outcome must become very very small. And it also becomes uninterpretable, because it is such a small number.

You can recover the likelihood from the geometric mean of likelihood, by exponentiating it to the power of the number of questions. So if you were predicting 10 coin flips, and the geometric mean was 0.5, you can find the original likelihood by taking 0.5^10. Then if you do that for every possible outcome, the probability will sum to 1.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: