Sure, but we can’t say someone has a 1-in-12B intelligence when we only have 8B (or whatever) people. We can only go as high as 1-in-<current_pop>
Matter of perspective; one in the 100~150 billion sapiens (or more) that have gone before and the rates go up. It's also possible rates are underestimated, as we have only tested a relatively small sample compared to N=all
we can compare their intelligence to those who have previously lived?
A quick googling gives estimates of ~117B humans have ever been born.
So if you were the cleverest person on the planet, ever, you'd have 1-in-117B intelligence?
That sounds wrong. Perhaps you're confusing frequency with likelihood?
IQ is standardised so that population scores on a standard test have a mean of 100 and standard deviation of 15. It's possible to obtain this with as fewer than 100 people, distributed as:
* One "genius" who scores 200
* One "dumbass" who scores 0
* 87 "everyman" who score 100.
The mean here is clearly 100, and the variance is sqrt(20000/89) = 14.99.
Of course this is very contrived and doesn't look much like a Bell curve in the first place. But with say a million people it wouldn't take much to come up with a more realistic looking example.