That sounds wrong. Perhaps you're confusing frequency with likelihood?
IQ is standardised so that population scores on a standard test have a mean of 100 and standard deviation of 15. It's possible to obtain this with as fewer than 100 people, distributed as:
* One "genius" who scores 200
* One "dumbass" who scores 0
* 87 "everyman" who score 100.
The mean here is clearly 100, and the variance is sqrt(20000/89) = 14.99.
Of course this is very contrived and doesn't look much like a Bell curve in the first place. But with say a million people it wouldn't take much to come up with a more realistic looking example.