logoalt Hacker News

gus_massayesterday at 8:25 PM1 replyview on HN

Nitpicking:

> You’re never going to get error less than 10E−15 since that’s the error in the tabulated values,

If you have like 100 (or 400) values in the table, you can squeeze one more digit.

In the simple case, imagine you have the constant pi, with 15 digits, repeated 100 times. If the rounding was done in a random direction like

  floor(pi * 1E15 + random() - 1/2 ) / 1E15
then you can use statistic to get an average with an error that is like K/sqrt(N) where K is a constant and N is the number of samples. If you were paying attention in you statistic clases, you probably know the value of K, but it's not my case. Anyway, just pick N=100 or N=400 or something like that and you get another digit for "free".

Nobody builds a table for a constant and uses a uniform offset for rounding. Anyway, with some smooth function that has no special values in the points where it's sample, the exact value of the cut decimal is quite "random" and you get a similar randomization.


Replies

_0ffhyesterday at 11:53 PM

Also nobody in his right mind uses lookup tables where the table value is actually the float approximation of the true f(x) - you choose the support values to minimize an error (e.g. mse) of a dense sampling of your interpolated value over x (or, in the limit, the integral of the chosen error function between the true curve and the interpolation of your supports). If you want to e.g. approximate a convex function using linear interpolation, all the tabulated values f'(x) would be <= the true f(x).