logoalt Hacker News

jmalickiyesterday at 3:16 AM2 repliesview on HN

I love seeing a Shewchuk citation other than my ML background of learning conjugate gradient! He is truly a great educator!


Replies

LegionMammal978today at 12:05 PM

Yeah, in general, this is a problem that people have spent a lot of time thinking about; while floating-point numbers can be finicky, they're what you have to work with if you have inputs at multiple scales.

(Meanwhile, I wonder why it's a fair bit harder to look up Ozaki et al.'s optimized version [0] compared to Shewchuk's original paper [1], unless perhaps later authors have found it to be no improvement at all.)

[0] https://www.tuhh.de/ti3/paper/rump/OzBueOgOiRu15.pdf

[1] https://people.eecs.berkeley.edu/~jrs/papers/robust-predicat...

cremeryesterday at 8:35 AM

his predicates paper opens with "Computational geometers despise floating-point arithmetic" same trick as the CG title: write the sentence a frustrated reader would write, then aren it.. if you like those the Triangle paper is the third one in the same key