logoalt Hacker News

cies01/22/20251 replyview on HN

I disagree 1.0 is more precise one than 1.

Both in speaking language, and in quite some programming languages "1" is assumed to be an integer, and "1.0" is assumed to be a number with one decimal (something akin to a float). And I'd say integer "1" is the most precise type of one.

If we are rounding numbers you are right though...

round_to_int(0.5000000 to 1.499999) -> 1

round_to_one_decimal(0.9500000 to 1.049999) -> 1.0


Replies

Terr_01/22/2025

> I disagree 1.0 is more precise one than 1.

It depends on the context/subtext: Is the other person trying to communicate something extra by adding the .0 portion?

Some are, some aren't. A programmer might use it to distinguish a data-type even though they are otherwise equal, an engineer might use it for significant-figures, etc.