logoalt Hacker News

adrian_blast Thursday at 12:37 PM0 repliesview on HN

For serialization and deserialization, when the goal is to produce strings that will be read again by a computer, I consider the use of decimal numbers as a serious mistake.

The conversion to string should produce a hexadecimal floating-point number (e.g. with the "a" or "A" printf conversion specifier of recent C library implementations), not a decimal number, so that both serialization and deserialization are trivial and they cannot introduce any errors.

Even if a human inspects the strings produced in this way, comparing numbers to see which is greater or less and examining the order of magnitude can be done as easy as with decimal numbers. Nobody will want to do exact arithmetic computations mentally with such numbers.