> formatting is more common than parsing.
Is it, though? It's genuinely hard for me to tell.
There's both serialization and deserialization of data sets with, e.g., JSON including floating point numbers, implying formatting and parsing, respectively.
Source code (including unit tests etc.) with hard-coded floating point values is compiled, linted, automatically formatted again and again, implying lots of parsing.
Code I usually work with ingests a lot of floating point numbers, but whatever is calculated is seldom displayed as formatted strings and more often gets plotted on graphs.
Think about things like logging and all the uses of printf which are not parsed back. But I agree that parsing is extremely common, just not the same level.
For serialization and deserialization, when the goal is to produce strings that will be read again by a computer, I consider the use of decimal numbers as a serious mistake.
The conversion to string should produce a hexadecimal floating-point number (e.g. with the "a" or "A" printf conversion specifier of recent C library implementations), not a decimal number, so that both serialization and deserialization are trivial and they cannot introduce any errors.
Even if a human inspects the strings produced in this way, comparing numbers to see which is greater or less and examining the order of magnitude can be done as easy as with decimal numbers. Nobody will want to do exact arithmetic computations mentally with such numbers.