You could save a bunch of space by encoding the data in a compact binary format and then loading it into a Float16Array.
In a .js file, each character is UTF-16 (2 bytes). Your current encoding uses 23 characters per coordinate, or 46 bytes.
Using 16-bit floats for lat/lon gives you accuracy down to 1 meter. You would need 4 bytes per coordinate. So that's a reduction by 91%.
You can't store raw binary bytes in a .js file so it would need to be a separate file. Or you can use base64 encoding (33% bigger than raw binary) in .js file (more like 6 bytes per coordinate).
(Edited to reflect .min.js)
> In a .js file, each character is UTF-16 (2 bytes).
What? I'd like to challenge this. The in-memory representation of a character may be UTF-16, but the file on disk can be UTF-8. Also UTF-16 doesn't mean "2 bytes per character": https://stackoverflow.com/a/27794229
The file https://github.com/AZHenley/coord2state/blob/main/dist/coord... doesn't use anything other than the 1-byte ASCII characters.
> Using 16-bit floats for lat/lon gives you accuracy down to 1 meter.
Not for Longitude it doesn't with values > abs(128), as that for example means 132.0 has the next possible value of 132.125.
float16 precision at values > 16 is pretty poor.
Converting that discrepency (132.125 - 132.0) to KM gives 10 KM.
Did you maybe mean Fixed-point? (but even then that's not enough precision for 1m)