Interesting! I don't share that view at all — I mean, everything running locally is just bits too, right? Your CPU doesn't care about monads or integers or characters or strings or functors either. But ultimately your higher level code does expect data to conform to some invariants, whether you explicitly model them or not.
IMO the right approach is just to parse everything into a known type at the point of ingress, and from there you can just deal with your language's native data structures.
I know everything reduces to bits eventually, but modern CPUs and memory aren’t as “lossy” as the network is, meaning you can make more assumptions about the data being and staying intact (especially if you have ECC).
Once you add distribution you have to encode for the fact that the network is terrible.
You absolutely can parse at ingress, but then there are issues with that. If the data you got is 3/4 good, but one field is corrupted, do you reject everything? Sometimes, but often Probably not, network calls are too expensive, so you encode that into the type with a Maybe. But of course any field could be corrupt so you have to encode lots of fields as Maybes. Suddenly you have reinvented dynamic typing but it’s LARPing as a static type system.