It's of course a trade-off in how far you want to go with special types.
Booleans and numbers have extremely common use cases, moreso than date/times.
But perhaps more importantly, they are quite easy to define. Date/time is a susprisingly complex topic with many variants (date, time, datetime, local/relative date/time, point in time, offset-based, timezone-based...) with all of them being quite important. The spec to define date/time types would likely be longer than for the whole rest of JSON, and you still wouldn't be able to correctly interpret the date/time from the spec alone, since timezone data/designations are dynamic.
Now the question is - what value does this extra complexity bring? I'm not saying there isn't any, but it doesn't seem to justify the cost.
Numbers aren't "surprisingly easy to define". Indeed, JSON is a very good example of how to not define numbers for interoperability. The original spec literally doesn't place any limits on valid ranges, precision etc, with the result that the later RFC notes that "in practice" you probably want to assume 64-bit floating point because that's what most parsers use (but still doesn't actually guarantee at least that much precision!).