I love how this proposal "fixes" a bunch of non-issues, while making it much more difficult to write a JSON parser, and then doesn't address the only real problem with JSON:
Defined numbers as int64? Or arbitrary precision bignums? Then JavaScript would not be able to support JSON without an external bignum library, which many/most applications do not actually need, and JSON would be less convenient to use.
Or would you prefer they had specifically defined IEEE double precision as the numeric representation? Then JSON numbers would be useless for qemu's offsets and other applications that need numbers not representable as IEEE double precision.
Leaving it unspecified means that implementations support what they can. If you end up needing actual int64s in JavaScript, you can drop in a BigNum library and get them. It's true that not all numbers can be represented by all implementations, but that was true already.
Provided a rich range of integer types, with specifications in the standard. Leaving it unspecified is really the worst choice if you trying to interoperate with real applications and libraries. Supporting "what they can" is great for crappy implementations, and terrible if you're trying to consume this stuff and get work done.
A "rich range" of anything is contrary to the spirit of JSON. JSON became popular in part because it was a much simpler contrast to technologies like XML that provided a "rich range" of everything.
Leaving it unspecified means that JSON as a format is capable of arbitrary precision, without requiring that implementations carry around bignum libraries if their particular application doesn't need them.
JSON also has no floating-point type. It has a generic number type, and it is up to the parsing application to decide how to handle it. You have complete freedom; There is nothing to stop you treating it like an integer, float, decimal, or bignum.
A lack of a standard date/time type means that some people use UNIX time, others use ISO strings (which you then need to detect, parse and handle errors for), and others are crazy and use their own format, localised just to make it fun (are people writing JSON using Django templates or PHP date formatting - yes, yes they are).
To have one standard way that everyone does dates would be nice.
PS: And the HN measure of "More comments than upvotes?" clearly works as controversial/bad ideas do generally follow that rule.
It's ok if you are expecting a Date. But when you want to write generic code that "discover" the properties, then you have to do dirty things, like trying to parse every single strings as a date.
That it doesn't have a date/time type.