The 64bit issue is certainly an issue, but very much overblown.
First of all, in systemd, which is a heavy D-Bus user, we effectively IRL only send integers > 2^53 when we use UINT64_MAX as a special "niche" marker meaning "unlimited"/"unset"/"undefined". But the thing is that JSON has a proper concept for this: the "null" value (or simply not specifying a specific JSON field). Hence, in reasonably clean APIs this is mostly a non-issue.
That said, sd-varlink/sd-json (i.e. systemd's implementation of it), of course is 64bit signed and unsigned integer clean when it processes JSON. More-over it automatically handles if you do what the various specs on the internet suggest you do if you have an integer > 2^53: you encode it as decimal value as a string.
Would it be better if JSON would have been more precise on this, yes. Is it a big issue? No, not at all.
If you want more than 1-second precision, 64 bits are not enough. (Hmm does all C++ std::chrono implementation utilize 128-bit integers for nanosecond precision?)
First of all, in systemd, which is a heavy D-Bus user, we effectively IRL only send integers > 2^53 when we use UINT64_MAX as a special "niche" marker meaning "unlimited"/"unset"/"undefined". But the thing is that JSON has a proper concept for this: the "null" value (or simply not specifying a specific JSON field). Hence, in reasonably clean APIs this is mostly a non-issue.
That said, sd-varlink/sd-json (i.e. systemd's implementation of it), of course is 64bit signed and unsigned integer clean when it processes JSON. More-over it automatically handles if you do what the various specs on the internet suggest you do if you have an integer > 2^53: you encode it as decimal value as a string.
Would it be better if JSON would have been more precise on this, yes. Is it a big issue? No, not at all.