Two's complement didn't predominate until the late 1970s, early 1980s. Before that time ones' complement predominated.
And there are plenty of processors today which only use sign-magnitude. In particular, floating point-only CPUs. Compilers must emulate two's complement for unsigned arithmetic, and so signed arithmetic is significantly faster.
The C standard is what it is for good reason. It's not anachronistic. Rather, now there are a million little tyrants who can't be bothered to read and understand the fscking standard (despite it being effectively free, and despite it being 1/10th the size of the C++ standard) and who are are convinced that the C standard is _obviously_ wrong.
Which isn't a comment on this friendly-C proposal. But the vast majority of people have no idea what the differences are between well-defined, undefined, implementation defined, or unspecified behavior, and why those distinctions exist.
I think the point is that (integral) numbers stored in hardware naturally wrap, and that this behaviour is not restricted to two's complement. For that matter it's not even restricted to binary - the mechanical adding machines based on odometer-like gear wheels, operating in decimal, would wrap around much the same way, from the largest value back to the smallest... and these were around for several centuries before computers: http://en.wikipedia.org/wiki/Pascal%27s_calculator
Two's complement didn't predominate until the late 1970s, early 1980s. Before that time ones' complement predominated.
And there are plenty of processors today which only use sign-magnitude. In particular, floating point-only CPUs. Compilers must emulate two's complement for unsigned arithmetic, and so signed arithmetic is significantly faster.
The C standard is what it is for good reason. It's not anachronistic. Rather, now there are a million little tyrants who can't be bothered to read and understand the fscking standard (despite it being effectively free, and despite it being 1/10th the size of the C++ standard) and who are are convinced that the C standard is _obviously_ wrong.
Which isn't a comment on this friendly-C proposal. But the vast majority of people have no idea what the differences are between well-defined, undefined, implementation defined, or unspecified behavior, and why those distinctions exist.