It doesn't make sense in the context of today's relatively uniform hardware, but I think the goal wasn't so much portably correct code, but portably efficient code that avoided the need to use assembly language for everything.
The first version of UNIX was written in assembly language for an 18-bit machine. C was invented as part of rewriting UNIX to work on a 16-bit machine and who knows what to expect after that. Machine architectures were a lot less uniform back in the day, and most of those minicomputers are easily beat by some of today's 20 cent microcontrollers.
Given that context, being able to loosely define relative data type sizes based on how efficiently they might be represented on the machine might make more sense.
But you're right, that's quite alien to where we are today.
By the 1980s, Ada was already making the right call of allowing the developer to specify the behavior and exact range of modular integer types for correctness and letting the compiler optimize things for efficiency or performance as needed. It'd be nice to see that approach become more popular.
The first version of UNIX was written in assembly language for an 18-bit machine. C was invented as part of rewriting UNIX to work on a 16-bit machine and who knows what to expect after that. Machine architectures were a lot less uniform back in the day, and most of those minicomputers are easily beat by some of today's 20 cent microcontrollers.
Given that context, being able to loosely define relative data type sizes based on how efficiently they might be represented on the machine might make more sense.
But you're right, that's quite alien to where we are today.
By the 1980s, Ada was already making the right call of allowing the developer to specify the behavior and exact range of modular integer types for correctness and letting the compiler optimize things for efficiency or performance as needed. It'd be nice to see that approach become more popular.