Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Tangentially related, but I find it somewhat baffling that at some point people thought that using data types with unknown ranges is viable way of producing robust portable programs. It took until C99 to get basic int32_t etc, before that you were mostly relying on bunch of ifdefs to get portable results. The whole mindset for things like short/int/long must have been just so very different than what I'm used to.


It doesn't make sense in the context of today's relatively uniform hardware, but I think the goal wasn't so much portably correct code, but portably efficient code that avoided the need to use assembly language for everything.

The first version of UNIX was written in assembly language for an 18-bit machine. C was invented as part of rewriting UNIX to work on a 16-bit machine and who knows what to expect after that. Machine architectures were a lot less uniform back in the day, and most of those minicomputers are easily beat by some of today's 20 cent microcontrollers.

Given that context, being able to loosely define relative data type sizes based on how efficiently they might be represented on the machine might make more sense.

But you're right, that's quite alien to where we are today.

By the 1980s, Ada was already making the right call of allowing the developer to specify the behavior and exact range of modular integer types for correctness and letting the compiler optimize things for efficiency or performance as needed. It'd be nice to see that approach become more popular.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: