Really small correction though, simply because you mentioned Go and Rust: Rust doesn't have an int type; it has explicit i32/u32 and i64/u64 types. It

rygorous / gist:e0f055bfb74e3d5f0af20690759de5a7

submited by
Style Pass
2021-06-25 02:00:06

Really small correction though, simply because you mentioned Go and Rust: Rust doesn't have an int type; it has explicit i32/u32 and i64/u64 types. It does, however, have a usize type that is "The pointer-sized unsigned integer type" which is almost always used for indexing and loop counters (and ofc also isize, its signed counterpart). This aligns with what you're saying here and is generally a Good Thing™, especially the "Make sure that your idiomatic loop counter is register width" part.

I really enjoyed this, thanks. Do you have pointers to any sort of write-up about why C chose to go the "leave int at 32 bits" route? (And why wasn't the same done for the 16- to 32-bit transition?)

@jacobsa For the 16 to 32-bit transition, changing from 16-bit to 32-bit made sense; it only adds two bytes per variable, and quite a lot of counters in typical programs can very easily exceed 65536 in the first place (i.e. there were already lots of programs that either had arbitrary limits or actual brokenness, and those that didn't already had to use 32-bit values all over the place anyway). So the effect of the change was minimal. Furthermore, not all of the "16-bit" platforms were really 16-bit in the first place; the MC68K-based machines were really 32-bit from the get go, and so even on the 68000 and 68010-based systems, some compilers use 32-bit "int" by default anyway (the main downside, until the 68020, being that there was a performance penalty, particularly if multiplication was involved — but since programmers knew that, they could always declare things as "short int" if they knew values would fit in that range).

For the 32-bit to 64-bit transition, things were rather different. Most values in most programs will happily fit in 32 bits (i.e. they are always going to be smaller than ±2bn or +4bn respectively), so extending them all to 64-bits wastes four bytes each time. This creates extra pressure on processor caches and on system memory and has very little benefit otherwise.

Leave a Comment
Related Posts