Re: Re: Re: Re: Re: Re: Re: When will this happen?
Originally posted by hacurio1
Funny!!! I still remember studies saying that the transition from 16 to 32 bits would take longer of what it actually took. People use to say, "there is no need for 32 bits yet, 16 is fine." and here we are!!
(Sorry if this is Intel-ish instead of Motorolla ... I wasn't on the "Mac scene" during Moto's 32-bit transition)
Really? I don't remember any studies saying that the 32-bit transition on Intel would take >7 years (from introduction of the 386 in ~1988, a 32-bit pure processor, to the introduction of Windows 95, the first 32-bit-ish OS from MS).
Yes, people will always underestimate the ability for us to use power, and hind sight will always show that such predictions were foolish. On the other hand, certainly by the time Windows 3.1 was out (1992) the computer industry as a whole had a very well-defined idea that 32 bit processing was necessary for many activities. Once Win 95 came out, yes, there were "experts" who claimed that 16-bit software was okay still, but the reality of mode-switching (the CPU had to switch between 32-bit and 16-bit code, which is reportedly not the case on the 970 between 64-bit and 32-bit code) meant that running 16-bit software on a 32-bit OS was overall a pretty bad idea.
Now, contrasting 32-bits to 64-bits: when Intel switched over to 32 bits on their processors it was because users were already bumping into the limitations of 16 bits (memory allocation and management had been kludged to be larger than 16 bit pointers allowed, but even those kludges' boundaries were being bumped into by the majority of developers). By and large, developers are happy with 32-bits today, except for a few areas of development (video processing and databases forefront amongst them). When MS switched over to 32-bits on their OS, many applications which required 32-bit calculations had already been written to thunk the processor into 32-bit mode while they were in control and thunk it back down to 16-bit mode when they surrendered CPU control, which was both incredibly bad for OS stability (often applications left the CPU in a bad state) and for multitasking (apps such as this tended to relinquish control sparingly to avoid the performance hit of thunking/dethunking the processor when no other app needed the timeslice). While there are certainly apps that make use of 64-bit ints today (although obviously not 64-bit memory addressing etc), 64-bit processing support amidst regular 32-bit code is quite well supported on most processors today (not efficient - operations take 5x as long as 32-bit int operations - but such operations can not leave the CPU in a bad state nor do they require expensive mode-switching which makes developers want to hoard the CPU for as long as possible while they have it).
The 128-bit discussion here is more ludicrous: 64-bit pointers can address more memory than has been produced in the entire history of the personal computer (16 billion billion bytes), and it will be a while before we have enough memory on our desktop to even take full advantage of 64-bit addressing; I can think of no applications today that use 128-bit integer calculations - though I am sure they exist, they certainly are nowhere near mainstream. Will 128 bits be needed eventually? Eventually. I just don't see it as being as quick as the 16->32 bit transition (~10 years) or even the 32->64 bit transition (~16 years). Integer/pointer bit width, once a bottleneck and constraining factor for the majority of computing applications, is no longer the bottleneck.