Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Puevlo

macrumors 6502a
Original poster
Oct 21, 2011
633
1
Fact of the matter is there is no reason computers need to be made any faster than they are currently. I would say realistically we reached that point somehere around 2008. Productivity would increase so much faster if software was simply coded much more efficiently. What can you really do in OS X Lion that you couldn't do in Mac OS 9? Yet the requirements are infinitely higher.

I personally blame it on the proliferation of high level programming languages. A real OS upgrade would be to find a way to maintain the level of functionality whilst decreasing system resource requirements.
 
What can you really do in OS X Lion that you couldn't do in Mac OS 9?
Operate without the proliferation of malware, for one thing. Prior to Mac OS X, there were numerous viruses and other malware types affecting Mac OS 9 and earlier. Since Mac OS X, not a single virus has appeared in the wild, and only a handful of trojans. There are many other vast improvements over OS 9, but that's a significant one.
 
You make a good point.

I'm willing to bet that 95% of users don't need anything greater than the technology we had in 2008. And even the remaining 5% would be more than satisfied with 2010 technology.

But what you haven't accounted for is the fact that Intel is a business first and the sole purpose of a business is to earn revenue. To do that, they need to continue making the "latest and greatest" for both marketing purposes and for competition.

For that reason alone, we will always have improvements in technology that surpasses our ability to fully utilize it.
 
I am sorry but yes we do need faster computers. Saying that technology is fine now is like saying we don't need more fuel efficient cars that are able to maintain performance. Technology must progress. Perhaps you don't need the extra power but CPU is very critical for some fields. And the GPU is lagging. The one thing that kind of makes me a little mad is that Apple has top of the line CPU but makes horrible choices when deciding the GPU. Apple should not have released the MBP 2012 with a 650M. They could have made it an option or the only choice for non retina MBPs....but the Retina MBP needs something better. I think that they should have gone with a 670M
 
Processors do need to be faster, specially single-threaded.

And Apple needs to stop ignoring legacy.
 
I am sorry but yes we do need faster computers. Saying that technology is fine now is like saying we don't need more fuel efficient cars that are able to maintain performance. Technology must progress. Perhaps you don't need the extra power but CPU is very critical for some fields. And the GPU is lagging. The one thing that kind of makes me a little mad is that Apple has top of the line CPU but makes horrible choices when deciding the GPU. Apple should not have released the MBP 2012 with a 650M. They could have made it an option or the only choice for non retina MBPs....but the Retina MBP needs something better. I think that they should have gone with a 670M

You'd be able to cook an egg with a 670m and would decrease the battery life considerably.
 
Fact of the matter is there is no reason computers need to be made any faster than they are currently. I would say realistically we reached that point somehere around 2008. Productivity would increase so much faster if software was simply coded much more efficiently. What can you really do in OS X Lion that you couldn't do in Mac OS 9? Yet the requirements are infinitely higher.

I personally blame it on the proliferation of high level programming languages. A real OS upgrade would be to find a way to maintain the level of functionality whilst decreasing system resource requirements.



If Jobs & Wozniak had shared that philosophy, we'd all still be using typewriters. :rolleyes:
 
There's a saying that has been becoming more true as the years go by:


Processor time (and memory) is cheap
Programmer time is expensive


Apps are more bloated now because of the libraries that make programming easier and more powerful.

Sure, if everything was written in assembly, we'd have smaller, faster applications. Without the libraries though, we would have FAR less applications.

We'd also have a lot less features, because writing a non-trivial application in assembly language is bloody difficult, time consuming, and error prone. It simply isn't practical.


Hardware is cheap. Unless you really need the very fastest software that can be written, use higher level languages to get less errors and faster time to completion.

Even if you DO need something to be fast, most of the time, you can throw more memory at the problem to cater to the application size increase (your DATA is generally far larger these days anyway), and optimise the 10% that accounts for 90% of the CPU time in a lower level language.

Writing 100% of your application in a low level language is a waste for the vast majority of software out there. Also, modern CPUs are so complex, the number of programmers out there who could do a better job than an optimising compiler is becoming fewer by the day.


However, all that aside : people's expectations are changing.

What used to be good enough, in terms of processing throughput, no longer is.

DVD -> Blu-ray. 800x600 screen res -> retina display. 8 bit single channel audio -> 24 bit multi-track audio. 2d bitmapped graphics -> OpenGL. Processing all that extra data is not free.

Even if you were to write in assembly language, system requirements will still go up as people's multimedia expectations rise.
 
Last edited:
I personally blame it on the proliferation of high level programming languages. A real OS upgrade would be to find a way to maintain the level of functionality whilst decreasing system resource requirements.

The language Apple promotes is not very high level. Apple is a laggard here.
 
The language Apple promotes is not very high level. Apple is a laggard here.

It is, and it isn't.

C itself is reasonably low level, but the Cocoa frameworks are fairly abstracted.

Objective-C has quite a bit of overhead for all the message passing and libraries that are included in even a trivial app.

Its the price you pay for flexibility and ease of application development.
 
It is, and it isn't.

C itself is reasonably low level, but the Cocoa frameworks are fairly abstracted.

Objective-C has quite a bit of overhead for all the message passing and libraries that are included in even a trivial app.

Its the price you pay for flexibility and ease of application development.

Reference-counted imperative programming. Doubly primitive.
 
Reference-counted imperative programming. Doubly primitive.

Primitive, or efficient?

Yes, they've forced the programmer to work a bit harder than with other languages, but making virtually everything an object carries a hefty overhead before you start.

It depends what you're comparing to.

Compared to say, AmigaOS's intuition libraries, Cocoa is massively bloated.

But... it's a lot more powerful, too. It's a trade-off. the features everyone wants in their new applications don't come free.
 
I can see a lot for the hardware upgrade :

1) Notebook battery life / energy efficiency
2) USB 3.0 speeds
3) Graphics performance and resolution
4) Weight and thickness of portable computers (I'm glad with a Air)
5) SSD speed improvements
6) Faster network connections

Well, I'm going to stop here.... what do you think ? :p

But I agree that programmers could be more efficient in their coding.... sometimes it's just horrible !!!
 
Primitive, or efficient?

Yes, they've forced the programmer to work a bit harder than with other languages, but making virtually everything an object carries a hefty overhead before you start.

It depends what you're comparing to.

Compared to say, AmigaOS's intuition libraries, Cocoa is massively bloated.

But... it's a lot more powerful, too. It's a trade-off. the features everyone wants in their new applications don't come free.

You don't need everything to be an object to have real garbage collection.

I don't care about comparing it to even older stuff.
 
Maybe you're right to an extent, but which is easier to come by, the new greatest advance in silicon tech, or a decent programmer? There's a limit to the optimisation of software code. As features advance, demand for hardware is inevitable. It's not like we've got to a point where we could be using this hardware forever.
 
You don't need everything to be an object to have real garbage collection.

I don't care about comparing it to even older stuff.

Who said anything about garbage collection?

That's not the be all and end all of high level coding, and adds little to executable size really. You can write garbage collected objc for OS X anyway.

The bloat is caused by libraries doing more. You want to use one function from library/framework "foo"? The entire library is linked into your application.

The alternative, is you write the function yourself which takes programmer time and money.



It is far cheaper developer wise and far more reliable to just use well-tested, well debugged library code.

Sure, your RAM requirement goes up, but RAM is cheap.
 
I think a good example has always been the Xbox 360.

Released nov2005 I remember them showing a photo relistic demo of a game and the graphics were mind blowing. Has this or any other title been released showing this off?? Nope!

I'm pretty sure it was the call of duty guys that said there is no point putting in the hours to code a game that pretty as it would be a waste. People want a balance of good looks but quick release times not waiting for 12-18 months for a title because it needs to look stunning.

7 years on that very same console is still working perfectly and selling titles that modern machines struggle to maintain 30-60 fps just because the games are coded for that device.
 
Who said anything about garbage collection?

That's not the be all and end all of high level coding, and adds little to executable size really. You can write garbage collected objc for OS X anyway.

I said reference counting is primitive. It is not real garbage collection.
 
The biggest hurdle isn't faster computers or software....it's advancement is battery technology.
 
Software adds features, some of which is bloat bit some features requires faster CPUs. For instance my core 2 duo strains under the load of Lightroom and aperture. I need a more powerful CPU for this reason
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.