We don't need faster computers, just better software

Discussion in 'MacBook Pro' started by Puevlo, Jun 17, 2012.

  1. Puevlo macrumors 6502a

    Joined:
    Oct 21, 2011
    #1
    Fact of the matter is there is no reason computers need to be made any faster than they are currently. I would say realistically we reached that point somehere around 2008. Productivity would increase so much faster if software was simply coded much more efficiently. What can you really do in OS X Lion that you couldn't do in Mac OS 9? Yet the requirements are infinitely higher.

    I personally blame it on the proliferation of high level programming languages. A real OS upgrade would be to find a way to maintain the level of functionality whilst decreasing system resource requirements.
     
  2. GGJstudios macrumors Westmere

    GGJstudios

    Joined:
    May 16, 2008
    #2
    Operate without the proliferation of malware, for one thing. Prior to Mac OS X, there were numerous viruses and other malware types affecting Mac OS 9 and earlier. Since Mac OS X, not a single virus has appeared in the wild, and only a handful of trojans. There are many other vast improvements over OS 9, but that's a significant one.
     
  3. Suno macrumors 6502

    Suno

    Joined:
    Dec 12, 2011
    #3
    You make a good point.

    I'm willing to bet that 95% of users don't need anything greater than the technology we had in 2008. And even the remaining 5% would be more than satisfied with 2010 technology.

    But what you haven't accounted for is the fact that Intel is a business first and the sole purpose of a business is to earn revenue. To do that, they need to continue making the "latest and greatest" for both marketing purposes and for competition.

    For that reason alone, we will always have improvements in technology that surpasses our ability to fully utilize it.
     
  4. taedouni macrumors 65816

    Joined:
    Jun 7, 2011
    Location:
    California
    #4
    I am sorry but yes we do need faster computers. Saying that technology is fine now is like saying we don't need more fuel efficient cars that are able to maintain performance. Technology must progress. Perhaps you don't need the extra power but CPU is very critical for some fields. And the GPU is lagging. The one thing that kind of makes me a little mad is that Apple has top of the line CPU but makes horrible choices when deciding the GPU. Apple should not have released the MBP 2012 with a 650M. They could have made it an option or the only choice for non retina MBPs....but the Retina MBP needs something better. I think that they should have gone with a 670M
     
  5. cube macrumors G5

    Joined:
    May 10, 2004
    #5
    Processors do need to be faster, specially single-threaded.

    And Apple needs to stop ignoring legacy.
     
  6. BlazednSleepy macrumors 6502a

    Joined:
    Apr 15, 2012
    #6
    You'd be able to cook an egg with a 670m and would decrease the battery life considerably.
     
  7. tersono macrumors 68000

    tersono

    Joined:
    Jan 18, 2005
    Location:
    UK
    #7


    If Jobs & Wozniak had shared that philosophy, we'd all still be using typewriters. :rolleyes:
     
  8. throAU, Jun 17, 2012
    Last edited: Jun 17, 2012

    throAU macrumors 601

    throAU

    Joined:
    Feb 13, 2012
    Location:
    Perth, Western Australia
    #8
    There's a saying that has been becoming more true as the years go by:


    Processor time (and memory) is cheap
    Programmer time is expensive


    Apps are more bloated now because of the libraries that make programming easier and more powerful.

    Sure, if everything was written in assembly, we'd have smaller, faster applications. Without the libraries though, we would have FAR less applications.

    We'd also have a lot less features, because writing a non-trivial application in assembly language is bloody difficult, time consuming, and error prone. It simply isn't practical.


    Hardware is cheap. Unless you really need the very fastest software that can be written, use higher level languages to get less errors and faster time to completion.

    Even if you DO need something to be fast, most of the time, you can throw more memory at the problem to cater to the application size increase (your DATA is generally far larger these days anyway), and optimise the 10% that accounts for 90% of the CPU time in a lower level language.

    Writing 100% of your application in a low level language is a waste for the vast majority of software out there. Also, modern CPUs are so complex, the number of programmers out there who could do a better job than an optimising compiler is becoming fewer by the day.


    However, all that aside : people's expectations are changing.

    What used to be good enough, in terms of processing throughput, no longer is.

    DVD -> Blu-ray. 800x600 screen res -> retina display. 8 bit single channel audio -> 24 bit multi-track audio. 2d bitmapped graphics -> OpenGL. Processing all that extra data is not free.

    Even if you were to write in assembly language, system requirements will still go up as people's multimedia expectations rise.
     
  9. cube macrumors G5

    Joined:
    May 10, 2004
    #9
    The language Apple promotes is not very high level. Apple is a laggard here.
     
  10. throAU macrumors 601

    throAU

    Joined:
    Feb 13, 2012
    Location:
    Perth, Western Australia
    #10
    It is, and it isn't.

    C itself is reasonably low level, but the Cocoa frameworks are fairly abstracted.

    Objective-C has quite a bit of overhead for all the message passing and libraries that are included in even a trivial app.

    Its the price you pay for flexibility and ease of application development.
     
  11. cube macrumors G5

    Joined:
    May 10, 2004
    #11
    Reference-counted imperative programming. Doubly primitive.
     
  12. throAU macrumors 601

    throAU

    Joined:
    Feb 13, 2012
    Location:
    Perth, Western Australia
    #12
    Primitive, or efficient?

    Yes, they've forced the programmer to work a bit harder than with other languages, but making virtually everything an object carries a hefty overhead before you start.

    It depends what you're comparing to.

    Compared to say, AmigaOS's intuition libraries, Cocoa is massively bloated.

    But... it's a lot more powerful, too. It's a trade-off. the features everyone wants in their new applications don't come free.
     
  13. Monkeyat macrumors regular

    Joined:
    Feb 14, 2009
    #13
    I can see a lot for the hardware upgrade :

    1) Notebook battery life / energy efficiency
    2) USB 3.0 speeds
    3) Graphics performance and resolution
    4) Weight and thickness of portable computers (I'm glad with a Air)
    5) SSD speed improvements
    6) Faster network connections

    Well, I'm going to stop here.... what do you think ? :p

    But I agree that programmers could be more efficient in their coding.... sometimes it's just horrible !!!
     
  14. cube macrumors G5

    Joined:
    May 10, 2004
    #14
    You don't need everything to be an object to have real garbage collection.

    I don't care about comparing it to even older stuff.
     
  15. Dangerous Theory macrumors 68000

    Joined:
    Jul 28, 2011
    Location:
    UK
    #15
    Maybe you're right to an extent, but which is easier to come by, the new greatest advance in silicon tech, or a decent programmer? There's a limit to the optimisation of software code. As features advance, demand for hardware is inevitable. It's not like we've got to a point where we could be using this hardware forever.
     
  16. throAU macrumors 601

    throAU

    Joined:
    Feb 13, 2012
    Location:
    Perth, Western Australia
    #16
    Who said anything about garbage collection?

    That's not the be all and end all of high level coding, and adds little to executable size really. You can write garbage collected objc for OS X anyway.

    The bloat is caused by libraries doing more. You want to use one function from library/framework "foo"? The entire library is linked into your application.

    The alternative, is you write the function yourself which takes programmer time and money.



    It is far cheaper developer wise and far more reliable to just use well-tested, well debugged library code.

    Sure, your RAM requirement goes up, but RAM is cheap.
     
  17. Gav2k macrumors G3

    Gav2k

    Joined:
    Jul 24, 2009
    #17
    I think a good example has always been the Xbox 360.

    Released nov2005 I remember them showing a photo relistic demo of a game and the graphics were mind blowing. Has this or any other title been released showing this off?? Nope!

    I'm pretty sure it was the call of duty guys that said there is no point putting in the hours to code a game that pretty as it would be a waste. People want a balance of good looks but quick release times not waiting for 12-18 months for a title because it needs to look stunning.

    7 years on that very same console is still working perfectly and selling titles that modern machines struggle to maintain 30-60 fps just because the games are coded for that device.
     
  18. cube macrumors G5

    Joined:
    May 10, 2004
    #18
    I said reference counting is primitive. It is not real garbage collection.
     
  19. throAU macrumors 601

    throAU

    Joined:
    Feb 13, 2012
    Location:
    Perth, Western Australia
    #19
    You can turn on real garbage collection.
     
  20. inaka macrumors 6502

    Joined:
    Apr 26, 2010
    #20
    The biggest hurdle isn't faster computers or software....it's advancement is battery technology.
     
  21. maflynn Moderator

    maflynn

    Staff Member

    Joined:
    May 3, 2009
    Location:
    Boston
    #21
    Software adds features, some of which is bloat bit some features requires faster CPUs. For instance my core 2 duo strains under the load of Lightroom and aperture. I need a more powerful CPU for this reason
     

Share This Page