Retina Display and CPU speed

Discussion in 'iMac' started by Jeremy M., May 14, 2012.

  1. Jeremy M. macrumors member

    Joined:
    Mar 7, 2012
    #1
    Okay just wanna clear this up. There is a new rumour today stating that we could see Retina Display iMacs this refresh. Although I doubt it, if we were to say hypothetically speaking that these 2012 iMacs indeed did have these displays, would this make the computer actually slower than the 2011 version in terms of CPU and GPU performance?

    I use Logic on my Macbook Pro at the moment, and while I think this type of display would be awesome, I would rather it not affect the CPU performance. Making music does not need a beautiful screen does it? :)

    I don't believe the rumour anyway. I mean, do these HiDpi screens even exist at the moment?
     
  2. Comeagain? macrumors 68020

    Comeagain?

    Joined:
    Feb 17, 2011
    Location:
    Spokane, WA
    #2
    I tend to use logic all the time, not just on my Mac...;)

    The larger graphics would affect the GPU. And, as has been said, you don't need to upgrade to use Logic. And, we dont know all of the details of these rumored displays. Who knows, it could be much faster, or exactly the same with the new intel processors.
     
  3. Jeremy M. thread starter macrumors member

    Joined:
    Mar 7, 2012
    #3
    So driving more pixels does not take a hit on the CPU, only the GPU?

    If so, that sounds great!
     
  4. theSeb macrumors 604

    theSeb

    Joined:
    Aug 10, 2010
    Location:
    Poole, England
    #4
    Yes, the GPU.
     
  5. senseless macrumors 68000

    senseless

    Joined:
    Apr 23, 2008
    Location:
    Pennsylvania, USA
    #5
    I do notice blurrier scrolling on the iPad 3; enough to make me keep the iPad 2. I'm wondering if this is a limitation of the display itself or a processing lag.
     
  6. leman macrumors 604

    Joined:
    Oct 14, 2008
    #6
    Yes it will. Don't believe people who say that only the GPU will be taxed. Even if the windows compositing is done on the GPU, quadrupling the pixels is still fairly trivial. In another thread I have explained that even the crappy old intel GMA IGP has enough fill rate to redraw a 4K display often enough. However, things like font rendering and another complex graphics operations are still done on the CPU. Increasing the resolution will thus mean more work for the CPU as well, in addition to more RAM copying and GPU sync work. I don't think that you will notice any slowdowns, but the additional work for the CPU will be there.
     
  7. Whargoul macrumors member

    Joined:
    Apr 27, 2012
    Location:
    Denver
    #7
    Sorry, thats flat out BS.
     
  8. leman macrumors 604

    Joined:
    Oct 14, 2008
    #8
    Well, than I expect some other arguments than 'that's BS'. You could start by researching the fill rates of moden GPUs and how desktop compositing works.
     
  9. Giuly macrumors 68040

    Giuly

    #9
    Is an iPhone 4 slower than an iPhone 3GS? :rolleyes:
     
  10. blackhand1001 macrumors 68030

    blackhand1001

    Joined:
    Jan 6, 2009
    #10
    The higher resolution will affect the CPU as well. The cpu is still used to calculate the positions of the elements on the screen and most of the 2d drawing is performed on the cpu in Mac osx.
     
  11. nuckinfutz macrumors 603

    nuckinfutz

    Joined:
    Jul 3, 2002
    Location:
    Middle Earth
    #11
    It's important to realize that there's multiple elements to delivering a good display system.

    The CPU, GPU and software all work in concert to deliver great results. CPU and GPU speed isn't standing still and neither is Apple's evolution of Quartz and OpenGL.

    All things being considered I want higher resolution and if there are problems it'll eventually be fixed in software and faster hardware.
     
  12. leman macrumors 604

    Joined:
    Oct 14, 2008
    #12
    Excellent post!

    If Apple indeed goes ultra-high-rez on their products, they will make sure that the performance will be adequate for the existing hardware. Higher res will always mean more work for the whole system (CPU, GPU, RAM) but its fine as long as the overhead is negligible.
     
  13. Whargoul macrumors member

    Joined:
    Apr 27, 2012
    Location:
    Denver
    #13
    An iphone 4 has a completely different CPU that is not only more power per MHz but has higher MHz.
     
  14. Chippy99 macrumors 6502a

    Joined:
    Apr 28, 2012
    #14
    They haven't done so with their current equipment so I don't know how you can be so blindly confident.

    Anyway 3840x2880 displays are coming coming any time soon, so this is all irrelevant. 12 months away at least in my view, probably 24 months.
     
  15. omvs macrumors 6502

    Joined:
    May 15, 2011
    #15
    It was possible to buy a 22" 3840x2400 LCD in 2001. (http://en.wikipedia.org/wiki/IBM_T220/T221_LCD_monitors). The pixel response time was pretty awful and resulted in some visible ghosting (100ms maybe?) but scrolling/rendering on the computer seemed okay.

    GPU's and CPU's have come a long, long way in the past 10 years. As long as they're not using low-performance integrated GPU's, I don't think the new Mac's will have much problems.
     
  16. russofris macrumors regular

    Joined:
    Mar 20, 2012
    #16
    Increasing the resolution of a display device does not noticeably influence the performance of the CPU. There are corner cases, such as UMA devices (where the graphics processor and CPU share physical memory) where the GPU and CPU need to contend for resources, but overall, the CPU is unaffected.

    In cases where there is a discrete graphics processor (iMac), the display performance is almost entirely constrained to the performance of the GPU. Modern GPUs are more than capable of pushing the number of pixels needed by retina, which is easily demonstrated by looking at any multi-monitor setup. An iMac with three monitors pushes 3x the number of pixels and works fine. And iMac with a 4k screen (3860x2160) would have to push 4x the number of pixels. There will be a need for additional frame buffer memory (currently an option at the mac store in the form of a 2GB video card), but that will manifest itself in the form of an increase in the minimum amount of GPU ram in the iMac's default configuration.

    Fortunately, retina displays do not have the same issue downscaling that lower resolution displays do. Running a a game at 1080P on a 4K display 'should' look exactly the same as doing the same on a 1080P display device because the GPU doesn't have to do any funky blending. A single square 1K pixel becomes a 2x2 4px square on a 4k screen which occupies the same physical real-estate as it's 1k counterpart.

    One downside is that items that required blending on a 1k screen (text, icons, etc) need to be recreated, otherwise they look bad. Blending tricks that made things look better on a 1K screen don't really work on a retina screen where no blending is necessary.

    One last note on 3D game performance....

    With 3D retina displays at native resolution, you no longer need anti-aliasing. Since you can no longer perceive the stairs-like-progression of an angular line, it is not necessary to combine/blend multiple samples to mimic it. Diagonal lines will look like diagonal lines, without having to resort to mathematical tricks to fake it.

    For all intents and purposes (and in a digital vacuum), a 1920x1080 game with 4xAA and a 3860x2160 game with no antialiasing should perform identically. (yes, there are many corner cases, exceptions, different types of AA with varying efficiency gains, etc, but the point should ultimately stand). If you currently run 1920x1080@4xAA on your games, you will be able to achieve nearly the same performance on a retina device at native resolutions.

    F
     
  17. iMacFarlane macrumors 65816

    iMacFarlane

    Joined:
    Apr 5, 2012
    Location:
    Adrift in a sea of possibilities
    #17
    Absolutely brilliant! You're right, at the logic / code level, the time needed to figure out what to put into four pixels should be the same as the time needed to figure out what to put into one pixel four times. Good argument for retina feasibilty. And, that being said, with faster GPUs, we should be able to realistically expect improved performance at retina over what today's iMac is doing at standard resolution. Salud!
     
  18. leman macrumors 604

    Joined:
    Oct 14, 2008
    #18
    Because of many inaccuracies, I do have to comment on this again, sorry

    Many elements are still rendered on the CPU (for instance text, no idea how many drawing operations are offloaded to the GPU). This is by no means a significant performance hit we are talking about, but the CPU will have to do more work.

    Second, when talking about GPU performance you talk about fill rate. Naively speaking, desktop compositing involves projecting textures (images of windows and UI elements) onto the 'flat' 3D quads. The current Sandy Bridge IGP offers fill rates well over 3 gtexels, which translates to rendering 4K fullscreen quads at over 250 fps. In practice, you don't need to redraw the full screen often, as only small parts of interface elements are being updated. So even an integrated GPU won't have any problems at all with compositing a 4K desktop - in theory. In practice, memory bandwidth and CPU/GPU sync will play an important role. Because 4K desktop would require lots of texture RAM, the compositor is likely to steam UI buffers into texture memory. This can result in slowdowns when you suddenly recall many buffers, as in launching Expose or similar tools.


    Again, it is a total waste to store everything in the video memory. Desktop compositing does not need that high framerates anyway and most of the stuff (everything that has text on it) has to be rendered on the CPU side of the things. Streaming textures makes much more sense here.

    This is correct

    And this is very very wrong. You are confusing multisampling and supersampling. The commonly used multisampling AA is a clever hack which allows one to smooth the edges without a too big increase in memory/bandwidth and processing cost. Its been some time since I have done serious graphics programming so I can't recall the specific details, but basically, with multisampling AA you only render shapes of elements at higher (subpixel) resolution, your actual color/depth/etc. buffers are native resolutions and are then blended using the higher-res subpixel buffer. This creates an illusion as if the images vere actually rendered at higher resolution without too much addition work.

    Supersampling AA is entirely different, as it involves true rendering to a higher-res color/depth/etc. buffer and then downscaling the image. Now, playing at 3860x2160 is effectively 2x2 supersampling of 1080p resolution, and the performance impact is huge. Basically, the performance will be 1/4 of 1080p performance without any AA (and that is if your card is not bottlenecked by VRAM bandwidth, which it probably will). In contrast, 4x multisampling usually takes less then 20% performance hit.

    So when gaming on a super-high-rez screen, you WILL be forced to use lower resolution. Which is still ok, if the screen size stays around the same.

    ----------

    Please read my post. Its really not the way it works.
     
  19. russofris macrumors regular

    Joined:
    Mar 20, 2012
    #19
    I am neither incorrect nor confused, nor are you, with the exception of you having imagined me saying something I didn't. I didn't think it was prudent to go into detail about edge versus sample, FSAA, SSAA, versus MSAA, and actually made a reference of the these exceptions in my statement: "yes, there are many corner cases, exceptions, different types of AA with varying efficiency gains, etc, but the point should ultimately stand".

    My point is: For what performance you lose by going to a higher resolution, you can make up for a great deal of it by disabling antialiasing.

    Unfortunately, there are other things that add onto the performance cost of a higher resolution (any post-processing, including many HDR implementations) that you won't magically get back.

    A 512kx512k texture consumes the same amount of memory when displayed on a 4k screen as it does a 1k screen. If you're saying that the gaming industry is going to react to retina by releasing HQ textures (4096x4096 for example), then I'll concede the point. You're going to need more texture memory.

    In addition, you seem to imply that a higher resolution will increase the triangle count. A 20k poly scene is a 20k poly scene regardless of the resolution. It's the same as texturing. As you said, filtrate would be the main bottleneck, and that's not a problem with today's hardware.

    You will be forced to use a resolution that is immersive-at-best and playable-at worst. For some games this means you will have to run it back at 1080p, which will look exactly the same as it would on a 1080p monitor. For others, 4k will do just fine.

    I get the feeling I'm arguing with a Californian. You know, one of those guys that, when you say it's "quarter of four", will chime in with "Actually, it's 3:43". It's not that you're wrong, it's just that I hate you, and it is better for me to concede the argument to spare myself the embarrassment of playing "who can be the bigger jerk" in a public forum. The only way to win is not to play.
     
  20. leman macrumors 604

    Joined:
    Oct 14, 2008
    #20
    Haha, how did you get that impression? :) No, the reason why I argue is because some things you wrote might confuse people which are a bit more further away from technology and give them a wrong idea. And if I sound a bit aggressive its just because I am very tired and rush while writing.

    And this is alas, where you are wrong. Times where anti-aliasing had a big performance impact are long gone. As I mentioned before, 4x multisampling will often be less then 20% impact with modern cards of clever algorithms while the cost of supersampling is much higher as its the brute force approach. Many cards can do multi-sampled 1080p, but how many can do true super-sampled 1080p gaming? And which of them are likely to find their way into the iMac?

    I was not talking about gaming here, I was talking about desktop composition. Higher resolution = bigger UI controls = more memory required to store their contents.
     
  21. russofris macrumors regular

    Joined:
    Mar 20, 2012
    #21
    Ahh, we're cool then. I tend to overreact when someone acts like I wasted the most valuable 2 minutes of their time that they will experience in their lifetime by rounding.

    You're saying that I overstated the performance gains at the theoretical in-a-vacuum-FSAA 100%. To that I concede, though I did make an attempt to convey that my statement had a number of exceptions and that AA was more efficient than what I was presenting.

    I'm saying that the 20% @ 4xMSAA is an understatement, and am trying to assure potential retina customers that their Batman game will probably perform about the same on the 2012 retina-iMac at 4k as it did on their 2011 iMac at 2k. I was directing my comments to the "Games will totally suck on retina" crowd. I'd give you 35% on games without post processing and 50% on games with heavy post processing, but the numbers are pretty arbitrary, and we're both just guestimating based on our prior experience. When you factor in the new CPUs, GPUs, and architectural advancements like GPU IOMMU and pinned memory, the potential exists for a bright and high resolution 2012.

    The only games that will suck on retina are the ones that currently suck without it.
     
  22. nuckinfutz macrumors 603

    nuckinfutz

    Joined:
    Jul 3, 2002
    Location:
    Middle Earth
    #22
    Russofris and Leman

    Way to be civil in your exchanges and find common ground. I enjoyed kibitzing on the conversation.

    It'll be interesting to see how the impact of Retina displays affects all composition and how the various industries engineer their way around the problems.
     
  23. russofris macrumors regular

    Joined:
    Mar 20, 2012
    #23
    Optimistically, Take what humanity has achieved with paper + what humanity has achieved with pixels = what humanity can do with retina.

    Pessimistically, We will receive a mandatory direct-to-streaming release of "Green Lantern II" at 4k resolution for download on the iTunes store.
     
  24. Giuly macrumors 68040

    Giuly

    #24
    That's the whole ***** point, Walter.
     

Share This Page