Screenshot "as it appears" at scaled res?

Discussion in 'MacBook Pro' started by Qaanol, Jan 23, 2014.

  1. Qaanol macrumors 6502a

    Joined:
    Jun 21, 2010
    #1
    I want to take screenshots that come out at 2880×1800 and show the exact color displayed on each pixel of my 15.4" rMBP, no matter what scaled resolution I am using. How can I do this?

    In particular, if I set my screen to 1680×1050, I want my screenshots to be 2880×1800 not 3160×2100. And if I set my screen to 1920×1200, I want my screenshots to be 2880×1800 not 3840×2400. You know, a snapshot of the screen.
     
  2. ayeying macrumors 601

    ayeying

    Joined:
    Dec 5, 2007
    Location:
    Yay Area, CA
    #2
    It's not exactly that simple. When you're running 1680x1050, the OS ups the resolution to 3160x2100 to make everything fit and look clear. Same with 1920x1200 will be 3840x2400. The only way I know of, if you want 2880x1800 is to run 1440x900.
     
  3. priitv8 macrumors 68020

    Joined:
    Jan 13, 2011
    Location:
    Estonia
    #3
    Or natively 2880x1800.
    In all other cases the framebuffer is exactly double the size of virtual res.
    As the screen snapshot copies framebuffer 1:1 you can't get 2880x1800 snapshot from any other res.
    If you run your screen at 2880x1800 natively, you get a screenshot of 2880x1800 pixels at 72dpi.
    If you run a HiDPI mode 1440x900, you get a screenshot of 2880x1800 at 144dpi.
     
  4. Qaanol thread starter macrumors 6502a

    Joined:
    Jun 21, 2010
    #4
    Yes, I understand all of that, the stock screenshot implementation does not do what I want, that is why I made this thread. I want to create an image that shows exactly what appears on the screen, pixel for pixel. How can that be done?
     
  5. alphaod macrumors Core

    alphaod

    Joined:
    Feb 9, 2008
    Location:
    NYC
    #5
    Easy: take the screenshot, downsample to 2880x1800 in software.
     
  6. Qaanol thread starter macrumors 6502a

    Joined:
    Jun 21, 2010
    #6
    Is there software that guarantees to use the same downsampling method as OS X natively uses for displays?

    That downsampling is certainly already taking place somewhere along the graphics pipeline, between the display buffer and the hardware itself. I want to ensure that the image I create is pixel-for-pixel identical to what appears on the screen.

    Moreover, if Apple subsequently decides to alter its downsampling algorithm, I want to immediately and perpetually be guaranteed to obtain precisely the image as displayed, every time.

    So I repeat my query, are there any tools to capture the image *as shown on the screen* pixel for pixel on a retina MacBook?
     
  7. alphaod macrumors Core

    alphaod

    Joined:
    Feb 9, 2008
    Location:
    NYC
    #7
    No, there isn't. The only tool you need to built into OS X. To take a oversampled 1680x1050 HDPI or 1920x1200 HDPI at only 2880x1800 would require some downsampling as well.
     
  8. durkkin macrumors regular

    Joined:
    Sep 23, 2013
    #8
    No. The only people that know that is Apple.

    And how exactly do you expect that to happen? You think Apple allows access to that for any developer to exploit at will?

    Yes, it's called a camera. Take a picture. That will capture the image being shown on screen. Or just take a screenshot. That's going to show the exact image that's being displayed on screen. Why do the dimensions of that image matter?
     
  9. Qaanol thread starter macrumors 6502a

    Joined:
    Jun 21, 2010
    #9
    Okay, walk me through this.

    The image to appear on the screen is prepared by a processor. Is it the CPU or GPU?

    That image is rendered for retina at twice the final target resolution. So if the screen is set to 1920×1200 the image is rendered at 3840×2400.

    Is that 3840×2400 image what is known as a frame buffer?

    Where is it stored: in a cache, in main memory, or in VRAM?

    That image gets sent to a processor for downsampling to the display’s actual resolution of 2880×1800. Where does this downsampling take place, in the CPU or the GPU?

    What is the resulting 2880×1800 image, to be displayed on screen, known as?

    Where is it stored?

    Which processor decides when to push that 2880×1800 image out to be displayed?
     
  10. ayeying macrumors 601

    ayeying

    Joined:
    Dec 5, 2007
    Location:
    Yay Area, CA
    #10
    The image is prepared by the software. Not hardware.

    The doubled resolution is how the software, in this case, OSX, renders the image at 1920x1200 while the screen (physical) pixels itself is only a certain amount. The "retina" is basically rendering everything in 2x. You cannot get 1920x1200 if your resolution isn't 3840x2400. Remember the 2x rule.

    Here's how the process works. What you want is almost impossible.

    The OS renders everything you see on the screen 2x (HiDPI Mode) of whatever resolution you select. If it's 1280x800, the OS renders at 2560x1600. If it's 1440x900 it is 2880x1800 and so forth. It doesn't matter what your PHYSICAL pixel amount is, it's just want OSX does.

    If you're running at 1920x1200; You're actually rendering at 3840x2400 but downscaled to the screen to appear to be 1920x1200. Since you're running at HiDPI, this works. Otherwise, you don't have enough pixels to run it 1x1 square.

    If you want 2880x1800. You can only be running at 1440x900 HiDPI or 2880x1800 Native (1x1). Otherwise you won't get that resolution you want.
     

    Attached Files:

  11. priitv8 macrumors 68020

    Joined:
    Jan 13, 2011
    Location:
    Estonia
    #11
    Well, here's an example for you - this screenshot was made with desktop in native 2880x1800 resolution. It is a 1:1 pixel representation of the framebuffer, at 72dpi.
    The only glitch - you need SwitchResX to be able to switch GPU into this res, as it is not possible using the Display Preference Pane.
    http://forums.macrumors.com/showpost.php?p=18353372&postcount=18
     

Share This Page