It means they're not going to look too good. See here please.
It means absolutely nothing.
Web images are calculated in pixels, not pixels per inch. The 72 DPI standard for web is a myth and does not affect a thing. Read here: http://www.webdesignerdepot.com/2010/02/the-myth-of-dpi/
If you compare the same image on 220 PPI monitor and a 110 PPI monitor at native resolution, it simply means that the picture would be twice as big on the 110 PPI monitor. Or if you're confused by this, then just imagine watching a Full-HD movie on a 7" iPad 3, or a 42" Full HD TV. The picture quality remains the same on both screens, but it's a lot smaller and more crisp on the iPad 3 due to it having smaller pixels.
The problem of pixelated images on the Retina MBP comes due to Apple's weird resolution scaling. If you could run the desktop at true 2880x1800 resolution, there would not be any pixelation or blur-ness of images.
Well, you could always use google. There are tons of tutorials on how to adapt your website to HiDPI monitors. For example: http://www.kylejlarson.com/blog/2012/creating-retina-images-for-your-website/
http://retina-images.complexcompulsions.com/
It means absolutely nothing.
Web images are calculated in pixels, not pixels per inch. The 72 DPI standard for web is a myth and does not affect a thing. Read here: http://www.webdesignerdepot.com/2010/02/the-myth-of-dpi/
If you compare the same image on 220 PPI monitor and a 110 PPI monitor at native resolution, it simply means that the picture would be twice as big on the 110 PPI monitor. Or if you're confused by this, then just imagine watching a Full-HD movie on a 7" iPad 3, or a 42" Full HD TV. The picture quality remains the same on both screens, but it's a lot smaller and more crisp on the iPad 3 due to it having smaller pixels.
Well, it works a bit differently when resolution independence comes into play (like OS X HiDPI modes). Here, a logical pixel (point) equals two real pixels. For example, the 'best for retina' mode is logical 1440x900 and is seen by applications as such, its just that real resolution is 2880x1800. Basically, for the HiDPI mode the physical size of a 'normal' 100x100 image and a HiDPI 200x200 image will be the same: 200x200 (the normal one will be upscaled using bilinear filtering). I agree that its a bit confusing, but its the way it works. We are used to the fact that pixels represent the image size. With resolution independence, pixels represent image data, and not longer the image size, which becomes a separate parameter. So a 1x1 cm image could be 50x50 pixels, 100x100 pixels, or 1000x1000 pixels, hence HiDPI. Of course, OS X uses a bit simplified notion by distinguishing only between HiDPI (1 pixel of image = 1 native pixel) and non-HiDPI (1 pixel of image = 2 native pixels). It would be probably less confusing to work with real units, like mm, and design the UI in true resolution-independent way. But its something for next-gen OS.
Apple developer documents provide a nice overview: http://tinyurl.com/d7kr95d
Well, it works a bit differently when resolution independence comes into play (like OS X HiDPI modes). Here, a logical pixel (point) equals two real pixels. For example, the 'best for retina' mode is logical 1440x900 and is seen by applications as such, its just that real resolution is 2880x1800. Basically, for the HiDPI mode the physical size of a 'normal' 100x100 image and a HiDPI 200x200 image will be the same: 200x200 (the normal one will be upscaled using bilinear filtering). I agree that its a bit confusing, but its the way it works. We are used to the fact that pixels represent the image size. With resolution independence, pixels represent image data, and not longer the image size, which becomes a separate parameter. So a 1x1 cm image could be 50x50 pixels, 100x100 pixels, or 1000x1000 pixels, hence HiDPI. Of course, OS X uses a bit simplified notion by distinguishing only between HiDPI (1 pixel of image = 1 native pixel) and non-HiDPI (1 pixel of image = 2 native pixels). It would be probably less confusing to work with real units, like mm, and design the UI in true resolution-independent way. But its something for next-gen OS.
Apple developer documents provide a nice overview: http://tinyurl.com/d7kr95d
Actually the UI elements in OSX are 4x larger than they traditionally would be to display properly on Retina. A typical icon is 256x256...the retina aware icons are 1024x1024. This is because the "best for retina" setting is a 2x scale. But remember you are scaling 2x on a screen that is 2x higher res than it is scaling to...so the UI elements have to actually be 4x in order to render at the correct size in full res.
Actually the UI elements in OSX are 4x larger than they traditionally would be to display properly on Retina. A typical icon is 256x256...the retina aware icons are 1024x1024. This is because the "best for retina" setting is a 2x scale. But remember you are scaling 2x on a screen that is 2x higher res than it is scaling to...so the UI elements have to actually be 4x in order to render at the correct size in full res.
Understood, but the TS's query was regards to the rMBP's 220 PPI, rather than Mountain Lion's queer resolution scaling. If the rMBP could display true 2880x1800 resolution it would not have any quality issues with web images.