Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
BTW, I saw the 17" glossy regular resolution one (not the 1900x1200), and it looks like UTTER CRAP. The graininess was horrible, and you could see the scan rate! The 20" iMac and below are also terribly grainy, along with the 15" Glossy MBP.

Quibble: LCD screens do not have a "scan rate". Only CRT monitors.
 
Hopefully we will see new batch of MBP with CMO LED panel soon.
http://www.cmo.com.tw/cmo/english/about/shownews1.jsp?flag=20070612154119

From all the MBP screen experience, CMO (9C57) is still the preferred one over other panel we have seen from CD and C2D. One would expect CMO's LED would also show higher quality than other panels.

Which manufacturer are the current screens from?

Here is my screen's information:

MyDisplayPref.png
 
I just noticed this thread and what you guys are talking about right now, for my new MBP with glossy finished screen, when I look at the screen from top to the bottom there are no yellowish color what so ever just white, but how ever when I look from either left or the right side i can see the entire screen became yellowish so is that the problem or? maybe its just how the glossy finished looks like when I combined with the new LED screen?
As soon as you look at it from an angle and get any depressing results I think it's all due to the limited viewing angles.

If this issue exists and is a problem it's something you see even while you look straight on the screen.
 
Which manufacturer are the current screens from?

Here is my screen's information:

Well, it seems to say there the model is 9C67, which one would assume is newer and better than model number 9C57...
It probably doesn't work like that, but the fact remains, there are a lot of "Apple Should Have"'s, when the fact is, that the SR MBP is an absolutely fantastic computer in ALL respects. I do not believe that can be successfully denied.
 
The MBP I bought on Monday had a definite yellow cast over the bottom 1/3 of the screen, that was not due to viewing angle or any other factor. The screen lighting was just yellow, period.

Well, I just exchanged mine, and I can say the replacement has no problems. Absolutely beautiful screen.
I think you are the first one who have got it exchange with one without the colors, there have been people who didn't saw the issue but you have both seen it and then not seen it on the later one. So then atleast some screens must be good and there is hope :)
 
Well I took my MBP to the Apple store. The Genius noted my issue with the screen and said he didn't want to exchange it for a new MBP, since a new one could have the same problem. So they are ordering me a new screen from Apple and will replace it when it comes in. And for good measure a new keyboard too, since I have a squeaky spacebar. And they will fix and replace the parts in the store, so I don't have to send it off for a long time.

Well once again Apple has proven their service to be impeccable. I'm a happy customer :D

-=DG=-
Sweet.
 
I think you are the first one who have got it exchange with one without the colors, there have been people who didn't saw the issue but you have both seen it and then not seen it on the later one. So then atleast some screens must be good and there is hope :)

Damn right. Problem fixed, now everyone stop worrying!!!
 
there are a lot of "Apple Should Have"'s, when the fact is, that the SR MBP is an absolutely fantastic computer in ALL respects. I do not believe that can be successfully denied.
ok ok ok ok, I will be willing to say "The Macbook Pros spec doesn't suck, it's only that 256MB vram would be a nice insurrance for future games and would come at a very low cost if Apple would had put it in all modells", there you go, happy?

If I would have bought a MBP I would just have got a Dell 2007WFP S-IPS panel for it aswell so I would probably not even look at the screen it came with that much.

And 128MB isn't much of an issue in doom3, quake4, ut 2004, and so on which barefeats have shown, only in very high-end game type benchmarks in 3dmark06 but I'm not sure games at that level can be played on the 8600M GT no matter how much memory anyway.
 
I went to the apple store to get it exchanged, and the genius checked my gradients and told me there was a flaw so I went ahead and got the matte screen exchanged (I almost got the glossy one though but I liked the matte better). Any ways I'm at home and I can confirm that its 100% evenly lit and I dont see the yellow tint anymore and no back light bleeding, only barely about a mm on each bottom corner but other than that the screen is superb. Go and get it exchanged I think some of the screens just has that flaw. Also the mbp 15" led glossy that I saw on display had the yellow tint just as my first one, I checked and tested at the apple store for about an hour this morning and the led matte display next to this glossy one had no yellow tint so I went ahead and went home and came back and did the exchange. =D Now all is good!! and no dead pixels either very nice quality!!

jjahshik also seems to have it fixed. Anybody else?
 
Which manufacturer are the current screens from?
Could you run this program and do the screen dump (export DDC) by running SwitchResX control?
http://www.switchres.info/html/SRX/DL.shtml

9C67 LED panel (matte) should be LP154WP2-TLA1 which is the LG panel. It is suppose to be a decent panel. People who noticed more distinct yellow tint on 9C68 LED panel (glossy) can also try to run SwitchRedX control to determine the screen manufacture.
 
ok ok ok ok, I will be willing to say "The Macbook Pros spec doesn't suck, it's only that 256MB vram would be a nice insurrance for future games and would come at a very low cost if Apple would had put it in all modells", there you go, happy?

If I would have bought a MBP I would just have got a Dell 2007WFP S-IPS panel for it aswell so I would probably not even look at the screen it came with that much.

And 128MB isn't much of an issue in doom3, quake4, ut 2004, and so on which barefeats have shown, only in very high-end game type benchmarks in 3dmark06 but I'm not sure games at that level can be played on the 8600M GT no matter how much memory anyway.

Remember that the MBP is NOT a gaming computer. It is designed to be a work computer, which has the capability to play great games at highish qualities. I would say the closest thing that Apple has to a gaming computer is the 24" iMac. Anyone wanna argue? I seriously doubt whether any professional application would benefit more from more VRAM than a faster processor. If you happen to have a 500MB CAD file that you want to run, and so you would expect there to be a use for an extra 256MB of RAM, I should think that the GPU's Turbocache would fix this. The extra VRAM is only good for games. I'm not saying that the MBP should not have 256/512MB of dedicated VRAM, but that I believe it is not necessary for what the MBP was primarily designed for.

Apologies, I believe I am in somewhat of a bad mood at the moment, but the MBP and Mac Pro are primarily work computers, not gaming computers. Easily seen by the use of a 2.2/2.4 GHz processor, when there is no doubt that is complete overkill for almost every game, especially running a 8600M GT. Maybe it wouldn't be GPU limited with an 8800 or something, but it is.

And after all that (bitching, I admit) I agree with what you said.
 
Well, it seems to say there the model is 9C67, which one would assume is newer and better than model number 9C57...
It probably doesn't work like that, but the fact remains, there are a lot of "Apple Should Have"'s, when the fact is, that the SR MBP is an absolutely fantastic computer in ALL respects. I do not believe that can be successfully denied.

Believe me when I say this, I am actually very happy with mine, and many of you guys know how pissed I was with the older ones.

Still I am rather curious about the issues you guys noticed.
 
Well, it seems to say there the model is 9C67, which one would assume is newer and better than model number 9C57...
It probably doesn't work like that, but the fact remains, there are a lot of "Apple Should Have"'s, when the fact is, that the SR MBP is an absolutely fantastic computer in ALL respects. I do not believe that can be successfully denied.
The screen code used by Apple determines different screen manufactures. The higher of the number doesn't necessary means it is better. This has been the case in CD and C2D where people prefer 9C57 (crisp, evenly lit, nearly no grain) when Apple later switch to AUO 9C60 and 9C61 (on most of C2D) which turned out to be more grainy and uneven lit. When the LED panel production ramp up, we will see more LED panels from different manufactures. It will help us to determine which panel performs better.
 
Quibble: LCD screens do not have a "scan rate". Only CRT monitors.

No, but they still do have a refresh rate. Though only the "moving" pixels are changed during the refresh, the "scan" can be visible if the picture is moving fast and GPU/screen acting slow.

Upping LCD refresh rate can also make the display act weird, if the refresh rate matches screen response badly. I mean, if you have 60Hz refresh, it means 16.67ms per refresh. Then if it's an 10ms screen for example, there's probably going to be visible problems in fast moving scenes. It would be optimal to use 100Hz refresh, which the GPU probably won't do so the next best thing would be to use 50Hz refresh, which is doable. Then each frame would be visible for 20ms which suits the 10ms screen perfectly. In this scenario 50Hz refresh would produce better picture than 60Hz refresh; however, this example is not taken from real world but simply made to show that bigger is not always better.

That's also reason why I wonder why gamers want the biggest possible frame rate. It would be optimal to have a constant frame rate that would be some X times the screen refresh; for example this 10ms screen would work optimally if the GPU refreshed the screen 50 times a second and the game would provide the GPU a new picture every time the GPU does an update, therefore making 50fps the optimal frame rate (and making 100/150/200fps optimal frame rate for those who want bragging rights). Why games have variable frame rate goes beyond my understanding; IMO it would be a lot more beneficial to have a user preference for a constant frame rate, for example 50fps (or 25fps if the game is very taxing).

OK, end of this offf-topic nonsense :)
 
No, but they still do have a refresh rate. Though only the "moving" pixels are changed during the refresh, the "scan" can be visible if the picture is moving fast and GPU/screen acting slow.

It was by belief that the GPU calculated what all he pixels of a specific frame should be, I assume held them in some sort of buffer, and then changed the screen all at once. Is this true, or does the GPU send stuff directly to the screen, so pixels are changed running down (or up) the screen, resulting in pixels at one end being refreshed before pixels at the other? This would constitute a scan, but if their held in a buffer and changed all at once, it would not. I guess as 60Hz on a 100Hz screen apparently looks worse than 50Hz, would suggest the buffer.

So the 100Hz screen refresh is what the screen does, changing the pixels on the screen when it gets the new data, regardless of when it gets this data? I would have thought the screen would wait until a full frame had been made, and put it up, before waiting for the next one, in which case the 100Hz refresh rate would be simply the point where the pixels can't physically change fast enough, and I suppose you would get some kind of physical dithering effect in the LCD crystals themselves.

Anyway, sorry to quiz you on the exact workings of an LCD screen...
 
Apologies, I believe I am in somewhat of a bad mood at the moment, but the MBP and Mac Pro are primarily work computers, not gaming computers.
Well, ok, Macs aren't gaming computers, mostly because of:
1) There are few games for macs.
2) Therefor gamers don't buy macs.
3) If gamers don't buy macs why release games for them?
4) The hardware most often isn't very good for games.
5) The hardware most often can't be upgraded.
6) OS X graphics drivers and OpenGL are slow and outdated.
7) Therefor even on the same hardware games are faster in Windows so people will dualboot anyway.
8) Gamers still run the games in Windows.

Anyway, I'm an old amiga and later on OS nerd and for me I just want a decent OS, back in the Amiga days MS-DOS 6.22 + Win 3.11 sucked so therefrom comes my hate for Windows, which is quite illogical considering how stable and good XP SP2 and probably Vista is nowadays. Anyway I would prefer "to be different" and run something else.

I'm perfectly happy with FreeBSD + KDE but I would like to be able to run for instance Photoshop and games but I don't want to dual boot, so there comes OS X which offers some commercial apps and games. If I had a mac I would probably play in OS X even thought I had worse performance just to not have to install and dual boot Windows.

So, for me the huge advantage with a mac and os x is being able to game without dual boot and so on, and nothing else. Many of the smart utilities for os x cost money and in bsd/linux/solaris they tend to be free so as long as OS goes I'm quite happy with an open one.

I already know that my needs might not be everyone else, and even less what Apple want to sell. But they are mine and therefor I will have the issues I have, no matter what if others have them or not. I'm a consumer, not donator/apple employe, so I will argue for what fits my consumer needs.
Easily seen by the use of a 2.2/2.4 GHz processor, when there is no doubt that is complete overkill for almost every game, especially running a 8600M GT. Maybe it wouldn't be GPU limited with an 8800 or something, but it is.

And after all that (bitching, I admit) I agree with what you said.
I googled for prices and it seems a 2.2 GHz core2duo is (t7500, is that the one? whatever, doesn't matter so much, it's the principle) is around 330 dollar, 2.4GHz is around 650, so that's 320 dollar higher price for the 1/11 th faster cpu. However an update from 256 to 512MB desktop version of 8600 GT here in sweden cost around 150 sek, which is around $20. So say 128MB more vram might cost 10 dollar, but it doesn't matter, my argument is valid for 50 dollar aswell.

For Apple to add that $10 or whatever to the lowend modell wouldn't have made a huge effect on price, but it would keep me happy, but now they are using 128MB just to trick people into getting the middle end modell on which they probably earn a little more, but that one comes with a $320 or so more expensive CPU of which I have no ****ing use at all. And that's what are making me insane. I won't fall for that trick and buy that modell, especially as I know that in a half years town that CPU won't be top of the range, it won't be much slower than another one, but it will be much cheaper, and then it's so stupid to pay a lot for it.

128MB vram however are cheap.

To me 15.4" 2.2GHz, 2GB, 120GB, 256MB 8600M GT and 17" 2.4 GHz, 2GB, 200GB, 512MB, 1920x1200 modells only would have been enough. Then add the 13.3" with integrated graphics and 1GB ram and put the same shell on everyone and just sell them as Macbooks.

There you go, three modells of which noone really suck (well, macbook with real gpu would be better, but one could argue that the lack of it is good for portability, althought I would have prefered even a lowend gpu to integrated graphics.)

Edit: I'm angry because it's so obvious it's made to make a larger difference of low- and middle end modell to sell more of the later one, not because the cost of 256MB vram where so high that they had to do it.
 
Another question, I asked before but no answer.
If LED screens have the LEDs arranged around the outside of the screen, how is the light transported to the centre of the screen to keep the backlight even? I imagine it's not fibre optics, because that would just be insane...

Sorry to quiz you guys so much, but I am interested.

aliquis- : I don't know why Apple didn't use 256/512MB VRAM in the MBPs, but I'm sure it was a deeper reason than just to get people to pay for the 2.4 model. I think we need more benchmarks, but it is quite likely that Apple experimented with the different configurations, and deemed that the 8600M GT would not benefit from 512MB VRAM, especially as it uses Turbocache. Maybe it's not powerful enough to be able to process more than 256MB of textures in less time then it would take to just load the new textures into VRAM as it finishes with old ones?

I'm sure there is a legitimate reason why a 8600M GT with 512MB dedicated VRAM has no benefit over one with 256MB VRAM. There *may* also be issues with GDDR3 vs GDDR2. Perhaps Apple could not get a GDDR3 version with 512MB VRAM, and so chose to go smaller, so they could keep memory clocks high while keeping power consumption low?

I expect more in depth benchmarks between the 2.2 and 2.4 models along with other models such as the 'G1S'? etc. could reflect this.
 
Yep; the "scan" is visible, because the timings do not match. You can test this yourself by staring at the "flurry" screen saver for some time. It tries to refresh the picture as often as it can, thus eventually resulting in the kind of "scan distortion" mentioned above. You can notice when sometimes there's a break in the picture — it however does not mean that the GPU would hiccup, but just that the timings are not matching perfectly.

LCD panel only changes its pixel content when the content actually changes, so that's why there's no need for very high refresh rates. Actually 25fps would be "enough" for a smoothly moving SD picture (or 50fps for HD). That would also mean 40ms panels would be perfectly okay for a smoothly moving SD picture (or 20ms panels for HD). But there are always people who think they notice a difference between 8ms and 16ms while in practice the difference they're seeing is different distortions from the occasional timing mismatches.

If there was a 50Hz screen refresh for a 20ms LCD panel, there would be no timing distortions and the picture would be awesome. Even while playing at lowly 25fps which would mean every frame would be displayed twice by the GPU ;)
 
Is this true, or does the GPU send stuff directly to the screen, so pixels are changed running down (or up) the screen
To begin with the GPU works on many pixels at once while calculating them, not one at the time, but I guess the result ends up in a buffer which is then sent to the screen. Anyway I doubt all pixels are updated at once, but more likely with adressing and values, but I don't know for sure. Sure that would give you the chance to see that they are getting redrawn, if that happened slow which I doubt it does.

It would still not be the same as on a CRT because atleast the pixels glow all the time, so sure they might not be updated all at once, but if the image is static you will never notice any flicker, and if it moves a lot I guess the movement distract you from the redraw anyway. On a CRT however with a static image you will see flicker because each pixel are only starting to emit light when the electron beam passes thru and will wear of a little until the next run.
 
Another question, I asked before but no answer.
If LED screens have the LEDs arranged around the outside of the screen, how is the light transported to the centre of the screen to keep the backlight even? I imagine it's not fibre optics, because that would just be insane...

Sorry to quiz you guys so much, but I am interested.

Until somebody breaks one apart (or someones led backlight breaks), we can only guess; the most likely answer is that "backlight" means having leds on the back of the panel. However, because the backlight only means that the unit provides even lighting across the whole screen, it can in theory be done with fiber optics too. Whichever would be more practical.

Former guess would mean more leds and simpler design but the latter would mean less leds (and more optical cable) while the backlight could possibly be made thinner. Breaking a backlight would also reveal how it's done; if the former, then there could be "dead pixels" due to broken backlight led, if the latter, then there could be "dead lines" which would make the partial backlight failure more severe.
 
That would also mean 40ms panels would be perfectly okay for a smoothly moving SD picture (or 20ms panels for HD). But there are always people who think they notice a difference between 8ms and 16ms while in practice the difference they're seeing is different distortions from the occasional timing mismatches.
That is wrong, since no matter what framerate as long as you have any delay at all it will effect the next frame to some extent. With a 40ms delay on 25Hz content the last pixel would to some extent still be where the whole time the next pixel are supposed to be shown. And obviously that will affect how the current pixel will look.
 
If there was a 50Hz screen refresh for a 20ms LCD panel, there would be no timing distortions and the picture would be awesome. Even while playing at lowly 25fps which would mean every frame would be displayed twice by the GPU ;)

Almost agree. (Not quite)
I believe this would be true except for one thing: ghosting.
I'm pretty sure the eye can only respond to 25-30 fps, but at 25 fps (using a game as an example) any fast movements would result in an object "jumping" across the screen. This would be incredibly obvious at a refresh rate of 25 fps, even 30 fps. Of course, it probably wouldn't cause bother, but it wouldn't look as good as it could. At higher framerates, the eye just blurs the many closely positioned objects, and so you just get a blur, which looks natural.
So, I reckon a game such as Crysis, which uses motion blur, should look really nice even at framerates of 25-30, where as other games you would need a bit higher.
This suggests other places that motion blur could be used to make other things look better, such as movies, although its probably already there to some extent in the data itself at capture. Perhaps if the GPU rendered one frame ahead, compared two consecutive frames to see how and where things move, and used blurring appropriately, you could get much nicer looking videos of artificial things, an excellent example would be a fast moving CAD movie. I wouldn't imagine it would be that hard for the GPU to pull off.



Oh, and thanks for all the other explanations. Cheers!
 
Got my screen information

DDC block report generated by SwitchResX for display
Color LCD

0 1 2 3 4 5 6 7 8 9 A B C D E F
-----------------------------------------------------
0 | 00 FF FF FF FF FF FF 00 06 10 67 9C 00 00 00 00
1 | 00 11 01 03 80 21 15 78 0A 9C 60 99 58 51 8E 26
2 | 12 50 54 00 00 00 01 01 01 01 01 01 01 01 01 01
3 | 01 01 01 01 01 01 9F 25 A0 40 51 84 0C 30 40 20
4 | 33 00 4C CF 10 00 00 18 00 00 00 01 00 06 10 30
5 | 00 00 00 00 00 00 00 00 0A 20 00 00 00 FE 00 4C
6 | 50 31 35 34 57 50 32 2D 54 4C 41 31 00 00 00 FE
7 | 00 43 6F 6C 6F 72 20 4C 43 44 0A 20 20 20 00 46

-----------------------------------------------------
Valid DDC block: checksum passed

EDID Version........1.3
Manufacturer........APP
Product Code........26524 (679C) (9C67)
Serial Number.......0

Manufactured........Week 0 of year 2007
Max H Size..........33 cm
Max V Size..........21 cm
Gamma...............2.20

DPMS Supported Features:
------------------------


Display type:
-------------
RGB color display


Input signal & sync:
--------------------
Digital

Color info:
------------
Red x = 0.600 Green x = 0.319 Blue x = 0.149 White x = 0.312
Red y = 0.345 Green y = 0.555 Blue y = 0.072 White y = 0.328

Established Timings:
--------------------

Manufacturer Reserved Timings:
------------------------------

Standard Timing Identification:
-------------------------------

Monitor Description blocks:
---------------------------
Descriptor #0 is Timing definition:
Mode = 1440 x 900 @ 60Hz
Pixel Clock.............96.31 MHz Non-Interlaced

Horizontal Vertical
Active..................1440 pixels 900 lines
Front Porch............. 64 pixels 3 lines
Sync Width.............. 32 pixels 3 lines
Back Porch.............. 224 pixels 6 lines
Blanking................ 320 pixels 12 lines
Total...................1760 pixels 912 lines
Scan Rate............... 54.72 kHz 60.00 Hz

Image Size.............. 332 mm 207 mm
Border.................. 0 pixels 0 lines

Sync: Digital separate with
* Negative vertical polarity
* Negative horizontal polarity

Descriptor #1 is Manufacturer specific data (not interpreted here)

Descriptor #2 is ASCII data:
LP154WP2-TLA1
Descriptor #3 is ASCII data:
Color LCD
 
Is it true that the more VRAM the GPU has, the more/better colors will show up on the screen? I heard someone say it before and wondering if it's true.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.