I don't know what you've seen that is so flawed, but his methodologies are very sound.
Since you don't seem to know what "tinging of the edges" is really called in photography, I'll clue you into the term "vignetting", and why it doesn't really affect what Btom does here. For one, there are all types of distortion caused by lenses: barrel; pincushion; and also vignetting. Vignetting is not going to affect the color cast of the screen, only the brightness, since it deals with optics and the falloff of light through the lens array. And plus it's not going to matter much if you are far enough away from the screen when taking the picture (like Btom has suggested many times) since as you said, it really only affects the corners.
Also, I've never heard of a single consumer, prosumer, or professional digital camera that has exhibited any kind of noticeable color gradient or cast across some portion of the image, mainly because it is impossible for this to happen due to, I dunno, physics. CCD/CMOS sensors are quite exact things, and there are millions of pixels on a sensor so any defects in the microlenses, filters, or the actual pixel structure are probably distributed within a fairly small variance, and at random. Pair that with other things like signal-to-noise issues, and you have fairly random uniformity issues, NOT deliberate color gradients in a camera sensor. So I feel that large color gradients that tend toward a yellowish tinge on the right or bottom right sides of a camera sensor, across many camera sensors, but specifically only those who post images of their iMac in this thread has NO chance of being the correct explanation, and it's probably very likely that these screens look just the way they do in the image (or worse).
And after seeing dozens of screens in this thread myself, and having owned an iMac with the problem, AND taking many pictures of it in various situations, I can say that what my ~$150 compact camera conveys is extremely accurate after doing some normalization in Photoshop.
The grey bar test is probably more useful for identifying the issue with just one's eyes. If a person can't see it, they just may not be very observant and then take an image for us to look at here. There is nothing wrong with doing this, because it is still very apparent if there is or is not an issue. I myself think the best way to test an image taken of the screen is to put the display at 50% brightness, with the ~50% grey solid background, and hide the dock and all the icons. Then it's quite easy to grey balance the dead center of the screen, over a several thousand pixel area, removing any possibility of High-ISO noise interfering with the grey balance. Then of course adjust the levels so the grey slider is in the center of the very large column of data in the histogram (since most of the pixels are actually going to be about the same luminance despite uniformity issues).
In fact, I think your argument is what is fatally flawed here. Yes, you are correct that we have no idea of the quality of the cameras being used, but that would actually only work to MASK the issue from us, not accentuate or create an issue where there is none. Images taken at higher ISOs have decreasingly accurate color representation and would actually only work to make the person's panel seem more uniform. In fact recently I have seen an attachment here where the screen looked very good, but when looking at it at 100% it was clearly taken at a high ISO. If anything he was told his screen was fine when it probably is not, given the incidence rate of the yellowing issue (Every screen I've ever seen). Pair that with the many proclamations of "It looks worse than this in person" for those who can actually detect the issue and I would say it's more likely that posting an image of the screen with a crappy camera, etc., actually helps to hide the problem, not accentuate it.
Anyway, I'll soon be replacing this one with a 2010 version so expect to hear a "fatally flawed" analysis of my old screen versus my new screen.