Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I did not know that they were changeable, thank you Hmac.

To restate my question a couple posts up, i meant do humans see in pixels? Not that I cant physically see the pixels on my screen. Like it seems that all the colors and what-not are perfect. But in reality, dont we have to see in some type of pixel? Oh well, maybe my strange questions just shouldn't be answered, but I feel that I could possible narrow down my options for a graphics card/cinema display "scientifically." It has seemed to help so far. And I was also wondering if anyone has encountered a plexyglass or basically a "see-thru" side wall for the MP, since its boastfully beautiful inside, at least in my opinion.
 
I did not know that they were changeable, thank you Hmac.

To restate my question a couple posts up, i meant do humans see in pixels? Not that I cant physically see the pixels on my screen. Like it seems that all the colors and what-not are perfect. But in reality, dont we have to see in some type of pixel? Oh well, maybe my strange questions just shouldn't be answered, but I feel that I could possible narrow down my options for a graphics card/cinema display "scientifically." It has seemed to help so far. And I was also wondering if anyone has encountered a plexyglass or basically a "see-thru" side wall for the MP, since its boastfully beautiful inside, at least in my opinion.

Yeah, you can find a see-thru side wall for the MP, but thats a little off topic, Ben. I'm not sure where you can get them though, but have seen it.

I am not exactly sure what you are asking for, but I am just going to say one random thing lol:rolleyes:.

I have a colorvision Spyder2PRO monitor calibrator. Very good investment, easy setup, sets up the right colors on your screen. That way every pixel is the color it should be, but I'm just saying the random obvious since I'm not understanding your question!!:eek:
 
"Ahhh"med, im saying that what do humans see, regard computers for 5 seconds, i know its hard Ahmed, but like why buy an extreme monitor if its not worth it? You know?

Ahh. You buy highend monitors because of their higher res, and more true color. If you buy a dull dell (haha, thats funny) 15" monitor (LCD), you are gonna see the pixels nice and fine from 2 feet away (yeah, trust me). so that means, with a higher res, more texture, and if you want more texture for that fine expensive display of yours, we have invented graphics accelerators, and then they evolved to be $500 video cards with flames and cool figures on em:D. (like the 2600XT you saw in the mac pro, the MB that you have has a graphics accelerator, more common in laptops)
 
Right! Science alert!

The answer is: "sort of" :)

Check some of these links.

Summary from deviantart.com: (my emphasis)

The average human retina has five million cone receptors on it. Since the cones are responsible for colour vision, you might suppose that this equates to a five megapixel equivilant for the human eye.

But there are also a hundred million rods that detect monochrome contrast, which plays an important role in the sharpness of the image you see. And even this 105MP is an underestimate because the eye is not a still camera.

You have two eyes (no kidding!) and they continually flick around to cover a much larger area than your field of view and the composite image is assembled in the brain - not unlike stitching together a panoramic photo. In good light, you can distinguish two fine lines if they are seperate by at least 0.6 arc-minutes (0.01.Degrees).

This gives an equivilant pixel size of 0.3 arc-minutes. If you take a conservative 120 degrees as your horizontal field of view and 60 degrees in the vertical plane, this translates to ...

576 megapixels of available image data.

Curiously - as a counterpoint to this - most people cannot distinguish the difference in quality between a 300dpi and a 150dpi photo when printed at 6x4", when viewed at normal viewing distances.

So: although the human eye and brain when combined can resolve massive amounts of data, for imaging purposes, 150dpi output is more than enough to provide adequate data for us to accept the result as photographic quality.


But don't forget that women have more cones and men have more rods - I kid you not.Therefore the ladies see colours brighter than gents but can't see as well when it gets dark.

The main property that defines a high quality monitor, at least in the context of Mac use, is its colour fidelity and that's where the Apple displays offer good value. If refresh rates are the most important consideration, as when selecting a gaming monitor, one needs to look elsewhere.
 
Oh i get it. Thanks dude. So you recommend an apple brand? i think there is only three options of theirs, 20, 23 and 30.

Mmm, Apple's displays havent been updated ever since 2004, but the 30 inch got a increase in specification in august 2006. So they are very outdated, but people still buy them, they are still very good, though they are outdated, its what the majority on these forums recommends. But recently, since the displays really are starting to "look bad", people have been opting for high-end Dell displays that have more features, for a less price, but they look really crappy, silver is SEXY!!!:apple::apple:
 
for comparisons sake, the 2600 is what the imac uses. The mac pro will be a little faster than the imac with that card b\c you have a more powerful CPU, but you'll get more or less similar results within 5-10 fps. There are plenty of threads on imac performance, so i suggest you look there for a good idea of the power. Generally though, 8800GT is a great card that just came out. Its not the most powerful card on the market and there's better stuff coming soon, but atm it is a very powerful card. Apple is charging a bit more for it than it is worth though. 8800 GT is a 250 card. 2600XT is a ~$120. They are charging 200 more for the upgrade. So...70 more. But hey, thats better then charging 500 for a 1900XT (they don't even sell it anymore really on newegg b\c it is so old..but i bet you could buy it for under 150)
 
for comparisons sake, the 2600 is what the imac uses. The mac pro will be a little faster than the imac with that card b\c you have a more powerful CPU, but you'll get more or less similar results within 5-10 fps. There are plenty of threads on imac performance, so i suggest you look there for a good idea of the power. Generally though, 8800GT is a great card that just came out. Its not the most powerful card on the market and there's better stuff coming soon, but atm it is a very powerful card. Apple is charging a bit more for it than it is worth though. 8800 GT is a 250 card. 2600XT is a ~$120. They are charging 200 more for the upgrade. So...70 more. But hey, thats better then charging 500 for a 1900XT (they don't even sell it anymore really on newegg b\c it is so old..but i bet you could buy it for under 150)

GIMME A QUADRO!!!!
 
for comparisons sake, the 2600 is what the imac uses. The mac pro will be a little faster than the imac with that card b\c you have a more powerful CPU, but you'll get more or less similar results within 5-10 fps.

*snip*

Er, No.

The top of the line iMac uses a 2600 Pro and it's probably the mobile version, without dedicated cooling, at that. The Mac Pro uses the dedicated PCIe 2600 XT. Two different cards. Here's a comparison of the two dedicated PCIe cards. It's for PC but their relative performance won't change much across platforms. Expect at least 25% improvement in the Mac Pro's favour, probably even more if, as I suspect, the iMac has a mobile chip.
 
Er, No.

The top of the line iMac uses a 2600 Pro and it's probably the mobile version, without dedicated cooling, at that. The Mac Pro uses the dedicated PCIe 2600 XT. Two different cards. Here's a comparison of the two dedicated PCIe cards. It's for PC but their relative performance won't change much across platforms. Expect at least 25% improvement in the Mac Pro's favour, probably even more if, as I suspect, the iMac has a mobile chip.

WRONG. there is no ati mobile 2600 pro as ive said. so, again as ive said, bought 2600xt, since they were available to be mobile, and put in 256mb of ram instead of 512mb, and clocked it down.
 
Link, please, or I call BS...

this is the thread when the magic happened:

https://forums.macrumors.com/threads/339616/

at first they all said, no, it was a PRO, but then later, they found out the model number was that of the XT, it also states in that thread the info about the mobile 2600xt, and NOT pro

the model number of the imac's video card is: 0x9583, search around for it, take some time, it is that of the 2600xt, also, in the thread above, one guy who had bootcamp, downloaded an app similar to system profiler, it stated it has a 2600xt. there are many things, its more 2600 pro, and less 2600 xt, its right in the middle. the difference between it and the mac pro's one is that the imac's is slower, but has the same chip, and the imac is mobile.

i call BS :p
 
I actually meant a proper reference, from a tech site. Not just another chunk of your speculation. Even so, from page 25:

Very confusing, but I think i get it, the video cards in the imac is the 2600 XT, but has a clock speed of the PRO, and RAM of a PRO. and is also a mobility model, either way, im happy, I am not sure if I am thinking correctly, but I believe the whatever video card in the imac is better than the 2600 PRO radeon for desktop, not mobility

Sorry, but your belief don't make it so...

It's not an XT, nor is it a hybrid card as some have claimed. If you install the proper ATI drivers it shows up as what it is - a regular 2600 Pro. As nice as the XT would have been the Pro still plays Bioshock and Crysis just fine so I don't care so much.

Here is the money shot from that thread. Use ATI's own drivers under windows and the chip-set reveals it's true nature.

In any case, the way Apple juggle their clock-speeds, no one is going to know for sure until the iMac and Mac Pro are benchmarked. Until then, it's all just hot air.
 
Personally, I'd go for the 8800GT. The upgrade price is actually very reasonable in comparison to the real world cost, so it's worth the upgrade. It also beats the ATI 3870 HD hands down.
 
I actually meant a proper reference, from a tech site. Not just another chunk of your speculation. Even so, from page 25:



Sorry, but your belief don't make it so...



Here is the money shot from that thread. Use ATI's own drivers under windows and the chip-set reveals it's true nature.

In any case, the way Apple juggle their clock-speeds, no one is going to know for sure until the iMac and Mac Pro are benchmarked. Until then, it's all just hot air.

Well I get it now, and i said search that model number, and that may just be a guy's opinion. and it is a hybrid, of the XT and PRO. its more of a PRO, but has a core XT chip. but as you said, we have to see benchmarks.

BACK ON TOPIC.
 
man there is so much more to graphics cars than i ever imagined... gosh, im in over my head! but reading your guys's brawl, i am understanding more, and also, what you guys are looking for in a GC. Continue.

-Thanks!!!
Benn
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.