Separate names with a comma.
Discussion in 'Buying Tips and Advice' started by ajo, Feb 3, 2006.
Will the iMac screens ever be in HD in the near future.
The cinema displays are lovely.
They are. Even the 17" is > 1280x720 so it is fully capable of 720p, which is officially HD.
You must be one of those, if it ain't 1080p it ain't HD people.
I'm one of those people. Just get a 24" display to share with your iMac
Screen size does not make for HD..If that were the case any widescreen TV would be HD compatable..
There's a big difference between 1280X720 and 720p..
The P stands for progressive scan..
I could be wrong tho !
1280x720 IS 720p... 1920x1080 is 1080p... as for the comment that any widescreen being HD, not true, because HD is a resolution, not an aspect ratio... meaning that you can still receive HD on a 4:3 television...
if i understand correctly, anything over a certain resolution, like you say is HD. 720p is HD, infact, it's the only HD that is being broadcast right now and will more than likely be the only one for years to come. 1080p is great for downloads such as trailers and movies, but broadcast television won't be going 1080p anytime soon
Are the iMac screens digital?
what about the Rev. B's?
Huh? Are you asking if LCD screens are digital? Um...yeah....digital is a source, not a screen per se.
I was wondering what the internal screen connections were, are they VGA (analog) or DVI (digital) or none of the above?
I know my Rev. B has a mini VGA out, but does that mean the my internal screen is connected with VGA? I know recently (I think with the iSight G5) they switched to mini DVI out.
Integrated displays are almost always all-digital. It wouldn't make sense to convert to analog and back again when you can just wire the whole thing locally. The only reason LCDs ever had a VGA port was for compatibility and interchangeability with CRT monitors.
So, in other words, the internal connection is digital on all the iMacs.
It wouldn't be like Apple to screw everyone over and cripple the internals though
Can is say......
I'm not impressed with the 1080p that ive seen, which is admittedly only the trailers on apples site - the shadows seem really blocky
Maybe its just me being a photographer and having seen awesome quality photos?
If you are going to use a statement like [sic] "infact" you should at least have the correct facts. 720p is NOT the only HDTV broadcast standard being used now. IN FACT, both NBC and CBS use 1080i. FOX and ABC use 720p. Those are the true facts.
Broadcast TV won't be going 1080p anytime soon, true. But Blu-Ray and HD-DVD will, so 1080p will become more relevant as these formats increase in popularity.
Most of do not have access to £1000 + digital cameras, pro set ups, huge canvas backgrounds etc...
Must admit its very frustrating having a 24" Dell monitor and not being able to take advantage of it properly, my PC can only do 480 QT movie trailers - and even then it stutters at times.
One reason why i want an Intel Powermac,
another being WOW.
Maybe they will release a true HD 30 inch Imac with the topped out 2.16Ghz core duo for the 30th anniversary.
Worth selling my monitor for.
Personally I always though the whole 'HD' thing to be a big marketing gimmick. Also someone told me that you can tell the screen is connected digitally because when you look in the display pref's the refresh rate box says N/A, as digital displays dont have a refresh rate, they only refresh whats changed
Are you sure about this? On one hand, I know that LCDs dont have the "flicker" that CRT screens do, but on the other hand I seem to recall a patent being filed very recently for a technology that would only refresh certain parts of the screen.
Thats what someone told me. They might be wrong, but if so why isint there a refresh rate setting for digital displays, it could be that its refresh rate is variable but i really dont know.
From what I can find, LCD's don't have a "refresh rate," rather it's called "pixel response time" and is measured in milliseconds (not hertz).
...But for all general purposes, it's the same thing.
So the n/a in system prefs will tell you if you have an LCD attached, but won't tell you if it is digital or not (I think).
Pretty much correct. The lack of a refresh rate setting is more a result of the LCD display technology than the fact that it's digital (though they are interlinked in several ways). The refresh rate is an element of analog technology that is irrelevant for LCD displays in most cases, especially when connected over DVI, because the signal for DVI is regulated independent of OS settings.
Telling if you have a digital connection is as simple as looking at the cable. If it's a white DVI connection, it's almost certainly digital. If the cable has blue caps, it's not a digital connection. If you have a notebook PC, 95% of them have solid digital connections all the way through because it's cheaper.
Yes, you can easily tell by the cable, but what about an internal connection like with the iMac? That's the tricky part.
It's not, though. It's digital. Why would anyone spend more money to degrade the quality and add another part to a computer?
I don't know. Why did Apple put a mini VGA out when DVI would have been better?