Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Who says Apple TV does 1080i???
It quite clearly does not.

AppleTV most definately does do 1080i. Read the Tech Specs.

Look at the formats supported under "TV Compatibility".

The AppleTV can output 1080i, but what it doesn't do is support 1080i (or 1080p) material. That's where a lot of confusion comes in. I don't have an Apple TV, but I'm guessing that when you have an AppleTV set to output 1080i to a TV that can receive a 1080i signal, you're set. Any 720p24 video will be scaled up to 1920x1080 and sent out as a 1080i signal.

It would probably look like ABC/Fox/ESPN output from a HD cable box at 1080i.

ft
 
I think you're right in the sense that people don't/won't wake up and change their habits in such a conscious way so quickly. But, it's happening, and it's major. For the first time people are online more than they watch television, and it's due to content. People now read news and consume media online more than in print. These are such HUGE indicators that a revolution is going on, and the computer is at the core/heart of it all. I find that hugely fascinating, but also as a consumer, hugely exciting in that all this content I have stored on my computer will be available on my television in my living room. Apple has a reputation of innovating, but not really inventing, and that's what this device is all about.

I see your point and definitely agree that there is a whole new ballgame just waiting to take shape -- I guess I'm just largely disappointed by the way Apple has dipped their foot in with the iTV. I was hoping/expecting so much more from them, especially since it was announced so long ago and they had so much time to stick in those magic "surprise features".....they just never came to light unfortunately. Only in my dreams I guess.
 
Point #1 : Never heard of an up-converting DVD player eh and yea, I keep saying HDMI but what I'm really saying and referring to are the HDTV compatible inputs found on HDTVs... be them DVI or HDMI or COMPONENT ... Not svideo or composite...

Point #2 : Whoever those 'many people' you've been speaking to are you really have to ditch em and fast cause they simply have no idea what they are saying. :lol:

Dave

I also want to add, if you have a nice Plasma, let's take what I have, a Pioneer Elite, then all of the "up-converting" is performed by the TV.

Sure this not the case on low end Plasmas and LCDs, but then again you get what you pay for.

Also I think most people are on the same page when we refer to HDMI we're referring to actual HDMI. Composite is composite. DVI is DVI.

My TV/Monitor included the following: Independant Dual HDMI, Single DVI-D, Component - RGB HV, Composite , S-Video, SR (In/Out) and RS-232C. My TV/Monitor is basically as HD as you can get, but no matter how I spin it the S-Video connector is not digital, yet it's there.
 
Before I start, I do not claim to be an expert in HDMI and I've discovered that there are many different views on this technology. For the last couple of months I have buying new equipment for my home theater and my original plan was to go all digital (which meant HDMI).

This is when I first learned of many of the problems found in HDMI 1.0, 1.1, 1.2 and 1.2a. Everything I've read and everyone I talked to said that the promise of HDMI is welcome, but it just wasn't there yet. Basically, any device with HDMI before 1.2most likely won't work well with other HDMI devices (so that eliminates any device that came with HDMI before 2006, most DVRs, Cable and Satilite boxes and sadly Verizon's FIOS (according to early reports the FIOS boxes uses HDMI 1.1!)).

Now these weren't sales people that I was talking to, but actual Home Theater installers. When I told them my plan, they strongly recommended that I run a Composite RGB HV cable (I also ran HDMI).

Currently my TV/Monitor and A/V receiver supports HDMI 1.3(maybe 1.3a, I'm not sure), my HD Digital Cable/DVR Box supports HDMI 1.0 and I don't know what my future Apple TV's HDMI is.

The funny thing is the main reason I want to go with HDMI is because the spec doesn't just support audio and video, but because it support the Consumer Electronics Control (CEC) channel, which means (if implemented properly) I can point my A/V receiver's remote at my TV in on room and communicate to the A/V receiver in my media closet in a different room.

Oh yeah, can anyone name the first consumer HDMI device? It's the Sony PlayStation 3.
I have two devices connected to my TV through HDMI. The first is my wife's Macbook (DVI-HDMI) and the second is my Sony HD-DVR. Both work fine with HDMI. I don't know what problems these installers are referring to, but for me, I plug it in, I get video, I get audio, it's good. What can go wrong?

Also, there were many many many consumer products with HDMI way before the PS3. The PS3 may have been the first HDMI 1.3 device, but it was no way the first HDMI device.

ft
 
I have two devices connected to my TV through HDMI. The first is my wife's Macbook (DVI-HDMI) and the second is my Sony HD-DVR. Both work fine with HDMI. I don't know what problems these installers are referring to, but for me, I plug it in, I get video, I get audio, it's good. What can go wrong?

Also, there were many many many consumer products with HDMI way before the PS3. The PS3 may have been the first HDMI 1.3 device, but it was no way the first HDMI device.

ft
Yes, two devices, but only only HDMI device.

The HDMI spec for video is DVI. Which is why DVI-to-HDMI cables are so cheap, all it is basically is a DVI cable with a HDMI end. If you have patience you can easily make one yourself, of course it will cost more than buying one retail. So your laptop is actually speaking DVI to DVI.

The problems with HDMI mostly appears on the audio side of things. Which is why the most common symptom/problem found in HDMI installs is a great picture but no sound. It appears that all of the security embedded in HDMI is located on the audio side.
 
Yes, two devices, but only only HDMI device.

The HDMI spec for video is DVI. Which is why DVI-to-HDMI cables are so cheap, all it is basically is a DVI cable with a HDMI end. If you have patience you can easily make one yourself, of course it will cost more than buying one retail. So your laptop is actually speaking DVI to DVI.

The problems with HDMI mostly appears on the audio side of things. Which is why the most common symptom/problem found in HDMI installs is a great picture but no sound. It appears that all of the security embedded in HDMI is located on the audio side.

Well, my Sony HD-DVR has HDMI and my audio and video are fine when connected to my Sharp LCD. Neither are HDMI 1.3, I'm sure.

For me, HDMI just works. Now, I am planning on swapping my Sony DVR from HDMI to component, but it's more of a quirk with the Sony using HDMI. Has nothing to do with HDMI in general.

If you want to know, the Sony DVRs always revert to an "Auto HDMI" setting when you power off. It means that the DVR will pass through the original format, be it 480i, 720p, 1080i. It's annoying because you have to wait a few seconds when you switch channels. I like to just leave it set at 1080i, but you have to use component for the setting to "take". But I digress ...



ft
 
AppleTV most definately does do 1080i. Read the Tech Specs.
Thanks ft.

I had thought that references to 1080i were simply refering to TVs which can accept AppleTV signals. I thought AppleTV did no upscaling, and no interlacing. I really wonder how well it does both.

I don't have an Apple TV, but I'm guessing that when you have an AppleTV set to output 1080i to a TV that can receive a 1080i signal, you're set. Any 720p24 video will be scaled up to 1920x1080 and sent out as a 1080i signal.
That would be quite a quality loss though wouldn't it?

For example, my parents new TV - it accepts 1080i signals (or 1080p24), but scales to 720p (it's a 768 line TV - like most current HD plasmas). How much quality is lost when the AppleTV upscales to 1080, interlaces the signal, sends to the TV, which de-interlaces it, and downscales to 720p?

I mean this as a serious question - I'll configure AppleTV the moment it's released here in Oz. I assume I should tell the AppleTV to output at whatever res the original file is...? I guess waiting a week may be quite useful :)

ps. Their old plasma accepted 1080i but scaled to 480p.
 
For example, my parents new TV - it accepts 1080i signals (or 1080p24), but scales to 720p (it's a 768 line TV - like most current HD plasmas). How much quality is lost when the AppleTV upscales to 1080, interlaces the signal, sends to the TV, which de-interlaces it, and downscales to 720p?
I'm sure you could do it that way, but it would be counter-intuitive.

Most HDTVs out there today can accept both 1080i and 720p. So if you have a 720p plasma (i.e. 768p - darned marketers), you set the aTV to 720p. If you have a 1080p display, you set the aTV to 1080i. It's as easy as that. No point in setting the aTV to output 1080i to a 768p plasma.

ft
 
I'm sure you could do it that way, but it would be counter-intuitive.

Most HDTVs out there today can accept both 1080i and 720p. So if you have a 720p plasma (i.e. 768p - darned marketers), you set the aTV to 720p. If you have a 1080p display, you set the aTV to 1080i. It's as easy as that. No point in setting the aTV to output 1080i to a 768p plasma.

ft
I agree to not do that (I am intellectually interested in what would happen to quality in the upscale/interlace/de-interlace/downscale). I just can't see any advantage to EVER setting the AppleTV to output 1080i - since every TV that can accept that could accept 720p natively, and most TVs would downscale to 720p anyway.

When you say "set the aTV to 1080i. It's as easy as that." - I'm wondering how easy 'that' is and which aTV settings will get best results.
 
Well, my Sony HD-DVR has HDMI and my audio and video are fine when connected to my Sharp LCD. Neither are HDMI 1.3, I'm sure.

For me, HDMI just works. Now, I am planning on swapping my Sony DVR from HDMI to component, but it's more of a quirk with the Sony using HDMI. Has nothing to do with HDMI in general.

If you want to know, the Sony DVRs always revert to an "Auto HDMI" setting when you power off. It means that the DVR will pass through the original format, be it 480i, 720p, 1080i. It's annoying because you have to wait a few seconds when you switch channels. I like to just leave it set at 1080i, but you have to use component for the setting to "take". But I digress ...
ft
I know this statement will open up a can of worms, but everything I've read states that 720p looks better than 1080i, so you may want to set it to that.
 
You do know that both 480i and 480P have a 720x480 resolution, right? If 480P is supported on the AppleTV, there is no reason what so ever that 480i can't be supported.
Sure there is. The difference between 480i and 480p is marked. There's such a thing as integrity.
Ummmm, Component video is analog... Why would a set need a Digital to Analog Converter on an analog input?
What I meant to say was progressive scan, not digital. I mistakenly oversimplified.
Thats pretty much it - DVD, CABLE & GAME so TV makers have in most cases built TVs with 3 or 4 HD inputs (at the most)...
There was a time when TVs just had one or two. It's not a component manufacturer's responsibility to plan around how many free inputs you have any more than it's a peripheral manufacturer's responsibility to gauge the number of USB ports you might have.
I just can't see any advantage to EVER setting the AppleTV to output 1080i - since every TV that can accept that could accept 720p natively, and most TVs would downscale to 720p anyway.
The advantage is that if your TV *is* 1080 natively, then it avoids the downsampling to 720p on 1080 content. Other than that, it's unimportant.
 
I never understand people who want to Handbrake their DVDs, let alone their entire DVD collection. I love movies enough to give them my undivided attention for two hours, enjoying them in front of large TV and surround sound, and for that, Handbraking seems totally unnecessary.

How about watching movies on iPod or laptop? No thanks. I have better ways to entertain myself than watching a movie while on the road.

That's fine and good, but better is fairly relative. I'd like to read a book, unfortunately reading in a moving vehicle proves detrimental to my gastrointestinal tract.

I could always just harass fellow travelers. It's far more interesting than any movie.:)
 
But AppleTV can't supply 1080 content.
It can't display video at 1080 lines (currently). But video is only one element of the Apple TV--drawing the interface at 1080 lines will be superior than upsampling a 720 interface, for pixel size and for overall smoothness. So when you're listening to music or scrolling through trailer previews, or displaying photo slideshows of high-resolution images, you'd still benefit over 720p on a sufficiently large TV.

But a video-only argument, put another way: why would you want to upsample twice on typical content? Anything not natively 720p (for example, current iTunes videos; DVD content; EyeTV recordings) would be upsampled by the Apple TV to 720p, which would then be handed off to the TV to upsample to 1080i. Why not skip the middle man and just have the Apple TV upsample to 1080i and display natively?
 
It can't display video at 1080 lines (currently). But video is only one element of the Apple TV--drawing the interface at 1080 lines will be superior than upsampling a 720 interface, for pixel size and for overall smoothness. So when you're listening to music or scrolling through trailer previews, or displaying photo slideshows of high-resolution images, you'd still benefit over 720p on a sufficiently large TV.
What are you basing this on? What do you mean "drawing at 1080" - it's still a conversion of the 720 signal.

I've read many reviews saying that upscaling in the best DVD players gives a better result than doing it in the TVs, but every review has said this is because the quality of the signal processing is higher in those DVD processors.

ie: The actual process is the same.
But a video-only argument, put another way: why would you want to upsample twice on typical content? Anything not natively 720p (for example, current iTunes videos; DVD content; EyeTV recordings) would be upsampled by the Apple TV to 720p, which would then be handed off to the TV to upsample to 1080i. Why not skip the middle man and just have the Apple TV upsample to 1080i and display natively?
Well...

Firstly - I agree that you shouldn't upsample twice. Upscaling (or downscaling) will cause problems similar to audio concatenation problems... errors just stand out.

However - most HDTVs are 768 lines at the moment, so the TV is not going to be upscaling to 1080 lines.

Also, forgetting 1080 vs 768, no plasma TV displays interlaced signals. They're all progressive, they all de-interlace the signal first. If the AppleTV interlaces its signal, and the TV can determine which method of Interlacing was used, it'll deinterlace the signal cleanly. But why bother?

(EDIT: CRT HDTVs are apparently mostly 1080i, so they upconvert and interlace if they get a 720p signal. The plasmas do the opposite by downconverting and deinterlacing if they get a 1080i signal)

All in all - you've convinced me that either the TV or AppleTV should do ALL the work on upscaling. That's very useful to me. But I'm leaning towards letting the TV do all the work since
1) This takes some pressure off the AppleTV video card (though I think this is probably unimportant)
2) If I set the AppleTV to my TV resolution (720), it'll have to convert 480p30 stuff to 720p24. Whereas the TV will convert 480p30 to 720p30.
3) The TV is actually 768 lines. I imagine that converting 480p (or 576p) video to 768 directly is better than having the AppleTV convert to 720, and then having the TV convert that to 768. (not so sure about this!!!)

Can you give me any reason that having the AppleTV do the upscaling will result in better quality than having the TV do it (given that I have a good TV too! which makes a difference).
 
I may be playing devil's advocate here, but when was the last time Mossberg gave an Apple product a bad review?

Don't get me wrong, I usually agree with him, but does anyone else think he comes across to the general public as an unabashed fanboy?

If someone who reviews many products and is honest about them all... and one of the makers of products he reviews seems to get consistently good reviews ... then there are two possible conclusions:

One, he is biased and a 'fanboy' of that manufacturer
... or
Two, that manufacturer consistently makes great products.

Personally the latter case seems in evidence here ... but maybe I am biased :)
 
What are you basing this on? What do you mean "drawing at 1080" - it's still a conversion of the 720 signal.

I've read many reviews saying that upscaling in the best DVD players gives a better result than doing it in the TVs, but every review has said this is because the quality of the signal processing is higher in those DVD processors.

ie: The actual process is the same.
Well...

Firstly - I agree that you shouldn't upsample twice. Upscaling (or downscaling) will cause problems similar to audio concatenation problems... errors just stand out.

However - most HDTVs are 768 lines at the moment, so the TV is not going to be upscaling to 1080 lines.

Also, forgetting 1080 vs 768, no plasma TV displays interlaced signals. They're all progressive, they all de-interlace the signal first. If the AppleTV interlaces its signal, and the TV can determine which method of Interlacing was used, it'll deinterlace the signal cleanly. But why bother?

(EDIT: CRT HDTVs are apparently mostly 1080i, so they upconvert and interlace if they get a 720p signal. The plasmas do the opposite by downconverting and deinterlacing if they get a 1080i signal)

All in all - you've convinced me that either the TV or AppleTV should do ALL the work on upscaling. That's very useful to me. But I'm leaning towards letting the TV do all the work since
1) This takes some pressure off the AppleTV video card (though I think this is probably unimportant)
2) If I set the AppleTV to my TV resolution (720), it'll have to convert 480p30 stuff to 720p24. Whereas the TV will convert 480p30 to 720p30.
3) The TV is actually 768 lines. I imagine that converting 480p (or 576p) video to 768 directly is better than having the AppleTV convert to 720, and then having the TV convert that to 768. (not so sure about this!!!)

Can you give me any reason that having the AppleTV do the upscaling will result in better quality than having the TV do it (given that I have a good TV too! which makes a difference).



Look I may be totally wrong here but as far as I can see this is not quite correct (I'm not saying GregA did I just picked up the drift with his post). Many people are saying 720p is the same as 1080i and it is only a matter of conversion in either direction.

I shoot 1080i video (USA) and can pull up a frame on screen. It is 1920 x 1080 pixels and looks fabulous. Now when viewed on a 1080i screen it is interlaced, i.e. 60 fields per second and is an awesome picture and quite good at tracking motion without too much pixillation.

Now, if I have 720p material on my Mac screen it is only 1280 x 720 pixels. It is also a great picture but if I resize to 1920 x 1080 I can se it is obviously not as sharp or as clear as native 1920 x 1080 footage (any more than upsizing any image can be). The 720p is viewed on a TV in a progressive way so there are 30 frames per second and it tends to be better at showing fast motion than the 60 fields per second.

So if motion is important 720p seems better but for my eyes 1080i is simply a better picture and it seems the reason seems simply to me ... it has more data.

The progressive form of 1080 must be something to behold but I have no such camera as of yet and my HDTV isn't up to but it must look amazing.

Having said that ... 720p will be just fine for most purposes and my ATV arrives tomorrow and I have a few samples to test.

One thing though ... making anything from a standard definition DVD into 720p is a waste of time since the 480i or 480p material would simply be upscaled in size. Obtaining 720p from your own HD material or Apple iTunes ... (hopefully) which was made from true HD source or film is another story, well worth obtaining. :)

Just my input ... I welcome those who know more chipping in. :)
 
Apple TV

A lot of people complain about everything here. Sure, you're upset that whatever it is isn't suitable for you, but just don't buy it.

I do like the Apple TV, but if only it had these features:

5.1 surround sound
iTunes Store have a movie rental store (movies available outside the US would be a start).

That's pretty much about it. If I could rip my existing DVDs and get 5.1 sound, I'd be happy with just that. Maybe in the future that is something Apple is thinking of including.

As for 1080p, people have already pointed out that this doesn't seem to be a priorty. Most people would have a 720p TV and the difference between the 2 isn't that much. Maybe this is a hint that iTunes movies/TV shows will be available on 720p soon? It's certainly more feasible than downloading a 1080p file.
 
A lot of people complain about everything here. Sure, you're upset that whatever it is isn't suitable for you, but just don't buy it.

I do like the Apple TV, but if only it had these features:

5.1 surround sound
iTunes Store have a movie rental store (movies available outside the US would be a start).

That's pretty much about it. If I could rip my existing DVDs and get 5.1 sound, I'd be happy with just that. Maybe in the future that is something Apple is thinking of including.

As for 1080p, people have already pointed out that this doesn't seem to be a priorty. Most people would have a 720p TV and the difference between the 2 isn't that much. Maybe this is a hint that iTunes movies/TV shows will be available on 720p soon? It's certainly more feasible than downloading a 1080p file.

This is interesting regarding 5.1
http://www.thismuchiknow.co.uk/?p=24


Damn still no FedEx delivery here yet ...
 
Remote Volume

This isn't exactly surprising, as there's probably no standard way to change the TV volume via some input cable, and doing it directly via the remote would mean you would have to program your Apple remote like you would do a universal remote -- i.e.: a pain in the butt.

Umm, what about adjusting the output volume from the Apple TV device itself. I have no problems with either the Elgato eyeHome (an Apple TV forerunner) or Mac Mini running eyeTV or Media Central plugged in to a TV using a remote tied to the device.

I would be very surprised if Apple missed this bit out as usually they think very intuitively about the user interface.
 
Umm, what about adjusting the output volume from the Apple TV device itself. I have no problems with either the Elgato eyeHome (an Apple TV forerunner) or Mac Mini running eyeTV or Media Central plugged in to a TV using a remote tied to the device.

I would be very surprised if Apple missed this bit out as usually they think very intuitively about the user interface.

As has been mentioned earlier in this thread, attenuating the audio line levels at the input stage is only going to reduce the signal-to-noise ratio and be generally detrimental to sound quality (you might hear it as static, crackling, or background 60 Hz hum noise), as well as be annoying to have to keep track of where you changed volumes, whether it was at the source or at the receiver or TV.

Apple didn't miss out, they realized that it's not a practical solution.
 
I agree to not do that (I am intellectually interested in what would happen to quality in the upscale/interlace/de-interlace/downscale). I just can't see any advantage to EVER setting the AppleTV to output 1080i - since every TV that can accept that could accept 720p natively, and most TVs would downscale to 720p anyway.

When you say "set the aTV to 1080i. It's as easy as that." - I'm wondering how easy 'that' is and which aTV settings will get best results.

Here's my thoughts on the 720p/1080i subject. Most new plasmas and LCDs are indeed 720/768p native. For these sets, setting the aTV would theorectically offer the best visuals. For 1080p native sets (of which I own), I find that I get better visuals from my Sony DVR when I set it to 1080i. I can compare/contrast since ABC and Fox are 720p. On my TV, 1080i output with 720p material is better than 720p output with 720p material. I attribute this to the Sony DVR having a better scaler than my TV.

As for the ease of setting the aTV to various output resolutions, I do believe that it would be very easy. In reading the manual, there's a set-up menu and one of the choices is to set the output format of the aTV. Simple, just like any other CE device these days.

One last thing. There are many HDTVs out there that do not have 720p compatibility. Among them are CRT RPTVs and the 2005 Panasonic Plasmas. These TVs accept 1080i.

ft
 
Here's what I'm banking on:

When Apple releases the 6GB Full Screen iPod they will also announce HD content on iTunes that is subscription based -- and you will be able to download it straight to your ATV or to your computer

I think Apple will surely release a rental service in time, or so I think/hope

In regards to a DVR I just don't see Apple doing this since it would hurt the sales of their TV shows sales on the ITMS

Just a thought, take it as you will :eek:
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.