Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
This doesn't bother me at all. Connecting an iPad or iPhone direct to a tv or projector via HDMI really isn't practical anymore. :apple:TV & airplay is how I do all tv & projector connections now. I really don't understand why some of you are so upset by this adapter. What are you going to do, get a 30ft HDMI cable so you can sit on the couch and mirror your iPad or iPhone to a TV? Connecting a computer or even iOS devices to a tv via a cable is what I would consider legacy hardware. I'm willing to understand this frustration some of you have. I just don't get it.

If you're happy to watch stuff that's already lossy, then you probably don't care about the additional compression and errors that get introduced with Airplay and wireless transmission.

Sometimes you want pixel-perfect 1:1 display... like say a business presentation.
 
Wow. There have been articles about this for days, and you still don't have the facts right. The cable can and does output 1080p video just fine. What it *can't* do is *MIRROR* the display at resolutions higher than 1600x900.

Mirroring the display involves on the fly *encoding* of the video stream, which is very computationally expensive, and not handled in hardware except on high end, *expensive* hardware (such as top of the line, dedicated PCI-e GPUs).

Yeah, exactly, it re encodes (read: lost of quality) to the tv. It's like hooking up a super nintendo to an hdtv :rolleyes:

TLDR; You have no idea what you're talking about.
 
It's very difficult to unbrick a device that has no physical ports if the wireless stack is not functioning properly. That's why the Apple TV has a USB port, for service and support.

Unbricking. That's you argument for inventing the lightening adaptor?

I suppose dual purposing any of the sim card slot, or adding micro sd or micro hdmi would also be absurd.
 
Yeah, exactly, it re encodes (read: lost of quality) to the tv. It's like hooking up a super nintendo to an hdtv :rolleyes:

TLDR; You have no idea what you're talking about.

You mean LOSS of quality. It's like like hooking a snes to an HDTV. Your math is bad. 1600x900 >> 720x480. 99% of people wont notice a difference between 1600x900 and 1920x1080. Stop pretending like you know anything.
 
Do me a solid and PM me your address so I can send you some coffee or Xanax or something.

Jesus. Chill, man. I did take some time to read through other parts of this thread and you need to seriously tone it down. Bullying people on a rumors site doesn't make you a superior being.

It makes you a, well, you get the idea.

Get ready for another long reply telling you how you don't know what you're talking about, ad nauseam. The guy can't let anyone else have an opinion contrary to his own. Sounds like a future politician to me.... :rolleyes:
 
If you're happy to watch stuff that's already lossy, then you probably don't care about the additional compression and errors that get introduced with Airplay and wireless transmission.

Sometimes you want pixel-perfect 1:1 display... like say a business presentation.

Let me be honest, I really don't see much lag or loss of fidelity using air play. For business sakes I'm never streaming hi def video. I do however, use it at home for those pesky shows on Hulu that are computer only. Yeah, there is some degradation of video quality, but I trade a 30-40 ft HDMI cable for the ease of picking up my MacBook or iPad and firing the video to my TV without going wait, let me go grab that cable... I'm 32, so I remember that fuzzy 70's & 80's era TV quality. I've got to tell you, compression rates and buffers and 120hz refresh rates from 10-20 feet away from the TV really are hard to notice. I just think people are being overly picky about a feature that to me is more of in inconvenient to use. I don't know, just my opinion.
 
Wow. There have been articles about this for days, and you still don't have the facts right. The cable can and does output 1080p video just fine. What it *can't* do is *MIRROR* the display at resolutions higher than 1600x900.

Mirroring the display involves on the fly *encoding* of the video stream, which is very computationally expensive, and not handled in hardware except on high end, *expensive* hardware (such as top of the line, dedicated PCI-e GPUs).

Thats was also my assertion in a earlier posting. Something along the lines of trying to stream an 1600 X 900 resolution to a 1080 HDTV would cause artifacts regardless.
 
Sometimes you want pixel-perfect 1:1 display... like say a business presentation.

Sorry but I have seen almost no presentations, business or otherwise, that are a true 1:1 pixel-perfect. There is always some kind of scaling or video processing in the loop. Not to mention that a majority or projectors I see business bring in are still only 1024x768.
 
You mean LOSS of quality. It's like like hooking a snes to an HDTV. Your math is bad. 1600x900 >> 720x480. 99% of people wont notice a difference between 1600x900 and 1920x1080. Stop pretending like you know anything.

Why are you comparing 720x480 when I explicitly mentioned 1920x1080? Apparently my math isn't as bad as your reading skills.
 
It's ridiculous that the lightning to 30 pin doesn't support video. That means I can't watch any video on my car stereo anymore. I would need to go back to the 4th gen. The new iPhone and iPod touch has been around long enough, what's up?
 
And I one with a separate micro-HDMI port - as is done by Nokia. The best of the two worlds (Ligtning with its advantages + direct, cheap, high-quality, true HDMI).
MicroHDMI has three problems:
1) Density of the 19 pins creates a weak solder point for the connector on the phone;
2) Consumers often confuse the MicroHDMI and microUSB connectors; result is they jam the wrong cable into the connector and damage their phone;
3) HDMI requires the source device (phone) to provide power to the TV.

These are the reasons why most Android phones use MHL: Same microUSB connector and the Display (or Dongle) provides charging power to the phone.

Apple could provide this interface over lightning, as far as I can tell. I doubt they're saving power by compressing and decompressing their video signal.
 
It's ridiculous that the lightning to 30 pin doesn't support video. That means I can't watch any video on my car stereo anymore. I would need to go back to the 4th gen. The new iPhone and iPod touch has been around long enough, what's up?

BTW, for video playback, all iDevices starting with the iPhone 3GS / iPod touch 3gen / iPad 1 are all perfect - all are able to properly decode even 1080p H.264 videos. (2009 models up until around 15 Mbps; over that, they'll stutter - for example with the 40 Mbps video track of THIS standardized test video.) Assuming you use hardware decoding, that is, play back iOS-native video containers (mov / mp4 / m4v) in either the stock Videos app or a third-party one with (enabled) hardware acceleration support.

Of course, video out is restricted in the 2009 and 2010 models (composite/component SD only in the former and up to 720p only over HDMI / XGA over VGA in the latter models) but at least decoding works. Which means you won't encounter stuttering.

All in all, your 4th-gen device will work just fine as a video player.

(Let me know if you need to rely on software decoding.)


----

BTW, I'll receive the Lightning > HDMI adapter tomorrow. Hopefully will be able to publish a decent evaluation and comparison to both "old", cabled HDMI / VGA adapters and the ATV 3 tomorrow evening.

(HDMI and VGA essentially give the same image quality BUT the VGA is, obviously, a no-audio one and can only go up to XGA on 4th-gen iPhone / iPod touch / 1st-gen iPad, not to 720p, unlike the HDMI adapter. On newer models, both are capable of 1080p.)
 
OK, we get it - Apple engineers are very smart. Could you - as a user - name a single benefit of having this new adapter other than the fact that the cable may be connected to the devices regardless of plug orientation? Does this compensate for 400% increase in price of HDMI adapter (compared to MHL), poor video quality and pricey cables in general? I suspect that Apple engineers were busy all those 5 years thinking how to achieve this technological marvel.

The advantage of Apple's weird computer-in-adapter over MHL is this:
Higher resolution, frame rate, software upgradable, and the ability to support any future display standard by changing just the adapter and not the phone.

Most Android phones, including the Galaxy S2 and S3, will only be able to get 720p60. Xperia ZLs can get 1080p30 or 1080i60. This is limited by the MHL converter on the phone. Rooting the phone may allow some Android phones to switch to the same modes as the Xperia, but I can't verify that. Regardless, no Android user/dev I've heard from could name an MHL setup that gets 1080p60. To get beyond these limits, you need to switch both your adapter and your phone, because all the MHL chips need to be replaced.

Lightning is limited by the adapter and the processing power of your phone. The adapter itself can handle 1080p60. If a better transport codec or setting is found, it can be pushed to all the devices. Then improving the situation is simply a firmware update. The lifespan of the adapter and the phone is not as tied together.

As of right now, the lightning display mirroring solution on an ipad mini, is inferior to the 30-pin solution on a iphone 4s, but still better than MHL.
 
You said hooking up snes to hdtv so you're not even making sense. At least you can admit your math is bad.

NgG9xKm.jpg
 
The advantage of Apple's weird computer-in-adapter over MHL is this:
Higher resolution, frame rate, software upgradable, and the ability to support any future display standard by changing just the adapter and not the phone.

Most Android phones, including the Galaxy S2 and S3, will only be able to get 720p60. Xperia ZLs can get 1080p30 or 1080i60. This is limited by the MHL converter on the phone. Rooting the phone may allow some Android phones to switch to the same modes as the Xperia, but I can't verify that. Regardless, no Android user/dev I've heard from could name an MHL setup that gets 1080p60. To get beyond these limits, you need to switch both your adapter and your phone, because all the MHL chips need to be replaced.

Lightning is limited by the adapter and the processing power of your phone. The adapter itself can handle 1080p60. If a better transport codec or setting is found, it can be pushed to all the devices. Then improving the situation is simply a firmware update. The lifespan of the adapter and the phone is not as tied together.

As of right now, the lightning display mirroring solution on an ipad mini, is inferior to the 30-pin solution on a iphone 4s, but still better than MHL.

Apple engineer that provided information to Panic stated that Lightning bandwidth limit is lower than what's required for 1080p.

On the other hand, according to Samsung specs, their MHL adapter is 1080p. So, as far as interfaces are concerned, MHL comes on top. Whether specific phone can deliver 1080p is a different matter. Is S3 can't do it, perhaps S4 will.
 
Yup. This one just has pixies inside, with notepads, who transcribe USB to HDMI.
5MCBVtI.jpg

That's about your level of understanding. There's no "transcribing" the data is delivered pixel for pixel. The Apple hack transcodes the data, reducing the quality.

----------

Completely wrong. Either you put a static port on the device or have an adaptor. The lightning in incredibly adaptable and will be able to cope with and future Connectors that come along.

It can take v.high power loads - something that mini and micro usb cannot at all... in fact all the current Sammys, etc tablets and iPad break the USB power specs.

And as the great post above shows samsung has exactly the same issue. Someone needs to crack it open and it will have some form of processor

A lot of devices still require specific ports and power... you cannot charge the nexus 7 properly with other USB devices. Don't think that Apple is the only one that want you buy their own adaptors and chargers. Just because they use a low spec microUSB that allows a simple charge... the rest of the functionality require device specific adaptors

Video though MHL is connection agnostic not even a type of port so companies are sticking whatever port they want on the device.

"The Galaxy S III is the first MHL device to use a different connector - one that is not compatible with all other MHL devices and accessories.[9] Consumers assumed that the MHL branding ensured compatibility so they were surprised when MHL accessories did not work with the Galaxy S III[10] (the incompatibility is due to the S3 using an 11 pin connector rather than a 5 pin)."

You don't understand even the basics here. The standard devices are just delivering the pixel data verbatim. The Apple hack is decoding compressed video data, causing artifacts and output lag. It's a horrible, horrible solution.
 
The advantage of Apple's weird computer-in-adapter over MHL is this:
Higher resolution, frame rate, software upgradable, and the ability to support any future display standard by changing just the adapter and not the phone.

Most Android phones, including the Galaxy S2 and S3, will only be able to get 720p60. Xperia ZLs can get 1080p30 or 1080i60. Regardless, no Android user/dev I've heard from could name an MHL setup that gets 1080p60.
You're creating a fallacy by comparing the theoretical abilities of Lightning to the actual abilities of shipping MHL-based products. You also assume that an iPhone 5 can render and compress these larger resolutions, which may not be so.

Your claim of software upgradability is also dubious, because you typically can't reprogram a hardware-based decoder to support new codecs or double the bandwidth.
 
Nice, can´t wait!

It's much better than expected. Some premilinary results, all with the iPhone 5 (will also test with the iPad 4):

- when mirroring,

a, it mirrors with 60 fps as opposed to the 30 fps-only AirPlay mirroring over an ideal connection (cabled ATV3 and a quality Wi-Fi access point some two metres from the iPhone 5)

b, the lag is, while somewhat larger (83ms (5 frames when playing back a 60 fps video) vs. 50 ms (3 frames)) than with an iPad 3 + (old) VGA / (old) HDMI adapter combo, is still considerably lower than the iPhone 5 + cabled, ideal ATV3 combo (150ms (9 frames)). That is, the lag is almost half of that of the standard wireless AirPlay, while "only" about 40% more than with the old adapters, on previous-gen iDevices (here, iPad 3). Whether this is sufficient for fast action games like Real Racing 2/3 - well, it's on the borderline, I'd say.

- playback of iOS-native(!!!) videos is done on the adapter itself. (In several of my previous posts here, I've emphasized I don't know whether this is the case.) This means videos are played back at true 1080p (I've tested this with rescharts, of course) and at their full quality, making use of the entire screen estate. 60 fps decoding is supported (when driven from, say, It's Playing - you can't synch 60 fps videos to the stock Videos app), so are Apple's CC's.

Will go on reporting / making all this into a full article with comparative lag shots (120 fps videos and the like) so that anyone can check all this himself. I'll, for example, test how much the H.264 decoder is better than that of the single-core A5 in the ATV3, known for not really liking 60p (or >20 Mbps 24p/30p) content. Hope it's better.
 
Last edited:
It's much better than expected. Some premilinary results, all with the iPhone 5 (will also test with the iPad 4):

- when mirroring,

a, it mirrors with 60 fps as opposed to the 30 fps-only AirPlay mirroring over an ideal connection (cabled ATV3 and a quality Wi-Fi access point some two metres from the iPhone 5)

b, the lag is, while somewhat larger (83ms (5 frames when playing back a 60 fps video) vs. 50 ms (3 frames)) than with an iPad 3 + (old) VGA / (old) HDMI adapter combo, is still considerably lower than the iPhone 5 + cabled, ideal ATV3 combo (150ms (9 frames)). That is, the lag is almost half of that of the standard wireless AirPlay, while "only" about 40% more than with the old adapters, on previous-gen iDevices (here, iPad 3). Whether this is sufficient for fast action games like Real Racing 2/3 - well, it's on the borderline, I'd say.

- playback of iOS-native(!!!) videos is done on the adapter itself. (In several of my previous posts here, I've emphasized I don't know whether this is the case.) This means videos are played back at true 1080p (I've tested this with rescharts, of course) and at their full quality, making use of the entire screen estate. 60 fps decoding is supported (when driven from, say, It's Playing - you can't synch 60 fps videos to the stock Videos app), so are Apple's CC's.

Will go on reporting / making all this into a full article with comparative lag shots (120 fps videos and the like) so that anyone can check all this himself. I'll, for example, test how much the H.264 decoder is better than that of the single-core A5 in the ATV3, known for not really liking 60p (or >20 Mbps 24p/30p) content. Hope it's better.
Thanks a lot! Looking forward to your article.
 
That's about your level of understanding. There's no "transcribing" the data is delivered pixel for pixel. The Apple hack transcodes the data, reducing the quality.

Well, you seem to be a quite knowledgable fellow. I assume that you have examined the signal transmission between the iOS device and the adapter to confirm that it does compress the picture and its not due to some crappy upscaler on a cheap TV.

Also, it should be rather easy for an expert like you to enhance the firmware that gets uploaded to the adapter or however this thing works and the kernel extension that handles the stuff on the phone to output uncompressed 1080p on jailbroken devices. If the processor in the adapter is to slow, you can probably mass-produce your own, faster version.

And then you go out and implement 4K over MHL on the Galaxy S4. On the iOS devices, all it takes is maybe a faster processor in the adapter - on the Galaxy S4, probably a new connector and a new adapter as compared to the Galaxy S3.
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.