Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
The obvious thing to do is load some test patterns such as the AVS709HD collection and see how it does on resolution, overscanning, etc. The 1080p resolution doesn't come up on their display which they don't bother to name? The first thing I would do is try it on a different display.
 
adapters

I'm tired of buying adapters and cables; I have boxes full of them.

VGA/DVI/HDMI and every combination going to and from Powerbook,MacBook Pro,Air,Ipad,Mac-Mini,IMac,Iphone etc...
 
The truth is :
1)Nobody has done an objective comparison of the video quality or the data
transmission rate compared to other products and connectivity options.
Yet everyone has an opinion.

2) At least for Apple size matters as they are trying to make thinner,stronger And lighter phones. I think in Order to cram so much in a little space the 30 pin connector had to go. In my humble opinion creating a lighter, stronger and thinner phone demonstrates more innovation than creating a larger screen.

3) Apple designed this connector for future products and specifications neither you nor I are aware of. Yet so many here seem to second guess their decision without knowledge of Why Apple may have made certain trade offs

4) it would be real interesting to get the statistics of how many people use their iPhone connected by a cable to view video on a TV. To be honest I know many people with iPhone and IPads and Never have heard anyone using this function.
personally when I give presentations with video from my iPad I stream it wireless. What's your guess if you we're going to predict the future? more wire or more wireless . Of course nobody has yet shown any evidence that the video is poor with the wire option?

The greatest thing is there are so many options out there if you don't like it don't use it. If the other options are so great why would you waist your time here?
I would be the first to admit I am a fanboy looser who drinks Apple Kool aid by the gallon. As much as a looser that I am that I spend hours surfing apple news sights. I can't imagine the level of despair a human would sink to to waist their limited precious time on this planet, hanging out on these sights and be an Apple basher.
Of course we all know the real reason you guys are here
Just saying.
 
Last edited:
4) it would be real interesting to get the statistics of how many people use their iPhone connected by a cable to view video on a TV. To be honest I know many people with iPhone and IPads and Never have heard anyone using this function.
personally when I give presentations with video from my iPad I stream it wireless. What's your guess if you we're going to predict the future? more wire or more wireless . Of course nobody has yet shown any evidence that the video is poor with the wire option?

Well, the main limitation would be that majority of places are still using vga projectors, so you would still be limited by their 800x600 resolution anyways. I am using airserver to mirror my ipad to my macbook (and to the projector), so I don't need an ipad adaptor (but I do use a minidisplay to vga cable).

I also do have a 30-pin to vga adaptor for my ipad for when I do not have time to set up (I use my own router because of my workplace's network restrictions).

What does that make me? :p
 
Apple is pushing lightning for profits, not for the benefit of the user. I still haven't heard of a practical advantage over the old connector or even traditional usb3 as far as an iOS device is concerned.

What profits? Selling Lightning to 30-pin adaptors? That would be a stupid move.

Lightning is better than micro-USB in many ways, and one of them is that it won't have to be changed for a long, long time to keep up with the new technology. Oh, USB also can't carry video signals directly, can it?

----------

I'm tired of buying adapters and cables; I have boxes full of them.

VGA/DVI/HDMI and every combination going to and from Powerbook,MacBook Pro,Air,Ipad,Mac-Mini,IMac,Iphone etc...

Apple's PCs really sucked with their video outputs for a long time. Mini-VGA? WTF? Min-DVI? Who comes up with this stuff? At least they went standard now with DisplayPort and (stupid) HDMI.
 
I disagree.

If anything, we now know why the lightning adaptors are so expensive. Not so much because Apple wants to leech us of our money, but because they really are that costly to produce.

Also, Apple is pragmatic above everything else (at least where managing supply-side costs are concerned). They are not going to waste money outfitting every connector with a processor and ram if there wasn't a legitimate need, especially when a cheaper alternative presented itself.

They clearly have some great plans for it down the road. Just like how itunes would go on to become a great selling point for the iphone and iPad, the lightning connector may well prove to be as such in due time. :)

I agree, there's something about all this we don't understand. It makes no sense create a cable like this, and make such dramatic changes to replace the 30-pin connector, unless there's something in the future.

The fact is that some of us don't want to let go Composite and Component video, and our analog RCA audio connectors (I still need these!) even when living in the digital era. I have both Component and HDMI cable/adapter for my iPad 3, and don't want to let go yet.

I guess in Apple's plan there's more in place than an HDMI cable. It may be AirPlay, as the original post highlights. It would be nice to have a small AirPlay receiver (the size of a Flash drive, but it connects to an HDMI port on a TV or Projector) to carry around in case we need to connect to a TV.

I'm sure Apple will solve the low quality/Artifacts issues with this cable sometime soon. This is no ordinary cable, and the Lightning port still is a Pandora box to most of us. From what I was able to gather, there are lots of possible uses for this adapter, we just need to wait.
 
this AC comment on panic's blog potentially sheds all the light... sounds right to me.

https://www.panic.com/blog/2013/03/the-lightning-digital-av-adapter-surprise/#comment-16841

Airplay is not involved in the operation of this adapter.
It is true that the kernel the adapter SoC boots is based off of XNU, but that’s where the similarities between iOS and the adapter firmware end. The firmware environment doesn’t even run launchd. There’s no shell in the image, there’s no utilities (analogous to what we used to call the “BSD Subsystem” in Mac OS X). It boots straight into a daemon designed to accept incoming data from the host device, decode that data stream, and output it through the A/V connectors. There’s a set of kernel modules that handle the low level data transfer and HDMI output, but that’s about it. I wish I could offer more details then this but I’m posting as AC for a damned good reason.
The reason why this adapter exists is because Lightning is simply not capable of streaming a “raw” HDMI signal across the cable. Lightning is a serial bus. There is no clever wire multiplexing involved. Contrary to the opinions presented in this thread, we didn’t do this to screw the customer. We did this to specifically shift the complexity of the “adapter” bit into the adapter itself, leaving the host hardware free of any concerns in regards to what was hanging off the other end of the Lightning cable. If you wanted to produce a Lightning adapter that offered something like a GPIB port (don’t laugh, I know some guys doing exactly this) on the other end, then the only support you need to implement on the iDevice is in software- not hardware. The GPIB adapter contains all the relevant Lightning -> GPIB circuitry.
It’s vastly the same thing with the HDMI adapter. Lightning doesn’t have anything to do with HDMI at all. Again, it’s just a high speed serial interface. Airplay uses a bunch of hardware h264 encoding technology that we’ve already got access to, so what happens here is that we use the same hardware to encode an output stream on the fly and fire it down the Lightning cable straight into the ARM SoC the guys at Panic discovered. Airplay itself (the network protocol) is NOT involved in this process. The encoded data is transferred as packetized data across the Lightning bus, where it is decoded by the ARM SoC and pushed out over HDMI.
This system essentially allows us to output to any device on the planet, irregardless of the endpoint bus (HDMI, DisplayPort, and any future inventions) by simply producing the relevant adapter that plugs into the Lightning port. Since the iOS device doesn’t care about the hardware hanging off the other end, you don’t need a new iPad or iPhone when a new A/V connector hits the market.
Certain people are aware that the quality could be better and others are working on it. For the time being, the quality was deemed to be suitably acceptable. Given the dynamic nature of the system (and the fact that the firmware is stored in RAM rather then ROM), updates **will** be made available as a part of future iOS updates. When this will happen I can’t say for anonymous reasons, but these concerns haven’t gone unnoticed.
 
this AC comment on panic's blog potentially sheds all the light... sounds right to me.

https://www.panic.com/blog/2013/03/the-lightning-digital-av-adapter-surprise/#comment-16841

AC's comment makes a lot of sense.
This would meant that there's virtually no Digital/Analog conversion going on inside iOS devices with the exception of the audio DAC for the speaker and headphones/mic.
Analog video components are prompt to cause interference, take space as these require extra shielding.
I think it's a great idea. I want a Lighting to RS-232C cable to be able to configure routers.
 
And that is why I am not drawing a million dollar salary at Apple. This is serious far-sighted stuff if it is true. Now this is the Apple I know! :D
 
I like how some folks here are claiming that this is future-proofing when in fact Apple routinely obsoletes its older products for arbitrary reasons anyway.

I really like the people saying HDMI is stupid. I don't understand how you can say that given that it works immediately and delivers much better video quality than this cable is offering.
 
And that is why I am not drawing a million dollar salary at Apple. This is serious far-sighted stuff if it is true. Now this is the Apple I know! :D

It's not that great. Apple has planned obsolesce of products that is significantly more common than changing video cable standards. When was the last time you wanted to output video from your ipad / iphone and the destination socket was too exotic? It just doesn't happen. Most companies follow international standards - only apple doesn't. (See: Mini USB in Nokia, Samsung vs Apple proprietary connectors. Non-standard headphone jacks in Apple microphones etc)

It's an over-engineered solution, designed around producing a proprietary format to generate additional expensive cable sales. It also allows removing hardware from each iphone unit to save production costs. This reduction will not/is not passed onto the consumer. Yes, it's probably 10c worth. But over 100m products that's some serious money.

Not that i'm surprised, Apple has been doing this since the first iPod I bought. Each one used to come with a plug, usb cable, docking station etc. The price stayed the same, but all the extras became $24.99 additional purchases. That's business I suppose.
 
Well, in theory at least, this allows any lightning adaptor to remain compatible with any Apple product sporting that port, compared to Samsung or any other company, which would effectively obsolete their earlier port standards should they jump to the next global standard in future.

Correct me if I am wrong, but wouldn't this mean that in 3 years time, my lightning to hdmi adaptor should still work on any iphone or ipad with a lightning port? Same goes for any accessory.

Conversely, if Samsung were to adopt a new port due to technological advancements (say nano-USB or USB10 or whatever, I am pulling names out of thin air here), would this not render their product incompatible with their earlier offerings? So a cable for their Note3 (should one come out) may not work on an earlier Note2 and so on.

Likewise, since the lightning adaptor is in essence a mini-cpu, Apple could theoretically program it to perform functions normal cables can't. Which may be a good or bad thing, I suppose, depending on the function in question.

It's clearly aimed at keeping you locked into the Apple ecosystem, but we all knew that the day we bought into Apple's walled garden, did we not? :eek:
 
It's not that great. Apple has planned obsolesce of products that is significantly more common than changing video cable standards.

The only comparable standard - MHL - has its compatibility and obsolescence issues too - http://en.wikipedia.org/wiki/Mobile_High-Definition_Link#The_Galaxy_S_III_MHL_port_controversy - and doesn't actually define a standard connector. Samsung use a proprietary connector with extra pins so that they can have simultaneous MHL and USB links.

Incidentally - if you read the Wikipedia article you'll see that MHL is not simply HDMI-over-microUSB, so MHC-to-HDMI cables will also have chips in them, which may or may not be "computers" (using programmable micro-controllers rather than hard-wired circuitry is common practice... probably not the same processing power as the Apple cable, though).

From Apple's point of view, using the 'soft' adapter will have its advantages: the Lightning-to-VGA adapter (still essential if you want to give presentations) probably re-uses much of the same hardware/firmware design, and if MHL- or DisplayPort-enabled televisions take off it won't be a major re-design to produce a Lightning-to-MHL adapter.

The other issue that nobody has mentioned is that HDMI and HDCP carry royalties, and by putting all this in the adapter Apple may only need to pay royalties on the adapters, rather than on every iDevice sold.
 
AC's comment makes a lot of sense.
This would meant that there's virtually no Digital/Analog conversion going on inside iOS devices with the exception of the audio DAC for the speaker and headphones/mic.
Analog video components are prompt to cause interference, take space as these require extra shielding.
I think it's a great idea. I want a Lighting to RS-232C cable to be able to configure routers.
It´s such a great idea that it is seriously questionable on many levels. Yes, it is very flexible, yes it is able to adapt to future 3rd party connectors/standards and it could adapt to almost every protocol, because it´s simply a bus that is not bound by such things.

What you get in return though is a product that is not on par to even current technology needs (to be able to display unaltered images/videos/photos) on your TV, projector or otherwise.

From an engineering standpoint, you start at a base level and create a solution that is at least able to meet today´s requirements without too many sacrifices. Apple seems to not have had quality at the top of their priority list, but design, flexibility and "future-proofness" for obvious marketing reasons.

They are not even able to transfer 1080p or better (the iPad is already at 2048x1536 resolution) without some serious video or image quality degredation TODAY.

The H.264 compressed 1080p video that is on your iPad is being re-encoded again by the iPad to be able deliver it over lightning to the adapter, which then decompresses an already compressed picture that was on your iPad again. The sheer dumbness of this approach is totally out of this world.

Now, the question is, why do they even need to introduce a 2nd encode/decode step, why even do this? Because bandwidth on lightning is not enough, the bus is already bandwidth constrained. It´s a serial bus that is already bandwidth constrained with even today´s requirements. And why? Because Apple wants one bus to use across all of their products and some coming products are going to be even smaller, thinner and lighter.

And if it´s that seriously bandwidth constrained, the quality can only get better if they reduce bandwidth needs further, which can only happen with the use e.x. of H.265/HEVC (which requires a lot less bandwidth to compress than H.264) does, but it will take at least 1-2 years for Apple to be able to include encoders for that.

And even that would still not be on par than a raw video or other picture data transfer. So much for future-proof.
 
If any of this speculation is true about how this cable operates, it raises two important questions.

1. Why does the "new" cable have less bandwidth than the "old" cable?

2. If Airplay is the bridge, how long will it be until the cable itself is eliminated from that system?

It may be this cable is a "behavioral bridge" for people who expect cables to interconnect devices, until the next thing where all devices will be interconnected wirelessly with a dongle on the "dumb" devices.

Rocketman
 
some developer already chimed in that output is full 1080p indistinguishable from previous versions.

Well, then, he may not know much about video (en)coding, compression artefacts, AirPlay etc. or, as has been explained by me in several post of mine,

1, the adapter does decode iOS-native video formats instead of relying on the much inferior mirroring (as does the ATV) AND

2, the dev referred to this, not the AirPlay-based mirroring.
 
If any of this speculation is true about how this cable operates, it raises two important questions.

1. Why does the "new" cable have less bandwidth than the "old" cable?

2. If Airplay is the bridge, how long will it be until the cable itself is eliminated from that system?

It may be this cable is a "behavioral bridge" for people who expect cables to interconnect devices, until the next thing where all devices will be interconnected wirelessly with a dongle on the "dumb" devices.

Rocketman
Well, Airplay doesn´t need to be involved, they are simply using the encode hardware (that is already present in the e.x. A5X/A6X of the iPad, it´s done by the ImgTec IP encoder) and the decoder which also seems to be in the adapter itself. Future IMGTec chipsets have even better encoders to cope with 4K video (they are already out) but it would still not be a raw data stream then.

The thing is, I´m not even sure why they even need that additional encoding/decompression step, because they could just transfer the raw compressed data (a simple bitstream copy mechanism) to the adapter, which is then doing the H.264 decompression itself. The decompression hardware is there, so why not use it like that? So, even if they are bandwidth constrained (which we cannot prove) 256MB memory should be more than enough to store the OS, the compressed data that is received from the iPad and several seconds of the decompressed image, before it gets send to the display. The whole approach just doesn´t make any sense to me. Why worsen the quality if you don´t really need to.
 
Last edited:
If any of this speculation is true about how this cable operates, it raises two important questions.

1. Why does the "new" cable have less bandwidth than the "old" cable?

Power usage? Size? Price? Licensing issues? Incompatibility with ARM chipsets (see Intel's Thunderbolt)? Nothing is really known. Only one thing is certain: the anonymous Apple engineer has stated (original post) the following (bold by me):

"The reason why this adapter exists is because Lightning is simply not capable of streaming a “raw” HDMI signal across the cable. Lightning is a serial bus. There is no clever wire multiplexing involved. "


2. If Airplay is the bridge, how long will it be until the cable itself is eliminated from that system?

It may be this cable is a "behavioral bridge" for people who expect cables to interconnect devices, until the next thing where all devices will be interconnected wirelessly with a dongle on the "dumb" devices.

It simply makes it possible to operate w/o an external (or p2p, on-the-fly created) Wi-Fi network and all the issues creating it is involved (speed, latency etc.), without the need to configure anything. In that regard, the cabled solution is still superior to the "legacy" wireless AirPlay solution up to now (see connection to AppleTV). Of course, it's still not known whether the adapter supports direct iOS-native mp4 / m4v / mov file decoding to offer great quality at least with those kinds of media, as does the ATV. Hopefully it does.

I'll promptly order an adapter (hope it's already on sale here in Europe) and report back.

----------

The thing is, I´m not even sure why they even need that additional encoding/decompression step, because they could just transfer the raw compressed data (a simple bitstream copy mechanism) to the adapter, which is then doing the H.264 decompression itself. The decompression hardware is there, so why not use it like that? So, even if they are bandwidth constrained (which we cannot prove) 256MB memory should be more than enough to store the OS, the compressed data that is received from the iPad and several seconds of the decompressed image, before it gets send to the display. The whole approach just doesn´t make any sense to me. Why worsen the quality if you don´t really need to.

It's not known whether this is done or not at least to iOS-native local or networked (HTTP / iTunes Home Sharing) video files. Hope Apple didn't screw this up.

Other container formats (but still having non-Hi10P H.264) have never been supported by Apple in any way. This means for example the H.264 video track in MKV files can surely not be transferred to the adapter by third-party apps. (After all, Apple has never let devs use hardware acceleration for those kinds of containers, while it's certainly there and jailbroken media players like XBMC and RushPlayer+ do make use of it.)
.

----------

Well, Airplay doesn´t need to be involved, they are simply using the encode hardware (that is already present in the e.x. A5X/A6X of the iPad, it´s done by the ImgTec IP encoder) and the decoder which also seems to be in the adapter itself. Future IMGTec chipsets have even better encoders to cope with 4K video (they are already out) but it would still not be a raw data stream then.

I don't think they aren't using the HW encoder right now - after all, it's able to encode the camera / screen stream at 1080p30. They may have chosen AirPlay on top of that for handshaking etc., to make development faster.
 
Last edited:
What a mess. This is somehow commendable? They took something so simple, over engineered it, and downgraded the whole user experience.
 
I'll promptly order an adapter (hope it's already on sale here in Europe) and report back.

That would be great and I would be very thankful if you could post some results back in this thread or somewhere else. I only own an iPad 3 so I cannot verify it myself.

I don't think they aren't using the HW encoder right now - after all, it's able to encode the camera / screen stream at 1080p30. They may have chosen AirPlay on top of that for handshaking etc., to make development faster.
Yes, it´s capable of that, indeed. I can also play very high bitrate 1080p@59.94fps on my iPad 3 and I was very pleaseantly surprised when I tested the limits of the decoder hardware. That´s one of the reasons why I haven´t upgraded, yet. Not worth it from my standpoint. Even more so, if these issues are verified.

No matter how they do it currently, if it leads to such serious quality issues, they are doing it wrong. For example, Airvideo has extremely good quality over WiFi and you cannot see any image degredation, because it uses a different approach. It supports native H.264 decoding (through bitstream copy) and re-encoding with files Apple doesn´t want to handle natively (like MKV containers, etc.)

If a 3rd party developer can do it with Apple´s hardware, why can´t Apple.
 
I'm sure Apple will solve the low quality/Artifacts issues with this cable sometime soon. This is no ordinary cable, and the Lightning port still is a Pandora box to most of us. From what I was able to gather, there are lots of possible uses for this adapter, we just need to wait.

Now that Apple themselves stated Lightning is simply not able to transfer at 746,50 Mbit/s (the lowest-quality (4:2:0) uncompressed 1080p30 HDMI; with better color resolution it'd be 995,33 Mbps (4:2:2) / 1,49 Gbps (4:4:4)), there is no way they'll ever manage to make it work without the en/decompression step. That is, it'll never ever have the same quality as just outputting pure HDMI or VGA signal - as was done before Lightning.

All this on current hardware. There may be a Rev.2. in the future some day with faster hardware (in both the iDevice and the adapter itself) allowing for higher data rates. But that won't help the owners of current Lightning models sticking with slow(er), non-upgradeable hardware.

----------

For example, Airvideo has extremely good quality over WiFi and you cannot see any image degredation, because it uses a different approach. It supports native H.264 decoding (through bitstream copy) and re-encoding with files Apple doesn´t want to handle natively (like MKV containers, etc.)

If a 3rd party developer can do it with Apple´s hardware, why can´t Apple.

By the last sentence, you mean hardware, direct decoding of h.264 video tracks in non-iOS-native containers like MKV's via AirPlay? It can't be done from third-party apps either, only inside the iDevice. (With AppStore apps, by silently remuxing the MKV to an iOS-native container in the background and play those video chunks; with JB'n apps, by directly using the hardware decoder without having to rely on background remuxing.)
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.