Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
...Why switch to Lightning before they had SoCs with faster serial interfaces that could provide a better user experience though? It makes very little sense. I have trouble imagining what compelled them to switch to Lightning in a holiday quarter when there was no SoC support to make it an obvious improvement compared to the alternatives or even its predecessor. Now I guess we have to sit back and wait for the A7 to see what an uncrippled version of Lightning might look like?
Rather the contrary. From Apple´s perspective, it makes a lot of sense.

They wanted the lightning connector out fast, because they wanted everyone to adopt it, they wanted to force it on everyone. What better way to do it with the new iPhone 5 and the new better iPad 4, which where the crownjewels of Apple´s lineup back then.

Now, they even have the iPad mini and there´s no way for a customer or peripheral maker to go back, so peripheral makers are forced to this solution. Yes, there are adapters, but adapters are always worse than using something natively.

I am very sceptical that Apple can fix this with just a SoC upgrade.
 
256MB?!??!?!?!!
That's how much RAM my iMac G5 had.

The same amount of RAM as my first Mac... nuts! :eek:

That was the size of the hard disk drive in my first Mac that had a HDD, a Performa. And its RAM was original 5 MB which I upgraded to 20 MB. Those were the days. The program had to be concise to run on those machines.

Going to try my 30 Pin to HDMI connected to the 30 pin to Lightening.....


Wish me luck lol

What is Lightening?

I am beginning to wonder if Apple is taking a loss for each Lighting cable they produce!

LED lighting?

Listen sonny!

Back in my day the only thing our cables had in them was Lead and Copper, and that is how we liked it!!!

Copper? How about steel?
 
Yet more proof that lightning is a junky and expensive connector. 2013 and can't output 1080p?

This is the year of the Android.

Wow. There have been articles about this for days, and you still don't have the facts right. The cable can and does output 1080p video just fine. What it *can't* do is *MIRROR* the display at resolutions higher than 1600x900.

Mirroring the display involves on the fly *encoding* of the video stream, which is very computationally expensive, and not handled in hardware except on high end, *expensive* hardware (such as top of the line, dedicated PCI-e GPUs).
 
It's a ridiculous solution - did they realise that Lightning wouldn't be able to output data at the rate that HDMI demands at any point during its development?

Occurs to me that Bluetooth was not intended for audio use (much less good-sounding stereo) during its development. IIRC, audio was kinda wedged into BT after introduction, sounded like crap at first (choppy low-fidelity single-channel), and over time the protocol evolved to address audio as a primary purpose instead of just a hacked-on afterthought.

BTW: HDMI is a pretty fast interface, not just high data rate but pumped thru several wires in parallel. Full blown HDMI support quite reasonably exceeds the purpose of the interface. Reviewing the capability, intentions, and implementation methinks :apple: has done quite a good job with this interface. It's a dongle to support something few people will need for a device which must sacrifice everything which isn't a common use to most users.
 
Wow. There have been articles about this for days, and you still don't have the facts right. The cable can and does output 1080p video just fine. What it *can't* do is *MIRROR* the display at resolutions higher than 1600x900.

Mirroring the display involves on the fly *encoding* of the video stream, which is very computationally expensive, and not handled in hardware except on high end, *expensive* hardware (such as top of the line, dedicated PCI-e GPUs).

This is correct! With the 30-pin connector iOS devises could mirror at full quality, but could only play video at 720p because of on the fly *decoding* in the device. With Lightning, iOS devices play video at 1080p and mirror at up to 1600x900 because of the on the fly *decoding*.
 
This is correct! With the 30-pin connector iOS devises could mirror at full quality, but could only play video at 720p because of on the fly *decoding* in the device. With Lightning, iOS devices play video at 1080p and mirror at up to 1600x900 because of the on the fly *decoding*.

So to cut a long story short, the lightning adaptor actually works better than the old 30 pin adaptor for HDMI, and we are really all just throwing a hiss fit over nothing?
 
Can't resist a "in my day" post...

Compared to my first (original release) IBM PC, this little "connector" has 100x the video capacity, 16,000x the RAM, and somewhere around 1,000,000x the bandwidth - all in a small matchbox instead of several shoeboxes, at 2% the price.

Rule of thumb when tempted to complain about how new technology sucks: be amazed that it works at all.

----------

So to cut a long story short, the lightning adaptor actually works better than the old 30 pin adaptor for HDMI, and we are really all just throwing a hiss fit over nothing?

Bingo!

Most posters aren't responding to the technology, they're responding to a few out-of-context comments construed to fit whatever whiny ":apple: sux" narrative they've chosen.
 
This is correct! With the 30-pin connector iOS devises could mirror at full quality, but could only play video at 720p because of on the fly *decoding* in the device.

Not exactly. 1080p h.264 decoding was supported by the a4 already - and the a5 even 1080p60. Heck, even the 3gs can decode 1080p pretty well (but doesn't support hdmi/ vga output, of course).
 
Lightning doesn't do USB 3.0, so you can safely assume that the bandwidth of the port is less then 5gbit/sec.

HDMI runs at 10.2gbit/sec. At the bare minimum, Lightning is at least half as slow as required to properly support a full HDMI link. There simply aren't enough pins and data doesn't move fast enough to make this happen.

Are you assuming Lightning runs at USB2 bit rate? (480 kb/s)

As Thunderbolt shows, it is possible to move data at extremely high speed, so it is possible Lightning has the bandwidth.

Lightning has a lot of secrets.
 
Can't resist a "in my day" post...

Compared to my first (original release) IBM PC, this little "connector" has 100x the video capacity, 16,000x the RAM, and somewhere around 1,000,000x the bandwidth - all in a small matchbox instead of several shoeboxes, at 2% the price.

Rule of thumb when tempted to complain about how new technology sucks: be amazed that it works at all.

----------



Bingo!

Most posters aren't responding to the technology, they're responding to a few out-of-context comments construed to fit whatever whiny ":apple: sux" narrative they've chosen.

You seem to forget that there has already been flawless hdmi /vga output before Lightning. This is why we're b1tching - a step in the wrong direction, you see.

----------

Are you assuming Lightning runs at USB2 bit rate? (480 kb/s)

As Thunderbolt shows, it is possible to move data at extremely high speed, so it is possible Lightning has the bandwidth.

It doesn't - see the update of the original article.

----------

Are you assuming Lightning runs at USB2 bit rate? (480 kb/s)

As Thunderbolt shows, it is possible to move data at extremely high speed, so it is possible Lightning has the bandwidth.

It doesn't - see the update of the original article.
 
Wow. There have been articles about this for days, and you still don't have the facts right. The cable can and does output 1080p video just fine. What it *can't* do is *MIRROR* the display at resolutions higher than 1600x900.

Mirroring the display involves on the fly *encoding* of the video stream, which is very computationally expensive, and not handled in hardware except on high end, *expensive* hardware (such as top of the line, dedicated PCI-e GPUs).

Most ARM SoCs include on-the-fly encoding of video streams. Imagination, ARM, etc, all sell the modules required, and Apple will incorporate one (probably the Imagination one) in their Ax SoCs. It'll be used for video recording, and also this video transmission. It would be interesting to see if recording a video via the camera AND connecting via the HDMI adaptor is possible.

However the quality is probably not close to a decent software implementation on a PC.
 
So to cut a long story short, the lightning adaptor actually works better than the old 30 pin adaptor for HDMI, and we are really all just throwing a hiss fit over nothing?

I think the poster is talking about playing back videos, which is a decoding issue.

What this issue is about is the fact that the phone/tablet has to ENCODE the video to send it to the cable to DECODE. The encoding is lossy, and also because it has to happen in real-time, it will be done in hardware on the device using a less-than-perfect implementation that is more than adequate for recording videos on the device's camera, but not for encoding text, menus, UI interfaces, etc - things you may use when your device is connected via the HDMI adaptor.

If you are lucky, as I mentioned earlier, is if the device can send pre-existing video streams (e.g., from the internet, storage, etc) straight to the adaptor to display, without going via an internal decode-encode cycle.
 
Apple taking a loss on something they produce? You're not being serious, I hope. This is Apple we're talking about. . .not Google or Amazon. Apple wants to make money from selling content AND hardware to run the content.

Well, putting an entire computer into a tiny adapter cable that will not sell in the volume of the devices it connect to. We are taking about an entire computer, and that means a huge amount of development and software.

That little adapter sounds like it is an AppleTV ($99), without WiFi and its own powersupply hardware wise (neither of which come close to difference of $49).

Speaking of, AppleTV could do the work of adapter, and have more advantages as well.

Lightning is much more then what Apple has told us.

----------

It doesn't - see the update of the original article.

I was replying to post which seemed to suggest it was.

And the update only said "Serial Bus", and that could mean anything.
USB is a serial bus.

Edit: I mean it could have the bandwidth, but muxing the data to a few wires is a challenge.
Addition: See the HDMI details, technically complex, but not impossible.
 
Last edited:
Thinking about this, I think that this adaptor is a custom version of a USB Video Card, using a variant of the existing DisplayLink hardware.

Right now they've probably only got the drivers to display video, hence everything is a video, encoded on the iDevice.

Further down the line, they will incorporate drivers into the OS to actually use it as a video card, hopefully with lossless rendering of UIs on the adaptor itself. I.e., this adaptor is a full graphics card with 256MB memory, connected via USB2 to Lightning to the iDevice (possibly bypassing USB2 stage, depending on the implementation).

http://www.displaylink.com/technology/technology_overview.php for those who think a USB2 video card is cranky.
 
Wow. There have been articles about this for days, and you still don't have the facts right. The cable can and does output 1080p video just fine. What it *can't* do is *MIRROR* the display at resolutions higher than 1600x900.

Mirroring the display involves on the fly *encoding* of the video stream, which is very computationally expensive, and not handled in hardware except on high end, *expensive* hardware (such as top of the line, dedicated PCI-e GPU).

LOLno. Surface and Surface Pro can project, mirror or extend and its no big deal. Surface Pro can extend display to 2 additional displays... one over Mini DisplayPort, one over USB3.
 
Video Mirroring has never been upscaled to 1080p. The max has always been 720p.
Nothing new here.
 
So to cut a long story short, the lightning adaptor actually works better than the old 30 pin adaptor for HDMI, and we are really all just throwing a hiss fit over nothing?
That certainly sounds like a valid summarization of the thread.

----------

LOLno. Surface and Surface Pro can project, mirror or extend and its no big deal. Surface Pro can extend display to 2 additional displays... one over Mini DisplayPort, one over USB3.

Surely you aren't comparing the graphics capability of a PC (even with integrated HD4000) to an iPhone and expecting similarity? The actual article by Panic says they put the phone in mirroring mode and are shocked that it is mirroring the video. That's something they need to figure out.

Now, if there are speed issues, artifacts, etc., that's something to worry about.
 
Surely you aren't comparing the graphics capability of a PC (even with integrated HD4000) to an iPhone and expecting similarity?

Except its also in the ipad mini and ipad 4... is there any chance this isn't the connector going forward- At least for a while? Basically you're saying "good enough" I.E. 720p mirroring is acceptable on a now generation tablet... when another now generation tablet can push external displays up to 2560x1440. Certainly doesn't sound competitive.
 
That's why I said 'but other devices are better value & more flexible.'

You seem to have not heard of Roku, Boxee, Netgear, Western Digital, Android TV…
Many are cheaper than an ATV & support more features (clearly some are worth avoiding too). Your points are valid about the cost of Pi extras, however it does mean you have a custom device that does exactly what you want.

I misunderstood your statement on other devices ...

Of course I know of Roku, Boxee, etc, however if a user wants AirPlay/AirTunes support the only best options are a Apple TV and AirPort Express. Software that tries to emulate AirPlay/AirTunes Receiving just is not good enough.
 
Except its also in the ipad mini and ipad 4... is there any chance this isn't the connector going forward- At least for a while? Basically you're saying "good enough" I.E. 720p mirroring is acceptable on a now generation tablet... when another now generation tablet can push external displays up to 2560x1440. Certainly doesn't sound competitive.
Maybe you don't know what the Surface Pro is? It is different from other tablets. And costs much more than others. Apple does not even make a product with direct feature comparisons. The closest is probably the MBA, not the iPad.
 
Maybe you don't know what the Surface Pro is? It is different from other tablets. And costs much more than others. Apple does not even make a product with direct feature comparisons. The closest is probably the MBA, not the iPad.

The more storage you put in an Ipad the more expensive it gets. Its pretty close to the cost of a Surface Pro at 64 and 128 gb sizes... RT tablets do extension, mirroring and projection without display artifacts at the same price as the Ipad.
 
This doesn't bother me at all. Connecting an iPad or iPhone direct to a tv or projector via HDMI really isn't practical anymore. :apple:TV & airplay is how I do all tv & projector connections now. I really don't understand why some of you are so upset by this adapter. What are you going to do, get a 30ft HDMI cable so you can sit on the couch and mirror your iPad or iPhone to a TV? Connecting a computer or even iOS devices to a tv via a cable is what I would consider legacy hardware. I'm willing to understand this frustration some of you have. I just don't get it.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.