Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Micro USB isn't an answer either.

So I go to a friends house and want to watch a video from my ipad I have to first check they have Apple TV, if not I have to unplug mine from the back of my setup, take it round and setup it all up instead of just taking a RELIABLE cable.

Lightning seems to me like a massive backward step by Apple if they wanted to do everything wirelessly then they should have just used micro USB!

Micro USB doesn't solve anything, because it comes with its own set of baggage. The lightning adapters from Apple say, "We know these functions will work and can be used with any software that adopts the appropriate portions of the SDK out-of-the-box." Micro USB says, "This might work and you'll be disappointed and confused when it doesn't." Not to mention microUSB would end up having the same exact issues with video, but probably more so since that wasn't a major focus of the technology in the first place. I'm sure Lightning had a plan for video in mind at inception.

For those people that do need to pop into a random location and display video, they will need to carry the Lightning AV adapter and a lot of cable. Of course you better hope that the connection to the TV is accessible or the projector compatible. If you're going to do it a lot, buy an Apple TV. However, in either case, video out from an iPad or iPhone isn't the common case.

I leave you with xkcd
 
Elegant

I do not understand the point in this article. The use of the ARM chip is simple it is more flexible than discrete or programmable logic chips.

In the past such an adaptor would have been 100's times the size in discrete logic (74xxx logic), smaller with programmable array logic (PAL and GALs), smaller still with field programmable logic (FPGA), then even smaller with a CPU. It is the natural progression of digital circuits nothing else.

Exactly. It is an elegant solution from Apple.

Not to mention, if the first version of the software on the adapter isn't satisfactory or has bugs, then they can just update it by updating iOS. No need to flash the chip.

----------

:O AND THEY ARE STILL 2 YEARS BEHIND EVERYONE ELSE #WirlessCharging.

Let me get this straight... Now I need to buy a special base station that I will charge my device with and cannot be removed from or it stops charging. Kind of like an electric toothbrush.

I'm just not sure how this solves an actual problem. In fact it creates a new one, that I still need the cable connection for every time I want to charge in a new location (often) and now I need to buy cradles for every location where I want to enjoy the novelty of wireless charging, all without any additional benefit. Yay!

In the case of the electric toothbrush you need not only a plug, but a surface that is within range of the cord length that is stable enough to hold the toothbrush in place. Seems like a traveller's delight.
 
So is it the case that people will now hold off to buy version 2.0 of an adapter?

Surely if it can't output native 1080p it's not working as advertised?
 
Absolutely different

Like Micro USB then but much more expensive.

MicroUSB is not a solution, it is just a different solution. It is latest connector in a long series of connectors for the USB protocol and it has a whole huge standard of backwards compatibility and flexibility of purpose built into it. The primary issues with USB are limitations, expectations, and supporting software.

Most consumers wouldn't know an Ethernet cable from a USB cable, but they would assume if it fits, it should work. With USB devices on a mobile platform that just isn't the case. The Lightning adapter is a guarantee, this will do exactly what I want. I will venture to say many are completely willing to pay a few bucks for something that works the first time without installing, hacking or praying.
 
I do not understand the point in this article. The use of the ARM chip is simple it is more flexible than discrete or programmable logic chips.

In the past such an adaptor would have been 100's times the size in discrete logic (74xxx logic), smaller with programmable array logic (PAL and GALs), smaller still with field programmable logic (FPGA), then even smaller with a CPU. It is the natural progression of digital circuits nothing else.

No, it's an ugly hack due to Apple not wanting to implement anything they don't have total control over. It's seriously ugly, seriously inefficient and performs poorly. When you need to put a processor in your cable, you know something is desperately wrong.

----------

Does this mean Apple is doing more to achieve less with this hacky method to output via HDMI?

Exactly.
 
...
They clearly have some great plans for it down the road. Just like how itunes would go on to become a great selling point for the iphone and iPad, the lightning connector may well prove to be as such in due time. :)

I was thinking the same thing. At the moment, it doesn't make a lot of sense, but Apple doesn't think about TODAY; they think about 5+ YEARS down the road.

Their ability to engineer and manufacture such cables is technological wizardry. The inability of the cable to provide premium video is a great failure. They are making these now to improve them over the next decade and make way for better and better eventualities in data transfer.
 
Wait, ALL Lightning cables have 256MB of ram? Why can't my iPod touch 5 use it. :c

Seriously though, what the **** could they POSSIBLY need a complete computer for in a god damn cable for?

Not all cables. Just ones that need to handle the decoding of a signal that would need too much bandwidth to send through the lighting connector in raw form.
 
.

My first computer had an 8 megabyte hard drive and i think 512kb of ram lol
 
Apple's Lightning Digital AV adapter IS NOT a computer because it requires one to function (no processor unless standalone in which case, it would be a computer)..

This difference does not make something not a computer. Many mainframes and big supercomputers have required and still require another computer (service processor, console processor or front-end minicomputer) to boot them, or to do any IO.

Even your older MacBook won't boot without the ARM processor inside the HDD controller chip.

----------

Shame they couldn't do the same with the iPhone!

They could do that with a phone if people didn't need a display, speakers, batteries, and antenna's big enough not to have "holding it wrong problem"
 
I wonder how many og the actual people complaining here intend to use the new Lightning DV cable, or even used the old Dock HDMI Cable? I don't actually know anyone who bought the old one, and I know lots of iPhone users. Most people are happy with Mirroring to Apple TV.

For me, the size and easy flippable nature of the Lightning connector is a win, especially if it means smaller, thinner, lighter devices
 
Now I know very little about how this all works, but if my calculations are correct... (trying to calculate the size of 30 1080p frames...)

(1920*1080*4)/1000000 = 8.2944 MB...

8.2944 * 30 (FPS) = 248. Is it perhaps coincedence that it's right near the 256MB mark? Or is there some reason to be able to store an entire second of video in memory?
 
My point is iPad never had an equal components that was "shifted" to the adapter since its previous video output didn't require encoding. You didn't read the article and tried to spin the story into Apple squeezing extra profit on iPad by taking components off it, which obviously isn't the case since the earlier iPad just outputted the raw stream.

I think your point was more to try and call me out for being anti-Apple - which again I wasn't being. It's cool. I was calling it good business if true.

Your point falls apart once you acknowledge that you honestly don't know what chips did what. Can you assume - sure. But that doesn't mean you know. Nor does Panic. That's why I asked you to define conjecture for me.

You could have just wrote your post arguing my assertion. But that wasn't enough. You took it a step further by trying to make insinuations (which were wrong) about my motivation.

I understood your point crystal clear. I can't say I agree or disagree with it since it's not based on facts that are known.
 
Last edited:
Basically a USB to HDMI adapter..

It's basically Apple's mobile version of a USB-to-DVI adapter.

iOS=Lightning=USB=HDMI..?
 
Also, doesn't the lightning adaptor cost more than the old 30-pin cable? Where exactly is apple shifting costs to if they are actually earning lower margins overall? Every iOS device still has to come standard with a lightning port and charging cable anyways.
 
It looks like it does output video via Airplay.

http://store.apple.com/uk/product/MD826ZM/A/lightning-digital-av-adapter?fnode=3a

Beware if you want to stream on demand!

Written by Gurminder P from Bristol
08-Jan-2013

I bought this adapter with the intention of connecting my ipad to my tv and streaming apps such as Lovefilm, BBC iPlayer and 4od. Be aware that none of these applications will work when connected via HDMI to a display. You will receive a notification that states the app does not support.

Possibly ignorance on my part, but to me, this was a big plus point of buying the adapter in the first place. I contacted 4od and received the below response when I asked why there was no support:

"Unfortunately the reason Airplay no longer works is that we don't have the rights to broadcast all our shows using Airplay, so this feature is not enabled."

Hope this helps.

and

No Dolby Digital 5.1

Written by TC Y from Pontypool
30-Jan-2013

What is the point of technology moving backwards?
This lightning adapter does not allow 5.1 passthrough - the 30 pin adapter for ipad 2 and 3 will. Disappointing.
 
does this mean that none of the future iphones and ipads will be able to output real HDMI signal? why didnt they just make the port a little longer with enough pins?

Oh, forget about the larger speaker and smaller body. But you knew that.
 
No, it's an ugly hack due to Apple not wanting to implement anything they don't have total control over. It's seriously ugly, seriously inefficient and performs poorly. When you need to put a processor in your cable, you know something is desperately wrong.
Yup. This one just has pixies inside, with notepads, who transcribe USB to HDMI.
5MCBVtI.jpg
 
No, it's an ugly hack due to Apple not wanting to implement anything they don't have total control over. It's seriously ugly, seriously inefficient and performs poorly. When you need to put a processor in your cable, you know something is desperately wrong.



Completely wrong. Either you put a static port on the device or have an adaptor. The lightning in incredibly adaptable and will be able to cope with and future Connectors that come along.

It can take v.high power loads - something that mini and micro usb cannot at all... in fact all the current Sammys, etc tablets and iPad break the USB power specs.

And as the great post above shows samsung has exactly the same issue. Someone needs to crack it open and it will have some form of processor

A lot of devices still require specific ports and power... you cannot charge the nexus 7 properly with other USB devices. Don't think that Apple is the only one that want you buy their own adaptors and chargers. Just because they use a low spec microUSB that allows a simple charge... the rest of the functionality require device specific adaptors

Video though MHL is connection agnostic not even a type of port so companies are sticking whatever port they want on the device.

"The Galaxy S III is the first MHL device to use a different connector - one that is not compatible with all other MHL devices and accessories.[9] Consumers assumed that the MHL branding ensured compatibility so they were surprised when MHL accessories did not work with the Galaxy S III[10] (the incompatibility is due to the S3 using an 11 pin connector rather than a 5 pin)."
 
It's basically Apple's mobile version of a USB-to-DVI adapter.

iOS=Lightning=USB=HDMI..?

I wish it was. Unfortunately, it isn't. It seems it's just an AirPlay receiver (without forcing the user to explicitly set the client to stream), with all its associated problems: (when not streaming iOS-native video files but, say, mirroring or playing back non-iOS-native videos) low framerate, lowish resolution and a lag making it impossible to play fast-paced (action / racing etc.) games.

(Note that, not having purchased the new adapter myself, I don't know about whether iOS-native video files are handled the same way as on the AppleTV. As the latter has 8GB caching storage, it can pre-buffer streamed video just fine. Here, the lack of buffer MAY mean this adapter can't even play back iOS-native video files properly, unlike the ATV.)

That is, unlike the old HDMI / VGA adapters (which did mirroring without problems), there's no point in preferring this adaptor over the old, true AirPlay-to-AppleTV way. That is, don't purchase this if you already have an Apple TV - this won't offer any (resolution / framerate / lag) advantage over streaming to the AppleTV, unlike with the previous adapters.

All in all: I'm VERY disappointed. I loved the old adapters, which I used a lot when I needed much better performance / quality than with AirPlay (again, except playing back iOS-native videos, which are played back flawlessly if your network is fast enough not to have buffering problems). However, given that this adapter doesn't seem to offer anything over AirPlay (on the contrary: it may not even decode iOS-native files natively), I surely won't purchase it.

Apple should have gone the separate micro-HDMI way, as has Nokia, instead of - even when compared to Apple's previous tech - just offering us a technically vastly inferior "solution".
 
Last edited:
Bunch o' younguns around here. My first Mac had a whopping 4MB of RAM. Thankfully I upgraded to a IIsi and it's massive 16MB of RAM which maxed out at 64MB (although RAMDoubler could theoretically make it go beyond it's max to 128MB). My B&W only came with 64MB. Wow how times have really changed.

My first Intel box had a 210 meg hard drive. I splurged and got the big one. The standard size in those days was 170 meg.

Oh - and yes, I got double the normal amount of RAM. I got 2 Megs, instead of the standard 1 meg.

That machine, an 80486/66, was a screamer. I could start up a fractal in Fractint before I went to bed, and sometimes, by the time I had gotten up in the morning, it would have finished drawing it! Woo Hoo! I'd stand there and admire it on the big 14 inch screen and decide whether to store it on a floppy disk so I could look at it later.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.