Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Smartass

macrumors 65816
Dec 18, 2012
1,450
1,701
so they went through all this troubles to make a small computer inside an adapter and still it produces lower quality video than a normal hdmi adapter? my god Apple went full retard on this one.
 

thadoggfather

macrumors P6
Oct 1, 2007
15,551
16,286
Everything recently feels 'jimmy rigged.'

Between this, iPad 3 being powered by A5x (really just A5 CPU and quadcore gfx to drive that display), The fact that 13" rMBP intel integrated 4000 HD card is pushing all those pixels, and the fact that 15" rMBP neither nvidia nor the 4000HD card were meant to push those pixels either but that both models are probably doing it due to an Apple-made and baked-in software hack.

Did Apple not think somebody nerdy and inquisitive would do a tear apart of the cable eventually?
 

iSunrise

macrumors 6502
May 11, 2012
382
118
How stupid is it to force customers to the new lightning connector on their mobile devices, force them to buy expensive and technically inefficient active adapters and even lower the quality of the transfered image while doing that. If I have a 1080p HD movie on my iPad, I want it to be displayed on my TV in the same quality as it is stored and not process it with another transcoding step that lowers the quality even more. I can live with adapters any day, but not if they alter the image of my stored media.

Not even talking about the fact that this adapter is a complete an utter waste of resources. Every engineer that reads this should at least shake their heads or fall of their chair, because it just doesn´t make any technical sense, but rather to outsource components that may increase the BOM of their mobile devices by a tiny bit.

Instead of just building a bit more ridiculously cheap RAM into their mobile devices and use the native processing power of their already extremely fast and future-proof A6(X) chips (they are going to be even faster in the future), they make an active adapter that actually needs to work with some additional memory and even an iOS to boot and transfer an image. Seriously, how dumb is that, Apple?
 

Menneisyys2

macrumors 603
Jun 7, 2011
5,997
1,101
A 256MB RAM in an adapter? I hope they patented this whole thing for every little piece of design. But it's a definite overkill.

No it's not. Given that it's a complete AirPlay receiver, it does need RAM to function.

I'd even add it should have MORE RAM (or flash storage) to properly buffer iOS-native video files. If it buffers / decodes them at all, as the AppleTV does (not sure about this adapter)...

----------

How stupid is it to force customers ...

... Seriously, how dumb is that, Apple?

While you're absolutely right, you'll surely receive a LOT of flames from Apple fanboys for this :)
 

mcfmullen

macrumors member
Feb 6, 2012
71
1
You forgot to add -

Apple's Lightning Digital AV adapter IS NOT a computer because it requires one to function (no processor unless standalone in which case, it would be a computer). This is just following your LOGIC :D

P.S. The use of capital letters is a doubtful method even to convince your pals, not to mention Macrumors, where's a lot of people working in IT.

Incorrect. The adapter does not require a computer to function, it requires input. The computations all take place inside the adapter, not the computer. The same cannot be said about the hub which requires the CPU or the raid controller.
 

Menneisyys2

macrumors 603
Jun 7, 2011
5,997
1,101
so they went through all this troubles to make a small computer inside an adapter and still it produces lower quality video than a normal hdmi adapter? my god Apple went full retard on this one.

Apple doesn't give a sh*it about technically non-illiterate folks. This is why they even had the "you're holding it wrong" "excuse". And that's only one of their blatant lies showing they think everyone else is stupid. For example, when asked about dropping the antialiasing in the iPhone 2x mode on the iPad 1 / 2 in iOS 5 (and thus REALLY reducing the quality of displaying the iPhone screen), they stated "antialiasing was a bug". Which everyone at least somewhat tech-literate knows is a blatant lie.
 

mcfmullen

macrumors member
Feb 6, 2012
71
1
While those devices might not be full-fledged computers, in all technicality they are or very well could be. Though I'm not saying it would be a powerful computer or a computer that could do anything "useful" like we are used to what they do, but in all technicality they are computers.

Did you even read my comment? No processor, no computer.
 

KanosWRX

macrumors 6502
Jul 14, 2008
417
396
That is cool! It is amazing how much tech they managed to put inside such a small adapter. :eek:

How is this amazing! They have to waste more materials just to make a product that is worse then the last one! They should have thought of this before making the lightning adapter. Now we have to deal with inferior tv out on all apple devices for the next 10 years when apple makes a new port on their devices :( apple engineers you really dropped the ball on this. Guess your quality is going down just like your stock!
 

iSunrise

macrumors 6502
May 11, 2012
382
118
While you're absolutely right, you'll surely receive a LOT of flames from Apple fanboys for this :)
So be it. Another reason to hold onto my iPad 3 even longer. It´s just sad, as people won´t have any choice with future devices.
 

lolkthxbai

macrumors 65816
May 7, 2011
1,426
489
I think Apple is providing fan service in a way by even offering these adapters. Pretty soon signals will be more commonly transmitted via wireless radio and Apple knew this when they started to campaign the whole Post-PC concept and cut-the-cord concept with sync over wifi and airplay. Honestly, i wouldn't bother connecting my laptop to my TV because its annoying and uncomfortable to have my laptop so close and physically attached to my TV. The same goes for my iPhone and iPad. And for $50 bucks, it'd be a better investment to just get an Apple TV for another $50 and do AirPlay over wifi with the obvious additional benefits of having an Apple TV. Like I said, fan service. And to some extent this really only benefits those that choose not to get an Apple TV or don't have Wifi yet in their home/office.
 

odedia

macrumors 65816
Nov 24, 2005
1,044
149
They pretty much stuck a Raspberry Pi model A onto this thing... amazing, and at the same time - feels almost pointless (Besides Apple's obsession for thinness and proprietary connectors).
 

aerok

macrumors 65816
Oct 29, 2011
1,491
139
This is why I don't really come around anymore. Stupid comments everywhere and the mods don't care because page views are awesome. Money > Quality. It's just the way it is.

It was only one guy and one post in the whole thread... easily ignored.
 

Alameda

macrumors 6502a
Jun 22, 2012
927
546
Micro USB doesn't solve anything, because it comes with its own set of baggage... microUSB would end up having the same exact issues with video, but probably more so.]
That's incorrect. Smartphones with microUSB, such as Samsung's Galaxy SII, use a technology called MHL, which combines the three RGB/ YUV pairs of the HDMI signal into a single twisted pair with a clock. Another pin carries data, and the other two carry power. Televisions from Sony, Samsung, LG, and Toshiba can accept this signal directly, with a passive cable that lets the TV charge AND control the phone. Or, a dongle can be used for any other HDMI TV. Resolution can be as high as 1080p60, without compression.

I think the teardown article makes a lot of assumptions which are likely false. For one thing, just because the chip has an embedded ARM microcontroller, it doesn't mean it's running an OS. That is very likely wrong. It probably has 256 kbits of on-chip non-volatile, not 256MB. And, the likely purpose of this is to manage the HDMI and HDCP protocols, not to convert H.264 into HDMI. The artifacts which were observed can be caused by a lot of things, such as the video compression or decompression, OR, maybe the video output to the lightning connector sub-samples video to 4:2:2 or 4:2:0, or, maybe some video scaling is performed. But I doubt that the lightning dongle decompresses the video signal. An easy test would be to observe how much lag is present when you move your finger about the display. With HDMI, there is no lag, but if the video is compressed and uncompressed, there will likely be lag.
 

SmoMo

macrumors regular
Aug 20, 2011
218
21
FDvHD

... I'd stand there and admire it on the big 14 inch screen and decide whether to store it on a floppy disk so I could look at it later.

Why not just store it on your hard dive?
 

iSunrise

macrumors 6502
May 11, 2012
382
118
That's incorrect. Smartphones with microUSB, such as Samsung's Galaxy SII, use a technology called MHL, which combines the three RGB/ YUV pairs of the HDMI signal into a single twisted pair with a clock. Another pin carries data, and the other two carry power. Televisions from Sony, Samsung, LG, and Toshiba can accept this signal directly, with a passive cable that lets the TV charge AND control the phone. Or, a dongle can be used for any other HDMI TV. Resolution can be as high as 1080p60, without compression.

I think the teardown article makes a lot of assumptions which are likely false. For one thing, just because the chip has an embedded ARM microcontroller, it doesn't mean it's running an OS. That is very likely wrong. It probably has 256 kbits of on-chip non-volatile, not 256MB. And, the likely purpose of this is to manage the HDMI and HDCP protocols, not to convert H.264 into HDMI. The artifacts which were observed can be caused by a lot of things, such as the video compression or decompression, OR, maybe the video output to the lightning connector sub-samples video to 4:2:2 or 4:2:0, or, maybe some video scaling is performed. But I doubt that the lightning dongle decompresses the video signal. An easy test would be to observe how much lag is present when you move your finger about the display. With HDMI, there is no lag, but if the video is compressed and uncompressed, there will likely be lag.
Even if they are wrong on the technical details, what it comes down to in the end is the visibly higher quality (900p instead of 1080p, image artifacts) the old adapter with the old connector offers compared to the new adapter and the new lightning connector. No one really cares how it works as long as it doesn´t offer worse quality on screen. And if Apple praises the new lightning adapter as more "future-proof", that should translate to "we can make even more money in the future".
 

Abazigal

Contributor
Jul 18, 2011
19,586
22,043
Singapore
How stupid is it to force customers to the new lightning connector on their mobile devices, force them to buy expensive and technically inefficient active adapters and even lower the quality of the transfered image while doing that. If I have a 1080p HD movie on my iPad, I want it to be displayed on my TV in the same quality as it is stored and not process it with another transcoding step that lowers the quality even more. I can live with adapters any day, but not if they alter the image of my stored media.

Not even talking about the fact that this adapter is a complete an utter waste of resources. Every engineer that reads this should at least shake their heads or fall of their chair, because it just doesn´t make any technical sense, but rather to outsource components that may increase the BOM of their mobile devices by a tiny bit.

Instead of just building a bit more ridiculously cheap RAM into their mobile devices and use the native processing power of their already extremely fast and future-proof A6(X) chips (they are going to be even faster in the future), they make an active adapter that actually needs to work with some additional memory and even an iOS to boot and transfer an image. Seriously, how dumb is that, Apple?
That's the whole point - Apple can't be that dumb. I want to believe that there has to be some deeper reason behind all this. I am no tech expert, so I can't know if all this is possible or not, but what if the circuitry was to future-proof the cable for the next 10 years?

This probably seems redundant, even wasteful now, but I daresay that years down the road, we will be praising Apple for their foresight when their future products start tapping on features in the lightning cable.
 

Alameda

macrumors 6502a
Jun 22, 2012
927
546
Even if they are wrong on the technical details, what it comes down to in the end is the visibly higher quality (900p instead of 1080p, image artifacts) the old adapter with the old connector offers compared to the new adapter and the new lightning connector. No one really cares how it works as long as it doesn´t offer worse quality on screen.
That's true, but I'd like to see independent verification of their claims, since they got a few details wrong.
 

Menneisyys2

macrumors 603
Jun 7, 2011
5,997
1,101
The artifacts which were observed can be caused by a lot of things, such as the video compression or decompression, OR, maybe the video output to the lightning connector sub-samples video to 4:2:2 or 4:2:0, or, maybe some video scaling is performed. But I doubt that the lightning dongle decompresses the video signal.

It does - see the Panic image at http://www.panic.com/blog/wp-content/uploads/2013/03/jpegcompression.jpg . Those are definitely decompression artefacts (blocking), not those of plain 4:2:x subsampling, which would "only" slightly reduce the bandwidth needed (by one-third for 4:2:2 and by half for 4:2:0).

Those kinds of blocks would ONLY happen with interlaced content (and only with 4:2:0), not progressive one - and all iDevices emit progressive video. An example:

interlaced, moving image (the worst case): http://upload.wikimedia.org/wikipedia/commons/b/b9/420-interlaced-single-field.png

(this isn't the case as the Panic screenshot shows a non-moving one)

With progressive input, such blocks are far less visible at 4:2:0.

Note that the above (worst-case and, in this non-interlaced case, surely not existing) 4:2:0 subsampling case would only introduce blocks in the outer one-pixel areas of different colors. Here, in the Panic shot, anyone can see the blocks are even 2-3 pixels high/wide, that is, much larger than anything 4:2:x subsampling could ever introduce.
 

Alameda

macrumors 6502a
Jun 22, 2012
927
546
Apple's Lightning Digital AV Adapter is a Full-Fledged Computer

It does - see the Panic image... Those are definitely decompression artefacts (blocking), not those of plain 4:2:x subsampling, which would "only" slightly reduce the bandwidth needed (by one-third for 4:2:2 and by half for 4:2:0).
Thank you! That was very useful information! What I meant was that I'd like to see someone independently verify the artifact issue of the iPhone 5 vs, for example, the 4s.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.