A 256MB RAM in an adapter? I hope they patented this whole thing for every little piece of design. But it's a definite overkill.
How stupid is it to force customers ...
... Seriously, how dumb is that, Apple?
You forgot to add -
Apple's Lightning Digital AV adapter IS NOT a computer because it requires one to function (no processor unless standalone in which case, it would be a computer). This is just following your LOGIC
P.S. The use of capital letters is a doubtful method even to convince your pals, not to mention Macrumors, where's a lot of people working in IT.
so they went through all this troubles to make a small computer inside an adapter and still it produces lower quality video than a normal hdmi adapter? my god Apple went full retard on this one.
While those devices might not be full-fledged computers, in all technicality they are or very well could be. Though I'm not saying it would be a powerful computer or a computer that could do anything "useful" like we are used to what they do, but in all technicality they are computers.
That is cool! It is amazing how much tech they managed to put inside such a small adapter.
So be it. Another reason to hold onto my iPad 3 even longer. It´s just sad, as people won´t have any choice with future devices.While you're absolutely right, you'll surely receive a LOT of flames from Apple fanboys for this
This is why I don't really come around anymore. Stupid comments everywhere and the mods don't care because page views are awesome. Money > Quality. It's just the way it is.
That's incorrect. Smartphones with microUSB, such as Samsung's Galaxy SII, use a technology called MHL, which combines the three RGB/ YUV pairs of the HDMI signal into a single twisted pair with a clock. Another pin carries data, and the other two carry power. Televisions from Sony, Samsung, LG, and Toshiba can accept this signal directly, with a passive cable that lets the TV charge AND control the phone. Or, a dongle can be used for any other HDMI TV. Resolution can be as high as 1080p60, without compression.Micro USB doesn't solve anything, because it comes with its own set of baggage... microUSB would end up having the same exact issues with video, but probably more so.]
... I'd stand there and admire it on the big 14 inch screen and decide whether to store it on a floppy disk so I could look at it later.
Even if they are wrong on the technical details, what it comes down to in the end is the visibly higher quality (900p instead of 1080p, image artifacts) the old adapter with the old connector offers compared to the new adapter and the new lightning connector. No one really cares how it works as long as it doesn´t offer worse quality on screen. And if Apple praises the new lightning adapter as more "future-proof", that should translate to "we can make even more money in the future".That's incorrect. Smartphones with microUSB, such as Samsung's Galaxy SII, use a technology called MHL, which combines the three RGB/ YUV pairs of the HDMI signal into a single twisted pair with a clock. Another pin carries data, and the other two carry power. Televisions from Sony, Samsung, LG, and Toshiba can accept this signal directly, with a passive cable that lets the TV charge AND control the phone. Or, a dongle can be used for any other HDMI TV. Resolution can be as high as 1080p60, without compression.
I think the teardown article makes a lot of assumptions which are likely false. For one thing, just because the chip has an embedded ARM microcontroller, it doesn't mean it's running an OS. That is very likely wrong. It probably has 256 kbits of on-chip non-volatile, not 256MB. And, the likely purpose of this is to manage the HDMI and HDCP protocols, not to convert H.264 into HDMI. The artifacts which were observed can be caused by a lot of things, such as the video compression or decompression, OR, maybe the video output to the lightning connector sub-samples video to 4:2:2 or 4:2:0, or, maybe some video scaling is performed. But I doubt that the lightning dongle decompresses the video signal. An easy test would be to observe how much lag is present when you move your finger about the display. With HDMI, there is no lag, but if the video is compressed and uncompressed, there will likely be lag.
That's the whole point - Apple can't be that dumb. I want to believe that there has to be some deeper reason behind all this. I am no tech expert, so I can't know if all this is possible or not, but what if the circuitry was to future-proof the cable for the next 10 years?How stupid is it to force customers to the new lightning connector on their mobile devices, force them to buy expensive and technically inefficient active adapters and even lower the quality of the transfered image while doing that. If I have a 1080p HD movie on my iPad, I want it to be displayed on my TV in the same quality as it is stored and not process it with another transcoding step that lowers the quality even more. I can live with adapters any day, but not if they alter the image of my stored media.
Not even talking about the fact that this adapter is a complete an utter waste of resources. Every engineer that reads this should at least shake their heads or fall of their chair, because it just doesn´t make any technical sense, but rather to outsource components that may increase the BOM of their mobile devices by a tiny bit.
Instead of just building a bit more ridiculously cheap RAM into their mobile devices and use the native processing power of their already extremely fast and future-proof A6(X) chips (they are going to be even faster in the future), they make an active adapter that actually needs to work with some additional memory and even an iOS to boot and transfer an image. Seriously, how dumb is that, Apple?
That's true, but I'd like to see independent verification of their claims, since they got a few details wrong.Even if they are wrong on the technical details, what it comes down to in the end is the visibly higher quality (900p instead of 1080p, image artifacts) the old adapter with the old connector offers compared to the new adapter and the new lightning connector. No one really cares how it works as long as it doesn´t offer worse quality on screen.
The artifacts which were observed can be caused by a lot of things, such as the video compression or decompression, OR, maybe the video output to the lightning connector sub-samples video to 4:2:2 or 4:2:0, or, maybe some video scaling is performed. But I doubt that the lightning dongle decompresses the video signal.
That's true, but I'd like to see independent verification of their claims, since they got a few details wrong.
Thank you! That was very useful information! What I meant was that I'd like to see someone independently verify the artifact issue of the iPhone 5 vs, for example, the 4s.It does - see the Panic image... Those are definitely decompression artefacts (blocking), not those of plain 4:2:x subsampling, which would "only" slightly reduce the bandwidth needed (by one-third for 4:2:2 and by half for 4:2:0).