Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

iSunrise

macrumors 6502
May 11, 2012
382
118
That's the whole point - Apple can't be that dumb. I want to believe that there has to be some deeper reason behind all this. I am no tech expert, so I can't know if all this is possible or not, but what if the circuitry was to future-proof the cable for the next 10 years?
I hope you´re right. I really do. Because if it´s more future-proof, then it needs to be technically better as a whole, not just because it´s thinner and more flexible, it needs to retain the quality on levels we had before and not make it worse, at least for the 4th/5th generation of mobile devices. We don´t know what Apple will do in the future and I am no prophet, but currently you are pretty much forced to buy it when you´re on a lightning mobile device. Maybe they are going to change something in the 6th or coming generation again, but it´s seems very plausible that it´s lightning connector, bandwidth or DRM related.
 
Last edited:

fertilized-egg

macrumors 68020
Dec 18, 2009
2,109
57
Your point falls apart once you acknowledge that you honestly don't know what chips did what.Can you assume - sure. But that doesn't mean you know. Nor does Panic. That's why I asked you to define conjecture for me.

Even if you refuse to believe that claim that this is a decoding chip, which is actually based on good evidence(shocker!), we know there's a 256MB RAM because it's clearly labelled. In other words, it's 100% positive that this whole thing isn't just a part that was shifted away from the earlier iPad in anyway which means your assertion was wrong no matter what. Plus why the heck would an earlier iPad have a decoding chip that decodes its own video stream? That's completely non-sensical but you didn't read the article.

You could have just wrote your post arguing my assertion. But that wasn't enough. You took it a step further by trying to make insinuations (which were wrong) about my motivation.

Sorry but that's what you do. I don't know if you do it without realizing it but it's easy to go back on your posting history and show that you almost always lean on the argument against Apple no matter what the story is.

I understood your point crystal clear. I can't say I agree or disagree with it since it's not based on facts that are known.

No it's simpler than that: you didn't read the article but when someone said something positive about Apple being generous with profit margin on the adapter, you tried to spin it into Apple being greedy.

I see that you're trying to cling to a "if you cannot prove it 100% you cannot say I'm wrong" argument. But there is a very clear label on the RAM saying this isn't something that wasn't taken off the iPad. Or are you going to say I'm still wrong because I cannot 100% prove the iPad 1, 2, 3 didn't have a hidden 256MB RAM somewhere? ;)
 

Menneisyys2

macrumors 603
Jun 7, 2011
5,997
1,101
My other question is... can USB 2 carry enough bandwidth to handle video? I'd wager a guess of no.. without the same technique. So yes, the lightening connector may top out at a specific configurable digital signal and have to send this data in a compressed form, but I also wager a guess that if there is something wrong with the MPEG decoding that, Apple will be able to fix it with an update.

Well, they could have used the same technology as in Thunderbolt: 10 Gbit/s per device (20 total), which would allow for even HDMI 1.3 (allowig for even WQXGA (2560×1600) at 60 Hz and requiring 10.5 Gbit/s), let alone older / less capable HDMI versions (for example, the less than 5 Gbit/s, 165 MHz HDMI 1.0, which allows for up to 1920×1200 (that is, vertically more than Full HD) at 60 Hz). Then, they wouldn't need to compress / decompress the video signal and could use a far simpler (and cheaper) adapter.

Of course, this may have cost them a lot more (dunno about the actual prices for Thunderbolt vs. Lightning circuitry; I only assume the latter costs less than the former because it's far slower) - maybe this is why they went with the, quality-wise, far inferior solution.

----------

I hope you´re right. I really do. Because if it´s more future-proof, then it needs to be technically better not just because it´s thinner and more flexible, it needs to retain the quality on levels we had before and not make it worse, at least for the 4th/5th generation of mobile devices. We don´t know what Apple will do in the future and I am no prophet, but currently you are pretty much forced to buy it when you´re on a lightning mobile device. Maybe they are going to change that to native processing in the 6th or coming generation again

I'm afraid they won't be able to do this. Even if you subsample colors with 4:2:0 (which halves the bandwidth needed, see my prev. comment above: https://forums.macrumors.com/posts/16932363/ ) and "only" use 1920*1080 at 60p (which the iDevice hardware H.264 decompressor has always been capable of decoding, starting with the A4 CPU), you need to transfer around 2 Gbit/s. (With 30p, half of it.) If the hardware just can't sustain that bitrate, then, you in no way will ever be able to transfer uncompressed signals, no matter how hard you try. It's simple physics.

All in all, Apple should have used the same chipset as with the Thunderbolt. Then, it'd be easily possible to drive external HDMI1.3-capable monitors at even native Retina (!) iPad resolutions at native 60p framerate.
 

mdelvecchio

macrumors 68040
Sep 3, 2010
3,151
1,149
Agreed. Lightning is a horrible adapter. It's not meant to benefit the consumer, but to shift extra cost onto the consumer. In order to make the iPhone lighter and thinner (something not demanded by the customer, but by marketing), they took out most of the onboard processing of video, audio, etc. But they didn't cut the cost of the unit.

The lightning connector could easily have been twice as wide with the same utility, the same ability to flip it, etc., and then the phone could be doing the processing, we would still have analog audio and video out, and HDMI out. But Apple wanted to cut corners and screw over customers in the process.

Lightning is just one of the negatives of the iPhone 5, but I'm stuck with mine.

so, where'd you get your electrical engineering degree from? I'd really like to see your product portfolio so I can better weigh your technical opinion. have a link to share?
 

WestonHarvey1

macrumors 68030
Jan 9, 2007
2,773
2,191
This cable is a stopgap convenience measure. Eventually all mobile devices and monitors will use AirPlay or similar for this purpose. Nobody will be using cables.
 

Menneisyys2

macrumors 603
Jun 7, 2011
5,997
1,101
Eventually all mobile devices and monitors will use AirPlay or similar for this purpose. Nobody will be using cables.

See my posts above: AirPlay just can't be as effective as (real; not that of Lightning) cabled connections, unless all you do is playing back iOS-native video files. It's only with the latter case that AirPlay can deliver the same quality as (old) cabled solutions. This is all laws of physics (en/decoding lag and resolution restrictions, Wi-Fi lag).
 

iSunrise

macrumors 6502
May 11, 2012
382
118
I'm afraid they won't be able to do this. Even if you subsample colors with 4:2:0 (which halves the bandwidth needed, see my prev. comment above: https://forums.macrumors.com/posts/16932363/ ) and "only" use 1920*1080 at 60p (which the iDevice hardware H.264 decompressor has always been capable of decoding, starting with the A4 CPU), you need to transfer around 2 Gbit/s. (With 30p, half of it.) If the hardware just can't sustain that bitrate, then, you in no way will ever be able to transfer uncompressed signals, no matter how hard you try. It's simple physics.

All in all, Apple should have used the same chipset as with the Thunderbolt. Then, it'd be easily possible to drive external HDMI1.3-capable monitors at even native Retina (!) iPad resolutions at native 60p framerate.
Yes, that´s what I fear as well.

Unfortunately I could not find any document on the web that says something about the bandwidth of the new lightning connector. What would theoretically be possible is that the connector is just a 1st generation and that leaves a door open for future faster chipsets that are also compatible with older lightning connectors (like Firewire did) and then we would have enough bandwidth to at least get 1080p@23.976/24Hz/30Hz uncompressed. Or Apple simply just doesn´t care and we need to wait for faster hardware/encoders or better codecs like H.265/HEVC to squeeze about 50% more data in. Forcing Airplay onto people must makes a lot of sense for Apple, because licensing should also get a bit easier and it´s a closed system.

The lightning connector may be flexible (and thus future-proof), but it looks more and more like they had to make a lot of sacrifices, because their major selling point "thinner and lighter" is marketable for the masses and that leads to buyer interest.
 
Last edited:

Menneisyys2

macrumors 603
Jun 7, 2011
5,997
1,101
so, where'd you get your electrical engineering degree from? I'd really like to see your product portfolio so I can better weigh your technical opinion. have a link to share?

He is at least partly right - see my Ligthning vs. Thunderbolt post at https://forums.macrumors.com/posts/16932463/ . Apple does have the technology that would it make possible to easily transfer even the highest-resolution uncompressed HDMI signals, let alone "simple" Full HD ones.

Of course, as we don't know how big / power hungry / costly Thunderbolt chipsets are, it's not possible to tell whether they can be used in as small / constrained device as an iPhone.
 

linuxcooldude

macrumors 68020
Mar 1, 2010
2,480
7,232
Reading through this blog, the author makes several assumptions he calls "theories" even going as far as saying "Are we off base? Let us know"

He appears to be a programmer and not an electronics engineer.
 

iSunrise

macrumors 6502
May 11, 2012
382
118
Reading through this blog, the author makes several assumptions he calls "theories" even going as far as saying "Are we off base? Let us know"

He appears to be a programmer and not an electronics engineer.
Yet he provided images and comparisons along with a maximum output resolution of 900p with the lightning adapter. He doesn´t need to be a Ph.D. in physics or engineering, because he provided proof to his written claims.

To fake something like that would be possible of course. I highly doubt that, though.
 

Fatalbert

macrumors 6502
Feb 6, 2013
398
0
That's the whole point - Apple can't be that dumb. I want to believe that there has to be some deeper reason behind all this. I am no tech expert, so I can't know if all this is possible or not, but what if the circuitry was to future-proof the cable for the next 10 years?

This probably seems redundant, even wasteful now, but I daresay that years down the road, we will be praising Apple for their foresight when their future products start tapping on features in the lightning cable.

You're mixing the Lightning cable and this adaptor. Yes, the Lightning cable is built very well for the future. The adaptor is not. Sure you can update the software in the adaptor, but you can't update the insufficient hardware it uses. It already can't output 1080p, according to the article (and conflicting with store.apple.com's claims).

I think Apple CAN be dumb sometimes. Ping? Game Center? The buttonless iPod shuffle? I think there are just a few idiots responsible for these screwups in every company... like whoever made the Google Pixel.
 

APlotdevice

macrumors 68040
Sep 3, 2011
3,145
3,861
See my posts above: AirPlay just can't be as effective as (real; not that of Lightning) cabled connections, unless all you do is playing back iOS-native video files. It's only with the latter case that AirPlay can deliver the same quality as (old) cabled solutions. This is all laws of physics (en/decoding lag and resolution restrictions, Wi-Fi lag).

The laws of physics? Hardly. It's simply a limitation to our current technology. After all a radio wave, like all forms of light, travels almost three-hundred million meters per second. That's safely far beyond human perception.
 

linuxcooldude

macrumors 68020
Mar 1, 2010
2,480
7,232
Yet he provided images and comparisons along with a maximum output resolution of 900p with the lightning adapter. He doesn´t need to be a Ph.D. in physics or engineering, because he provided proof to his written claims.

To fake something like that would be possible of course. I highly doubt that, though.

Theories, as he calls it, is not proof, even he is not sure. Just because you read it off the internet does not make it true or factual.

I'm not convinced.
 

Fatalbert

macrumors 6502
Feb 6, 2013
398
0
you would be crazy to buy this over an apple tv

If what Panic says is true, yes. But Apple's site claims otherwise. Why am I the only person noticing this besides one other guy? It's an important mystery.

The Apple TV is an amazing device. For $100, you get a wireless video receiver at least, which alone would normally cost more, and at most a great movie-watching and music-playing box. All they have to do is release a little software update, and it can have apps... GAMES.
 

iSunrise

macrumors 6502
May 11, 2012
382
118
Theories, as he calls it, is not proof, even he is not sure.
I agree with you, but it´s as close to a proof as he can make with an article that is published on the web. I like his honest approach in that he openly admits that it´s possible in that he made a stupid mistake somewhere and that he may be wrong.
 

yakapo

macrumors 6502
Jul 11, 2008
254
235
Since we're "back in my day"-ing, it's 12 times the hard drive space of my first Mac... and 2048 times the RAM in my first Apple device (a //c).

It's a little silly to say that Lightning can't output raw HDMI--of course it can't. Raw HDMI requires nineteen wires, which Lightning has nowhere near. The legitimate question would be whether an iOS device an output uncompressed digital 1080p video, which the existence of a mini computer in the adapter in no way answers.

It's unlikely it does, [corrected my statement after reading the linked article] given the compression artifacts noticed, but it's not impossible that higher-throughput less- or un-compressed video could be output via a later adapter, or maybe even a firmware modified version of this one (given that it is, after all, a full computer--depends on the max bandwidth of its lightning connection to the host iOS device).

Personally, I think the lightning connector is awesome, because it's massively future-proofed; since the pins are fully reconfigurable, you can add just about anything you want to it down the line. If/when MHL (a phone-centric HDMI replacement that uses less wires) hits the market, you just have a new adapter with a different internal processor to handle the transcoding. Or you produce a new 4K adapter that takes advantage of a more powerful CPU in future iDevices to stream more data through the same connector and drive a higher-res display. Or whatever Apple decides to do with it down the road.

Yes, geeks can whine about expensive cables, but the fact is that the vast majority of consumers just buy whatever overpriced cable is on the shelf at RadioShack anyway, so they're highly unlikely to notice or care. At least in this case they're actually getting some technology for their buck, rather than a gold-plated placebo with an 800% markup.

Quote
So the adapter is sort of a hack from apple to emulate airplay.
Because the lightning connector cannot outpot raw hdmi signal, damn that is awful. Something always gotta give with apple.

-----

2 different perspectives. One of them is totally crazy.
 

Menneisyys2

macrumors 603
Jun 7, 2011
5,997
1,101
If what Panic says is true, yes. But Apple's site claims otherwise.

Well, there have been numerous cases Apple was caught for blatantly lying. (Just one example: "You're holding it wrong" at the same time of posting job ads for antenna / wireless engineers and posting (and then silently removing) videos "proving" the phones of other phone manufacturers also have "death of grip".) I wouldn't trust them when the Panic guys have posted at least one screenshot clearly showing the video stream is recompressed.
 

SgtPepper12

macrumors 6502a
Feb 1, 2011
697
673
Germany
All in all, Apple should have used the same chipset as with the Thunderbolt. Then, it'd be easily possible to drive external HDMI1.3-capable monitors at even native Retina (!) iPad resolutions at native 60p framerate.
Then again Thunderbolt is not Apple's technology. It's Intel's. Would Intel implement Thunderbolt on an ARM device? Don't think so. I don't even think it's as easy as putting a chipset on the logic board.
 

Exhale

macrumors 6502a
Sep 20, 2011
512
145
Well, they could have used the same technology as in Thunderbolt: 10 Gbit/s per device (20 total), which would allow for even HDMI 1.3 (allowig for even WQXGA (2560×1600) at 60 Hz and requiring 10.5 Gbit/s), let alone older / less capable HDMI versions (for example, the less than 5 Gbit/s, 165 MHz HDMI 1.0, which allows for up to 1920×1200 (that is, vertically more than Full HD) at 60 Hz). Then, they wouldn't need to compress / decompress the video signal and could use a far simpler (and cheaper) adapter.
Because Thunderbolt requires an on-board chip. One that is not only large, but also extremely power hungry.
(It requires far more power than any Macbook SSD, and even many HDDs)

You really don't want that on a Mobile Phone.

In addition, Intel sells the controllers (in bulk) for 40-50 USD if I recall correctly.
Its for this reason, Thunderbolt also isn't appearing in any other devices but RAID enclosures, Monitors, and occasionally connectivity hubs.

The price of a hard drive would jump by a huge amount if they used it, and power consumption would basically double. For obvious reasons, that makes it even less viable for mass storage devices like USB sticks. Forget about using it in keyboards and mice, or most any other cheaper USB device. It could be implemented in Webcams to allow them to transmit an uncompressed feed, but the question is if that matters when that stream is then down compressed to less than a megabyte when transmitted over the Internet. Particularly now when USB3 already has enough bandwidth to transfer an uncompressed 1920x1080x24bitx60hz stream.

Future revisions of Thunderbolt will reduce these issues - but that future was not 6 months ago, and its not 6 months from now either.
 

Menneisyys2

macrumors 603
Jun 7, 2011
5,997
1,101
Then again Thunderbolt is not Apple's technology. It's Intel's. Would Intel implement Thunderbolt on an ARM device? Don't think so. I don't even think it's as easy as putting a chipset on the logic board.

Yup, this is why I posted at https://forums.macrumors.com/posts/16932541/ :

" Of course, as we don't know how big / power hungry / costly Thunderbolt chipsets are, it's not possible to tell whether they can be used in as small / constrained device as an iPhone. "
 

samcraig

macrumors P6
Jun 22, 2009
16,779
41,982
USA
No it's simpler than that: you didn't read the article but when someone said something positive about Apple being generous with profit margin on the adapter, you tried to spin it into Apple being greedy.

I see that you're trying to cling to a "if you cannot prove it 100% you cannot say I'm wrong" argument. But there is a very clear label on the RAM saying this isn't something that wasn't taken off the iPad. Or are you going to say I'm still wrong because I cannot 100% prove the iPad 1, 2, 3 didn't have a hidden 256MB RAM somewhere? ;)

No. You interpreted my comments as Apple being "greedy". Let's be clear about that. It's your assumption that's what I meant. However - you'll never know what I meant for sure because you aren't me at the keyboard. However I do know what I meant - so it's all good.

I never said I was right. In fact - I'm most likely not. However just because I'm wrong doesn't make your opinions correct or factual.

Keep your winks. We both know they aren't genuine.
 

Menneisyys2

macrumors 603
Jun 7, 2011
5,997
1,101
Because Thunderbolt requires an on-board chip. One that is not only large, but also extremely power hungry.
(It requires far more power than any Macbook SSD, and even many HDDs)

You really don't want that on a Mobile Phone.

In addition, Intel sells the controllers (in bulk) for 40-50 USD if I recall correctly.
Its for this reason, Thunderbolt also isn't appearing in any other devices but RAID enclosures, Monitors, and occasionally connectivity hubs.

The price of a hard drive would jump by a huge amount if they used it, and power consumption would basically double. For obvious reasons, that makes it even less viable for mass storage devices like USB sticks. Forget about using it in keyboards and mice, or most any other cheaper USB device. It could be implemented in Webcams to allow them to transmit an uncompressed feed, but the question is if that matters when that stream is then down compressed to less than a megabyte when transmitted over the Internet.

Future revisions of Thunderbolt will reduce these issues - but that future was not 6 months ago, and its not 6 months from now either.

Thanks for clearing this up!
 

linuxcooldude

macrumors 68020
Mar 1, 2010
2,480
7,232
I wouldn't trust them when the Panic guys have posted at least one screenshot clearly showing the video stream is recompressed.

Depends on how he did his testing. 1600 X 900 is the native resolution of the iPhone 5. So did he do mirroring from the screen or did he play a 1080p video where the video would not have to be upscaled to 1080p? Not sure if that has anything to do with that or not.

Having a screenshot of some artifacts does not give us how he did his testing.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.