First, as you note, it's an Intel-created standard, just like USB was. Apple didn't create USB but they did popularize it. (Can you name a single PC, motherboard, or peripheral that supported it before the iMac?)
Second, Apple's market share was even lower than 8% when the iMac put USB on the map. Market share can't be a prerequisite for popularizing a standard.
I see it's time to resurrect the dead threads time. What, do you just search my old posts now and try to start new arguments? It sure as hell looks that way to me.
Just because you added USB first does not in ANY WAY SHAPE OR FORM mean you
POPULARIZED it. The two don't add up! If you're 4-6% of the market (at the time), how the hell do you "popularize" something? If you look at actual history, Apple AVOIDED USB 2.0 when it came out. They PUSHED Firewire 400 instead even as far as making it a requirement on their 1st generation iPod (the one thing Apple DID "popularize"). This was a horrible mistake (fortunately, PowerMacs could easily add USB 2.0 with a simple card unlike today's Macs that can't add a damn thing without some massively expensive external thunderbolt rig) that even Apple realize after a year of complaints (how the hell do you get Firewire 400 on a PC when 95% of them never included it back then and probably around 75% never EVER included it in the history of Firewire?) Apple soon caved and added USB 2.x to Macs and switched iPods from Firewire to USB. We all know that USB 2.0 was inferior to Firewire 400, but it's precisely because Apple CANNOT "popularize" something with a small market share that their bid to make Firewire 400 the next big thing failed.
Your version of history has a 4-7% market share "popularizing" USB 1.0. Apple did switch to USB 1.0 early. They avoided USB 2.0 when it came out for over a year (and again with USB 3.0) so your argument falls apart. Do you seriously think PCs adopted USB because a couple of million (out of 300-400 Million at the time PCs total) Macs had it? Did the PC market adopt Firewire because ALL Macs had it? No. PCs never cared much for Firewire. Did PCs adopt AppleTalk? No. Did PCs adopt PowerPC? No. Did many PCs adopt Thunderbolt even though Apple is way more popular in general now than 20 years ago? NO, NO and still NO.
You have to have a significant market share to "popularize" something. The iPhone is a significant market share. iPhone and iPad changes start trends. Even the Apple Watch has a chance to popularize because it's a new slow to take off market and people are watching Apple now. But is Apple making Lightning available to others? Is it doing something new or just repackaging something in an EXPENSIVE proprietary connector like Apple seems to like to do (e.g. iPhone's entire history)? No one else asked to adopt Apple connection standards because they'd charge too much if they'd even do it (they like charging their customers exorbitant amounts for dongles and cables). But
no one was really watching what Apple back in 1996 except to watch them self destruct and nearly go bankrupt which is exactly where they were quickly heading at the time. Yeah, what an excellent model to copy. A company that's going bankrupt because they lost the PC wars. Let's get us some of THAT! NOT!
Did you know that Commodore sold more C64s that were more or less feature identical to each other in ten years from its release in August of 1982 than Apple sold in Macs (by a factor of 3) up until the point Steve Jobs returned? Apple sold 20 million Macs by 1997 since 1984. Commodore sold over 60 million C64s by 1992 and unlike the Mac, they were the same computer made in 1982 save cost saving motherboard changes and the like (different case design and color). And yet Commodore still died because they could not adapt the Amiga to confront the juggernaut that was the PC. That same juggernaut would have killed Apple too if Steve hadn't come back when he did.
Your reasoning is backwards, which is why you have to stretch it in funny ways to try to make it fit the history.
There's a big difference between Thunderbolt and something like AppleTalk. Thunderbolt was created to be a STANDARD and despite you trying to say it's all Intel's baby, you must have missed the original articles that stated Apple was working WITH Intel to create Thunderbolt and that's a big change in Apple's role of "creating" something in that Thunderbolt was supposed to become a standard. Apple preferred the superior Firewire 400 over USB 2.0 and rightly so (believe it or not Jobs was interested in standards when he came back; every move he made pushed the Mac away from its proprietary crap and towards standards or creating new standards including the move to Intel from PowerPC.) But
Apple played a part in Thunderbolt's initial development and that's the key difference you seem to be missing when you want to paint Thunderbolt as just another Firewire adoption.
Meanwhile, Lightning is a joke. It was created just on the cusp of USB-C appearing that would have solved the problem and met a standard AND have had USB 3.x speeds from the start. Now Apple can't admit that it was a mistake despite the technical advantage of USB-C over Lightning. Arguing Lightning is slightly smaller is a weak argument when most people don't want phones to get any thinner, especially if it means sacrificing battery life. Lightning's ONLY advantages over USB 2.x is the reversible connector (and some creative types made a spring loaded backwards compatible reversible connector even there). Other than the new iPad Pro, Lightning is still USB 2.x speeds and I find it sad that Microsoft's new Lumia 950 series phone has faster wired connections than the latest iPhones and faster charging as a result of using USB-C. It also has wireless charging (something Apple fanboys will claim Apple "invented" on the next iPhone or maybe the one after when Android has had it available for years).
What I wrote that you quoted was over a year ago and as you can see by now that Thunderbolt has made a huge change in order to try to save/re-invent itself by moving to USB-C connectors (when the USB guys didn't even want Thunderbolt to use the same connector from day one) but somehow made it "OK" by including USB 3.x within the Thunderbolt standard itself as a subset controller (even though that doesn't change the confusion factor when wondering why a Thunderbolt 3 device doesn't work when plugged into a PC that only has USB-C). Thunderbolt is
desperately trying to stay relevant in a world that doesn't really need or want it. Still, this is a good move for it for now, but the problem is it MUST stay well ahead of all future USB changes to in order to keep this ruse of "1 connector to rule them all" relevant. If USB4 got a jump on Thunderbolt 4 and was faster they'd have a real problem on their hands and Thunderbolt would be moot once again. Nothing changes the fact that Thunderbolt connections (despite the same connector cord) require more hardware to function at those speeds and thus cost more for the same device. A 6TB conventional "media" hard drive capable of mere ~180MB/sec speeds at maximum is not going to be ANY faster with a Thunderbolt controller than a USB 3.1 controller and thus it makes ZERO sense to use a Thunderbolt version that will inevitably cost $200 more than the USB 3.1 version, ESPECIALLY if they use the same freaking port on your next generation computer! Hey, charge me $200 for NOTHING!
I see ONE major advantage of Thunderbolt III that might make it viable even so and that is external graphics card hubs that would essentially let you do ONE WIRE to connect and turn your notebook into a gaming or graphics workstation capable desktop when docked (yeah you "can" do this with Thunderbolt II, but it's too slow and isn't directly designed around this which means it costs more whereas Thunderbolt III has this in mind from day one). Given desktops are already a dying breed compared to mobile platforms, this would indeed be a useful trick to spare someone from having to own a second entire computer just for home/game use, but ONLY if said dock costs significantly less than an entire second computer and sadly, that hasn't really been the case in the PC world where $600 + a graphics card is insane when you can build a gaming computer for $1200 no problem (less if you don't need 4K gaming). Mac users would crap their pants for one, though since there is no gaming Mac at any cost.
I don't know how you come to that conclusion.
Apple (post-1997) hates creating standards. They learned that the hard way, from their first 20 years. They use whatever's out there, as long as it's good enough for them. (This is similar to what Seymour Cray did.)
I think you mean pre-1997 and post 2011 as Jobs clearly pushed for standards, moving the Mac from an obscure IBM processor to the world's x86 home standard and making Mac software easier to port, making sure Macs had standard PCI buses on the Pro "trucks" (broken by Jonny Ive with his custom GPU connectors and no expansion slots), supporting OpenGL (standard) instead of licensing DirectX (which I'm sure Microsoft would have licensed them at a price, especially given they needed the Mac to pretend they weren't a total monopoly), standard gigabit Ethernet at a time when most PCs only had 100Mbit, adding HDMI even to notebooks (now largely gone again) etc. etc. Steve's Apple only went proprietary when necessary, not just to make an extra buck on sucker users. That's the OLD Apple that wasn't under Steve's control and sadly it appears to be the new Apple under Tim Cook's control.
They needed a good cheap pluggable interconnect for the iMac to replace ADB, and USB was good enough, so they used it.
Don't you mean that Apple was in dire straits at this point and couldn't afford to roll their own incompatible standard?
I maintain the
old non-Jobs Apple would have gladly sacrificed compatibility with the IBM PC clone market if it means they could SOAK a large enough Mac market for overpriced proprietary hardware. They are still soaking the market today with dongles that cost 2-4x what they should cost and released at times where it's questionable to do so (i.e. no one is using mini-display port, so shouldn't it at least come with a free dongle? No, why give away for free what you can charge over $20 for a $1 part and make $19 profit? Hey, the new post-Jobs Apple isn't so different, after all.
When they were looking into networking solutions in the late 1990's, WiFi was not yet a clear winner (remember those days?), but Apple found it to be good enough so they put their weight behind it.
Apple was also nearly bankrupt at the time as I've mentioned so there was no capital to invent something new. Besides, as I've indicated once Steve returned circa 1997, it was more like NeXT than Apple at that point (you know the Apple that put Steve in virtual "Siberia" in 1985 and caused him to leave Apple since they would not let him work on the Macintosh any longer).
Just as the iMac was the first major computer to have USB, the iBook was the first major computer to have built-in wireless.
And again, being one of the first, doesn't mean you "popularize" something. Major computer? In what sense were the Powerbooks ever "major" computers anyway? They were a blip on the radar at most.
When Apple needed a fast pluggable interconnect in the 1990's, they found (but did not create) IEEE 1394, and rebranded it as Firewire. When USB finally caught up, Apple dropped Firewire.
You mean a few years ago with USB 3.0? Apple never "dropped" Firewire even long after it was an obvious commercial "failure" and some would say that was a good move since professionals used Firewire even if most regular users did not. Even so, dropping it has had consequences for many commercial audio rigs that still use Firewire (somehow adapters never work quite as well).
Regardless, I don't know WTF you're trying to argue, really. Your entire post is predicated on the words helped "create" vs "adopt" (despite Apple's obvious hand in Thunderbolt's development; they're also on the USB board, BTW so they do have a "hand" in USB standards as well) and trying to change the REASONS Apple did what it did from "greed" to "convenience". Otherwise, you're preaching to the choir.
When Apple needed a fast flexible thin connection for iPods in the 2000's, they created the 30-pin connector. Then when they needed something smaller, they created the Lightning connector; Micro-USB lacked the features they needed (reversibility, physical robustness). Today, the new USB-C connector has finally caught up, so they'll eventually replace their custom connector with that.
You went from arguing that Lightning was better than USB-C (being slightly smaller than it) in the other thread to "eventually replace" here.
That's like saying I was right all along but not admitting it and then starting to argue here about something I said over a year ago in the context of "they don't create standards; they adopt them!" when clearly modern Apple does help create standards and they conveniently switch to whatever sells them high-priced adapters and dongles as well. Apple loves to over charge still. The only difference is that for awhile there, the higher prices bought you premium features (there was no better notebook for the dollar in 2008 than my Macbook Pro, even for Windows it ran Vista better than the rest of the PC world). Now it just buys thinner and thinner and thinner instead of better and better and better and sadly Steve's own gauntness seemed like it mirrored his last few years on earth obsession with "thin" and Apple is still obsessed with it even when it means their GPUs are pitiful compared to machines that cost half as much.
If you assume that Apple's goal is to create proprietary standards
It's funny, but I don't recall saying that about Modern Apple. The OLD Apple used to do that (well I wouldn't call them "standards" exactly since only Apple ever really used them) in the '80s and early '90s (got to be different) and it nearly bankrupted them. The Tim Cook Apple had better be careful what it does because history has a way of repeating itself. For a company that touts the latest Intel CPUs all the time, they ignored USB 3.0 for a long time while throwing Thunderbolt out there when it wasn't useful to anyone (nothing available for it) and mini-display port when no one had it (and display port hadn't taken off) leading to dongle land once again.
NO ONE ELSE has adopted Thunderbolt + Mini-Display port in any quantity and now it's moving to USB-C connectors and Apple seems to be sitting on the edge of the fence trying to make up its mind whether it wants to support that standard or not. They put it on ONE 13 inch Macbook and NOTHING else since then (and there's been a whole new release cycle since then) and even the model they have it on has only ONE port (including charging) so it's a rather poor POOR implementation requiring (you guessed it) more adapters and hubs. They SHOULD have put it on ALL the rest of the Macs as an extra port for now until it does catch on, giving flexibility and all larger models should have at LEAST two ports, preferably more since it does everything all the other ports put together did.
If Apple doesn't put Thunderbolt III in a USB-C package next spring, then they are stupid and the Mac deserves to die. This is their one chance to get out there early with a notebook that can one wire connect to an external graphics card + hub AND maintain the new standard that will be EVERYWHERE within three years. Wait on this and I know I'm moving on. I've been holding off on my next notebook AND desktop purchase because of the Thunderbolt III + USB-C promise of combination computer/dock setup that will make my life simpler. But I'm not going to wait forever when PCs will be all over this like glue.