Technical paper viewed by non-technical audience, hilarity ensues.
Depends. It sounds more like hiring a private chef into your kitchen and assuming you'll become a good cook.So while Apple may not have used user attributed data, Google still knows exactly what Apple is doing in AI, exactly how their models perform, and can compare their own AI with Apple's AI since they have both.
Sounds like a win-win for Google, Apple not so much. I wonder how much intellectual property Apple is giving up just to say "We can do that too!"
This sounds like a pretty big failure at the executive level.
So while Apple may not have used user attributed data, Google still knows exactly what Apple is doing in AI, exactly how their models perform, and can compare their own AI with Apple's AI since they have both.
Sounds like a win-win for Google, Apple not so much. I wonder how much intellectual property Apple is giving up just to say "We can do that too!"
This sounds like a pretty big failure at the executive level.
Yeah I'm sure Apple would put all of the associated IP on their servers without any restrictions... and of course Google would love the potential liabilities 🙄This isn't exactly true. Cloud based infrastructure usually has pretty good checks in place to protect customer privacy exactly because of this sort of thing. Microsoft can't/doesn't look at everything that's on azure, amazon doesn't look at everything on their web services exactly for this reason.
I can't believe Google has a vision for AI while Apple wasted time on a useless headset vanity project for dim Tim.
No. We are talking about server chips used for training AI models not mobile chips. While inthis case power consumption still matters, it's not as important.I wonder is Apple using the Tensor chip because it uses way less power than the nVidia chip in the same application?
Responses here are hilarious.
Apple partners with both Google and Microsoft for various cloud things. What they used (the hardware) has absolutely no bearing on the model or Apple's implementation of the model, no? That's kind of like saying "Apple sucks because they use HP in the data center (vs. Dell for example)".
Apple has an initiative called ACDC (Apple Chips in Data Center) which is how they are creating an infrastructure for AI training using their own (M) silicon.
Tensor Vs. Nvidia is kind of like arguing over the brand of shovel you use to dig a trench. It does not mean that Apple Intelligence is all of a sudden a Google Gemini AI rebranded.
Unlike Nvidia, which sells its chips and systems as standalone products, Google provides access to its TPUs through cloud services. Customers using Google's TPUs have to develop their software within Google's ecosystem, which offers integrated tools and services to streamline the development and deployment of AI models.
And don't forget their bad blood with NvidiaApple is reportedly developing their own AI chips. To buy themselves time, it makes sense they are going with a rented solution, rather than buying a lot of hardware they’d want to scrap in 2-3 years. And maybe they evaluated bot Google and Nvidia and found the former to be more cost-effective.
So while Apple may not have used user attributed data, Google still knows exactly what Apple is doing in AI, exactly how their models perform, and can compare their own AI with Apple's AI since they have both.
This sounds like a pretty big failure at the executive level.
Holding a grudge doesn’t make economic sense. They could easily design a contractual agreement that prevents the earlier disasters they had with Nvidia. Maybe they even did now, but Nvidia didn’t accept it.And don't forget their bad blood with Nvidia
brother I think you are getting Google’s search/ad services confused with AWS 😆Apple IS already reliant on Google services. Where do you think icloud resides?? Of course they are going to chose a partner they are already doing business with. Thats called business sense.
Securing Nvidia chips takes time and money- Apple only had one so they settled for what they could get. I’m concerned about the precedence this sets, because it caught them flat-footed and they’re in damage control mode. Compromising on the foundational piece instead of delaying like they say they do- going with best of breed, and not being first but being best. This is a move exuding an utter lack of confidence. It’s got a shiny veneer though! Leaves me concerned for Apple Intelligence. Let’s hope that doesn’t become a running gag like the mention of Siri.
I would have hoped people would realize that it’s an analogy and not just a random statement, lol. The bearing it has to the conversation is that just because Apple is using Google’s tech doesn’t mean they are throwing Privacy out the window.While your statement is true it has no bearing on the current topic.
Hokding a grudge doesn’t make economic sense. They could easily design a contractual agreement that prevents the earlier disasters they had with Nvidia. Maybe they even did now, but Nvidia didn’t accept it.
Apple PUBLISHED THE INFORMATION THEMSELVES in a research paper! That's the entire reason for this article.
The research paper tells the whole world how they did it, tricks they used and how good the result was.
There is no need for spying, Google can just read the research paper!
So at the risk of not having all the info (I did not read the white paper), did apple use Googles chips to 'train' the models or are they planning to use them to 'run' the models? It could be two different use cases.
So not only was Apple reliant on Google services, but it was also comically lagging behind Nvidia’s hardware offerings. It makes you wonder what outdated tech Siri is still running on.
Apple was already in bed with google but this is a new leap forward that deepens the privacy stance of the company. How can you promise to maintain privacy while usi googles servers? That’s idiotic! Are they giving google all our data? Even if they remove the identifiers it’s still finger-printable.
This disclosure feels like a way out when Apple Intelligence doesn’t perform as expected.
“It’s because we used Google Tensor chips.”
Either that or Google paid them to use Tensor. Apple rarely discloses the hardware they use to develop stuff. There was no reason to do so here, especially given the cloud based computing and the privacy stance they have towards that.
So not only was Apple reliant on Google services, but it was also comically lagging behind Nvidia’s hardware offerings. It makes you wonder what outdated tech Siri is still running on.
So.. are they acknowledging that the Apple Silicon Neural Engine is crap? I thought they always marketed it as the best for machine learning