lol at everyone complaining about the lack of new hardware at a DEVELOPER conference.
You should really look up all the amazing and iconic HARDWARE releases that have happened at the DEVELOPER conference
lol at everyone complaining about the lack of new hardware at a DEVELOPER conference.
M3 is A17 Pro gen (N3B). M4 is based on the upcoming A18 Pro (N3E). M2 is A15 gen SoC.(M4 and A17Pro; but also M3, which is a A15-gen SoC).
It wouldn't make much sense that developers machines are kept two generations behind the latest apple products
As you said the 3rd generation ATV4K is pretty adequate, but it can't support AV1 YouTube playback, it can't supply upscaled content outputted as 8K, I think the A17 Pro or M3 GPU processing with raytracing/mesh shading would modernize the ATV arcade game playing. Bluetooth/WiFi could see a bump newer. So Apple could update it sometime for that stuff.I couldn't care less for a new Apple TV, the current one is already capable enough. Beyond a price drop, or an update to a newer SoC so that Apple doesn't have to keep producing a three-year-old design for another year, I don't think there's any need for an updated unit.
However, I would be disappointed if at least a new Mac Studio wasn't announced at WWDC. It's the only machine (along the Mac Mini and the Pro) that's still at the M2 generation. It wouldn't make much sense that developers machines are kept two generations behind the latest apple products (M4 and A17Pro; but also M3, which is a A15-gen SoC).
I would not be surprised if this hard push into AI makes Siri no longer optional.The last time Apple did this, with putting Siri suggestions everywhere, I immediately turned them all off. Siri is useless for anything beyond setting timers and basic stuff like that. I am not expecting anything better now. They've had over a decade to improve Siri and they have no ideas. So there's no way they've suddenly figured it out in the last year.
You're absolutely right. Keeping the Studio on M2 would still be 2 generations behind, though.M3 is A17 Pro gen (N3B). M4 is based on the upcoming A18 Pro (N3E). M2 is A15 gen SoC.![]()
Agreed. But 8K is so niche that they don't need to rush, and regarding AV1... well, I don't know, but I've never played a 4K HDR video on YouTube on my AppleTV 4K that seemed to need a missing hardware decoder. Ever.As you said the 3rd generation ATV4K is pretty adequate, but it can't support AV1 YouTube playback, it can't supply upscaled content outputted as 8K, I think the A17 Pro or M3 GPU processing with raytracing/mesh shading would modernize the ATV arcade game playing. Bluetooth/WiFi could see a bump newer. So Apple could update it sometime for that stuff.
I beg to differ. First, because Apple is no longer the company of “we sell you a computer that will last for decades because OS updates are smooth and unbloated” (they used to be, but this ended after the Snow Leopard era, more or less, and now every MacOS minor update is gigabytes of iOS-like bloat that makes your 4-year old Mac crawl). Besides, there is a big problem in hardware at this time: hardware-accelerated raytracing. Because Apple decided to keep desktops with M2 (except iMac), we are going to have to wait one year more until all Macs have hardware raytracing. And that’s a bummer. In my case, for example, I’ll delay my support for raytracing just because if this.Hardware isn’t Apple’s problem, it’s software. Most of their devices have processors that are more than enough to last many years but the software is lackluster. The iPad is a prime example…all that hardware power but nothing on the software side to take advantage of it.
I would not be surprised if this hard push into AI makes Siri no longer optional.
Good, no one needs yearly updates on this stupid thing. I have the last 3 models in various tvs and can’t tell the difference
What exactly is a spec bump going to get you?I just wish they'd spec bump things like the Apple TV more regularly to keep them as up to date as possible.
They charge a LOT for what they are, and never budge on MSRP, so the value gets worse with every day that goes by.
I just think a premium manufacturer like Apple should be held to a little higher standard here.
Nobody wants a redesign or a bunch of work going in -- just spec bump the internals to keep it up to spec so that whoever walks into an Apple Store and forks over $130-$150 (+tax) isn't getting reamed for doing so
Currently only the A17 Pro supports AV1 decoding, as an A-series chip. I don’t see them putting that into an Apple TV before it has trickled down to the regular iPhone, due to Apple’s focus on supply-chain streamlining under Tim Cook. They won’t create a custom chip just for the Apple TV, and they also won’t give it an iPhone flagship chip.I think the most important thing right now for the Apple TV is AV1 hardware decoding. Other than that the current hardware is kinda overpowered unless you are trying to run AV1 content.
Yes. I also think it is going to be boring AI stuff that I am not that interested in.I am fine with a focus on software, but I'm more interested in say, overhauling the macOS Music app, adding features to iPadOS so the iPad can be more of a laptop substitute, and ironing out long-standing bugs. But I know that's not what's coming. It's just going to be two hours of "AI" because that's the industry buzzword right now.
Especially with so many bugs in the basic software like networking and bluetooth.Neural engines, ai, Apple's just selling hype and buzzwords now.
Right, M4 iPads came as a surprise to me, so quickly on the heels of M3 laptops. Had they released M3 iPads I'd be inclined to agree with Gurman that we're not likely to see new Studios or Pros (with high-end M4 chips) until at least WWDC 2025, if not later, since M3 Max lacks interconnects and Apple making a monolithic Ultra seems unlikely right now. But since they did release the M4 iPads, that changes things.
The high-end desktops are running on 2 generation old M2 technology originally released in 2022, and the latest MacBook Pro M3 Max can meet or exceed the performance of an M2 Ultra in some respects. It makes zero sense for Apple to wait around and then release their high end machines on aged M4 technology toward the end of 2025 when all hype will be on the upcoming M5. It leaves their allegedly most powerful machines perpetually lagging a year or more behind in technology and eclipsed by laptops. It would also be a complete backtrack and in opposition of everything Apple's been doing with their M-series chips, which have shown nothing but an acceleration in release dates so far.
How does automating complex tasks with AI really empower iOS 18 iPhones compared to iPadOS 18 iPad Pros w/larger display and an optional external display? Then you have AI used with Macs with advanced AS processors and multiple display capable? You mentioned Vision but that is using the older M2. Hopefully WWDC addresses how far Apple implemented AI into those OS's.How would there be hardware anyways; iOS 18 is a big release and they have Vision os now it will be a packed software event no time for anything else
Lots of people these days believe whatever they read on the web.I suppose there are folks who've never learned critical thinking and might just accept any answer generative AI gives them, not verifying it for accuracy and not considering whether it is manipulated content by bias censors or safety curation. That could be considered taking away the ability to think. But we already have that particular societal problem with all manner of things other than generative AI. Another might be where it's used as a shortcut for writing "news" articles or academic papers rather than reporters, students, or researchers writing original content themselves.
I see mainstream generative AI as it is right now being useful for general questions or improving "smart" functionality. And eventually in medical and scientific fields where it can be used to great effect in a variety of ways. What we saw with Microsoft certainly emphasizes the power it has to to be dystopian in the wrong hands, but fearing the technology instead of learning about it does very little good. In my opinion generative AI has the potential to become really powerful to individual users by having the ability to do additional training on it with your own data, and with live data that it can crunch through and give preliminary results on far faster than a human. Of course it should still be reviewed and verified by a human, since the technology tends to have flaws in its output, but it can be a useful tool.
oh, for sure—been tuning into the keynotes since roughly ‘09 for context. I was only really positing two main points: 1) Apple Silicon has been an inevitable source of them focusing heavily on hardware in keynotes of recent years; 2) I feel that the pre-recorded nature of the keynotes allows them to demo products in a much more “free” fashion—unlike when (I know this wasn’t WWDC, just an example) Steve demo’d the original iPhone and it was the most tight-knit operation on the planet, with multiple backup units on hand in case something went wrong live.Hardware announcements have been going on at WWDC since the early 2000s
I would also love this