Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I couldn't care less for a new Apple TV, the current one is already capable enough. Beyond a price drop, or an update to a newer SoC so that Apple doesn't have to keep producing a three-year-old design for another year, I don't think there's any need for an updated unit.

However, I would be disappointed if at least a new Mac Studio wasn't announced at WWDC. It's the only machine (along the Mac Mini and the Pro) that's still at the M2 generation. It wouldn't make much sense that developers machines are kept two generations behind the latest apple products (M4 and A18; but also M3, which is a A17Pro-gen SoC). (UPDATED, I was mistaken on the A-core generations)
 
Last edited:
I couldn't care less for a new Apple TV, the current one is already capable enough. Beyond a price drop, or an update to a newer SoC so that Apple doesn't have to keep producing a three-year-old design for another year, I don't think there's any need for an updated unit.

However, I would be disappointed if at least a new Mac Studio wasn't announced at WWDC. It's the only machine (along the Mac Mini and the Pro) that's still at the M2 generation. It wouldn't make much sense that developers machines are kept two generations behind the latest apple products (M4 and A17Pro; but also M3, which is a A15-gen SoC).
As you said the 3rd generation ATV4K is pretty adequate, but it can't support AV1 YouTube playback, it can't supply upscaled content outputted as 8K, I think the A17 Pro or M3 GPU processing with raytracing/mesh shading would modernize the ATV arcade game playing. Bluetooth/WiFi could see a bump newer. So Apple could update it sometime for that stuff.
 
The last time Apple did this, with putting Siri suggestions everywhere, I immediately turned them all off. Siri is useless for anything beyond setting timers and basic stuff like that. I am not expecting anything better now. They've had over a decade to improve Siri and they have no ideas. So there's no way they've suddenly figured it out in the last year.
I would not be surprised if this hard push into AI makes Siri no longer optional.
 
As you said the 3rd generation ATV4K is pretty adequate, but it can't support AV1 YouTube playback, it can't supply upscaled content outputted as 8K, I think the A17 Pro or M3 GPU processing with raytracing/mesh shading would modernize the ATV arcade game playing. Bluetooth/WiFi could see a bump newer. So Apple could update it sometime for that stuff.
Agreed. But 8K is so niche that they don't need to rush, and regarding AV1... well, I don't know, but I've never played a 4K HDR video on YouTube on my AppleTV 4K that seemed to need a missing hardware decoder. Ever.

Modernization is nice. But a new AppleTV would be, in my opinion, an update nobody is waiting for.
 
Hardware isn’t Apple’s problem, it’s software. Most of their devices have processors that are more than enough to last many years but the software is lackluster. The iPad is a prime example…all that hardware power but nothing on the software side to take advantage of it.
I beg to differ. First, because Apple is no longer the company of “we sell you a computer that will last for decades because OS updates are smooth and unbloated” (they used to be, but this ended after the Snow Leopard era, more or less, and now every MacOS minor update is gigabytes of iOS-like bloat that makes your 4-year old Mac crawl). Besides, there is a big problem in hardware at this time: hardware-accelerated raytracing. Because Apple decided to keep desktops with M2 (except iMac), we are going to have to wait one year more until all Macs have hardware raytracing. And that’s a bummer. In my case, for example, I’ll delay my support for raytracing just because if this.
 
I just wish they'd spec bump things like the Apple TV more regularly to keep them as up to date as possible.

They charge a LOT for what they are, and never budge on MSRP, so the value gets worse with every day that goes by.

I just think a premium manufacturer like Apple should be held to a little higher standard here.

Nobody wants a redesign or a bunch of work going in -- just spec bump the internals to keep it up to spec so that whoever walks into an Apple Store and forks over $130-$150 (+tax) isn't getting reamed for doing so
 
I think the most important thing right now for the Apple TV is AV1 hardware decoding. Other than that the current hardware is kinda overpowered unless you are trying to run AV1 content.
 
  • Like
Reactions: turbineseaplane
My 2021 does the thing where screen turns black when you switch from SDR to HDR or switch framerates, which i think is solved in a newer chipset or something on the 2022 model, but I cant be damned enough to upgrade just for that
 
  • Like
Reactions: turbineseaplane
I just wish they'd spec bump things like the Apple TV more regularly to keep them as up to date as possible.

They charge a LOT for what they are, and never budge on MSRP, so the value gets worse with every day that goes by.

I just think a premium manufacturer like Apple should be held to a little higher standard here.

Nobody wants a redesign or a bunch of work going in -- just spec bump the internals to keep it up to spec so that whoever walks into an Apple Store and forks over $130-$150 (+tax) isn't getting reamed for doing so
What exactly is a spec bump going to get you?

If there is no features requiring it, other than tech nerds who “know better”, why would anyone give a damn which processor is their tv box if it does everything it’s supposed to with ease?

If you’re argument is that they should lower the cost the longer it goes without an update, I’d agree with you 100%
 
How would there be hardware anyways; iOS 18 is a big release and they have Vision os now it will be a packed software event no time for anything else
 
I think the most important thing right now for the Apple TV is AV1 hardware decoding. Other than that the current hardware is kinda overpowered unless you are trying to run AV1 content.
Currently only the A17 Pro supports AV1 decoding, as an A-series chip. I don’t see them putting that into an Apple TV before it has trickled down to the regular iPhone, due to Apple’s focus on supply-chain streamlining under Tim Cook. They won’t create a custom chip just for the Apple TV, and they also won’t give it an iPhone flagship chip.
 
I am fine with a focus on software, but I'm more interested in say, overhauling the macOS Music app, adding features to iPadOS so the iPad can be more of a laptop substitute, and ironing out long-standing bugs. But I know that's not what's coming. It's just going to be two hours of "AI" because that's the industry buzzword right now.
Yes. I also think it is going to be boring AI stuff that I am not that interested in.

But I will still eagerly download and try out the beta, as I did in previous years 😂😆
 
Right, M4 iPads came as a surprise to me, so quickly on the heels of M3 laptops. Had they released M3 iPads I'd be inclined to agree with Gurman that we're not likely to see new Studios or Pros (with high-end M4 chips) until at least WWDC 2025, if not later, since M3 Max lacks interconnects and Apple making a monolithic Ultra seems unlikely right now. But since they did release the M4 iPads, that changes things.

Apple photoshopped the M1 and M2 Max die photos. M3 Max may not have it this time because it is a 'short term' solution. Doing an Ultra (or even worse bigger package) on every single iteration doesn't make economic sense at all. Apple doesn't do the amount of volume where they can toss that class of package into the trash can every 12-14 months. Nobody else does even at substantially higher volumes.



The high-end desktops are running on 2 generation old M2 technology originally released in 2022, and the latest MacBook Pro M3 Max can meet or exceed the performance of an M2 Ultra in some respects. It makes zero sense for Apple to wait around and then release their high end machines on aged M4 technology toward the end of 2025 when all hype will be on the upcoming M5. It leaves their allegedly most powerful machines perpetually lagging a year or more behind in technology and eclipsed by laptops. It would also be a complete backtrack and in opposition of everything Apple's been doing with their M-series chips, which have shown nothing but an acceleration in release dates so far.

This is leaping to "end of 2025" conclusion that isn't well supported. Just because Apple isn't going to do a M4 Ultra real soon doesn't mean it will then be a long as possible. WWDC 2025 is likely late. The orginal Studio launched in March. The bigger lack of rational reason is why Apple would tie the Ultra SoC to WWDC in any year. There is about zero rational reason for that.

If the M4 Max is getting close to more complete M2 Ultra performance coverage then Apple could just sell a Max only Studio (and skip the Ultra 'half' of the line up). Similar issue with M3 Max. It is primarily Apple OCD that they would "have to update" both variants of Studio. A M3 Max and M2 Ultra wouldn't make the sky fall for six or seven months. ( still likely gapped on RAM capacity and multiple threaded performance. Buying a Studio or Mac Pro to primarily win some single threaded drag racing contest , makes about zero sense. )

The whole "most powerful machines" is a bit skewed perspective. Apple sell about 75 (if not 80) % laptops. If they have a M4 Max in a MBP 16" in Q4 2024 then Apple won't shed any tears at getting higher economies of scale selling more laptop panel than the would have ( i.e., folks buying more laptops and less desktops isn't going to phase them. That is same general path the overall market has been moving along for over a decade. )

Similar if the Mini/Mini Pro gets M4/M4 Pro before the Studio and folks go to a more affordable Mac system.... is that really a 'bad thing' for the overall aggregate Mac ecosystem? No . Raising aggregate unit volume will help.
[ And the M1 Studio 'ran past' the most popular MP 2019 configurations on a wide set of workloads.... that didn't make the M-series Mac Pro come any sooner. ]


Apple M-series tie RAM capacity to the package size. Can wave hand at "two generations old" M2 , but if the working set data size is > than a smaller die M4 package ... the "newer' silicon in the M4 isn't going to make up the difference if it cannot hold the data in RAM. Similar with internal SSD size... plain Mn can't address as much SSD as the larger dies. So folks who require a larger native drive; M(n+1 or n+2) doesn't do much because doesn't meet a necessary criteria. ( M2 MP can also have more than one Internal SSD which the rest of line up can't. Mn+2 isn't going to make a difference if no radical change in chassis the SoC goes into. )
 
How would there be hardware anyways; iOS 18 is a big release and they have Vision os now it will be a packed software event no time for anything else
How does automating complex tasks with AI really empower iOS 18 iPhones compared to iPadOS 18 iPad Pros w/larger display and an optional external display? Then you have AI used with Macs with advanced AS processors and multiple display capable? You mentioned Vision but that is using the older M2. Hopefully WWDC addresses how far Apple implemented AI into those OS's. :cool:
 
I suppose there are folks who've never learned critical thinking and might just accept any answer generative AI gives them, not verifying it for accuracy and not considering whether it is manipulated content by bias censors or safety curation. That could be considered taking away the ability to think. But we already have that particular societal problem with all manner of things other than generative AI. Another might be where it's used as a shortcut for writing "news" articles or academic papers rather than reporters, students, or researchers writing original content themselves.

I see mainstream generative AI as it is right now being useful for general questions or improving "smart" functionality. And eventually in medical and scientific fields where it can be used to great effect in a variety of ways. What we saw with Microsoft certainly emphasizes the power it has to to be dystopian in the wrong hands, but fearing the technology instead of learning about it does very little good. In my opinion generative AI has the potential to become really powerful to individual users by having the ability to do additional training on it with your own data, and with live data that it can crunch through and give preliminary results on far faster than a human. Of course it should still be reviewed and verified by a human, since the technology tends to have flaws in its output, but it can be a useful tool.
Lots of people these days believe whatever they read on the web.
 
Hardware announcements have been going on at WWDC since the early 2000s



I would also love this
oh, for sure—been tuning into the keynotes since roughly ‘09 for context. I was only really positing two main points: 1) Apple Silicon has been an inevitable source of them focusing heavily on hardware in keynotes of recent years; 2) I feel that the pre-recorded nature of the keynotes allows them to demo products in a much more “free” fashion—unlike when (I know this wasn’t WWDC, just an example) Steve demo’d the original iPhone and it was the most tight-knit operation on the planet, with multiple backup units on hand in case something went wrong live.

always love to hear someone else who wants a WWDC focused on optimization. (that’s the word I was looking for when typing out my original comment!) hope it happens sooner rather than later. iPadOS is in such a sad state and I wouldn’t mind upgrading my M1 Pro to the new M4…if only they’d make the OS less of a mess. also, for the love of ****, get the Settings app right on macOS. they really didn’t need to screw it that badly.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.