Source please .They both have neural accelerators in each GPU core.
Source please .They both have neural accelerators in each GPU core.
It seems to be what the M4 Mac Mini tops out at as well. I haven't seen any third party's being able to offer more than that. Only the M4 Pro chip and above seems to offer storage above 2TB. If there is an M5 Mini, the option of 4TB internal storage might prove more interesting than incremental improvements in processor performance for those who don't need the extra grunt (and cost) of an AS Pro chip.This article reads like it was AI generated and not fact checked. In particular "Support for up to 2TB storage" ... this is an Apple configuration max for the assembled M4 Macbook Pro (non-Pro M4) and not a M4 chip limitation.
The M5 Pro and Max versions are coming later. It was just the base model MacBook Pro that was updated. The M4 Pro and M4 Max versions are still on sale and will be updated to the M5 equivalents likely after the New Year with the Air following sometime in the spring.So the MacBook Pro only has a base chip now? Will the Air also have this base M5 chip? Is it the same chip inside the iPad Pro?
Why do they keep screwing with the naming? Pro should have the Pro chip like it's been for M1-M4, what's wrong with these people.
This is such an important point. Apple's software does a ton of background machine learning that most people have no idea is happening. This has been true for years. Apple first included a "Neural Engine" in the A11 back in 2017 (iPhone 8 and X). That was to handle various "AI" tasks. Apple's machine learning efforts started well before then, but that was the first step with dedicated hardware.While AI and LLMs get all the headlines these days, the underlying neural acceleration hardware boosts a host of other tasks that utilize machine learning routines. Things like the blurred backgrounds in FaceTime calls or copying text out of a photo in the Photos app all use ML and the neural hardware. Third party developers have access to this hardware so even if something isn't billed as AI (although everything today seems to be billed as AI for marketing purposes) it can still get a boost from the hardware improvements.
This has been discussed for years and its ignored. Similar to how they only compare the latest iPhones to each other and ignore previous gens until a month or more later, if at all.
I believe its down to the fact that they are using the latest device so don't care about other ones.
Also this article having no real data about what it does, because up to 45% faster includes 0% as a number, makes it all pretty useless anyway. I'm sure the new device is faster, but I'm not sure we can state for a fact what it is until they do real world benchmarks.
Do I have a youtuber for you...
He does exactly what you are asking for. Compares M series SOC`s performance for lightroom etc.
Not much different to any other company. At the end of the day you buy what you want or what you can afford. I have a Mini M2 pro, happy with it, can't see me changing for a whileI am getting really tired of Apple playing this game of “here’s our brand new processor! OR…you can buy the super maxxed ultra crazy version of the previous processor at a super maxxed premium price!!!”
Is this what not happens everywhere? AMD, Intel and all those that use the arm chip on other devices. You buy a nice router and a few months later they bring out another one with better specs. Video cards for P.cs, a new one comes out often offering this and that, phones are the same.Apple needs to stop improving their silicon every year because it's angering too many people.
Why on Earth would offering more performance each year at the same or lower price anger people? Don’t understand that at all. Nobody makes you upgrade until you need to.Apple needs to stop improving their silicon every year because it's angering too many people.
Genuine question... why? I have an M1 and would like to see how it compares with the M5, why would you start the comparison at the M1 Max?
I guess if you are sticking with a laptop (M1-M4) and did not have 32GB, this new one would get you a bit more local LLM performance. You would really have to know the exact models you use in your workflow to know if the extra RAM and bandwidth would make a big enough difference to upgrade. But if you needed it, you would probably choose to get a M4 Max MBP with with faster memory bandwidth and potentially more RAM.Compared to the M4 chip that Apple launched in May 2024, the M5 delivers:
- 27.5% higher unified memory bandwidth
In addition to general performance claims, Apple published a set of specific real-world workload results showing measurable gains in AI-driven applications:
- 4×+ peak GPU compute performance for AI
- 3.6× faster time to first token (LLM)
M4 Chip M5 Chip
120 GB/s unified memory bandwidth 153GB/s unified memory bandwidth
For users whose workloads include on-device AI inference, complex 3D rendering, or other GPU-bound or memory-intensive tasks, the jump from M4 to M5 is material. The combination of per-core Neural Accelerators, higher memory bandwidth, and new GPU architecture produces multi-fold speed-ups in certain AI operations. In environments where time-to-result directly affects workflow, such as local LLMs, diffusion models, video enhancement, or ray-traced production or gaming, the M5 represents a meaningful step-change rather than a minor iteration.