I think they should use a ‘G’ prefix for higher-end mobile chips... then in a few years they’ll finally ship a G5 laptop!It will be called something like M1x.
I think they should use a ‘G’ prefix for higher-end mobile chips... then in a few years they’ll finally ship a G5 laptop!It will be called something like M1x.
No. People on this forum and people who watch keynotes are not where Apple makes or will make their hay. The “average consumer” just needs a price and a value proposition, along with a little desirability (aka pretty engineering or status appeal). People who really buy Apple receive “empty marketing” very well. They could give two plops about gigahertz or number of cores
Imagine a Beowulf cluster of these!It’s a Mac, it won’t even run Tetris 😹
Apple said the MacBook Air was faster than 98% of Windows PCs. That’s a simple and powerful message. If these specs are true (no reason to think they aren’t since they are likely being posted by official reviewers) then that claim looks substantiated.
I think the exciting part of all this is having a powerful CPU while at the same time having low power efficiency. Up until now you had to give up one to get the other. Instead of just throwing in a bigger battery tackle the battery longevity through efficiency. Apple’s DNA has always been doing more with less. This is a new model for desktop CPU designs.Yawn. 99.9% of people don't need anything faster than what was available 5 years ago. This is all fascinating from an academic perspective but means very little from a practical perspective. So I can land the space shuttle from a mac book air. Awesome. Will I? no.
I don’t work in the industry anymore, but I did work at Exponential, Sun and AMD for a decade. And my Ph.D dissertation was on CPU memory hierarchies. So ...
Cost? Hard to say. Depends on how complicated the package is. But generally I would expect it to be cheaper, because connectors, boards, etc. cost money, soldering costs money, slots cost money, etc.
Repair? Harder. You have to replace the entire CPU package. Though apple likes to solder RAM in a lot of machines, anyway, so it’s not like it’s that easy to repair either way.
Physical limitations? Good question. Apple appears to be putting all the RAM on the same plane as the CPU. This means the more memory you have, the further it is from the CPU, which increases latency (And/or increases driver power). Of course, when it’s not in the socket, it may be stacked, but it’s even farther from the CPU. So overall, for now, it makes more sense to keep it in the CPU package (also reduces impedance, making it more efficient).
There are also ways to stack them - i have an example in my PhD dissertation using a well filled with thermally-conductive goop, with RAM chips on interposers separated by diamond sheets that transmit the heat into the goop. Apple isn’t doing that
Performance: this is a different issue than where you put the memory. You can share memory even if it’s external. Or you can have partitioned memory even if the memory is in the same package as the CPU.
As for the performance impact, apple says it is better. I can’t answer either way. You obviously have to worry about contention - it’s difficult for the CPU and GPU to write to the same memory area at the same times. But Apple knows a lot more about how it’s GPU and CPU work in concert than I do.
It’s a multi billion dollar industry - now’s the time to go after a slice of the pie. I won’t say bigger because right now it’s just crumbs.Apple has done a terrible job in that industry (the Mac gets practically no AAA games at launch, and iPhone still has a large collection of mostly casual games) but if they made the proper investments I think there’s a lot of potential for them to step in. I just don’t think they have the interest. Sony and Microsoft’s launches for this generation have been disappointing, even if the PlayStation 5/Xbox Series X are incredibly powerful.
It’s difficult to understand when the paradigm shifts.
”Up until now, the only electric cars are radio controlled toys, so no way an electric car can carry four people and go 0-60 in 3 seconds.”
While one group of PA Semi employees set to work on the Apple A4 processor using an ARM CPU core, another group began defining the microarchitecture for the new CPU. According to one source, Steve Jobs initially set an "insanely great" bar for the performance of the new CPU, but he eventually realized that his CPU team was limited by the same laws of physics that apply to everyone else.
It’s a multi billion dollar industry - how’s the time to go after a slice of the pie.
They’re throwing tonnes at Apple TV+, they’d make their money back faster in gaming if done properly.
I recall attempts to make other materials meeting or exceeding diamond's thermal conductivity, but I think diamond still holds the crown. Does anyone use industrial diamonds in CPU assemblies these days?There are also ways to stack them - i have an example in my PhD dissertation using a well filled with thermally-conductive goop, with RAM chips on interposers separated by diamond sheets that transmit the heat into the goop. Apple isn’t doing that![]()
This is good for the entire tech industry, not just Apple. Calm down, fanboi.Wow, just imagine what the higher end products are going to deliver. As others are saying, Intel and AMD are going to be scrambling to find a way to get just close to this kind of performance per watt - in the next 12-18 months, which by that time Apple will be at the next level again. I wonder if the PC crowd really understands what Apple has been able to deliver, or if they’ll just be in denial?
Which reminds me - I wonder how many more posts I have to make on here before I earn the label “M1?”I think they should use a ‘G’ prefix for higher-end mobile chips... then in a few years they’ll finally ship a G5 laptop!
It took Adobe years just to get a decent version of Photoshop to the A chip on ios and Autodesk software on macos is an awful buggy mess with 10 year old bugs still hanging around.With this level of performance, we could see developers and Software companies turning their priorities to the Mac.
What about if devs like Adobe and Autodesk decided to push full-apps to macOS with ARM and only limited versions of AutoCAD or Photoshop for Windows in the next 5 to 10 years?
Can Intel x86 get this performance in the long term?
They still sell many Intel Macs and maybe don't want to talk dirty about their supplier.If the benchmarks are so amazing and blows Intel out of the water, why didn't Apple do direct benchmarks against specific intel chips in its keynote as it has done traditionally?
Why put meaningless "2x", "3x", "4x" stuff that can easily be disregarded as empty marketing?
Bring back Steve Job's Photoshop benchmarks or some real world meaningful test back to keynotes.
I think diamond still holds the crown, with silver #2 (at a fraction of the thermal conductivity). But I seem to recall some compound using boron and arsenic that was pretty close to diamond. Don’t know how far that’s gotten.I recall attempts to make other materials meeting or exceeding diamond's thermal conductivity, but I think diamond still holds the crown. Does anyone use industrial diamonds in CPU assemblies these days?
Also reminds me of reading about Akhan semiconductor's attempts to make an entire chip out of diamond (for both thermal conductivity and potential device size reasons). No idea how close they are to succeeding.
MacBook Pro G5....... ..... ....I think they should use a ‘G’ prefix for higher-end mobile chips... then in a few years they’ll finally ship a G5 laptop!
What has changed? Macs have always been way down the list in gaming. Windows PC have always provided the best gaming experience, or Consoles, at second best, if you can't afford a decent gaming rig.Well - I don't know - that's pretty sound reasoning.
What is your plan longer term for your Windows side usage (since Bootcamp is gone)?
I'm excited about Apple chips, but I'm bummed that I'm now likely going to have run multiple machines to maintain the Windows gaming side.
The Air is fanless....That's damn impressive. I think I'll still be holding off until they update the design but for a fanless machine, these numbers are insane.
I am curious how you think the Unified Memory Architecture will affect GPU performance. It seems that not having to move data to and from the GPU should have a fairly positive impact on overall performance. Also, what do you expect the upper limit will be for SoC on chip RAM will be? I would love 32GB or 64GB for my next mini (not buying until it has 10Gb/s Ethernet again, but I can wait).Graph x86 performance vs Apple CPU performance over the last decade or so. Intel would have to radically bend their curve just to keep up, and there’s no indication they can do it. Even switching to TSMC as a fab wouldn’t get them there.
Reminds me of this old article about the A6 development history.
I guess the spirit of Jobs lived through via Cook, so to speak. I don't think Cook gets enough credits for pouring in what must've been a crazy amount of money into the project. Business-wise it would've been so much easier to save the huge outlay of R&D by just going on semi-custom or even off-the-shelf chips. That's the type of long-term planning&execution not frequently seen.
Well that was the right decision: wait till this event and then just decide what's best for you. But many instead chose to rushing into Intel Macs "before it's too late", out of fear and ignorance. I'd love to see their face when they see the ARM Air beating their new $3k machine.im really glad I decided not to Get a new MacBook Pro yet. I will gladly wait for apple to release the chip in the MacBook pro 16
Yes? That's why I said those numbers are insane for a fanless machine.The Air is fanless....