Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
No. People on this forum and people who watch keynotes are not where Apple makes or will make their hay. The “average consumer” just needs a price and a value proposition, along with a little desirability (aka pretty engineering or status appeal). People who really buy Apple receive “empty marketing” very well. They could give two plops about gigahertz or number of cores

I agree the target market is general consumers, but the keynote was aimed heavily to the enthusiast community and this is why we are reacting. The general consumer that cares about social media, photos and facebook has no clue that there was a mac keynote, heck they probably don't know what a keynote is.

You have to understand that the tech sites, enthusiasts, tech youtubers, those are the primary target for the keynote, and those will be the messengers to the masses. You will soon see youtubers advertise why this new technology will suck or why it'll be great. Why you should buy it now, or why you should wait. The masses will watch the youtubers for advice. The general consumer will probably see youtube rather than the keynote.

To give you another example why the keynote is aimed at techies, everyone has a 'tech' person in their lives, this is the 'advisor' on whether something is good or bad and why. I'm sure if a friend of family member asks for advice on a purchase, they are likely going to refer to this person.

From personal experience, no general consumer that I have ever known cares about mac keynotes. They just don't care period. It's true what you say, they care about pricing, if and when they have to purchase a computer (every 5 to 10 years).
 
Apple said the MacBook Air was faster than 98% of Windows PCs. That’s a simple and powerful message. If these specs are true (no reason to think they aren’t since they are likely being posted by official reviewers) then that claim looks substantiated.

but you can always say "you is faster than most of you competitors", just for marking alone as well..
The true tests in in the real world where everyone is pounding in it.

the 4 "efficicienty cores" seems ok, as they would make sure they each run at full speed, but like everything there is some comprise, there has to be.

and that will come out only when people have run it though it's paces.
 
Yawn. 99.9% of people don't need anything faster than what was available 5 years ago. This is all fascinating from an academic perspective but means very little from a practical perspective. So I can land the space shuttle from a mac book air. Awesome. Will I? no.
I think the exciting part of all this is having a powerful CPU while at the same time having low power efficiency. Up until now you had to give up one to get the other. Instead of just throwing in a bigger battery tackle the battery longevity through efficiency. Apple’s DNA has always been doing more with less. This is a new model for desktop CPU designs.
 
I don’t work in the industry anymore, but I did work at Exponential, Sun and AMD for a decade. And my Ph.D dissertation was on CPU memory hierarchies. So ...

Cost? Hard to say. Depends on how complicated the package is. But generally I would expect it to be cheaper, because connectors, boards, etc. cost money, soldering costs money, slots cost money, etc.

Repair? Harder. You have to replace the entire CPU package. Though apple likes to solder RAM in a lot of machines, anyway, so it’s not like it’s that easy to repair either way.

Physical limitations? Good question. Apple appears to be putting all the RAM on the same plane as the CPU. This means the more memory you have, the further it is from the CPU, which increases latency (And/or increases driver power). Of course, when it’s not in the socket, it may be stacked, but it’s even farther from the CPU. So overall, for now, it makes more sense to keep it in the CPU package (also reduces impedance, making it more efficient).

There are also ways to stack them - i have an example in my PhD dissertation using a well filled with thermally-conductive goop, with RAM chips on interposers separated by diamond sheets that transmit the heat into the goop. Apple isn’t doing that :)

Performance: this is a different issue than where you put the memory. You can share memory even if it’s external. Or you can have partitioned memory even if the memory is in the same package as the CPU.

As for the performance impact, apple says it is better. I can’t answer either way. You obviously have to worry about contention - it’s difficult for the CPU and GPU to write to the same memory area at the same times. But Apple knows a lot more about how it’s GPU and CPU work in concert than I do.

thank you, I always wished the second best model had more RAM (similar to storage option, 512Gb) instead of configure to order ... $1249 MBA and $1499 MBP and so on, even it means increasing the price in the second best model.

with appliance model and pricing, RAM also should be in the strategy not just the storage. That is where my cost of RAM question comes, integrated or soldered or not soldered RAM is still key to the overall performance of the system.
 
  • Like
Reactions: bklement
Apple has done a terrible job in that industry (the Mac gets practically no AAA games at launch, and iPhone still has a large collection of mostly casual games) but if they made the proper investments I think there’s a lot of potential for them to step in. I just don’t think they have the interest. Sony and Microsoft’s launches for this generation have been disappointing, even if the PlayStation 5/Xbox Series X are incredibly powerful.
It’s a multi billion dollar industry - now’s the time to go after a slice of the pie. I won’t say bigger because right now it’s just crumbs.

They’re throwing tonnes of dollars at Apple TV+, they’d probably make their money back faster in gaming, if done properly.
 
Last edited:
  • Like
Reactions: MICHAELSD
It’s difficult to understand when the paradigm shifts.

”Up until now, the only electric cars are radio controlled toys, so no way an electric car can carry four people and go 0-60 in 3 seconds.”

Reminds me of this old article about the A6 development history.

While one group of PA Semi employees set to work on the Apple A4 processor using an ARM CPU core, another group began defining the microarchitecture for the new CPU. According to one source, Steve Jobs initially set an "insanely great" bar for the performance of the new CPU, but he eventually realized that his CPU team was limited by the same laws of physics that apply to everyone else.

I guess the spirit of Jobs lived through via Cook, so to speak. I don't think Cook gets enough credits for pouring in what must've been a crazy amount of money into the project. Business-wise it would've been so much easier to save the huge outlay of R&D by just going on semi-custom or even off-the-shelf chips. That's the type of long-term planning&execution not frequently seen.
 
It’s a multi billion dollar industry - how’s the time to go after a slice of the pie.

They’re throwing tonnes at Apple TV+, they’d make their money back faster in gaming if done properly.

I was thinking the same.

It would take an innovative approach for Apple to make waves in the gaming industry, as the big players have established themselves over many years and consoles.
 
  • Like
Reactions: TVreporter
There are also ways to stack them - i have an example in my PhD dissertation using a well filled with thermally-conductive goop, with RAM chips on interposers separated by diamond sheets that transmit the heat into the goop. Apple isn’t doing that :)
I recall attempts to make other materials meeting or exceeding diamond's thermal conductivity, but I think diamond still holds the crown. Does anyone use industrial diamonds in CPU assemblies these days?

Also reminds me of reading about Akhan semiconductor's attempts to make an entire chip out of diamond (for both thermal conductivity and potential device size reasons). No idea how close they are to succeeding.
 
Wow, just imagine what the higher end products are going to deliver. As others are saying, Intel and AMD are going to be scrambling to find a way to get just close to this kind of performance per watt - in the next 12-18 months, which by that time Apple will be at the next level again. I wonder if the PC crowd really understands what Apple has been able to deliver, or if they’ll just be in denial?
This is good for the entire tech industry, not just Apple. Calm down, fanboi.

-texted from my iPhone 12 Pro Max
 
With this level of performance, we could see developers and Software companies turning their priorities to the Mac.

What about if devs like Adobe and Autodesk decided to push full-apps to macOS with ARM and only limited versions of AutoCAD or Photoshop for Windows in the next 5 to 10 years?

Can Intel x86 get this performance in the long term?
It took Adobe years just to get a decent version of Photoshop to the A chip on ios and Autodesk software on macos is an awful buggy mess with 10 year old bugs still hanging around.

I understand your point but Adobe and Autodesk are two of the slowest moving corporations around.
 
If the benchmarks are so amazing and blows Intel out of the water, why didn't Apple do direct benchmarks against specific intel chips in its keynote as it has done traditionally?

Why put meaningless "2x", "3x", "4x" stuff that can easily be disregarded as empty marketing?

Bring back Steve Job's Photoshop benchmarks or some real world meaningful test back to keynotes.
They still sell many Intel Macs and maybe don't want to talk dirty about their supplier.
 
  • Like
Reactions: jmgregory1
I recall attempts to make other materials meeting or exceeding diamond's thermal conductivity, but I think diamond still holds the crown. Does anyone use industrial diamonds in CPU assemblies these days?

Also reminds me of reading about Akhan semiconductor's attempts to make an entire chip out of diamond (for both thermal conductivity and potential device size reasons). No idea how close they are to succeeding.
I think diamond still holds the crown, with silver #2 (at a fraction of the thermal conductivity). But I seem to recall some compound using boron and arsenic that was pretty close to diamond. Don’t know how far that’s gotten.

I don’t know of anyone actually using diamond, though.
 
That's damn impressive. I think I'll still be holding off until they update the design but for a fanless machine, these numbers are insane.
 
Well - I don't know - that's pretty sound reasoning.

What is your plan longer term for your Windows side usage (since Bootcamp is gone)?

I'm excited about Apple chips, but I'm bummed that I'm now likely going to have run multiple machines to maintain the Windows gaming side.
What has changed? Macs have always been way down the list in gaming. Windows PC have always provided the best gaming experience, or Consoles, at second best, if you can't afford a decent gaming rig.
 
Graph x86 performance vs Apple CPU performance over the last decade or so. Intel would have to radically bend their curve just to keep up, and there’s no indication they can do it. Even switching to TSMC as a fab wouldn’t get them there.
I am curious how you think the Unified Memory Architecture will affect GPU performance. It seems that not having to move data to and from the GPU should have a fairly positive impact on overall performance. Also, what do you expect the upper limit will be for SoC on chip RAM will be? I would love 32GB or 64GB for my next mini (not buying until it has 10Gb/s Ethernet again, but I can wait). :)
 
Reminds me of this old article about the A6 development history.



I guess the spirit of Jobs lived through via Cook, so to speak. I don't think Cook gets enough credits for pouring in what must've been a crazy amount of money into the project. Business-wise it would've been so much easier to save the huge outlay of R&D by just going on semi-custom or even off-the-shelf chips. That's the type of long-term planning&execution not frequently seen.

You can’t differentiate unless you own the whole stack. iPhone would be where mac is if they hadn’t put in all this hard work.
 
im really glad I decided not to Get a new MacBook Pro yet. I will gladly wait for apple to release the chip in the MacBook pro 16
Well that was the right decision: wait till this event and then just decide what's best for you. But many instead chose to rushing into Intel Macs "before it's too late", out of fear and ignorance. I'd love to see their face when they see the ARM Air beating their new $3k machine.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.