Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
It’s not clear what you mean by going through Rosetta - Rosetta is a translator, not an emulator. It does a one time translation (with some exceptions due to peculiarities of how code pages work in x86) when an app is installed or first run, but after that it is running native code. If you’re going to think about the performance impact, think of the penalty of Rosetta as being more akin to using a bad compiler (or a compiler with the optimization flags turned off) rather than the penalty that occurs when you use an emulator.
True but give me real world Application tests once the application have been translated. A benchmark test output from a processor speed tester tells me nothing about how a particular application is running under real world Rosetta II translated applications. This is like getting testing results at my old General Motors engine testing facility. Just because the motor runs fast in a lab does not equate that if it is not matched up with the correct transmission or turned correctly once it is installed, that you will be getting the same results that you got when you originally tested labs.
 
If the benchmarks are so amazing and blows Intel out of the water, why didn't Apple do direct benchmarks against specific intel chips in its keynote as it has done traditionally?

Why put meaningless "2x", "3x", "4x" stuff that can easily be disregarded as empty marketing?

Bring back Steve Job's Photoshop benchmarks or some real world meaningful test back to keynotes.

Well they don’t have Apple Silicon ready for the 16” MacBook Pro or iMacs, so maybe they want to fully transition before they start slamming the CPUs that they are still trying to sell in some of their more expensive machines.
 
I interviewed with Dobberpuhl twice.

The first time was in 1991 or so. I remember very little about it, but I got the general impression it didn’t go well, and I did not get the job. Honestly I cannot even remember *where* the interview happened, though it must have been in Massachusetts for the Alpha team, and I do vaguely remember talking about their emulation technology (they had a way to run x86 and SPARC code on alpha).

Around 1995 I interviewed again, this time in Palo Alto for the strongarm team. I remember that one a bit better. I recall being asked to design a RAM cell, and nailing it. But then it was time for DobberpuhL to talk to me. And he starts with “so I remember you from 4 years ago. That didn’t go well. Let’s hope you did better today. Why did you get a Ph.D, anyway? That was dumb.” :)

Six months later I was at Exponential Technology interviewing one of the guys who interviewed me at DEC. I think we made him an offer. Can’t recall if DEC made me an offer, though I would bet they did not. If they had I wouldn’t have taken it, anyway, because the Exponential job was such a perfect match for my technical experience (i had been designing bipolar CPUs for 4 years, which was super unusual, and the folks at Exponential had actually been monitoring our research to see what we were up to since they were also doing bicmos).

At AMD I worked with a half dozen or so folks from the Alpha team. They were all incredibly smart (and they still are :)
I don’t understand the tech side, but this is cool to read because we had an NT Alpha. I even had my own direct line to a high up Exchange engineer because of a bug no one else had because how many were actually running an NT Alpha Exchange server? Lol I bet they never fixed it

i also marvel when my Dad gets nostalgic holding his iPhone in his hand marveling what he could do what all that power.. he was excited to get the upper 4k back in the 70s to write his telemetry code.

thank you for the stories.
 
That's a good point, but I didn't trust a single reference of '2x' '3x' '4x' claims all throughout the keynote for the same reason, 4x performance where? To me they lost credibility to the entire mac enthusiast community and we are all waiting for real testing to learn the truth.
Certainly what we need are multiple, carefully-conducted, real-world benchmarks from reputable, independent sources. But now that they will finally be released into the wild, those should be coming in soon.

But as to the 2x, 3x, etc., the footnotes at Apple.com show what they are specifically in comparison to.
 
Last edited:
In the fine print Apple explained that many of the comparisons were to the i7 Ice Lake Air and i7 Coffee Lake Pro. Since those are both upgraded models if anything Apple understated the performance gains.

I really hope that's true and I also can't help but think if it's true, how that message didn't really come through as well. That presentation magic from Steve Jobs would have amplified the product so much more.
 
For gaming, I think you need
What?!?!? The latest game consoles support ray tracing, 120Hz at 4K, and 60Hz at 8K, as well as adaptive sync technologies.
True. Also for gaming, I think you need to look squarely at the GPU. This GPU is more comparable to "integrated" graphics offerings such as Iris from Intel and Ryzen with integrated graphics. In order to have a serious gaming rig, you need a gpu compatible to the big boys Nvidia and AMD along with supporting software apis. Of course... maybe someday this will happen.
 
Apple said the MacBook Air was faster than 98% of Windows PCs. That’s a simple and powerful message. If these specs are true (no reason to think they aren’t since they are likely being posted by official reviewers) then that claim looks substantiated.
I can believe that because most consumers buy Microsoft Surface products which are called Windows PC's. I should know, because I took care of Microsoft displays at Best Buy for years. But I also worked for Apple computer retail, before the Apple stores in the past. Just like a huge number of people now buy Chromebook computers.
 
  • Like
Reactions: JohnnyGo
Graph x86 performance vs Apple CPU performance over the last decade or so. Intel would have to radically bend their curve just to keep up, and there’s no indication they can do it. Even switching to TSMC as a fab wouldn’t get them there.

Hey cmaier, you seem to work in the industry, questions for you

Integrating RAM into the CPU die, what is the pros and cons?

Cost?
Physical limitation?
Reparability?
Performance - since all of them will be fighting for the same RAM 8GB or so?

I am concerned with apple not able to offer higher density RAM in the default configuration for last few years, to get Macs with 16GB RAM is costing a lot ...
 
I think the question now is what Apple will do for the high end products... We currently have a standard M1 and a modified M1 with an extra graphics core. Unfortunately these options won’t cut it for those of us who actually do need a discreet GPU. The M1 can already do some pretty GPU heavy task but obviously Apple designed the M1 GPU to be comparable to other iGPUs not necessarily a replacement for the AMD GPUs in their high end products (see the comparison graphs on their website).

If Apple is indeed planning to do a 16” and iMac sometime around WWDC perhaps they have an M1X already in the works with a graphics option that can really push things to the next level. If apple is already getting these kind of numbers for the low end, the high end chip must be insane!
 
  • Like
Reactions: harmoniumfarfisa
I get all the excitement, but I'm still skeptical until we see real-world numbers.

Otherwise, why would Apple have kept the "higher end" specs on MBP 13" with Intel, if that is actually lower end? Would they be selling a more expensive yet worst performing machine just to keep Intel users happy? (unless you really need 32Gb RAM or 4Tb storage)
 
I can believe that because most consumers buy Microsoft Surface products which are called Windows PC's. I should know, because I took care of Microsoft displays at Best Buy for years. But I also worked for Apple computer retail, before the Apple stores in the past. Just like a huge number of people now buy Chromebook computers.
The presentation said: "faster than 98% of PC laptops sold in the last year."
 
Hey cmaier, you seem to work in the industry, questions for you

Integrating RAM into the CPU die, what is the pros and cons?

Cost?
Physical limitation?
Reparability?
Performance - since all of them will be fighting for the same RAM 8GB or so?

I am concerned with apple not able to offer higher density RAM in the default configuration for last few years, to get Macs with 16GB RAM is costing a lot ...
 
  • Like
Reactions: 2Stepfan
Hey cmaier, you seem to work in the industry, questions for you

Integrating RAM into the CPU die, what is the pros and cons?

Cost?
Physical limitation?
Reparability?
Performance - since all of them will be fighting for the same RAM 8GB or so?

I am concerned with apple not able to offer higher density RAM in the default configuration for last few years, to get Macs with 16GB RAM is costing a lot ...

I feel like RAM with these ARM based chips are a different animal compared to an Intel PC. Just look at the iPhone/iPads, consistently they have used less RAM than the competitors. Maybe 16GB really is more than plenty.
 
Honestly, I think Intel and AMD are busily building a faster horse (to borrow from the urban myth Henry Ford saying.). Look at the Linus Tech Tips video (where he's dismissive precisely because he's coming at it from the established approach perspective.). "So you can't upgrade your memory. waaaaah waaaaah".

In my opinion, I think most people who buy Macs never think about things like upgrading the memory, and they don't actually care what chip is in it. They care what the machine can do. So giving up the ability to upgrade your memory, by putting the memory IN THE CHIP, instead of inventing faster and faster pipelines from the CPU to the memory... It's the sort of stuff Intel and AMD aren't even thinking about because they are stuck in the established approach of "CPU, GPU, Memory, Motherboard, IO controller, etc etc etc." They make chips. Someone else has to turn them into a working physical system and someone else has to supply an Operating System that can make use of the features...

Vertical integration is incredibly powerful and I'm very hopeful for what Apple is going to do next.

I'm just waiting cautiously to see what happens with virtualisation as I use my Mac for development and I need to be able to use Postgres, Node, etc etc.

Exactly. You get it.

If the benchmarks are so amazing and blows Intel out of the water, why didn't Apple do direct benchmarks against specific intel chips in its keynote as it has done traditionally?

Why put meaningless "2x", "3x", "4x" stuff that can easily be disregarded as empty marketing?

Bring back Steve Job's Photoshop benchmarks or some real world meaningful test back to keynotes.

No. People on this forum and people who watch keynotes are not where Apple makes or will make their hay. The “average consumer” just needs a price and a value proposition, along with a little desirability (aka pretty engineering or status appeal). People who really buy Apple receive “empty marketing” very well. They could give two plops about gigahertz or number of cores

That's a good point, but I didn't trust a single reference of '2x' '3x' '4x' claims all throughout the keynote for the same reason, 4x performance where? To me they lost credibility to the entire mac enthusiast community and we are all waiting for real testing to learn the truth.

See above. The “Mac enthusiast community” is awesome, but not remotely the target market. We got a nod with the clever “Safari is snappier” reference Tuesday, but that’s all we should expect. You are really great, but please step aside for the paying customers. ;)
 
I don’t understand the tech side, but this is cool to read because we had an NT Alpha. I even had my own direct line to a high up Exchange engineer because of a bug no one else had because how many were actually running an NT Alpha Exchange server? Lol I bet they never fixed it

i also marvel when my Dad gets nostalgic holding his iPhone in his hand marveling what he could do what all that power.. he was excited to get the upper 4k back in the 70s to write his telemetry code.

thank you for the stories.

At exponential we spent a lot of time thinking about alpha as our real competition. In our test lab we had NT running on machines with the PowerPC processors we had designed, and we’d informally compare our performance to their’s. I don’t remember how we compared, but I do vividly recall that windows NT ran *much* faster than MacOS on our processors. MacOS, back then, was quite a kludgey pile of code. There were chips on Mac motherboards that nobody at Apple still understood - a lot of 68k code was still floating around in firmware, etc.

I also remember that, since our first chips came back a little slower than we wanted, while we waited for the second batch to come back we had a water cooling rig in the lab in the back of the building and we’d test things on it to see how they’d scale, etc. (i wasn’t involved in that, other than one night, around midnight or 1am, me and my buddy Alan start hearing alarms go off back there. As the most senior of the junior people there, I bravely went back there to see what was going on, and found that one of the water cooling rigs had failed. I had to call my boss to let her know, and I’m sure she didn’t appreciate the call at that time at night.). I wish I could remember what Alan and I were working on that night. He was CAD, and I was designing the floating point unit, or maybe the floating point for the next chip, or maybe we were just both playing Joust on the arcade machine in the break room. Or maybe foosball.
 
I can’t wait until the first folks shown some actual performance data. I’m a bit worried about the 16gb being shared with the graphics card especially when using Photoshop. Hopefully my mind will be at ease once these are in user hands.
 
So I am out of the tech loop, are there going to be different versions of their chips like ,i3, i5 etc. Will these chips replaces the ones in all their other products?
 
What?!?!? The latest game consoles support ray tracing, 120Hz at 4K, and 60Hz at 8K, as well as adaptive sync technologies.
Would have to agree, I have a MacBook Pro 2019 16 with 32gb ram, 2TB of storage and a Razer Core X with a 3080 EGPU with Windows 10 and it kicks butt PC gaming. But I just got a new Microsoft X box series X to have a gaming console, so once bootcamp running on a Mac is no longer supported , once the Intel processors are gone in Macintosh systems, So I have a platform to play some games. And the new X box is very impressive for $500. I have it running on the HDMI port on my Samsung Odyssey G7 32in gaming monitor along with the displayport 3080, running on the same monitor. A gaming console that can run 1440p 120hz with 120fps is pretty amazing but the Window PC side still kicks it butt in a lot of aspects.
 
Hell, even Linus Sebastian is in denial...watch his latest YouTube video on the topic. I think they will see the “light” soon enough...
It is frustrating they always must seem to talk about aspects not related to the CPU (design, cost of RAM, "walled garden" etc.)
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.