Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
not going to change a thing for apple and where they're headed ... AAPL knew about these chips for quite a while, they are still not delivering what AAPL wants/needs
 
But they aren’t.
The problem with this transition is that instead of increasing industry compatibility with different software solutions we are moving away. Ultimately it isn't about what developers/engineers think is best, its all about how businesses reacts to Apple's products. A awful lot of people buy computers related to their work environment. :)
 
  • Like
Reactions: richinaus
No mention at all yesterday of NVIDIA blowing the competition away with their new Ampere RTX 3000 series lineup, while Apple is about to transition to in-house graphics across the line.

I have no doubt Apple can compete on CPUs in the desktop space - but common, them dropping AMD graphics from Apple Silicon macs and not working with Nvidia due to a past feud is just going to cripple them in the graphics space. Even the new PS5 and Xbox Series X look pretty pathetic compared to these new chips.
I think you should stop comparing the new unreleased consoles to these Nvidia chips. The graphic cards will start at $500-700 and up for higher-end configurations. You pay $500 for the console and get the whole package unlike the graphic card at $500 with nothing else. Your full PC configuration will cost about $1000, if not more. That’s twice the price of the new consoles (if the rumored $500 price is true). On top of that, games always run better on consoles and require less horsepower since they are specifically programmed for that one machine. Unlike PC’s, where you have to take into consideration myriad of different hardware options, hence those minimal requirements are always higher.
 
  • Disagree
Reactions: dguisinger
ROFL! Where have you been past few years? Intel is done. AS is the future. Intel is not gonna be able to compete.

Don't count Chipzilla out yet. They were in a similar position 15 years ago, when the Pentium 4 was a dog and AMD passed it by. Intel got their crap together, the Core cpu's came out and the rest was history, AMD was reduced to bargain basement junk PC's you find at walmart, for the next 10 years.

They'll figure it out.

People think the ARM Macs will be some sort of next level performance. All people know is geekbench scores. Because there is no comparison to desktop & workstation level software or computers, because you can't run those types of apps on iPads.

Just wait till they come out and, ARM optimized software comes out. Then you can judge.
 
I would reiterate that any CPU where all the cores are created equal is not a real apples to apples contender to Apple’s upcoming A14 family of processors, especially if we’re talking mobile PCs (laptops).

Intel does have Turbo Boost, though, which tackles a similar issue.

You need something with big performance cores and small efficiency cores, and an OS that fully leverage that.

Intel promise for 2021 in that regard is Alder Lake. By that time we’ll know everything about the AS Macbook and the AS Macbook Pro 14” probably. And they may sport (as per Bloomberg rumors) a 8 big cores + 4 small cores configuration.

Right.
 
I think you should stop comparing the new unreleased consoles to these Nvidia chips. The graphic cards will start at $500-700 and up for higher-end configurations. You pay $500 for the console and get the whole package unlike the graphic card at $500 with nothing else. Your full PC configuration will cost about $1000, if not more. That’s twice the price of the new consoles (if the rumored $500 price is true). On top of that, games always run better on consoles and require less horsepower since they are specifically programmed for that one machine. Unlike PC’s, where you have to take into consideration myriad of different hardware options, hence those minimal requirements are always higher.

Only mentioned it once and I should stop comparing it? I didn't want to get into a big console discussion as its far off topic, but I'll just say this.

As discussed many times around here in the Epic 30% battle, Consoles are subsidized, so they cost more than the sale price.... and they are designed to last 6-7 years without upgrades. The new gen consoles clearly are not going to be competing on graphics. Yes, you can squeeze more performance out of a consistent console experience, but you are failing to note that's no longer how game developers are working. They aren't hand optimizing games, they are targeting Xbox One S, Xbox One X, Xbox Series S, and Xbox Series X all at the same time, not to mention PCs..and are using the same APIs to go cross platform with all of the DirectX technologies. Gone are the days that Xbox console games are going to be well optimized for a single platform. Even Sony is allowing exclusives to launch on the PC at the same time. So they are in fact, directly competing, and you can't fake ray tracing performance - you either have it or you don't. NVidia had a huge swing and a miss on RTX 2000 series.... they were overpriced and way underperforming for ray tracing. Some people thought it was a gimmick and wouldn't take off. Now we have the ability to raytrace 4K 60fps. Nvidia is actually targeting people who are waiting to buy new consoles this fall with their super low RTX 3070/3080 pricing compared to where the RTX 2070/2080 was. Last I heard the 4K performance for X Box Series X is actually up-sampled not real 4K. AMD probably could have done better if they did it as a discrete chip like older consoles had, even if it still didn't match.

Now
Back to my original actual point
Apple needs to let NVidia back into the game and continue working with AMD. Without discrete graphics as an option on the new Apple silicon macs, they are going to be unappealing to many. Even worse if they get rid of 3rd party graphics drivers on Apple Silicon for eGPUs and expansion cards on Mac Pro.

"Go buy a PC to game" shouldn't be the refrain. Apple isn't even used that much in business, its become for of a luxury consumer brand. We need more options to make the graphics appealing, not less.
 
Boom, that's it, Apple Silicon is dead. Next step: Apple apologizes and prepares to put Tiger Lake 10nm with large cooling fans in next-gen iPads and iPhones.
I hear the next iPad will come with an Intel CPU and a water tank for cooling, with water tubes and water tank cases in a variety of matching colors.
 
  • Haha
Reactions: gnasher729
In the Business World, it's the software that matters, NOT the hardware !

Apple's custom Si Macs will very-likely establish a beachhead with Gamers & Hobbyists first !

What good is having a Mac that mostly runs iOS apps ?

Isn't that what an iPad is ?

very uniformed comment

1. Gamers and hobbyists? Are you joking? Gamers who play on personal computers like dedicated gpus and require directx and windows to play 90% of games. not saying that the gamer cadre is exclusively like that, but most of the market is. Hobbyists? Hobbyists like to tinker and build tower pcs with interchangeable parts and overclocking potential. None of that fits apple silicon.
2. A mac with apple silicon does not run “mostly iOS apps,” as it is running macOS and has access to rosetta emulation and the whole backlog of 64 bit mac apps that run on modern intel macs, along with newly gaining access to the whole iOS app store as well.
 
  • Like
Reactions: gnasher729
Its success is going to rely on NATIVE software from the likes of Adobe, Microsoft and others. The iPad versions of these apps are not going to cut it. Emulated software is not going to cut it either. It will take native App support for this push to ARM to be successful.

It looks like you think native support is any amount of work.

The reality is: You start Xcode, you go to project settings, you turn on the ARM architecture, build - that's it. Developers have the hardware for testing in their hands, so any reasonably important app will be ready when the first ARM Mac is released.
 
These processors are being produced on a new process and won't be released "in volume" for a month or two yet. Few design wins (50) and later release (November?) seems to corroborate that these processors won't have significant availability for months. They'll be expensive and limited to higher-end laptops, which are fine as halo products, but not something the average consumer will buy. It would appear unlikely that Apple would be able to acquire these in the required amounts at a price point they would be comfortable with. Performance improvements are there (though battery is a concern if rumors regarding power consumption are true), but these don't really factor into the equation at all for Apple, certainly not enough to halt a transition to their own silicon.
 
So this tiger lake series has worse performance then current 16” mbp?
To be fair this is a lower wattage part than what they put in the 16” MBP.

But I won’t be the slightest bit surprised if Apple releases Apple Silicon-powered MBP13’s and MBA’s (which used a similar wattage Intel chip) that beat these new chips handily.
 
  • Like
Reactions: nameste
I disagree. These look like great processors and might do well against AMD’s 4000 series, which are excellent. Intel is playing catch-up to AMD and these might just do it.

I’m looking forward to Apple’s offerings though. It’ll be interesting to see how they stack up with these new Intel chips and what AMD has out already.
I don't see how they can increase the speed without increasing the power requirements or reducing the die size. My guess is this increases speed at the cost of battery life. The performance-per-watt metric just took a dive.
 
Apple still uses Intel processors in their Macs and will continue to for an announced 2 more years; I am somewhat confident we have not seen the last Mac refresh with Intel processors. That said, Apple does not use AMD processors. So is it really a surprise that MacRumors would continue to report on Intel processors more frequently than AMD processors?

This transition will take awhile and probably have some bumps along the way just as other transitions have. There will be a lot of FUD posted here at MacRumors because, well, that's what people do. Readers have the option of getting angry about misinformation or opting to do a moments research to find where the truth lays. I enjoy the angry posts until I remember their vote counts the same as mine, then I just feel fear, uncertainty, and despair. Ah, FUD-it.
Yeah, I know. There will also be a transition at MacRumors towards possibly not covering Intel at all anymore. We all have to guess when each Mac product line will receive its last Intel update. My point was that it is not the fact that Intel will keep releasing new processors that is sufficient for MacRumors to cover it but that we don’t know at the moment what will be last Intel processors used by Apple.
 
Apple needs to let NVidia back into the game and continue working with AMD.

There are a lot of issues that come with "letting Nvidia back", one of them being the parasitic CUDA.

Without discrete graphics as an option on the new Apple silicon macs, they are going to be unappealing to many.

You are arguing semantics. It doesn't matter whether you call a GPU "discrete" or not, what matters is how it performs. Apple's integrated GPUs are doing quite well agains Nvidia and AMD GPUs when you consider the difference in configuration and power usage. If Apple manages to ship a GPU that performs similar to what anyone else could have offered, what's the problem?
 
  • Like
Reactions: CWallace
It looks like you think native support is any amount of work.

The reality is: You start Xcode, you go to project settings, you turn on the ARM architecture, build - that's it. Developers have the hardware for testing in their hands, so any reasonably important app will be ready when the first ARM Mac is released.
That is if your code doesn’t contain any low-level calls to hardware-specific features (or you embed third-party libraries that do). It’s the code in large software packages that contains some very old stuff, that will cause complications. The bigger the code base the higher the risk that it has some old dependencies and the more monumental the task of going through all of it.
 
Why do some of you talk about the future with such confident certainty? Some of you are acting like you're a time traveler from 2030 who's telling us events that have already happened. In life, the people who act most confidently are often the people who are least knowledgeable and least capable.
 
Samsung can manufacture Apple Silicon, and has in the past.

Before you say that, you need to understand that it means that they must do a complete backend process for Samsung since that process and TSMC are not compatible.
It means different memory, layout and design rules and mask set.
Yes, they can second source Samsung... Maybe.
 
Only mentioned it once and I should stop comparing it? I didn't want to get into a big console discussion as its far off topic, but I'll just say this.

As discussed many times around here in the Epic 30% battle, Consoles are subsidized, so they cost more than the sale price.... and they are designed to last 6-7 years without upgrades. The new gen consoles clearly are not going to be competing on graphics. Yes, you can squeeze more performance out of a consistent console experience, but you are failing to note that's no longer how game developers are working. They aren't hand optimizing games, they are targeting Xbox One S, Xbox One X, Xbox Series S, and Xbox Series X all at the same time, not to mention PCs..and are using the same APIs to go cross platform with all of the DirectX technologies. Gone are the days that Xbox console games are going to be well optimized for a single platform. Even Sony is allowing exclusives to launch on the PC at the same time. So they are in fact, directly competing, and you can't fake ray tracing performance - you either have it or you don't. NVidia had a huge swing and a miss on RTX 2000 series.... they were overpriced and way underperforming for ray tracing. Some people thought it was a gimmick and wouldn't take off. Now we have the ability to raytrace 4K 60fps. Nvidia is actually targeting people who are waiting to buy new consoles this fall with their super low RTX 3070/3080 pricing compared to where the RTX 2070/2080 was. Last I heard the 4K performance for X Box Series X is actually up-sampled not real 4K. AMD probably could have done better if they did it as a discrete chip like older consoles had, even if it still didn't match.

Now
Back to my original actual point
Apple needs to let NVidia back into the game and continue working with AMD. Without discrete graphics as an option on the new Apple silicon macs, they are going to be unappealing to many. Even worse if they get rid of 3rd party graphics drivers on Apple Silicon for eGPUs and expansion cards on Mac Pro.

"Go buy a PC to game" shouldn't be the refrain. Apple isn't even used that much in business, its become for of a luxury consumer brand. We need more options to make the graphics appealing, not less.
Man, you’re so off topic! What does subsidizing have to do with consumers? Nothing. We pay $500 and that’s what matters. The rest is for another conversation. Also, what Sony exclusives get released on PC at the same time? You’re wrong here. Sony only now started releasing some old exclusives like Horizon Zero Dawn. The game is over 3 years old. It’s been confirmed that they will follow suit with this by releasing some older games on PC going forward.
The bottom line is you can’t compare a console at $400-500 and a PC at $1000-1200. That’s apples to oranges.
 
Actually, in single-core, even Ice Lake is a fair bit faster than Coffee Lake Refresh-H in the 16-inch. Tiger Lake even more so.

The fastest 16-inch scores 1110; the fastest 13-inch Pro scores 1233 instead. With Tiger Lake, it would score ~1400. And that's despite half the TDP.

I don't think that comparing these figures is that straightforward. Geekbench is a bit weird in this regards: it is basically a sequence of very brief bursty workloads. Ice Lake and Tiger Lake have more agile dynamic clocking subsystem, they can burst up and down a bit faster, which may skew the numbers one is getting. I think longer, intense workloads are a better basis for comparison. Notebookcheck did publish some preliminary Cinebench scores, based on them Tiger Lake is around 5-7% faster at the same clock, which is not bad, but also not particularly impressive when we consider that we are comparing it to a 5 year old architecture (albeit optimized to it's limits). Intel has managed to push Skylake to around 5.0 Ghz, I am wondering what will be the limit of the new Willow Cove.

At the same time, we don't know anything about power consumption. TDP is meaningless here — it refers to the CPU running 100% on base frequency. One thing seems to be clear — Willow Cove is more power-efficient running at lower frequency (a 3.0ghz Willow Cove core will consume similar amount of energy as a 2.3ghz Sunny Cove — that's a 30% improvement here). But we have no idea how much energy it needs to boost to 4.8ghz.



Yes it does. They have incrementally increased the cache sizes and adjusted the exclusivity a bit. IPC is more so on architecture implementation and cache synergies than clock speeds.

That clock speed boost percentage is mainly the top end turbo mark. The larger dynamic range that Tiger Lake can operate in is far more important than that percentage also . Especially in context of battery life and pref/watt.

Based on the preliminary benchmarks, a Tiger Lake running at 4.8ghz seems to be around 10% faster than a Skylake running at 5.0 ghz. I am looking forward to seeing in-depth analysis with power consumption. We can hope that Intel has fixed it's biggest problem — ridiculous power usage at higher frequencies. If Willow Cove can hit that 4.8 while using 20W or less per core — that's great. If they need 50+ watts, not much has changed.

At any rate, the GPU alone is a huge improvement.
 
  • Like
Reactions: CWallace
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.