Intel's Anti-Mac Ad Campaign Highlights M1 Shortcomings

In my case, using my Mac for work is quite a pain - can’t read work emails on it, etc. Due to “security” and lack of I.T. Support. The official solution is Citrix, which is quite unpleasant. That said, I do use my Mac for work :)
Do you guys run Exchange for mail or use some lawyer specific solution? :) Based on what you said, I presume the issue is that your firm does not support device management for macOS and so they will not let you connect to the corporate mail server (and other corporate systems), rather than a lack of macOS clients for those systems?
 
Like you, I run an internet-based business, but ours is cloud native running in Kubernetes. I don't buy machines, I just rent cores on demand. My workload doesn't care about the physical CPU, and my cloud provider just tells me they can either be Intel or AMD. I could get Intel specifically, and pay a premium, but it would be a waste.

For workloads like mine, which are becoming far more common, the CPU is a commodity. If Intel depends on some kind of brand differentiation, then expect a long, slow decline as servers migrate to this model.

The way you're doing it is by far the more popular way and you're completely right that for people buying into cloud infrastructure they won't see or get to choose the hardware beyond "Intel, AMD or ARM" and RAM/SSD quantity type of choices.

For us we've always owned our infrastructure as its been very cost effective and gives us total control, if we need a server with a QuickAssist cryptographic card or an FPGA card we can just send one to the DC and have it installed for us, if we need several servers connected together with 100Gb/s networking we can send the cards and cables and have it done etc but I totally understand our situation is not at all normal.

I'm friends with a few people who run large datacentre's who I met through them using our service and they generally will use whatever gives them the best return on investment. At the moment that's used Intel stuff but one of my friends has invested in over 1,500 EPYC 7002 based machines to offer up to their cloud customers.

Taking the 32, 48 or 64 core models and dicing those up into 1-2 core instances with 2-4GB of RAM each has been very profitable for them and really helped them put more customers in the same physical area while simultaneously reducing power and cooling costs which combined create the vast majority of the DC's operating cost.

I know Amazon EC2 and Microsofts Azure both already have 7003 EPYC Milan CPU's as AMD gave them a whole bunch before the whitebox crowed gets access and I think they're pretty much all in on EPYC for the same reasons as my friend the RoI if you're buying new is off the charts with these processors.
 
those are really bad ads... they could have just talked to people using Macs to get better digs on performance. Looking forward to finally upgrading to the M1

Exactly. The Mac users who are using M1 are typically pretty happy with its performance.

And here's the funny part... the M1 is the slowest Apple Silicon chip that will ever exist. And it's still amazing.

Imagine what an M1X or M2 will be able to do. Apple has only just begun!
 
I remember a couple years ago many lamenting that Apple had turned it‘s back on the Mac line and was just going to pursue iOS and iPadOS devices. The M1 has welcomed a new era for Mac. I’m looking forward to getting an M powered Mac.
 
I can order a 10700 or 10700k right now on Amazon for MSRP or below. I could get a 5800x, 5900x or 5950x but I’d have to pay a $300 to $500 premium from outfits I’ve never heard of.

The 10700K scores a pathetic benchmark of less than 70% of the speed of a 2 year old Ryzen 9 3900. And half the performance of the 5900. If you want compare performance wise, the Intel would be more on par with the old AMD Ryzen 7 4700, which not only is the same price as the Intel, it's still faster as well.

1613158500044.png
 
Do you guys run Exchange for mail or use some lawyer specific solution? :) Based on what you said, I presume the issue is that your firm does not support device management for macOS and so they will not let you connect to the corporate mail server (and other corporate systems), rather than a lack of macOS clients for those systems?

Correct. Device management for iOS, but not mac. Technology exists, they just don’t want to use it.
 
Someone got super defensive 🤣

But did you know:

- Metahumans is a browser tool which will run on a Mac
- Unreal Engine is primarily accelerated thanks to GPUs (which current Mac Pros have plenty of juice in them to run), not CPUs which is what the joke is referring to (more specifically Intel, as AMD is doing wonders for the PC)
- Unreal Engine on macOS is still supported by Epic Games

Why you chose the worst example for your argument of choosing a PC over a Mac is beyond me. You could have selected the most popular games on PC and chose the worst game that runs on a Mac and that would have worked much better but you couldn’t even do that. 🤣

Anyways enjoy your coffee. I’m done with you 👋
Err...

MetaHumans is browser based. Yes, but it just serves the compositing purpose, not the rendering purpose, for the final result you need UE.

Yes, Unreal is accelerated thanks to GPUs, but the UE support on the Mac is far from good, not because of EPICs, but due to Apples older hardware choice and bad rendering API features support. On Macs it just serves lower and older graphics games, well. The current Mac Pro has nothing near the top of the line NVIDIA Quadro or Geforce cards to offer.

I think, you won't see an AMD/NVIDIA/Apple Silicon mixture in the next Mac Pro, which makes me wonder what Apple has to offer in comparison in a SoC which includes the graphics.

They will compete with NVIDIA, AMD and Intel at same time, and Apple surely has nothing (feature wise) like that in the pipeline.

Thanks, but too late for a coffee 20:55, drinking tea!
 
As per Robert Frost – Instructor and Flight Controller at NASA “Apple computers are quite common at the more research-oriented centers and very much less common at the operations-oriented centers.”
 
Because I write CAD software, use CAD software all the time, and it works fine with metal. There are lots of CAD, EDA, and other CAM tools that support Mac. Those that do not don’t support it for reasons other than technical - they simply find the potential audience too small.
Nice, I don't write CAD applications, but CAD on Macs is not an industry standard.

I use mainly CATIA, PRO/E here in Germany (car industry), and I never saw a Mac being used for CAD.
It's always Windows, and other UNIX's. I doubt Tesla and SpaceX nor General Motors uses mainly Macs to build their vehicles. iMacs at marketing department and for the desktop publishing, yes.

Even the movie industry which build heavily on Autodesk Maya, Realflow, Houdini, etc. uses Windows and Linux for all their main tasks.

Apple will never ever get deep into these areas, regardless of which GPU or CPU they announce, which makes these "Apple will crush PCs(Intel/AMD)" statements obsolete. Apple will always stay mainly in the lifestyle, gadget and general purpose consumer device market.
 
Even the movie industry which build heavily on Autodesk Maya, Realflow, Houdini, etc. uses Windows and Linux for all their main tasks.

Apple will never ever get deep into these areas, regardless of which GPU or CPU they announce, which makes these "Apple will crush PCs(Intel/AMD)" statements obsolete. Apple will always stay mainly in the lifestyle, gadget and general purpose consumer device market.

So it's especially weird that Intel seems to be worried or bothered by what Apple is doing.

Didn't the new CEO of Intel call Apple a "lifestyle company" the other day?

But with these ads... it seems that Intel clearly has a bee in their bonnet about Apple.
 
While Alder lake might improve things, Rocket lake at 14nm is only competitive with Zen3 in single core at the expense of massive power - maybe matters a little less in a desktop but there it is.
I’ve heard that before and I don’t get how “only by drawing more power” is meant to mitigate the success of Intel’s work. Intel’s design can handle that power and turn it into speed. I may be being naive here, but I’m assuming than in desktop where power consumption isn’t a concern both chip makers will draw as much power as necessary to get the maximum speed they can. If Intel can handle take more power and turn it into more speed then that’s reflects well on them doesn’t it? Wouldn’t AMD do the same if they could?
 
Sure it is true.
Name me a single Mac that had the top of the line graphic card at its release date.
The graphics cards they shipped was always 1-2 years behind. Well they were a "special" edition, but tech wise behind the rest. Apples M[n] will get its feet into plain consumer area and push ARM development, but you won't see the professional 3D and high class Game creation industry flooding over to Apple.

Thanks for making it super easy for me! ;) The MacBook Pro 2020 have AMD 5000s. That is currently the best lineup AMD has for laptops. If you want to argue that AMD laptop GPUs are behind Nvidia laptop GPUs, sure! But that somewhat undercuts your proposition that AMD GPUs are lightyears ahead of Apple when Apple uses their GPUs and the M1 integrated GPU has just been released.

Also, technically the M1 GPU is an answer to your query as well. Currently one of if not the best integrated graphics chip on the market.

Others have already dismissed your statement that CAD programs don't work on the M1, they do. And your argument that Intel GPUs are superior to Apple's GPU is that Intel won an award for its API? That doesn't speak to its consumer hardware (in fact any of their hardware) which is currently only the Xe (well competitive hardware anyway).

You're left with that Apple won't release server grade/workstation GPUs. Possibly true - at this point totally unclear what Apple's roadmap is. But I highly doubt Apple will stay with just M1-class GPUs over the next two years and the rumors indicate that Apple will be releasing their own dGPUs and possibly workstation GPUs as well.

This how people who follow the industry for a living view Apple and graphics:

Apple, of course, has long held a reputation for demanding better GPU performance than the average PC OEM. Whereas many of Intel’s partners were happy to ship systems with Intel UHD graphics and other baseline solutions even in some of their 15-inch laptops, Apple opted to ship a discrete GPU in their 15-inch MacBook Pro. And when they couldn’t fit a dGPU in the 13-inch model, they instead used Intel’s premium Iris GPU configurations with larger GPUs and an on-chip eDRAM cache, becoming one of the only regular customers for those more powerful chips.


So it’s been clear for some time now that Apple has long-wanted better GPU performance than what Intel offers by default. By switching to their own silicon, Apple finally gets to put their money where their mouth is, so to speak, by building a laptop SoC with all the GPU performance they’ve ever wanted.

Also I don't expect Apple to "crush" Nvidia, AMD, or Intel. But they have released a very good, very competitive integrated GPU. While Apple has annoyed me in more ways than one about this, arguing that Apple doesn't care about graphics and will never ship a competitive GPU is easily disproved by the fact that they already have.
 
Last edited:
Nice, I don't write CAD applications, but CAD on Macs is not an industry standard.
There is lots of CAD done on Macintosh systems, and with the improved performance, I would not be surprised to see some of the Linux applications offer macOS versions (even running under X). Until now, there was no reason to bother to port to macOS. They offered the same Intel chips and no matter how great their industrial design or how reliable their systems were, it did not matter to that crowd. However, Apple Silicon will give them a chance to distinguish themselves. If Apple can continue to deliver better performance than their competition, they will finally have a compelling reason for these companies to consider porting.
I use mainly CATIA, PRO/E here in Germany (car industry), and I never saw a Mac being used for CAD.
Certainly not for PRO/E as despite their promises, Parametric never released a macOS version.
It's always Windows, and other UNIX's. I doubt Tesla and SpaceX nor General Motors uses mainly Macs to build their vehicles. iMacs at marketing department and for the desktop publishing, yes.
You do understand that there are lots of other places that do design other than automative and aerospace, right? Many smaller shops use Macintoshes as they have software that solves their CAD needs and is cheaper/easier for them to maintain. That is why AutoCAD, ArchiCAD and Graphisoft all have macOS versions.
Even the movie industry which build heavily on Autodesk Maya, Realflow, Houdini, etc. uses Windows and Linux for all their main tasks.
Given that you are now talking about my industry, I have to say you are wrong. Some studios are primarily Linux, some primarily Windows, some primarily macOS. Many have a mixture. That Maya, Cinema4D, etc., all have macOS versions, indicates that there is a profitable market for them to maintain the software.
Apple will never ever get deep into these areas, regardless of which GPU or CPU they announce, which makes these "Apple will crush PCs(Intel/AMD)" statements obsolete. Apple will always stay mainly in the lifestyle, gadget and general purpose consumer device market.
Apple Silicon will make them competitive in many markets for the first time in a long time. I expect to see Macintosh systems in more areas moving forward, rather than fewer.
 
I’ve heard that before and I don’t get how “only by drawing more power” is meant to mitigate the success of Intel’s work. Intel’s design can handle that power and turn it into speed. I may be being naive here, but I’m assuming than in desktop where power consumption isn’t a concern both chip makers will draw as much power as necessary to get the maximum speed they can. If Intel can handle take more power and turn it into more speed then that’s reflects well on them doesn’t it? Wouldn’t AMD do the same if they could?
It's because drawing more power is generally a sign of an inefficient design/fabrication. You've got it back to front: AMD can achieve the same (and better in multicore) with less power - if Intel could do that they would. It's one of the reasons (also die size) why Intel had a drop in cores in its desktop line. It is true that in desktops - well big towers/workstations anyway - power draw is less of a concern than laptops, but it still means more/louder cooling necessary to keep the chip from melting and, if something is meant for more continuous operation like HPC, more costs. In smaller desktops power draw can still matter for performance as they can be harder to cool.

Oh and I forgot a big one: sustaining higher power through the silicon to achieve better, stable performance also means tighter binning (quality control) on that silicon which is more expensive, so higher costs. Truthfully Rocket Lake was never meant for 14nm, it had to be backported from 10nm because of Intel's fabrication woes. (again why Intel had to drop total cores) It still wouldn't quite be as good as Zen 3, just as Tiger Lake (essentially mobile Rocket Lake) isn't as good as mobile Zen 3, but it would be better.
 
Last edited:
Because products like Octane X, Cinema 4D, AutoCAD and others with more demanding requirements work fine with metal. That those companies have never bothered to port, has nothing to do with limitations in Metal and everything to do with their perceived market dynamics.
Btw. I also started using Maya when its name was Alias PowerAnimator with a nice Motif based UI. Maya on macOS always has been a "game of luck", it loves to lag, slow down, crashes with heavier meshes, PaintFX has hardware overlay problems. Dunno about AutoCAD, but I expect the same. C4D works great on macOS! The ones who "really" uses Octane, uses it mainly on PCs with the newest NVIDIA+CUDA tech. Cost/value/speed wise it does not even worth buying a Mac for this kind of work.
 
My only question at this point is this, do M1 chips throttle due to heat?
Sure it will, and good it does, but does it matter that much?
I'm happy that my general purpose M1 MBA is quiet, fan-less and accept the throttle(didn't notice so far).
Know your usage, and choose wisely...
 
M1 Macs are going to be a hard sell to Big Corporations with No Bootcamp or dual boot Windows/macOS.
Not to mention Lack of upgrading and probably Tuff to repair.
What you buy at time of sale on an M1 Mac your stuck with.

But of course the Die Hard Apple Fan boys don't or refuse to think of this.

I have worked in IT for Big Corporations.

I know how they feel about Windows, backwards compatibility, cost of ownership and easy of repairs.
The lack of x86 virtualization hurts, but...

Big corporations don’t tinker in their notebooks. Heck, we’re only 850 users and we don’t even do that.

Not being able to upgrade a notebook is a bad straw man IMO.

I could see this being a problem for small business or non-profit, but honestly, at a large corporation who still has a notebook after 3 or 4 years and then expects some guy to come blow it out and drop sodimms in it!?😂
 
I can edit 8K RAW video on a Mac that costs $699 in real time, try that on a $699 PC. If I am going to play games, I'll buy a gaming PC or a Console. You can't built iOS Apps on a PC either but I can build Android apps on a Mac. I also have access to a Unix terminal which you have to add on a PC. On a PC, I have to use Windows. eww.
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.
Back
Top