Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
When you’re wrong you’re wrong. And the fundamental point is an A-series processor may very well NEVER have to throttle when placed in an enclosure as big as a MacBook or iMac or the like. If you don’t want to be corrected, don’t say one thing, get it wrong, and then pretend you said something completely different.

I'm getting tired of people trying to put words in my mouth-you should understand that I don't care what your education or position is at ANY company. You are insulting me for you not understanding what I am saying is your failure-not mine.

A chipset is designed with a thermal preset cap to prevent frying itself.
 
  • Like
Reactions: Atlantico
I'm getting tired of people trying to put words in my mouth-you should understand that I don't care what your education or position is at ANY company. You are insulting me for you not understanding what I am saying is your failure-not mine.

I didn’t put words in your mouth. I cited your exact words! You said “The laws of physics will always thermally throttle any processor”

I am not insulting you. I am telling you that the words that you said are not true. You can say I am “not understanding what [you] are saying,” but your words are not open to interpretation. You made a statement that is quite clear, and also quite wrong.
 
I didn’t put words in your mouth. I cited your exact words! You said “The laws of physics will always thermally throttle any processor”

I am not insulting you. I am telling you that the words that you said are not true. You can say I am “not understanding what [you] are saying,” but your words are not open to interpretation. You made a statement that is quite clear, and also quite wrong.

*Limited by the laws of physics at the time of production. Better?
 
I think you mean patents, not copyrights. An x86 processor can be reverse engineered and built without violating any of Intel's patents, I'm sure. It just isn't cost effective to do so.

What in the HELL are you blathering about?!?

Copyright. Yes, copyright. That's what Intel is using to stop x86-64 emulation and likely also translation.

Imagine the next J K Rowling writes a series of children's books where the characters "speak" in a strange dialect. Copyright could be used to prevent other authors from writing stories set in that universe - they wouldn't be able to use the dialect. The other authors could certainly write books, but they'll be unconnected stories of wizards, magic, or what not.

Back to CPUs: I'm sure Intel has plenty of patents on implementing their instruction set, but those can be worked around via a different implementation. And a software emulation is certainly a different implementation. But a specific instruction encoding is a copyright matter. (" Why did you encode your instruction set exactly the same as us?")

What about 32 bit x86, why isn't Intel pursuing copyright claims there? In years past, nobody thought such matters were important. So such claims weren't pursued. And thus 32 bit has escaped into the public domain.

P.S. In a related matter, back in the '70's, Zilog came up with their own (much nicer!) mnemonics for 8080 instructions due to copyright issues.
 
  • Like
Reactions: Atlantico
*Limited by the laws of physics at the time of production. Better?
No. It remains untrue. Many processors never need to be throttled. The laws of physics do not dictate that any given processor has its clock frequency and/or voltage dictated by thermal considerations. Many processors have a maximum frequency that is dictated by delay path and not by any thermal consideration.

And, of course, NO processor has its frequency limited at the time of production by thermal considerations - when you produce a processor you can always put a big enough heat sink, aerosol cooling, active cooler, liquid cooler, or the like on it and guarantee that you can handle the heat flux. Throttling only has to happen when your cooling solution dissipates less heat than the processor can generate. And physics certainly does not demand that any processor must generate more heat than can be dissipated by a suitable thermal solution.

As someone else pointed out, numerous cpus in real world computers have never had to be throttled.
[doublepost=1550900921][/doublepost]
Copyright. Yes, copyright. That's what Intel is using to stop x86-64 emulation and likely also translation.

Imagine the next J K Rowling writes a series of children's books where the characters "speak" in a strange dialect. Copyright could be used to prevent other authors from writing stories set in that universe - they wouldn't be able to use the dialect. The other authors could certainly write books, but they'll be unconnected stories of wizards, magic, or what not.

Back to CPUs: I'm sure Intel has plenty of patents on implementing their instruction set, but those can be worked around via a different implementation. And a software emulation is certainly a different implementation. But a specific instruction encoding is a copyright matter. (" Why did you encode your instruction set exactly the same as us?")

What about 32 bit x86, why isn't Intel pursuing copyright claims there? In years past, nobody thought such matters were important. So such claims weren't pursued. And thus 32 bit has escaped into the public domain.

P.S. In a related matter, back in the '70's, Zilog came up with their own (much nicer!) mnemonics for 8080 instructions due to copyright issues.

Actually Intel does enforce its 32-bit op code copyrights, it’s just that they licensed them to a bunch of companies and fabs. And the 64-bit instructions are mostly owned by AMD, not Intel.
 
  • Like
Reactions: 09872738
I might still buy an ARM-based Mac if it were possible to run Windows and Linux in a virtual machine. But I don't see how this could happen.

My guess is this rumor is wrong. The ARM is not able to replace high-end Intel chips. I think what might happen is that Apple adds the ARM chip alongside the Intel chip.

Have to wait and see if it is time to bail on Apple.
 
I might still buy an ARM-based Mac if it were possible to run Windows and Linux in a virtual machine. But I don't see how this could happen.

My guess is this rumor is wrong. The ARM is not able to replace high-end Intel chips. I think what might happen is that Apple adds the ARM chip alongside the Intel chip.

Have to wait and see if it is time to bail on Apple.

Why can’t ARM replace the Intel chip?
 
It is not about "moving on", or not for everyone at least. It is about looking for pros and cons, depending on your workflow, attitudes, needs etc. Regarding Surface Pro, it may be overpriced but I've found myself doing with it a lot of things that weren't possible with iPad Pro. So for me the overpriced tool is the second one. But for me, I can understand that my position is anecdotal
If it’s all cons and no pros, and it has been for years (at least from a generic “your” perspective) why bother hanging out here?
There are lots of “Apple hasn’t done anything right since (a really long time ago)” posts and “Apple long abandoned users like me” posts. If these are true, let it go. If not, own that and actually mention those upsides.

Do I wish the next Apple laptop to be insane fast? Sure. Do I care what chip runs it? Not as long as it works insane fast. I don’t go posting on windows forums about how much I dislike the user experience there
 
  • Like
Reactions: cmaier
No. It remains untrue.... when you produce a processor you can always put a big enough heat sink, aerosol cooling, active cooler, liquid cooler, or the like on it and guarantee that you can handle the heat flux.

And you continue to use your standing as a very educated person to try and remove the fact that my statement is based on many considerations -which aren't your concern beyond trying to understand that I'm saying Apple has made some crap that wasn't worth the heatsink they threw at it.

Thats a fact you can't calculate yourself out of-or else the MP 6,1 might have sold better. Stop your assumptions that I need to provide any calculations beyond where my money may be spent. I may not have Phd, but I don't need one to know you seem to think you are superior when your stats are anything beyond speculation until Apple produces a retail machine.
 
Last edited:
And you continue to use your standing as a very educated person to try and remove the fact that my statement is based on many considerations -which aren't your concern beyond trying to understand that I'm saying Apple has made some crap that wasn't worth the heatsink they threw at it.

Thats a fact you can't calculate yourself out of-or else the MP 6,1 might have sold better. Stop your assumptions that I ned to provide any calculations beyond where my money may be spent. I may not have Phd but I don't need one to know you seem to think your stats are anything beyond speculation until Apple produces a retail machine.

I’m sorry - perhaps there’s a communications issue here? Is English your first language? (I am not trying to insult you - it’s just that you keep saying things and then saying we are misunderstanding you, and this last response by you is very hard to understand and is not grammatical).

What do you mean by “Apple has made some crap that wasn’t worth the heatsink they threw at it?” What crap? The A-series chips? These are universally considered to be great chips, and they blow away all the competition. Why are they not worth the heatsinks? The heatsinks are too good for them? What you are saying simply does not make any sense. And that’s putting aside the fact that you are changing the subject. All i said was that the law of physics you recited is not a law of physics and is wrong. Now you are accusing me of making assumptions (not sure what these assumptions are) and of inventing things you said (despite me having quoted you word for word).

I don’t care how you spend your money and I don’t care about your opinions as to what apple should do next, but I do care when you state as laws of physics things which are not at all laws of physics.
 
I’m sorry - perhaps there’s a communications issue here? Is English your first language? (I am not trying to insult you - it’s just that you keep saying things and then saying we are misunderstanding you, and this last response by you is very hard to understand and is not grammatical).

What do you mean by “Apple has made some crap that wasn’t worth the heatsink they threw at it?”

...or else the MP 6,1 might have sold better.

Yes, english is my first language, I just got distracted-this is getting old. I'm done since you should easily understand what I've said many times-it better dump heat if I put it to work or its a paperweight. Grammatically correct enough?
 
Yes english is my first language, I just got distracted-this is getting old. I'm done since you should easily understand what I've said many times-it better dump heat if I put it to work or its a paperweight. Grammatically correct enough?
Nobody is disagreeing with the concept that if you generate heat you need to dissipate it away from the chip. We only disagree with your claim that every chip has to throttle.
 
Nobody is disagreeing with the concept that if you generate heat you need to dissipate it away from the chip. We only disagree with your claim that every chip has to throttle.

When you DON'T provide a proper heatsink it Will-just as Apple has had a bad habit of doing- I don't like dumping on obvious errors, since it's not helpful-but you seem content on making me prove I'm actually in the ecosystem. That is wasteful of both our time.

When I was studying for my Ph.D with a concentration in solid state physics they didn’t teach me that one. Care to elaborate?

Built-up kinetic energy in the machine will eventually throttle any chipset when poor cooling is used, which hasn't historically been great. I think anyone with a current machine would know that and it doesn't require any advanced training to realize that if the machine is really put to work. I would hope they covered the laws of thermal dynamics in your program-you seem quite competent in them.

[doublepost=1550906388][/doublepost]
Yup. Many real world usage situations do not saturate processor functional units nearly as much as benchmarks, which allows the machine to seem even faster under that real work usage to the user than indicated by the benchmarks.

To a typical consumer-you are correct, but I've never been one of those so I don't take that attitude unless a friend/family member at the consumer level asks my opinion. This may be good for them-but not me, but I would never base an opinion off a spec sheet or GB score-or a rumor for that matter. For me I try to focus most of my users hardware suggestions to benefit their user experience-which this may very well end up being but I won't be impressed until its in my hands and I can stress it myself.
 
Last edited:
  • Like
Reactions: WatchFromAfar

except i really think the thing is, when they finally decide to go to ARM, they're not going to even bother with a laptop form factor. they will ditch it altogether in favor of a tablet form factor. you can still use a bluetooth keyboard if you want, but I really believe they're going to just embrace the tablet form factor as what they consider the next generation of computing.
 
ive really just resigned myself to all of this happening. the future always brings change, miniaturization. it is what it is. i'm still going to need a truck for work, and i'll either have to rely on my old x86 diesel mac pros, or get a new windows cummins turbo diesel with a 32-core xeon.
i'll still want a iPad Porsche for the weekends though. and the occasional spirited commute.
 
  • Like
Reactions: Reindeer_Games
I might still buy an ARM-based Mac if it were possible to run Windows and Linux in a virtual machine. But I don't see how this could happen.

My guess is this rumor is wrong. The ARM is not able to replace high-end Intel chips. I think what might happen is that Apple adds the ARM chip alongside the Intel chip.

Have to wait and see if it is time to bail on Apple.
Linus Torvalds seems to understand this (and the article even uses Steve Jobs to back him up):

https://www.theregister.co.uk/2019/02/23/linus_torvalds_arm_x86_servers/
 
  • Like
Reactions: Reindeer_Games
Based on what, other than rank paranoia?
Eh, maybe based on past events like the introduction of SIP or the T2 Chip with Secure Boot? They just have to flip a switch in the firmware so that you can't install or even load another OS from USB.
No paranoia, just some possible scenarios.
 
ARM chips will give Apple to release another laptop line as well as iMac line. It will not replace MBP or other Pro version.
[doublepost=1550922578][/doublepost]
Whats so impressive about an i7? I bought a brand new iMac last summer with one and it feels more sluggish than my 2013 and has more issues. In the past, a 3+ year gap between computers was a wayyy bigger upgrade

iMac is poorly engineered piece, it has always been. CPU options they give you are just a marketing trick cause all of them go down throttled and are their sustainable performance is crippled by tight chassis, improper cooling, PCB shortcomings, etc. Same spec CPU and GPU in some normal Windows laptop running resource hungry Windows 10 would give you better average performance.
 
A huge win for Apple since the ARM design goes back to licensees where PC makers will pick it up. Intel x86 chips don't make sense for many device makers going forward.
 
Not categorically disagreeing with you - just injecting a few notes of caution:

Windows 10 for ARM 64 bit has been a PRODUCT from Microsoft for over a YEAR now (and for 32 bit ARM back to 2016!). It handles x86 by automatically and silently Cross-Compiling the code into ARM-Native Code. Then it runs THAT. No "Emulation" Required!!! Also, An ARM-Based Mac would have an ARM Boot Camp. Boot Camp is pretty much just a set of Hardware Drivers.

True - but AFAIK you can't actually buy Windows 10 for ARM unless your name is "Dell" or "Lenovo" - that might change, but it might also be because there's no real standard ARM hardware platform or even a single "standard" ARM processor (part of the appeal of ARM is that manufacturers can pick'n'mix cores, GPUs, various codecs, vector processors and other hardware acceleration features). Intel Macs have always been, basically, PC clones built around bog-standard Intel chips, PCIe and AMD/NVIDIA GPUs. There's no guarantee that Windows-for-ARM would be compatible with anything other than MacOS (esp. if they use proprietary A-series GPUs). Even Linux/BSD will depend on someone compiling an ARM/Mac compatible distribution and suitable hardware drivers (which will need open specifications and could be hindered by dependence on copyrighted firmware 'blobs').

Emulators/cross compilers etc. certainly have a role to play - I've used the 68k emulator on PPC, 'Classic mode' on OS X and Rosetta in my time and although they've done a great job as 'bridges', performance has always been 'meh' - not everything is compatible and optimisations and processor-specific hardware accelerations tend to get stripped out. The result was fine for 'light' applications (MS Office wasn't really CPU-bound) but if your daily driver was, say, Photoshop with CPU-heavy plug-ins, an Intel Mac was hardly a compelling upgrade until Adobe had ported CS for Mac to Intel.

Plus, if the emulators/cross-compilers get too good, Intel or AMD might set the lawyers on them: https://www.theregister.co.uk/2017/06/09/intel_sends_arm_a_shot_across_bow/

Actually, I don't think backwards

I'm fairly certain that they won't do something as foolhardy as immediately commit the entire Mac line to an "off the cliff" transition like that.

Well, that's the $64,000 question: they did it with XServe (no, a Mac Pro or Mini with a rack adapter is not a substitute for a proper rackmount server with redundant PSUs and lights-out management), they did it in 2013 with the Mac Pro (by letting the cheesegrater get thoroughly out-of-date before switching to a radically different concept), they did it with the 2016 MBP (if you don't like it here's a 2-year-old entry-level rMBP at a new higher price), to a certain extent with the Mini and the Air, they're doing it yet again with the Trashcan/Mythical Modular Mac Pro transition and, while were at it, the iMac is heading for its second birthday...

See also Apple Maps, FCPx and the Photos app - which may be OK today but were all half-baked when Apple prematurely tried to push them on users.

So, unfortunately, although I'd hope otherwise, botching the Intel/ARM transition by letting the Intel models get horribly out-of-date first would be pretty much par for the course.

Why would they have to?

Because by (say) locking down MacOS, iOS-style, so you could only install Apps from the store, Apple get a slice of all App sales and in-App purchases. As with the above - that's in Apple's hands and doesn't depend on technology.

Apple isn't a charity and whatever they do, PCs and phones are a maturing technology that isn't going to generate the sort of growth that it used to. They'll naturally be looking for ways to force obsolescence on hardware and make more money from services.

I think this speculation is making the (allegedly) forthcoming announcement of the Mythical Modular Mac Pro interesting: will they pile a lot of investment into doing something innovative with a Xeon (in which case they'll need to support Intel for the foreseeable future) or will they come up with something courageous with 32 ARM Cores driving a shedload of Metal-optimised shaders/vector processors? The latter would be really exciting and innovative... if they already had a credible, up-to-date Mac Pro for customers who needed one three years ago.
 
Dude, Exactly what I wanted to say. The chip transition has nothing to do with how the os and how its apps are suppose to run, the focus of the article should be that with Apple being in control of the chips they no longer have to wait on Intel for the next release of each Mac model. It's really that simple, Apple will be making "all the widgets" as they used to say or similar.
 
  • Like
Reactions: Novus John
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.