Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Why better architecture? Intel chip can run software from 1981(from dos or later from windows) as well as support 64bit and 32 bit simultaneously, not to mention the long lifespan of their chips.

I believe RISC is better than CISC for modern computing platforms. Also, having the ability to run 35 year old code isn't relevant.
 
Ah, cisc is more flexible and people do use older programs. RISC already lost but I get your point.

RISC lost? Is that why Microsoft is moving Windows to ARM? Is that why pretty much all mobile devices are on ARM? The future is RISC.
 
  • Like
Reactions: DeepIn2U
RISC lost? Is that why Microsoft is moving Windows to ARM? Is that why pretty much all mobile devices are on ARM? The future is RISC.
Microsoft has done multiple platforms over the years. X86 wasn't the only one. But risc lost the computing war to intel years ago for heavy computational supremacy;e.g. Hp risc chips. But there arm chips in low power devices do have their place.
 
Last edited:
Microsoft has done multiple platforms over the years. X86 wasn't the only one. But risc lost the computing war to intel years ago for heavy computational supremacy;e.g. Hp risc chips. But there arm chips in low power devices do have their place.

There are more ARM chips in use today than x86. I don't see how that's RISC losing anything.
 
Right all low power, not quite like a huge Microsoft server farm that are in corporate data centers. All stand alone on idevices.

Okay? And? You said they lost period. If you're saying servers are all that matter, that's a bizarre argument. That being said, you should read up on the first real ARM server chip.

http://www.anandtech.com/show/10353/investigating-cavium-thunderx-48-arm-cores
http://www.anandtech.com/show/10918...riq-2400-server-soc-in-action-begins-sampling
 
Last edited:
Okay? And? You said they lost period. If you're saying servers are all that matter, that's a bizarre argument. That being said, you should read up on the first real ARM server chip.

http://www.anandtech.com/show/10353/investigating-cavium-thunderx-48-arm-cores
http://www.anandtech.com/show/10918...riq-2400-server-soc-in-action-begins-sampling
Most of the world runs its server farms on intel chips. The keyword is most. Desktops and laptops are mostly intel. Mobile devices and chrome books are arm.
 
Most of the world runs its server farms on intel chips. The keyword is most. Desktops and laptops are mostly intel. Mobile devices and chrome books are arm.

And starting this year, laptops will start to be on ARM. Why are you cheerleading for Intel?
 
Not cheerleading, it's the way it is. Why are you cheerleading arm, and what laptops?

I don't even buy amd laptops.

I'm not cheerleading ARM, I just feel Intel has A) chosen the wrong architecture for the future (which shows in their poor per watt performance when compared to ARM) and B) screwed over everyone else because of poor decisions made in the 90's and earl 2000's.

As for what laptops? We'll know around Redstone 3. That's when Microsoft is releasing Windows 10 on ARM. Also, it will be on the Snapdragon 835 only at first.
 
There are more ARM chips in use today than x86. I don't see how that's RISC losing anything.

My head swims with the dense dense fog that permeates my brain when thinking about how one could possibly even compare a PHONE to a real desktop computer and then claim that it means a damn thing that there are more phone CPUs out there than ever x86..... SO WHAT? There are a lot of Prius cars out there. That doesn't mean I want one instead of a Nissan GT-R. Comparing a phone to a high powered workstation class desktop is at cross-purposes and illogical. Furthermore, the PPC was a RISC CPU and Apple moved AWAY from it to an Intel x86. Why? More/Faster/Cheaper CPU updates and Windows (boot camp) compatibility.

The really funny thing is that modern x86 CPUs are RISC internally with a CISC translation layer (which gauging by some posts on here I think some people don't a flipping clue about this or they wouldn't act like x86 ISN'T RISC when it already is). If Intel is behind the "battery curve" it's because they have had better things to do. How much profit is in the high-competition cell phone CPU market? Why would Intel want to focus on that instead of high-profit Xeon E7 series workstation CPUs that an ARM couldn't touch (e.g. look at the 8890 with 24 cores and tell me which CPU you'd rather use to edit 4K or even 8K video, that or a cell phone cpu?)

Battery technology is the real holdup for both smart phones that don't need charged every or every other day (the old Motorola pre-"smart" phones' charge would last a week) and electric cars alike. Make a better battery and small gains in battery life won't seem very important.
 
  • Like
Reactions: Feenician and I7guy
I'm not cheerleading ARM, I just feel Intel has A) chosen the wrong architecture for the future (which shows in their poor per watt performance when compared to ARM) and B) screwed over everyone else because of poor decisions made in the 90's and earl 2000's.

As for what laptops? We'll know around Redstone 3. That's when Microsoft is releasing Windows 10 on ARM. Also, it will be on the Snapdragon 835 only at first.
It seems intel will not be in mobile devices. As for whether or not a non-intel laptop that can run windows software is successful, we will see. Could be similar to their phones.
 
People like to be able to use bootcamp at full native speed, or running Parallels and VMWare at an OK performance.

Emulation is a lot slower.

At this point, there is very little to gain from moving Intel to ARM except a whole lot of disruption. Apple are also having issues with production yields. ARM manufacturers will soon have the same issues as Intel.



 
Last edited:
My head swims with the dense dense fog that permeates my brain when thinking about how one could possibly even compare a PHONE to a real desktop computer and then claim that it means a damn thing that there are more phone CPUs out there than ever x86..... SO WHAT?

That's like what I say when people go "There are a BILLION Windows machines out there", and I say yeah and there were about 1 Billion VCRs or more connected to TVs in the 1980s - 2000 even the ones in hospitals and doctors offices, same as Windows! Who freaking cares! :p
 
That's like what I say when people go "There are a BILLION Windows machines out there", and I say yeah and there were about 1 Billion VCRs or more connected to TVs in the 1980s - 2000 even the ones in hospitals and doctors offices, same as Windows! Who freaking cares! :p

The problem with your analogy is that the latest Intel processors are equivalent to 4K Bluray and ARM is closer to the equivalent of a DVD. That is why I went with Prius Vs. Nissan GT-R. Your analogy makes zero sense because Intel CPUs as a whole are much FASTER. :D

Have a nice day. :cool:
 
The problem with your analogy is that the latest Intel processors are equivalent to 4K Bluray and ARM is closer to the equivalent of a DVD. That is why I went with Prius Vs. Nissan GT-R. Your analogy makes zero sense because Intel CPUs as a whole are much FASTER. :D

Have a nice day. :cool:
Huh? I am not even talking about the 4K to ARM comparison or your Intel to ARM. I am just saying there are a billion windows devices most likely used to browse the web vs a billion old vcrs that were used to browse tv.

"box (vcr) connected to display (tv)" = "box (windows machine) connected to display (screen)"?
1 billion vcrs
1 billion windows pc

you can't see the similarity huh?
don't worry about it...

similarity? A billion people still doing absolutely nothing but sitting on their A$$!

if these 1 billion PCs were so great (besides some being in a botnet) where is all the great productivity that "should" have came about in the last 20+ years? NONE..

oh well... I was agreeing with you in the first place, and some how it got twisted? Weird...
 
Huh? I am not even talking about the 4K to ARM comparison or your Intel to ARM. I am just saying there are a billion windows devices most likely used to browse the web vs a billion old vcrs that were used to browse tv.

I'm simply not seeing what this has to do with ARM processors then? There are billions of very current ARM processors out there doing nothing but checking Facebook pages and making tweets. I don't call that productive either, especially when many people today can't even do basic math in their heads and have no knowledge of history, etc. (the old "Jay Walking" bit on the previous Tonight Show comes to mind). Just because more people are connected to the internet doesn't make the Internet more useful or productive. It just seems to attract more criminals to pray on said people who don't know anything about such scams, etc. since they are too busy texting people they are at the mall or McDonalds to notice anything outside their sphere or influence.

I'm simply against moving to ARM for moving to ARM's sake. There has to be a good reason to move to it and it has to be better than 2 extra hours of battery life on my notebook to potentially lose tons of software and compatibility with Windows, etc.
 
It has absolutely nothing to do with ARM, or Intel. I was just passing by and saw your comparison, and laughed cause you are right, and said I make the same comparison. Mine comes up when Windows users brag that there are 1 billion Windows machines compared to how many Macs there are. hmm...
 
I don't think the debate is ever going to stay dead.


even funnier when the languages that really work these differences are the ones that fell out of favor a while. Or are on very specific systems. If ones company has invested in some oracle sparc systems...probably has a dev or 2 who knows how to code to leverage those sparc systems more fully. Or they have applications vendors that do.


They have compilers and libraries that work this stuff. Off top of head C/C++ and fortran...intel conveniently makes compilers for them.


Universals like gcc run on anything. Is gcc going nitty gritty to get out every ounce of power from intel...not likely. it compiles check...now on to make sure amd works and hell sparc.



Are they used? Yes, but not a lot. And usually for specific reasons. Namely you know exactly where your end product will be running. Your clients will be on intel systems.

Beyond that we have the rise of the higher level languages. Which are liked for the fact they don't muck about much in the nitty gritty. I gather not delving deep into the innards of intel...probably fell into the nitty gritty glossed over.



rest...
I don't see apple doing this...their next gen OS revealed last year as far I know is for intel. It has a what? 2-3 year projected date till we even see it. Tied to this OS will be in theory (and assumed gold disked when released) new file system. New file system...will be enough fun for apple to get devs on board with and up to speed.

Mix in new FS and architecture..that's one hell of a party...or one hell a way to cause a train wreck of too much change at once. I got 5 on the latter.
 
Just look at the Surface range. I run a Surface 3, Atom quad, 4GB ram. It's fast, coupling in the low power tasks there ( which are many ) and only leveraging a current I7 and I just might think about a new Mac. Will watch the news on this closely.
 
If Apple eventually attain full or as close to full control of the silicone, then we can assume that will give Apple a greater ability to combat against backdoors and potentially have the most secure hardware and operating systems on the planet for the end user.

When you see how Chinese made security cameras (I have direct experience of hardware flooring a network due to it being compromised out of the box) are loaded with malware, the NSA intercepting hardware and Andorra devices coming loaded with malware etc. etc. this has to be a real goal for Apple.

I assume ARM allows this be one of the outcomes of such a strategy.

Then the main weak point if such a point is achieved will be the Apple board and employees, subcontractors maybe, can't think of anything else. I'm assuming control of silicone with reduce intercept compromise strategies as tampering will be easier to detect and manage on a hardware level.
 
  • Like
Reactions: Michaelgtrusa
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.