Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
AMD's CPUs are horrible... I thought Apple cared about power usage. Intel has done quite a bit of work on that, plus, AMD's implementation of Turboboost is pretty bad...
I have an i5 Macbook pro and i5 HTPC and a i7 powerhouse for my 3D related work.

But that doesn't make any sense if you see how AMD really kicks the crap out of some of Intel integrated stuff. With regards of low power x86 based systems, it really lets Intel see every corner of the room if you look at the broader picture and don't only focus on linpack results... .

I have a "fast" (I put in into quotes because it depends of the task) i5 macbook pro but for some parts it is as slow as hell because it is seriously hampered by that Intel HD crap. There is a reason why I still keep that i7 around even relatively it isn't that much faster then my macbook... .

Especially when so much heavy stuff more and more relies on GPGPU computing. It is funny that for something that was developed by Apple (OpenCL) I need to use alternative manufacturers because Intel can't (or won't) write OpenCL drivers (please those extremely slow CPU drivers don't cut it) for their HD range, but then again it would prove how abysmal the performance is of those things.

If I didn't also used my HTPC as a rendering slave - as a technology enthusiast - I would certainly go with the AMD route.
 
Last edited:
All this hate an misinformation.

Intel does one thing worse than any other chip manufact. Graphics! No matter if it's integrated or standalone Intel is the worst.

Granted their processing power is better than AMD but not to the extent AMD processors could even be called bad.

Also The HD3000 is not just a little worse the AMD APU. The AMD integrated graphics have DOUBLE the frame rates!
 
I'm talking about using AMD processors in the Macbook Air it would lower manufacturing costs and thus let Apple pocket the profits because people wouldn't notice whether it is AMD or Intel based. Like wise Apple could use the AMD processor solution to ressurect the White macbook. I say it is likely because Apple largely doesn't care about the pro market any more. Just my opinion.

I'm with you on Apple's attitude, but the stated reasoning seems more logical (that they felt more limited by the gpu options offered by Intel than the cpu offerings). It wouldn't be much of a stretch given that they both run on X86, but I don't see it happening solely on these reasons unless the price difference is huge and AMD's solution offers at least a sidegrade.

The macbook airs suck for any kind of serious work anyway. Power users running intense photoshop documents or anything of the likes arent going to be running out to buy a macbook air.

The MBA is really geared for basic use like sending photos of your cat to grandma or typing a document in MS Word for your university paper. And in that regards, the AMD cpus are more than capable.

It's possible to use them for more. There's definitely a scale to it. Even heavy software these days has really soft minimum requirements, but it's just a matter of what you wish to do within it. Photoshop runs okay on an air if you're dealing with 8 bit images. If you're dealing with huge files at 16 bits or higher, it would definitely choke the air. It's also definitely limited on ram, but someone will obviously tell me that doesn't matter because of the SSD:rolleyes:. To me the Air definitely feels ram limited, so if I was going to try anything that's harsh on ram, I'd dial down settings. Like if it was photoshop, I'd turn history settings down, turn off thumbnails, and do a few other things to conserve ram. This was necessary with large files when it was still a 32 bit program. It was quite common for people working on things like assembling movie posters. Photoshop can actually run on really minimal hardware if you dial the settings down enough. It's just not necessarily fun to work that way, and there's still a breaking point.
 
Another example of Intel's anticompetitive and monopolistic business practices hurting the consumer.

Apple should have the ability to use an Intel CPU without an Intel GPU. Other GPUs can interoperate with Intel CPUs; Intel is specifically disallowing it in an attempt to spread their monopoly. Just as Microsoft are forced to open specifications to parts of Windows to allow competitive products, Intel should be forced to do likewise.

Any AMD bashing is just ridiculous: as the article states, Apple discovered many of them faulty during testing. When any major company has a major supplier decision like this to make, they evaluate parts from all suppliers and pick the best one.
 
It's possible to use them for more. There's definitely a scale to it. Even heavy software these days has really soft minimum requirements, but it's just a matter of what you wish to do within it. Photoshop runs okay on an air if you're dealing with 8 bit images. If you're dealing with huge files at 16 bits or higher, it would definitely choke the air. It's also definitely limited on ram, but someone will obviously tell me that doesn't matter because of the SSD:rolleyes:. To me the Air definitely feels ram limited, so if I was going to try anything that's harsh on ram, I'd dial down settings. Like if it was photoshop, I'd turn history settings down, turn off thumbnails, and do a few other things to conserve ram. This was necessary with large files when it was still a 32 bit program. It was quite common for people working on things like assembling movie posters. Photoshop can actually run on really minimal hardware if you dial the settings down enough. It's just not necessarily fun to work that way, and there's still a breaking point.

Agreed 100%. A MacBook Air will definitely run most software, but it'll be a pain in the a$$. A few years ago I bought myself a small 10" netbook for a 2 month vacation in Europe. I really only needed it for email, tv shows when bored and storing photographs from my dSLR. It served its purpose perfectly. I had photoshop installed on it, as well as lightroom, but doing any kind of serious work on it wasn't feasible.

Now, given that the Atom processor inside that laptop sucked for photoshop purposes (i'm a graphic designer), it really boggles my mind how I used to work on a Power Mac G4 dual processor @ 867mhz with less than a gigabyte of RAM.... for 7 years!
 
It's possible to use them for more. There's definitely a scale to it. Even heavy software these days has really soft minimum requirements, but it's just a matter of what you wish to do within it. Photoshop runs okay on an air if you're dealing with 8 bit images. If you're dealing with huge files at 16 bits or higher, it would definitely choke the air. It's also definitely limited on ram, but someone will obviously tell me that doesn't matter because of the SSD:rolleyes:. To me the Air definitely feels ram limited, so if I was going to try anything that's harsh on ram, I'd dial down settings. Like if it was photoshop, I'd turn history settings down, turn off thumbnails, and do a few other things to conserve ram. This was necessary with large files when it was still a 32 bit program. It was quite common for people working on things like assembling movie posters. Photoshop can actually run on really minimal hardware if you dial the settings down enough. It's just not necessarily fun to work that way, and there's still a breaking point.

Just out of curiosity, what do you edit that eats up that large an amount of resources? I use PS mostly to make textures. The highest resolution I usually go for is 4096x4096. I've edited PS documents that size with 30+ layers in them. Multiple image layers with alphas, various adjustment layers, all that neat stuff. In all the years I've been using PS, I don't think I've ever seen it peg higher than 2GB before.

I'm not calling you out or anything. I'm honestly curious here. Unless you're editing 20MP RAW photos with over 60+ layers, I can't imagine what could push you beyond the 4GB mark.

edit: I thought about it for a second, and pretty much answered my own question. If you're doing pro photography or advertisement work, you're gonna be dealing with tons of lossless quality RAW images, vector graphics, and who knows what else. Texture work in comparison is considerably less strenuous on a computer. I'm usually dealing with much smaller .jpgs and .tga files.
 
Last edited:
It wouldn't be much of a stretch given that they both run on X86, but I don't see it happening solely on these reasons unless the price difference is huge and AMD's solution offers at least a sidegrade.


To be more precise, the both run the x86 instruction set and the AMD64 instruction set. See that there, AMD-haters? Yes, Intel had to license the 64-Bit instruction set from AMD because AMD had that LONG before Intel came up with a 64-Bit architecture.

Price is an issue. When I buy a server, I usually get the SAME server with a 6-core-AMD processor with double the RAM size for several hundred bucks LESS than the 4-core Intel version would cost us. And the AMD CPUs even consume less power than the Intel processors. So I get more processing cores, more RAM AND save money on both the machine AND power. And both Windows Server and Ubuntu Server run as well on AMD as they run on Intel.

Why again should I insist on having an Intel processor? Because of twelve year old prejudices and online forum anecdotes posted by noobs?

Heck, AMD even have factories here in Germany where they build their stuff. Another reason for me as a German why I should buy AMD instead of Intel.
 
I have yet to see my MacBook Air struggle at anything except games.

Photoshop (runs like butter)
Aftereffects
Final Cut


They don't just work they work well.
 
Just out of curiosity, what do you edit that eats up that large an amount of resources? I use PS mostly to make textures. The highest resolution I usually go for is 4096x4096. I've edited PS documents that size with 30+ layers in them. Multiple image layers with alphas, various adjustment layers, all that neat stuff. In all the years I've been using PS, I don't think I've ever seen it peg higher than 2GB before.

I'm not calling you out or anything. I'm honestly curious here. Unless you're editing 20MP RAW photos with over 60+ layers, I can't imagine what could push you beyond the 4GB mark.

edit: I thought about it for a second, and pretty much answered my own question. If you're doing pro photography or advertisement work, you're gonna be dealing with tons of lossless quality RAW images, vector graphics, and who knows what else. Texture work in comparison is considerably less strenuous on a computer. I'm usually dealing with much smaller .jpgs and .tga files.

I used to work at a marketing agency doing the design for all promotional peices; posters, flyers, banners, display booths, etc.

I specifically remember getting materials (digital files) from one of our clients to wrap a small hockey rink for consumers to shoot at a small net. The whole setup was about 6 or so feet long. The files we received were TIFF files, uncompressed, multiple layers, 16bit, many transparencies at a resolution that surpassed 5,000x5,000 pixels and on top of it all, CMYK, which just kills the entire experience (RGB makes files significantly smaller and faster to work with).

The file size of the documents themselves were 1gb large. Opening the files took longer than booting up the computer. Once loaded and with enough RAM, things were running fine.

Thats just graphic design, doing video editting can be (and probably is) as daunting if not more.

That said, I would never buy or use a MacBook Air for my line of work.
 
I didn't see this mentioned anywhere in the thread, but the reason I don't want to see AMD chipsets in a machine I'm interested in buying is the fact that they produce Linux drivers that are completely useless. I love OS X, but every now and again my job forces me to work in a Linux environment for long periods of time, and AMD GPU's have in the past made that experience horrible. I don't know if they have improved their drivers since then, but if they haven't then I'm not looking forward to buying an AMD powered Mac.
 
oh this is bad news. I remember purchasing two mac mini's in 2006 about the time Mac's switched to Intels from PowerPC. If i'm not mistaken the Chips were CoreDuo.

The Macs were so slow, they barely ran the OS. These macs were badly configurered. I fear if Apple does ever change, i'll wait atleast a year (after the switch), read the reviews, try it out for myself, and then buy new macs.

:confused:
 
I didn't see this mentioned anywhere in the thread, but the reason I don't want to see AMD chipsets in a machine I'm interested in buying is the fact that they produce Linux drivers that are completely useless. I love OS X, but every now and again my job forces me to work in a Linux environment for long periods of time, and AMD GPU's have in the past made that experience horrible. I don't know if they have improved their drivers since then, but if they haven't then I'm not looking forward to buying an AMD powered Mac.

Apple's been using AMD (er, ATI) GPUs in their desktops and laptops for a while now. There also haven't been any horror stories from the 6XXX series GPUs in the laptops like there were with the Nvidia cards a few years back.

Unless your job requires you to use Linux with potentially shoddy drivers on your own computer, I don't see how an AMD CPU/GPU solution would give you grief.

It doesn't matter much though. Apple's made wise decisions in the last 6+ years, and it doesnt look like things are going to change. If the AMD solution were to have been made available, it would have been a good one. Apple hasn't settled for crap in a long time. That said, nobody is using an AMD powered Mac, so the whole argument is moot.

Whats more important is: how good WOULD an AMD powered Mac be? I'm sure Apple will make the right call in terms of CPU hardware. (can't say the same on their software side, Lion has been a complete mess).
 
I would like Apple to use AMD CPUs in mass products. It would surely wake Intel up, but I suspect they wouldn't think twice before pressuring and blackmailing Apple for Mac exclusivity.
 
I've seen videos of people removing the CPU fan on an Intel processor and cause the computer to crash. I've seen videos of people removing the CPU fan on an AMD processor keep chugging along.

You've actually got those exactly the wrong way round. The P4 was the first CPU to have thermal throttling, AMD got that a lot later. Processors from before that time would burn themselves to death if given half a chance. These days removing the fan won't do anything except throttle either system.

Removing the entire heatsink may cause either one to go into thermal shutdown, which is *not* a crash.
 
My i5 Air has been great, my main issue with its graphics chipset is that it does get a little laggy when outputting to a high resolution external monitor along with the internal one. Mainly this is an issue because clamshell mode stopped working for me in 10.7.3, but that is besides the point.

It runs completely silent 95% of the time, even when compiling/running a game in the simulator or working through photos in aperture/photoshop. It still gets warm, but it is pretty much the closest you can get to iPad-esque always-on computing while still keeping great performance.
 
Wirelessly posted (Mozilla/5.0 (iPhone; CPU iPhone OS 5_0_1 like Mac OS X) AppleWebKit/534.46 (KHTML, like Gecko) Version/5.1 Mobile/9A405 Safari/7534.48.3)

I think apple will create a hybrid motherboard with both an arm and intel/amd processor. Will be just like the graphics switching now
 
First off, MR, this is old news. Just because Forbes is confirming it, we are not given any new information of any real interest or value here.

That said, anti-AMD sentiments here are unfounded and ignorant. Anyone who has ever owned and used an AMD machine knows that AMD provides great bang for the buck and is no more or less unreliable than Intel.

Not AMD please!

And what exactly would be so horrible about AMD? Yeah, fine, the Phenom II is very Core 2, but I've read nothing to the effect of Bulldozer not being up to snuff in terms of speed or reliability.

if it works better fine by me. AMD is underrated.

Truth.

While there's probably some truth to how Apple tested AMD chips in MBAs and whatnot, I doubt they were even really considering it. Why? Thunderbolt.

They advertised Thunderbolt heavily with the MBP and iMac months before the MBA launched. So why would they produce a line of computers that weren't compatible? That couldn't use the iMac's target display mode? That wouldn't work with a TBD?

So unless AMD had a related technology, I'm guessing we would have seen it come with the MBP and iMac first, long before the MBA.

My guess is that we would've seen it with the iMac first; AMD has never really had a strong mobile CPU. In any event, while I don't think that Thunderbolt is necessarily worth it, I agree with your assertion that Apple had enough invested in it to not consider abandoning Intel lightly.

AMD's CPUs are horrible... I thought Apple cared about power usage. Intel has done quite a bit of work on that, plus, AMD's implementation of Turboboost is pretty bad...

What's so bad about AMD CPUs? Every PC tower that I've ever owned has used an AMD CPU; they're fast for their cost and plenty reliable. The only Intel machines I've owned were either second hand PC laptops that I didn't pay for or Macs.

Was this not old news from a year ago. AMD again. I like ATI I use them on my Gaming rig but would it really benefit end users if they went over to AMD I don't think so, I would find it hard to believe that AMD had actually come out with something better than Intel that surpass them enough for Apple to change over.:rolleyes:

I don't think it'd hurt consumers. In fact, it'd probably lower the cost of Macs which would help them. Plus, I don't see them being THAT much worse in terms of performance than Intel.

Intel rocks and has the money and talent to move forward. Plus can AMD even pump out enough CPU for Apple to care.

I for one will stay with Intel I can't say anything bad about their Core 2 Quad or i5 and i7 cpu they just work.

GPU is another thing, but the next gen will be I believe a large leap in capacity from what I have read. Not that I need to play Skyrim on my macbook air. :D

----------



Why would Apple want to become Dell. Not going to happen.

They will continue to use Intel and ARM happily and we or at least I will enjoy continuing to buy their wonderful iOS and OS for year to come. :cool:

Having been a many-time owner of AMD CPU-based PCs, I can attest that AMD CPUs also "just work". I swear, people on here read about a few failures from almost a decade ago and suddenly, it's a blacklisted brand.

Pshh. I think the whole AMD vs Intel argument is like Nikon vs Canon. Comparable technologies, but two sides with people that have strong feelings about why their side is better.

Truth. I don't see any reason why one is necessarily "better" than the other one. Intel is typically ahead of the curve and will probably always be the leader when it comes to mobile CPUs. AMD offers great bang for buck and is always putting up a good fight against Intel in terms of desktop CPUs. They're both fine and reliable. It's a stupid debate.

Remove the fan or heatsink from an AMD processor and say good bye to your $10 CPU!

Intel processors have built-in sensors that would shut the system down and prevent the CPU from frying itself.

Your information is old. AMD has long since changed that. Update your information and your prejudices.

I made that fairly obvious in my post, the FIRST thing on my list as well. I'll post it again, I don't think you even read my post:

1) Mac users are generally noobs without a clue on how to overclock, swap hard drives, flash roms, etc. Development for such fine-tuning computer savvy people would be limited on the mac side.

I'll agree that it's limiting, though there is a growing number of people that subscribe to that community. Case in point: the Hackintosh community.

Please no AMD, with their crappy unstable Nvidia chipset junk.

What...are...you...even saying here?

oh this is bad news. I remember purchasing two mac mini's in 2006 about the time Mac's switched to Intels from PowerPC. If i'm not mistaken the Chips were CoreDuo.

The Macs were so slow, they barely ran the OS. These macs were badly configurered. I fear if Apple does ever change, i'll wait atleast a year (after the switch), read the reviews, try it out for myself, and then buy new macs.

:confused:

PowerPC to Intel was an architecture switch. Intel and AMD use the same architecture. Also, Intel Macs were pretty freakin' fast when they first came out. Tiger on Intel ran faster than Tiger ever did on PowerPC, so I don't know what you're talking about there. A switch to AMD, especially to someone who is upgrading from either a Core Duo or an early Core 2 Duo would STILL result in a faster machine.

Apple leaving Thunderbolt?

I don't think so.

This is probably one of the better reasons as to why Apple won't be making AMD Macs anytime soon, unless Apple can license it from Intel onto their logic boards with an AMD chip, though even that sounds unlikely.
 
The macbook airs suck for any kind of serious work anyway. Power users running intense photoshop documents or anything of the likes arent going to be running out to buy a macbook air.

The MBA is really geared for basic use like sending photos of your cat to grandma or typing a document in MS Word for your university paper. And in that regards, the AMD cpus are more than capable.

This is the MOST ridiculous comment ever.

There are professionals out there still using PowerBooks and PowerMacs for serious design or scientific work.

You do NOT need an Intel Quad Core to be productive. Computer is just a stupid tool - stupid but extremely FAST to get your work done.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.