Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Why can't there be the best of both worlds? Intel CPUs and ATI GPUs. AMD's current CPUs lag behind Intel's and their notebook ones seem to make a lot of heat. Maybe they have some new ones coming soon that can improve on those things.

AMD owns ATI
 
Can't deliver based on what evidence? They've delivered very nice, very fast processors in the past. They delivered chips that matched and surpassed Pentium III and IV chips in speed and price/performance ratio..

Last century. The best that can be said is that they're not too far behind today, at least on desktop chips. While they have a nice roadmap, so does Intel. There's nothing to indicate any significant advantage to AMD in the near future.

So far, everyone in this long thread has missed the big story. Apple's new GPU-switching technology on the laptops is not from nVidia - Apple developed it themselves. That means a significant chance that it would work with AMD GPUs. The entire AMD/Apple discussion thing could be about GPUs.

In the desktop market, Apple is associated with underperforming machines that are more expensive. It's already a bad thing!

Oh, yes. All those 27" all-in-ones out there are far less expensive than the iMac.

Oh, wait.
 
Why on earth do Windows shills hang out on this forum? I mean... seriously, what is that? (You know who you are. You sound like a bunch of bitter ex-girlfriends. It's pathetic.)

What kind of ignorant statement is that? Some of us have Windows machines and some of us also have Windows on our Macs. Some of us don't happen to like Microsoft OR Apple as far as companies go (i.e. greed greed greed). That does not make one a "Windows shill". I don't read Windows forums at all. I use OSX 90% of the time these days (the PC is mostly for gaming). But that doesn't mean Steve Jobs gets a free ride to be a controlling greedy SOB with no one having anything to say about it. Look at all the areas the Mac could be SO much better in (i.e. better GPUs, SLI support in the OS, either helping to advance OpenGL or licensing DirectX because it kicks OpenGL's butt these days, etc.)

You simply cannot (but Apple does anyway) claim to have the most advanced OS on the planet and yet you don't even support hardware acceleration in the OS for 3rd party software (and hardly support it yourself for anything except on one 9400 chipset). It's absolutely EMBARRASSING that XBMC has no hardware acceleration for video on the Mac (100% Apple's fault) and that Flash is a total turd on OSX (80% Apple's fault for no hardware API support, which is what makes Flash work on a PC with almost no CPU use at all while it sucks 100% on Macs because Apple REFUSES to offer modern OS feature support for GPU hardware). But the standard fanboy response is to blame anyone BUT Apple for these problems. They'll scream all day long how much Adobe sucks it when Apple is the one that is being uncooperative. Apple is the one that purposely excludes specific companies in licenses. Apple is the one that won't let you buy better hardware for less or even have the option of a matte screen if they don't feel like offering one at any given moment. It's those sorts of stick-it-to-you type decisions that just rubs some of us the wrong way all the time. Apple doesn't want to do it right and they don't want to let anyone else do it right for them.

WHY is Apple talking to AMD after giving them the middle finger for the past four years? This isn't rocket science folks! It's OBVIOUS. Apple finally made the decision to include a better bottom line standard of graphics than Intel GMA (the absolute PITS compared to either ATI or NVidia and it shows no signs of EVER catching REMOTELY up). Intel decided to leverage its CPU standings (which currently wipes the floor with AMD for power, performance and heat issues) to try and FORCE sales of their crappy GPUs. Steve was counting on the bottom line Macs to use a cheap integrated chip and this killed that vision dead. Steve doesn't like ANYONE killing his visions (even if they're not always the best visions) and so he's playing retaliatory games by now dealing with AMD for the next generation 13" Mac and Mac-Mini which have nowhere to go in the future without either jacking up the price (Steve would NEVER let Apple take a small drop in profits to cover an outboard GPU) or going backwards (like the 2nd generation Mac-Mini did) to completely crappy Intel integrated graphics. Steve doesn't need a top-of-the-line CPU in the Mini or the 13" notebooks. What he does need is SOMEWHERE for the CPU to go in the next few years when the C2D will be old news and even AMD will have better chips to offer. At the very least, it might get Intel to reconsider its licensing. Either way, Intel has become a thorn in Steve's side. Steve likes giving thorns, not receiving them. He doesn't like "bags of hurt".

It's why you STILL do not see Blu-Ray support in OSX even though the licensing is now quite reasonable and the drives are fast becoming dirt cheap and the storage possibilities are quite nice for certain things. Too bad. Steve doesn't want Blu-Ray to compete with iTunes HD. Never mind that iTunes HD has about 40 movies in total (at 720P) to offer compared to thousands of BD titles in 1080P. It doesn't matter. Steve has his vision and he will try to move heaven and earth before admitting he made a STUPID decision. It's been so long that it's now hard for Steve to ADMIT not including BD is a dumb move. He's built his hardware base around things other than HDMI so including support wouldn't help older Macs use Blu-Ray anyway. But then he expects you to buy a new Mac at least every other year anyway, so I don't see where it would matter. It's like not including hardware acceleration for older Mac GPUs (or even newer ones) in the OS. You don't NEED hardware acceleration! Your dual-cores and Grand Central will take care of it! Yes, that's why when I encoded a video last night in Final Cut Pro on my MBP running Snow Leopard it was only using 54% of total CPU power (i.e. just over one CPU) despite FCP having 11 threads. That Grand Central REALLY makes a BIG difference! Now it uses 54% instead of 50% and it only took 2 hours to compress the video instead of 2 hours and 5 minutes! Woohoo!
 
t's absolutely EMBARRASSING that XBMC has no hardware acceleration for video on the Mac (100% Apple's fault) and that Flash is a total turd on OSX (80% Apple's fault for no hardware API support, which is what makes Flash work on a PC with almost no CPU use at all while it sucks 100% on Macs because Apple REFUSES to offer modern OS feature support for GPU hardware). But the standard fanboy response is to blame anyone BUT Apple for these problems.

Same for VLC 1.1!
"Using DxVA2 on Windows Vista and 7 and VAAPI on Linux, the decoding stage of VLC framework can now be done by the GPU."
http://www.jbkempf.com/blog/post/2010/03/15/On-the-road-to-VLC-1.1.0:-faster

Edit:
Plex and OSX: http://ryan.plexapp.com/?p=34
 
They should have at least one facility in Dresden (Germany). I don't really know why you are bashing Amd so hard in this thread: maybe you were fired and now you are angry ?

You are plain wrong. AMD owns no fabs anymore. You are thinking of Global Foundries, which AMD spun off and is majority owned by Arab investors.

I was not fired. I quit, 2 months before the california bar exam so I would have time to study. I am bashing them because they had a huge lead and squandered it. I am bashing them because they lost money every quarter from Q4 2006 (except for the quarter where they got a one-time payment from Intel, and last quarter where they got a one-time benefit from the fab spin-off). Their stock used to be in the $40's. It's not just me that's bashing them - it's the marketplace.
 
They should have at least one facility in Dresden (Germany). I don't really know why you are bashing Amd so hard in this thread: maybe you were fired and now you are angry ?

Nothing says that Apple doesn't have their own custom ASIC's design put together in combination with the PA Semi and recently acquired former Exponential crew built around x86. Leveraging AMD in this vain would be their only option as they 1) don't have any fabs of their own (AMD does or at least has arrangements with them) and 2) they don't have an x86 license (AMD does) and more than likely wouldn't be granted one anyhow. In truth, I don't think they want one... they just want CPU's and GPU's that suit their needs and if that means using Peter (AMD in this case) to push Paul (Ortellini, as it were), then so be it.

Apple *WILL NOT* stray from x86 unless they can come up with something significantly faster on their own, at least when it comes to the desktop (as good as Intel is, portables = all ARM/Snapdragon/Cortex/Tegra/A4 etc.) but that doesn't mean that Apple couldn't reach into an agreement or maybe even a joint-venture with AMD with modified versions of their core designs.

Beyond that... the biggest situation with ATi is that their newer GPU's are faster than (or at least on par with, Fermi took forever to come to fruition and ATi still has newer tech coming down the pipeline, NVidia's roadmap for the desktop is a bit slower) NVidia's recently launched Fermi lineup and at a significant power savings (in truth, NVidia has some minor graphics advantages but it's a bit of a Hail Mary, the feature they're pushing isn't implemented heavily in gaming thus far, and the overhead for the gains might not be desirable for Apple and others). In that vain, since AMD owns ATi... there's a good chance that Apple might be looking at embracing an AMD/ATi GPU in place of the secondary NVidia chip (in conjunction with an Intel processor and IGP) or perhaps even a CPU/GPU combo.

Before I hear groans of "OH GOD, NOT AMD!", keep in mind... the core CPU performance of the x86 chip only has to be "relatively close" in today's game. With Apple's focus on OpenCL and Grand Central, it's more about multi-core and potentially a heavily threaded multi-core GPU (and/or a primary GPU on die with a secondary GPU or GPU's alongside) than anything. A faster Intel Core i3/i5/i7 is all well and good... but when saddled with a subpar Intel IGP, a slower AMD processor could still come out superior in price/performance with a more efficient and significantly more powerful GPU to take up the slack. Keep in mind, you can still get brute force out of either option with secondary GPU's (Apple is using NVidia chips alongside Intel's IGP with the new Core i3/i5/i7 laptop chips) but it's in "base" performance where an AMD/ATi Vision setup could have significant legs over the Intel/Intel IGP setup even at lower power consumption and performance per watt. I don't know that for a fact, but anything is possible so we'll just have to see.

By the way... for those preaching of AMD's laggard designs... in honesty, they were never laggard on the design front. They simply got beat wholesale on the fab front and weren't ever remotely as good on the efficiency front (that's since the Pentium M). They couldn't keep up with Intel who was bumping their die size down and scaling better than AMD's fabs were. It wasn't until Core i3/i5/i7 that Intel finally matched AMD and removed the memory controller from the board and put it on the CPU. In truth, yes the AMD processors have always been power monsters... but Opteron, despite it's expense and energy consumption, was still significantly more powerful than a comparable Xeon-equipped system. The difference that fell into Intel's favor? For the price of one Opteron system, you could likely build 3-4 Xeon boxes and still come out with a power savings advantage.

Intel likely will eventually buy NVidia because their GPU segment is failing them and they seriously could use some firepower here. I think this move with the i3/i5/i7 was an attempt to strongarm NVidia and widdle them down so they could swoop in and scoop them up (and pay mind to the reality that NVidia scared them with CUDA, the promise of using GPU's as additional CPU power literally began destroying Intel's stronghold on the budget GPU sales lead). Keep in mind, Intel was developing Larrabee as their own "Hail Mary" answer to the growth of GPU's as processing units (Larrabee though is just some neutered/scaled CPU's trying to serve the same sector) and it failed to prove competitive (hence it never shipped) and sent Intel back to the drawing board to improve it (or who knows, maybe even clean-sheet redesign it). This alone is proof that while AMD might be behind on CPU might, they're far from being behind overall. In fact, they might just come out ahead (with major thanks to ATi).
 
Before I hear groans of "OH GOD, NOT AMD!", keep in mind... the core CPU performance of the x86 chip only has to be "relatively close" in today's game. With Apple's focus on OpenCL and Grand Central, it's more about multi-core and potentially a heavily threaded multi-core GPU (and/or a primary GPU on die with a secondary GPU or GPU's alongside) than anything. A faster Intel Core i3/i5/i7 is all well and good... but when saddled with a subpar Intel IGP, a slower AMD processor could still come out superior in price/performance with a more efficient and significantly more powerful GPU to take up the slack. Keep in mind, you can still get brute force out of either option with secondary GPU's (Apple is using NVidia chips alongside Intel's IGP with the new Core i3/i5/i7 laptop chips) but it's in "base" performance where an AMD/ATi Vision setup could have significant legs over the Intel/Intel IGP setup even at lower power consumption and performance per watt. I don't know that for a fact, but anything is possible so we'll just have to see.

Great post Mr. Mackey.

I certainly don't want an all AMD lineup but Apple would do well to take the entry level Macs and utilize AMD hardware. Fusion would be a nice platform for the Mac mini.
 
Nothing says that Apple doesn't have their own custom ASIC's design put together in combination with the PA Semi and recently acquired former Exponential crew built around x86.

There are maybe 2 or 3 guys at Intrinsity from Exponential. I can I only recall one (Blomgren) working on the PowerPC x704. Of the members of the PowerPC x704 team, 2 are at Sun, one became a patent agent, one went to AMD then retired, one went to Sun, then AMD, then became a lawyer, and the rest are scattered to various other companies.

Leveraging AMD in this vain would be their only option as they 1) don't have any fabs of their own (AMD does or at least has arrangements with them)

Their "arrangement" is they pay a fab to make their chips. Just like Apple does. Apple doesn't need AMD for that.

and 2) they don't have an x86 license (AMD does) and more than likely wouldn't be granted one anyhow. In truth, I don't think they want one...

True. But the license applies to designs by AMD.

Apple *WILL NOT* stray from x86 unless they can come up with something significantly faster on their own, at least when it comes to the desktop (as good as Intel is, portables = all ARM/Snapdragon/Cortex/Tegra/A4 etc.)

Also true.



By the way... for those preaching of AMD's laggard designs... in honesty, they were never laggard on the design front.

Sure they were. Ever hear of K5? And Athlon was kind of crummy (though good enough to keep us in the game). After clawhammer/sledgehammer, they've done almost nothing other than slapping down more cores. The integrated memory controller and point-to-point bus, the x86-64 instruction set, etc. was all done before 2002.

[/QUOTE]
 
They are probably just talking about GPU's for the new mac pros that should (fingers crossed) being coming out in june.
 
They are probably just talking about GPU's for the new mac pros that should (fingers crossed) being coming out in june.

FFS the thread in question mentions AMD CPU.

The meetings have reportedly included briefings by AMD that have since enabled Apple to begin working with AMD processors in its labs as part of an initiative to position the chips inside some of the company's forthcoming products. While AMD offers a variety of embedded processors, Apple is believed to be investigating the chipmaker's workstation and notebook class CPUs.

It is believed that Apple is working with AMD to expand its potential sources for CPUs in order to increase its flexibility and broaden its competitive options, but also likely in response to problems it has encountered with Intel. These issues include limited availability of new processors (which is rumored to have slowed Apple's notebook refreshes) as well as new chipset designs imposed by Intel that have blocked the Mac maker's plans to continue a partnership with NVidia to deliver a standardized chipset for use with its Intel processors across all of its consumer computer offerings.


Why would there be "advanced meetings" to discuss GPU technology? Apple already has used AMD GPU there wouldn't need to be high level meetings.
 
Maybe Apple is holding on to Core 2 Duo on the lower end Macs until AMD makes a better CPU that's faster and more efficient but not as much as the Core i3 so Apple could sell the lower end Macs at a lower price with a better GPU since AMD is cheaper.
 
Why would there be "advanced meetings" to discuss GPU technology? Apple already has used AMD GPU there wouldn't need to be high level meetings.

There are always high level meetings to discuss these things. AMD used to send Jerry Sanders to Compaq and Sony all the time, even though they were already using our chips. It's how salesman work. Keep the relationship going.
 
There are maybe 2 or 3 guys at Intrinsity from Exponential. I can I only recall one (Blomgren) working on the PowerPC x704. Of the members of the PowerPC x704 team, 2 are at Sun, one became a patent agent, one went to AMD then retired, one went to Sun, then AMD, then became a lawyer, and the rest are scattered to various other companies.

That's fine but the point is still that there's new CPU-related help that has at least "some" experience working on a desktop capable CPU. Toss that in with the PA Semi people, and while I know their primary focus has been the A4 and embedded-class CPU/GPU advancement... none of what they've done has inherently been clean sheet. After all, the designs are based off of an existing architecture (ARM) that has been around for years, just as x86 has been. Exponential's experience lied in modifying an IBM/Freescale (nee-Motorola) design. Granted, I think PA Semi's experiences with PPC were better than Exponential's as none of their efforts seemed to bear fruit. PA at least had a successful business model.

Their "arrangement" is they pay a fab to make their chips. Just like Apple does. Apple doesn't need AMD for that.

They need AMD for the x86 license, pure and simple. If not them, then Via. That's their only 2 options for one outside of Intel and obviously, of the 2... which one is more "in the game" across the board?

Doesn't take a rocket scientist to figure that one out.


True. But the license applies to designs by AMD.

Hence, Apple rubs AMD's back and in turn AMD gets a new jointly-developed chip design to sell as their own (in fact, Apple doesn't even have to be mentioned in the same breath with the chipset and nobody would know any better). Apple benefits and helps push Intel's hand in a vain that is more favorable to their way of thinking. That being Intel either buying NVidia and getting up to speed with GPU-based computing via OpenCL, or at least grant NVidia a contract to keep themselves in the game against a more competitive AMD/ATi. If that comes at AMD/ATi's hands solely or with Apple's gentle poking/prodding... it doesn't really matter now does it?

You have to realize... Intel at this point has held all of the cards. When NVidia stepped up with CUDA and later, Apple embraced OpenCL (to which others got on-board)... it got Intel scrambling. True, Larrabee was probably in development before that started but I'd wager easily that there was snooping and rumor-mongering going on from inside that kicked Larrabee off as a test-bed to try to combat the looming potential for GPU's to start eating into Intel's performance lead. After all, Intel is keen to NOT lose marketshare in any way. That includes by keeping any advantage in any way possible. They did that with NVidia by locking them out of the latest designs, they obviously will be keen to thwart a resurging AMD/ATi if it indeed happens. It could prove harder though since they don't hold anything over either's heads on the level they do NVidia.

Nevermind the fact that while much of AMD's design is still based around an older design, that design is NOT inherently flawed nor completely out of the game (in fact, Via is more out of the game than them even though they have low power chips... their lower power chips don't have the grunt that Atom has, much less the ARM-based juggernaut). Keep in mind that Intel themselves turned back the clock to the PIII from the PIV version of the x86 architecture to build the Pentium M and the various Core processors that have come since then. AMD is not THAT far out of the game and as I noted earlier, it took Intel until the Core i3/i5/i7 family to catch up to what AMD has had since 2002! They lost out on 64-bit computing to AMD, they had the failure that has been Itanium, etc. etc. They're not infallible or inflappable either.

In truth, yes... the AMD processors are not as efficient, and that is somewhere where Apple's acquisitions in PA Semi and Intrinsiq could benefit AMD and therefore Apple themselves. Competition is good for the bottomline as it brings forth greater price battles. After all, both have significant experience dealing in processors that are quite powerful for being more a more embedded class processor. In fact, this is another chink in Intel's armor as let's face it... Atom is a flop in this sector and is more of a "tweener" falling between the desktop CPU and the mobile handheld CPU.

So while Intel "has been better" at efficiency on the desktop, it doesn't mean that they will always be. I'm not saying Apple will buy AMD. In fact, I don't see a ton of value there other than the x86 license that they stand to lose anyway. Yet DO NOT count AMD out for the benefits that they could have for Apple in a joint venture vain. Apple, via PA Semi and Intrinsiq, could all but fix AMD's woes for them for the benefit of pushing Intel to give Apple the direction they want, or at least getting greater performance and efficiency from their software roadmap. Don't think that makes sense? Apple put a ton of investment into OpenCL and Grand Central only to not want to see it squandered. They had put a considerable effort into an NVidia/Intel arrangement that Intel has mucked with. Jobs, as anyone knows, doesn't take lightly to people fumbling with his plans. He parked use of ATi GPU's in the Macs for far less than that. Jobs may be friends with people at Intel, but friends or no friends... Apple = his true love, and he is shrewder than shrewd when it comes to advancing them.

Also, as I noted earlier... Fermi came way too late for NVidia and it's benefits vs. power usage aren't nearly as big as previous NVidia designs had shown. Apple could well be talking to AMD simply for ATi, although my gut tells me that they're wanting something along the line of Vision and that in and of itself requires an AMD processor. The only chink in the AMD roadmap is the efficiency of the Athlon, Phenom, etc. cores and that isn't incapable of being rectified. Especially when you've got your own team of chip engineers/designers with knowledge of how to achieve just that.

In as far as fabs... I know that Apple can go to a fab of their choosing (much as I assume AMD could although more than likely... none outside of Global Foundries or Intel's [obviously not going to happen] could meet their significant needs, I'm sure), but they can't obtain the x86 license in any other way than going through Via or AMD. Bottomline, if they go through AMD or Via to get access to tweak on an x86 chip... they're going to need AMD or Via for that license and to sell/brand that chip design (i.e. return on design investment via sales = greater financials to R&D the future generations, that's where the PowerPC failed is that it didn't have volume sales to improve the breed), and more than likely would just ride along with whatever fabs those companies elect to use. Bottomline, they can't do it themselves. There's nothing out there on the books that says Apple can't design a revised x86 chip design and gift it to AMD or, better yet, sell it to AMD for contingencies to come later (i.e. reduced costs per part, added benefits of control of the design of the ASIC's, access to AMD's engineering patents for use in the Apple A# series of handheld/mobile processors, etc.).


Sure they were. Ever hear of K5? And Athlon was kind of crummy (though good enough to keep us in the game). After clawhammer/sledgehammer, they've done almost nothing other than slapping down more cores. The integrated memory controller and point-to-point bus, the x86-64 instruction set, etc. was all done before 2002.

Once again... the big advances that Intel made were on an efficiency scale to an existing architecture that they've incrementally improved. If AMD could achieve greater efficiency per cycle (their key weakness) and find a way of fabing at smaller dies on an equal scale to Intel (the $64,000,000 question and the weakest chance of the lot IMHO), they could most definitely go toe to toe. Maybe not in total volume (they lacked the fabs even before their woes) but they're far from out of the game. In fact, they're really the only one's still left in the game. If we lose them, it's an end-game for competition on x86, and the only thing left driving competition on the desktop. I think Apple is shrewd enough to see that themselves. Apple wants to have a part in shaping where x86 goes and I think they're hitting their strides enough to help drive that. An Apple/AMD/ATi arrangement could give Apple that in, even if Apple elects to not exclusively align themselves or even knowingly commit to their hand. In fact, the less Apple becomes the direct "known" enemy to Intel's agendas, the better. That's another benefit to Apple throwing AMD a bone and getting out of the way.
 
God, I hope not. AMD processors are no longer in the same league as Intels. Maybe for the lowest end macs if any. :eek:

Really, where do you read that kind of crap?

If you put Intel and Amd onto a Price / Performance line, then AMD will have a slight win over Intel.

Plus the new Six Cores are competitive to the i7 Line. They seem to be identical.


I wish they update their lower iMacs soon. I hope with a nice six core :D

And the macbook update without ATI was disappointing.

I would keep the Intel in high end imacs and mac pro, just because there is little reason to change the motherboard and the inner design.
 
That's fine but the point is still that there's new CPU-related help that has at least "some" experience working on a desktop capable CPU. Toss that in with the PA Semi people, and while I know their primary focus has been the A4 and embedded-class CPU/GPU advancement... none of what they've done has inherently been clean sheet. After all, the designs are based off of an existing architecture (ARM) that has been around for years, just as x86 has been. Exponential's experience lied in modifying an IBM/Freescale (nee-Motorola) design.

Only in the sense that we started with the IBM/Motorola instruction set. We didn't have any access to their design, and every single transistor was designed by us from scratch.


They need AMD for the x86 license, pure and simple. If not them, then Via. That's their only 2 options for one outside of Intel and obviously, of the 2... which one is more "in the game" across the board?

Doesn't take a rocket scientist to figure that one out.
Except the AMD and VIA licenses to not apply if the design is not an "AMD" or "VIA" design. However, you forgot the most obvious choice - IBM. They have a license to fab anyone's x86 design. All you have to do is pay them.



Nevermind the fact that while much of AMD's design is still based around an older design, that design is NOT inherently flawed nor completely out of the game (in fact, Via is more out of the game than them even though they have low power chips...

Except it is a nearly 10 year old microarchitecture.


In as far as fabs... I know that Apple can go to a fab of their choosing (much as I assume AMD could although more than likely... none outside of Global Foundries or Intel's [obviously not going to happen] could meet their significant needs, I'm sure), but they can't obtain the x86 license in any other way than going through Via or AMD.

AMD actually already has a deal in place to use IBM's fabs if necessary. And, as I point out, the easiest way to get an x86 license is to do what Montalvo, Transmeta, Rise, etc. do/did - go to IBM.
 
But I think it just goes to show that PCs easily last 6-8 years if you know what you are doing (and I'm sure Macs do too).

Here's how... http://boards.ign.com/teh_vestibule/b5296/191015261/

This is priceless. Are you serious? Run ccleaner on a regular basis and your PC will last 6-8 years? Only on Windows 7 of course, which hasn't been out 6-8 years so no one can call you out.

I don't know anyone that's owned a PC for more than 3-4 years that hasn't had to reinstall windows at one time or another. That includes myself, and my PCs for work.

Conversely, I have a 5-6 year old PB G4 that I have never reinstalled the op sys on.

I'm not saying all PCs are junk. But I am saying that in my experience (and many, many, others) computers running versions of windows that have existed for 6-8 years require A LOT more maintenance than computers running versions of OS/X that have existed for the same amount of time.

You can put your head in the sand if you like, but ccleaner is not the answer to that.
 
As Intel has no replacement for Apple's two-chip Core 2 Duo solution in the MacBook 13", Mac Mini and entry iMac, AMD's Llano processors may be the answer.
 
I'm hoping Apple and AMD are able to collaborate on some designs. Every since the original Amiga, I've been waiting for a system with multiple hardware chips each designed to do one (or two) things well, but not necessarily be multi-purpose.

If Apple and AMD were to collaborate on a system with a good enough CPU and a _lot_ of surrounding hardware, the result could be incredible.

Not to say that Intel couldn't do the same thing, just that they won't.
 
They should have at least one facility in Dresden (Germany). I don't really know why you are bashing Amd so hard in this thread: maybe you were fired and now you are angry ?

Same thing everyone says to me as a former employee of Apple.

Its a tired argument some people decide to just change directions. I did and it nets me an additional 50k a year.
 
Same thing everyone says to me as a former employee of Apple.

Its a tired argument some people decide to just change directions. I did and it nets me an additional 50k a year.

Exactly. It's not like I left to go to Intel. In 2003 I saw that AMD was going badly awry due to a major change in management and I started law school. In 2006, when things had only gotten worse, I decided to follow through on the career change. (Though in my case I took a huge pay cut :( )

Weird that he'd believe me if I had GOOD things to say. After all, if a former employee had a bias, it would generally be positive.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.