Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I don't think Apple would use AMD unless they as fast or faster than Intel's current offerings and possible future offerings. The secret talks that Apple might be having with AMD, might be AMD showing Apple their future chips.
 
Maybe Apple is holding on to Core 2 Duo on the lower end Macs until AMD makes a better CPU that's faster and more efficient but not as much as the Core i3 so Apple could sell the lower end Macs at a lower price with a better GPU since AMD is cheaper.
 
Ummm. Apple was actually interested in AMD BEFORE the company went with intel. This is just another mouse peeking its head out of the hole.
 
I think I just had to pick up my jaw off of the floor. I have both a Intel Mac and a AMD computer, and I have to say, I like the AMD better. It(Athlon II X2 240) runs cooler then my dual core Intel machine (my iMac). All in all, I really think that Apple partnering with AMD would be a good move on their part, because they would be able to lower prices, both for themselves and for the consumer, because AMD/ATI is cheaper. This ought to be interesting.
 
I've still got an iBook G3 from 2000 running 10.4 right next to me. What's the problem again?

I really believe that. The old Apple notebooks had very high quality. A friend has a powerbook from 2000 or so that still works very well and even looks much more like a "modern" machine than some cheap notebooks sold today.

But Apple did not always so so well. I have a Macbook from 2006 (first generation). It suffered from case discoloration (fixed by Apple for free), the battery failed very quickly (replaced for free), the webcam failed after a year, the display started flickering for the first five minnutes or so when turned on and the mouse button does not work any more (don't care because "tapping" still works). A few weeks ago all USB ports failed (does not recognize any external displays in both OSX and windows). This made the unit finally quite useless for me.

Because I don't need much power on the go, I bought a netbook from levono to replace it. While this is not a truely "high tech" computer, it is not as crappy as many people including Steve Jobs want to tell us. Speed (including boot) are ok even with 1 GB RAM (Win 7 starter, so no Aero etc.).

I will see how long it lasts,but if it fails after 2 or 3 years, I don't care much because of the low price.

And hey, it as built in 3G. Something I like very much and that is not offered on any notebook by Apple (not even on a 2000€ + unit as a BTO option). Your opinions may vary, but I consider an iPoad as too limited for my use.

Christian
 
MBP 13 summer update

Anyone know if its possible that apple would launch another update on the 13 inch MBP this summer? (I heard somewhere that they are planning to cancel c2d production during summer) and also if they would update the normal macbook, they would have to update the MBP 2 because the difference would be so minor.... any opinions?
 
Presentation by one of AMD's chief of technology esp. bulldozer):


http://webcastingplayer.corporate-ir.net/player/playerHOST.aspx?c=74093&EventId=2457769&StreamId=1370910&TIK={7315d172-dbd3-4fad-9751-e71d39c185f2}&RGS=3&IndexId=
 
In 2003 I saw that AMD was going badly awry due to a major change in management and I started law school. In 2006, when things had only gotten worse, I decided to follow through on the career change.
So the three years covering 2003 through 2006 was enough to sour you on AMD's capabilities forever, but the three years covering 2007 through 2010 is somehow not enough for AMD's design teams to get their act together again? If you haven't been with the company in the last few years then you're still just speculating like everyone else who is not in the know. Nobody here has debated ATI's current superiority over Nvidia and yet nobody is claiming that Apple should actively avoid all Nvidia chips due to their lack of cutting edge hardware at the top end. So why make such insinuations about AMD? It smacks of a reactionary knee-jerk response as opposed to an honest and thoughtful debate.

Not to mention that much of the current hierarchy and protocol at AMD is based on ATI's processes. If it worked for ATI it just might work for AMD as well. In all honesty there isn't much if any downside for Apple to consider using AMD chips where they make the sense and plenty of upside potential both for Apple and their customers. I'm not an emotional promoter or detractor of either brand; but I certainly recognize the inherent benefits of active competition. Intel is an absolutely massive conglomerate compared to the likes of AMD, but Intel still doesn't always have the best chip for the best price in every possible segment. They never have and they probably never will.

Weird that he'd believe me if I had GOOD things to say. After all, if a former employee had a bias, it would generally be positive.
That comment is pure conjecture in the service of selling manufactured irony and these types of unqualified statements only draw the rest of your conclusions further into question. Not to mention that most of the people I know are either indifferent or actively negative toward previous employers. How many people do you hear waxing poetic over a former employer? The more they would profess their love the more curious it would seem that they're no longer working for them.
 
interesting:

http://www.theinquirer.net/inquirer/news/1601521/apple-shafts-nvidia-amd-employees

"
Apple shafts Nvidia using ex-AMD employees
A love triangle
By Lawrence Latif
Fri Apr 16 2010, 03:46

FLOGGER OF SHINY TOYS Apple has managed to shaft Nvidia, implementing its own graphics core switching technology by hiring former AMD employees.

The INQUIRER reported last year that Jobs' Mob welcomed Bob Drebin and Raja Koduri, who were senior figures at AMD. Sure enough, a year later Apple managed to produce a system very similar to Nvidia's Optimus.

The Green Goblin's clever system allows dynamic changing between integrated graphics and more a powerful discrete GPU depending on application profiles. Its critical feature was to do this seamlessly, without requiring the user to log out or flick a switch. At first it was widely thought that Apple had incorporated this technology in its latest Macbook Pro line.

The Macbook Pro has had switchable graphics since the late 2008 models, however those units required the user to log out and login again to their user account to make the change. The system incorporated in the latest Macbook Pro launched earlier this week, "toggles seamlessly" between Intel's integrated graphics and Nvidia's 320M/330M GPU chips, much like Nvidia's Optimus.

Nvidia has confirmed to us that the technology in the Macbook Pros isn't Optimus and that the switching technology incorporated in Apple's latest high-end laptops was all its own work, presumably with help from the firm's new employees.

For Apple, creating its own technology might help the company gain a bit more leverage against GPU designers allowing it, as with the Macbook Pro, to use multiple vendors' graphics hardware. Apple was caught out more than once with Nvidia's chips and while the cappuccino firm hasn't dumped the Green Goblin yet, it is clear that Jobs' Mob is reducing its reliance on any single vendor.

AMD on the other hand, having lost two of its engineers, must be happy that its alumni still manage to stick it to Nvidia when given a chance."
 
So the three years covering 2003 through 2006 was enough to sour you on AMD's capabilities forever, but the three years covering 2007 through 2010 is somehow not enough for AMD's design teams to get their act together again? If you haven't been with the company in the last few years then you're still just speculating like everyone else who is not in the know.

Nonsense. You act like I'm not still plugged into what's going on there. That I don't go drinking with some of the current employees. That I don't know the people in charge VERY well having had years of experience working with them before (and while) they were in charge. And 2002, 2003, 2004, 2005, 2006 is a lot of years of screwups while I was still there. 2007, 2008, 2009 AMD continued on the path from the previous 5 years - they did not tape out anything interesting, and continued making spins on designs from early 2002. They spun off their fabs, so they no longer can influence the process they way they used to. Their CEO had to resign in disgrace. The new CEO is a guy that once told the design team (when he was a manager in Texas) "if you don't like it, quit" - and 60 out of 100 people in Sunnyvale quit within a month. They used to hand place and hand instantiate each cell in the design for maximum efficiency and speed - now they rely on tools which perform 20% worse than humans (in order to save money).

I know EXACTLY what chips they are currently working on and WHO is working on those chips.

Think what you want. I find it hilarious that if someone says something bad about a former employer, everyone wants to assume it's some sort of evil grudge. It's just honesty.
 
Think what you want. I find it hilarious that if someone says something bad about a former employer, everyone wants to assume it's some sort of evil grudge. It's just honesty.
Saying something bad about your former employer isn't the issue. Claiming each and every single decision made has resulted (or will result) in a wholly negative outcome is the issue. Like Nvdia, AMD is currently in a transition phase that has yet to be fully realized. There are likely to be positive and negative repercussions of changes that have been and will be made to AMD's business model. In your comments virtually all these changes are known quantities that can be quickly and easily ruled as permanent failures. You claim to have inside information and based on that information you assert that AMD is perpetually doomed. If that's true then perhaps you could inquire as to why your friends on the inside are still working for a company they see going nowhere.
 
Saying something bad about your former employer isn't the issue. Claiming each and every single decision made has resulted (or will result) in a wholly negative outcome is the issue. Like Nvdia, AMD is currently in a transition phase that has yet to be fully realized. There are likely to be positive and negative repercussions of changes that have been and will be made to AMD's business model. In your comments virtually all these changes are known quantities that can be quickly and easily ruled as permanent failures. You claim to have inside information and based on that information you assert that AMD is perpetually doomed. If that's true then perhaps you could inquire as to why your friends on the inside are still working for a company they see going nowhere.

Maybe it's because they want to keep a job in this down economy and they have limited options. :rolleyes:

cmaier was clearly an engineer. Engineers are not stupid. They're trained to make good judgments and predict outcomes on what's presented to them.
 
Saying something bad about your former employer isn't the issue. Claiming each and every single decision made has resulted (or will result) in a wholly negative outcome is the issue. Like Nvdia, AMD is currently in a transition phase that has yet to be fully realized. There are likely to be positive and negative repercussions of changes that have been and will be made to AMD's business model. In your comments virtually all these changes are known quantities that can be quickly and easily ruled as permanent failures. You claim to have inside information and based on that information you assert that AMD is perpetually doomed. If that's true then perhaps you could inquire as to why your friends on the inside are still working for a company they see going nowhere.

My friends work there because of golden handcuffs.

I can't speak as to the ATI division, but, yes, the processor division is doomed. You don't have to believe me. Look at their track record for the last 8-10 years, and keep in mind that any new microarchitecture that is sold takes 2-3 years of design time. Then look at their 10K's, their desperation moves (selling the fab, Arab investors, etc.), the well-publicized defections (Fred Weber, etc.) and put it all together. Look at the benchmarks over time. Look at their stock over time. If you choose to write off my statements, there are plenty of objective facts out there for you.

Sometimes a duck is a duck, and the fact that I am an ex-AMD employee calling it a duck doesn't make it any less of a duck.
 
All you need to evaluate AMD are the ****** chips that they produce. They are just a shadow of their former company. It is a shame because competition is what keeps things interesting. Sadly, Apple + AMD = The next Amiga
 
My friends work there because of golden handcuffs.

I can't speak as to the ATI division, but, yes, the processor division is doomed. You don't have to believe me. Look at their track record for the last 8-10 years, and keep in mind that any new microarchitecture that is sold takes 2-3 years of design time. Then look at their 10K's, their desperation moves (selling the fab, Arab investors, etc.), the well-publicized defections (Fred Weber, etc.) and put it all together. Look at the benchmarks over time. Look at their stock over time. If you choose to write off my statements, there are plenty of objective facts out there for you.

Sometimes a duck is a duck, and the fact that I am an ex-AMD employee calling it a duck doesn't make it any less of a duck.


I take it in your opinion the last good thing that came out of AMD was the Athlon line.

I will say that your voice bashing AMD does not look very good as you are a former employee.
AMD used to be a really great company and I think they produced things by far better than intel. Now AMD Mobile line even back in the atholon days was pretty much crap but most people knew that.

I think AMD got hurt pretty badly by the illegal business practices of Intel bribing to threatening the people sold to only to buy intel. I know you Cmair have stated you know first hand of some of the low ball crap intel pulled and more than likely still pulls to keep AMD from getting back up and going. AMD gets screwed from all sides and just can not compete with intel abusing their monopoly power.
 
I take it in your opinion the last good thing that came out of AMD was the Athlon line.

I will say that your voice bashing AMD does not look very good as you are a former employee.
AMD used to be a really great company and I think they produced things by far better than intel. Now AMD Mobile line even back in the atholon days was pretty much crap but most people knew that.

Athlon 64/Opteron were excellent chips. So were the x2's and the initial x4's. But they waited too long and Intel caught up by putting its own memory controller on-board and by using a point-to-point interconnect, and they have no visionary. They don't even have a real architect.

And AMD's graphics chips are pretty good (different team then the CPU guys, though).
 
Exactly. It's not like I left to go to Intel. In 2003 I saw that AMD was going badly awry due to a major change in management and I started law school. In 2006, when things had only gotten worse, I decided to follow through on the career change. (Though in my case I took a huge pay cut :( )

Weird that he'd believe me if I had GOOD things to say. After all, if a former employee had a bias, it would generally be positive.

For me I was at a wall. I couldn't go any further. I took all the wonderful things Apple taught me, all the great experiences, and memories with me to the corporate world. I think you and me took similar paths in that one career one sees as amazing lead us to another career which open minded people would understand why we chose. I left Apple, I know to the die hards that means I am stupid, but I never really left Apple. I develop a corporate road map for them everyday. Kudos to you cmaier, we don't always agree but I give you kudos on what you have accomplished.
 
Athlon 64/Opteron were excellent chips. So were the x2's and the initial x4's. But they waited too long and Intel caught up by putting its own memory controller on-board and by using a point-to-point interconnect, and they have no visionary. They don't even have a real architect.

And AMD's graphics chips are pretty good (different team then the CPU guys, though).

I have to say that I love my Athlon 3000+ that is currently running in my desktop rig. It does pretty well considering the computer is over nearing 6 years old. At the time I built my computer intel was getting the crap beaten out of it by the Athlons and I think that a intel chip costing over twice as much as my athlon ($300 at the time) was losing big time to it.
 
Except the AMD and VIA licenses to not apply if the design is not an "AMD" or "VIA" design. However, you forgot the most obvious choice - IBM. They have a license to fab anyone's x86 design. All you have to do is pay them.

Once again... if AMD sells them with their name on them, well then... they're an AMD design. Just because AMD opens their engineering drawings up for scrutineering and tweaking and collaboration with the PA Semi/Intrinsity team at Apple only to be sold as a *NEW* AMD processor, doesn't make it not an AMD part. If it's based off of their design and sold as their design, it's still an AMD design. It's not that hard to figure out. I mean, Chrysler sold the Crossfire despite it being engineered by Mercedes on one of their platforms, with their engines, and it was built at Karmann. It was still a Chrysler design though. The same is true of the Chrysler 300, which Chrysler (under Fiat control) still sells, and of which is built on a reengineered 2 generation old E-Class Mercedes platform. Semantics can take you farther than you'd think, and if the core of the new AMD architecture has ties to Athlon, Phenom, etc. but is tweaked by PA Semi/Intrinsity for sale by AMD, it's still an AMD chip.

If taken another way, if you're going to get so tight regarding what is and isn't an AMD or VIA design, then neither ultimately have the right to upgrade their own design to anything new because it would deviate from their original core design. If that's how the license is written and interpreted, well folks... it's an end-game before it even started. You can't compete with Intel or anyone else that has the rights and ability to alter their original design on a whim. I'm pretty confident that, that was NEVER the case. After all, if it was... we'd never had AMD's x86-64 taking the lead over EMT64 because it wouldn't be allowed. Helping develop a chip is no different than a company changing their own inherent designs, so the real testament is who builds it and sells it. Just because Apple collaborates on the design and direction of future AMD product doesn't mean that the ultimate greater benefits on the design wouldn't go straight to AMD.

Similarly though, nothing is to say that Apple couldn't rub AMD's back for gains in sharing some of the patents for processors from AMD's portfolio to help them with their A# (i.e. A4, A5, A6...) and the on-die GPU's using some of ATi's patents as well. If AMD is as laggard and behind the times as you say, then this is a perfect opportunity for Apple to step in, have PA Semi/Intrinsity people help fix what ails them, and gain some IP in the process. It'd still be an AMD design in the end.

Except it is a nearly 10 year old microarchitecture.

x86 is older than that as a whole... what's your point? For a 10 year old architecture it's still holding it's own remarkably well in multi-core Phenom machines against Intel's latest. They're not THAT far behind Intel's latest. They might not be the standard-bearers anymore (that to be honest has been gone for quite awhile [I already highlighted that even though Opteron, pound per pound, was more powerful... it was also energy thirsty and expensive, you gained more by building multiple Intel Xeon machines over 1 Opteron in overall performance per watt], yet they've still managed to keep bumping things to stay in the game)... but once again, it's not like Intel is mopping the floor with the ARM-based processors out there with Atom either. By that reckoning, should we consider Intel doomed? Sure, AMD hasn't done as well as Intel overall (although once again... Intel has no real competitor for Tegra or Tegra 2)... but who has on the desktop really? The reality is... AMD has done better than everyone else but Intel. VIA isn't remotely in the game there. IBM? Yeah... umm... not many new G6's or Power5's on the consumer/business desktop are there? Hrm... not many Cell desktops either?

AMD actually already has a deal in place to use IBM's fabs if necessary. And, as I point out, the easiest way to get an x86 license is to do what Montalvo, Transmeta, Rise, etc. do/did - go to IBM.

IBM's fab #'s have never been nearly as large as what AMD themselves had. True, they've had some bleeding edge tech, but they've also had problems with yields with the PowerPC's over the years just as Freescale/Motorola has. How is that any major advantage? Sure IBM's found ways with their fabs to get down to 32nm, but so has AMD, it's just that they never had enough of them to compete with Intel. Intel could have problems with yields on over half of their fabs and still produce multiple times more than IBM and AMD combined could at theirs on a good day with high yields. They also scale better to smaller dies quicker than AMD, IBM, and Freescale have. While I don't doubt they will use IBM's fabs to meet demand, if necessary, I also believe they'll continue to use their old fabs.

Bottomline... AMD will still be a huge customer from their old fabs. They might tap into some of IBM's unused production, but I still wager they'll stick primarily with their own (former) fabs first and foremost. The reason for the spin-off is simple... look at Intel and how many fabs they've closed as they continue to shrink manufacturing of dies to smaller and smaller sizes. As the chips shrink the need for capacity has shrunk. That's a big part of Intel looking into the SSD market because from a financial standpoint, it saves them $ vs. closing down more fabs and laying off skilled labor. Fab costs = a huge expense and by AMD freeing themselves of that, they place the burden for updating fabs to be competitive into someone else's hands. Since they've been playing cat/mouse with Intel for an eternity at this level, pushing the fabs out into a separate company that they license mfg. through helps them out, and it allows the chip fabs to expand their business beyond AMD's work. Not that this wasn't necessarily the case before (I believe I read at one time, IBM and NVidia both hired some fab work through AMD) but I'm sure it was more of a sensitive issue for companies within the same fields concerned about confidentiality on their designs.

In as far as an x86 license... yeah Apple could go to IBM, but what gain is there? I mean, if Apple wants to clean sheet design their own x86 design and look at integrating their own GPU (From where? Their tech they bought is primarily based around more compact embedded graphics chip designs, not for desktops.) on silicon, great... but let's face the reality here. Apple isn't going to go that far with the desktop. They don't have enough overall volume to justify it. They're better off collaborating with an existing partner to drive the tech and improve the tech than they are to clean sheet it and build their own chips on the desktop. It's not that they couldn't do it, it's just that there's not as many positives in doing so as you'd think, and the strict revenue stream for continued R&D is much the same as it was for both IBM and Motorola/Freescale. By working with AMD or VIA, a co-designed chip that's sold as an AMD or VIA chip could be sold in enough volume to justify the custom workings. The reasons for AMD over VIA = many, as AMD is more competitive overall and AMD has their own graphics processors that are competitive if not standard-bearing in price, performance, and efficiency. That's the primary reason why Apple assisting designing AMD's next-gen chips makes some element of sense. AMD reaps the overall rewards in sales and sells parts to Apple on-cost, and likely gains some IP access on patents to use for future Apple processor's and graphics processors for the iPhone/iPod/iPad. Call it... synergistic and without a need for Apple to buy AMD, without a need for Apple to invest heavily in AMD or ATi, and also without the requirement of an exclusive contract. In doing so, they also push Intel more into favorability with their needs... whether Intel buys NVidia or at least re-opens contracts with NVidia for Ion2... either way it's beneficial to Apple's direction with OpenCL.

Now working in collaboration with AMD, they can "assist" them with improving their 10 year old design to speed bump it or improve the efficiency of it to better compete with Intel. In doing so, with the ATI video accelerators that AMD has in their portfolio, Apple could gain a valuable alternative to Intel's chips that have better spec performance across the board than Intel does with IGP and any discrete secondary processor. This is especially true considering how far NVidia's lost their way on the desktop with regards to Fermi which while it's quite powerful, it's also a huge energy hog, and the performance isn't that much better than ATI's tech *TODAY*. ATI still has better stuff coming down the pipe later in the year to trump Fermi.

The point being... it's taken 8-10 years for Intel to get where AMD was in 2002. In some ways, yeah they've actually surpassed them as they're more energy efficient and have been long prior to the Core i3/i5/i7. In some ways though, from a financial "per part" case, AMD still holds some significant advantages over Intel. That's before you get to the reality that as anemic as Atom is... the Athlon in the Zino HD has better performance on the desktop and is cheaper than the Pentium Dual-Core, Celeron, Core 2 Duo, etc. For some applications, AMD has their advantages still. Not too bad for a company that, according to many, has sat on their hands for 10 years, eh?

I personally, in no way, believe that AMD hasn't been working on advancing what they have, but considering Intel's schemes of kickbacks from the OEM's and their marketing muscle, strength in their # of fabs, ability to continuously and feverishly upgrade fabs, etc. they've been able to keep pace or prove superior even when their architecture wasn't necessarily the best all-around. That said, AMD's obviously not made anything landmark, but once again... what Intel has done has largely been to play catch-up on chip designs while racing AMD to get to smaller dies more quickly until they could get their chipsets on-par or superior. When AMD was at 65nm, Intel was at 45nm. That was how they kept a performance and efficiency advantage even with chipsets whose designs were clearly not superior to AMD's on paper, but were superior in how they were scaled down. AMD's energies over that time were spent trying to catch Intel at the fab level which was never a war they could win, but they had to at least try to keep pace. If they didn't, they would've been out of the game long ago. That took a lot of $ and energy and ultimately, there was little they could've done to beat Intel at that level.

Oh and for the person questioning my comment on the NVidia sale to Intel... I don't want to see it either, but they are vulnerable at this stage and it's a fit that suits both. Intel's GPU segment has been crap for forever and a day. NVidia has sorta diversified themselves to the point where they're in a quandary. Tegra and Tegra 2 are great but don't offer much over some of their competition at this stage (i.e. A4, Snapdragon, Cortex A9, etc.). Fermi came way too late to the market and sure, it's the overall performance champ but it barely beat a year old ATi design and it requires way too much power to do so and is more costly. Don't get me wrong... I'm an NVidia fanboy and prefer them over ATi honestly (ATi's drivers/software are garbage) but... I'm just being realistic here. After Intel locked them out of the more recent chips with Ion to the point NVidia's had to take extra time with workarounds and look at discrete options as their primary focus even as AMD and ATi use combined solutions (that almost lock them out of that market as well), it's pretty safe to say that NVidia has significant needs as does Intel when it comes to GPU on chip performance.

Similarly to the fact that Apple won't develop an x86 chip of their own on their own for sale, bet your bottom dollar that NVidia won't either, and NVidia ultimately has greater benefits for doing so (they know how to market to PC builders, they have a desktop GPU line that would integrate well in terms of their Ion tech, they have experience in motherboard and chipset manufacturing, and they would gain from the development and volume). Reality is... you're not going to remotely compete with Intel on sales and you're better off to either work with Intel or AMD than to go it on your own. It's not like Transmeta, Montalvo, and Rise are selling in anywhere near the volume AMD and Intel are, much less Via for that matter. If Intel won't work with NVidia, there's 2 options... assimilate or die. I'd love to see a 3rd option for NVidia, but I just don't see it with the way things are going. I mean, yeah they can stick around and be a niche player in the high end graphics card market for gaming and workstation-level work, but they've even sorta' slipped up with Quadro there too. They're becoming a Jack of all trades, master of none too quickly. I'd hate to see them go the way of 3dFx but it's looking that way.
 
Once again... if AMD sells them with their name on them, well then... they're an AMD design.

I'm assuming you haven't actual read the license agreement.

Semantics can take you farther than you'd think, and if the core of the new AMD architecture has ties to Athlon, Phenom, etc. but is tweaked by PA Semi/Intrinsity for sale by AMD, it's still an AMD chip.
Not according to the license agreement, it's not.

If taken another way, if you're going to get so tight regarding what is and isn't an AMD or VIA design, then neither ultimately have the right to upgrade their own design to anything new because it would deviate from their original core design.
That doesn't follow at all. It's an AMD design if it is designed by AMD, and not by other parties.



x86 is older than that as a whole... what's your point? For a 10 year old architecture it's still holding it's own remarkably well in multi-core Phenom machines against Intel's latest.

Irrelevant. I said MICROarchitecture. A microarchitecture is a completely different thing than an Instruction Set Architecture. Just because it has "architecture" in the name doesn't mean it's at all related.



IBM's fab #'s have never been nearly as large as what AMD themselves had.

Huh? AMD had one fab - in Dresden. IBM had many fabs, including Fishkill, Poughkeepsie, etc. What are you talking about?

True, they've had some bleeding edge tech, but they've also had problems with yields with the PowerPC's over the years just as Freescale/Motorola has. How is that any major advantage? Sure IBM's found ways with their fabs to get down to 32nm, but so has AMD, it's just that they never had enough of them to compete with Intel.

You are being silly. AMD and IBM co-developed their process. They are identical processes so that AMD could use IBM as a second source fab if needed. This jointly developed process has been used on all AMD chips for the past several years.



In as far as an x86 license... yeah Apple could go to IBM, but what gain is there?

I told you what is to gain. AMD's license doesn't apply to things it "co-designs" with other parties. And IBM is in the fab outsourcing business while AMD has no fabs. IBM is in the business. AMD is not


The point being... it's taken 8-10 years for Intel to get where AMD was in 2002.
6 years to get to where they were in 2004.


Similarly to the fact that Apple won't develop an x86 chip of their own on their own for sale, bet your bottom dollar that NVidia won't either,

Also funny - nVidia is working on an x86.
 
They can stick an AMD CPU in the Apple TV for all I care. Just upgrade the hardware on that thing already (or better yet let the Mac Mini run the AppleTV software like it's Front Row or something (or just add the capability to Front Row to rent HD movies and stick an HDMI output on the thing). Apple TV doesn't need a "great" CPU. It just needs a BETTER CPU (or a GPU that can offload the decoding).
 
They can stick an AMD CPU in the Apple TV for all I care. Just upgrade the hardware on that thing already (or better yet let the Mac Mini run the AppleTV software like it's Front Row or something (or just add the capability to Front Row to rent HD movies and stick an HDMI output on the thing). Apple TV doesn't need a "great" CPU. It just needs a BETTER CPU (or a GPU that can offload the decoding).

Yep. But a better idea would be ARM. You want it to run cool and fanless.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.