In all honesty I believe there's more motivation to put leverage on Intel or nVidia instead of actual adoption.
I honestly feel this is the case, too.
Otherwise, if it's true, it makes for a very depressing rumor.
In all honesty I believe there's more motivation to put leverage on Intel or nVidia instead of actual adoption.
The New AMD factory in New York will be a "foundry"
This may be good for Apple as they will be able to have more control on the functionality of their processor/platform. Does that mean more control over user experieince and imposing more restrictive hardware configurations. I don't know.
Running your product through a foundry based fab poses some challenges. Apple will be alloted blocks of time. If the yields are not what they are expecting this could actually increase the cost of the processors. if that is the case more fab time would be needed to prove the process, at a cost of course.
Intel is more flexible in that arena as they have more fab floor to absorb extra engineering experiments without taking massive hits on overall throughput.
this could be good for apple but minimal gains for the consumer
Precisely. If you wouldn't perceive a difference wouldn't you rather pay $400 less for a AMD based Mac Pro?
Intel has just announced record profits which means they're once again taking advantage of weaker competition and we're paying through the nose.
Intel's yield is much higher than AMD's.
And, at the end of the day, this factor is paramount.
HT is not the reason for Intel's greater performance with the i7s (and some i5s) over AMD's offerings.In otherwords, AMD can't compete when Intel artificially raises the thread count.
Precisely. If you wouldn't perceive a difference wouldn't you rather pay $400 less for a AMD based Mac Pro?
Intel has just announced record profits which means they're once again taking advantage of weaker competition and we're paying through the nose.
First of all, Apple wouldn't buy both AMD and nVidia. While I don't think it's an impossible rumor, I do think the chances of Apple doing any such thing is vastly remote. First of all, given that a lot of nVidia's revenue comes from products focused on the PC markets, it would cost Apple a considerable amount of money to purchase them, only to then either a) continue to produce GPU/GPGPU designs that are sold to competitors, so as to make a ROI, or b) focus them towards designs for use in Apple's own computers which would seem like a questionable investment at best. A lot of the selling points for nVidia's products, Apple has had no interest in incorporating, so I don't see why they rush to do so now.i read only the first 4-5 pages, so i apologize if someone mentioned this before , but if apple buys AMD (and NVIDIA) then wouldn't that lawsuit between AMD and INTEL be null? and wouldn't it give apple the advantage because they work exclusively with INTEL, so they can make the NVIDIA graphics cards to work more intergrated with the INTEL chipsets (like NVIDIA was trying to do, but INTEL doesn't want them to)? also, by buying AMD (for NVIDIA) they could get their next OSX to work better with the graphics cores? i'm no tech but this seems like a good reason to buy AMD &/or NVIDIA.
First of all, Apple wouldn't buy both AMD and nVidia. While I don't think it's an impossible rumor, I do think the chances of Apple doing any such thing is vastly remote. First of all, given that a lot of nVidia's revenue comes from products focused on the PC markets, it would cost Apple a considerable amount of money to purchase them, only to then either a) continue to produce GPU/GPGPU designs that are sold to competitors, so as to make a ROI, or b) focus them towards designs for use in Apple's own computers which would seem like a questionable investment at best. A lot of the selling points for nVidia's products, Apple has had no interest in incorporating, so I don't see why they rush to do so now.
When it comes to AMD, it's a bit more clouded. On the one hand, Apple would be obtaining a CPU and GPU design firm who also has a large interest in a foundry (Global Foundries). So that obviously could provide a benefit in regards to tighter system design and integration with OS X, as they could dictate design goals to match what they wish to do. On the other hand, a lot of people think that purchasing AMD would allow Apple to manufacture X86 processors, but if I recall correctly from what I've read in the past, the X86 license is non-transferable upon AMD being purchased. I may be wrong about this, but if I'm right, a purchase by any company would essentially allow Intel to terminate the cross-licensing agreement.
There's no difference between a fab and a foundry. I think what you're trying to get at is the difference between owning a fab and contracting one. When you contract a fab, you are not "allotted blocks of time." You pay for wafer starts. The fab can only make X wafer starts per day, and you can buy some portion of that. In some cases you can even buy partial wafers, and share a wafer with other folks.
Intel's yield is much higher than AMD's.
Wafer starts = fab time.
I spent countless hours trying to recapture <1% which equated to hundreds of thousands of die.
i read only the first 4-5 pages, so i apologize if someone mentioned this before , but if apple buys AMD (and NVIDIA) then wouldn't that lawsuit between AMD and INTEL be null? and wouldn't it give apple the advantage because they work exclusively with INTEL, so they can make the NVIDIA graphics cards to work more intergrated with the INTEL chipsets (like NVIDIA was trying to do, but INTEL doesn't want them to)? also, by buying AMD (for NVIDIA) they could get their next OSX to work better with the graphics cores? i'm no tech but this seems like a good reason to buy AMD &/or NVIDIA.
What is wrong with AMD chips? That they benchmark slightly lower than Intel chips? That doesn't mean they aren't powerful enough to get work done, especially the kind of work average consumers do.![]()
Can we stop this "sucks major nuts" and 'doesn't compete for the same market' thing now. AMD and Intel all compete for the same market. It just depends on what you class your markets as.
On the other hand, a lot of people think that purchasing AMD would allow Apple to manufacture X86 processors, but if I recall correctly from what I've read in the past, the X86 license is non-transferable upon AMD being purchased. I may be wrong about this, but if I'm right, a purchase by any company would essentially allow Intel to terminate the cross-licensing agreement.
You are saying the same point... I say market, you can call it "class of market". The point I was making was they are not after the same consumer. You seem to agree but like to argue? It's just as Apple isn't after the same consumer as other computer brands or Lamborghini as Toyota. There are no $185+ desktop cpu because the don't want to sell in the high end cpu market. There is no $250 Apple netbook because they don't want to sell in the low end market.
Where's the 930, 950, 960, 975, 980 etc. in your benchmarks? They are not there because they don't have a cpu to compete with it.
I give up.
(found the horse icon
)![]()
Not yet it doesn't.
If I can't do actual work on it for my actual job then it's not a computer. It's as simple as that. I can do everything my job would ever ask of me on a netbook. The yPad? Not even close. But then maybe that's because I expect more than "pretty darn close" to a real paycheck.
You are saying the same point... I say market, you can call it "class of market". The point I was making was they are not after the same consumer. You seem to agree but like to argue? It's just as Apple isn't after the same consumer as other computer brands or Lamborghini as Toyota. There are no $185+ desktop cpu because the don't want to sell in the high end cpu market. There is no $250 Apple netbook because they don't want to sell in the low end market.
Where's the 930, 950, 960, 975, 980 etc. in your benchmarks? They are not there because they don't have a cpu to compete with it.
I give up.
(found the horse icon
)![]()
You just compared a Single Core to a Dual Core. Quite likely the RAM was slower too, the hard drive was definitely slower. Very poor reasoning.I hate AMD. I used to have a Gateway MX6426. It has an AMD Turion 64 mobile single core chip @ 2.2GHz. Not only did that chip suck soo much, it gave off a **** load of heat. It would be idealing at 140F and run at 210F when intensive apps were used. The fan was fine. I appied arctic silver 5 to it. it ran a little cooler... but not much. The integrated ATi graphics would be at 170F whne doing nothing and at 200-215 doing games like NFSU. Its 4200RPM PATA HD and 1Gb RAM also contributed to its slowness. I sold that crap on eBay, and bought a dell Latitude D620. It's Intel Core Duo @ 1.66 is still faster. Yes it is duo core though. With Arctic silver on the D620 it ideals at 90F and under full load 130F. When i prevent the fan, it can peak to 180F. The GMA doesn't go over 110F even under full load. It came with 1Gb Ram and a 40Gb 5400 SATA HD. Wih those specs, and both systems with Xp, my Dell outperformed the gateway. yet the both were from 2006... the dell was January 2006 and the gateway was end of 2006.
My Dell Latitude D620 now has the same 1.66GHz T2300E Core 2 Duo as before, but 3 GB Ram, and a 250GB 7200RPM HD inside. Add Macintosh OS X 10.6.3, it runs seemlesly. I'm typing this on it right now.
Thats not relevant AND Preston is a known Microsoft advocate.
I can tell from those specs that is an underperforming piece of crap.
I have a netbook very similar to that one, except I have more memory, and it sucks.
There is no such thing as a good netbook. You could have stopped after the first line.