Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Brazos is superior to Atom, but is slower CPU-wise then ULV Core 2 Duo and slower GPU-wise against the 320M. Whereas Apple going Sandy Bridge at least has upside for the CPU, Apple going Brazos looks to be a downgrade all around.
Brazos makes for a nice $500 netbook compared to Atom.

The socketed Mini-ITX and Micro ATX boards with an overclockable E-350 appear to be starting at $119 with USB 3.0 and $99 without.
 
The article update sure make it sounds like the door is very much open for a future MacBook Air/MBP 13" using BOTH Sandy Bridge AND NVIDIA GPU for graphics. In fact, this might mean a closer partnership in general for Intel and NVIDIA, and Apple might have better options for utilizing latest-gen. Intel processors AND NVIDIA GPUs, even in the small-form-factor machines.

I'm very very very very very happy with my MacBook Air 13" 2.13GHz 4GB RAM 256MB SSD (latest gen.) with Core2Duo, but when I'm ready to refresh in a few years, I have a feeling some really sweet options might be available!
 
added one square more inch to the board for a non-integrated external GPU solution.

Really Steve Jobs, one square inch isn't going to kill your design aesthetic.

no. it wont.

or nVidia could produce a GPU that doesn't need a square inch of board.
Why is the 330m the same size as the 320m that has a whole chipset worth of additional functions that need pins associated?
Plus Intel could come to the party as well. Why does it need a square inch of board for the chipset when nVidia could get all that function plus a GPU and memory controller in the same space?

If each of them halved their footprint there would be plenty of room.
 
or nVidia could produce a GPU that doesn't need a square inch of board.
Why is the 330m the same size as the 320m that has a whole chipset worth of additional functions that need pins associated?
Plus Intel could come to the party as well. Why does it need a square inch of board for the chipset when nVidia could get all that function plus a GPU and memory controller in the same space?

If each of them halved their footprint there would be plenty of room.

because nvidia/amd/intel make one or two models of a cpu/gpu and disable circuitry depending on demand or manufacturing issues. in the old days they would junk chips that didn't meet spec. now they disable circuitry and label it as a lower model SKU
 
Can someone explain to me how what Intel is doing isn't antitrust/a monopoly?

Intel basically just monopolized the entire integrated graphics market, despite making a far inferior product than it's competitors.

How is this any different than when Microsoft bundled IE with Windows to try and monopolize the entire internet browser market, despite making an inferior internet browser.

Microsoft was sued for that and lost, so why not Intel.

Atleast with what MS did, there was an easy fix, anyone that realized that Netscape (which evolved into Mozilla Firefox), was the superior product could simply download it and install it. That however is not an option for Apple and other consumers of chipsets, Intel won't sues anyone that rips out Intel's crappy integrated graphics and puts in an integrated graphics solution that actually works well. I am still shocked by the fact that when Nvidia started to do just that, Intel sued THEM. Imo, it should be other way around. Intel should be ones getting sued.

The FCC investigated Apple and Google for their purchase of AdMob and ---. Why aren't they investigating Intel about this?

because if they go to court intel will argue the computer market includes ARM devices as well and then they don't have a large market share
 
because nvidia/amd/intel make one or two models of a cpu/gpu and disable circuitry depending on demand or manufacturing issues. in the old days they would junk chips that didn't meet spec. now they disable circuitry and label it as a lower model SKU

So why don't they put the lower function chips on smaller carriers, that could go into smaller footprint devices and be sold at a premium?
 
So why don't they put the lower function chips on smaller carriers, that could go into smaller footprint devices and be sold at a premium?

a lot of TV's are using Atom these days but it's not advertised anywhere. ARM is pretty good but Atom still beats it for performance.
 
Hooray for anti-competitive behavior! Oh, we can't compete with this smaller company, so let's just buy them out and shut them down! The feds won't do anything about it. They like monopolies!!! Whee!!!!!
 
There is no way in hell they can justify using the Core2Duos still, at least not anywhere near the price they're demanding for them. Yeah they're fine chips, but they're old as balls. Come on, seriously?

I really hope Apple starts offering an AMD option or something, at least if nothing else to light a fire under Intel's ass. Not sure if Apple has some kind of deal with Intel that would not allow that though.

In any case, I'm sure they'll use the latest Sandy Bridge CPU's (with their new integrated GPU) and pair them with a semi-new Nvidia or ATI GPU.

The CPU is a minor portion of the cost of a notebook, so the 'price they are demanding for them' really wouldn't be that different with a Core2 vs. and i3. What you are really paying for is the high quality screen and unibody chassis (and "Apple Tax")
 
I don't think that's pure licensing cost. Part of that is an agreement to avoid having to pay punitive damages as enforced by the court. Still, no way there will be a discreet nVidia GPU on an intel processor die.

Well, no, certainly, but there'll at least be NVIDIA influence to make it less crappier than it could be. And after reading a little bit more of the details, I don't think NVIDIA will leave having been too hurt by it. Still, it sucks that this took them out of the chipset business. They could've made some pretty kick-ass computers, PC or Mac.

If something gets discontinued, I'd think it'd be the White Macbook. It's pricepoint is already overlaped by the MacBook Air, although the use case is slightly different, and could be further mitigated by bringing the 13.3" MBP down $100. I'd then see the 13.3" MacBook Pro get a low-end discrete GPU and Sandy Bridge. Eliminating the HDD for the blade SSD should free up enough room to put in a discrete GPU with some redesign. I'm hoping the ODD is retained since that's a logical part of the Pro differentiation and is still necessary for installing larger software, given the common internet bandwidth limits in Canada for instance.

I had that idea too, though every time I bring it up it gets shot down because blade SSDs cost too much; though for the price point that the 13" MacBook Pro is already at, I don't think it'd cost all that much more. I also get into some stupid argument with someone who is fervently bent on the death of the optical drive in computers that don't rhyme with "Gack Look Where?" or "Smack Eeny Swerver", and frankly, I'm with you on the retention of the ODD, because download speeds, even in the states, aren't enough to replace DVDs (for use with software and movies). Otherwise, that idea is the only way I can see Apple successfully differentiating the 13" Pro from the other two 13" Mac laptops (along with the Mac mini) internally, though I doubt they'll do it. Plus a previous rumor stated that the next MacBook Pro refresh would likely only see four updated models instead of the current six, which conveniently takes out the 13" MacBook Pro. Otherwise, the fact that whitey is the same cost as the 11" MacBook Air doesn't mean much to consumers who'd rather buy more computer for the same money. Make the 13" Air $999 and you'll sell me on that notion. Plus the education market needs a Mac laptop made of a more durable material than Aluminum and the Polycarbonate fits the bill. I could see the two lines (white MB and 13" MBP) merge, though how they'd accomplish that, I couldn't even begin to guess. Either way, next refresh will be very telling for both lines.


I hope Apple avoids the AMD's first Fusion products. Llano may have a better GPU, but the CPU is Core 2 Duo caliber. It's missing out on Sandy Bridge's faster hardware video encoder and lacks SSE4.1 and SSE4.2 much less AVX, which is a big benefit for multimedia applications and is likely to be adopted by software much faster than getting developers to rewrite things in OpenCL to take advantage of the GPU.

Yeah, Fusion seems to be a bit on the low-end, like for Netbooks. That said, it seems like it could wreck the hell out of Atom. Too bad Apple's starting CPU is the ULV Core 2 Duo. They really could've produced a lower-end version of the 11" MacBook Air with an Atom, though I suppose it's a much more capable machine with the ULV C2D.

This is bad news...

Intel and ATI have NEVER been good at making good mobile GPUs.

Intel has not made a single worthwhile GPU... EVER.

And for the people who say that these new Sandy Bridge GPUs are as good as the 320m I say they are missing the point. THE 320M IS OLD... If Nvidia were making a GPU to go along with the Sandy Bridge chips I would more than likely be 2x the speed of what intel will be offering.

ATI hasn't sucked all that bad with mobile, though I suppose Apple hasn't used them since the Mobility Radeon X1600 for a reason. The point to be made with the 320M against Sandy Bridge's IGP isn't that the former is older than the latter. It's that the former still beats the latter with an older CPU and that the tests where the latter holds its own against the former are those in which the CPU is more heavily used.

Umm, why would the FCC investigate a chip company over chipset patents?

Because they effectively chased another company that should've had proper license of those patents out of business? Out of the chipset business, at least.

The article update sure make it sounds like the door is very much open for a future MacBook Air/MBP 13" using BOTH Sandy Bridge AND NVIDIA GPU for graphics. In fact, this might mean a closer partnership in general for Intel and NVIDIA, and Apple might have better options for utilizing latest-gen. Intel processors AND NVIDIA GPUs, even in the small-form-factor machines.

I'm very very very very very happy with my MacBook Air 13" 2.13GHz 4GB RAM 256MB SSD (latest gen.) with Core2Duo, but when I'm ready to refresh in a few years, I have a feeling some really sweet options might be available!

Nope. Not even close. The MacBook Air uses an integrated graphics processor. NVIDIA can't make one of those for use in the MacBook Air without doing so with a chipset that only works with the Core 2 Duo. They are STILL not allowed to develop a chipset that works with the Core i Series chips. The update said that Intel would be licensing NVIDIA's patents in Sandy Bridge. This only means that Intel can draw from NVIDIA's technology, not use its GPUs or chipsets, both of which aren't possible or feasible, respectively. Really, the update doesn't change anything, save for the notion that Intel has the potential to make their IGP not suck. But knowing Intel...

or nVidia could produce a GPU that doesn't need a square inch of board.
Why is the 330m the same size as the 320m that has a whole chipset worth of additional functions that need pins associated?
Plus Intel could come to the party as well. Why does it need a square inch of board for the chipset when nVidia could get all that function plus a GPU and memory controller in the same space?

If each of them halved their footprint there would be plenty of room.

I don't think it works that way. For one, the 320M isn't a discrete GPU. It is a Chipset that has an integrated GPU on the same die. The GT 330M is a discrete GPU. You can't compare the two as they are two different things. Intel could in theory make a chipset that (given that it is the same size as the 320M) also has an IGP and then you could have the two IGPs do some unholy CrossFire/SLI type thing with each other for better performance. But I don't think it works that way either.

**** Intel and their crappy graphics, put AMD in the darn computers

+1
 
Last edited:
Because they effectively chased another company that should've had proper license of those patents out of business? Out of the chipset business, at least.

So, Intel has invested tons of money in research, and patented the hell out of it. They should give that away?

Seems the consensus at MacRumours is that Apple should sue the hell out of anyone stepping on Iphone patents.

Is there a double standard here? :rolleyes:
 
So, Intel has invested tons of money in research, and patented the hell out of it. They should give that away?

Seems the consensus at MacRumours is that Apple should sue the hell out of anyone stepping on Iphone patents.

Is there a double standard here? :rolleyes:

They both were infringing on the other one's patents. And frankly, the notion of one of them suing the other is just as stupid as Apple suing Motorola or HTC or Google for infringing on an iPhone patent. It's as though I could patent oxygen and then sue everyone using an oxygen tank. Okay, fine, maybe not quite that bad, but still. For Intel to not even let NVIDIA license the patents it has so that NVIDIA could make chipsets for the Core i3/i5/i7 chips STILL strikes me as anti-competitive.
 
They both were infringing on the other one's patents. And frankly, the notion of one of them suing the other is just as stupid as Apple suing Motorola or HTC or Google for infringing on an iPhone patent. It's as though I could patent oxygen and then sue everyone using an oxygen tank. Okay, fine, maybe not quite that bad, but still. For Intel to not even let NVIDIA license the patents it has so that NVIDIA could make chipsets for the Core i3/i5/i7 chips STILL strikes me as anti-competitive.

In Intel's defense, though, Intel is fairly rapidly moving functions from the North and South bridges into the CPU package (for all intensive porpoises, the North bridge is now gone - it's inside the CPU package).

If the CPU-bridge architecture is changing radically at each tick-tock, it would be difficult for a third party to keep up - or Intel would be forced to freeze some kinds of innovation to keep from breaking the licensees products.

As NVIDIA is finding, building the CPU/GPU/chipset together is the winning strategy.
 
I hope that money wasn't spent just to make the more Geforce GTX edition video cards for the up coming mac pros or other mac machines with intel chips.

That would be BS! Slow and should have been out a while ago. Get it together get updated.
 
Do Apple need to worry about games ever again?

Now they have the marketplace and are pulling in all the iOS games to run on the Mac's they will be able to boast about the many thousands of new games for the Mac.

That's my concern, rather that build the Mac UP to run great games, they are building games DOWN to run on the Mac.

:(
 
download speeds, <b>even in the states</b>, aren't enough to replace DVDs

This made me laugh :D, having in mind that:

Top 5 Countries by Download Speed are:
South Korea : 36.5 Mbps
Latvia : 23.3 Mbps
Republic of Moldova : 21.5 Mbps
Japan : 20.3 Mbps
Sweden : 19.8 Mbps

(Source: Net Index)

The only thing that we in the states are at the top is the <i>price</i> of broadband :mad:
 
Well in the "country" I live, I've got 60Mbps (about 7MBytes/sec) and In my uni with ethernet I've got about 600 to 700Mbps of internet speed. The only thing I use the optical drive for is for ripping audio cd's.
And I use an old plextor drive for that, since the new drives are utter garbage.

Apple should just ditch the optical drive, put an ssd in it's place, put a bigger battery and put an nvidia/amd whatever half decent graphics chip and call it a day.
 
seems like nvidia got the short end of the stick on this deal. intel doesn't have to invest any of their own time to make a decent gpu
 
Very rarely do you tax a CPU anymore. GPUs get hit continuously, and storage I/O is key.

Unless you are playing games, doing high end 3D content creation or running GPGPU tasks this is plainly untrue. For example, my PC (I have a MBP as well before the tiresome why are you PC owners posting here clowns pipe up) in general daily use (stuff like web browsing, playing back video and the like) never sees my GTX480 (actually a watercooled SLI pair :p ) never clocks itself above 50Mhz, yes 50Mhz. Oh and that is under W7 which has a GPU accelerated 3D composited desktop. Also my MBP has both a 9400 and 9600M GPU installed and I have never noticed any performance difference between the two, even when driving both the laptop display and my 30" 2560x1600 Dell 3008. I expect I would if I played games or did high end 3D content creation, but then if I did a 320m would not provide suitable performance either. Something much more common to see is an app or thread maxing out one or two of my CPU cores.
 
Unless you are playing games, doing high end 3D content creation or running GPGPU tasks this is plainly untrue. For example, my PC (I have a MBP as well before the tiresome why are you PC owners posting here clowns pipe up) in general daily use (stuff like web browsing, playing back video and the like) never sees my GTX480 (actually a watercooled SLI pair :p ) never clocks itself above 50Mhz, yes 50Mhz. Oh and that is under W7 which has a GPU accelerated 3D composited desktop. Also my MBP has both a 9400 and 9600M GPU installed and I have never noticed any performance difference between the two, even when driving both the laptop display and my 30" 2560x1600 Dell 3008. I expect I would if I played games or did high end 3D content creation, but then if I did a 320m would not provide suitable performance either. Something much more common to see is an app or thread maxing out one or two of my CPU cores.

My macbook with 9400m graphics is not smooth like my hackintosh was. The dock animation is jerky, opening big stacks too. So either the software is poorly optimized or the graphics card really makes a difference. And If used with my full hd display forget it it lags all over the place.
 
Something else must be causing that, because as I say I have not had any issues like that on my setup, even when using a 4k pixel display. Similarly my work desktop has two 1600x1200 resolution displays driven by a very weak GeForce 9300 GE (a whole 8 CUDA cores!) and that has no problem running the Aero desktop of W7 with over 30 application windows in memory.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.