Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
After three pages and nobody bringing this up I'm sure I'm wrong, but I thought the Intel chips themselves were restricted to 4GB RAM. I don't know how that works, but that's what I had heard somewhere. Can anybody say whether there could be a limitation in any of the other hardware that would restrict the MBP to 4GB?

What does the FSB have to do with the RAM? FSB is the link between Northbridge and the CPU, it has nothing to do with the amount of RAM the system supports.

IIRC, Santa Rosa and Penryn only support 4GB of RAM. And I'm talking about the mobile chipsets here.

People, there's been a few users on these boards that have their previous model Penryn's running fine w/ 6GB of RAM. In fact they are far snappier that way. Keep in mind its DDR2 memory.
 
Hybrid SLI will never happen

Apple won't enable those nVidia features, because it will KILL their claim of a 5 hour battery life. Probably same thing with the memory, though I suspect the current HIGH cost of DDR3 may have prevented them from having it as an option. DDR3 so-dimms are still priced too high, with DDR2 costing nearly half that much. :apple:
 
I would settle for a little more thickness, if they made part of the bottom shell vented, put on some little feet (so that air circulates under the machine) and put in some fans blowing outward. :p

No No NO NO !!!!!!!!!!
Vents in the bottom of a laptop are the dumbest idea ever. Apple laptops are the only ones without this MAJOR design flaw. It's just plane retarded designing a portable computer that has to be used on a clean flat hard surface not to overheat. again No NO no NO no. never. :mad::mad::mad:
 
8GB, crap. Imagine the cost for 1x 4GB DDR3 chip

Costs about $300 or so dollars for the full 8gb upgrade. Can't remember the site, but it has been talked about on here for a while now.

A good place to look for RAm is dealram.com. It lists the price of RAM for different systems (MacBooks, Mac Pros, PowerBooks, along w/ computers from Acer, HP, etc.). It also sells flash memory (compact flash, SD, USB drives, et al). Pretty cool.

As for the graphics, I'd want to be be able to switch on the fly as well as using both at the same time. But I think I'll wait for 4 things before buying something: 17" MBP upgrade, Nehalem, upgraded 30" cinema display, & Snow Leopard. Okay, you could probably combine the 17" MBP & Nehalem. By waiting, I'll be able to save up enough $$$ for all this. Besides, I'm saving up to buy my own car. Driving my parents' car gets kinda old. :(
 
Not just for games.....

Apple does still have a focus on things like video production and editing, and a properly implemented "Geforce Boost" type setup could really improve applications like Motion.

I think the bigger question is, how difficult would it be to implement this on the current Macbook Pro's configuration, where the 9600GT video is far faster than the integrated 9400M video? To prevent the faster chip from getting stuck in idle clock cycles, waiting for the slower one to finish, they'd have to devise some kind of setup where the 9600GT performed 2 instructions for every 1 instruction the 9400M was doing in parallel.

As I understand it, nVidia has no such setup in current Windows notebooks using Geforce Boost. Rather, they've used it when the 2 video chipsets in question were far closer together in speed, so they could simply do "1 to 1" splitting of instructions between them.

So like you say, they may well find the better alternative is to wait for Snow Leopard to tackle the "simultaneous use of video chips" issue, and do it more like you suggest - letting one GPU handle some of the number crunching tasks related to video rendering, while the other draws the actual dots and lines on the display.


Personally, I don't think GeForce Boost will ever be implemented on Apple computers. The main benefit of having GeForce Boost and getting 2 GPUs to directly work together is for games, which isn't exactly an Apple focus or of much benefit to the OS itself.

With Grand Central, Snow Leopard should be able to see the IGP and the discrete GPU as 2 separate processors and allocate tasks to each. SLI/GeForce Boost merges 2 GPUs to appear as 1 GPU, but that isn't necessary for GPGPU or OpenCL. Tasks can be distributed quite well to each separate GPU rather than 1 big hybrid GPU.

This won't directly improve gaming, but already nVidia has the GPU accelerated PhysX engine. Games that use PhysX can use the discrete GPU for all the visual work in the game while the PhyxS engine runs on the IGP. This doesn't require SLI/GeForce Boost to work. As OpenCL and CUDA becomes popular, I'm sure this will be the way to go rather than using SLI/GeForce Boost which only benefits the visual work in the game and leaving nothing free for the physics and other computational work.

And someone mentioned before the impediment to on-the-fly switching between the IGP and the discrete GPU may well be Quartz Extreme since it's probably difficult to shunt the frame buffer from 1 GPU to the other without logging off. It may not ever be on-the-fly switching, but hopefully Apple will get it down to a few seconds pause while it switches, without having to fully log-out.
 
Are you disabling the speed throttling? What does the thermals of the GPU look like? What program are you running to test the GPU?

RivaTuner and yeah I've disabled the throttling. It gets up into the top 80s early 90s F but never crashes and the case is cooler than my old X1600 MBP was when browsing the freaking net. ATITool detects no artifacts and neither do mine eyes during gaming.

But as impressive as that sounds and as high as the 3dmark score might be the real world performance is a bit meh. It's fine for games a year old or more, you can play those butter smooth and max settings, and some well optimised new games like Spore, probably Left 4 Dead and Far Cry 2 will run ok on med settings, but anything like Crysis/Crysis Warhead, STALKER, World in Conflict and any other high end games out or coming out from now on have to have their settings turned down so low to play smoothly they look worse than 2 year old titles.

It's not the clock speeds that are the problem with the 9600M GT it's the no. of shaders and the memory bandwidth. They're not so hot. This card is basically just rehashed tech from over a year ago slightly overclocked now it runs cooler.
 
I think the bigger question is, how difficult would it be to implement this on the current Macbook Pro's configuration, where the 9600GT video is far faster than the integrated 9400M video? To prevent the faster chip from getting stuck in idle clock cycles, waiting for the slower one to finish, they'd have to devise some kind of setup where the 9600GT performed 2 instructions for every 1 instruction the 9400M was doing in parallel.

It doesn't work that way. If you want both cards to work on the same frame, you have to give say one third of the frame to the slower card to render, and two thirds to the faster card (and that is difficult enough). But you never know ahead how long it takes to render some part of a scene. Say you have a video game with a really nice graphic effect when someone shoots a gun. That effect might take a lot of time to render, and if you are unlucky, you gave that part of the screen to the slow card (remember: The application doesn't know where things are on the screen). So everything slows down enormously for that part to render. Or you have an empty scene with a group of people with really elaborate clothing. The third of the screen with those people might take longer than the rest of the scene. Bit of bad luck, and all the hard work is given to the slow card.

And you can't set up the graphics card so the fast one draws 20 people and the slow one draws the other ten, because they can't both draw into the same screen areas.

The best you can do is assigning GPUs to monitors if you have more than one monitor (built-in + external monitor).
 
Personally, I don't think GeForce Boost will ever be implemented on Apple computers. The main benefit of having GeForce Boost and getting 2 GPUs to directly work together is for games, which isn't exactly an Apple focus or of much benefit to the OS itself.

Perhaps in the past, but that seems to be more & more a priority for Apple.
http://www.apple.com/games/

Apple used to ignore games and not take them seriously. They focused on the fine arts Pros...but then they realized games move machines and did an about-face...riiiiight around the time Jobs came and implemented the iMac. Look at the new MBPs, they're built with gamers in mind much moreso than musicians, film makers & photographers imo.
 
Okay MBP = possible dual gpu....but no apple support, because no software.
But whats up with a possible windows software for the dual gpu?
 
When we say Apple will ultimately allow us to use both at once, I think that is more likely to take the form of one GPU acting normally and another as a OpenCL device.

I don't expect they would enable the two GPUs to act as one in an SLI type arrangement. That would seem like a massive driver effort.

The fast switching will surely be coming, maybe even before 10.6

I disagree - nVidia already has the working binaries for this - it's simply a matter of Apple wrapping them into their driver framework and extending the kernel enough to acknowledge more than one GPU.
 
Costs about $300 or so dollars for the full 8gb upgrade. Can't remember the site, but it has been talked about on here for a while now.

http://www.crucial.com/store/partspecs.aspx?imodule=ct2kit51264bc1067

No. more like $1200 for 8GB kit! It would be cool but I think 8GB probably consume more battery like that's probably why Apple caps it at 4GB like they also not enabling SLI in New MacBook Pro. Think about it 8GB, Intel Core Duo 2 @ 2.8GHz with a SLI NVIDIA. How much battery life would you have? I say an hour! :D
 
Okay MBP = possible dual gpu....but no apple support, because no software.
But whats up with a possible windows software for the dual gpu?

nVidia stated that Apple's implementation isn't exactly the same. I'm guessing this is largely due to the difference between having a BIOS versus EFI. They did say they would specifically be writing new drivers to support running both GPUs under Windows Vista properly. There is no support for Hybrid SLI under XP however.

SLI shouldn't be hard to incorporate since the code is out there and OS-X is now 100% binary compatible with nVidia's driver base - it's just a matter of wrapping those binary's up into a proper KEXT and ensuring that everything else in the OS (pretty much only the kernel and OpenGL implementations) can make use of dual GPUs. Again, nVidia should be able to help with the OpenGL part.
 
Wirelessly posted (Iron Man/3.0 (Iron Man Suit; U; Arc Reactor 5_5; en-us) IronManKit/7.18 (KHTML, like Gecko) Version/3 Iron Man Web Browser/885.30.9)

Hey guys, I'm just on the road right now with the top down, but I thought I'd just say that I totally believe this rumor! Can't wait to see it!
 
http://www.crucial.com/store/partspecs.aspx?imodule=ct2kit51264bc1067

No. more like $1200 for 8GB kit! It would be cool but I think 8GB probably consume more battery like that's probably why Apple caps it at 4GB like they also not enabling SLI in New MacBook Pro. Think about it 8GB, Intel Core Duo 2 @ 2.8GHz with a SLI NVIDIA. How much battery life would you have? I say an hour! :D

Hi,

I would have assumed that more memory would result in better battery life as the hdd would access information less often. Maybe I'm mistaken.

Still, I agree with other posters. 640k is more than enough memory.

s.

;)
 
Actually...its currently only in the Macbook line. Not in the iMacs, Mac Pros, or Mac Mini yet.

Well, no real need to put it into a desktop computer. But the real issue here is that we are talking about the double graphics chip, which is only on the MBP therefore my comment was off. If it is only in the MBP I don't know how probable it is for Apple to implement.
 
Originally Posted by mgpg89
NO - BECAUSE Apple nor Nvidia has released drivers to support it under windows. Plus you have to run Vista to get the benefits.
And if a solution does come out from someone other than Apple, to those who are curious, be careful about temperatures. You don't want to damage your system due to the chip proximity in the MBP. I'm sure some groups like Bare Feats will publish information if that time comes.
 
Apple won't [....]DDR3 so-dimms are still priced too high, with DDR2 costing nearly half that much. :apple:

Same could be said about DDR when DDR2 came out. So what's your point?

image.php
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.