That's a MacBook Pro with a dedicated graphics chip.I've downloaded the demo and having been playing it. It works very well.
It supports widescreen even. Very playable. I'm going to buy the game. I wanted to check out the demo first, and i'm very pleased. Almost everything is set to max too.
I have a 15" 2.16 core due 2 MBP
C'mon! That's just not true and you know it. I can play WoW on my MacBook and many older games or new games with low reuirements can play perfectly fine on a MacBook.GMA950 is not for games, full stop.
Bioshock on a 24" iMac at full res/max settings. http://www.youtube.com/watch?v=FbY1kTP79vcThe only Apple computer that I would recommend playing Bioshock on is the Macbook Pro. It has a DX10 video card, and it's much faster than the one in the new iMacs. (DX10 cards are still not available for the Mac Pro platform, although ATI's x1900xt is a solid DX9 performer, despite being severely out of date now).
That's strange, because I could have sworn I've played games on my MacBook. But I guess if socam says it, it must be true.The GMA950 is not for games, full stop.
Hahah, nice try, but you got the wrong card. That article you refer to was the press release for the 'cut down' 256mb version of the card.
Unfortunately that video shows that the iMac is not fast enough to play the game at maximum resolution and settings. The YouTube video just shows the first 10 minutes of the game - which is entirely scripted. The video you see at the beginning 'Atlantic 1960', is just an AVI... the first bit of 'almost' actual gameplay comes after 8 minutes of the video, and even then it's still scripted with relatively little requirement from the GPU - and yet it is still jerky - well below 30 fps I'd say.
Rob-ART disagrees with you. His results show the iMac being very marginally faster, except in Quake 4 for some strange reason.The only Apple computer that I would recommend playing Bioshock on is the Macbook Pro. It has a DX10 video card, and it's much faster than the one in the new iMacs.
Apple's always a little bit behind on their video cards, I doubt this will change. But I guess you're right, 18 months is too old, let us all throw out anything that's of that age or older, including pets.Hahah, nice try, but you got the wrong card. That article you refer to was the press release for the 'cut down' 256mb version of the card.
The press release for the X1900XT 512Mb card is here:
And this is from January 2006. Which makes the card approximately 18 months old - as old as the Intel Core Solo processor!
Get my drift?
Add to this the fact that nVidia's 8800 series cards have been out for nearly a year (November 2006), Apple really need to get with the programme...
Because it is typical of the hard-core PC gamers. One year is severely out of date.
I don't care what the article says. The X1900 line came out end of January 2006, because I bought mine on the release. For a video card, that is ancient history.
Welcome to consumer electronics. If you can't handle the PC upgrade schedule, stick to consoles.Because it is typical of the hard-core PC gamers. One year is severely out of date.
I'm sorry, I'm just sick of the ridiculous upgrade cycle. There's enormous online peer pressure of sorts about graphics cards. They're almost instantly 'outdated' in everyone's eyes, and decent budget cards suck of they can't run the latest games at max resolution. It's just silly.
The results are all close, because those games are all old news. The Doom3 engine should not be used in benchmarking any longer.