Uh, no it's not. The X3100 has a lot of latest features and support:
http://softwarecommunity.intel.com/articles/eng/1488.htm
http://en.wikipedia.org/wiki/Intel_GMA
The Mac drivers might not be quite there yet but it's tons better...
As an "old school" PC gamer (and console gamer) I learned MANY years ago that what is said on paper is always VERY different from real world performance.
This is especially the case with Intel's paper specifications for all GPUs.
Look at the GMA 950 for a second. Many of the paper specs put it above all of nVidia and ATI's current and past integrated GPUs. In some respects, on paper, it is on the same level as the GeForce 8400M GS in my HP.
The GMA X3100, on paper, looks better than the GeForce 8400M GS in my HP.
However, what is the real world performance like?
Well, I had another HP with the ATI Xpress 200M chipset (integrated GPU, but dedicated memory, 128MB worth) and a Turion64 ML-37 (2GHz), along with 1GB of RAM and Windows XP MCE 2005. Unreal Tournament 2004 could achieve a solid 30fps in ALL situations at 1280x800 with medium settings across the board on that system. Half-Life 2, 30fps at the same settings as UT2k4.
On my MacBook? Well, it has a 2.16GHz Core 2 Duo, GMA 950, and 1GB of RAM. In XP, using the latest drivers, UT2k4 can barely choke out 30fps at 800x600. Half-Life 2 at any resolution, unless certain hardware and software modes are forced and visual quality is greatly sacrificed (graphical atmosphere plays a HUGE role in this game), the game is a slide show at nearly any setting in most circumstances.
The GMA X3100, in nearly all benchmarks, scores roughly the same as the GMA 950, give or take a couple of frames here and there.
What about the GeForce 8400M GS in my HP? Despite the fact that the paper specs are roughly the same as the GMA 950 and X3100, the real world performance is another story. I get a rock solid 60fps in UT2k4 at 1280x800 with everything maxed, same goes for Half-Life 2 and most of Half-Life 2: Episode 2. Doom 3 runs smooth as butter at any setting. Call of Duty 4, a game that would make the X3100 cry, runs just as good as it does on the consoles. It even runs UT3 great, as long as I use reasonable settings. This is all on a Core 2 Duo at 2GHz (Santa Rosa) with 2GB of RAM on Vista. I just gave Vista another try with the latest WHQL certified nVidia drivers available through Windows Update and my gaming performance is now BETTER in Vista than it was in XP.
Long story short, despite what Intel says about the X3100 on paper, real world tests (except for 2 or so) show the GMA 950 still being the better performer and having much more stable drivers across the board. Ironically, even though the GMA 950 does NOT support OpenGL 2.0, it still has better OpenGL support! Even user experiences I've read on other forums support this. I remember reading a thread at one notebook forum where people were trying to figure out how to disable the Hardware T&L on the X3100 to get games playable. Essentially, they were taking it "down" to the level of the GMA 950 so they could actually play games on it.
The GMA X3100 is truly a joke. I would MUCH rather have the proven performer with the stable drivers and be "stuck" at "only" 3GB of RAM than go with something that has a terrible track record after a year of availability and the ability to handle one more gigabyte of memory.