Video memory is not the bottleneck in modern GPUs, the number of shaders and memory clock speed is.
A pixel = 1 byte. 1920x1080(aka Full HD/1080p)=2 073 600 bytes, or 1,9775MB.
Displaying that image 60 times per second on a screen? 118,6523MB. See where I'm going? Even 256MB, provided the memory itself is fast enough, is plenty to drive very large screen at very high refresh rates.
There are quite a few things wrong with this.
a) 1 pixel is 4 bytes usually when you use 32bit color space. 1 byte per RGB sub pixel an 1 byte for alpha channel.
b) all you are talking about is the framebuffer that is only a small part of what is usually stored in the VRAM and needs quite a lot more for one frame
c) a framebuffer only stores one frame there are other buffers that store 1 or 2 more frames and usually no more than 2-3 frames need to be stored at any given time. In this respect the refresh rate really doesn't matter. Thus frambuffer is actually less than you calculated.
d) Back in the very old days it used to be just framebuffers but today the GPU is responsible for drawing and creating the picture and needs all kinds of additional information to that. In Games that means textures, geometry information, overlays. All this can be a lot especially in big outdoor levels.
Even the standard 2D OSX desktop can suck memory like crazy. There are used loads of tricks to concele the lack of enough video memory. Saying that it is easily enough is just nonsense.
Here a bit of Info about what a OSX desktop would want on video memory.
http://www.anandtech.com/show/2804
A single Safari 4 window required 7MB of video memory at 2560 x 1600..
each 12MP image from my digital camera that I open in Photoshop eats up around 56MB of video memory
and something about gaming and system + video memory.
http://www.tomshardware.com/reviews/ram-memory-upgrade,2778-6.html
--------------
What is the best choice for a Computer that lasts you 3 years?
Currently with most engines the 6750M is either to slow anyway or as fast as with 1024MB on OSX. I just wanted to point out that snaky's theory is wrong. Still currently you won't feel much of a difference with either VRAM.
The problems as this described at thg but to a bigger extent are with what you can do. Today for a long while to come all developers will optimze enough for 512mb and there will be little performance difference on default settings. However changing some settings with enough VRAM drops frames by 5% with to little VRAM by 30%. Artefacts and popping textures are another problem. There isn't enough VRAM and some texture isn't in it, it will be loaded but not before the engine for performance sake only says, "get me that data, I will go on for now". And some other service loads the data and it starts popping up in a later frame. That can be annoying but it rarly (if ever) happens in tight CoD environments and many people just ignore it or don't even notice it.
A lack of VRAM is something few people notice but there are problems. For the next 3 years you will most likely do fine with 512mb. If you like playing ARMA2 1GB might be worth it. 512mb installations won't die quickly because most IGPs still not exeed that and won't for a while to come. 256mb is less and less supported that is where you run into trouble at times. I doubt 512 will reach that state in the next 2-3 years.
In OSX for all kinds of casual work 512MB is enough. If you open multiple big pictures in PS (which isn't a casual program anyway) or do some serious movie editing some things might not be as smooth anymore, but with video editing usually other things cause more problems. CAD stuff and rendering also pro stuff rely much more heavily on lots of VRAM, which is why Tesla, Quadro, ... GPUs used to come with the biggest VRAM configs the chips supported.
I would only worry about games as a normal user.