Agreed 100%. A MacBook Air will definitely run most software, but it'll be a pain in the a$$. A few years ago I bought myself a small 10" netbook for a 2 month vacation in Europe. I really only needed it for email, tv shows when bored and storing photographs from my dSLR. It served its purpose perfectly. I had photoshop installed on it, as well as lightroom, but doing any kind of serious work on it wasn't feasible.
Now, given that the Atom processor inside that laptop sucked for photoshop purposes (i'm a graphic designer), it really boggles my mind how I used to work on a Power Mac G4 dual processor @ 867mhz with less than a gigabyte of RAM.... for 7 years!
I remember those. I didn't own one, but I worked at a few. You really had to dial down those settings and install a scratch disk to keep it from being an annoying experience. We can get more power than that from a mini today, but the problem size hasn't remained static. Sometimes updating a computer can change the way you work, and on ram, I think attitudes will shift as applications become more optimized toward 64 bit. For a long time there the application was a soft bottleneck, and there wasn't a huge drive for higher density ram in laptops and computers for a couple years. It was more about mobile phones from the low end and possibly servers on the high end, but I haven't kept up with that so I'm not sure (others on here know way more than me about that).
Just out of curiosity, what do you edit that eats up that large an amount of resources? I use PS mostly to make textures. The highest resolution I usually go for is 4096x4096. I've edited PS documents that size with 30+ layers in them. Multiple image layers with alphas, various adjustment layers, all that neat stuff. In all the years I've been using PS, I don't think I've ever seen it peg higher than 2GB before.
I'm not calling you out or anything. I'm honestly curious here. Unless you're editing 20MP RAW photos with over 60+ layers, I can't imagine what could push you beyond the 4GB mark.
edit: I thought about it for a second, and pretty much answered my own question. If you're doing pro photography or advertisement work, you're gonna be dealing with tons of lossless quality RAW images, vector graphics, and who knows what else. Texture work in comparison is considerably less strenuous on a computer. I'm usually dealing with much smaller .jpgs and .tga files.
Heh... 30MP seems like the low end these days. It's a matter of size and settings. If you need to compile a lot of 32 bit files and sizes of 6k and up, it takes a lot of ram. The thing is that given 64 bit application builds and cheap ram, you can use ram for much of what used to be allocated to scratch disks. If you look back a few years, making this stuff tolerable generally meant 8 bpc was your only option, if you encounter banding in smooth areas, blend it with noise, make sure one channel isn't blocked up and causing the issue. Dedicated or sometimes raided scratch disks were common in dealing with large images, and you had to watch settings like thumbnails, history cache settings, and everything else with large files. The G3 and on really starting to make the price tags of the older graphics workstations feel redundant, but at that time you still needed to adjust your work to what the hardware would support.
Today you can get away with almost anything. I was just saying that a lot of ram gives you quite a bit of freedom with your settings regardless of bit depth or how many layers (32 bit isn't entirely uncommon if you're comping renders and photography). I think the next step would be a functional linear workflow for digital camera files with the typical gamma correction curve and standard profile applied for viewing only until the end. This would open up a lot of editability with floating point math, but I don't know if they'd have to reconsider rasterization and channel interpolation methods considering the nature of digital cameras and RGBG bayer sensor arrays. Basically pixels have not only gaps between them but each only represents one color with the other two interpolated. The doubled green channel is to maintain ideal perceived acutance.
This seems to be turning into a post on why I wish photoshop was more like nuke

. Anyway it's quite liberating not having to close out programs or keep history settings low without experiencing lag. I'm not sure about his testing methods, but digilloyd showed gains on many of his tests up to around 32GB of ram using an image around 10k or so. It's large, but it's not that uncommon.