Look in the news about Dell and NVidia GPU's - they were probably second or third, after HP, to 'discover' that the GPU's literally burn or melt the chip's package.
Perhaps a 'new' Dell will be better, but news has it (mixed with a little rumor, but then look at this website's name

) that all NVidia chips from about 18 months ago to the present are actually affected; they just won't (or can't) own up to it yet.
They'll send you a patch to keep the fan running, and hope the thing lasts until your warranty expires.
I'm no particular fan of any brand, including Apple - everyone has had their fair share of problems, so double check with skepticism on any laptop with NVidia GPU's, even Apple's machines.
However, I'll bet you get better service if there is a problem from Apple than from Dell.
I friend of mine (a professional, non-tech person) has a Dell laptop purchased perhaps a year or so ago. It came with Vista, when XP was still an option (she didn't purchase one of the 'business' machines that continued to have XP downgrade rights).
Her cell-based mobile broadband device, for which she still had another 8 months on contract, had no driver for Vista - which she didn't realize until after she and her husband spent 3 weeks trying to get it to work. I had to locate and install beta drivers to get it to function, developed by an open-source volunteer. It worked, but they paid me $25 to find the driver and install it (they offered $100, but really - they're friends, not clients).
Anyway, since then, she has received a replacement keyboard, which she installed herself (that took 2 days, and then she called me over to do that for her). The keyboard was a replacement for the one she ruined trying to replace something else that was under the keyboard (I forget what that was).
I remember how 'nice' the machine looked (looks), the Vista-home-premium with the glowing buttons (first time I had seen the Vista interface) - it seemed a bit foreign compared to XP, but 'pretty' - slow for a dual Intel machine with 256 Mbytes of GPU - but nice looking.
If you can find a legitimate XP disc in your collection, and assuming you can or are willing to 'transfer' that to your 'new' Dell, I'd personally choose to run XP if I'm going to run Windows at all. I highly recommend the 64 bit version, too - it's faster, more stable and actually uses the RAM you give it.
As a developer, I can tell you without hesitation that, although XP and it's NT ancestors were actually good technical accomplishments (rumors to the contrary are not as 'real' as many profess), the interior of OSX, to a developer and engineer, is considerably more 'modern' and advanced in design. OSX is based on Unix, that has a 30+ year history of reliability - though this particular branch might be aged to only 20, and Apple's 'generation' of the kernel and other choices putting it together mean that some of it is even more recent, the point is the technological roots of OSX are extremely well exercised and proven. It was a great choice to use as the basis for OSX. Windows struck out on their own, and re-invented the wheel, and didn't do as well.
OSX's Unix based file system approach is much better than Windows. One place you see this is in managing large files on disks were write cache is disabled. OSX's method for mapping the location of files is more sophisticated (and the Unix system allows for plugin file system alternatives). In Windows, for example, if you have a file that's 4Gbytes (typical enough these days), and if the allocation size is 4K (common), there will be 1 million entries in the directory to map that file. If you copy the file from one drive to another, and write caching is disabled, the machine may appear to hang for several minutes (sometimes 15 or 20), even though the file COULD be copied in about 3 minutes. It makes you wait through hundreds of thousands of directory management steps. Of course, turning on the write cache 'fixes' this - at the risk of loosing directories at every errant shutdown.
The long awaited WinFX filesystem, which was supposed to be in Vista, never appeared. They had to strip it because they couldn't get the code into 'release condition' soon enough. They still haven't, as far as I know.
Unix methods are far superior in this regard, and Apple inherited that when it decided to base OSX on Unix, and then selected a range of filesystems you can choose from. I don't see this mentioned much in the argument between OSX and Vista - I usually hear more generic terms like reliability, virus susceptibility, user interface quality - but there are more mundane specifics like this which form the real guts of your computer system you would never know about unless you had a reason to look, like I do (as a developer).
Every where you turn to examine the differences involved you find points to be made like this. There are a few in favor of Windows (especially Windows 2000 and XP), but they are few and of limited value. Critical sections, for example - a technique used to protect memory shared by various threads in an application, is much faster and lighter in Windows than in Unix/Linux and Mac. Mac is more like Linux than you might believe, because under the hood is the ancestor to all of the Linux/Unix family line.
In most ways that matter, and the closer you look to compare, you'll realize there's a lot to appreciate about the design of the Mac OSX compared to anything from Windows.
With all of that said, I still develop applications that target Windows - I'm in business, so I have to choose to cover a wide audience. The only 'catch' I ever agree with which designates a Windows machine for a particular purpose is if a required application is only available on Windows, and there are a few, but they are generally not required by most users.