Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Mundy said:
Clovertown is a 64-bit CPU.

Ask your PC-loving IT guy if he uses Windows XP64 and more than 4 gigabytes of RAM. If not, then 32-bit processors are apparently okay for him, too.

hee hee - thanks.
 
ncook06 said:
I'm just wondering if I can drop one of these into an iMac... Are they pin-compatible? Also sort of wondering about a heat issue.
Nope - Different processor archetecture. Even if the pins were the same, the motherboards are different.

Both Clovertown and Woodcrest are 'Xeon' chips, which is a particular processor family. Chips in one processor family may be replaceable with others from that family, but generally different families are not replacable with each other. (Unless specifially designed to do so.)

Also, the iMac is a 32-bit computer, and these are 64-bit chips, reducing any possiblity to zero.
 
epitaphic said:
probably due to latency involved in distributing the load across the two processors. that's the same problem a single Clovertown would have. Only true quads wouldn't suffer from these problems (earliest seems to be Harpertown in 2009, don't know if there are any non-MCM Xeons scheduled before then)

What about Tigerton (2007)? Isn't that a "true" quad?
 
I remember thinking "My goodness what would I ever do with all this power" when playing with the apple store quad woodcrests. The future is very bright :)
 
RichP said:
Personally, I still see data transfer, namely from storage media, as a huge bottleneck in performance. Unless you are doing something really CPU intensive (vid editing, rendering, others) Most of the average "wait-time" is the damn hard drive.

Arrays of cheap RAM on a PCIe card?

The RAM companies don't seem interested in making wodges of slow cheap hi-cap ram, only in bumping up the speed and upping the capacity. For the last 10 years, a stick of decent RAM has always been about £100/ $100 no matter what the capacity / flavour of the moment is.

Even slow RAM is still orders of magnitude faster than a HD, hence my point. There's various historical and technical factors as to why we have the current situation.

I've also looked at RAID implementations (I run a RAID5) but each RAID level has its own problems.

I've recently seen that single-user RAID3 might be one way forward for the desktop, but don't really know enough about it yet.
 
relimw said:
Very cool. Now to find apps (os10.5 direct blind support?) that can make use of all those cores. :cool:

One app would be iTunes. I noticed iTunes was running 14 threads last night. Any time you have a multithreaded application or are running multiple single thread aplications more cores can help.

Some server applications (the Apache web server and many DBMS systems) use a "process per client" model where a new process (another instance) of the server is created for each client connection. A bussy web server might have 100 copies of apache all running at once. 8 cores would help there.
 
Where's Multimedia? This is exciting!

Wow...a user upgradable Mac. Good stuff indeed.

I am anxiously awaiting better utilization of all the cores, but the ability to multitask without hiccups is still great for now!

--HG
 
this is pretty neat news.

means people like me can buy a mac pro tower with the 2.0 ghz core. good video card.

then upgrade later on when i have more money. that and it will be powerful as hell.
super nice!
 
brianus said:
What about Tigerton (2007)? Isn't that a "true" quad?

Intel has two lines of Xeon processors:

* The 5000 series is DP (dual processor, like Woodcrest, Clovertown)
* The 7000 series MP (multi processor - eg 4+ processors)

Tigerton is supposed to be an MP version of Clovertown. Meaning, you can have as many chips as the motherboard supports, and just like Clovertown its an MCM (two processors in one package). 7000's are also about 5-10x the price of 5000's.

So unless the specs for Tigerton severely change, no point even considering it on a Mac Pro (high end xserve is plausible).
 
RedTomato said:
Arrays of cheap RAM on a PCIe card?

The RAM companies don't seem interested in making wodges of slow cheap hi-cap ram, only in bumping up the speed and upping the capacity. For the last 10 years, a stick of decent RAM has always been about £100/ $100 no matter what the capacity / flavour of the moment is.

Even slow RAM is still orders of magnitude faster than a HD, hence my point. There's various historical and technical factors as to why we have the current situation.

I've also looked at RAID implementations (I run a RAID5) but each RAID level has its own problems.

I've recently seen that single-user RAID3 might be one way forward for the desktop, but don't really know enough about it yet.

The reason for the RAM improvoments in speed and size are that RAM (not CPU) is the main bottle neck in preformance. A CPU can only execute instructions as fast as they can be pulled out of RAM. Now you go and put multiple cores inthe box and the demand on RAM doubles.

As for RAID. I think the way forward is Sun's "ZFS" file system. There was talk of that moving into Mac OSX and we know it is being ported to BSD Unix and Linux. Basically ZFS makes the RAID layer just go away

Read more here...
http://www.sun.com/2004-0914/feature/index.html

Sun has released this as Open Source. so it will get ported around to other OSes. I hear Sun's Dtrace is already in Leopard
 
Would it be smart to wait for these 8 core mac pros or are they still a long ways away?
 
ChrisA said:
Logically, the next question is if ZFS' 128 bits is enough. According to Bonwick, it has to be. "Populating 128-bit file systems would exceed the quantum limits of earth-based storage. You couldn't fill a 128-bit storage pool without boiling the oceans."
wow. boiling the oceans. there's a thought that never crossed my mind ;)
 
Quoting myself, bad boy,

RedTomato said:
Arrays of cheap RAM on a PCIe card?

http://www.superssd.com/products/tera-ramsan/indexb.htm

That's one answer. 1 TB of DDR on a (rather big) card. Takes 2500 watts to power, but gives you 32GB/sec continous bandwidth.

Would that be enough to feed an 8-core Mac Pro? (4GB/sec per core, running through the entire 1TB in 32 seconds.... hmmm)

Wonder when products like that will filter down?

There's a rather sad Gigabye Ramdisk card at

http://www.gigabyte.com.tw/Products...ew.aspx?ProductID=2180&ProductName=GC-RAMDISK

Costs only £100 but has a max capacity of 4GB. You'd be better off spending the money on more system RAM.
 
meaning that unless you have a way of really stressing 8 cores, you may be better off with 4 faster cores in your Mac Pro

drool - i'll take 8 cores for my 3D rendering :D

I think I'll be selling my quad G5 next year for a 8 core Mac Pro.

D
 
FleurDuMal said:
A bit pointless given that no software utilises the extra cores yet. But nice to know, I guess.

I'm still getting used to having two cores in my laptop!

whooleytoo said:
What I couldn't understand - I couldn't see it explained in the article - why is the dual core Mac Pro (i.e. with current Mac Pro with 2 cores disabled) faster in so many tests than the 4 core Mac Pro.

I think part of the reason so many people seem to be hung up on the "software doesn't utilize multiple cores" mantra is because benchmarks tend to test only one software component at a time. If a given app isn't multithreaded, then it doesn't benefit from multiple cores in these tests. But that doesn't mean that multiple cores don't affect the overall system speed.

What we need is some kind of a super benchmark: How fast is my computer when I'm watching a quicktime stream of Steve demoing the latest insanely great stuff, while ripping my CD collection to iTunes, while surfing complex Cnet.com pages (w/animation), and compiling the latest version of my Java app, every once in a while flipping over to Dashboard (dashboard seems to take up a lot of system resources every time I invoke it, not just on startup).

At this point I would rather push towards more cores than more raw speed in a single core, since I don't tend to wait on any single process. If something is taking a long time, like loading a page or compiling code, I switch to something else and come back later. I would much rather have the whole system retain its responsive feel than have one app finish its task a few seconds quicker.
 
ChrisA said:
Sun has released this as Open Source. so it will get ported around to other OSes. I hear Sun's Dtrace is already in Leopard

Great. Um... What's their patent licensing scheme on this? (Since they proudly announce they've patented parts of it...)
 
FleurDuMal said:
A bit pointless given that no software utilises the extra cores yet. But nice to know, I guess.

I'm still getting used to having two cores in my laptop!


Not pointless at all if a person uses a lot of applications. You can justify all 8 cores right now. For sure. My quad core shines in multitasking.
 
zero2dash said:
Sheesh...just when I'm already high up enough on Apple for innovating, they throw even more leaps and bounds in there to put themselves even further ahead. I can't wait 'til my broke @$$ can finally get the money to buy a Mac and chuck all my Windows machines out the door.

How is this Apple "innovating"? Anandtech just put pre-release quad-core Intel-processor in to an Apple-computer. Apple itself had nothing to do with it. They could have used quad-core Dell-machine just as well.
 
mmmcheese said:
Do you mean like how BeOS did things?

Yeah BeOS had this great feature called magic pixel dust. :rolleyes:

All that BeOS had was separate threads per window at the UI level. This does nothing for parallelizing compute tasks. These extra thread that BeOS had spent most of their time doing absolutely nothing.

What Mac OS X has now is several operating services that will automatically scale up to use as many cores possible (while still making sense). Many of the "Core" framework do this without any work by application authors other then then those authors deciding to use those services instead of rolling their own.

For example ColorSync color correction, audio conversion, audio mixing, etc.

...and yes Mac OS X 10.5 is expanding the OS services that will do the right thing (TM) as well as making it easier for developers to transparently and directly utilize the cores available in a system.
 
Oh Yeah! Core 2 Quadro - C2Q - Looking Great! October In Stores. Wow!!

I wouldn't want to say I told you so but... :eek: :p :D
Half Glass said:
Where's Multimedia? This is exciting!
Wow...a user upgradable Mac. Good stuff indeed.

I am anxiously awaiting better utilization of all the cores, but the ability to multitask without hiccups is still great for now!
Must Crush Video...Must Crush Video...Must Crush Video...Must Crush Video...Must Crush Video...Must Crush Video...Must Crush Video...

I'm still gonna wait for the Clovertown option to appear in the BTO page, then price retail Clovertowns a Fry's before I decide if I'll let Apple to my upgrade or do it myself according to which way cost less. But I really don't want to kill my warranty on day one. So it'll be academic since they are going retail in a month prolly before Apple adds the Clovertown option to the BTO page although they were pretty Johnny On The Spot with the C2D iMacs.
 
Evangelion said:
Most people run more than one app at once.

Yes, that's true.

It's also true that most of the time, most people aren't even maxing out ONE core never mind eight.

And when they do, their program won't get any faster unless it's multithreaded and able to run on multiple cores at once.
 
iMac Upgradable

So the question I have is can the latest iMac be CPU upgraded like the MacPro?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.