PDA

View Full Version : G5s at SIGGRAPH


actionslacks
Jul 31, 2003, 01:14 AM
Just got back from SIGGRAPH and I was suprised that Apple had very little presence, especially with the G5. The Apple booth was set up only for software presentations - Shake, DVD Studio Pro, Final Cut, etc.

I saw 2 G5s on the floor. The first was in the Apple booth, but was not featured and the other was at the Pixar booth and was being used for demonstrations. The Pixar demonstrations were impressive. The rep had the CPU monitor up the whole time he was rendering and it was never maxed out. Also the machine was VERY quiet. Until I saw the monitor, I thought for sure that it was not turned on.

I would have thought that this would have been a good place to show off the G5s against the Wintel boxes espcially since there were several vendors showing off Opteron systems curiously running XP.

I can only think that there are very few of these machines assembled if they couldn't even get a few out for people to get their hands on.

acj
Jul 31, 2003, 02:01 AM
Originally posted by actionslacks

The rep had the CPU monitor up the whole time he was rendering and it was never maxed out.

This is not good! It's like needing to go faster but your shoe's stuck under the gas pedal! They've got some optimizations to do.

So if it is rendering fast, this could be great, because it could ge faster.

cb911
Jul 31, 2003, 02:39 AM
i guess Siggraph isn't really the place to show off new hardware. but i guess that they could have if they wanted...

MisterMe
Jul 31, 2003, 08:01 AM
Originally posted by acj
This is not good! It's like needing to go faster but your shoe's stuck under the gas pedal! They've got some optimizations to do.

So if it is rendering fast, this could be great, because it could ge faster. Huh? The G5 was doing everything asked of it without breathing hard, and you say that is a bad thing? Just exactly what would be a good thing in your book?

Mr. Anderson
Jul 31, 2003, 08:15 AM
Originally posted by MisterMe
Huh? The G5 was doing everything asked of it without breathing hard, and you say that is a bad thing? Just exactly what would be a good thing in your book?

I'm thinking that he's talking about using everything the machine has when rendering, maxing it out. If they were doing a rendering and it wasn't maxing the CPU then something is a little wrong there. I do 3D animation and I have to wait hours and sometimes days to get an animation done. If you can process faster frames, use the all the CPU, great!

I'm thinking that they weren't rendering or had something fixed for the demo so that the rendering didn't take up all the CPU time so that other apps could run smoothly.

D

actionslacks
Jul 31, 2003, 11:53 AM
Originally posted by acj
This is not good! It's like needing to go faster but your shoe's stuck under the gas pedal! They've got some optimizations to do.

So if it is rendering fast, this could be great, because it could ge faster.

First, don't take this too seriously. It was just something that I noticed. And he WAS swtching back and forth between programs. Beyond that i don't really know what to make of it.

What i think you should be more concerned with is the fact that Apple was not promoting their hardware for 3D animation at all. Pixar was using the G5 for a demo, but Apple's booth consisted of about 20 G4s and 1 G5 and they were only talking about their apps.

Mr. Anderson
Jul 31, 2003, 12:18 PM
Originally posted by actionslacks
What i think you should be more concerned with is the fact that Apple was not promoting their hardware for 3D animation at all. Pixar was using the G5 for a demo, but Apple's booth consisted of about 20 G4s and 1 G5 and they were only talking about their apps.

I'm sure they had a reason. It doesn't make any sense. Maybe they're worried about the 65k units preordered and want to keep people from ordering any more until they can meet the demand....

I'll be more worried if this is the case in MWSF :eek:

D

actionslacks
Jul 31, 2003, 07:48 PM
www.architosh.com/news/2003-07/2003c1-0730-richardkerris.phtml

Apparantly Macs caused a bigger stir than I witnessed. This being my first Siggraph, I didn't have anything to compare it too. I am already looking forward to next year in LA when people will have had the G5s in their hands for a full year.

acj
Jul 31, 2003, 11:53 PM
Originally posted by MisterMe
Huh? The G5 was doing everything asked of it without breathing hard, and you say that is a bad thing? Just exactly what would be a good thing in your book?

Someone mentioned what I was worried about, but I just wanted to confirm. I've done a bunch of 3d rendering, and I wouldn't be too happy if the computer was just putting 50% into it. So what if it was responsive for other programs, I want my movie in 1 day, not two! Besides, OSX should be good enough to multitask even when something is taking 100% of the CPU.

So like I said before, this could be seen as a good thing because there is possibly a lot more optimization that will happen.

MisterMe
Aug 1, 2003, 07:45 AM
Originally posted by acj
.... Besides, OSX should be good enough to multitask even when something is taking 100% of the CPU.

So like I said before, this could be seen as a good thing because there is possibly a lot more optimization that will happen. Young man, 100% means all, everything, there ain't no more. It is impossible to run a GUI-based OS with one application getting the entire CPU.

Mr. Anderson
Aug 1, 2003, 07:56 AM
Originally posted by MisterMe
Young man, 100% means all, everything, there ain't no more. It is impossible to run a GUI-based OS with one application getting the entire CPU.

Ok, I render my animations and do 'little' tasks with all that running. Email, web, maybe another app if I really need to (photoshop, etc.). And my dual 1.25 works great, not that much slower. But the CPUs are both pegged at 100%.

To render and not be using 100% seems wrong to me.

D

MisterMe
Aug 1, 2003, 03:23 PM
Originally posted by Mr. Anderson
Ok, I render my animations and do 'little' tasks with all that running. Email, web, maybe another app if I really need to (photoshop, etc.). And my dual 1.25 works great, not that much slower. But the CPUs are both pegged at 100%.

To render and not be using 100% seems wrong to me.

D There you have a much less capable CPU pegged at 100% while multitasking. Even then, the 100% includes your rendering app and all the other processes being run at the same time. Rendering is not a religious issue, it is a computer task. Right and wrong do not enter into the equation. In the case of the G4, rendering took as much of the CPU as was available. In the case of the G5, rendering took as much of the CPU as was needed.

daveL
Aug 1, 2003, 03:41 PM
Originally posted by MisterMe
There you have a much less capable CPU pegged at 100% while multitasking. Even then, the 100% includes your rendering app and all the other processes being run at the same time. Rendering is not a religious issue, it is a computer task. Right and wrong do not enter into the equation. In the case of the G4, rendering took as much of the CPU as was available. In the case of the G5, rendering took as much of the CPU as was needed.
I disagree. Rendering is like folding. They will both use every cpu cycle available until they are finished. On a faster machine, they will simply finish earlier. The only exception I can see is if the rendering process also had to do a load of disk i/o or process data across a slow network. In short, if the rendering process is not i/o bound, it *will* be cpu bound until it completes.