Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Status
Not open for further replies.
slughead said:
Well, look who's the scientist all of a sudden.

Hey, no need to be a dick about it. 24fps it is a fact, they talked about it in both biology & media classes I took a few years ago. granted a frame of a movie might be different measurement but the 100 to 120 fps seems a bit of a stretch, we're talking consumer level machines...
 
Trekkie said:
Hey, no need to be a dick about it. 24fps it is a fact, they talked about it in both biology & media classes I took a few years ago. granted a frame of a movie might be different measurement but the 100 to 120 fps seems a bit of a stretch, we're talking consumer level machines...

Wow, slow down a second here, you're speaking 24 letters a minute. Is that enough in verbal communication ? Definitely not. But we put punctuation between the words, as well as pauses, continuations, and so on... The point is, this 24 frames per second is the biological limit of perception neurologically. When we watch motion, we capture the motion blur as well, this hints us where the object is going. If you use a camera to capture a fast moving object, you will notice this, the blur trail tells you where the object is next headed. TV shows and movies all have this. If you freeze a frame on your DVD or VCR with a lot of action, you'll see that the motion is not defined, but captured over a range of exposure time to preserve perceived motion. Video games, on the other hand, DO NOT capture frame to frame motion. If you do a frame capture, you'll get just the characters and their positions, you don't capture the second derivative of the motion, the velocity, because your capture time IS ZERO. Hence, the 24 frames a second in a first person shooter seems a lot choppier than 24 frames a second in your favourate TV show or movie even though the update rate of movement is the same. The key here is the inter-frame interpolation, which exists in the form of motion blurrs in motion captured sequences, but do not exist in video games, that is more than enough information to trick your brain into perceiving fluid motion. If the motion gap is large, for example in a FPS, someone gets blown away by a fast rocket, even 60fps seems jumpy since the distance travelled by fast objects between relative frames is still high due to increased velocity of the travelling object.
 
Trekkie said:
24fps it is a fact, they talked about it in both biology & media classes I took a few years ago.

this is like a classic game debate I wish I wouldn't enter in to but here I go:

the game isn't just only displaying 24 frames per second, it's only calculating 24 frames per second. So if you are playing against someone who is running at 48 frames per second, what happens to their bullets? what happens to their movement? how do you reconcile those two things?

Speaking from experience, the game reconciles those things by having all of the 48fps shots land exactly between the eyes of the 24fps player and the 24fps player's shots hit the moon.


and, 24fps isn't some biological limit. It's arbitrary, just like 55 mph speed limits. It just got agreed upon at some point.

15 is about the minimum for "convincing" motion, and it goes up from there. 30 looks pretty smooth, 60 looks glassy, and 120 looks surreal.

Edit houses spend countless hours of computer time trying to make 30'ish frame per second video look more like the choppy 24fps of film. Why would they bother if no one can see the difference?

but more to the point, so it can't play doom3. It also can't run any of the best viruses out there, it never forces you to look at any hexidecimal, and there is no add/remove hardware wizard.
 
Maxx Power said:
You are forgetting to mention those performance numbers are from PC games as well as PC versions of video cards.

oingoboingo said:
Here are a few benchmarks of modern games which are available on the Mac platform (although the tests were performed on PCs, so the actual FPS scores may bear no resemblance to those achievable on the iMac G5. The relative performance of the various GPUs is the important thing here).

I did actually mention it in the first paragraph of my original post :) for exactly the reasons you mention. There are way too many differences between platforms and game code for any of the FPS figures to be taken literally. The figures were to illustrate the relative performances of various GPUs.

Maxx Power said:
Secondly, if you noticed the trend in performance graphics cards in the last little while, and here I give you an example, the 9600Pro that is bundled with the G5 powermac's only run at 364Mhz compared to the stock standard of 400Mhz on the PC.

Yes, I did notice the downclocking of the Radeon 9600 Pros. If you check some of my posts from earlier this week, I've been messing around with the ATIcceleratorII overclocking utility, and sure enough, the default clock speeds of my G5's Radeon 9600 Pro are well below standard. As you say, it would not be at all surprising to see a similar downclocking used in the iMac G5, but I hope against hope that this is not the case.

Maxx Power said:
This means one thing --> the "Steam Powered" (creatively said by some clever bloke here in this thread, credits go to him/her), coal burning Nvidiot 5200 Ultra Slow Edition is not acceptable, especially NOT considering the price of the computers they are in, and further exacerbating this is the fact these cards are non-upgradable, perhaps forever. Investment in one of these machines is like buying several thousand dollars of lottery, you might get what you want, but be prepared to be severily disappointed.

Yes, the smart buyer will definitely be waiting a few weeks to see how the benchmarks turn out. At this stage it is unknown what effect a) reduced G5 FSB bus speeds b) single channel DDR RAM and c) the nVidia FX 5200 Ultra will have on the overall system performance of the iMac G5. My opinion is that the FSB and RAM changes will not have all that great an impact, but the FX 5200 is going to drag things down.

I've mentioned this in some other posts, but I'd love for the iMac G5 to be a good overall performer (including 3D graphics and games), so then I could replace my bulky 1.6GHz G5 PowerMac / 19" CRT setup with a svelte iMac G5. Unfortunately, I fear that the new iMac will be a little underwhelming in some key areas (most notably, 3D graphics and gaming).
 
oingoboingo said:
I did actually mention it in the first paragraph of my original post :) for exactly the reasons you mention. There are way too many differences between platforms and game code for any of the FPS figures to be taken literally. The figures were to illustrate the relative performances of various GPUs.



Yes, I did notice the downclocking of the Radeon 9600 Pros. If you check some of my posts from earlier this week, I've been messing around with the ATIcceleratorII overclocking utility, and sure enough, the default clock speeds of my G5's Radeon 9600 Pro are well below standard. As you say, it would not be at all surprising to see a similar downclocking used in the iMac G5, but I hope against hope that this is not the case.



Yes, the smart buyer will definitely be waiting a few weeks to see how the benchmarks turn out. At this stage it is unknown what effect a) reduced G5 FSB bus speeds b) single channel DDR RAM and c) the nVidia FX 5200 Ultra will have on the overall system performance of the iMac G5. My opinion is that the FSB and RAM changes will not have all that great an impact, but the FX 5200 is going to drag things down.

I've mentioned this in some other posts, but I'd love for the iMac G5 to be a good overall performer (including 3D graphics and games), so then I could replace my bulky 1.6GHz G5 PowerMac / 19" CRT setup with a svelte iMac G5. Unfortunately, I fear that the new iMac will be a little underwhelming in some key areas (most notably, 3D graphics and gaming).

Oh yes, for that first quote you had here, I did forgot that by the time i posted a reply, sorry.
 
I think the new imac follows apple's li'l thing they have. Colorful and curvy, than white and curvy, and now white and boxy. I like it a lot but it took me like 20 minutes to realize it was all-in-one :p Do yous guys think there'll eventually be a 23" one with a better graphics card? It looks perfect to mount on a wall in a studio with wireless mouse+keyboard.
 
For a good joke look at the first responces to the introduction of the ipod on this forum. For that matter look at the responces for the intro of the G4 iMac. So many said that it was the "end of the world, it would never sell, it's underpowered"...HA HA HA!


Apple will sell a ton of the new G5 iMacs, too many people on this bord need a reality check!
 
nagromme said:
Too cool.

Now let some people whine--EVERY Apple product gets that phase... followed by great reviews :D (Remember the iPod whining?)
Wow!

Unfortunately that was before my time here on MacRumors, but that old thread gave me a LOT of laughs!

So much talk of "Where's the Innovation?! This thing won't sell!! It's an overpriced external hard drive!!"

Ha ha ha ha ha!

We are really messed up in the head.

Personally I think the new iMac is an INCREDIBLE machine. I must admit, I never really liked the iMac G4 design...but this one...:D:D I really like it! The iMac G5 will definately be finding a place in my house someday. :D

Now, where were we? ....Oh, yeah, back to the WHINING!! ;)
 
Quicky

In the midst of your video struggle, does anyone know if the new iMac comes with a keyboard? According to the web store, keyboards and mice are seperate orders. Anyone?
 
i'm sure everyone will fall in love with it when they see it in person...... if not don't buy it.

I, personally, will be saving all my piennies and dimes to buy this sucker :p
 
When the original iMac G4 was debuted, I seem to recall Johnathan Ive talking about the different design concepts that they considered. The new iMac G5 was one of those original designs, he also mentioned that they chose the former iMac G4 design because mounting a optical drive vertically, slowed down performance. Evidentally, that is still true. That is why the eMac's Superdrive is 4x faster than the new iMac G5.

The fact is, the iMac *may* be targeted at the average consumer, but alot of folks that fall between the consumer and professional product spectrum---choose the iMac simply because they don't have the money or can't justify spending a huge amount for a Powermac G5 and an Apple Cinema Display.
Apple tends not to address this type of consumer.

And, just because Dual-Layer drives aren't the norm, doesn't mean Apple shouldn't or won't put them in their products. Apple has not been the norm since Steve Jobs returned. Don't you remember a time when Superdrives aka DVD-R drives weren't readily available in other consumer offerings? That didn't stop Apple from using them!

Cless said:
... Obviously you're quite removed from the reality of consumer computers. Dual layer is strictly the realm of enthusiasts and professionals right now. Even DVD burning in general isn't a very widespread activity among the bulk of computer users.

Are we? Since when do consumers need a 64-bit processor? Where is the 64-bit software/operating system? If Apple used your argument as a basis for future products, we would be stuck with the G4 for another 4 years. The fact is you might not need it, but the computer industry is a highly competitive business--and you can gain an edge over your competitor by using ploys such as 64-Bit processors, aluminum or plastic chasis, high-end video cards, and dual layer drives.
 
I like it...

I was worried how it would look with just a monitor, but the negative space underneath makes it look great. Anyone thinking how chameleon could play out on that lower lip surface (more so than sleep), I wonder if they have any leds under there we don't know about? I like the way it looks like an old Macintosh with the backside cutoff! I can see people having one around just for ichat. I wish it ran on a battery and had a cintiq like screen tho... I guess that will have to wait.
 
miketcool said:
In the midst of your video struggle, does anyone know if the new iMac comes with a keyboard? According to the web store, keyboards and mice are seperate orders. Anyone?
Yeah, it comes with a keyboard/mouse. If you go to the purchase screen for any of the imacs and scroll down you see that you get the keyboard/mouse.

sushi said:
ChipNoVaMac said:
The lack of FW800 is one big down fall IMO

Okay, how many consumers have external peripherals that use FW800?

FW400 yes, but 800 not that many.

Sushi

I agree with Sushi, I haven't seen very many FW800 peripherals out there. I'd love to see more, and see FW800 implemented in all Apple products, but oh well...
 
Good good!

When I heard the rumored specs of the iMac G5 I thought well maybe I should just get an eMac..then I saw the iMac G5, saw the specs weren't that bad and thought it kicked ass. This is a "consumer" PC but by no means will it be slow. This is more then enough to handle any program designed for Mac, and just a year ago to get something similiar to this you had to pay what? $2000-3000? It is definately a great value and if Apple's plan is to get the 64-bit gen going into high gear this will accomplish it.

I don't think it having Firewire 800 matters much, as Firewire 800 is not used widely enough, even if iPod uses 800 if you dont have 1 second to upload a song to your iPod using 400 you need to cut down on your daily activities. Being a college student this will be great for me, I will sell my monitor and G4, get this great more powerful package, and not pay much.

My biggest gripes I could think of is that it loads on the side--why not the top? Also you sort of take the swivel part out of it once you hook all this crap to the back of it, your liable to accidentally disconnect a cord if you swivel. Other then that Im not terribly worried about screen quality, I know this screen will be fantastic for 2 years, and sort of crap out after the 4th year, but it would still be useable.

I am just wondering why it doesn't come with Tiger, and what the damn holdup with Tiger is?
 
yes, and your competitor has 64-bit CPUs *and* 64-bit operating systems

joshuawaire said:
The fact is you might not need it, but the computer industry is a highly competitive business--and you can gain an edge over your competitor by using ploys such as 64-Bit processors, aluminum or plastic chasis, high-end video cards, and dual layer drives.


Hmmm. The dark side has 64-bit processors (amd64/em64t), 64-bit operatings systems (linux/xp 64-bit preview), CPUs at up to 3.6 GHz, all kinds of chassis, expandable with PCI/PCI-X/PCI Express/AGP, just about every video card made (easily upgradeable), and every optical drive that's out there.

The new iMac has a low end 64-bit CPU with a completely 32-bit OS, only 2 DIMM memory slots, one highly constrained chassis without any slots, a definitely low-end non-upgradeable video card, and a fairly low-end non-upgradeable laptop optical drive.

What *is* your argument? Are you saying that the iMac isn't competitive?
 
XboxEvolved said:
My biggest gripes I could think of is that it loads on the side--why not the top? Also you sort of take the swivel part out of it once you hook all this crap to the back of it, your liable to accidentally disconnect a cord if you swivel. Other then that Im not terribly worried about screen quality, I know this screen will be fantastic for 2 years, and sort of crap out after the 4th year, but it would still be useable.

The right side does feel kinda natural for a drive. Besides, on top? Wanna get toaster jokes started? (Roxio legal steps in here) The space heater jokes are already in order....
 
FPS

OK, I'll butt in :p When talking FPS, you're actually talking at least SIX separate issues. Getting them confused is a common pitfall.

1. How many FPS the eye can "notice." I hear lots of numbers thrown around... 24, 30, 60... but in the end, I can detect a slight flicker in the refresh rate of my eMac at 80 Hz vs. 89... so the eye can certainly detect over 80 FPS at least in some subtle way. (Otherwise there wouldn't BE higher refresh rates.)

2. How smooth the resulting motion seems. You can have a light flickering at 200 pulses per second and it will look steady. Now fly it through the air fast and you'll see a trail of dots: it's not a smooth line. You CAN see that. Motion blur can help "connect the dots"... but simply hiding the frames that way (like in most movies) isn't exactly the same as truly having more frames precisely rendered (or having true continuous movement like real life). Sometimes a blur-trail effect looks more natural than sharp rendering. Sometimes not.

3. How many FPS the display can actually render. A CRT has a given refresh rate (typically 72+). A TV/console game (NTSC) has 60 fields-per-second. CRTs and LCDs alike take some time for an image to actually fade into the next frame. Thus, a GPU may be rendering more FPS than your display can actually show--so you don't even get the CHANCE to perceive them.

4. Min and max framerates. If your average is 60 fps, that may look smooth--but there will be moments in many games where you drop below or rise above that average. If you drop frequently down to 12, then that 60 is not so good. An average framerate of 120 might keep your "low points" from being noticeable. In other words, extra FPS is a nice cushion to have.

5. How many times per second the in-game actions like collisions are calculated--and with what lag/delay. That need not be tied to what's rendered visibly at all! In online play, the network can be a far bigger factor.

6. The subjective effect of all that on what you are doing (gaming or whatever) at a given moment. That's going to vary based on a million things, from lighting in the room to personal preferencers to the style of play your fellow combatants have. There's no formula to tell you that 120 fps is always good enough, or that 30 fps is never good enough. It's just not that simple. You might notice a given framerate in one situation and not in another. And if you notice... you might not care! In the end, you may care more about having fun than poring over the math :D I know my mind is on the fun when gaming (UT2004) on my G4 laptop... which is totally crushed by the new iMac :)
 
HiRez said:
Unfortunately, no. They for some stupid reason removed this capability a few years ago, and it's really annoying because I have a PowerBook that I usually run with an external monitor (lid closed). To restart I have to pull it out of its little cubbyhole, dragging the cables with it, open it up, press the power button, and then quickly close the lid and put it back. PITA.

Never run a Powerbook with the lid closed. Apple explicitly warns against this due to the heat that radiates from the keyboard.
 
AidenShaw said:
Hmmm. The dark side has 64-bit processors (amd64/em64t), 64-bit operatings systems (linux/xp 64-bit preview), CPUs at up to 3.6 GHz, all kinds of chassis, expandable with PCI/PCI-X/PCI Express/AGP, just about every video card made (easily upgradeable), and every optical drive that's out there.

The new iMac has a low end 64-bit CPU with a completely 32-bit OS, only 2 DIMM memory slots, one highly constrained chassis without any slots, a definitely low-end non-upgradeable video card, and a fairly low-end non-upgradeable laptop optical drive.

What *is* your argument? Are you saying that the iMac isn't competitive?

I don't see your arguement either, there are plenty of MP3 players out there that many consider the better choice over iPod, of course iPod may be a good seller in its own right but like the computer market the majority of options are non-Apple, and could be considered better. So now what was -your- arguement? My arguement is, power, and style in compact packages sells, and while there are similiar products to the iMac in both design and CPU, none of them can match to how everything is so form fitted like they are on the Mac platform.
 
Trekkie said:
Hey, no need to be a dick about it. 24fps it is a fact, they talked about it in both biology & media classes I took a few years ago. granted a frame of a movie might be different measurement but the 100 to 120 fps seems a bit of a stretch, we're talking consumer level machines...

I'm not going to go into the "I can get an XYZ PC for Q dollars that would run games P hundred times faster" debate, but I will say that I did notice the frame rate wax and wain from 120 to 100 and back again when playing QuakeIII on my old vid card. I then capped the FPS at 80 to keep it constant.

You don't _need_ to go past 60FPS for any game, however games when lag in certain situations down to 5-10 FPS, there's no mistaking the need for something better.

The thing about it is, at some point EVERYONE has to draw the line. What if the card were so crappy people were saying "Come on, apple at least let me run quake 2!" .. Some of us choose to hold Apple up to the PC standard: Just give me enough to do everything OK instead of photoshop well and games crappily. Instead Apple has given this new iMac a *modern* processor and a "its resolutions go high but nothing else" video card.

Again, let me emphasize that the 5200 is not a mid-range video card, it is an entry level or ultra-cheapo-browse-internet-and-write-letters-to-grandchildren card. It will not run doom3 here or there, it will not run doom3 ANYWHERE.

This computer is Apple's ONLY PC. The powermac is a workstation, this is a PC, the difference is speed and cost. This is the sort of thing that makes came companies ignore Apple: Why the hell should carmack make doom3 for mac? Why I ask you? The ONLY people who can run it are people with G5s. Mind you, I think they will release it, but what about farcry, Half life 2, and so many others? Apple is killing the gaming market for mac, not windows or game companies. It's the people who MADE the iMac a gaming crapbox.
 
just a thought said:
I'd like to weigh in as a "professional" user. I run a small publishing house. Our two "workhorse" design computers are a G4 iMac and a Cube. I just bought a bottom-of-the-line G5 iMac to replace the Cube (moving to one of our editors who will replace her clamshell iBook with it).

The fact of the matter is, even though we do graphics-intensive stuff, I've NEVER felt that any of the video cards (all stock) we've worked with have lacked in any way, shape or form. And we are doing NOTHING but graphics work. We don't play games. We run Photoshop and Quark and Illustrator day in and day out and the only reason we're replacing the Cube with the G5 is because the PROCESSOR is much faster.

The fact of the matter is that this is about as close to a perfect machine as I could imagine: it's fast, it's got a good screen, and it takes up almost no space on the workbench. I ordered it as soon as I got in to work this morning.

The reality is that most professionals don't seek out the top of the line stuff--they don't need the fastest graphics cards, they don't even need the fastest processors. Every design professional I've talked--and I know a lot of heavy-hitters--makes a decision of cost vs performance and goes from there. A friend of mine who has one countless publishing design awards heads up a shop that up until this year were still running entirely on blue and white G3s.

Based on that, I can assure you that, among professional graphic designers, this thing is going to be HUGE.

Just wanted to say thanks for this. Maybe some of the people who are focusing solely on the gaming aspect will realize there's more to judging the quality of a graphics card than how fast and well it can play games. Graphics cards are key for individuals like yourself and obviously are not unusable for your needs, so let's not get too caught up in this whole thing, shall we? Ah who am I kidding, it's already too late...

Thank you for your perspective and input, it is needed in threads like this one.
 
Status
Not open for further replies.
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.