Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
PS8 Will not run on OS 9

PS8 Won't Work on OS 9

Could this mean something more than a simple Carbon to Cocoa conversion? I think it might. What other reason would they have for dropping OS 9?

*drum roll*

Perhaps it's because they will be incorporating 970 enhanced code? Is there any reason why supporting the 970 would exclude OS 9? If not, it may just be a reason to stop QA'ing OS 9 support? Who knows.
 
Re: Re: Re: AltiVec

Originally posted by MisterMe
Unlike you, I actually know string theorists and they like Macs. Real scientists like to get their work done, not futz around with computers. As for the term velocity, there is really only one relevant dimension in computation, the line between the beginning and end of the job. In that respect, velocity would denote the rate at which you are moving forward or backward. Since your computer task can't move backward even on a Windows machine, speed is the more appropriate term.

Man, not to be a hard-a$$ or anything, but:
ve·loc·i·ty __ (_P_)__Pronunciation Key__(v-ls-t)
n. pl. ve·loc·i·ties
Rapidity or speed of motion; swiftness.

Physics. A vector quantity whose magnitude is a body's speed and whose direction is the body's direction of motion.

The rate of speed of action or occurrence.

The rate at which money changes hands in an economy.

Velocity only implies that there could be more than one direction of motion, it doesn't (as far as I can see) require there to be. And, the guy wasn't just speaking about computation, if you looked at the original post; he also said: m/s speed, m/s/s acceleration. So, in that case, I said he should...

Oh, never %*#(& mind. ;)

And, it does look like the term "frequency accleration" can be used: http://216.239.41.100/search?q=cach...quency+definition+acceleration&hl=en&ie=UTF-8. Though, I have to say that I, myself, prefer velocyples, now that someone said it! :D

And (to whomever it was, I can't remember), nice try to deflect the angst over the 970 by bringing up John Manzione... :p

[EDIT: Stupid mistakes]
 
Re: Re: Re: Re: Woo Hoo - they get faster and faster

Originally posted by Bengt77
Wrong! The G4 still has some powers not unleashed yet. Just throw in a true DDR FSB (say 400MHz) and use corresponding (duh) DDR RAM and you'll see the (dual) G4's performance go up to quite a competitive level with today's Pentium 4s. At least, that's what I think. The G4 has never been fully utilised to it's full potential, which is a terrible shame, really.
:(

The G4's that exist are being used even beyond their full potential. They don't have a true DDR FSB. You are describing the 7457-RM, which doesn't exist and probably never will.
 
Re: Re: Woo Hoo - they get faster and faster

Originally posted by Masker
And, if you're going to get all physics on us, use the correct term: velocity not speed. [Edit: Hmmmm. For cps or Hz, velocity isn't exactly right, either...]

Frequency is the word you are looking for. "Velocity" and "Speed" both refer to distance per second. "Frequency" refers to cycles per second.

:)

Not sure if a steadily increasing freuency would be refered to as accelleration, though ...

Gosh, all this talk reminds me of the "Windows Accellerating" video cards of the early '90's ...
 
Re: Could IBM be using the 970 elsewhere?

Originally posted by P-Worm
Do you think that IBM will be using the 970 for things other than Macintoshs? Maybe there own desktops? Seems a little unlikely, but we don't want to be swept with the current...

P-Worm

IBM has already announced a line of blade servers based on the 970.

So, Apple exclusivity was never assumed nor accurate.
 
Originally posted by hayesk
If Intel can say the P4 makes your colours more vibrant and your internet surfing more rewarding, then why not fight fire with fire?


True, true ... but I'd like to think despite all the evidence to the contrary that we're better than that ...
 
Re: Re: Re: Re: PPC 970 for Apple... Confirmed?

Originally posted by Cubeboy
Another thing to note is that a program that fits into a 32 bit model and doesn't take advantage of a specific 64 bit capability (scalability, memory, high precision arithmetic) will probably perform better compiled as a 32-bit binary than a 64 bit one due to more cache misses associated with the size of the 64 bit binary.

Um, no. If your application doesn't use 64-bit ints, compiling with the "use64" (or whatever) compiler switch will change nothing besides the pointer addresses (and, I'd suspect that the compiler will have a separate switch for allowing >4GB addressable memory which would twiddle the pointer size back down to 32 bits as 90% of all apps won't need that much addressable memory). The instructions are still, as always with PPC, 32 bits wide. The only thing compiling specifically for the 970 would do would be allow you to use native 64-bit int registers (and memory addresses).

But, yes, blindly changing 32-bit ints to long longs (64-bit ints) will increase your memory usage, which is the #1 performance bottleneck on the G4 (although not as bad on the 970). This is the same as using 32-bit ints for 8-bit or 16-bit data types today. Proper variable sizing will remain a primary concern for developers who want optimized code.
 
No, it'll have magnetic core memory

Originally posted by Vlade
Will the 970 have DDR400 or what?

No, it'll have magnetic core memory. :cool:


Seriously, though, the PPC970 has no memory other than cache and scratch space.

Whatever memory it uses will be decided by the chipset and motherboard designers.

Since there haven't been any specs announced, anything could happen. MacB did claim PC3200, however, in the proto system that they supposedly saw.
 
Originally posted by mkaake
i don't think the g4 is all that bad - i think we're all getting caught up in the mHz myth that we love to bash others about. i mean, i'm writing this on a 266 g3, and it runs just about everything i want it too (except x, of course...) (and yeah, i know there are ways to make it, but it's not worth it on this machine). pump it with ram, and the only times when you see the age of the proc is when you rip and encode in itunes. and the occasional bout when you're running 4 or 5 programs at a time.

i think the g4 bashing has become kind of a sport and we've forgotten that it's still pretty powerfull...

oh well . think what you will.

<braces for impact>

matt

"occasional bout when you're running 4 or 5 programs at a time" ... Dude, I'm hardly ever running fewer than 4 programs! I mean, you've got (aside from Finder), Mail and iTunes and Project Builder going 100% or the time, throw in Safari and you're up to 4!

Having a beefier machine broadens your horizons in ways you'll never understand until you have one ...

The G4 is a fine chip, with a really crappy front-side bus design. It just physically can't get enough data to keep the processor busy. That having been said, even with a significantly better FSB I'd be surprised if the G4 core could compete with the top-of-the-line P4s. It just doesn't do that much more per clock cycle to counter its lower frequency.
 
Re: Re: Re: Re: Re: PPC 970 for Apple... Confirmed?

Originally posted by jettredmont
Um, no.

Maybe yes and no.

The "size of the binary" statement is pretty far off (unless all data is statically allocated).

Using 64-bit pointers, however, does increase the runtime footprint of a program. For some programs, the pointers represent a fair proportion of the total data. Programs that keep their data in lists, double-linked lists, or trees often have that issue.

The larger data size lowers performance in two ways:

- more memory bus bandwidth is needed to move the pointers
- the pointers occupy more space in cache, thereby reducing the effective size of the caches

So, the original poster had more or less the right idea, but made a mistake by referring to the size of the binary image.
 
Re: Re: Re: AltiVec

Originally posted by matznentosh
Actually, there have to be at least 4 spatial dimensions plus the time dimension. Space is curved as can be seen clearly by the bending of light rays coming from distant stars. That curve has to occur in a different spatial dimension than the 3 we experience with our senses. As for string theory, most of my physics friends use unix boxes. They haven't gotten the Mac bug yet.

The effects of any dimension above 3 is manifested as time. Any series of events in the n-dimensionl universe from our 3-dimensional perspective can therefore be described using 3+1 dimensions.

For purposes of predicting events, it is often useful to count on a fourth dimension, and sometimes useful to utilize further dimensions. If it takes 6 dimensions to explain an occurence which could also be explained using 4, then the 4-d explanation wins (re Occam). However, in the absence of a 4-D simplification, one must run with 6 dimensions.

That having been said, the correct answers to "how many dimensions are there?" is one of "3+time" or "infinite".

Okay, let's drop the physics now except as it applies to the 970 processor ... :)
 
Re: PS8 Will not run on OS 9

Originally posted by Frobozz
PS8 Won't Work on OS 9

Could this mean something more than a simple Carbon to Cocoa conversion? I think it might. What other reason would they have for dropping OS 9?

Perhaps it's because they will be incorporating 970 enhanced code? Is there any reason why supporting the 970 would exclude OS 9? If not, it may just be a reason to stop QA'ing OS 9 support? Who knows.

Or it could simply be that they are using Carbon calls that are exclusive to MacOS X. Or they have switched to MachO and are using other MacOS X Frameworks. It is complely possible (and I bet very likely) that Adobe will continue to use Carbon for it's existing applications. Remember, they were one of the early companies that complained about having to rewrite for Cocoa - why would they do so now _after_ getting a Carbon version on MacOS X (as opposed to before when they had tons of lead time and inside access that numerous other devs didn't).

And of course, it is also possible that they will be putting out a 64-bit version of Photoshop - but they will still put out a 32-bit version for the majority of people that won't have 970s.
 
Snowy_River:

The truth is that the G4 is a modern processor. It happens to have some problems (too-slow FSB, etc.), but it is a robust, powerful, modern processor.
It's no more modern than an AMD K6-3 (which was of course a K6-2 with the on-die L2 and motherboard L3). In addition to the similar caches, they have similar slow FSB's, weak scalar floating point units, poor clock speed scaling, and no hardware prefetch. I'm not interested in looking up K6 details, but I'd be surprised if it had significantly worse out-of-order or superscalar execution capabilities than a G4. Perhaps someone else feels motivated to continue the comparison.

As for your assertion that the G4 is powerful, well thats just rediculous. There are plenty of benchmarks that illustrate what happens to G4's that venture outside of an hand-sewn Altivec cocoon.

And what we're excited about is not the prospect that Apple will be competitive again, but that we may be coming into a new 'Golden Age', akin to when the G3 came out; that 970 based Power Macs will not simply be competitive with P4s, but will significantly exceed their performance
There is nothing that suggests the PPC-970 will outperform the Intel chips of its time in most apps. Already Intel has P4's with an equally fast FSB and similar caches, that score better in SPEC (the only performance numbers really available for a PPC970). Intel's next revision, needless to say, will not be slower. The current P4 core has been around since January 2002 and the next one will be out this year. Intel is a tough compeditor.

jeffosx:

So what do 1000 ancient NT boxes have to do with modern PCs?

Bengt77:

The G4 still has some powers not unleashed yet. Just throw in a true DDR FSB (say 400MHz) and use corresponding (duh) DDR RAM and you'll see the (dual) G4's performance go up to quite a competitive level with today's Pentium 4s.
This looks to be pure speculation. Wishful speculation at that. Yes a 400mhz FSB and solid RAM would boost certain AltiVec benchmarks a lot, but since when has that been the problem with the G4? If bandwidth were the problem, don't you think that Apple's new 166mhz FSB computers would be able to demonstrate a nice performance boost? Don't know about you, but I was troubled by the 17" AluBook's unimpressive CPU performance compared to the old PC133 15" TiBook.
 
Re: Re: Re: Re: Re: PPC 970 for Apple... Confirmed?

Originally posted by jettredmont
(and, I'd suspect that the compiler will have a separate switch for allowing >4GB addressable memory which would twiddle the pointer size back down to 32 bits as 90% of all apps won't need that much addressable memory).

Actually, I suspect that such a switch would be incompatable with the runtime. After all, there would then need to be some kind of switch to tell memory allocators that they couldn't go above 4GB, even though they have access to the memory. 64-bit support in MacOS X would already require 2 versions of every framework (you can't use a 32-bit framework in a 64-bit application and vice versa). Allowing 64-bit runtime with 32-bit pointers would only make the problem worse.
 
Re: Re: Re: Re: Re: Re: PPC 970 for Apple... Confirmed?

Originally posted by AidenShaw
Using 64-bit pointers, however, does increase the runtime footprint of a program. For some programs, the pointers represent a fair proportion of the total data. Programs that keep their data in lists, double-linked lists, or trees often have that issue.

The larger data size lowers performance in two ways:

- more memory bus bandwidth is needed to move the pointers
- the pointers occupy more space in cache, thereby reducing the effective size of the caches

But if you are using so many pointers, then you data has already violated the locality of reference principle and thus you would have likely made minimal usage of the caches anyway. There are data structures that make heavy usage of pointers that can take advantage of being in the cache (hash tables come to mind). If you are using that many pointers, even at 32-bits you are losing a lot of performance.
 
sometimes the algorithms demand it...

Originally posted by Rincewind42
But if you are using so many pointers, then you data has already violated ....If you are using that many pointers, even at 32-bits you are losing a lot of performance.

Losing performance compared to what?

Your statements seem rather naive - some algorithms naturally run the fastest by using pointers. Think of a b-tree with small nodes, sure the pointers will be a fair percentage of the data - but you're quite unlikely to come up with a faster way of solving the problem.

Many other important applications (fluid dynamics, semiconductor design, ...) also have a fair number of pointers.
 
Originally posted by ddtlm
Snowy_River:

As for your assertion that the G4 is powerful, well thats just rediculous. There are plenty of benchmarks that illustrate what happens to G4's that venture outside of an hand-sewn Altivec cocoon.

Agreed. A G4 doesn't scare anyone anymore. It had a much longer competitive life than a Pentium or AMD chip, but as of now, its not in the same league as the others. Its true that other factors were holding it back, but the G4 has to go.
 
Re: Re: PS8 Will not run on OS 9

Originally posted by Rincewind42
Or it could simply be that they are using Carbon calls that are exclusive to MacOS X. Or they have switched to MachO and are using other MacOS X Frameworks. It is complely possible (and I bet very likely) that Adobe will continue to use Carbon for it's existing applications. Remember, they were one of the early companies that complained about having to rewrite for Cocoa - why would they do so now _after_ getting a Carbon version on MacOS X (as opposed to before when they had tons of lead time and inside access that numerous other devs didn't).

And of course, it is also possible that they will be putting out a 64-bit version of Photoshop - but they will still put out a 32-bit version for the majority of people that won't have 970s.

well i don't know if cocoa will improve the performance but the recent versions of Photoshop and Illustrator for OS X really really blow. They need to do SOMETHING to make them run better.
 
Rewriting it exclusive for Cocoa should help the speed since all of the legacy code wil be gone finally. Of course for me Photoshop 7 doesn't really blow now...Illustrator is a slug...but Photoshop is fairly snappy on my Dualie 1 Gig.
 
Re: sometimes the algorithms demand it...

Originally posted by AidenShaw
Losing performance compared to what?

Your statements seem rather naive - some algorithms naturally run the fastest by using pointers. Think of a b-tree with small nodes, sure the pointers will be a fair percentage of the data - but you're quite unlikely to come up with a faster way of solving the problem.

Many other important applications (fluid dynamics, semiconductor design, ...) also have a fair number of pointers.

I never said that there were not algorithms that would benefit from structures that rely stricly on pointers - your example was a linked list. A linked list where you have 1 or 2 pointers per node and a small amount of data is grossly inefficient - such lists are typically implemented as pointing to a block of data and you chain additional nodes in blocks that point to the next block (a linked list of arrays). I can easily think of an algorithm that would get a huge performance win because of pointers: page flipping. But the speed up comes from not having to do a block copy on a huge amount of data, the data size still dwarfs the pointer size regardless of if it is 32, or 64 bits in size.

But regardless, if your jumping all over memory, then you are losing the benefit of the cache to some degree. 4-8 extra bytes in the cache per node aren't going to increase that hit significantly. Caches are designed to take advantage of linear access patterns, not effectively random ones that you get with linked lists and trees (especially when you are allowed to delete and insert at any location within the data structure).
 
Re: Re: Re: PS8 Will not run on OS 9

Originally posted by dongmin
well i don't know if cocoa will improve the performance but the recent versions of Photoshop and Illustrator for OS X really really blow. They need to do SOMETHING to make them run better.

And I'm sure that they are, the real question is if you will be willing to live with the minimum requirements :D. (I refer to OS version - there are far to many that have bitched about upgrading to 10.2 and say that they may not upgrade to 10.3 because of one reason or another. I'm willing to bet that Adobe will require 10.3 for their next versions of software, if only because the rest of the industry will likely head that way by the time PS8 comes out).

Originally posted by Rustus Maximus
Rewriting it exclusive for Cocoa should help the speed since all of the legacy code wil be gone finally. Of course for me Photoshop 7 doesn't really blow now...Illustrator is a slug...but Photoshop is fairly snappy on my Dualie 1 Gig.

Rewriting in Cocoa won't necesarily do anything at all for the speed. And they could just as easily remove legacy code while staying in Carbon. Cocoa isn't a magic bullet and Carbon isn't an instant indicator of legacy.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.