Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
When you wish upon a PowerMac...

GFLPraxis said:
Sorry, the last one NO WAY.

Blu-ray readers will not be available until 2006. Further, Blu-ray burners are estimated to cost $300 in 2007.

HD-DVD is the only one that will be available in 2005. Certainly no Blu-ray in June.

I can dream can't I :) Though I do think Blu-ray could be here by 4th Quarter, I think that might be in the cards.
 
Where is the HD content?

First, let me preface my comments by stating I am not an engineer and don't understand all the technical aspects of this CPU/GPU. But I can read and I will quote from Sony's press release and others in answer to your questions.

GFLPraxis said:
You're still not showing why it will be useful to the consumer. Some good quotes for you:

"Sure, the Cell will probably have a huge number of vector units but most work on the PC side is done by integer and floating point operations. SIMD/vector operations are not as common, especially because such things are hard to do."

"Specifically, the companies confirmed that Cell is a multicore chip comprising a 64-bit Power processor core and multiple synergistic processor cores capable of massive floating point processing. Cell is optimized for compute-intensive workloads and broadband rich media applications, including computer entertainment, movies and other forms of digital content."

I would say that describes a lot of Apple's technologies very well. Don't you? The issue here is how application will be developed to take advantage of this type of processing power. But it is application development that spurred me to my earlier posts with respect to the Powermac. Apple Pro applications (Final Cut Studio) are state of the art and Sony needs these.

GFLPraxis said:
Yes, rendering would see quite a boost, but it's not something that enough Mac users do to increase the costs of Apple CPU's by that much, which is why it would make more sense to sell it as an accelerator like Durandal suggested, much like the old x87's. "

"Only in VERY select pro applications. Why do you think someone hasn't done this before? Why did the drive to use GPU's for software besides games largely fail? Because it's a lot of silicon that isn't going to be of much use outside of HPC."

"Ever take a look at the design philosophy of the PowerPC 970? It excels at unoptimized code. The Cell ... excels (heh) at highly parallelized code, in other words, code that was written for it.

Putting a Cell as the CPU in a PowerMac would be akin to putting just the AltiVec from a G4/G5 core as a CPU. It's good at a very limited set of things. That doesn't make it bad or anything, but just because it's "scalable" doesn't mean that it can do everything under the sun equally well."

"IBM, Sony Group and Toshiba are collaborating on the design and implementation of Cell which is expected to deliver vast floating point capabilities, massive data bandwidth and scalable, supercomputer-like performance."

Sounds good to me. Supercomputer-like. I like the sound of that. Sounds like marketing for the G5. Perhaps these chips can't compare with current chip technologies with the way current applications are implemented, but I am reading into these comments that there is a new development paradigm about to take place that will take advantage of the Cell unique characteristics

I quote from my other source:

"According to Petrov Group, IBM’s “GHz U-turn,” away from frequency and toward System-on-Chip integration and memory density, will have profound consequences for all major players in the digital entertainment, enterprise computing, and semiconductor industry sectors. It could lead to mass extinctions and the emergence of new vendors and businesses; it will profoundly change the landscapes of entire industries and create new configurations of business innovation, productivity, and added value. The changes are imminent although still poorly, if at all, understood.

“This year the era of entirely new organic-like computing technology is starting. It will be based on software-enabled computing cells. These building blocks will be highly integrated and super-dense, have very low power, and will be cost-effectively produced in ultra-large volumes. Perhaps most importantly, it will be software, rather than hardware, that will fuel the computing performance of the new systems,” Mr. Petrov said.

What the Petrov Group is getting at is a completely new way of creating applications which take advantage of this processor's capabilities, which by the way, he is indicating that very few people understand yet. I take solace in believing Apple engineers understand all to well what it's capable of. So, without answering the technical aspects of your question I will punt by speculating that "software-enabled computing cells" means something no one has seen yet, but will change the way software applications are engineered. This in turn will lead to new and exciting breakthroughs in computer technology and IBM is the one holding the bag.

Oh, and "mass extinctions" I believe refers to Microsoft. ;)

So you see, this suggests that the Cell chip is a whole new way of developing applications and implementing technologies. But the technologies in question (media content creation) are Apple's core strength.

But there is more. Hmm, can't get a current G5 in a Powerbook because of heat? How about the new Cell multi-core Powerbook? And with precise processor clock control to enable power savings? Doesn't seem implosible to me.

GFLPraxis said:
The only feasible application a Cell processor would have in Macs would be as some sort of dedicated media processor for high-end workstations. You won't be seeing Cell-based PowerMacs."

Exactly what is a "high-end workstation" if not a Powermac? Here is where your argument seems to be at odds with what is being published about the Cell. First, Sony is going to develop a Cell based workstation:

"The Cell processor-based workstation will totally change the digital content creation environment," said Masayuki Chatani

Digital content creation environment? I would say that describes the Apple Powermac's primary function perfectly. So, exactly how the Cell helps with this I can't say difinitively, but it seems from the sources I quoted above that it involves a whole new way of implementing software and developing media content. Here is the rub of my contention. If Sony is going to create a new "Workstation" what software for content creation is going to run on it? Why reinvent the wheel? And can they more to the point? Apple already has the state of the art media content creation suite--it's called Final Cut Studio. A Sony branded Apple Powermac? A HP branded iPod? Who would have thunk? Didn't we hear that major PC makers are after Apple to license OS X? I think we did, and guess who's at the top of the list?

"Sony Corporation expects to launch home servers for broadband content as well as high-definition television (HDTV) systems powered by Cell in 2006.
Sony Computer Entertainment Inc. also expects to launch its next generation computer entertainment system powered by Cell to revolutionize the experience of computer entertainment. "


Home servers for broadband content? Can we say MacMini? What software will run on them? Something from Microsoft? Not a chance. Something Sony will create? Now why was Ando on stage at the keynote? This is why. Apple software and Apple workstations for creating this new digital entertainment content for Sony TVs and Playstations is a marriage made in heaven (or Cupertino).

So will we see Cells in Powermacs and Powerbooks? I believe we will after Tiger is released. How does this benefit the consumer. Isn't it obvious? Apple software running on these new media products to interact with new HD media content benefits consumers, and Apple Powermac/Powerbook workstations running some form of Cell technology to develop said content benefits develepers--like me. :)
 
fpnc said:
To compare you have to look at the percent change, not raw MHz. Thus, even with your numbers the change for the Pentium 4 is:

0.93GHz / 2.8GHz = 0.33 or 33% improvement

and for the rumored 2.7GHz G5:

0.70GHz / 2.0GHz = 0.35 or 35% improvement

And if you want to talk about dates, the 2.0GHz G5 systems from Apple didn't ship until well into summer 2003. So, if the 2.7GHz G5 ships within the next few weeks that will also be a less than two year period.

Frankly, clock speed and performance changes over the last two years between the G5 and Pentium 4 have been pretty much the same.

Thanks fpnc

To correct the numbers : in 2,5 years time : 3800Mhz-3066Mhz=734Mhz

0.734 / 3.06 = 0.24 or only a 24% increase in over 2 years time.
 
Even if the P4 has increased by more clock cycles per second than the G5... I would think that the G5 has still increased more than the P4 in terms of performance since the G5 can do more per clock cycle than the P4... so in that respect, 2.7 GHz is quite good, especially when you take into account the speed of the FSB on the G5 system vs the P4 system.
 
nexusfx said:
I like your take on this and your rational, the one thing I disagree with is, once you throw those chips into a hopper and mix in manufacturing factors, market forces, software support, operating systems, and other concerns, the IBM PowerPC/G5 makes for a more efficiant and better multitasking System, the Speed as you said can be a wash, I won't exactly argue with that, but I'll tell you that in my experience the G5 is much faster for doing my Graphics Design and Video Rendering. Take that as you will, I respect and appreciate your comments and opinions.

Well, I can only assume that when you say that your experience on the G5 is "much faster" you're probably comparing a dual processor Power Mac G5 to a single-core, single-processor Pentium or Athlon system. The dual processors in Apple's Power Mac are what has given it an advantage over the typical Pentium 4 system. Of course, that going to change a bit now that dual-core Pentium systems are coming on line (thankfully for Apple, at lower clock speeds and higher costs than the single-core Pentiums). But, yes, in my opinion the dual Power Mac G5s are still pretty competitive systems.
 
Well, I know they're faster... but do they have as much bandwidth?

GFLPraxis said:
You just completely skipped over his post, didn't you. He was pointing out that the benchmarks were not an accurate measurement, because due to the more powerful processor itself the scores of the x86 will be higher, but due to the more efficient design of the PowerPC it is vastly better at multitasking and has no bandiwidth problems.

Individual benchmarks don't take bandiwidth into account.

So as AMD users kick the crap out of your last year model G5, hey at least you have great bandwidth.
 
Just took my 2.5 PM off Craig's List. If this rumor is even marginally correct, this is not going to be worth the upgrade dollars.

I honestly don'g know what Apple is thinking with this upgrade. But no point going on and on about it, since everyone here seems to agree.
 
Blue Moon said:
Yeah, good thing you didn't get in on an investment that has jumped over 200% in the last year. You narrowlly avoided disaster there buddy.

Sure, but the 200% jump was mainly due to the iPod. I've felt that Apple has been concentrating too much on the iPod/iTMS and not enough on their computers -- particularly their Pro line.

Come on Apple, have you forgotten that you're a "computer" company?
 
There's no reason for an attitude like yours...

memofromturner said:
So as AMD users kick the crap out of your last year model G5, hey at least you have great bandwidth.

OK "kick the crap out of your last year model G5" read some of fpnc's responses to me if you want to learn some, him and I have been replying with "real" answers and opinions, with respect and understanding, and not unwarrented sarcasm and arrogance.

I'm trying to be reasonable here, there's no reason to have an attitude towards anyone, he made a valid point. You however just decided to throw insults to feel better about yourself and your somewhat ill-informed understanding of system architecture. Because if you "did know" what your "ego" claims you do, you would have coutered his point with a valid one of your own.
 
nexusfx said:
With all due respect "I did my Research." You looked at the results of someone else's and tried to pass it off like you did it yourself.

Actually I just pasted some links for your viewing pleasure... I never put my initials next to them. Have another look.

nexusfx said:
I wouldn't either, Doom 3 was a HORRIBLE port, they didn't do anything to the source code to take advantage of the G5 Architecture, that game didn't even run well on a Dual 2.5 with a 9600XT and 1.5 Gigs of RAM (A feat easily done on even an Athlon XP 3000 with a lower end card) I've got it on my 1.6, it runs fine, but well, I prefer the PC version.

Was it a horrible port? Because that's not what I remember reading. I recall it being an integral design of OSX and how it allocates resources to no single task in entirety.


nexusfx said:
Overpriced, that's debateable, there's the software you get too, not much but you still use more of it than you would from an out-of-the-box Wintel Machine. Though I DO AGREE myself on that one, if they do the updates they need to lower the price. OUTDATED, yeah ok that's why people in the creative and cinema business (PIXAR, WETA, Lucasfilm, you get the idea) use G5's instead of Wintel, I mean, those guys need their renders to be slower "THAT's HILARiOUS, I think that's better than my benchmark joke......oh wait, was that a joke..... Oye, jokes aside, I don't mean to insult, but an eye for an eye. NOW THEN

So I assume DreamWorks used G5s to render the new movie "Madagascar"? Nope; they used Opterons.

Pixar didn't purchase a new Intel Xeon RenderFarm in 2003?

Can't quote benchmarks? What then would be a good indicator of performance?

The average person couldn't care less about specs, as long as they mean nothing in terms of performance. Most users probably want to know what the machine is capable of, not what's under the hood. -But if you want to still compare hardware, then sadly you're last year's model. Period. Apple is still trying to play catch up... and doing a poor job according to these rumors.
 
nexusfx said:
Overpriced, that's debateable, there's the software you get too, not much but you still use more of it than you would from an out-of-the-box Wintel Machine. Though I DO AGREE myself on that one, if they do the updates they need to lower the price. OUTDATED, yeah ok that's why people in the creative and cinema business (PIXAR, WETA, Lucasfilm, you get the idea) use G5's instead of Wintel, I mean, those guys need their renders to be slower "THAT's HILARiOUS, I think that's better than my benchmark joke......oh wait, was that a joke..... Oye, jokes aside, I don't mean to insult, but an eye for an eye. NOW THEN

Well actually Lucasfilm/ILM division uses Athlon 64's and Episode III was fully rendered using Athlon 64's running Windows XP 64-bit. as for WETA ...R.O.T.K's effects were rendered on Dual Xenon's as well as G5's

http://www.planetamd64.com/lofiversion/index.php/t2007.html
 
Here we go again....I need to go to bed.

memofromturner said:
So I assume DreamWorks used G5s to render the new movie "Madagascar"? Nope; they used Opterons.

You're right, but oh wait, did I mention Dreamworks at all? No I sure didn't. Their designers use HP's too, watch the special features on the Shrek 2 DVD.

memofromturner said:
Pixar didn't purchase a new Intel Xeon RenderFarm in 2003?

Yes they sure did, at that point, having a farm that big, Xeon's do a good job, and they were probably cheaper (refer to fcnp's response to my posts for more answers). Oh and you might want to walk through Pixar Studios and take a look at what's sitting on the designers desks, "those" would be G5's

memofromturner said:
Can't quote benchmarks? What then would be a good indicator of performance?

I dunno, it's not like we can rewrite OS X to run like Windows or vise versa, or do the same with the hardware in some strange mixed up way. Which is exactly what my point was to begin with.
 
Single G5 processor discontinued ??

What is going to happen to the single processor G5 :confused: ?
Is it going to be discontinued :( ? or do you expect a re-launch of a single processor model :rolleyes: ?

I was on the verge of buying the G5 single 1.8 when I read the negative buyer advise. What is going to be the "logical" equivalent (when I want to have Mac separate from screen).

Coen
 
Interesting...

jiggie2g said:
Well actually Lucasfilm/ILM division uses Athlon 64's and Episode III was fully rendered using Athlon 64's running Windows XP 64-bit. as for WETA ...R.O.T.K's effects were rendered on Dual Xenon's as well as G5's

http://www.planetamd64.com/lofiversion/index.php/t2007.html

Interesting, considering they used 600 G5's in the rendering of the Trilogy DVD's wonder how many different farms they have, yeash, wish I had that money.

Oh BTW that's on Apple's website in their archived hot news somewhere.......MAN I really should be resting right now :)
 
NEW MACS ANNOUNCED TODAY!?

madrobby said:
There's a promotion at http://www.apple.com/de/promo/ (and uk, and at, and ?) which runs out on April 20. Maybe we'll see new PM on April 21.

Also interesting is the June 30th, 2005 expiration of another offer on that page, regarding Xserves.

The Stereophonics are playing at Apple Store London tonight (8pmGMT) could this be a launch backdrop?
 
It just makes no sense

IBM's power4 and power5 are dual core for ages. It would be a snip to make a dual core G5. Of c(o)urse apple should also make their own chipset that supports it.
 

Attachments

  • MacRumors666.gif
    MacRumors666.gif
    20.2 KB · Views: 434
WOW........678 people rated this story negative, that's got to be some record lol
 
If they up the default RAM, throw in a 300GB drive and/or lower prices, i'm in for one (most likely the "middle" model). Oh, and make the X800 an option!
 
aafuss1 said:
I think the next update wouldn't feature PCI Express-because Final Cut Studio requires a AGP graphics card (which is software that you may run on a G5). Dual-layer capable SuperDrives would be welcome as is price cuts.

That might be just for backward compatibility, if they required PCIe then none of the current systems would be able to use it...
 
Xbox 2 (or 360 or whatever) is supposed to be revealed within a month and rumored to come with 3GHz Power PC processors. So why oh why would Apple settle for 2.7GHz processors if IBM has proven to be able to make 3GHz?
 
So far... Apple haven't settled for a 2.7GHz G5... as far as the public know anyway. It's a rumour afterall.. and could be completely wrong. Maybe Apple will release 3-3.5GHz in June.
 
B-52 Macer said:
Xbox 2 (or 360 or whatever) is supposed to be revealed within a month and rumored to come with 3GHz Power PC processors. So why oh why would Apple settle for 2.7GHz processors if IBM has proven to be able to make 3GHz?
Those 3 GHz CPUs are positively wimpy, and therefore far easier to make, when compared to the PowerPC 970 and its derivatives. The IPC (Instructions Per Clock [cycle]) for the Xbox processors is said to be just 2, which is about half of that for a PPC 970 (normally 4, but it can be as high as 5 when a branch instruction is involved).
 
meh... just be cool

ok - time for my sixpence worth.

It's easy - do what all Mac users do - WAIT ! :D

We are all used to doing this so what is a couple more months?? :rolleyes: The more

time Apple has to develop hardware the better. As the old saying in the

computer world goes - don't buy until you absolutely positively need to.

Fair enough, if you have money burning a hole in your pocket ready to be

thrown at a G5 - then do it after this conference, updated Macs or not,

you will still be satisfied.


Either way - I will be throwing some $$$ at Apple shortly for a G5 - updates

or not - who will be doing the same???


aussie_geek
 
wrldwzrd89 said:
Those 3 GHz CPUs are positively wimpy, and therefore far easier to make, when compared to the PowerPC 970 and its derivatives. The IPC (Instructions Per Clock [cycle]) for the Xbox processors is said to be just 2, which is about half of that for a PPC 970 (normally 4, but it can be as high as 5 when a branch instruction is involved).
Ok, then the difference in clock speed makes sense, damn. :(
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.