Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
dicklacara said:
Mmmm.... the Justice Department would certainly be interested in that!

With the primary competitors being Japanese, and one of them (Sony) at least also selling at a discount to attract customers, I don't think they'll care! :p
 
No

acidreflux said:
I think you need to read point 5 before you continue your train of thought.

1.. From a business standpoint, what IBM does or doesn't do doesn't transfer responsibility to Apple. Success or failure, yes, but not responsibility. Like I said, I am not under the impression that Steve lied, he said probably, as in likely, but that is not a necessary statement to say probably. That is forecasting, not a commitment.

2. Macintosh may be a concern for companies, but in the case of the graphics cards and processors, software and hardware vendors for financial reasons, will put apple aside in respect to production.

3. Your reasoning is weak, if you rule out the premise that Jobs lied.

1. I'm not talking about legal responsibilty and I assume you aren't either, since you say "From a business standpoint." It seems to me that "from a business standpoint," when you give someone (or several million someones) your personal assurance, you are assuming a kind of responsibility. Jobs basically asked us to trust him: not to trust him that he was not lying, but rather trust him that IBM would deliver. He did not say "maybe," he did not say "IBM has told me." Whether he should have said what he said is another question.

2. Yeah, and Jobs knew that. If Boeing considered development of my helicopter-lawnchair-device to be top priority, I would a millionaire by now.

3. No. See 1. People assume all sorts of responsibility that do not fall in the category of lies. In a typical corporate environment, if someone like Steve Jobs made this kind of commitment to a major client and did not deliver, he would be fired. Not for lying. For making a commitment and not delivering.

I'm not suggesting that Jobs should be fired, just pointing out that he bears responsibility, which seems to be to be painfully obvious.
 
Jibber Jabber

With all this babble.. i'm sure one of you can point me in the right direction, in that can a 15"pbook w/ 2 gb of ram handle video editing ok? it would be used for occasional w/ an external owc elite harddrive.. plz let me know if it will be a waste of money getting one. i was just wondernig w/ the small fsb and other hindrances it may have.
 
Object-X said:
First, let me preface my comments by stating I am not an engineer and don't understand all the technical aspects of this CPU/GPU. But I can read and I will quote from Sony's press release and others in answer to your questions.



"Specifically, the companies confirmed that Cell is a multicore chip comprising a 64-bit Power processor core and multiple synergistic processor cores capable of massive floating point processing. Cell is optimized for compute-intensive workloads and broadband rich media applications, including computer entertainment, movies and other forms of digital content."

I would say that describes a lot of Apple's technologies very well. Don't you? The issue here is how application will be developed to take advantage of this type of processing power. But it is application development that spurred me to my earlier posts with respect to the Powermac. Apple Pro applications (Final Cut Studio) are state of the art and Sony needs these.

So it will give a good boost for Final Cut, but will suck for stuff outside of that. Like I said, its for a limited crowd. So like I said, it would be better as an upgrade card or as a seperate video0designed Mac machine (vMac?), rather than going in all Macs, because the Cell sucks at anything integer related.

Further, this is a big problem because the PowerPC excels at non parellized code while the Cell excels at parellized code. So what do developers code for?


"IBM, Sony Group and Toshiba are collaborating on the design and implementation of Cell which is expected to deliver vast floating point capabilities, massive data bandwidth and scalable, supercomputer-like performance."

Sounds good to me. Supercomputer-like. I like the sound of that. Sounds like marketing for the G5. Perhaps these chips can't compare with current chip technologies with the way current applications are implemented, but I am reading into these comments that there is a new development paradigm about to take place that will take advantage of the Cell unique characteristics

Don't fall for the hype. The PS2's Emotion Engine was supposed to be super-computer like as well.
I quote from my other source:

"According to Petrov Group, IBM’s “GHz U-turn,” away from frequency and toward System-on-Chip integration and memory density, will have profound consequences for all major players in the digital entertainment, enterprise computing, and semiconductor industry sectors. It could lead to mass extinctions and the emergence of new vendors and businesses; it will profoundly change the landscapes of entire industries and create new configurations of business innovation, productivity, and added value. The changes are imminent although still poorly, if at all, understood.

“This year the era of entirely new organic-like computing technology is starting. It will be based on software-enabled computing cells. These building blocks will be highly integrated and super-dense, have very low power, and will be cost-effectively produced in ultra-large volumes. Perhaps most importantly, it will be software, rather than hardware, that will fuel the computing performance of the new systems,” Mr. Petrov said.

What the Petrov Group is getting at is a completely new way of creating applications which take advantage of this processor's capabilities, which by the way, he is indicating that very few people understand yet. I take solace in believing Apple engineers understand all to well what it's capable of. So, without answering the technical aspects of your question I will punt by speculating that "software-enabled computing cells" means something no one has seen yet, but will change the way software applications are engineered. This in turn will lead to new and exciting breakthroughs in computer technology and IBM is the one holding the bag.

None of these contradict the lack of practical consumer applications. It might be nice in designer machines for rendering and video editting, but thats ALL. It would be useless in a server (a rendering node server no, but a normal server yes) and a home machine, and its a lot more expensive than a normal processor.
Oh, and "mass extinctions" I believe refers to Microsoft. ;)

We can only hope...>- )

So you see, this suggests that the Cell chip is a whole new way of developing applications and implementing technologies. But the technologies in question (media content creation) are Apple's core strength.

I've already read all this stuff before, you know. It still does not address the issues. You've failed to address ANY of them in fact.

But there is more. Hmm, can't get a current G5 in a Powerbook because of heat? How about the new Cell multi-core Powerbook? And with precise processor clock control to enable power savings? Doesn't seem implosible to me.

What the heck makes you think the Cell is cooler than a G5?


Exactly what is a "high-end workstation" if not a Powermac? Here is where your argument seems to be at odds with what is being published about the Cell. First, Sony is going to develop a Cell based workstation:

"The Cell processor-based workstation will totally change the digital content creation environment," said Masayuki Chatani

Digital content creation environment? I would say that describes the Apple Powermac's primary function perfectly. So, exactly how the Cell helps with this I can't say difinitively, but it seems from the sources I quoted above that it involves a whole new way of implementing software and developing media content. Here is the rub of my contention. If Sony is going to create a new "Workstation" what software for content creation is going to run on it? Why reinvent the wheel? And can they more to the point? Apple already has the state of the art media content creation suite--it's called Final Cut Studio. A Sony branded Apple Powermac? A HP branded iPod? Who would have thunk? Didn't we hear that major PC makers are after Apple to license OS X? I think we did, and guess who's at the top of the list?

See above. Apple might make a video-editting station with Cell, because Cell is great for that kind of stuff, BUT if they made the whole PowerMac line Cell-only, they would alienate a lot of other people. The Cell's performance on regular Mac apps wouldn't be so good.


"Sony Corporation expects to launch home servers for broadband content as well as high-definition television (HDTV) systems powered by Cell in 2006.
Sony Computer Entertainment Inc. also expects to launch its next generation computer entertainment system powered by Cell to revolutionize the experience of computer entertainment. "


Home servers for broadband content? Can we say MacMini? What software will run on them? Something from Microsoft? Not a chance. Something Sony will create? Now why was Ando on stage at the keynote? This is why. Apple software and Apple workstations for creating this new digital entertainment content for Sony TVs and Playstations is a marriage made in heaven (or Cupertino).

Yeah, the Emotion Engine was supposed to be in TV's and PDA's too...
 
memofromturner said:
So as AMD users kick the crap out of your last year model G5, hey at least you have great bandwidth.


Absolutely not, it means that if I run two programs at once I kick the crap out of the AMD user, and my peripherals transfer data faster so I get work done faster and my apps load a tad faster.
 
Get A Grip, Folks...

Reading this thread, I am amazed at the number of people here still doing clockspeed vs. clockspeed comparisons between the PowerPC and Intel architecture chips. It is obvious that the GHz Myth is still alive and well, and is still doing its damage to the way processors are discussed.

All of this talk about the G5 Powermac being "an embarrassment" to Apple is insane. Just to clear my head of FUD, I revisited Barefeat's and a couple of other head-to-head tests of current G5 systems against current Wintel systems. Interestingly, the G5 holds up very well, especially in the midrange systems, in all operations. And, the G5 wins many of the tests outright, in multi-processor and Altivec enabled operations. And... that is in benchmark testing that pretty much obviates any impact of the operating system on user productivity.

I slam a computer very frequently, doing high frame rate video and multi-track audio work. First hand, I can tell you that my dual 2.5 G5 Powermac continues to amaze me with the grace and raw speed with which it dances through demanding tasks, and does so across multiple running applications.

I have a feeling that nobody here who has alluded to 'G5 Powermac ambarrassment' actually owns and uses one every day. The truth is that the G5 Powermac flat rocks in getting real work done. And, any incremental improvement will only make this already-terrific platform even better.
 
Hey guys...

Has anyone gone through the source code of Apples various web pages to see if they left anything out there by mistake? They have let graphics and text scraps out on the server a couple of times in past years, which has served well as tip-offs to upcoming updates. Those with some web-programming background who are interested my well find a clue or two for us floating out there...
 
quta, you speak more like a veteran than a "newbie." :)

For those of you new to the rumor scene, learn the following 6 points. (I got my name for a reason, ya know.)

1. Apple is not closing its doors. (Just in case you have never heard that one before)

2. Apple wants to upgrade the specifications on its computers as much as possible and as quickly as possible. They never "hold back" for ANY reason. They would have released speed bumped/improved Macs last week if they could have.

3. Apple is not releasing lower upgrade specifications to the rumor sites so they can "wow" you when the "real" upgrades are annouced. This is the real world. In the real world, things do not advance as quickly as the seller nor buyer wish.

4. ThinkSecret's track record on reporting rumors is extraordinary. Sometimes they might miss the mark on one upgraded feature, like a video card upgrade on a system, but if they say the next upgrade will be 2.0/2.3/2.7 G5's, you can bet that's what it is. Sorry, there will not be 3.0 dual cores this time around.

5. Related to #4, your opinion of what the next upgrade will be is usually more accurate than what other Mac rumor sites report (other than TS). Macrumors.com has realized this, too, and basically only reports rumors from TS and sometimes AppleInsider.

6. You would be fine with a dual 1GHz G4 if you just got to work instead of wasting your time worrying about the next upgrade and wasting hours reading 20 pages of a rumor message board in which people vent their feelings with no actual knowledge of what is coming down the pipe. Some gripe and complain about minor upgrades and how that adversely affects productivity, but instead of blaming Apple, look and see if your time management skills need improvement. Yes, it can be therapeudic to know others feel the same way you do (only 200 MHz?!? :mad: ), but seriously, GET TO WORK ALREADY!

Armed with the knowledge from #6, you can all go buy last year's Mac (which may still be this year's Mac :( ) and be more productive than actually having the Mac that you wish Apple would release in the coming update.
 
drewyboy said:
With all this babble.. i'm sure one of you can point me in the right direction, in that can a 15"pbook w/ 2 gb of ram handle video editing ok? it would be used for occasional w/ an external owc elite harddrive.. plz let me know if it will be a waste of money getting one. i was just wondernig w/ the small fsb and other hindrances it may have.
Considering Apple marketed the 867MHz TiBook as a "mobile video editing workstation" and showed it running Final Cut Pro, I would think you would be just fine with a newer 1.67MHz AlBook. One thing that would help would be to remove the stock drive and put in a 7200RPM 60GB Travelstar hard drive, and/or get an external 3.5" 7200RPM FireWire drive. Obviously the PB will not be as fast at rendering as a desktop G5, but still will perform fine if a PB is the form factor you need.
 
GFLPraxis said:
So it will give a good boost for Final Cut, but will suck for stuff outside of that. Like I said, its for a limited crowd. So like I said, it would be better as an upgrade card or as a seperate video0designed Mac machine (vMac?), rather than going in all Macs, because the Cell sucks at anything integer related.

Further, this is a big problem because the PowerPC excels at non parellized code while the Cell excels at parellized code. So what do developers code for?
<snip>
In the end, I don't think this will matter. If Apple puts a Cell chip in their Macs, they'll throw it in as an accelerator for highly parallel code, write up some SDKs (software development kits) for the hardware, release it to their developers, and encourage them to make use of it. In fact, once GCC gets auto-vectorization capabilities, encouraging the developers won't be necessary.
 
MacWhispers said:
All of this talk about the G5 Powermac being "an embarrassment" to Apple is insane.

You know, I think the G5 is an embarrassment, but not for the lack of 3.0 GHz. It's an oversized case with none of the expandability of previous Apple Towers. 2 Hard Drives? One Optical? No BTO Hardware RAID option, 3 slots with one being hard to use due to the AGP card? It's kind of silly that case has gotten bigger and heavier but the amount of stuff you can put inside has shrunk.
 
jbh001 said:
This doesn't make sense. Why add functionallity for a technology (AGP 8x) that is already on its way out the door?

Why not PCIe?

This would be akin to replacing the modem with an ISDN terminal while willfully ignoring that DSL has already suplanted ISDN in the marketplace as the preferred technology.

Either Apple is experiencing a major attack of dumbness, or they are working overtime in the disinformation department to try and trace the leaks to Think Secret.


OR G5's already support AGP 8x and Apple's just using the same motherboard because these are minor updates and not worth the cost of engineering a new board.
 
yes, some good points, but...

...if I am going to drop $2500-2700 on a desktop that can't really play games, (video card and simple availability issues,) and has fundamentally limited program compatibility, then of course I will expect something else super-excellent, not just OSX. People aren't buying Power Macs for word processing or buisness work, they are buying them to render the next generation of films, do massive multitrack recording sessions, ect. Many (most cutting edge virtual) audio apps are still only optimized for PC, which means macs must run at FASTER speeds in order simply to compete. For example a new 2.7Dual would not run AbletonLive or Reaktor nearly as fast as a year old Compaq AMD-64 *notebook*, which is somewhat sad, -how can they be considered next-gen or a good investment for the future? Or at the very least how can they be considered worth upgrading to from a RevA? They need some new major hardware innovation!! At this point they still seem to be struggling with the cooling/clock speed!? As for a mouse, I am very happy w/ my logitech and wish Apple would quit wasting time tweaking vanity issues [glances at ipod/Powerbooks] vs. delivering real innovation...
 
GFLPraxis said:
So it will give a good boost for Final Cut, but will suck for stuff outside of that.
If it can assist in decoding h264, it's good for lots of things on lots of devices including laptop, desktop and floortop computers...

... any box that might be involved in playing QuickTime AV.

In the New Internet, we will be browsing/interacting with AV media, not just hyperlinked text pages.

IMO, CELLs, or the like, will be everywhere!
 
wellllllll

looks like no PM updates at NAB, i guess i can stop reading this thread and stop being curious about it til next week...hmmm, i wonder how many combined man hours were spent just sitting around, speculating and BSing about this. Chalk me up for a few hours :(


...as my money piles up and contiunes to burn in my pocket
 
Don't lose hope

crpchristian said:
looks like no PM updates at NAB, i guess i can stop reading this thread and stop being curious about it til next week...hmmm, i wonder how many combined man hours were spent just sitting around, speculating and BSing about this. Chalk me up for a few hours :(


...as my money piles up and contiunes to burn in my pocket
There's always tomorrow.
 
crpchristian said:
looks like no PM updates at NAB, i guess i can stop reading this thread and stop being curious about it til next week...hmmm, i wonder how many combined man hours were spent just sitting around, speculating and BSing about this. Chalk me up for a few hours :(


...as my money piles up and contiunes to burn in my pocket

You and me both, brother. Been saving since Septempber of last year.

And who needs more useless posts on the speed of Athlon 64s and Northbridge? Or the random "powerbook g5" posts.

Gah.

The thinksecret forums were pretty insightful, but they still had people doing that too.
 
Newtek Lightwave

Newtek Lightwave next update will support dual-core...Lightwave 8.3...why would they support it when MAC doesn't have dual-core...so I am hoping next update is dual-core
 
dicklacara said:
If it can assist in decoding h264, it's good for lots of things on lots of devices including laptop, desktop and floortop computers...

... any box that might be involved in playing QuickTime AV.

In the New Internet, we will be browsing/interacting with AV media, not just hyperlinked text pages.

IMO, CELLs, or the like, will be everywhere!

So, increase the cost of every Apple machine by a couple hundred dollars, and force every programmer to redesign their apps for better performance with parellized code, so...we can get slightly better H.264 performance?...yeah.
 
kaneda said:
Newtek Lightwave next update will support dual-core...Lightwave 8.3...why would they support it when MAC doesn't have dual-core...so I am hoping next update is dual-core
This has already been posted and discussed earlier in this thread.

Anyway...we'll see dual-cores in Macs as soon as IBM's ready to release them. Exactly when that is is open to speculation.
 
GFLPraxis said:
So, increase the cost of every Apple machine by a couple hundred dollars, and force every programmer to redesign their apps for better performance with parellized code, so...we can get slightly better H.264 performance?...yeah.
We won't have to force programmers to do anything except recompile. Auto-vectorization combined with Cell support in a future version of GCC will do the redesigning for us. I agree with you about the price increase, although I'm not 100% sure the increase would amount to "a couple hundred dollars" - maybe more, maybe less. I'm just not sure.
 
myapplseedshurt said:
what kind of crack laced methamphetamine empregnated ecstasy are you smokin' ???? where did you get THAT information, thinksecret?? :D :D :D :D :D :D :D :D :D :D

Actually he got it from Microsoft's official developer specs, but he got it wrong.

It's three 3 GHz PowerPC's, but NOT G5's. They're far, FAR slower than normal G5's, getting 2 instructions per clock cycle and possibly not even having AltiVec.
 
wrldwzrd89 said:
We won't have to force programmers to do anything except recompile. Auto-vectorization combined with Cell support in a future version of GCC will do the redesigning for us. I agree with you about the price increase, although I'm not 100% sure the increase would amount to "a couple hundred dollars" - maybe more, maybe less. I'm just not sure.

IBM said it would increase the cost of processor by 3.5 times, whatever that means (which processor does it cost more than?)
 
GFLPraxis said:
So, increase the cost of every Apple machine by a couple hundred dollars, and force every programmer to redesign their apps for better performance with parellized code, so...we can get slightly better H.264 performance?...yeah.

We're initially talking about things like TVs, etc.-- things that will have a much higher volume just computers.

When low-power, portable versions of the CELL become available, they will be used in cameras, cell phones, iPods, etc.

One of the beauties of the CELL architecture is that all apps do not need to be redesigned (maybe just recompiled, or not).

Optimization can be limited to critical AV interface apps such as QuickTime.

There are already special-purpose chips announced/available to do h264 low-power decoding,and normal-power h264 encoding/decoding... these will prolly be used in the near future, then superceeded by the more general-purpose CELL.

The potential is for billions of chips (in the China cell phone market, alone).
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.