Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
~Shard~ said:
You'lll be fine - I'm doing it with my G4 iMac and have no concerns whatsoever. ;)

Guess you're not trying to run Motion or Aperture! You're going to find more and more apps requiring the graphics cards that will only work in the intel boxes or G5 towers. The next photoshop or After Effects might be candidates.
 
I am sorry, but the IBM anouncement seems to a day late, and a dollar short.

IBM had their chance, for many years with Apple. They failed to deliver even after a long wait.

As much as I liked the PPC, given the new MacIntels..... May IBM rot in hell for holding Apple back.....
 
aegisdesign said:
Kind of pointless if your apps and OS aren't highly multi-threaded. 8 2Ghz cores running a single threaded application is as fast as a 2GHz single core CPU.

(Insert obligatory grumble ...if only Apple had bought BeOS... Way ahead of it's time)

No kidding! I downloaded the thing awhile back when I had a windows box (and that was legal, too :cool: ) and I became a huge fan of the thing.

I was watching some video where this Be dude was like, 'this computer's got 2 processors on it, but one's off' opens 4 video streams and 5 applications, things start slowing down, he clicks a button, turns the 2nd chip back on, and everything's flowing smoothly again (suddenly I'm not impressed with the ps3's HD playback ability...)

And he said that they rigged a box with 8 processors it in! And it took advantage of every single one!!!

Imagine just buying another chip, instead of a new computer, and seeing huge increases in speed... ah... I hate M$... :mad:
 
Chip NoVaMac said:
Thanks, but it seems that we have heard a lot from IBM lately about the future of the PPC chip, after Apple decided to go to Intel.
And then there was that whole thing...
Days after Apple announced the Intel switch, IBM decided that they could have made processors that suited Apple better...They just didn't feel like it.
 
cgratti said:
Same here, I'll just move along slowly with my iMac G5, until 2007...

Haha, damn you! Just look at my macs in my sig. I have zero sympathy for you!!! :p

45nm??? Holy hell... 1000 times smaller than a blood cell??? It just boggles my mind how people can do this kind of stuff.
 
shyataroo said:
why not just make the chips bigger (the same size transistors just have more of them) or go upto 128 bit computing?

The problem is this, say you want a processor to run at 3Ghz, that means that the electrons need to be able to travel through the processor (which is certainly not a straight line, and some paths much longer than others, in one third of one billionth of a second, otherwise you have some electrons still cycling through the chip when the clock cycles. In that time light in a vacuum can only travel about 10 centimeters. Electricity flows through a microchip at less than the speed of light. So the problem is that you can only have a chip be so big before you get electron interferences.
 
AidenShaw said:
And maybe by then Apple will expand their product line when Steve's replacement realizes that he has customers who say

I'll pay whatever you ask, just give me a portable workstation with 8 cores and loads of power. I don't care how thick, or how heavy, or how long the battery lasts!​

and other customers who say

I'll pay whatever you ask, just give me a lightweight, compact portable with 10 to 12 hours of battery life in real use. I don't care if it has one core or two, or a builtin optical drive!​

You'll be able to get A: and B: from the other Intel vendors....


well lets just hope that steve wont be replaced on/before/ or anytime soon after 2007. not really sure why you would even bring that up.
 
AidenShaw said:
Unfortunately, that ignores the needs of both the people who want powerful, and the people who want small.


again, i dont see the point of your post.
you obviously hate apple so why would you find it Unfortunate?
 
Remember history; Apple has a real knack for picking the processor that initially looks like the clear winner but turns out to lag behind in the long run. 68000 eventually usurped by x86, PPC eventually usurped by x86, Intel usurped by AMD. Just enjoy the computer you've got till you need a new one and then buy a new one and enjoy that until it too no longer meets your wants and needs, wash, rinse, repeat.
 
Agree

TheMasin9 said:
this can only go so far, 90, 65, 45, when we start approaching 20-5 nanometers things are going to get really scary because then you are dealing with elements on the atomic scale, and then they start acting way different than they would in these kind of chip fabs.
I totally agree with you.
It´s going to be very interesting to see what the mayor companies (ibm, intel, amd) are going to do. This time the should be aware of the problem and they can prepare alot, compared to the 90nm that took everybody by surprise.
I've spoken to some of the guys working at the nanometer lab at my university, and the message from them are you really need to change the Dev staff from electro-engineers to physisists (help with spelling :) ).
All prev experience wont help a bit.
 
aegisdesign said:
The elephant in the corner is wearing an AMD t-shirt however.

Not really. AMD is having a hard time with their 45nm process. Hell They aren't even shipping 65nm chips in bulk yet. Intel has remained in the lead when it comes to shipping processores at smaller processes.
 
Super Dave said:
Be was way ahead of its time. Has Palm done anything with it since buying it? I can never figure out why companies buy things like that and shelf them.

David:cool:

Death throws. You buy up small time companies to try and fill the holes. Witness the purchase of some Chineese Linux company in 2004. Or was that 2005?
Palm's leadership was about as coherent as a gradeschool economic class. Witness the CEO of the year at Palm. I'm not kidding when I say there was a new CEO ALMOST once a year over the last 5 years. The best thing to happen to PalmSource was when Access bought them out.

Chip NoVaMac said:
May IBM rot in hell for holding Apple back.....

Little extreme don't you think? I leave such strong words for the MPAA and RIAA execs. IBM wasn't intentionally doing this on purpose. It's all about R&D funds. when the chip you design is going to be predominantly be used in 5% of a market someone has to pony up the R&D money and that person was Apple who I'm pretty sure bulked at that. Intel on the other hand sells so many CPU's that they easily make up the cost.
 
AidenShaw said:
And maybe by then Apple will expand their product line when Steve's replacement realizes that he has customers who say

I'll pay whatever you ask, just give me a portable workstation with 8 cores and loads of power. I don't care how thick, or how heavy, or how long the battery lasts!​

and other customers who say

I'll pay whatever you ask, just give me a lightweight, compact portable with 10 to 12 hours of battery life in real use. I don't care if it has one core or two, or a builtin optical drive!​

You'll be able to get A: and B: from the other Intel vendors....

Don't rule out another notebook range! Apple have released the MacBook Pro, soon(ish) we'll have the iBook replacement the MacBook and then we'll see a whole new notebook line the MacBook Spine Compressor.... It'll be a beast, 10 dual core processors, 1TB storage, 50 GB RAM, 1KW Battery provding 10 minutes of uptime, 55 Firewire ports, 150 USB 2 ports, 1 FW800 port ;) , 2 x 16 x DL DVD burners... It'll come with a 1 year warranty and 5 free chiropractic sessions. I can't wait.. here here to the insane...

Actually, although taking the p*ss a little, I do think that this year will see apple pay some attention to the gamers. I have stated to friends that I expect to see a gaming laptop in the next year... I think Steve is in talks with the major gaming developers as I think it is clear that he wants to maintain apple's core professional market but not grow it.. his focus for growth is on the home market and that has to include games.
 
beatle888 said:
again, i dont see the point of your post.
you obviously hate apple so why would you find it Unfortunate?
But Aidenshaw has a very good point here. Prev the chip and cpu was so thick that adding a thick optical drive didnt change the total package. For some year now and certainly in the future that wont be true. If you exlude all fluff, you could build an awesome paperthin laptop that i think alot of ppl would like to have.

And there is also the potable workstation market, they want all fluff, and ofcourse you try to make their package as small as possible but you can never get it as small as the first example.
 
ZLurker said:
... from electro-engineers to physisists (help with spelling :)

Physicists :) Like me!

What you say is true, because all kinds of un-obvious things happen when scaling down below classical length scales. Interference effects between individual electrons, coupling between them to form new states, scattering from the boundaries of thin wires, all kinds of interesting physics to explore. Eventually we'll get to where only 1 electron at a time can move through the interconnect, in single file lines...kind of crazy to think about.
 
gauchogolfer said:
Physicists :) Like me!

What you say is true, because all kinds of un-obvious things happen when scaling down below classical length scales. Interference effects between individual electrons, coupling between them to form new states, scattering from the boundaries of thin wires, all kinds of interesting physics to explore. Eventually we'll get to where only 1 electron at a time can move through the interconnect, in single file lines...kind of crazy to think about.
Wohoo, bring it on!
I think we will see some awsome new products in the next 20 years. I think they will be just as awesome when you compare a commodore 64 with a Quad, as when you compare Quad to nextgen.

Ok alittle of topic but still nice to think about :)
 
jhu said:
i thought leakage increased with a decrease in process?

Leakage is only a part of the problem, the bigger problems are yield caused by errors in the lithography process. There's now whole ranges of software from companies like Mentor Graphics (who I believe led the charge with the Calibre range, but not certain) that take IC designs and ajust them so that they are more manufacturable. In addition to that whole other areas of DFM (design for manufacture) tools have been tackling these problems. Each reduction exagerates old problems, and introduces new ones, but I think the IC industry is now saying that they can pretty much see their way to 10nm, beyond that who knows?

It's all impossibly small to me!
 
plastique45 said:
Yes and no.

Yes, but IBM was the great loser with the G5. They promised 3GHz, they got 2.2GHz and had to have them overclocked and watercooled at 2.5Ghz while Intel did get not only a speed increase but also DELIVERED those chips to their client while IBM simply couldn't ship any chip for a long time :rolleyes:

Intel promised 5GHz processors by 2005, 10GHz by 2008.

We all know how that went...

Instead they increased their clock speeds minutely over a period of 4 years (3.06GHz -> 3.6GHz is not very good compared to 2GHz -> 2.7GHz in half the time).

I wish people would stop thinking that Intel are some kind of great company simply because Apple have switched to using Intel chips. Until the Pentium M and Yonah their processors were a joke (performance wise, and power consumption wise).

Apple switched at the right time. Anything older than Yonah wouldn't have been worth using.
 
Hattig said:
Intel promised 5GHz processors by 2005, 10GHz by 2008.

We all know how that went...

Instead they increased their clock speeds minutely over a period of 4 years (3.06GHz -> 3.6GHz is not very good compared to 2GHz -> 2.7GHz in half the time).

I wish people would stop thinking that Intel are some kind of great company simply because Apple have switched to using Intel chips. Until the Pentium M and Yonah their processors were a joke (performance wise, and power consumption wise).

Apple switched at the right time. Anything older than Yonah wouldn't have been worth using.

Could not agree more. For me the advantage of the switch to Intel is that it doesn't even need to stay Intel, AMD is always there as an option. In fact without AMD I'm not sure we would see Intel's focus shifting from Ghz to design efficiency. IMHO AMD shamed intel into doing something better.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.