Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
With:
  • 2.66GHz quad core
  • X58 chipset
  • eSATA
  • Radeon HD 4850
  • 3 GiB DDR3 (1066)
  • 750 GB drive (space for 3 more)
  • space for 2nd optical
  • 15 in 1 media card reader
  • Vista x64
  • HDMI
  • 1394
Yeah I lot of AMD fans started trashing i7 launch prices until I showed them that system. Then again my friends like to build their own computers.
 
i wish apple would invest that kind of $ in testing their hardware.

Apple just assembles hardware other manufacturers make. Although they may create specifications like PCB shape and such.

The sum of the parts does require testing I'm sure, but not 500M worth.
 
Then again my friends like to build their own computers.

So do I...

I saw http://techreport.com/articles.x/15816/1 - and I think the ASUS P6T will be replacing my P5E3/Q6600 system before the end of the year ;) .

The kicker - two SAS ports on the mobo! My Q6600 system has a U320 SCSI card, and a pair of 15K RPM U320 SCSI drives for the system and application disks. I can get rid of a card in the next system...
 
I'm not sure why this is news.

I used to work for a validation unit in the past, and they obviously conduct some random testing and random testing obviously can't cover any scenarios.

Newer architectures or architectures with new components or instructions obviously get longer tests or new tests compared to simple steps between processors with same architecture (since then you can just do regression testing).

Well, the news is pretty sparse on details but mention formal methods, which is a completely different approach from 'random' testing. Formal methods consists of creating a mathematical model of the system, proving that the model have certain (interesting) properties, and, to some extent, arguing that the model is a reasonable approximation of the actual system.

Formal methods were thought to provide a whole new way of doing software development in the 1980's but have since fallen from grace somewhat. I was not aware that it was a big thing in hardware development still. If someone has some references I would love to see them.
 
The best thing about i7, which I have said in nearly ever thread I have posted in, is no more 775/771; just 1366.
 
I'm not even sure why this is news. It's almost certain that every processor since Pentium has had very rigorous testing and Xeons, which Apple uses in the Mac Pros and XServes have even more extra testing since they are more mission critical. And being able to update processors in microcode has also been incorporated in Intel processors for a long time. And in fact was used recently in Core 2 Duo processors to address a TLB issue. (Probably less seriously than the one AMD had with Phenom. I'm pretty sure AMD CPUs have a microcode update system too, but I believe Phenom's differs from Athlons and was not fully implemented yet.)

http://www.rage3d.com/board/showthread.php?threadid=33889730
 
Ah, you're back. Neato
Yes and I am glad to be. I have no dislike in me for anyone. I like all people here no matter if they have different opinion of me or anything.
Let's just hope they're a lot more efficient than the PowerPC chips, though, right?
Yes, PowerPC is dead technology now. :)
Although, I do want Apple to stay with Nvidia chipsets to make themselves more unique than an average Intel chipset PC. ;)
If the Mac Pro, iMac and Mac Min can get Nvidia motherboards, it will be great.
 
What's the bottom line?

The new Intel i7 processors and chipsets that were launched today are for desktop computers.

Apple doesn't use Intel's desktop processors in its consumer or professional line. Instead it uses Intel's portable and server processors.

Therefore, we won't see the new i7 architecture in Apple's Mac Pro until the first half of 2009 and iMac and MacBook until late 2009 at the earliest.

It appears there's going to be a substantial period where Apple's desktop offerings won't be as competitive (based on performance) as other computer makers — kind of reminds me of the the late PowerPC days.
 
The new Intel i7 processors and chipsets that were launched today are for desktop computers.

Apple doesn't use Intel's desktop processors in its consumer or professional line. Instead it uses Intel's portable and server processors.

Therefore, we won't see the new i7 architecture in Apple's Mac Pro until the first half of 2009 and iMac and MacBook until late 2009 at the earliest.

It appears there's going to be a substantial period where Apple's desktop offerings won't be as competitive (based on performance) as other computer makers — kind of like the late PowerPC days.

Alright, I'm not going to act like I know much about computers, but if these chips run cooler then couldn't Apple just put them in the iMac?
 
The best thing about i7, which I have said in nearly ever thread I have posted in, is no more 775/771; just 1366.



That will change next year when the mainstream, and server based nehalems come out. Remember these are desktop chips, and desktop chips always have had one socket and servers another. This isn't changing for Nehalem either so that really doesnt apply what you just said...LGA771 sockets were for core 2 based xeons. The initial (nehalem) DP Xeon's will share the LGA1366 socket because it will use standard DDR3 like the desktops. But the MP Xeon will use LGA1567 because it uses FB-DIMM based DDR3. And also later in 09 the mainstream/low power nehalems (havendale and lynnfield) will use LGA1156. So in essence, will have 3 sockets for nehalems, and that doesnt include the mobile sockets...Soo yea...its not the best thing about i7 because its not true...
 
Apple doesn't use Intel's desktop processors in its consumer or professional line. Instead it uses Intel's portable and server processors.

Which is so sad....

THe Lord God Jobs' obsession with "thin" won't let Apple put a desktop quad package in the Imac - forcing an overpriced, underpowered system on the faithful.

The "allergy" to a mini-tower desktop forces people to buy a Dell or HP - if they don't want an underpowered toy (the MacMini) or a workstation that's far more than they need (the Mac"Pro"), or a system that's an all-in-one with a monitor that they don't want (the Imac).

I can't fathom why his holiness doesn't sell a mini-tower that's aimed at the midrange that doesn't want the restrictions inherent in an all-in-one.
 
Alright, I'm not going to act like I know much about computers, but if these chips run cooler then couldn't Apple just put them in the iMac?

Not the iMac. You would have to compare each processors TDP (Thermal Design Power) or maximum amount of power the cooling system in a computer is required to dissipate.

The iMac currently uses Intel's 3.06Ghz X9100 Core 2 Duo Extreme processor at 45W (TDP).*

PC desktops spec out at either the Intel 3.33Ghz E8600 Core 2 Duo processor at 65W (TDP) or 3.0Ghz QX965 Core 2 Extreme at 130W (TDP).

The slowest i7 chip is the Intel 2.66Ghz 920 Core i7 Quad processor at 130W (TDP).

* The iMac originally used a custom-built limited pre-Montevina 3.06 processor that was similar to the X9100 but ran at 55W, I'm assuming Apple has moved over to the now released X9100
 
So do I...

I saw http://techreport.com/articles.x/15816/1 - and I think the ASUS P6T will be replacing my P5E3/Q6600 system before the end of the year ;) .

The kicker - two SAS ports on the mobo! My Q6600 system has a U320 SCSI card, and a pair of 15K RPM U320 SCSI drives for the system and application disks. I can get rid of a card in the next system...
I was tempted to ask how much for your P5E3 but it's X38 and DDR3.
 
If Intel tested every possible 32-bit by 32-bit multiplication, how long would that take?

What about every 64-bit by 64-bit multiplication?

(The answer will reveal why they have to use theoretical or formal methods.)
 
As an additional safety net, Intel has included software in the Nehalem chips which can be changed after they ship.

Aha, firmware in the i7.

From a "time to fix" point of view I prefer a software solution over a hardware solution, but my first choice is to buy a product that does not need fixing.

Unfortunately the computer industry launches unfinished products with such speed that they need the firmware escape to even deliver on the product performance.

I think it is a sad excuse for mediocre design, development and testing.

Dear supernatural being, save us from more (firmware) update mayhem.

Regards
Coen
 
If Intel tested every possible 32-bit by 32-bit multiplication, how long would that take?

What about every 64-bit by 64-bit multiplication?

(The answer will reveal why they have to use theoretical or formal methods.)

That is even assuming that you could do a test that way. It would be quite possible that you would only get ever a wrong result if you perform three certain multiplications in a row.
 
THe Lord God Jobs' obsession with "thin" won't let Apple put a desktop quad package in the Imac - forcing an overpriced, underpowered system on the faithful. Yadda yadda yad.

And for that we should be grateful. The iMac is one of the most exciting developments in home computing for many years.

From a "time to fix" point of view I prefer a software solution over a hardware solution, but my first choice is to buy a product that does not need fixing.

Unfortunately the computer industry launches unfinished products with such speed that they need the firmware escape to even deliver on the product performance.

I think it is a sad excuse for mediocre design, development and testing.

I think you lack an understanding of software engineering. To just throw down a blanket claim that mediocre design, development or testing is responsible for bugs is just plain silly. The smartest and hardest working people in the world will create software that has bugs, because bugs are an unavoidable fact of software engineering. To TRY and mitigate against ALL defects is outrageously expensive and time consuming - a price that you probably would not be willing to pay. There is an interesting story of how the team writing the software for the Nasa space shuttle worked to try and prevent all bugs (and even then they still got them, just very, very few). They take years and years to write very little software (compared to what Intel, MS, Apple produce, for example), but at least it is nearly perfect.

They Write the Right Stuff (the story about space shuttle software)
 
Adrian,

... because bugs are an unavoidable fact of software engineering.

This is exactly my point! This is why I inserted the "funny story" about the comparison between Micros~1 and GM". Why do we label "bugs" in cars as mediocre design and why an "unavoidable fact" in software engineering?

To TRY and mitigate against ALL defects is outrageously expensive and time consuming - a price that you probably would not be willing to pay.

I am not convinced about the "price" and "time" argument. I believe it is a mind shift.
In the 1980's US and Europe manufactured cars were full of "bugs". We thought those "bugs" were an "unavoidable fact" of car manufacturing.
The Japanese brands changed our way of thinking. Their concept of quality changed the entire industry. This did not lead to an explosion of car prices, this did not lead to a slowing in the new-model-cycle.

So why can't the same thing happen in software engineering?

Intel, let the i7 be a good one from the start. I want/ need a new Mac Pro.

Coen
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.