Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Any gains you would have from Apple using Core i3s would be offset by the poor performance of the Intel GPU. End of story. You can argue the metrics/error margins/nerd porn all day, there's no changing that fact.
Perhaps, but that depends on what you're doing. The integrated GPU that Intel is using is pretty crummy, but not nearly as crummy as their past GPUs, at least. The desktop GPUs score a bit lower than the 9400M, but the laptop GPUs are clocked a fair bit lower. If you're not doing much of anything that heavily uses the video card then it probably wouldn't be noticed.

Now if Apple slapped an i3 in there and also added a halfway-decent dedicated video card, then that would be even better. Not that it would happen. They'll probably use the Sandy Bridge video card, which might actually match up with the GT320 in performance. Well, that's the hope, anyway. The clock speeds are going to go as high as 1.4ghz on the integrated GPU once SB hits.
 
Any gains you would have from Apple using Core i3s would be offset by the poor performance of the Intel GPU. End of story. You can argue the metrics/error margins/nerd porn all day, there's no changing that fact.

How come the i3s have to use an intel GPU? The current mac mini uses a mobile version of the Nvidia 9400. In fact there is not a single mac computer currently that Apple sells that uses an Intel GPU.
 
How come the i3s have to use an intel GPU? The current mac mini uses a mobile version of the Nvidia 9400. In fact there is not a single mac computer currently that Apple sells that uses an Intel GPU.
The integrated graphics of Clarkdale/Arrandale are on the same package as the CPU.

The clock speeds are going to go as high as 1.4ghz on the integrated GPU once SB hits.
1.4 GHz is probably going to be desktop. Regardless, it looks like a clock speed bump.
 
I looked at the page. It was them comparing a similarly clocked Core 2 Duo to a similarly clocked Core i3. No one here is arguing the Core i3 isn't better than the Core 2 Duo clock for clock. But from what I did see, it pretty much confirms that the 3.33GHz E8600 will fare very well against the Core i3 chips. Your benchmarks show nothing new, you actually think you are proving something here?
That depends on what you're doing. Like I said, multithreaded benches show the best improvement. The i3 consistently wins. You seemed to think that it didn't matter.
Boy, you really are drinking the Intel kool-aid when it comes to these i3 processors. It was your quote, not mine. "Take a look at the real-world stuff. x264 encoding, CS4, some games"

Let's do a real comparison of these anandtech benchmarks that you posted shall we? http://www.anandtech.com/show/2921/2

The E8600 outperformed the Core i3 in EVERY game. The Core 2 Duo outperformed the Core i3 in sysmark 2007. The only time the Core i3 was in front it was by 1 or 2 seconds proving you are just wasting your time trying to argue.
For crying out loud...SYSMARK IS SYNTHETIC. Synthetic benchmarks are CRAP. I don't trust a single one. I trust a comprehensive suite of real world benches, not some program that runs a set series of functions that may or may not see an equivalent in real world performance. Did you see how the e8600 beat the quad core Intel chips in Sysmark when the quads are inarguably more powerful CPUs? And that doesn't strike you as odd?

I also got my links mixed up with the Techpowerup/Anandtech benches, so yes, that was my fault. Like I said earlier, most games don't utilize more than 2 cores. Those that do will show the i3 soundly beating the Core 2, even the e8600, which is 10% faster than the e8400, and you certainly won't see a raw 10% boost from using it. Run it against the numbers in the Techpowerup bench. You're now completely ignoring the Techpowerup benches, which are much more comprehensive than the Anandtech ones. Seconds are also irrelevant. It's the percentage gains that you want to look at. Let's say one CPU gets a job done in 3 seconds, another in 2. Gee, only a one second difference...except it's a 33% reduction in overall time.
 
How come the i3s have to use an intel GPU? The current mac mini uses a mobile version of the Nvidia 9400. In fact there is not a single mac computer currently that Apple sells that uses an Intel GPU.
Wrong. The 15"/17" MBPs have the Intel GPU. It's just that they also have a dedicated GPU and the computer will automatically switch to the dedicated chip when necessary. You get the integrated GPU whether you like it or not with the i3/i5/i7 mobile chips because it's integrated into the package itself. It's not integrated into the CPU itself, but sits right next to it on the same chip package. You can't separate them.
 
1.4 GHz is probably going to be desktop. Regardless, it looks like a clock speed bump.
True, but we may see a good 1ghz on the mobile chips, and hopefully Intel makes some improvements in the chip that will boost performance further. Time will tell, but hopefully we'll see performance that is competitive with existing integrated GPUs.
 
That depends on what you're doing. Like I said, multithreaded benches show the best improvement. The i3 consistently wins. You seemed to think that it didn't matter.
I seemed to think that it didn't matter? Where did you pull that out of your ass from? I clearly stated from the beginning that clock for clock i3 will win. Go back and read my posts so I don't have to repeat myself.

For crying out loud...SYSMARK IS SYNTHETIC. Synthetic benchmarks are CRAP. I don't trust a single one. I trust a comprehensive suite of real world benches, not some program that runs a set series of functions that may or may not see an equivalent in real world performance. Did you see how the e8600 beat the quad core Intel chips in Sysmark when the quads are inarguably more powerful CPUs? And that doesn't strike you as odd?
This is too funny. You seem to not have a grasp on synthetic and real world applications because you have it backwards. When comparing a E8600 to something like a Lynnfield Core i7, in the synthetic benchmark it will trounce the E8600. In real world applications, like iMovie and Final Cut, the difference won't be as great. You seem to have it backwards and not understand what you are saying.
And no, it really doesn't strike me odd that the Core 2 Quad was below the Duo, the Core 2 Quad wasn't that great.

I also got my links mixed up with the Techpowerup/Anandtech benches, so yes, that was my fault. Like I said earlier, most games don't utilize more than 2 cores. Those that do will show the i3 soundly beating the Core 2, even the e8600, which is 10% faster than the e8400, and you certainly won't see a raw 10% boost from using it. Run it against the numbers in the Techpowerup bench. You're now completely ignoring the Techpowerup benches, which are much more comprehensive than the Anandtech ones. Seconds are also irrelevant. It's the percentage gains that you want to look at. Let's say one CPU gets a job done in 3 seconds, another in 2. Gee, only a one second difference...except it's a 33% reduction in overall time.
No, you said some games the Core i3 performed better and you were lying. You want real world applications? Games are some of them. Also how am I ignoring techpowerup benchmarks? They also showed the rather mediocre performance of the i3 in comparison to the Core 2 Duo.

x264 benchmark - hardly any improvement
Handbrake - 300 second improvement
Xiisoft Converter - hardly any improvement
Divx Converter - ZERO improvement

There you have it folks, one "real world" application shows an amazing 20% improvement. LOL
 
It needs to be cheaper. I suspect the UK price will increase to something like £550. Which is over £200 more than the first incarnation of the Mac Mini at £339.
 
I seemed to think that it didn't matter? Where did you pull that out of your ass from? I clearly stated from the beginning that clock for clock i3 will win. Go back and read my posts so I don't have to repeat myself.
Gee, how about the parts where you said that most applications aren't even multithreaded:
Generally, real world applications don't even support multithreading, so your argument is blown straight out of the window. That synthetic benchmark is the best results you're gonna get.
And then you say that synthetics are the best you are gonna get. Stop posting. Now. Synthetics are garbage. A comprehensive bench of real world applications are the best measure. Why? BECAUSE PEOPLE ACTUALLY USE THEM. When was the last time you used Sysmark to do actual work?

Now, for the average user, it may not matter because they're never going to push the system. For those who do, there is indeed a difference.

This is too funny. You seem to not have a grasp on synthetic and real world applications because you have it backwards. When comparing a E8600 to something like a Lynnfield Core i7, in the synthetic benchmark it will trounce the E8600. In real world applications, like iMovie and Final Cut, the difference won't be as great. You seem to have it backwards and not understand what you are saying.
Funny, the e8600 beats the Core 2 Quads in Sysmark. I'm the one who has it backward? That tells me that Sysmark has piss-poor multithreading. And you're telling me that Final Cut won't see a big difference from an e8600 to a friggin' i7? Are you utterly HIGH? You're looking at two physical cores vs four cores + four virtual cores. There's no contest! Final Cut is heavily multithreaded. If you're rendering or encoding a bunch of video, then the difference is night and day! I went from a 2.8ghz iMac (early 2008, 2.8ghz w/ 6MB cache) to an i7 Hackintosh, and even at stock speeds the i7 destroyed the Core 2 completely and utterly. Just stop posting, for the love of God.
And no, it really doesn't strike me odd that the Core 2 Quad was below the Duo, the Core 2 Quad wasn't that great.
There isn't enough facepalm in the world for this. Let's see, two cores at 3.33ghz or four cores at 2.66ghz. Which do you think has the most horsepower? Now look at the Sysmark chart again. Funny, the chip with significantly less horsepower scores higher. That tells me that Sysmark is NOT a good measure of overall performance.
No, you said some games the Core i3 performed better and you were lying. You want real world applications? Games are some of them. Also how am I ignoring techpowerup benchmarks? They also showed the rather mediocre performance of the i3 in comparison to the Core 2 Duo.
READ THE DAMNED TECHPOWERUP BENCH. Since you seem incapable of clicking further into it, I'll link the exact page for you:

http://www.techpowerup.com/reviews/Intel/Core_i3_540_530/11.html

The i3 comes out significantly ahead in games that actually use multithreading. Are you just sticking your fingers in your ears and saying "LALALA I CAN'T HEAR YOU" all day long? Now look at the rest of the review. I've cited this multiple times already. Try actually looking. The Anandtech article briefly touched a few points. Techpowerup was much more comprehensive.

x264 benchmark - hardly any improvement
Handbrake - 300 second improvement
Xiisoft Converter - hardly any improvement
Divx Converter - ZERO improvement

There you have it folks, one "real world" application shows an amazing 20% improvement. LOL

Gee, that's funny....

http://www.anandtech.com/show/2901/8

From a more detailed Anandtech comparison:

x264 - A bit over 20%. What are you smoking?
Handbrake - 20% improvement, as you said.
Xiisoft - You're estimating as Xiisoft was benched with an e8400.
Divx - A tiny improvement, which says more about Divx than the i3 when you compare other encoding benchmarks.

Now let's look at some others from both the detailed Anandtech review and the Techpowerup. A few points, some of which you oh so conveniently left out:

WM Encoder - About 12%
POV Raytracer - About 30%!
Blender - About 20%
Par2 Data Recover - Over 20%
Sony Vegas Bluray encode - Over 10%
Sorenson Squeeze FLV encode - About 15%
7zip compression, 32MB dictionary - Over 25%. Granted, it loses by a slim margin in another 7zip test.

Gaming is about the only area in which the e8600 comes out ahead most of the time, but even then, the i3 does have an edge when games utilize more than two cores as the Techpowerup benches show. Gaming is far more dependent upon the GPU than the CPU these days. As more games utilize more cores, you'll see the i3 gain an edge.

All rough estimates, but accurate within a percentage point or two. You can stop posting now. An i3 at a clock that's over 10% lower than the e8600 still soundly beats the e8600 overall (at worst, about matching it, losing by more than a tiny margin in very few tests) in anything that is built to take advantage of more than two cores, which is becoming more common as developers catch up. It's not the most colossal leap, but considering the lower clock, it's certainly a nice upgrade, not to mention the availability of higher clocked i3s. I'm comparing the 2.93ghz model, not the 3.06ghz one. I'm not even talking about price. The i3 530 is less than HALF the price of the e8600!
 
Gee, how about the parts where you said that most applications aren't even multithreaded:

And then you say that synthetics are the best you are gonna get. Stop posting. Now. Synthetics are garbage. A comprehensive bench of real world applications are the best measure. Why? BECAUSE PEOPLE ACTUALLY USE THEM. When was the last time you used Sysmark to do actual work?
Pathetic. You still have no grasp on the difference between synthetic benchmarks and real application performance. People who know what they're talking about will use the real world application performance excuse when they are trying to defend a WEAKER processor to a stronger one showing that the stronger processor does not show as many gains as it does in the synthetic benchmark. You have it backwards buddy. How about you stop posting because it's pretty clear you have no idea what you are talking about.

Now, for the average user, it may not matter because they're never going to push the system. For those who do, there is indeed a difference.
I find this funny. A power user with an i3? LMAO!

Funny, the e8600 beats the Core 2 Quads in Sysmark. I'm the one who has it backward? That tells me that Sysmark has piss-poor multithreading. And you're telling me that Final Cut won't see a big difference from an e8600 to a friggin' i7? Are you utterly HIGH? You're looking at two physical cores vs four cores + four virtual cores. There's no contest! Final Cut is heavily multithreaded. If you're rendering or encoding a bunch of video, then the difference is night and day! I went from a 2.8ghz iMac (early 2008, 2.8ghz w/ 6MB cache) to an i7 Hackintosh, and even at stock speeds the i7 destroyed the Core 2 completely and utterly. Just stop posting, for the love of God.
Funny stuff going on here, where do I begin? For one, the Core 2 Quad is a rather mediocre chip. For two, Geekbench(a known benchmarking tool with multithreading) has already proven the differences are minimal.

Also, the only part of Final Cut that is multithreaded are the effects. If the application is not built with something like Grand Central Dispatch in mind, then there's no benefit. It's really coming clear as to what type of person I'm talking to here.

There isn't enough facepalm in the world for this. Let's see, two cores at 3.33ghz or four cores at 2.66ghz. Which do you think has the most horsepower? Now look at the Sysmark chart again. Funny, the chip with significantly less horsepower scores higher. That tells me that Sysmark is NOT a good measure of overall performance.
Oh how it's funny watching someone criticize a benchmark they posted. Hilarious.

READ THE DAMNED TECHPOWERUP BENCH. Since you seem incapable of clicking further into it, I'll link the exact page for you:

http://www.techpowerup.com/reviews/Intel/Core_i3_540_530/11.html

The i3 comes out significantly ahead in games that actually use multithreading. Are you just sticking your fingers in your ears and saying "LALALA I CAN'T HEAR YOU" all day long? Now look at the rest of the review. I've cited this multiple times already. Try actually looking. The Anandtech article briefly touched a few points. Techpowerup was much more comprehensive.
Wow, the Core i3 uses two less watts than the E8400???? HOLY **** ALERT THE PRESSES!!!!

Is that what you wanted me to see? I've looked through it. It's not impressive and overclocking is pointless in a Mac discussion.
Gee, that's funny....

http://www.anandtech.com/show/2901/8

From a more detailed Anandtech comparison:

x264 - A bit over 20%. What are you smoking?
Handbrake - 20% improvement, as you said.
Xiisoft - You're estimating as Xiisoft was benched with an e8400.
Divx - A tiny improvement, which says more about Divx than the i3 when you compare other encoding benchmarks.

Now let's look at some others from both the detailed Anandtech review and the Techpowerup. A few points some of which you oh so conveniently left out:

WM Encoder - About 12%
POV Raytracer - About 30%!
Blender - About 20%
Par2 Data Recover - Over 20%
Sony Vegas Bluray encode - Over 10%
Sorenson Squeeze FLV encode - About 15%
7zip compression, 32MB dictionary - Over 25%. Granted, it loses by a slim margin in another 7zip test.

Gaming is about the only area in which the e8600 comes out ahead most of the time, but even then, the i3 does have an edge when games utilize more than two cores as the Techpowerup benches show. Gaming is far more dependent upon the GPU than the CPU these days. As more games utilize more cores, you'll see the i3 gain an edge.

All rough estimates, but accurate within a percentage point or two. You can stop posting now. An i3 at a clock that's over 10% lower than the e8600 still soundly beats the e8600 overall (at worst, about matching it, losing by more than a tiny margin in very few tests) in anything that is built to take advantage of more than two cores. It's not the most colossal leap, but considering the lower clock, it's certainly a nice upgrade, not to mention the availability of higher clocked i3s. I'm comparing the 2.93ghz model.
x264 - What am I smoking? I'm smoking nothing because of the fact that once you bump up that E8400 to an E8600 that amazing gap you have will vanish.
Xiisoft - LOL, again you posted the benchmark. It showed no gains. Not my problem
Divx - Ok, criticize another benchmark YOU posted. Boy, isn't backpeddling fun?

LOL? Bragging about 10-20% of performance is absolutely hilarious coming from the guy who said the i3 would "destroy" the Core 2 Duo. Give me a break buddy. That Core 2 Duo is 2 years old and the best Intel can squeeze out is 10-20% performance increase?

I rest my case, the i3 is essentially rebranded garbage from 2 years ago. I said the performance differences were minimal when I first responded to you. And they are. Thanks for proving my point.
 
Pathetic. You still have no grasp on the difference between synthetic benchmarks and real application performance. People who know what they're talking about will use the real world application performance excuse when they are trying to defend a WEAKER processor to a stronger one showing that the stronger processor does not show as many gains as it does in the synthetic benchmark. You have it backwards buddy. How about you stop posting because it's pretty clear you have no idea what you are talking about.
HAHAHHAHAHAA! Yeah, sure, you just keep telling yourself that, pal. Synthetics are not real world. Real world is real world. It's showing how ACTUAL APPLICATIONS perform. How exactly does a synthetic measure that? Synthetics may give you a rough idea, but it's a comprehensive suite of real world applications in different categories that will show you how the CPU really does...IN THE REAL WORLD. I can't believe anyone would try to argue to the contrary. Well, unless they're an obstinate chunkhead who cannot admit when he's completely wrong.

Please show me some of these "people who know what they're talking about", because I sure as hell am not talking to one.
I find this funny. A power user with an i3? LMAO!
Not everyone beats their CPU 24/7 and not everyone has a large budget. Performance is performance. I'm not trying to say that a pro user will buy an i3. I'm saying that the i3 is a superior CPU.
Funny stuff going on here, where do I begin? For one, the Core 2 Quad is a rather mediocre chip. For two, Geekbench(a known benchmarking tool with multithreading) has already proven the differences are minimal.
And funny, real world multithreaded apps show that the differences are huge. Do you really think that a Core 2 Duo at 3.33ghz somehow has greater potential than a quad at 2.66ghz, both using the same friggin' architecture? Are you nuts? Did you notice how the Q9400 whomped the e8600 (and the i3, too) in nearly every multimedia bench in that list? And you're going to tell me that a goddamned synthetic bench is a better indication of performance?

Please justify your answer. Don't just tell me that, "Well, smart people say so!" with nothing to back it up. Tell me logically or cite a source that definitely says that a synthetic benchmark is somehow a better indication of performance than a comprehensive suite of real-world applications. Go for it, slugger.
Also, the only part of Final Cut that is multithreaded are the effects. If the application is not built with something like Grand Central Dispatch in mind, then there's no benefit. It's really coming clear as to what type of person I'm talking to here.
Congrats on proving that you're clueless. Final Cut is heavily multithreaded. Why? IT HAS TO BE. Apple has had dual CPUs for nearly ten years now, and you're telling me that they haven't optimized everything they can in their flagship HD video editor for multiple CPUs/cores? Show me some proof or get the hell out. Even if it were just the effects, that's often the most CPU-intensive thing you can do with Final Cut! You're telling me that playback off 1080p HD isn't written for multiple cores in Final Cut? Get lost!

Also, Compressor is very heavily optimized for multiple cores.
Oh how it's funny watching someone criticize a benchmark they posted. Hilarious.
Sysmark is ONE PART of the bench, you toolbox, and the results are very often contrary to the rest of the benchmarks. Reading comprehension. Learn it.
Wow, the Core i3 uses two less watts than the E8400???? HOLY **** ALERT THE PRESSES!!!!
I didn't even mention power consumption. You're now deliberately dodging because you know that you're full of crap and won't admit it. Try looking at the rest of the benchmarks they posted. Go ahead.
Is that what you wanted me to see? I've looked through it. It's not impressive and overclocking is pointless in a Mac discussion.
Confirmed. You completely ignore EVERY OTHER BENCH in there. I said nothing of overclocking. I did point out that it beats the e8400 soundly in games that actually use more than four cores, another point you completely ignore, along with every additional benchmark they had. Smooth move.
x264 - What am I smoking? I'm smoking nothing because of the fact that once you bump up that E8400 to an E8600 that amazing gap you have will vanish.
Xiisoft - LOL, again you posted the benchmark. It showed no gains. Not my problem
Divx - Ok, criticize another benchmark YOU posted. Boy, isn't backpeddling fun?

LOL? I don't see anything related to POV Raytracer, Blender, Par2 Data Recover, Sony Vegas Bluray encode, Sorenson Squeeze FLV encode and 7zip compression. Maybe you should check your links or stop wasting my time. And bragging about 10-20% of performance is absolutely hilarious.
Try clicking on that Anandtech link (the one you just ignored), which I friggin' SAID was a more comprehensive review. Go on, give it a try. Not that it'll matter. You pretty well ignore everything to the contrary anyway. What you'll see will back up everything I said, but I don't expect you to admit to anything. You'll deliberately ignore everything to stay in your insulated bubble. Have fun in there.

This level of ignorance makes me wonder if you're just a troll that crawled out of 4chan. Considering you're thick enough to think that Apple could cram an incredibly hot, power-hungry monster like the GTX 480M (100w in a mobile chip, nearly double that of the 4850) into an iMac when they've never gone for top-end GPUs in that line...yeah. They'd have to blare the fans just to keep it from overheating, and Apple's tendency is more heat instead of more noise.

LOL? Bragging about 10-20% of performance is absolutely hilarious coming from the guy who said the i3 would "destroy" the Core 2 Duo. Give me a break buddy. That Core 2 Duo is 2 years old and the best Intel can squeeze out is 10-20% performance increase?

I rest my case, the i3 is essentially rebranded garbage from 2 years ago. I said the performance differences were minimal when I first responded to you. And they are. Thanks for proving my point.
Yes, 10-20%...in a lower-end chip that costs less than HALF of what the earlier generation top-end cost and runs at a lower clock speed with lower power consumption. My, what a horrid upgrade! Now compare the e8600 to the top-end i5 dual core, which is much closer in price and is at the HIGH END of what Intel is selling in a dual package. The e8600 gets inarguably destroyed. When the lower end comes out 10-20% over the previous high end, I call that some good progress. Are you somehow under the impression that the i3 is supposed to be a high-end chip in Intel's current lineup? It's a rung above the bottom of this generation, yet it still manages to edge out the top end of the previous dual cores. Go ahead, let that sink in for a while. Now look at how much faster the i3 is than the rest of the Core 2 Duo lineup. Pit the i3 against an e7x00, which is about the same price. I'd say destroy isn't a bad term. Congratulations on showing yet another facet of your staggering ignorance. You have no clue how the current processor lineup is situated. The i3 is on the lower end, and you're laughing about how a lower-end chip only manages to beat the higher end chip of yesteryear by a measly 10-20%. You're unbelievable.

Rebranded? You really do have no clue. None. I guess the i3 is just a Core 2 Duo...with hyperthreading. And more aggressive power management (2w difference idle, much wider spread under load). And DMI. And moving the memory controller and PCIe controller onboard. And the integrated GPU. And numerous other improvements. The dual core i5s also have Turboboost and additional AES instructions. Yep, that's just rebranded, all right! No real improvements at all! And on what planet is 10-20% (which you yourself just said) minimal?

You don't have a case to rest unless you're resting on being an ignorant toolbox. You've proven that time and time and time again in here.
 
HAHAHHAHAHAA! Yeah, sure, you just keep telling yourself that, pal. Synthetics are not real world. Real world is real world. It's showing how ACTUAL APPLICATIONS perform. How exactly does a synthetic measure that? Synthetics may give you a rough idea, but it's a comprehensive suite of real world applications in different categories that will show you how the CPU really does...IN THE REAL WORLD. I can't believe anyone would try to argue to the contrary. Well, unless they're an obstinate chunkhead who cannot admit when he's completely wrong.
Oh how you try and twist things trying to act like you know what's going on. It really isn't that tough of a concept to understand. Pay close attention now, class is in session.
http://www.macworld.com/reviews/product/343881/review/27inch_imac_core_i5266ghz.html
Scroll down to the benchmarks. Look at iMovie, then look at a synthetic benchmark like cinebench. If you really can't see how wrong you are then that's just sad.

Please show me some of these "people who know what they're talking about", because I sure as hell am not talking to one.
It's pretty clear you are still confused, perhaps my lesson above will make it clearer for you?

Not everyone beats their CPU 24/7 and not everyone has a large budget. Performance is performance. I'm not trying to say that a pro user will buy an i3. I'm saying that the i3 is a superior CPU.
Oh, but you're trying to play it off like a 10-20% performance increase over a 2 year old processor is the bees knees? LOL

And funny, real world multithreaded apps show that the differences are huge. Do you really think that a Core 2 Duo at 3.33ghz somehow has greater potential than a quad at 2.66ghz, both using the same friggin' architecture? Are you nuts? Did you notice how the Q9400 whomped the e8600 (and the i3, too) in nearly every multimedia bench in that list? And you're going to tell me that a goddamned synthetic bench is a better indication of performance?

Please justify your answer. Don't just tell me that, "Well, smart people say so!" with nothing to back it up. Tell me logically or cite a source that definitely says that a synthetic benchmark is somehow a better indication of performance than a comprehensive suite of real-world applications. Go for it, slugger.
You posted it not me, so now your source of benchmarks aren't reliable? ;) It seems you only take notice to benchmarks that are beneficial to your argument and ignore the rest just because THEYRE SYNTHETIC HOLY **** I CANT BELIEVE IT, I DONT EVEN KNOW WHAT A SYNTHETIC BENCHMARK IS BUT IM COMPLAINING ABOUT IT LIKE I DO

Congrats on proving that you're clueless. Final Cut is heavily multithreaded. Why? IT HAS TO BE. Apple has had dual CPUs for nearly ten years now, and you're telling me that they haven't optimized everything they can in their flagship HD video editor for multiple CPUs/cores? Show me some proof or get the hell out. Even if it were just the effects, that's often the most CPU-intensive thing you can do with Final Cut! You're telling me that playback off 1080p HD isn't written for multiple cores in Final Cut? Get lost!

Also, Compressor is very heavily optimized for multiple cores.
http://forums.creativecow.net/thread/8/1017479

Sysmark is ONE PART of the bench, you toolbox, and the results are very often contrary to the rest of the benchmarks. Reading comprehension. Learn it.
Right, if it's such an awful benchmark then why did your wonderful source use it? ;)

I didn't even mention power consumption. You're now deliberately dodging because you know that you're full of crap and won't admit it. Try looking at the rest of the benchmarks they posted. Go ahead.
I did look at their mediocre benchmarks. 10-20 percent, LOL

Confirmed. You completely ignore EVERY OTHER BENCH in there. I said nothing of overclocking. I did point out that it beats the e8400 soundly in games that actually use more than four cores, another point you completely ignore, along with every additional benchmark they had. Smooth move.
Funny, I just went back and looked at those benchmarks just to get a glimpse of more mediocre 10-20 percent (if even) improvement.

Try clicking on that Anandtech link (the one you just ignored), which I friggin' SAID was a more comprehensive review. Go on, give it a try. Not that it'll matter. You pretty well ignore everything to the contrary anyway. What you'll see will back up everything I said, but I don't expect you to admit to anything. You'll deliberately ignore everything to stay in your insulated bubble. Have fun in there.
I've seen plenty of your mediocre benchmarks that had 10-20 percent improvement. WHICH IS NO DIFFERENT THAN THE ORIGINAL SYNTHETIC GEEKBENCH SCORES I POSTED WHEN I FIRST REPLIED

This level of ignorance makes me wonder if you're just a troll that crawled out of 4chan. Considering you're thick enough to think that Apple could cram an incredibly hot, power-hungry monster like the GTX 480M (100w in a mobile chip, nearly double that of the 4850) into an iMac when they've never gone for top-end GPUs in that line...yeah. They'd have to blare the fans just to keep it from overheating, and Apple's tendency is more heat instead of more noise.
It seems you have the tendency to call anyone a troll from 4chan the second someone disagrees with you. You seem to have quite a fascination with the place, makes me wonder.
https://forums.macrumors.com/showthread.php?p=10021527&#post10021527

Oh and I did not check the watt usage on the GPU at the time as it was not listed on Engadget's article, don't get your panties in a bunch over a single mistake I made.
Yes, 10-20%...in a lower-end chip that costs less than HALF of what the earlier generation top-end cost and runs at a lower clock speed with lower power consumption. My, what a horrid upgrade! Now compare the e8600 to the top-end i5 dual core, which is much closer in price and is at the HIGH END of what Intel is selling in a dual package. The e8600 gets inarguably destroyed. When the lower end comes out 10-20% over the previous high end, I call that some good progress. Are you somehow under the impression that the i3 is supposed to be a high-end chip in Intel's current lineup? It's a rung above the bottom of this generation, yet it still manages to edge out the top end of the previous dual cores. Go ahead, let that sink in for a while. Now look at how much faster the i3 is than the rest of the Core 2 Duo lineup. Pit the i3 against an e7x00, which is about the same price. I'd say destroy isn't a bad term.
It seems you are somwehow under the impression that you've actually proved something. I've been stating this entire time that the i3 was an improvement but not a very good one, do me a favor and get that through your thick skull.

Rebranded? You really do have no clue. None. I guess the i3 is just a Core 2 Duo...with hyperthreaded. And much more aggressive power management (2w difference idle, much wider spread under load). And DMI, which moves the memory controller and PCIe controller onboard. And the integrated GPU. And numerous other improvements. Yep, that's just rebranded, all right! No real improvements at all! And on what planet is 10-20% (which you yourself just said) minimal?
Rebranded performance with their garbage GPU, boy what a fun upgrade that is!

You don't have a case to rest unless you're resting on being an ignorant toolbox. You've proven that time and time and time again in here.
Keep telling yourself that, this entire conversation could've been over with after I posted the original GeekBench scores that showed 10-20% improvement. You lose.

https://forums.macrumors.com/threads/899572/
https://forums.macrumors.com/threads/899572/
https://forums.macrumors.com/threads/899572/
https://forums.macrumors.com/threads/899572/
https://forums.macrumors.com/threads/899572/

10-20%
10-20%
10-20%
10-20%
10-20%
 
Follow the money and you'll get it. Apple doesn't want to cannibalize Mac Pro sales for one.

I might spend the money on a Mac Pro even if it is overkill for my needs. But it is just too big. But I don't need or want portability so a laptop is out. But I like having choices, like what monitor I use, plus if the monitor dies I don't have to replace everything. So an iMac is out. But I want to have a little bit of expandability even if it is just adding a second hard drive without losing the optical drive. So the Mac Mini is out.

So here Apple sees a repeat customer who got his start on Apple computers and wants to stay with them. But is seeing very little resembling the computers that drew him to Apple in the first place.

So I buy nothing while I continue to wait. No money for Apple. What if everyone like me gives up and buys something else? Is that better than selling less Mac Pros? Lose not only a sale but a repeat customer too.
 
Oh how you try and twist things trying to act like you know what's going on. It really isn't that tough of a concept to understand. Pay close attention now, class is in session.
http://www.macworld.com/reviews/product/343881/review/27inch_imac_core_i5266ghz.html
Scroll down to the benchmarks. Look at iMovie, then look at a synthetic benchmark like cinebench. If you really can't see how wrong you are then that's just sad.
1. There are no i5 benchmarks on that page unless you count a quick Speedmark number. Check your link. The only thing beyond that I see are for nothing but Core 2 Duo iMacs from a link toward the bottom.
2. That's a quad core 2.66ghz i5, not a dual. Are you too dense to know the difference?
3. If you think that iMovie itself is somehow comprehensive then you're even dumber than I thought.

Try linking to the actual benchmark next time.
It's pretty clear you are still confused, perhaps my lesson above will make it clearer for you?
Still waiting on any lesson.
Oh, but you're trying to play it off like a 10-20% performance increase over a 2 year old processor is the bees knees? LOL
Once again you completely ignore the place that each CPU has in Intel's product matrix. Don't worry, I'm sure it'll click in a few days. Considering that, 10-20% is impressive. We're looking at lower end dual vs previous top-end dual.
You posted it not me, so now your source of benchmarks aren't reliable? ;) It seems you only take notice to benchmarks that are beneficial to your argument and ignore the rest just because THEYRE SYNTHETIC HOLY **** I CANT BELIEVE IT, I DONT EVEN KNOW WHAT A SYNTHETIC BENCHMARK IS BUT IM COMPLAINING ABOUT IT LIKE I DO
It's one benchmark out of dozens. The results are highly contrary to almost every other benchmark. This isn't surprising at all with synthetics. Most review sites like Anandtech include them for the sake of completeness, not necessarily because it's a definitive measure of pure performance.

You still haven't made a single point on how synthetics are somehow a better measure of real world performance than performance in real world applications. I'm not surprised. Keep dodging the point so you don't have to admit anything. Just stick with "Because smart people say they're better!" without backing anything up. Make sure to never admit when you're wrong even if you know that you are.

http://forums.creativecow.net/thread/8/1017479

Right, if it's such an awful benchmark then why did your wonderful source use it? ;)
Some guy on a forum said so! It must be true!

Come back when you have an answer from a real source, not a goddamned forum post. Hell, it even contradicts your earlier assertion that only effects in FCP are multithreaded! And you accuse me of selectively picking info from sources?

I already stated this above. If it was one of the only ones they ever used, then you'd have a point. If it was so incredible, then they wouldn't bother with dozens of tests in different apps, now would they? They'd just push a button and let the synthetic test everything.

I did look at their mediocre benchmarks. 10-20 percent, LOL

Funny, I just went back and looked at those benchmarks just to get a glimpse of more mediocre 10-20 percent (if even) improvement.

Except you said that they didn't even have those benchmarks earlier, which means you didn't even bother to look. Oops. You again ignore the place of the i3 in the product line. A chip that costs less than half of the previous generation scores higher.
I've seen plenty of your mediocre benchmarks that had 10-20 percent improvement. WHICH IS NO DIFFERENT THAN THE ORIGINAL SYNTHETIC GEEKBENCH SCORES I POSTED WHEN I FIRST REPLIED
And then you tried saying that the i3 was no better, pointed at Systemmark, yadda yadda yadda, it's pathetic, etc. I don't care if the synthetic corresponds with later numbers or not. I don't use it as a benchmark, period. If the synthetic corresponds with the real numbers, then that's fine, but that's not always the case.
It seems you have the tendency to call anyone a troll from 4chan the second someone disagrees with you. You seem to have quite a fascination with the place, makes me wonder.
Or when they display an ignorance so monumental that I wonder if they're just trolling.
Oh and I did not check the watt usage on the GPU at the time, don't get your panties in a bunch over a single mistake I made.
It seems you are somwehow under the impression that you've actually proved something. I've been stating this entire time that the i3 was an improvement but not a very good one, do me a favor and get that through your thick skull.
And you continuously ignore the rest of the Core 2 Duo line, pitting the new low-end against the old high-end. You completely ignore the product matrix because you can't accept that 10-20% is more than just "mediocre" when you take that into account.
Rebranded performance with their garbage GPU, boy what a fun upgrade that is!
"Rebranded" again, same old wrong tune. Get over it. If you don't like the GPU, then don't use it. Get something with a dedicated GPU. At least it's a step up from the even crappier Intel integrated video they've used in the past.
Keep telling yourself that, this entire conversation could've been over with after I posted the original GeekBench scores that showed 10-20% improvement. You lose.
As you ignore all the other bullcrap you've spouted. Your Geekbench scores are all quite varied as they're comparing different laptop systems as well, which makes comparison more difficult. You also ignore that I've been comparing the desktop variants, oh, except that YOU were using desktop variants as well in the later argument! Oopsie. Did you miss that one?

Keep snaking around and screaming "YOU LOSE" as if that'll somehow make you smart. Punctuating statements with "LOL" doesn't do anything to help you on that front. You're not fooling anyone.
 
1. There are no i5 benchmarks on that page unless you count a quick Speedmark number. Check your link. The only thing beyond that I see are for nothing but Core 2 Duo iMacs from a link toward the bottom.
2. That's a quad core 2.66ghz i5, not a dual. Are you too dense to know the difference?
3. If you think that iMovie itself is somehow comprehensive then you're even dumber than I thought.

Try linking to the actual benchmark next time.

Still waiting on any lesson.
Click more where the review is, now who can't navigate through a site? ;)

Once again you completely ignore the place that each CPU has in Intel's product matrix. Don't worry, I'm sure it'll click in a few days. Considering that, 10-20% is impressive. We're looking at lower end dual vs previous top-end dual.
10-20% packed in with Intel's wasted graphics? That's complete garbage.

It's one benchmark out of dozens. The results are highly contrary to almost every other benchmark. This isn't surprising at all with synthetics. Most review sites like Anandtech include them for the sake of completeness, not necessarily because it's a definitive measure of pure performance.
Yet they still include them meaning they serve a purpose.

You still haven't made a single point on how synthetics are somehow a better measure of real world performance than performance in real world applications. I'm not surprised. Keep dodging the point so you don't have to admit anything.
Yeah, I'm dodging the point because you don't know how to click "More" on a review page then scroll down.

I already stated this above. If it was one of the only ones they ever used, then you'd have a point. If it was so incredible, then they wouldn't bother with dozens of tests in different apps, now would they? They'd just push a button and let the synthetic test everything.
You're still wrong on Final Cut. What do you think, it runs on magic? It runs on the Quicktime 7 framework which is FAR from being optimized for multithreading and Final Cut Pro will never be until Quicktime X has all of Quicktime 7's features. You are just assuming it has multithreaded support because you don't know how applications work.

Except you said that they didn't even have those benchmarks earlier, which means you didn't even bother to look. Oops. You again ignore the place of the i3 in the product line. A chip that costs less than half of the previous generation scores higher.
What the hell are you talking about? They didn't even have those benchmarks earlier? Please quote me on where I said this. I'd love to see it. The scores still stand, you're just dancing around it with cost, which these garbage i3 processors should cost half the price in this day and age.

And then you tried saying that the i3 was no better, pointed at Systemmark, yadda yadda yadda, it's pathetic, etc. I don't care if the synthetic corresponds with later numbers or not. I don't use it as a benchmark, period. If the synthetic corresponds with the real numbers, then that's fine, but that's not always the case.
Right, but you ignore all of the games on anandtech that show that the Core 2 Duo performed better than the Core i3? What a joke.

Or when they display an ignorance so monumental that I wonder if they're just trolling.
I wonder the same about you, do me a favor and go to 4chan if you love to talk about them so much.

And you continuously ignore the rest of the Core 2 Duo line, pitting the new low-end against the old high-end. You completely ignore the product matrix because you can't accept that 10-20% is more than just "mediocre" when you take that into account.
What the hell? Since when was the Core 2 Duo a high end exclusive? What a joke that you are trying to claim that the Core 2 Duo was high end 2 years ago. It was mid-range or lower.

"Rebranded" again, same old wrong tune. Get over it. If you don't like the GPU, then don't use it. Get something with a dedicated GPU. At least it's a step up from the even crappier Intel integrated video they've used in the past.
Umm, I'd take their older GPU's if you could still remove them over their current horse **** any day of the week.

As you ignore all the other bullcrap you've spouted. Your Geekbench scores are all quite varied as they're comparing different laptop systems as well, which makes comparison more difficult. You also ignore that I've been comparing the desktop variants, oh, except that YOU were using desktop variants as well in the later argument! Oopsie. Did you miss that one?
Different laptop systems? All of the systems compared have the same RAM and using 32 bit geekbench, you think the scores will change that much otherwise? What a joke.

Keep snaking around and screaming "YOU LOSE" as if that'll somehow make you smart. Punctuating statements with "LOL" doesn't do anything to help you on that front. You're not fooling anyone.
Mhm, keep telling people to go crawl back to 4chan. Pathetic.

Anyway, I've got REAL LIFE things to do, I'll be back later to see if you have learned anything by then.
 
Click more where the review is, now who can't navigate through a site? ;)
I can admit that I missed it. I still don't see your point as it's comparing an i5 quad to Core 2 Duos and the Core 2s get stomped in anything multithreaded. One iMovie result (two of them posted, which differ pretty drastically) does not equal comprehensive. You still don't have a point here.
10-20% packed in with Intel's wasted graphics? That's complete garbage.
Aaaaand you still ignore the low vs high end. I expected as much. Just keep ignoring it.
Yet they still include them meaning they serve a purpose.
To be thorough. You still haven't stated WHY they're better beyond "People use them!". You still don't have a point yet again.
You're still wrong on Final Cut. What do you think, it runs on magic? It runs on the Quicktime 7 framework which is FAR from being optimized for multithreading and Final Cut Pro will never be until Quicktime X has all of Quicktime 7's features. You are just assuming it has multithreaded support because you don't know how applications work.
Final Cut is aging, no doubt. The next one had better be reworked. Could the multithreading be better? Of course. Are effects the only multithreaded part of Final Cut? Nope. Not even close. Final Cut doesn't scale terribly well across a bunch of cores, but if you think it's largely single-threaded then you're the one with no clue here.
What the hell are you talking about? They didn't even have those benchmarks earlier? Please quote me on where I said this. I'd love to see it. The scores still stand, you're just dancing around it with cost, which these garbage i3 processors should cost half the price in this day and age.
Gee, how about this:
LOL? I don't see anything related to POV Raytracer, Blender, Par2 Data Recover, Sony Vegas Bluray encode, Sorenson Squeeze FLV encode and 7zip compression. Maybe you should check your links or stop wasting my time.
Funny, YOU EDITED IT AND YOU F**KING KNOW IT. It's in the quotes of one of my posts, directly quoting your earlier post. You cut it out to cover your ass. Nice try, and you can deny it until you're blue in the face, but we both KNOW you're a f**king liar now.

And yes, the "garbage" i3 (which still beats the former high end, even by your own admission) should cost half when that former high end still costs double and it eviscerates other Core 2s that cost half of what it costs. Once again, you completely fail to make a valid point.
Right, but you ignore all of the games on anandtech that show that the Core 2 Duo performed better than the Core i3? What a joke.
And you ignore that most games don't use more than 2 cores and then ignore EVERY OTHER BENCH where the i3 comes out ahead, then ignore the Techpowerup benches in games that use more than 2 cores showing the i3 pulling ahead, constantly hiding behind this...even after you said 10-20% yourself. Nice.
I wonder the same about you, do me a favor and go to 4chan if you love to talk about them so much.
I'd rather not waste my time in that festering anus of the internet.
What the hell? Since when was the Core 2 Duo a high end exclusive? What a joke that you are trying to claim that the Core 2 Duo was high end 2 years ago. It was mid-range or lower.
Did you miss the part where I said it was the high end DUAL CORE, idiot? Oh, wait, you conveniently ignored it like so many other things. And the e8600 still costs nearly $300, which is pretty damned spendy for a desktop CPU.
Umm, I'd take their older GPU's if you could still remove them over their current horse **** any day of the week.
You couldn't remove any of Intel's integrated video if it was built into the board. And I'll say this again: If you don't want Intel integrated, you can get something that uses a dedicated GPU. Simple. Most people don't use their GPU beyond playing solitaire and watching movies, so the integrated video is just fine for them. If you need more, then get more. I don't see why this is hard to grasp.
Different laptop systems? All of the systems compared have the same RAM and using 32 bit geekbench, you think the scores will change that much otherwise? What a joke.
Then how do you account for the variance in the scores, idiot? The Core 2 is all over the 3000 range with those same specs!
Mhm, keep telling people to go crawl back to 4chan. Pathetic.

Anyway, I've got REAL LIFE things to do, I'll be back later to see if you have learned anything by then.
I didn't say that in the very text you quoted, so stop making things up.

You've proven yourself a liar, a fool, and so many other things. Keep editing things and hope people don't notice, you pathetic jackass. We both know it and I'm done with this if you're just going to edit your posts and claim you never said something. You'd make a great politician.
 
I can admit that I missed it. I still don't see your point as it's comparing an i5 quad to Core 2 Duos and the Core 2s get stomped in anything multithreaded. One iMovie result (two of them posted, which differ pretty drastically) does not equal comprehensive. You still don't have a point here.
What the hell are you talking about? The one that matters (the export to iTunes) was not impressive. The i5 stomping all over the Core 2 Duo? LMAO! What page are you looking at? The differences between iTunes, iPhoto, iMovie and Photoshop all make it very apparent that if the app isn't multithread aware it doesn't matter how powerful your CPU is, meaning that that 10-20 percent that you are gloating about will be even LESS noticeable. Do you not see this?

Aaaaand you still ignore the low vs high end. I expected as much. Just keep ignoring it.
Err, I keep ignoring it? I clearly discussed it below.

To be thorough. You still haven't stated WHY they're better beyond "People use them!". You still don't have a point yet again.
And you do? You claim they use it just because other people use it. Well why do other people use it? ***** and giggles? :rolleyes:

Final Cut is aging, no doubt. The next one had better be reworked. Could the multithreading be better? Of course. Are effects the only multithreaded part of Final Cut? Nope. Not even close. Final Cut doesn't scale terribly well across a bunch of cores, but if you think it's largely single-threaded then you're the one with no clue here.
So you're wrong and now trying to spin it around, nice strawman argument.

Gee, how about this:

Funny, YOU EDITED IT AND YOU F**KING KNOW IT. It's in the quotes of one of my posts, directly quoting your earlier post. You cut it out to cover your ass. Nice try, and you can deny it until you're blue in the face, but we both KNOW you're a f**king liar now.
You're right, I cut it because I hadn't noticed the other pages. This is all coming from the guy who edited his post about CS4 and games and whatnot before they were deleted. AND YOU ****ING KNOW IT LOLCAPSLOCK

And yes, the "garbage" i3 (which still beats the former high end, even by your own admission) should cost half when that former high end still costs double and it eviscerates other Core 2s that cost half of what it costs. Once again, you completely fail to make a valid point.
Right, I fail to make a valid point because you say so. The Core 2 Duo's are 2 years old, and I love how you are leaving out Arrandale i3's just to fit your "high end" dual core ********.

And you ignore that most games don't use more than 2 cores and then ignore EVERY OTHER BENCH where the i3 comes out ahead, then ignore the Techpowerup benches in games that use more than 2 cores showing the i3 pulling ahead, constantly hiding behind this...even after you said 10-20% yourself. Nice.
Mhm, so you are assuming the game engines used were just utilizing two cores. Well guess what, if most games utilize two cores then what the **** is the point of upgrading to an i3? For that 10-20% gain in a portion of my games? Yippee!

Did you miss the part where I said it was the high end DUAL CORE, idiot? Oh, wait, you conveniently ignored it like so many other things. And the e8600 still costs nearly $300, which is pretty damned spendy for a desktop CPU.
Much like you are conveniently ignoring the lower end Arrandale Core i3's. How swell.

You couldn't remove any of Intel's integrated video if it was built into the board. And I'll say this again: If you don't want Intel integrated, you can get something that uses a dedicated GPU. Simple. Most people don't use their GPU beyond playing solitaire and watching movies, so the integrated video is just fine for them. If you need more, then get more. I don't see why this is hard to grasp.
Why this is hard to grasp? Because Intel's GPU is utter horse ****. It's so bad it can't even display colors right on the new MacBook Pro, you're seriously defending that garbage?

Then how do you account for the variance in the scores, idiot? The Core 2 is all over the 3000 range with those same specs!
You take the average score, it's not that difficult, idiot.

I didn't say that in the very text you quoted, so stop making things up.

You've proven yourself a liar, a fool, and so many other things. Keep editing things and hope people don't notice, you pathetic jackass. We both know it and I'm done with this if you're just going to edit your posts and claim you never said something. You'd make a great politician.
Oh you clearly talked about telling someone to crawl back to 4chan in your previous posts. You also called someone a bonafied retard. It seems you have a lot class.

You'd make a great politician too considering when you tried to cover your ass up when you posted your first anandtech benchmarks making the claim that some games performed better on the i3 when none of their benchmarks gave any improvement too it.
 
Would it be too much to ask for a "Mini Node"?

I don't know what the next entry level consumer Mini will have for features (grpahics, CPU, HDD, etc.). The heated debate over the relative value of Core 2 Duos and i3s notwithstanding, I would submit that it would be nice for Apple to cough up a super-cheap Mini-Node" machine for those of us using Logic and FCP.

Take a Mini case. Throw out the optical drive. Add in its place bigger drives with SSD options. 4 GB RAM. Graphics would be of no concern, so crappy integrated Intel would be more than fine. Maybe take out extra USB and FW ports and add more Gigabit Ethernet ports. Preinstall all of the necessary Logic node and or QMaster stuff for node use. Then price them @ $399.

Anyone running FCP or Logic would pick one or two up, for sure. Heck, Apple could sell rack mount kits so FCP/Logic users could group several units into their own low-rent render farms. I know Apple sells XServes for clustering applications, but audio and video projects don't need the overengineering that 24/7 file servers do.

So that's my utilitarian Mac fantasy. Mac Mini Nodes.

Now back to the i3/C2d flame war. (sigh)
 
Pretty good idea. Would Mac mini Server hardware with the modifications you mention work for that?

The Mini Server is overkill and costs $999. A node wouldn't have to have a TB of storage or a server OS. I was hoping that gutting the thing might bring the price down.

I guess my "Mac Mini Node" will probably just end up being refurbed entry level Minis.
 
For one, Arrandale = Clarkdale - 1MB cache. Same goes for the i5 mobile chips, the ones in the Macbook Pro that spank a Core 2 in multithreaded benches. For two, as far as performance goes, that's the only real difference. For three, GEEKBENCH = SYNTHETIC. Synthetic benchmarks are crap. Also, I dug up a more comprehensive bench suite and edited the post a bit later. It shows the i3 wrecking the Core 2 Duo with a much wider variety of tests.

http://www.techpowerup.com/reviews/Intel/Core_i3_540_530/7.html

Except that said post has now disappeared. Why in the heck are the mods on this site so touchy and trigger-happy?

Spank ... Wreak ... Destroy ... Dude, you drank too much marketing kool-aid. We are speaking about a marginal difference, if any. And a very poor integrated graphics.
 
He thinks that PCs will still have their uses and people will still want them. He also said that PCs will not be replaced anytime soon, but it will happen.

lol...that statement is silly and based on what?


Until I see Apple make any real headway into the business world its all just pillow talk from Mr. Jobs.
 
Having owned four of them (one actually belonged to a client for a year), I sincerely believe the Mac Mini to be Apple's most reliable and underrated product. I sit here typing this on what is currently the highest specced non Server model (2.53GHz, 4GB RAM etc) connected to a Dell U2711 27" monitor (awesome!) - and she runs cool, only struggling if I fire up InDesign CS5 with a few other CS5 apps open at the same time, but only then if Chrome is struggling with Flash, which it does with aplomb!

A speedier Mac Mini would be great, but I cannot think of much else to make it better. Perhaps removing the CD/DVD drive and making it even smaller would be cool, with a black case option for home media centre use.

Apple went wrong with Apple TV by focusing on the hardware too much. They should have simply prepared some excellent OS X software and produced a media bay free Mac Mini - with HDMI out of course.

I love my Mac Mini because it's 'invisible' and quiet.

Why do you love yours?



I agree. The Mini is the best computer I actually ever bought all things considered. I bought it as a refurb for under $700 2 years ago and the thing is absolutely rock solid. I could probably sell it and get up to $400 for it right now which is crazy.

My biggest gripe is apple will never give these things any decent upgrades because it would kill their other hardware sales like the imac and Mac Pro. There would be a lot of graphics and video editors who would save the cash on a mini if suddenly they could get one with an i7 intel , 8 gigs of ram and a decent video card. It just doesn't make sense given that Apple makes their profit on hardware sales to have a powerful mini. That how Apple is able to sell Final Cut Studio for a mere $999. They are selling those pricey Mac Pros to go with them.

It would basically just be an iMac without a display. Now is that so much to ask for.....answer Yes.
 
My crystal ball says:

New model called Mac Mini Pro
w/ Quad Core
w/ nvidia 320m graphics module
w/ HDMI port
w/ Embedded WiFi and Bluetooth
w/ Blu Ray super drive option
w/ front USB port
w/ terabyte drive option

there is also another model with a bunch of options to replace Mac TV and become a living room entertainment hub with LOTS of WiFi suite software to use your iPad, iPhone or iPod Touch as you watch your favorite show or movie.

Wrong, wrong, wrong, wrong, wrong.
Doubtful HDMI, NO BLU RAY!
definitely not quad core.
Why even post this? Too much kool aid, and imagination.:mad:
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.