I wish apple would make them in black to match monitors and home theaters.![]()
black would look cool
I wish apple would make them in black to match monitors and home theaters.![]()
Perhaps, but that depends on what you're doing. The integrated GPU that Intel is using is pretty crummy, but not nearly as crummy as their past GPUs, at least. The desktop GPUs score a bit lower than the 9400M, but the laptop GPUs are clocked a fair bit lower. If you're not doing much of anything that heavily uses the video card then it probably wouldn't be noticed.Any gains you would have from Apple using Core i3s would be offset by the poor performance of the Intel GPU. End of story. You can argue the metrics/error margins/nerd porn all day, there's no changing that fact.
Any gains you would have from Apple using Core i3s would be offset by the poor performance of the Intel GPU. End of story. You can argue the metrics/error margins/nerd porn all day, there's no changing that fact.
The integrated graphics of Clarkdale/Arrandale are on the same package as the CPU.How come the i3s have to use an intel GPU? The current mac mini uses a mobile version of the Nvidia 9400. In fact there is not a single mac computer currently that Apple sells that uses an Intel GPU.
1.4 GHz is probably going to be desktop. Regardless, it looks like a clock speed bump.The clock speeds are going to go as high as 1.4ghz on the integrated GPU once SB hits.
That depends on what you're doing. Like I said, multithreaded benches show the best improvement. The i3 consistently wins. You seemed to think that it didn't matter.I looked at the page. It was them comparing a similarly clocked Core 2 Duo to a similarly clocked Core i3. No one here is arguing the Core i3 isn't better than the Core 2 Duo clock for clock. But from what I did see, it pretty much confirms that the 3.33GHz E8600 will fare very well against the Core i3 chips. Your benchmarks show nothing new, you actually think you are proving something here?
For crying out loud...SYSMARK IS SYNTHETIC. Synthetic benchmarks are CRAP. I don't trust a single one. I trust a comprehensive suite of real world benches, not some program that runs a set series of functions that may or may not see an equivalent in real world performance. Did you see how the e8600 beat the quad core Intel chips in Sysmark when the quads are inarguably more powerful CPUs? And that doesn't strike you as odd?Boy, you really are drinking the Intel kool-aid when it comes to these i3 processors. It was your quote, not mine. "Take a look at the real-world stuff. x264 encoding, CS4, some games"
Let's do a real comparison of these anandtech benchmarks that you posted shall we? http://www.anandtech.com/show/2921/2
The E8600 outperformed the Core i3 in EVERY game. The Core 2 Duo outperformed the Core i3 in sysmark 2007. The only time the Core i3 was in front it was by 1 or 2 seconds proving you are just wasting your time trying to argue.
Wrong. The 15"/17" MBPs have the Intel GPU. It's just that they also have a dedicated GPU and the computer will automatically switch to the dedicated chip when necessary. You get the integrated GPU whether you like it or not with the i3/i5/i7 mobile chips because it's integrated into the package itself. It's not integrated into the CPU itself, but sits right next to it on the same chip package. You can't separate them.How come the i3s have to use an intel GPU? The current mac mini uses a mobile version of the Nvidia 9400. In fact there is not a single mac computer currently that Apple sells that uses an Intel GPU.
True, but we may see a good 1ghz on the mobile chips, and hopefully Intel makes some improvements in the chip that will boost performance further. Time will tell, but hopefully we'll see performance that is competitive with existing integrated GPUs.1.4 GHz is probably going to be desktop. Regardless, it looks like a clock speed bump.
I seemed to think that it didn't matter? Where did you pull that out of your ass from? I clearly stated from the beginning that clock for clock i3 will win. Go back and read my posts so I don't have to repeat myself.That depends on what you're doing. Like I said, multithreaded benches show the best improvement. The i3 consistently wins. You seemed to think that it didn't matter.
This is too funny. You seem to not have a grasp on synthetic and real world applications because you have it backwards. When comparing a E8600 to something like a Lynnfield Core i7, in the synthetic benchmark it will trounce the E8600. In real world applications, like iMovie and Final Cut, the difference won't be as great. You seem to have it backwards and not understand what you are saying.For crying out loud...SYSMARK IS SYNTHETIC. Synthetic benchmarks are CRAP. I don't trust a single one. I trust a comprehensive suite of real world benches, not some program that runs a set series of functions that may or may not see an equivalent in real world performance. Did you see how the e8600 beat the quad core Intel chips in Sysmark when the quads are inarguably more powerful CPUs? And that doesn't strike you as odd?
No, you said some games the Core i3 performed better and you were lying. You want real world applications? Games are some of them. Also how am I ignoring techpowerup benchmarks? They also showed the rather mediocre performance of the i3 in comparison to the Core 2 Duo.I also got my links mixed up with the Techpowerup/Anandtech benches, so yes, that was my fault. Like I said earlier, most games don't utilize more than 2 cores. Those that do will show the i3 soundly beating the Core 2, even the e8600, which is 10% faster than the e8400, and you certainly won't see a raw 10% boost from using it. Run it against the numbers in the Techpowerup bench. You're now completely ignoring the Techpowerup benches, which are much more comprehensive than the Anandtech ones. Seconds are also irrelevant. It's the percentage gains that you want to look at. Let's say one CPU gets a job done in 3 seconds, another in 2. Gee, only a one second difference...except it's a 33% reduction in overall time.
Gee, how about the parts where you said that most applications aren't even multithreaded:I seemed to think that it didn't matter? Where did you pull that out of your ass from? I clearly stated from the beginning that clock for clock i3 will win. Go back and read my posts so I don't have to repeat myself.
And then you say that synthetics are the best you are gonna get. Stop posting. Now. Synthetics are garbage. A comprehensive bench of real world applications are the best measure. Why? BECAUSE PEOPLE ACTUALLY USE THEM. When was the last time you used Sysmark to do actual work?Generally, real world applications don't even support multithreading, so your argument is blown straight out of the window. That synthetic benchmark is the best results you're gonna get.
Funny, the e8600 beats the Core 2 Quads in Sysmark. I'm the one who has it backward? That tells me that Sysmark has piss-poor multithreading. And you're telling me that Final Cut won't see a big difference from an e8600 to a friggin' i7? Are you utterly HIGH? You're looking at two physical cores vs four cores + four virtual cores. There's no contest! Final Cut is heavily multithreaded. If you're rendering or encoding a bunch of video, then the difference is night and day! I went from a 2.8ghz iMac (early 2008, 2.8ghz w/ 6MB cache) to an i7 Hackintosh, and even at stock speeds the i7 destroyed the Core 2 completely and utterly. Just stop posting, for the love of God.This is too funny. You seem to not have a grasp on synthetic and real world applications because you have it backwards. When comparing a E8600 to something like a Lynnfield Core i7, in the synthetic benchmark it will trounce the E8600. In real world applications, like iMovie and Final Cut, the difference won't be as great. You seem to have it backwards and not understand what you are saying.
There isn't enough facepalm in the world for this. Let's see, two cores at 3.33ghz or four cores at 2.66ghz. Which do you think has the most horsepower? Now look at the Sysmark chart again. Funny, the chip with significantly less horsepower scores higher. That tells me that Sysmark is NOT a good measure of overall performance.And no, it really doesn't strike me odd that the Core 2 Quad was below the Duo, the Core 2 Quad wasn't that great.
READ THE DAMNED TECHPOWERUP BENCH. Since you seem incapable of clicking further into it, I'll link the exact page for you:No, you said some games the Core i3 performed better and you were lying. You want real world applications? Games are some of them. Also how am I ignoring techpowerup benchmarks? They also showed the rather mediocre performance of the i3 in comparison to the Core 2 Duo.
x264 benchmark - hardly any improvement
Handbrake - 300 second improvement
Xiisoft Converter - hardly any improvement
Divx Converter - ZERO improvement
There you have it folks, one "real world" application shows an amazing 20% improvement. LOL
Pathetic. You still have no grasp on the difference between synthetic benchmarks and real application performance. People who know what they're talking about will use the real world application performance excuse when they are trying to defend a WEAKER processor to a stronger one showing that the stronger processor does not show as many gains as it does in the synthetic benchmark. You have it backwards buddy. How about you stop posting because it's pretty clear you have no idea what you are talking about.Gee, how about the parts where you said that most applications aren't even multithreaded:
And then you say that synthetics are the best you are gonna get. Stop posting. Now. Synthetics are garbage. A comprehensive bench of real world applications are the best measure. Why? BECAUSE PEOPLE ACTUALLY USE THEM. When was the last time you used Sysmark to do actual work?
I find this funny. A power user with an i3? LMAO!Now, for the average user, it may not matter because they're never going to push the system. For those who do, there is indeed a difference.
Funny stuff going on here, where do I begin? For one, the Core 2 Quad is a rather mediocre chip. For two, Geekbench(a known benchmarking tool with multithreading) has already proven the differences are minimal.Funny, the e8600 beats the Core 2 Quads in Sysmark. I'm the one who has it backward? That tells me that Sysmark has piss-poor multithreading. And you're telling me that Final Cut won't see a big difference from an e8600 to a friggin' i7? Are you utterly HIGH? You're looking at two physical cores vs four cores + four virtual cores. There's no contest! Final Cut is heavily multithreaded. If you're rendering or encoding a bunch of video, then the difference is night and day! I went from a 2.8ghz iMac (early 2008, 2.8ghz w/ 6MB cache) to an i7 Hackintosh, and even at stock speeds the i7 destroyed the Core 2 completely and utterly. Just stop posting, for the love of God.
Oh how it's funny watching someone criticize a benchmark they posted. Hilarious.There isn't enough facepalm in the world for this. Let's see, two cores at 3.33ghz or four cores at 2.66ghz. Which do you think has the most horsepower? Now look at the Sysmark chart again. Funny, the chip with significantly less horsepower scores higher. That tells me that Sysmark is NOT a good measure of overall performance.
Wow, the Core i3 uses two less watts than the E8400???? HOLY **** ALERT THE PRESSES!!!!READ THE DAMNED TECHPOWERUP BENCH. Since you seem incapable of clicking further into it, I'll link the exact page for you:
http://www.techpowerup.com/reviews/Intel/Core_i3_540_530/11.html
The i3 comes out significantly ahead in games that actually use multithreading. Are you just sticking your fingers in your ears and saying "LALALA I CAN'T HEAR YOU" all day long? Now look at the rest of the review. I've cited this multiple times already. Try actually looking. The Anandtech article briefly touched a few points. Techpowerup was much more comprehensive.
x264 - What am I smoking? I'm smoking nothing because of the fact that once you bump up that E8400 to an E8600 that amazing gap you have will vanish.Gee, that's funny....
http://www.anandtech.com/show/2901/8
From a more detailed Anandtech comparison:
x264 - A bit over 20%. What are you smoking?
Handbrake - 20% improvement, as you said.
Xiisoft - You're estimating as Xiisoft was benched with an e8400.
Divx - A tiny improvement, which says more about Divx than the i3 when you compare other encoding benchmarks.
Now let's look at some others from both the detailed Anandtech review and the Techpowerup. A few points some of which you oh so conveniently left out:
WM Encoder - About 12%
POV Raytracer - About 30%!
Blender - About 20%
Par2 Data Recover - Over 20%
Sony Vegas Bluray encode - Over 10%
Sorenson Squeeze FLV encode - About 15%
7zip compression, 32MB dictionary - Over 25%. Granted, it loses by a slim margin in another 7zip test.
Gaming is about the only area in which the e8600 comes out ahead most of the time, but even then, the i3 does have an edge when games utilize more than two cores as the Techpowerup benches show. Gaming is far more dependent upon the GPU than the CPU these days. As more games utilize more cores, you'll see the i3 gain an edge.
All rough estimates, but accurate within a percentage point or two. You can stop posting now. An i3 at a clock that's over 10% lower than the e8600 still soundly beats the e8600 overall (at worst, about matching it, losing by more than a tiny margin in very few tests) in anything that is built to take advantage of more than two cores. It's not the most colossal leap, but considering the lower clock, it's certainly a nice upgrade, not to mention the availability of higher clocked i3s. I'm comparing the 2.93ghz model.
HAHAHHAHAHAA! Yeah, sure, you just keep telling yourself that, pal. Synthetics are not real world. Real world is real world. It's showing how ACTUAL APPLICATIONS perform. How exactly does a synthetic measure that? Synthetics may give you a rough idea, but it's a comprehensive suite of real world applications in different categories that will show you how the CPU really does...IN THE REAL WORLD. I can't believe anyone would try to argue to the contrary. Well, unless they're an obstinate chunkhead who cannot admit when he's completely wrong.Pathetic. You still have no grasp on the difference between synthetic benchmarks and real application performance. People who know what they're talking about will use the real world application performance excuse when they are trying to defend a WEAKER processor to a stronger one showing that the stronger processor does not show as many gains as it does in the synthetic benchmark. You have it backwards buddy. How about you stop posting because it's pretty clear you have no idea what you are talking about.
Not everyone beats their CPU 24/7 and not everyone has a large budget. Performance is performance. I'm not trying to say that a pro user will buy an i3. I'm saying that the i3 is a superior CPU.I find this funny. A power user with an i3? LMAO!
And funny, real world multithreaded apps show that the differences are huge. Do you really think that a Core 2 Duo at 3.33ghz somehow has greater potential than a quad at 2.66ghz, both using the same friggin' architecture? Are you nuts? Did you notice how the Q9400 whomped the e8600 (and the i3, too) in nearly every multimedia bench in that list? And you're going to tell me that a goddamned synthetic bench is a better indication of performance?Funny stuff going on here, where do I begin? For one, the Core 2 Quad is a rather mediocre chip. For two, Geekbench(a known benchmarking tool with multithreading) has already proven the differences are minimal.
Congrats on proving that you're clueless. Final Cut is heavily multithreaded. Why? IT HAS TO BE. Apple has had dual CPUs for nearly ten years now, and you're telling me that they haven't optimized everything they can in their flagship HD video editor for multiple CPUs/cores? Show me some proof or get the hell out. Even if it were just the effects, that's often the most CPU-intensive thing you can do with Final Cut! You're telling me that playback off 1080p HD isn't written for multiple cores in Final Cut? Get lost!Also, the only part of Final Cut that is multithreaded are the effects. If the application is not built with something like Grand Central Dispatch in mind, then there's no benefit. It's really coming clear as to what type of person I'm talking to here.
Sysmark is ONE PART of the bench, you toolbox, and the results are very often contrary to the rest of the benchmarks. Reading comprehension. Learn it.Oh how it's funny watching someone criticize a benchmark they posted. Hilarious.
I didn't even mention power consumption. You're now deliberately dodging because you know that you're full of crap and won't admit it. Try looking at the rest of the benchmarks they posted. Go ahead.Wow, the Core i3 uses two less watts than the E8400???? HOLY **** ALERT THE PRESSES!!!!
Confirmed. You completely ignore EVERY OTHER BENCH in there. I said nothing of overclocking. I did point out that it beats the e8400 soundly in games that actually use more than four cores, another point you completely ignore, along with every additional benchmark they had. Smooth move.Is that what you wanted me to see? I've looked through it. It's not impressive and overclocking is pointless in a Mac discussion.
Try clicking on that Anandtech link (the one you just ignored), which I friggin' SAID was a more comprehensive review. Go on, give it a try. Not that it'll matter. You pretty well ignore everything to the contrary anyway. What you'll see will back up everything I said, but I don't expect you to admit to anything. You'll deliberately ignore everything to stay in your insulated bubble. Have fun in there.x264 - What am I smoking? I'm smoking nothing because of the fact that once you bump up that E8400 to an E8600 that amazing gap you have will vanish.
Xiisoft - LOL, again you posted the benchmark. It showed no gains. Not my problem
Divx - Ok, criticize another benchmark YOU posted. Boy, isn't backpeddling fun?
LOL? I don't see anything related to POV Raytracer, Blender, Par2 Data Recover, Sony Vegas Bluray encode, Sorenson Squeeze FLV encode and 7zip compression. Maybe you should check your links or stop wasting my time. And bragging about 10-20% of performance is absolutely hilarious.
Yes, 10-20%...in a lower-end chip that costs less than HALF of what the earlier generation top-end cost and runs at a lower clock speed with lower power consumption. My, what a horrid upgrade! Now compare the e8600 to the top-end i5 dual core, which is much closer in price and is at the HIGH END of what Intel is selling in a dual package. The e8600 gets inarguably destroyed. When the lower end comes out 10-20% over the previous high end, I call that some good progress. Are you somehow under the impression that the i3 is supposed to be a high-end chip in Intel's current lineup? It's a rung above the bottom of this generation, yet it still manages to edge out the top end of the previous dual cores. Go ahead, let that sink in for a while. Now look at how much faster the i3 is than the rest of the Core 2 Duo lineup. Pit the i3 against an e7x00, which is about the same price. I'd say destroy isn't a bad term. Congratulations on showing yet another facet of your staggering ignorance. You have no clue how the current processor lineup is situated. The i3 is on the lower end, and you're laughing about how a lower-end chip only manages to beat the higher end chip of yesteryear by a measly 10-20%. You're unbelievable.LOL? Bragging about 10-20% of performance is absolutely hilarious coming from the guy who said the i3 would "destroy" the Core 2 Duo. Give me a break buddy. That Core 2 Duo is 2 years old and the best Intel can squeeze out is 10-20% performance increase?
I rest my case, the i3 is essentially rebranded garbage from 2 years ago. I said the performance differences were minimal when I first responded to you. And they are. Thanks for proving my point.
Oh how you try and twist things trying to act like you know what's going on. It really isn't that tough of a concept to understand. Pay close attention now, class is in session.HAHAHHAHAHAA! Yeah, sure, you just keep telling yourself that, pal. Synthetics are not real world. Real world is real world. It's showing how ACTUAL APPLICATIONS perform. How exactly does a synthetic measure that? Synthetics may give you a rough idea, but it's a comprehensive suite of real world applications in different categories that will show you how the CPU really does...IN THE REAL WORLD. I can't believe anyone would try to argue to the contrary. Well, unless they're an obstinate chunkhead who cannot admit when he's completely wrong.
It's pretty clear you are still confused, perhaps my lesson above will make it clearer for you?Please show me some of these "people who know what they're talking about", because I sure as hell am not talking to one.
Oh, but you're trying to play it off like a 10-20% performance increase over a 2 year old processor is the bees knees? LOLNot everyone beats their CPU 24/7 and not everyone has a large budget. Performance is performance. I'm not trying to say that a pro user will buy an i3. I'm saying that the i3 is a superior CPU.
You posted it not me, so now your source of benchmarks aren't reliable?And funny, real world multithreaded apps show that the differences are huge. Do you really think that a Core 2 Duo at 3.33ghz somehow has greater potential than a quad at 2.66ghz, both using the same friggin' architecture? Are you nuts? Did you notice how the Q9400 whomped the e8600 (and the i3, too) in nearly every multimedia bench in that list? And you're going to tell me that a goddamned synthetic bench is a better indication of performance?
Please justify your answer. Don't just tell me that, "Well, smart people say so!" with nothing to back it up. Tell me logically or cite a source that definitely says that a synthetic benchmark is somehow a better indication of performance than a comprehensive suite of real-world applications. Go for it, slugger.
http://forums.creativecow.net/thread/8/1017479Congrats on proving that you're clueless. Final Cut is heavily multithreaded. Why? IT HAS TO BE. Apple has had dual CPUs for nearly ten years now, and you're telling me that they haven't optimized everything they can in their flagship HD video editor for multiple CPUs/cores? Show me some proof or get the hell out. Even if it were just the effects, that's often the most CPU-intensive thing you can do with Final Cut! You're telling me that playback off 1080p HD isn't written for multiple cores in Final Cut? Get lost!
Also, Compressor is very heavily optimized for multiple cores.
Right, if it's such an awful benchmark then why did your wonderful source use it?Sysmark is ONE PART of the bench, you toolbox, and the results are very often contrary to the rest of the benchmarks. Reading comprehension. Learn it.
I did look at their mediocre benchmarks. 10-20 percent, LOLI didn't even mention power consumption. You're now deliberately dodging because you know that you're full of crap and won't admit it. Try looking at the rest of the benchmarks they posted. Go ahead.
Funny, I just went back and looked at those benchmarks just to get a glimpse of more mediocre 10-20 percent (if even) improvement.Confirmed. You completely ignore EVERY OTHER BENCH in there. I said nothing of overclocking. I did point out that it beats the e8400 soundly in games that actually use more than four cores, another point you completely ignore, along with every additional benchmark they had. Smooth move.
I've seen plenty of your mediocre benchmarks that had 10-20 percent improvement. WHICH IS NO DIFFERENT THAN THE ORIGINAL SYNTHETIC GEEKBENCH SCORES I POSTED WHEN I FIRST REPLIEDTry clicking on that Anandtech link (the one you just ignored), which I friggin' SAID was a more comprehensive review. Go on, give it a try. Not that it'll matter. You pretty well ignore everything to the contrary anyway. What you'll see will back up everything I said, but I don't expect you to admit to anything. You'll deliberately ignore everything to stay in your insulated bubble. Have fun in there.
It seems you have the tendency to call anyone a troll from 4chan the second someone disagrees with you. You seem to have quite a fascination with the place, makes me wonder.This level of ignorance makes me wonder if you're just a troll that crawled out of 4chan. Considering you're thick enough to think that Apple could cram an incredibly hot, power-hungry monster like the GTX 480M (100w in a mobile chip, nearly double that of the 4850) into an iMac when they've never gone for top-end GPUs in that line...yeah. They'd have to blare the fans just to keep it from overheating, and Apple's tendency is more heat instead of more noise.
It seems you are somwehow under the impression that you've actually proved something. I've been stating this entire time that the i3 was an improvement but not a very good one, do me a favor and get that through your thick skull.Yes, 10-20%...in a lower-end chip that costs less than HALF of what the earlier generation top-end cost and runs at a lower clock speed with lower power consumption. My, what a horrid upgrade! Now compare the e8600 to the top-end i5 dual core, which is much closer in price and is at the HIGH END of what Intel is selling in a dual package. The e8600 gets inarguably destroyed. When the lower end comes out 10-20% over the previous high end, I call that some good progress. Are you somehow under the impression that the i3 is supposed to be a high-end chip in Intel's current lineup? It's a rung above the bottom of this generation, yet it still manages to edge out the top end of the previous dual cores. Go ahead, let that sink in for a while. Now look at how much faster the i3 is than the rest of the Core 2 Duo lineup. Pit the i3 against an e7x00, which is about the same price. I'd say destroy isn't a bad term.
Rebranded performance with their garbage GPU, boy what a fun upgrade that is!Rebranded? You really do have no clue. None. I guess the i3 is just a Core 2 Duo...with hyperthreaded. And much more aggressive power management (2w difference idle, much wider spread under load). And DMI, which moves the memory controller and PCIe controller onboard. And the integrated GPU. And numerous other improvements. Yep, that's just rebranded, all right! No real improvements at all! And on what planet is 10-20% (which you yourself just said) minimal?
Keep telling yourself that, this entire conversation could've been over with after I posted the original GeekBench scores that showed 10-20% improvement. You lose.You don't have a case to rest unless you're resting on being an ignorant toolbox. You've proven that time and time and time again in here.
Follow the money and you'll get it. Apple doesn't want to cannibalize Mac Pro sales for one.
1. There are no i5 benchmarks on that page unless you count a quick Speedmark number. Check your link. The only thing beyond that I see are for nothing but Core 2 Duo iMacs from a link toward the bottom.Oh how you try and twist things trying to act like you know what's going on. It really isn't that tough of a concept to understand. Pay close attention now, class is in session.
http://www.macworld.com/reviews/product/343881/review/27inch_imac_core_i5266ghz.html
Scroll down to the benchmarks. Look at iMovie, then look at a synthetic benchmark like cinebench. If you really can't see how wrong you are then that's just sad.
Still waiting on any lesson.It's pretty clear you are still confused, perhaps my lesson above will make it clearer for you?
Once again you completely ignore the place that each CPU has in Intel's product matrix. Don't worry, I'm sure it'll click in a few days. Considering that, 10-20% is impressive. We're looking at lower end dual vs previous top-end dual.Oh, but you're trying to play it off like a 10-20% performance increase over a 2 year old processor is the bees knees? LOL
It's one benchmark out of dozens. The results are highly contrary to almost every other benchmark. This isn't surprising at all with synthetics. Most review sites like Anandtech include them for the sake of completeness, not necessarily because it's a definitive measure of pure performance.You posted it not me, so now your source of benchmarks aren't reliable?It seems you only take notice to benchmarks that are beneficial to your argument and ignore the rest just because THEYRE SYNTHETIC HOLY **** I CANT BELIEVE IT, I DONT EVEN KNOW WHAT A SYNTHETIC BENCHMARK IS BUT IM COMPLAINING ABOUT IT LIKE I DO
Some guy on a forum said so! It must be true!http://forums.creativecow.net/thread/8/1017479
Right, if it's such an awful benchmark then why did your wonderful source use it?![]()
I did look at their mediocre benchmarks. 10-20 percent, LOL
Funny, I just went back and looked at those benchmarks just to get a glimpse of more mediocre 10-20 percent (if even) improvement.
And then you tried saying that the i3 was no better, pointed at Systemmark, yadda yadda yadda, it's pathetic, etc. I don't care if the synthetic corresponds with later numbers or not. I don't use it as a benchmark, period. If the synthetic corresponds with the real numbers, then that's fine, but that's not always the case.I've seen plenty of your mediocre benchmarks that had 10-20 percent improvement. WHICH IS NO DIFFERENT THAN THE ORIGINAL SYNTHETIC GEEKBENCH SCORES I POSTED WHEN I FIRST REPLIED
Or when they display an ignorance so monumental that I wonder if they're just trolling.It seems you have the tendency to call anyone a troll from 4chan the second someone disagrees with you. You seem to have quite a fascination with the place, makes me wonder.
And you continuously ignore the rest of the Core 2 Duo line, pitting the new low-end against the old high-end. You completely ignore the product matrix because you can't accept that 10-20% is more than just "mediocre" when you take that into account.Oh and I did not check the watt usage on the GPU at the time, don't get your panties in a bunch over a single mistake I made.
It seems you are somwehow under the impression that you've actually proved something. I've been stating this entire time that the i3 was an improvement but not a very good one, do me a favor and get that through your thick skull.
"Rebranded" again, same old wrong tune. Get over it. If you don't like the GPU, then don't use it. Get something with a dedicated GPU. At least it's a step up from the even crappier Intel integrated video they've used in the past.Rebranded performance with their garbage GPU, boy what a fun upgrade that is!
As you ignore all the other bullcrap you've spouted. Your Geekbench scores are all quite varied as they're comparing different laptop systems as well, which makes comparison more difficult. You also ignore that I've been comparing the desktop variants, oh, except that YOU were using desktop variants as well in the later argument! Oopsie. Did you miss that one?Keep telling yourself that, this entire conversation could've been over with after I posted the original GeekBench scores that showed 10-20% improvement. You lose.
Click more where the review is, now who can't navigate through a site?1. There are no i5 benchmarks on that page unless you count a quick Speedmark number. Check your link. The only thing beyond that I see are for nothing but Core 2 Duo iMacs from a link toward the bottom.
2. That's a quad core 2.66ghz i5, not a dual. Are you too dense to know the difference?
3. If you think that iMovie itself is somehow comprehensive then you're even dumber than I thought.
Try linking to the actual benchmark next time.
Still waiting on any lesson.
10-20% packed in with Intel's wasted graphics? That's complete garbage.Once again you completely ignore the place that each CPU has in Intel's product matrix. Don't worry, I'm sure it'll click in a few days. Considering that, 10-20% is impressive. We're looking at lower end dual vs previous top-end dual.
Yet they still include them meaning they serve a purpose.It's one benchmark out of dozens. The results are highly contrary to almost every other benchmark. This isn't surprising at all with synthetics. Most review sites like Anandtech include them for the sake of completeness, not necessarily because it's a definitive measure of pure performance.
Yeah, I'm dodging the point because you don't know how to click "More" on a review page then scroll down.You still haven't made a single point on how synthetics are somehow a better measure of real world performance than performance in real world applications. I'm not surprised. Keep dodging the point so you don't have to admit anything.
You're still wrong on Final Cut. What do you think, it runs on magic? It runs on the Quicktime 7 framework which is FAR from being optimized for multithreading and Final Cut Pro will never be until Quicktime X has all of Quicktime 7's features. You are just assuming it has multithreaded support because you don't know how applications work.I already stated this above. If it was one of the only ones they ever used, then you'd have a point. If it was so incredible, then they wouldn't bother with dozens of tests in different apps, now would they? They'd just push a button and let the synthetic test everything.
What the hell are you talking about? They didn't even have those benchmarks earlier? Please quote me on where I said this. I'd love to see it. The scores still stand, you're just dancing around it with cost, which these garbage i3 processors should cost half the price in this day and age.Except you said that they didn't even have those benchmarks earlier, which means you didn't even bother to look. Oops. You again ignore the place of the i3 in the product line. A chip that costs less than half of the previous generation scores higher.
Right, but you ignore all of the games on anandtech that show that the Core 2 Duo performed better than the Core i3? What a joke.And then you tried saying that the i3 was no better, pointed at Systemmark, yadda yadda yadda, it's pathetic, etc. I don't care if the synthetic corresponds with later numbers or not. I don't use it as a benchmark, period. If the synthetic corresponds with the real numbers, then that's fine, but that's not always the case.
I wonder the same about you, do me a favor and go to 4chan if you love to talk about them so much.Or when they display an ignorance so monumental that I wonder if they're just trolling.
What the hell? Since when was the Core 2 Duo a high end exclusive? What a joke that you are trying to claim that the Core 2 Duo was high end 2 years ago. It was mid-range or lower.And you continuously ignore the rest of the Core 2 Duo line, pitting the new low-end against the old high-end. You completely ignore the product matrix because you can't accept that 10-20% is more than just "mediocre" when you take that into account.
Umm, I'd take their older GPU's if you could still remove them over their current horse **** any day of the week."Rebranded" again, same old wrong tune. Get over it. If you don't like the GPU, then don't use it. Get something with a dedicated GPU. At least it's a step up from the even crappier Intel integrated video they've used in the past.
Different laptop systems? All of the systems compared have the same RAM and using 32 bit geekbench, you think the scores will change that much otherwise? What a joke.As you ignore all the other bullcrap you've spouted. Your Geekbench scores are all quite varied as they're comparing different laptop systems as well, which makes comparison more difficult. You also ignore that I've been comparing the desktop variants, oh, except that YOU were using desktop variants as well in the later argument! Oopsie. Did you miss that one?
Mhm, keep telling people to go crawl back to 4chan. Pathetic.Keep snaking around and screaming "YOU LOSE" as if that'll somehow make you smart. Punctuating statements with "LOL" doesn't do anything to help you on that front. You're not fooling anyone.
I can admit that I missed it. I still don't see your point as it's comparing an i5 quad to Core 2 Duos and the Core 2s get stomped in anything multithreaded. One iMovie result (two of them posted, which differ pretty drastically) does not equal comprehensive. You still don't have a point here.Click more where the review is, now who can't navigate through a site?![]()
Aaaaand you still ignore the low vs high end. I expected as much. Just keep ignoring it.10-20% packed in with Intel's wasted graphics? That's complete garbage.
To be thorough. You still haven't stated WHY they're better beyond "People use them!". You still don't have a point yet again.Yet they still include them meaning they serve a purpose.
Final Cut is aging, no doubt. The next one had better be reworked. Could the multithreading be better? Of course. Are effects the only multithreaded part of Final Cut? Nope. Not even close. Final Cut doesn't scale terribly well across a bunch of cores, but if you think it's largely single-threaded then you're the one with no clue here.You're still wrong on Final Cut. What do you think, it runs on magic? It runs on the Quicktime 7 framework which is FAR from being optimized for multithreading and Final Cut Pro will never be until Quicktime X has all of Quicktime 7's features. You are just assuming it has multithreaded support because you don't know how applications work.
Gee, how about this:What the hell are you talking about? They didn't even have those benchmarks earlier? Please quote me on where I said this. I'd love to see it. The scores still stand, you're just dancing around it with cost, which these garbage i3 processors should cost half the price in this day and age.
Funny, YOU EDITED IT AND YOU F**KING KNOW IT. It's in the quotes of one of my posts, directly quoting your earlier post. You cut it out to cover your ass. Nice try, and you can deny it until you're blue in the face, but we both KNOW you're a f**king liar now.LOL? I don't see anything related to POV Raytracer, Blender, Par2 Data Recover, Sony Vegas Bluray encode, Sorenson Squeeze FLV encode and 7zip compression. Maybe you should check your links or stop wasting my time.
And you ignore that most games don't use more than 2 cores and then ignore EVERY OTHER BENCH where the i3 comes out ahead, then ignore the Techpowerup benches in games that use more than 2 cores showing the i3 pulling ahead, constantly hiding behind this...even after you said 10-20% yourself. Nice.Right, but you ignore all of the games on anandtech that show that the Core 2 Duo performed better than the Core i3? What a joke.
I'd rather not waste my time in that festering anus of the internet.I wonder the same about you, do me a favor and go to 4chan if you love to talk about them so much.
Did you miss the part where I said it was the high end DUAL CORE, idiot? Oh, wait, you conveniently ignored it like so many other things. And the e8600 still costs nearly $300, which is pretty damned spendy for a desktop CPU.What the hell? Since when was the Core 2 Duo a high end exclusive? What a joke that you are trying to claim that the Core 2 Duo was high end 2 years ago. It was mid-range or lower.
You couldn't remove any of Intel's integrated video if it was built into the board. And I'll say this again: If you don't want Intel integrated, you can get something that uses a dedicated GPU. Simple. Most people don't use their GPU beyond playing solitaire and watching movies, so the integrated video is just fine for them. If you need more, then get more. I don't see why this is hard to grasp.Umm, I'd take their older GPU's if you could still remove them over their current horse **** any day of the week.
Then how do you account for the variance in the scores, idiot? The Core 2 is all over the 3000 range with those same specs!Different laptop systems? All of the systems compared have the same RAM and using 32 bit geekbench, you think the scores will change that much otherwise? What a joke.
I didn't say that in the very text you quoted, so stop making things up.Mhm, keep telling people to go crawl back to 4chan. Pathetic.
Anyway, I've got REAL LIFE things to do, I'll be back later to see if you have learned anything by then.
What the hell are you talking about? The one that matters (the export to iTunes) was not impressive. The i5 stomping all over the Core 2 Duo? LMAO! What page are you looking at? The differences between iTunes, iPhoto, iMovie and Photoshop all make it very apparent that if the app isn't multithread aware it doesn't matter how powerful your CPU is, meaning that that 10-20 percent that you are gloating about will be even LESS noticeable. Do you not see this?I can admit that I missed it. I still don't see your point as it's comparing an i5 quad to Core 2 Duos and the Core 2s get stomped in anything multithreaded. One iMovie result (two of them posted, which differ pretty drastically) does not equal comprehensive. You still don't have a point here.
Err, I keep ignoring it? I clearly discussed it below.Aaaaand you still ignore the low vs high end. I expected as much. Just keep ignoring it.
And you do? You claim they use it just because other people use it. Well why do other people use it? ***** and giggles?To be thorough. You still haven't stated WHY they're better beyond "People use them!". You still don't have a point yet again.
So you're wrong and now trying to spin it around, nice strawman argument.Final Cut is aging, no doubt. The next one had better be reworked. Could the multithreading be better? Of course. Are effects the only multithreaded part of Final Cut? Nope. Not even close. Final Cut doesn't scale terribly well across a bunch of cores, but if you think it's largely single-threaded then you're the one with no clue here.
You're right, I cut it because I hadn't noticed the other pages. This is all coming from the guy who edited his post about CS4 and games and whatnot before they were deleted. AND YOU ****ING KNOW IT LOLCAPSLOCKGee, how about this:
Funny, YOU EDITED IT AND YOU F**KING KNOW IT. It's in the quotes of one of my posts, directly quoting your earlier post. You cut it out to cover your ass. Nice try, and you can deny it until you're blue in the face, but we both KNOW you're a f**king liar now.
Right, I fail to make a valid point because you say so. The Core 2 Duo's are 2 years old, and I love how you are leaving out Arrandale i3's just to fit your "high end" dual core ********.And yes, the "garbage" i3 (which still beats the former high end, even by your own admission) should cost half when that former high end still costs double and it eviscerates other Core 2s that cost half of what it costs. Once again, you completely fail to make a valid point.
Mhm, so you are assuming the game engines used were just utilizing two cores. Well guess what, if most games utilize two cores then what the **** is the point of upgrading to an i3? For that 10-20% gain in a portion of my games? Yippee!And you ignore that most games don't use more than 2 cores and then ignore EVERY OTHER BENCH where the i3 comes out ahead, then ignore the Techpowerup benches in games that use more than 2 cores showing the i3 pulling ahead, constantly hiding behind this...even after you said 10-20% yourself. Nice.
Much like you are conveniently ignoring the lower end Arrandale Core i3's. How swell.Did you miss the part where I said it was the high end DUAL CORE, idiot? Oh, wait, you conveniently ignored it like so many other things. And the e8600 still costs nearly $300, which is pretty damned spendy for a desktop CPU.
Why this is hard to grasp? Because Intel's GPU is utter horse ****. It's so bad it can't even display colors right on the new MacBook Pro, you're seriously defending that garbage?You couldn't remove any of Intel's integrated video if it was built into the board. And I'll say this again: If you don't want Intel integrated, you can get something that uses a dedicated GPU. Simple. Most people don't use their GPU beyond playing solitaire and watching movies, so the integrated video is just fine for them. If you need more, then get more. I don't see why this is hard to grasp.
You take the average score, it's not that difficult, idiot.Then how do you account for the variance in the scores, idiot? The Core 2 is all over the 3000 range with those same specs!
Oh you clearly talked about telling someone to crawl back to 4chan in your previous posts. You also called someone a bonafied retard. It seems you have a lot class.I didn't say that in the very text you quoted, so stop making things up.
You've proven yourself a liar, a fool, and so many other things. Keep editing things and hope people don't notice, you pathetic jackass. We both know it and I'm done with this if you're just going to edit your posts and claim you never said something. You'd make a great politician.
Pretty good idea. Would Mac mini Server hardware with the modifications you mention work for that?
For one, Arrandale = Clarkdale - 1MB cache. Same goes for the i5 mobile chips, the ones in the Macbook Pro that spank a Core 2 in multithreaded benches. For two, as far as performance goes, that's the only real difference. For three, GEEKBENCH = SYNTHETIC. Synthetic benchmarks are crap. Also, I dug up a more comprehensive bench suite and edited the post a bit later. It shows the i3 wrecking the Core 2 Duo with a much wider variety of tests.
http://www.techpowerup.com/reviews/Intel/Core_i3_540_530/7.html
Except that said post has now disappeared. Why in the heck are the mods on this site so touchy and trigger-happy?
He thinks that PCs will still have their uses and people will still want them. He also said that PCs will not be replaced anytime soon, but it will happen.
Having owned four of them (one actually belonged to a client for a year), I sincerely believe the Mac Mini to be Apple's most reliable and underrated product. I sit here typing this on what is currently the highest specced non Server model (2.53GHz, 4GB RAM etc) connected to a Dell U2711 27" monitor (awesome!) - and she runs cool, only struggling if I fire up InDesign CS5 with a few other CS5 apps open at the same time, but only then if Chrome is struggling with Flash, which it does with aplomb!
A speedier Mac Mini would be great, but I cannot think of much else to make it better. Perhaps removing the CD/DVD drive and making it even smaller would be cool, with a black case option for home media centre use.
Apple went wrong with Apple TV by focusing on the hardware too much. They should have simply prepared some excellent OS X software and produced a media bay free Mac Mini - with HDMI out of course.
I love my Mac Mini because it's 'invisible' and quiet.
Why do you love yours?
My crystal ball says:
New model called Mac Mini Pro
w/ Quad Core
w/ nvidia 320m graphics module
w/ HDMI port
w/ Embedded WiFi and Bluetooth
w/ Blu Ray super drive option
w/ front USB port
w/ terabyte drive option
there is also another model with a bunch of options to replace Mac TV and become a living room entertainment hub with LOTS of WiFi suite software to use your iPad, iPhone or iPod Touch as you watch your favorite show or movie.