Right. That would be a nice summer surprise![]()
And it would end this thread...
Right. That would be a nice summer surprise![]()
And it would end this thread...![]()
And it would end this thread...![]()
I feel like next week there will be a massive surge of rMBP purchases regardless of whether or not Apple releases the Haswell rMBPs.
If there is a rMBP update next week, the surge is self explanatory. If they have no release next week I think many customers (myself included) won't want to wait until October and will just plop down some cash for the current rMBP.
Then here's a sign for you: Apple is waiting for the Intel Core i7 4750HQ to be released so that they can put it in the base configuration.
Yeah, that's the low-end Iris Pro CPU. 4850HQ and 4950HQ have already been released. For some reason, Intel is taking its sweet time with the 4750HQ.
Source of info:
http://www.anandtech.com/show/6993/intel-iris-pro-5200-graphics-review-core-i74950hq-tested/19
Q3 would point to it being after August.
And if you have to ask why wait, then it's just the usual...
Apple needs 3 CPUs: 1 low-end, 1 mid-end, and 1 high-end.
Low-end goes into base config.
Mid-end goes into high config.
High-end is a BTO option.
Judging from the fact that Intel database has only 3 Iris Pro CPUs (seriously!), there's really no other that Apple can put into the base config.
"Launched" does not mean "available to OEM now".
Unless you're saying Anandtech is wrong... but then they did receive an official sample (4950HQ) from Intel to benchmark, so I'd think they exchanged some words with Intel prior to the writeup.
And I don't think they would give availability information like that without first checking in with Intel.
Cliffs:
-i7-4700HQ + GT 750M in games only on average 25% faster than a i7-4750HQ + Iris Pro 5200
-Iris Pro beats all in GPU-Computing and Video-Transcoding
-Around the same power consumption in idle and in use :O
So, Iris Pro is on par with GT 750M in Crysis 3, and performs a little worse in other games.
However, Iris Pro shows its advantages in GPU computing. Seriously, check this:
CLBenchmark (OpenCL performance): (higher is better)
Physics:
Iris Pro: 3277
750M: 2378
Graphics:
Iris Pro: 56516
750M: 35608
The other CLBenchmark tests, Iris Pro and 750M is almost on par.
Folding@Home is iris 0.2 behind 750M.
In Rightware Base Mark CL, Iris Pro is much much better than 750M. Video encoding is also better, and WebGL benchmarks says it all:
Iris Pro: 118
750M: 76,8
Since you made a new account just to bash on this john123 guy, I made an account just to praise him.
I've been reading this thread daily starting about two weeks before the WWDC, and I have to say that john123 is one of the few in the thread that actually THINKS through his responses before replying.
IN FACT: I do think he is an "intelligent, well-educated man" and does NOT come off as "insecure". I don't really care about him as a person, but I definitely care about his responses and what he says because, quite frankly, his responses are a lot better than yours.
I start school on July 31st and will be buying a Macbook pro retina. Haswell battery life is what im really hoping for being in school and studying in the library all evening.
Crossing fingers to be able to get Haswell before July 28th.
I actually don't think any of us (myself included) have disagreed with your overall characterization here. We just disagree the "slightly" part. For most serious gamers, 30 FPS is considered atrocious. That said, I do believe you're right, and that the Apple laptop gaming market is small enough that they wouldn't necessarily worry about those lost sales.Its almost as if what I have been saying all along in the dGPU vs. iGPU debate has been correct. Slightly worse game performance for much better compute performance
30 fps is good. 30 fps on average isn't.
I actually don't think any of us (myself included) have disagreed with your overall characterization here. We just disagree the "slightly" part. For most serious gamers, 30 FPS is considered atrocious. That said, I do believe you're right, and that the Apple laptop gaming market is small enough that they wouldn't necessarily worry about those lost sales.
30 fps is good. 30 fps on average isn't.
This right here, except 30 fps is mediocre at best. But dipping below 30... oh boy.
Wow you're really cool with your fancy schmancy N= phrases. Yes dude, we all know that you've read statistics. Good for you.
You might think that you come across as an intelligent, well-educated man.
But what you really come across as, is insecure. Nobody cares about you, and nobody ever will. Just write simply, and write the truth. No need to play fancy, nobody gives a ****.
Me too!
I have a Synology 2 Bay NAS and it easily reaches 60-70MB/s through Ethernet, but barely makes 30MB/s over WiFi. You have to consider that with WiFi your real world speed is close to half the theoretical. So with n you are getting 20-30 MB/s and unless you sitting beside your router no more.
So now if my router and MBP have WiFi AC, I am pretty sure I will be able to hit the max of 70 MB/s with the theoretical max of WiFi ac right now being 130MB/s. I spent approximately 350$ for my NAS and it is most definitely not a High end one.
you shall NOT underestimate 802.11n. just because your wireless router maximum throughput is 30MB/s doesn't mean there aren't others that are better.
http://www.smallnetbuilder.com/wireless/wireless-charts/bar/113-5-ghz-dn-c?task=archiveon
another lazy man's chart, in the table look at the Rank column ... N900 below is 802.11n, and anything above 1000 is 802.11ac.
again, some of 802.11n routers that topped the chart can give up to 70MB/s range ( except that asus rt-n66u .... he's cheating ), utilizing all your NAS speed.
know thy router, 802.11ac is good to have but currently ( and maybe few years to the future ) 802.11n still abundant for most use case scenarios.