Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
If they made it optional, that's fine, but take it away completely would be too soon. There are still a lot of people (including me) that use & like physical media. I still buy CDs the majority of the time and use them as well as DVDs I rent from Netflix in my Powerbook G4.

I really don't need 2 hdd or an extra battery. If I want an extra battery, I'll buy a spare to carry.

I really don't see Apple making that move yet.
I can see something like the Mac mini and Mac mini Server happening with the MacBook Pros.
 
i've been reading gamer forums since the 1990's. most of the people on there are idiots who think they need to buy the newest thing out there just to surf the internet or play a game.

doesn't matter since PC gaming is dying and everything is going to consoles except for strategy games. most people don't want to spend $500 on a graphics card and most think you are insane buying 2 of the same cards just to play a game. Means most PC games will be designed for the lower end hardware just as they have been since the 1990's. the last time i bought a new graphics card the day it came out was the nvidia Ti4600. didn't see games that used any of it's features for 18 months and the idiots on anandtech were saying how you had to buy it to get decent framerates. same thing when i bought a Vodoo2 the day it launched. nothing used a fraction of it's power for 12-18 months

Excellent post.
 
That changes nothing. Physics says you get to choose: at a given level of heat/power, you can have four clocks running at (average) frequency "f," or two clocks running at frequency "2f." These are average frequencies, taking into account "turbo," clock throttling, etc.

So if you have a particular thermal solution for a laptop that is designed to handle 2 cores, if you double the number of cores (assuming the same core design, same process technology, etc.) you have to run the four cores at an average speed half that of what you could run the 2 cores at.

Sorry, but when all else is equal, power consumption grows with the square of the frequency, not linear. So you can have four cores at frequency "f", two cores at frequency 1.414f, or one core at frequency 2f. So what does that mean?

1. If you don't need much processor power, you are better off if your applications can use multiple cores, so you can spread the work over multiple cores at the lowest possible clock speed, and you actually safe power and battery life doing that.

2. If you need a lot of processor power, you have the choice to use all cores at full speed to reduce the time your task takes. This will also produce more heat and cost battery life. On the other hand, it is surely better to get a task done in four hours and then the battery is empty, instead of emptying the battery in six hours and not getting the task done.

3. I think Arrandale is two cores + hyperthreading, so the maths would be slightly different.
 
I have a 5 year old powerbook g4...I can imagine the performance boost I'll get from upgrading...

hahahaha; I'm with you. I'm on a powerbook g4 as well. Been a great machine. But it's starting to show that it's straining. It's definitely time for an upgrade.

Can't even watch a youtube video without some hesitation.
 
core i7/i5
usb 3
sata 3
dx 11 gpu (mainstream and HIGH END!)
blu-ray option
hdmi or mini HDMI would be nice tought
 
Sorry, but when all else is equal, power consumption grows with the square of the frequency, not linear. So you can have four cores at frequency "f", two cores at frequency 1.414f, or one core at frequency 2f. So what does that mean?
.

Wrong. It's the square of voltage, not frequency. P=1/2 C V^2 f.

See, e.g.: http://en.wikipedia.org/wiki/CMOS#Power:_switching_and_leakage

Update: the 1/2 factor depends on whether you count each clock edge or every other. I designed CPUs at AMD (K6, Athlon 64 and Opteron), Exponential (PowerPC x704), Sun (UltraSparc IV), RPI (F-RISC/G), etc., and we usually followed the 1/2 convention, but many other people don't.
 
If they made it optional, that's fine, but take it away completely would be too soon. There are still a lot of people (including me) that use & like physical media. I still buy CDs the majority of the time and use them as well as DVDs I rent from Netflix in my Powerbook G4.

I really don't need 2 hdd or an extra battery. If I want an extra battery, I'll buy a spare to carry.

I really don't see Apple making that move yet.

Wasn't Apple the first to pull the floppy from their notebooks?

I think it could very well happen sooner then later. With the rise of streaming, music ,movie, and software downloads, cheaper flash drives, etc.

I believe I read some where that the hp envy doesn't have an optical which made room for a standalone graphic card. That would be a nice addition for the 13"
 
i can't wait XD
signature_SmileyFace.jpg
 
I guess THIS article will come as a surprise - if true of course...

Apple ditching Arrandale ??? Comments ?

Intel already have built in bypassing the internal graphics so there is no need to create a special processor for apple and the info about nvidia is just fan talk. Plus using the built in graphics and an external graphics chip gives more value than closing of the internal graphics and using the same external graphics solo.
 
I guess THIS article will come as a surprise - if true of course...

Apple ditching Arrandale ??? Comments ?


Bright Side of News said:
...which only leaves us to wonder where things went wrong for AMD and their Manhattan line of Mobility Radeon products. As you probably know, ATI Manhattan is a codename for mobile versions of DirectX 11 parts [Evergreen family].
There's nothing wrong since they aren't even late yet. :rolleyes:

I'm sure we'd all enjoy another round of Core 2 based products.
 
I guess THIS article will come as a surprise - if true of course...

Apple ditching Arrandale ??? Comments ?

Um, how about "no"? nVidia and Intel are just starting a massive court case to figure out if nVidia is even allowed to make chipsets for the i7 series. Ref:

http://www.pcmag.com/article2/0,2817,2353938,00.asp

Nobody knows which external graphics solution will end up in the next MBPs, but it'd be silly to disable the internal video.
 
There's nothing wrong since they aren't even late yet. :rolleyes:

I'm sure we'd all enjoy another round of Core 2 based products.

I wouldn't have a problem with that. If apple can think of a creative solution to overcome the inherent handicap that intel poses with the inclusion of a very, very, very inferior gpu in their processors, I will be happy. If they decide they are not going to go along with this pimping and ditching of a bad product on the shoulders of a good one, and the monopolising tactics that entails, this will be perfectly understandable and I will gladly accept another round of c2d. It's not like that extra combined 5% of advantage would make a difference to anyone.

But more so than ever I hope apple are motivated to acquire or at the very least strike a deal with amd+ati, so that the mac world acquires a solid future with great integrated cpu+gpu, and, again, their very own superior processor, like the powerpc was, with the added compatibility incentive. I wouldn't give a toss if in the interim they used the c2d, for 95% of the users the bottleneck is in the ram and the ssd in any case, so big f. deal if they use the c2d, I d take a c2d with an upgraded nvidia igfx and/or a discrete nvidia/ati option over the arrandales anyday.

Like I said I will be looking to see what option apple adopt, but I will be very happy if intel is punished for doing a micorosoft pimp act on the computing industry by seeing the most profitable and fastest growing platform -apple- move away from them.
 
I think we would all like that bright side of news article to be true, and to see the Arrandale CPU freed from its IGP counterpart and placed on a magical chipset with next-gen SLI/CrossfireX capable integrated graphics and USB 3.0.

But I'm afraid the future isn't that bright.
 
I can see an Intel Chipset and ATi Discreet 4350. Don't s'pose it'll be too much to ask of AMD to make an ATi chipset for i7. Its not likely but then Apple has done stranger. Like the mighty mouse.... :D

Did you mean the Mobility Radeon HD 4830? ;) A newish GPU with GDDR5 would be more to my liking.
 
Chipset? Video?

Given that NVidia still hasn't been able to get Intel to play nice enough to make a chipset for Nehalem, presumably these machines would use Intel chipsets. Presumably Intel will integrate some sort of graphics solution into these laptop chipsets. Hopefully said graphics solution will be something closer to the Larrabee graphics architecture they're working on, and not the horrible GMA-based platforms of MacBooks gone by. Clearly only the higher-end MBPs will have discrete NVidia graphics as they do now.

Apple will look pretty stupid after all the NVidia noise they've made over the past year if they release a new entry-level "MacBook Pro" with worse graphics than the current model.

This leads me to believe that the new Nehalem chips will lead to a re-forking of the MacBook lineup, probably not in name as they just got finished "Pro'ing" everything, but I think only the MBP models that currently have discrete NVidia graphics will get the Nehalem treatment until Intel gets it's integrated graphics act together.
 
Wrong. It's the square of voltage, not frequency. P=1/2 C V^2 f.

Yeah, um, clearly you're not very good at physics. Your argument is flawed, because clearly to increase the clock frequency, you would have to increase the voltage applied. How else are you going to squeeze sufficient electrons down a wire to be registered by the next component in less time? Furthermore, this capacitance power lost would be swamped by leakage, and inductance. If you design chips for AMD, surely you would know that at high frequencies, capacitor impedance is small, but inductance impedance is large.

alent1234 said:
since PC gaming is dying...
...most PC games will be designed for the lower end hardware just as they have been since the 1990's.

And for the record, console gaming is not taking over the market, and PC gaming isn't dying. How can it, when there are so many more PC games? How can a gamepad replace a keyboard and mouse? How can a 3-5 year old piece of technology, like a PS3 or XBOX 360 be expected to keep up with a brand new CPU and GPU? So what if most games are designed for lowish end systems? A lowish end system is still going to cane some 5 year old silicon in a fancy box.
 
Hopefully said graphics solution will be something closer to the Larrabee graphics architecture they're working on, and not the horrible GMA-based platforms of MacBooks gone by.

Unlikely. I'm pretty sure the general consensus is the Intel on-chip GPU will be dismal. Not that is necessarily a bad thing, as it will therefore run on effectively negligible power, and would be perfect for watching videos on. Although I still wouldn't be that surprised if Apple disable the Intel GPU to save a couple of watts, and rely on the new low power states of modern GPUs (which of course requires them to upgrade the existing GPUS :))

Clearly only the higher-end MBPs will have discrete NVidia graphics as they do now.

As I've said before, I think it is very likely that all models will get discrete graphics of some sort, probably AMD solutions. Not that it particularly matters, nvidia would be fine too. Apple need only put low power discrete graphics in the lower models, and performance level graphics in the high end, and everyone will be happy.
 
No Larrabee. Ever.

However, the idea that the models might be "forked" as Funkboy suggested is relatively well supported by the whole 6,1 6,2 deal, which suggests only two models will go through an architecture change (presumably the 15" and 17").
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.