Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
the at home service is after the online diagnostic software says there is a problem.

And this is a problem? How? Obviously it makes sense to do quick online diagnostic.

the upgrade to 3 years is a lot more.

Yes, but it's a much better service than Applecare. HP plans are different. Applecare style plan is offered only for two years and it's just $200.

It's $2063 for the i7 Envy with a 15" screen.

You got confused here which happens a lot because of the new Intel naming convention. The i7 that you selected is totally different from what Apple uses. Apple uses dual core Arrandale chips (available for Envy too). What you selected is a quad core Crarkdale chip. It'a a much more powerful chip which costs much more. Besides, in my comparison both laptops had identical i5 chip.

and that doesn't include Windows 7 Ultimate

Not only you do not need Ultimate, you do not need a Professional edition either. Unless you need features like "Location Aware Printing", "Offline Files and Folder redirection", "Windows Server domain joining" etc. So, yes, Windows 7 has a lot to offer and some people might need it but mostly it's for enterprise.

so you will still have to buy a photo editing, dvd authoring and movie creation programs.

Last time I checked Photoshop was not free for OS X either. And if you need something simpler, there is plenty of free stuff to choose from (like Picasa etc.)
 
Macbook Pro ship date slips

When a friend ordered a customized 15" Macbook Pro yesterday (April 14th), the estimated shipping timeframe was within 24 hours. When he checked back today (April 15th), the timeframe had slipped to 2-4 business day!
 
...
Well, Dell's RGBLED displays are some of the few that can claim true 100% color reproduction.

With Apple's edge-lit LED LCD displays, they're just giving you a thinner display that uses less power. RGBLED actually has hundreds of LEDs behind the LCD panel that can change color along with the picture being displayed on the LCD. Obviously that enhances the color quality dramatically. But none of Apple's displays use this technology. They're all edge-lit.
...

I have now talked to a few people and done some research.

Wow! This is a huge feature. Why is this not part of the Apple "Pro" machine's specs.

Why aren't more creative "Pro"fessionals clamoring for this?
 
after reading through these forums it never ceases to amaze me the amount of people complaining about these updates. however i do understand why this is happening and I think this information should be made a sticky (though in a more easier to understand post)

1. the reason why the 13" still has Core 2 Duo technology?

Nvidia and Intel had a chipset dispute basically saying that Nvidia can no longer make any integrated graphics chips for processors using the QuickPath Interconnect.. in other words Core i3, i5, i7 mobile processors.

previous gen 13" MacbookPro has the Core2Duo and Nvidia 9400m integrated graphics which is a great combination of graphics, performance, and battery life up to 8hrs. However.. think about this... if apple were to integrate the Core i3/i5 in the new 13" MBP they can no longer use any Nvidia integrated graphics chip and would have to use a dedicated one instead.

dedicated ones increase the cost and lower the battery life. two things they dont want.. Apple does not like backing down on their word.. in other words if they gave you 8hrs battery life last gen they will not give you 6 or 7 next..

on a performance/power consumption perspective NVIDIA beats AMD in this mobile sector. so Nvidia is the choice for Laptops

A: The MacBook Pro 13"s are using a discreet 320M. The 9400M blurb is meaningless as it doesnt have switching graphics. They could be using an Intel chipset for all we know.

B: The Ati Mobility series uses less power for more perfromance than nVidia M series.

C: ATi and Intel graphics are in far more laptops than nVidia right now.
 
A: The MacBook Pro 13"s are using a discreet 320M. The 9400M blurb is meaningless as it doesnt have switching graphics. They could be using an Intel chipset for all we know.

The 320M in the MBP 13" is not a discrete solution, it's an integrated solution. The poster you were responding to was right. If Apple had went Core i3/i5 in the 13", they'd have had to either stuff a discrete nVidia chip in there or use Intel HD graphics which are barely on par with a 9400m.

This was confirmed by Steve himself in his interview. Maybe all the whiners still don't get it. There would be no nVidia graphics in the MBP 13" if it had used a Core i3/i5.
 
A: The MacBook Pro 13"s are using a discreet 320M. The 9400M blurb is meaningless as it doesnt have switching graphics. They could be using an Intel chipset for all we know.

B: The Ati Mobility series uses less power for more perfromance than nVidia M series.

C: ATi and Intel graphics are in far more laptops than nVidia right now.

The 320M in the MBP 13" is not a discrete solution, it's an integrated solution. The poster you were responding to was right. If Apple had went Core i3/i5 in the 13", they'd have had to either stuff a discrete nVidia chip in there or use Intel HD graphics which are barely on par with a 9400m.
The GeForce 320M looks like it is literally the rumored MCP89 for Core 2/Atom that was touted in July/August of 2009. Things really headed south after Intel denied nVidia the license for QPI/DMI. There was a MCP99 in the cards as well for Arrandale/Clarksfield. Both parts were rumored to be a 32/48 shader GPU based on nVidia's newer 40 nm GT21x cores.

Thats the first time I've seen that.

I wasn't even aware nVidia was making chipsets for intel anymore, especially since nVidia announced they would be withdrawing from the Mobo market.
There is a footnote that it uses shared video RAM. The MCP89 is a drop in pin per pin for the MCP79. No need to redesign the logicboard. nVidia can make chipsets for the front side bus and they more than likely have a contract with Apple to provide.
 
The 320M in the MBP 13" is not a discrete solution, it's an integrated solution. The poster you were responding to was right. If Apple had went Core i3/i5 in the 13", they'd have had to either stuff a discrete nVidia chip in there or use Intel HD graphics which are barely on par with a 9400m.

This was confirmed by Steve himself in his interview. Maybe all the whiners still don't get it. There would be no nVidia graphics in the MBP 13" if it had used a Core i3/i5.

Thats the first time I've seen that.

I wasn't even aware nVidia was making chipsets for intel anymore, especially since nVidia announced they would be withdrawing from the Mobo market.
 
Thats the first time I've seen that.

I wasn't even aware nVidia was making chipsets for intel anymore, especially since nVidia announced they would be withdrawing from the Mobo market.

They don't make chipsets for new Intel processors, but they haven't stopped making chipsets for older processors for which they made chipsets before.

And as Eidorian states, it says so on Apple's tech spec page :

NVIDIA GeForce 320M graphics processor with 256MB of DDR3 SDRAM shared with main memory5

5Memory available to Mac OS X may vary depending on graphics needs. Minimum graphics memory usage is 256MB.

So now, again, the MBP 13" uses a Core 2 Duo so that it can use nVidia graphics. Otherwise, it would have a Core i3/i5 with sub par graphics. Guess which performs best ?
 
Look, this argument just doesn't hold water at all. I work with text, coding, etc. all the time. And the argument for a few extra vertical lines of resolution just doesn't work when a WIDER display allows you to have more open at a time.

The more windows I can have displayed side by side is a lot better than being able to pull a window down a little bit more. 16x10 is just an inefficient waste of space.



Yeah. And the funny thing is that since screens went 16x9, computer sales are UP. People want proper widescreen.



Thats true. However, on a 16x10 screen, like the MacBook screen, the image of a 2/35:1 film is as small as it is on a 4x3 TV. Not the case with a proper 16x9 screen.



And guess what? The iMacs are selling better than ever now. Apparently Apple made the right move.



Well, you're in a very very small minority. The rest of the industry has moved to 16x9. They did it virtually overnight. Sales are up. Apple did it on the iMac and sales are up. The only two places you really can't find 16x9 now are on Apple notebooks and Apple "Cinema" displays. But its only a matter of time with them.

I'm sure Apple already has working prototypes.



Again, your argument just doesn't work. I've lost NOTHING in "work" (if working on a computer can even be called "work") by going 16x9 and gained everything when it comes to entertainment. You don't know how many times I've had to explain to people why their widescreen movies look so small on 16x10 displays and how those very same people jump at the opportunity to replace their system with a 16x9 display. And yes these are people who "work" on their computers.

Just like me. 16x10 should have never been introduced to begin with.



Which is ironic because HD DVD used the same copy protection as blu-ray disc. HDCP, AACS. BD+ wasn't even around at that time and its something that doesn't affect anyone.



Which is exactly the same for iTunes HD downloads. So its okay for Apple to enforce the same type of HDCP requirements for HD video but its not okay for blu-ray to do it?



You need to take your own advice and not cherry pick replies.

It's funny how you call me a troll and ignore the fact that Apple enforces the same HDCP requirements for their "high definition" content. http://www.engadget.com/2008/11/17/apple-itunes-multimedia-throwing-hdcp-flags-on-new-macbook-mac/ http://gizmodo.com/5177075/itunes-hd-movies-wont-play-on-older-non+hdcp-monitors

You know what you need for HDCP compliance? A modern videocard with a chipset manufacturered provided driver (also known as a driver directly from nvidia or AMD), a non-Apple manufactured display from within the last 4 years, and an HDMI or DVI cable. Thats it. Scary stuff, huh?



Well, Dell's RGBLED displays are some of the few that can claim true 100% color reproduction.

With Apple's edge-lit LED LCD displays, they're just giving you a thinner display that uses less power. RGBLED actually has hundreds of LEDs behind the LCD panel that can change color along with the picture being displayed on the LCD. Obviously that enhances the color quality dramatically. But none of Apple's displays use this technology. They're all edge-lit.

Holy cow. I gather you're highly enamored of 16x9 and Blu-ray, etc. :rolleyes: That's all very fascinating and so forth... but it still seems like a lot of talk which accomplishes nothing (or very little of practical value). It just doesn't matter to anyone who doesn't care about that stuff. What... you expect Mac users (with tons of Mac OS experience and software) to rush out and purchase some Windows-running PeeCee laptop just because it can play Blu-Ray discs and has a 16x9 display? Sure.

You're so wrapped up in your mission that you've misinterpreted stuff I said. Of course iTunes movies have the same protections... because the movie business moguls demand it. [and I never said it didn't.] But *your* claim was that all such complaints about Blu-Ray came from Mac users spreading FUD. I proved you wrong... but obviously you didn't follow those links or read what PeeCee users said. You didn't acknowledge those facts. Ergo, the troll accusation shall stand as delivered.

Anyway, no matter how many pages you type about 16x9 it won't change anything. I'll be getting a new MBP because it's the right laptop for me. What you choose to buy doesn't interest me so much. [even less at this point.]
 
That bit isnt relevant at all.
I guess we're waiting for a take apart and GPU-Z screen shot then?

What exactly would you like to know? I'll agree that Intel and ATI have a much higher mobile share right now and that ATI has better performance/watt.

The IGPs we have today are a far cry from early "graphic processors". We're talking about dedidated hardware shaders only limited by their shared video RAM, in some instances dedicated sideport memory, or low budget shader count.
 
Wahoooo!

FINALLY!!! WHAT EVERYONE WAS WAITING FOR. THE UPDATE!!! NOW I REAAAALY REGRET GETTING MY 15"MBP TWO MONTHS AGO... SHOULD HAVE WAITED, SO I COULD DO SOOOO MUCH MORE...or not. Hahaha, silly nerds. What a HUGE update(NOT!). Let's see, who can download a photoshopped pic of Megan Fox or stream the latest Attack of the Show w/ Munn eating hot-dogs faster. Uh, mine is pretty quick.... pretty sure I can wait the extra second longer to punch the clown that all the new MBP's seem to have on mine. Comical...
 
That bit isnt relevant at all.

Not relevant to what ? It's basically the complaint with the 13" and it's because people misunderstand why it's using a Core2Duo. It is using it so that it can have an nVidia IGP, otherwise it would have Intel's IGP, which would make it perform much worse in games and other GPU intensive apps.

Why increase the CPU's speed by 15% if you lose 85% graphics performance ? This isn't 1996 anymore, GPUs nowadays are a much greater factor in performance than CPUs in most consumer related tasks.

Heck, until 10 minutes ago, you had the same misunderstanding in thinking the 320M was a discrete part when it isn't.
 
I guess we're waiting for a take apart and GPU-Z screen shot then?

What exactly would you like to know? I'll agree that Intel and ATI have a much higher mobile share right now and that ATI has better performance/watt.

The IGPs we have today are a far cry from early "graphic processors". We're talking about dedidated hardware shaders only limited by their shared video RAM, in some instances dedicated sideport memory, or low budget shader count.

I made the mistake of thinking the 320M was discreet as Ithough nVidia had stopped chipset making completely.

There is nothing more.
 
can someone post credible sources to AMD's graphic mobile performance/watt being better then Nvidia?

as far as what I know now integrated graphics wise.. Nvidia is on top.. performance/watt in the Nvidia 320M should be the best ~ remember this is an integrated graphics chip

in the mobile discrete graphics sector i guess the competition opens up a bit more.. im wondering who the clear winner is or if solutions from nvidia and amd are about equal here..
 
can someone post credible sources to AMD's graphic mobile performance/watt being better then Nvidia?

as far as what I know now integrated graphics wise.. Nvidia is on top.. performance/watt in the Nvidia 320M should be the best ~ remember this is an integrated graphics chip

in the mobile discrete graphics sector i guess the competition opens up a bit more.. im wondering who the clear winner is or if solutions from nvidia and amd are about equal here..

http://www.notebookcheck.net/Comparison-of-Graphic-Cards.130.0.html
http://www.notebookcheck.net/Mobile-Graphics-Cards-Benchmark-List.844.0.html
 

seems this is mostly at the high end..not across the whole board but yea.. aside from that lets keep the topic going people lol :D
anymore complainers..? alright ill whine a little... nothing is perfect right :)

my only complaint with the macbook pro is lack of USB3.. and bluray..

bluray is not really that important to me but inclusion of USB3 wouldve been nice..

aside from that I think this was a very good update from Apple..
 
seems this is mostly at the high end..not across the whole board but yea.. aside from that lets keep the topic going people lol :D
anymore complainers..? alright ill whine a little... nothing is perfect right :)

my only complaint with the macbook pro is lack of USB3.. and bluray..

bluray is not really that important to me but inclusion of USB3 wouldve been nice..

aside from that I think this was a very good update from Apple..
ATI's 400 shader Mobility HD 5600/5700 line is rather impressive.
 
I have my credit card in hand, ready to order. 2 things keeping me:
Besides browsing, documents, email, messengers, tunes, pictures etc, I make music, mainly on logic and I will be getting into Ableton Live. I'll also be getting into making short films but very amateur so ill stick to iMovie , and get Final Cut somewhere along the line. so here are my doubts

1) i5 (2.53) vs 17 (2.66) based on what i do/might get into in the future.

2) High Resolution: I have netflix, and do watch a lot of movies (streaming/dvd), but i dont want to get my computer and realize its way too small and be squinting all the time , advice?

thanks in advance:apple:
 
I have my credit card in hand, ready to order. 2 things keeping me:
Besides browsing, documents, email, messengers, tunes, pictures etc, I make music, mainly on logic and I will be getting into Ableton Live. I'll also be getting into making short films but very amateur so ill stick to iMovie , and get Final Cut somewhere along the line. so here are my doubts

1) i5 (2.53) vs 17 (2.66) based on what i do/might get into in the future.

2) High Resolution: I have netflix, and do watch a lot of movies (streaming/dvd), but i dont want to get my computer and realize its way too small and be squinting all the time , advice?

thanks in advance:apple:

Get the higher resolution it wont be that bad at all. The extra screen real estate will be worth it. I have a 15" and wish it had more screen space. As for the processor, if you plan to do video encoding, go with the i7.
 
Look, this argument just doesn't hold water at all. I work with text, coding, etc. all the time. And the argument for a few extra vertical lines of resolution just doesn't work when a WIDER display allows you to have more open at a time.

The more windows I can have displayed side by side is a lot better than being able to pull a window down a little bit more. 16x10 is just an inefficient waste of space.



Yeah. And the funny thing is that since screens went 16x9, computer sales are UP. People want proper widescreen.



Thats true. However, on a 16x10 screen, like the MacBook screen, the image of a 2/35:1 film is as small as it is on a 4x3 TV. Not the case with a proper 16x9 screen.



And guess what? The iMacs are selling better than ever now. Apparently Apple made the right move.



Well, you're in a very very small minority. The rest of the industry has moved to 16x9. They did it virtually overnight. Sales are up. Apple did it on the iMac and sales are up. The only two places you really can't find 16x9 now are on Apple notebooks and Apple "Cinema" displays. But its only a matter of time with them.

I'm sure Apple already has working prototypes.



Again, your argument just doesn't work. I've lost NOTHING in "work" (if working on a computer can even be called "work") by going 16x9 and gained everything when it comes to entertainment. You don't know how many times I've had to explain to people why their widescreen movies look so small on 16x10 displays and how those very same people jump at the opportunity to replace their system with a 16x9 display. And yes these are people who "work" on their computers.

Just like me. 16x10 should have never been introduced to begin with.
......
One can work on 16:9 but if you want maximum screen real estate vs portability than 16:9 is a loose loose. For the same portability you get less screen or for the same screen real estate less portability. At the same portability a movie is displayed at the same size on 16:10 as it is on 16:9. There is no difference.
Why?? Because at least in my opinion, portability is hugely dependent on the width of the notebook. Because I have to fit it into a bag with all my other stuff and my bags are designed to hold A4 shaped stuff like scratchbooks ...
I don't really care about weight. A 16:10 15,4" is not matched in screen real estate by a 15,6" notebook you need to go above 16" and even a 15,6" portability sucks. At the same width a movie is always displayed at the same size.

BTW I hope with OLEDs they finally kill the thick bezels around screens and give us more screen at the same size. OLEDs are almost indestructible as a video of an AMOLED screen and a hammer showed. Thus you don't need to protect them and they are extremely thin too.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.