Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I dont think he was calling you dumb.. he was simply trying to rationalize how to go out of your way to make something work on a mac that typically only works on PC.. like burning xbox games, downloading software, doing anything for the sake of being productive rather than masturbating to fancy toolbar icons and sleek/skinny unibodies... stuff like that. I own 2 macs btw so I'm not a h8r.. LOL

Wasn't trying to call him dumb... probably came across that way. Meant to say that OpenCL is the standard for making it work across all cards so that is probably the way development should be leaning. Also, there is a lot of device optimization that is useful in CUDA or OpenCL that I prefer not to mess with on a laptop graphics card like the 330M when Im going to be running all the code on at least a GTX295 or a Fermi card(where I get true out of order execution). It's far far more practical in my opinion to write code on my laptop for the running machine using SSH or transfering it via SCP than it is to write code natively on my laptop and then have to completely rewrite sections for device optimization later. If he isn't trying to do all of that sure, it works fine to write CUDA for a small test set on the laptop and then just run it on a larger more performance oriented device. Also, SSH is fast enough that I've never seen it as a major hit. So I never feel the need to complain if my mobile computer doesn't have something vendor specific that I need that I can get from just remote connecting to a server. Its a straightforward workaround that doesn't effect my workflow or productivity so I just get frustrated with the whole attitude of "No I need this to have a vendor specific device" in what is essentially a mobile device that you don't have a lot of control over.
 
Mac is first and foremost a design and development platform. It's what it's always excelled in. If it can satisfy gamers on the side, more power to them.

This was true in 2005 but Apple has outgrown the designer/pro market. The designer/Adobe pro user market represents a truly small percentage of the revenue stream these days. The Mac's phenomenal growth for the past 20 quarters has been fueled by general purpose users who also play games.
 
Wirelessly posted (Mozilla/5.0 (iPhone; U; CPU iPhone OS 4_2_1 like Mac OS X; en-us) AppleWebKit/533.17.9 (KHTML, like Gecko) Version/5.0.2 Mobile/8C148 Safari/6533.18.5)

wolfenkraft said:
Nooooooo!!!!! I need CUDA cores! :eek: :eek:


Absolutely. Not supporting CUDA will make me a sad panda. :(

You should be supporting OpenCL if you are coding yourself. Even as a user you should be pressuring vendors to support OpenCL as the default GPGPU platform.
CUDA will die with NVidia.
 
I dont know about the ATI/AMD

but i can tell you that nvidia sux.

mbp late 2009

Why do they 'sux'? I own many devices with NVIDIA GPU's. They perform great, the drivers have always been stable and they are updated periodically with tweaks & refinements.

I don't understand the hate. I think a lot of people moan about anything. My experience of ATI/AMD GPU's is nothing to write home about. They cost less than their NVIDIA counterparts, but that's about the only major plus point.

NVIDIA dominate the top positions (I have a GeForce GTX 570 on order with my delayed SB PC):

http://www.videocardbenchmark.net/high_end_gpus.html
 
Why do they 'sux'? I own many devices with NVIDIA GPU's. They perform great, the drivers have always been stable and they are updated periodically with tweaks & refinements.

I don't understand the hate. I think a lot of people moan about anything. My experience of ATI/AMD GPU's is nothing to write home about. They cost less than their NVIDIA counterparts, but that's about the only major plus point.

NVIDIA dominate the top positions (I have a GeForce GTX 570 on order with my delayed SB PC):

http://www.videocardbenchmark.net/high_end_gpus.html

I know that recently, when getting cards sorted for my desktop PC i found that power consumption was far greater on nvidia cards as was heat in a like for like card comparison. AMD cards also provided much better value for money, in the low/mid range similar cards are cheaper and even in the higher ranges too.

Thats mostly just my opinion from testing out a few different options.
 
Why do they 'sux'? I own many devices with NVIDIA GPU's. They perform great, the drivers have always been stable and they are updated periodically with tweaks & refinements.

I don't understand the hate. I think a lot of people moan about anything. My experience of ATI/AMD GPU's is nothing to write home about. They cost less than their NVIDIA counterparts, but that's about the only major plus point.

NVIDIA dominate the top positions (I have a GeForce GTX 570 on order with my delayed SB PC):

http://www.videocardbenchmark.net/high_end_gpus.html
http://support.apple.com/kb/TS2377

FYI.
 
Taken from apple website :

Requirements to install all Final Cut Studio applications

* Mac computer with an Intel processor
* 1GB of RAM (2GB of RAM recommended when working with compressed HD and uncompressed SD sources; 4GB of RAM recommended when working with uncompressed HD sources)
* ATI or NVIDIA graphics processor (integrated Intel graphics processors not supported)

Very nice to spend 1300 bucks on a Macbook PRO and not be able to use some PRO applications .I PROtest.

Yes, that's the situation right NOW. That does not mean that it won't be changed with the introduction of the Sandy Bridge integrated GPU, which is a lot better than the previous Intel integrated GPUs are. Just wait until the computers are actually announced to make judgements like this…
 
The benchmarks for the igp of sandybridge is right around 320m's performance.

First people complain about the last generation 13 inch mbps not using nehalems and sticking to core2duo to use 320m, and now they're complaining about not using 320m for using a much more competent cpu? Hard to please people.

There are 'different' sets people making the comments, dude.

Some would rather have the faster cpu, others would settle for a little less cpu in exchange for a better gpu.

Just because different people prefer different things doesn't mean that 'people' are hard to please ... it just means there's different opinions out there.

Geeez! :rolleyes:
 
Light Peak and FireWire have the fewest syllables, so are the best names to use, IMO.

Fire-wire
Light-Peak

You-ess-bee
Thun-der-bolt

Still, USB is the easiest to type.
Not as easy as "TB."

TB port, TB cable, etc. Hey we're all ADD on the net, and TXT and tweet character limits demand all these elisions. We have tons of Mac specific acronyms on these forums. Who wants to make 11 keystrokes over and over when "two will do"?

TTYL from my new MBP with SSD, TB, SB CPU & AMD GPU homies....
 
Is Sony VAIO Z still the Best 13 incher in the business?

It is disappointing that after all Apple did to elevate the power & speed of MacBook Air, I would think they would elevate 13" MacBook Pro to its true status as a Pro machine: full SSD, superb (not just good enuff) discrete graphics (facilitated with removal of Optical Drive). I don't know if the product development at Apple know that a higher end MBP 13 with substantial discreet graphics can be a big winner among gaming crowds who want to play graphic intensive FPS like Modern Warfare, SOCOM 4 or Crysis 2 on the Mac platform, Boot Camp or native, but do not want a bulky 15 or 17. I think "thin" should not be the design principle for the MBP line. We already have "thin" in the fantastic MBA line. We don't mind slight thickness, or the price. We do mind the size of that screen. We want to carry this thing around town in our sling bag. I guess Sony Vaio Z is still the best full feature 13" laptop (quad SSD RAID 0, BluRay writer, 1GB GPU, Core i7) in the business. Too bad it runs Windows.
 
Last edited:
Wirelessly posted (Mozilla/5.0 (iPhone; U; CPU iPhone OS 4_2_1 like Mac OS X; nb-no) AppleWebKit/533.17.9 (KHTML, like Gecko) Version/5.0.2 Mobile/8C148 Safari/6533.18.5)

SlickShoes said:
Why do they 'sux'? I own many devices with NVIDIA GPU's. They perform great, the drivers have always been stable and they are updated periodically with tweaks & refinements.

I don't understand the hate. I think a lot of people moan about anything. My experience of ATI/AMD GPU's is nothing to write home about. They cost less than their NVIDIA counterparts, but that's about the only major plus point.

NVIDIA dominate the top positions (I have a GeForce GTX 570 on order with my delayed SB PC):

http://www.videocardbenchmark.net/high_end_gpus.html

I know that recently, when getting cards sorted for my desktop PC i found that power consumption was far greater on nvidia cards as was heat in a like for like card comparison. AMD cards also provided much better value for money, in the low/mid range similar cards are cheaper and even in the higher ranges too.

Thats mostly just my opinion from testing out a few different options.

You have no idea what you are talking about.

In the desktop market. Nvidia is destroying amd at the moment. In price and performance. The gtx 580 leaps ahead of the 6970.

Also the new gtx 560 is the best bang for the buck you can get. It even outpeforms the 6950 and the 6970 in many games. And it's an easy 1 ghz overclock.

And was great with the 5 series.
But thet lost alot of momentum now.
 
Possibly some misunderstanding about ThunderBolt?

I read in the aticle about ThunderBolt on the main MR page, that the name is an Intel trademark.
Searching about it further, I saw this article about Intel's ThunderBolt:
http://savolainen.wordpress.com/2008/04/27/intel’s-mobile-strategy/
"Thunderbolt will be available soon in either 2 GB or 4 GB densities and may be combined with additional Intel BGA NAND packages to extend the density to 16 GB." (that was way back in 2008!)
But it's all about NAND...
Could it possibly be that the ThunderBolt is referring to a seperate SSD for OS?
 
Wirelessly posted (Mozilla/5.0 (iPhone; U; CPU iPhone OS 4_2_1 like Mac OS X; nb-no) AppleWebKit/533.17.9 (KHTML, like Gecko) Version/5.0.2 Mobile/8C148 Safari/6533.18.5)



You have no idea what you are talking about.

In the desktop market. Nvidia is destroying amd at the moment. In price and performance. The gtx 580 leaps ahead of the 6970.

Also the new gtx 560 is the best bang for the buck you can get. It even outpeforms the 6950 and the 6970 in many games. And it's an easy 1 ghz overclock.

And was great with the 5 series.
But thet lost alot of momentum now.

Mac Pro is Using AMD Radeon :)
 
So I'm looking to buy the new MBP, and I will occasionally use it for some gaming, Call of Duty specifically.. Will I be better off with a 15" with a proper graphics card or will the 13" be okay?
 
Rumor: Quad core i7's for 15/17 inch models. 6950m graphics.

http://hardforum.com/showthread.php?p=1036888693


This user posted the same thing on MacRumors (MBP Forum). The post, however, disappeared why he was offline - I checked his profile. It was not moved to wasteland, and it was not merged with another thread. Is it possible MacRumors got a cease and desist?

I'd call the guy a troll - but he has been a member here for a few years and on HardForum for over 5. A quick look at his posts and I don't think he seems like the troll type. The specs sound ridiculous - but I think they could be possible if Apple removes the optical drives on the 15/17, uses the new battery technology, and increases airflow.

Thoughts?

The specs indeed sound ridiculous.

Sadly, ironically, he probably got the 256MB VRAM right on the lower end videocard. That's right, I bet Apple is still gonna offer a lousy 256MB VRAM on the base 15'' model so that you have to upgrade to the more costly higher end 15'' or 17'' in order to get 512MB VRAM or more.
 
Just got back from my local Apple store...

The apple sales guy is trying to help a student choose a new computer. She repeatedly asked if there's a 13" macbook pro and he kept saying he's out of stock. I should have told her that a new one is coming out tomorrow with a THUNDERBOLT port! :D

I think she may have gone with the white macbook though... Apple sales guys are so tricky...
 



003726-amd_radeon_logo_300.jpg


While we've heard a lot about Apple's new low end 13-Inch MacBook Pro, there's been hardly any information about the upcoming 15" and 17" models. CNet provides a rundown of what they've heard about today's announcements.

First, as we've heard, Light Peak is officially being branded as "Thunderbolt". We've seen the photos already showing it on the 13" MacBook Pro.Thunderbolt will be found on the new MacBook Pros. Next, as we knew, Intel's latest Sandy Bridge processors will be used in the new MacBook Pros. And, the 13" MacBook Pro will only use Intel's integrated graphics chip.

CNet, however, also reveals that the 15" and 17" MacBook Pros will use AMD (formerly ATI) discrete graphics cards alongside Intel's integrated solution. CNet offers no specifics about which graphics cards Apple will be offering. Current 15" and 17" MacBook Pros have NVIDIA graphics cards built in.

Article Link: MacBook Pros with Thunderbolt, ATI/AMD Graphics?

As long as they don't generate the kind of heat that the Radeon Mobility X1600s did in the first and second rev (2006, Core and first gen Merom Core 2) MacBook Pros, I'm all for it.

So the 13" PRO will have Intel IGP? :eek:

Such a deal...

:rolleyes:

The 13" Pro has always been (at least since the white Unibody MacBook) the biggest rip-off in Apple's entire Mac line. Period.

The "refresh" is looking sweeter with each leak.

It's kinda hard to mess up a 15"/17" MacBook Pro release.

I have a Mobility Radeon 5870 in my Asus G73JH and have problems with it (GSOD) so I hope this time, they took the time to properly test it.

How is it with Apple and drivers? Do they use generic or have their own modified versions?

Apple works with the GPU manufacturer. As of recent, AMD has been more responsive in terms of putting out up to date drivers for their Apple supplied GPUs than NVIDIA has.

Can anybody shed some light on which generation of mobile GPU's could be included? I'm really surprised divers haven't leaked from OSX betas...

NVIDIA GeForce GT 4xxM or 5xxM where xx is either 25, 30, or 35, or an AMD Radeon HD 6xx0M where xx is either 65, 67, 75, 77, or 83 (skeptical as all hell on that one though).

I was hoping the 13" would get better graphics. I had heard goods things about the Intel IGP until recently when I heard it is less powerful than the NVIDIA 320M.

I wonder how quickly Apple will update its MacBook Air site, which says both:

"The new MacBook Air features the fastest integrated graphics on the market. The NVIDIA GeForce 320M graphics processor with 48 processing cores . . ."

and

"MacBook Air features the NVIDIA GeForce 320M graphics processor — the same one used in the 13-inch MacBook Pro. "

Apple might try to position the Intel IGP as faster than the 320M--you think?

I've been wondering how many pages will get refreshed in addition to the MacBook Pro page. I was shocked to find that they even edited in their Snow Leopard system requirements for OpenCL to include the 320M this time last refresh.

Wirelessly posted (Mozilla/5.0 (iPod; U; CPU iPhone OS 4_2_1 like Mac OS X; en-us) AppleWebKit/533.17.9 (KHTML, like Gecko) Version/5.0.2 Mobile/8C148 Safari/6533.18.5)

This is ********. Give 13" like 256mb of low voltage discrete vRAM. Instead, we get graphics slower than last gen (on medium settings) that also takes almost 400mb out of my RAM.

That exists? Or rather, that, in a form that there'd be room for in a 13" Pro, exists?!

learn OpenCL? Its a little bit more clunky than CUDA but it's a pretty good standard IMO. Also, If you are doing CUDA work just SSH into a desktop with a card worth using for CUDA... It's really not hard to SSH...

Nooooooooooooooooo!
I need a 13" with decent graphics not a 15"!!! :( 15" is too big for me.

It's really not that much larger, honestly.

6830 isn't the highest end. It has 6850, 6870, 6950 and 6970 above it. The 68xx cards are 58xx rebrands.

Nope, the 68xx cards are the successors to the 5770 and the 5830 cards, no rebranding. The 67xx series, on the other hand, is a rebranding. Though I think that only applies to desktop cards.

Ive been reading all these rumors the last couple of weeks and managed to convince myself that there was no good reason for me to upgrade, BUT if this is true ill be getting a 15er, especially since mine evolved into this over the last few days

...And you don't have AppleCare?

Newbie Questions:
Compared to the Mac 13" model with the rumored model, is the new model better, similar, or worse?

Worse on graphics, better on CPU. Though I'm still skeptical that the model we saw is going to be the one announced tomorrow.

I seriously think the 13" MBP should be renamed "Aluminium Macbook" or something like that. It really doesn't live up to the "pro" tag.

It makes sense, especially given the negative tech PR associated with another Intel IGP.

It has already been confirmed that the leaked 13" with the Sandy Bridge integrated graphics is called a MacBook Pro. Check this post. Altho the pic is not showing up for me atm, but it has the new MBP box with Thunderbolt and all. :p

I for one won't be upgrading my 2010 13" MBP to these. I'll most likely go with the 15" one next time I upgrade as I'm disappointed with how "Pro" the 13" is.

Edit: Also, I can't see them dropping the white plastic MacBook, the chix dig it and some other people too. It's a simple computer for simple people kinda. For people who'd use the SuperDrive the Air isn't the best option so I do think it's nice to have a very basic model in the lineup even if I'd personally never buy one.

Edit2: Quote the post and paste the link to a new browser tab if you can't see the pic.

(a) Pix can be deceiving; I'm still not ruling out a photoshopping. (b) The 13" Pro has always been a joke. Though to be fair the 320M is the closest it ever got, relative to every other IGP to come before it, to attaining even semi-Pro stature.

Wow, AMD GPUs? I guess Apple really doesn't want me as a customer again :rolleyes:

ATI/AMD still can't write drivers to save their lives. AMD does make good main CPUs though.

I take it you're not a gamer? AMD was the only of the two providing its driver fixes for things like Steam (The Source Engine games) and StarCraft II for the Mac and having those drivers work on the first go. No CUDA cores, yeah, that sucks. AMD has Stream processors and for everything else, there's OpenCL, which ought to be mass-adopted anyway.
 
Just got back from my local Apple store...

The apple sales guy is trying to help a student choose a new computer. She repeatedly asked if there's a 13" macbook pro and he kept saying he's out of stock. I should have told her that a new one is coming out tomorrow with a THUNDERBOLT port! :D

I think she may have gone with the white macbook though... Apple sales guys are so tricky...

Theres allways a thin line in difference with tricky/clueless.

Guess which one was it at yout example IMHO.

Tod
 
Taken from apple website :

Requirements to install all Final Cut Studio applications

* ATI or NVIDIA graphics processor (integrated Intel graphics processors not supported)

Very nice to spend 1300 bucks on a Macbook PRO and not be able to use some PRO applications .I PROtest.

I don't understand what you're talking about. The current MBPs already have nvidia graphics card in them.
 
The specs indeed sound ridiculous.

Sadly, ironically, he probably got the 256MB VRAM right on the lower end videocard. That's right, I bet Apple is still gonna offer a lousy 256MB VRAM on the base 15'' model so that you have to upgrade to the more costly higher end 15'' or 17'' in order to get 512MB VRAM or more.

Tests for the GeForce GT 330M showed that the extra VRAM made no noticable real-world difference unless you were using a 24" Apple external monitor. Otherwise, performance for most things was more or less on par with each other. Somehow, I doubt a changeover to AMD would change this.

I don't understand what you're talking about. The current MBPs have nvidia graphics card in them.

The 13" MacBook Pro rumored to be released tomorrow, apparently doesn't.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.