Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
So, if you want support for the current OpenGL version on your Apple computer - simply boot Windows 7.

What's the problem?
Well I'm not a content developer so it's not like I'd need a Quadro, and I use a PC with a GTX470 for gaming not that there are any OpenGL 3.0 games, but it's been pointed out that some 3D development tools are starting to move over to OpenGL 3.0. MARI by The Foundry was pointed out by Ars Technica as requiring at least OpenGL 3.0 and as such only works with Windows and Linux whereas the rest of The Foundry's products typically support Mac. For those affected, it'd probably be nice for Apple to build up a large installed base of OpenGL 3.x supporting Macs (ie. do it in Snow Leopard and not wait for Lion adoption) so that OS X is ready for developers to move their tools over to OpenGL 3.x rather than waiting for a critical mass of OpenGL 3.x applications that can't be supported on Macs before adding support. I'm sure developers that bought Mac Pros and the Mac Quadro GPUs weren't planning on booting into Windows to use some of their development tools.

On an idealistic level, I thought one of OpenGL's main features was cross-platform compatibility, which on practical terms means hoping Windows developers use OpenGL engines for their games instead of DirectX so that it's easier for them to also support OS X and Linux. But the cross-platform purpose seems to be blunted when one of the platforms you'd want to reach in a cross-platform game, OS X, isn't keeping up with even the OpenGL standard from 2 years ago (OpenGL 3.0). With Valve reintroducing game developers to Mac gaming this is the perfect time to push OpenGL, since a game developer using say an OpenGL 3.1 engine allows them to target Windows, including Windows XP, OS X, and Linux from a single code base rather than a DX9 version for Windows XP, a DX10/11 version for Windows Vista and 7, and if we're lucky an OpenGL version for OS X and Linux.
 
NVIDIA Quadro...Hmm.

Is it still possible to make a Quadro out of a Geforce by simply changing a few solder joints?
 
3D graphics GPUs are massively parallel, with dozens/hundreds of cores running very limited instruction sets, which are highly suitable for vector calculations but not so much general computing. That said, CPUs and GPUs seem to be converging, so years from now I wouldn't be surprised if they merge. Then we can go back to software rendering. (Which sounds bad initially when you're used to GPUs, but in reality not only would speed not be an issue, but it would allow you to ditch OpenGL/Direct3D with their attendant limitations, driver problems, etc. and just do whatever you want. That's right: Mac versions of games will finally perform the same as Windows, and be a lot easier to port.)

--Eric

Not different. Just a very well defined subset. By creating a functional unit that is only optimized to do that specific 3D graphics subset (and not for general purpose do everything capability like a CPU), you can make each processing unit a lot smaller, and thus fit 256 processing units on a chip, instead of just 4 (as in a quad core CPU) or 8.



You can, but it would take 64 as many expensive quad-core CPU chips to get the same number of processing units. In actually, you could might be able to get by with only maybe a half dozen or so dual quad-core Mac Pros, since each CPU core can do more triangles per second than the much smaller and simpler GPU core.

But one card versus several boxes wins, cost-wise, as long as you stick to that well defined subset of operations (in OpenGL & OpenCL).

Graphics functions are quite CPU intensive, and they're almost completely separate from the OS's main processes. That's why it makes sense to have a dedicated processor (GPU) and RAM for such tasks. Basically, in theory, yes, the CPU and system RAM could render everything. But they're already extremely bogged down by the rest of the OS. Have an uber-powerful (and in this case extremely precise) GPU w/ video RAM allows the machine to process extremely complex stuff without slowing the machine down. There's a lot more going on under the hood than one might think!

Thanks! :D
 
what does that cable do besides an odd design statement? is it supposed to say: all those unneeded 890.000.000 triangles will now be squeezed through this ugly cable?
 
This would have been helpful to me early in the year. However, as a professional 3D artist, I had to switch platforms back to Windows 7 64 bit for numerous reasons. In part, it was because there was no nVidia cards when I wanted to upgrade, only ATI which was offered as 5870 top end.

I think this card is great for those still on a Mac Pro doing 3D art design. Programs like I use such as Maya, Vue, etc. all take a great deal of power and use the CUDA cores. What some do not realize is just how limited the Mac Pro has become since this is the first time in well over a year that any professional 3D artist has had an option besides their poor offerings for video cards they typically have for Mac Pro use.

I continue to see Apple as being more consumer focused than professional which is why I went back. However, I recommend Apple to anyone that is a typical computer user. It's just not for those like myself because it is too little and too late now.

The Quadro 4800 Mac Edition has been available for some time as an aftermarket offering including through the Apple Store; the Quadro 4000 Mac Edition replaces this at a modest speed bump and a few hundred dollars in savings. What's nice on the Mac Pro, and it has been for awhile, is that Apple doesn't skimp on the power supply like some lower end desktops, so I would be able to stuff two Quadro 4000's in and still not exceed the 300W available on the PCIe bus.

At least a few renderers will take advantage of a large number of CPU cores and GPU's; I'm interested in KeyShot Professional as it supports the MCAD platforms that I use, and runs well in a mixed OSX/Windows 7 environment.

I'm sympathetic to your situation, and while it isn't a necessity that I even have a Mac Pro, with an FCP update on the horizon for early next year (Steve's text!) the Mac Pro platform seems to be the best future solution for myself, eliminating a couple of PC's that I need today for MCAD.
 
what does that cable do besides an odd design statement? is it supposed to say: all those unneeded 890.000.000 triangles will now be squeezed through this ugly cable?

If you'd read the thread you'd have noticed the several times the same question was asked and answered....

"3D Stereo Synchronization
Enables robust control of stereo effect through a dedicated 3-pin mini-din connection between the graphics card and 3D stereo hardware. Support via optional 3-pin mini din connector."
 
Nice to know quadro is still alive in macpro, I plan to get a macpro sometime after I finish with my military ********s and I plan on getting a pro card to aid me in some heavy 3D scenes I make with Vue.

I see a lot of confusion about what is that pro cards and why it cost that high.

These 2 articles will explain a lot:

http://www.xbitlabs.com/articles/video/display/quadrofx-firepro.html
http://www.xbitlabs.com/articles/video/display/nvidia-quadro-5000.html

And remember, the days where a quadro was the same with the gaming counterpart are far far behind us and you can't softmod or mod the hardware to make a gaming nvidia, a quadro one.

Have a nice day!
 
What I don't get is how can hardware be optimized for graphics? Are graphical functions different on the hardware level to non-graphical functions? Why can't someone just have a very powerful CPU with lots of RAM, and dedicate a certain percentage (depending on demand) of the CPU and RAM to graphics? Why would that (it's obviously the case) result in lower performance? It's all just bits, I would think that bits resulting from graphical operations are still processed the same way as any other bits...

x86 architecture is CISC and microcode based.

Due to the relatively limited number of gfx operations, you can hardwire those into the silicon, effectively parallesing many operations.

We see this all the time in FPGA programming. Sure, your 3GHz Xeon whatever can do a 4096 point FFT, it does so in sw, using loops.
A 200MHz FPGA can just set aside the gates and do that FFT in a single clock cycle, effectively making it a few-hundred core CPU.

Disadvantage of hardwiring instructions: uses silicon area, regardless of whether you are using the instruction.
 
x86 architecture is CISC and microcode based.

Due to the relatively limited number of gfx operations, you can hardwire those into the silicon, effectively parallesing many operations.

We see this all the time in FPGA programming. Sure, your 3GHz Xeon whatever can do a 4096 point FFT, it does so in sw, using loops.
A 200MHz FPGA can just set aside the gates and do that FFT in a single clock cycle, effectively making it a few-hundred core CPU.

Disadvantage of hardwiring instructions: uses silicon area, regardless of whether you are using the instruction.

the original x86 CPU's were CISC but today's Intel CPU's are a hybrid of CISC and RISC.
 
Folks, please learn to differentiate between workstation-class cards and regular video cards.

This is not for gaming. It's for quality over performance.

About $800.

EXPLAIN THIS TO ME PLEASE?

Is the former for Photoshop/Video Editting/Etc and the later for games???
 
Please answer dumb question

Honestly, it won't run it fine. Workstation cards suck for gaming.
OK, I admit it. I'm not a gamer. But, granted that you don't want to pay extra for a "Photoshop card" that you don't need, but, based on the specs, why wouldn't it also be just fine for gaming? Can you give some examples or quantify this somehow?
 
Wirelessly posted (Mozilla/5.0 (iPhone; U; CPU iPhone OS 4_1 like Mac OS X; en-us) AppleWebKit/532.9 (KHTML, like Gecko) Version/4.0.5 Mobile/8B117 Safari/6531.22.7)

Maybe but this graphics card targets the professional consumers or maybe some rich gamers.
 
Wirelessly posted (Mozilla/5.0 (iPhone; U; CPU iPhone OS 4_1 like Mac OS X; en-us) AppleWebKit/532.9 (KHTML, like Gecko) Version/4.0.5 Mobile/8B117 Safari/6531.22.7)

roland.g said:
A graphics card that runs more than a MacBook Air. :eek:

Pricing is just pure ripoff. I wonder how much does it cost to produce this than MacBook Air. Or engineer's labor cost to design this.
 
Geez, I'm sorry I opened this whole Gaming vs. Workstation can of worms. I know there's a difference between the two; I sold AutoCAD workstations for a living, for a brief time. I thought the "/snark" I put in my initial comment made it obvious I was joking about the Crysis thing.

Again, I'm just happy that the Mac Pro hasn't been completely ignored by hardware vendors.
 
if only Apple had OpenGL 4 to go with it...

oh, wait. We don't even have 3 yet. :rolleyes:
Hopefully they'll just skip to 4, considering that good OpenGL support on Mac is crucial to all things graphics related I can't understand how they're lagging so far behind. But then I suspect iOS has everything to do with that since they're focused on mobile OpenGL support.

OpenGL 4 is meant to have a more standardised interface for both full and mobile versions I think though? So here's hoping it may be part of 10.7! Valve and other games developers will hopefully be pushing Apple hard for this, and if Apple is focusing on the consumers like it claims then it really needs to get off its ass regarding gaming.
 
Hopefully they'll just skip to 4...

No, soon they will change the Apple.com homepage to read something like:

"Don't worry, we've got it under control.
Check back here tomorrow for exciting news from Apple!"

and there will be some kind of cryptic graphic and the Macrumors nerds will froth at the mouth with conspiracy theories linking the symbologies together, postulating about streaming iTunes, while someone from some news agency will "confirm" that the "exciting news" has something to do with OGL, and thus, OpenGL.

Then the time will come, and Apple will put up some kind of animation that says:

"OpenGL.
2.5.
Now on the Mac.
This changes everything. Again. Back the way it was."
 
High-end vs gaming cards

People doing a lot of high end video work that requires a lot of processing power. If you are using programs like Maya, Houdini, DaVinci Resolve, Autodesk Smoke, Realflow, Vray, Renderman, Vue, possibly BouJou, pieces of Adobes CS5 suite etc, will use these cards.

Pretty much anything dealing with 3D content creation or video creation where you are doing a lot of editing.

I don't doubt that these high-end cards run Autodesk and Adobe CS5 etc. better. What I don't understand are the claims posted here that these cards are actually bad for games. :confused: From the specs, it appears that the main difference is more memory, and, in some cases, they enable high-performance DP floating point that has been deliberately crippled in the "gaming" version. It isn't clear why adding these features would cause the card to perform more poorly on games. Can someone in know educate us non-gaming heathens?
 
there is already support for gtx 460/470/480

1. I am not sure what you mean by "do 3d glasses"
2. assuming you are talking about wearing special glasses that allow you to see stereoscopic imagery.... why would a professional workstation card have this feature?
3. what does the hardware have to do with what the software is displaying that would allow you to see stereoscopic imagery?

There are a few people out there that might need 3D glasses for editing, modeling and animating for those 3D movies, say, Pixar. It might be a help to those using 3D CAD. I've seen repair simulations using stereooptic 3D.

That is a very poor generalization. In most cases it is the software with improper process weighing where they blame it on the hardware.

Can you explain that in a bit more detail? Why would the software behave differently like that with a workstation card?

He just doesn't like the idea that any graphics card - OF ANY KIND - *can* cost over about $300.

It's not a consumer card, get a GeForce that fits your budget. Some niches require something specific to their needs, and this is intended to fill that.
 
Good to see some new Mac Pro options, even if it is only for the minority. After all, options are what is lacking in the Mac world.

Very clever avatar; for a moment I thought I had a bug in my system. Anyway, the card seems pretty nice although I'm not sure what realized value it may have on my day-to-day activities.

Say, anyone know what that odd cable arrangement is all about? It almost looks like it's feeding into itself.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.