Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I just stated I use an iMac... I just don't consider it a Professional machine because it isn't one.

I'm a professional, but there is nothing about the machine that is professional. I don't see why you find this so difficult a concept to grasp.

So you're the one who dictates what is a professional tool for all of us then I guess? Well good for you. Your mother must be proud of the power you have accumulated over your professional career to tell other professionals what they are using is not considered a professional computer by you, the one and only entity who gets a say.

iMac has Thunderbolt, the Mac Pro does not. Thunderbolt is a professional interface. The iMac has SATA 3.0, the Mac Pro does not. SATA 3.0 is a professional interface. Therefore, your argument is invalid on two counts.

Look, I've worked in my industry for 20 years. My colleagues and I use iMacs to professionally produce product for our clients and various networks. If you are so intimidated by the fact that an iMac can be used to complete tasks by consumers and professionals alike, then I suggest you get over yourself and move on to more productive arguments. Unless you want to unzip your pants, pull out a ruler and just measure it out like you seem to be doing in this thread.
 
So you're the one who dictates what is a professional tool for all of us then I guess? Well good for you. Your mother must be proud of the power you have accumulated over your professional career to tell other professionals what they are using is not considered a professional computer by you, the one and only entity who gets a say.

iMac has Thunderbolt, the Mac Pro does not. Thunderbolt is a professional interface. The iMac has SATA 3.0, the Mac Pro does not. SATA 3.0 is a professional interface. Therefore, your argument is invalid on two counts.

Look, I've worked in my industry for 20 years. My colleagues and I use iMacs to professionally produce product for our clients and various networks. If you are so intimidated by the fact that an iMac can be used to complete tasks by consumers and professionals alike, then I suggest you get over yourself and move on to more productive arguments. Unless you want to unzip your pants, pull out a ruler and just measure it out like you seem to be doing in this thread.

+ 1 to this.

The term "Professional" is hugely variable and to pigeon-hole someone based on the specs or name of the computer they use to be productive at their job is highly short-sighted.
 
That a professional is able to make do with some piece of a equipment does not make it pro.

Wrong, the company forces the professional to "make do", don't point the finger at professionals who produce consistent and growing work, professionals don't make do and when we do, and cut corners, we ultimately suffer and ruin the entire industry we're in, so yah, professionals are not making do, they are really making great use of the technology.

The company in question "big red crispy shiny fruity" company that we all pay into, they make their consumer products out to cater to professionals but in reality the ad reads: "The pro in everyone", which means NOTHING, so ya, that means the pro at the end of macbook is just to make the buyer feel higher up in status quo, maybe?
 
ok here are some hard numbers from notebook check.com.
You can check the reviews of the k2000M and GT650M (the same card architecturally but the K2000M can take advantage of the pro drivers) with the respective benchmarks. You can find them here:http://www.notebookcheck.net/NVIDIA-GeForce-GT-650M.71887.0.html
and here:http://www.notebookcheck.net/NVIDIA-Quadro-K2000M.76893.0.html

To recap: K2000M

  1. Solidworks: 34.1fps
  2. PRO/E 13.3fps
  3. Lightwave 45fps
  4. Siemens NX 24.8fps
GT650M
  1. Solidworks: 7.8fps
  2. PRO/E 1.2fps
  3. Lightwave 14.5fps
  4. Siemens NX 3.1fps
You can do the math on the percentage difference.
But Hey Diablo III runs faster on the GeForce lol....

Guys, I think most people here are still missing the point. Check out the above links, look at the benchmarks, and you will see that in their current iteration, the existing cards are only better for one thing: GAMES!!!!!!........

And since we all agree here (i hope) that the Mac is not the right platform for games, why not agree that a workstation card would definitely serve more people for the (graphics or engineering related) work that they get paid to do? Especially when other companies can fit those same cards in laptops just as thin and for the same cost?

Why so much resistance for requesting the Mac laptops to appeal to a wider range of professionals?
 
Guys, I think most people here are still missing the point. Check out the above links, look at the benchmarks, and you will see that in their current iteration, the existing cards are only better for one thing: GAMES!!!!!!........

And since we all agree here (i hope) that the Mac is not the right platform for games, why not agree that a workstation card would definitely serve more people for the (graphics or engineering related) work that they get paid to do? Especially when other companies can fit those same cards in laptops just as thin and for the same cost?

Why so much resistance for requesting the Mac laptops to appeal to a wider range of professionals?

I can play games on my Mac just fine. All the games I ever want to play I can play on my Mac at at least 60 fps with no complaints. Bioshock, Kerbal, Civilization V, etc. No problems here.

Every professional I know and work with has a MacBook Pro with no regrets. The current offering of GPUs is more than adequate for everything we do since we aren't in need of render farms and super computing clusters. I think you assume we want more than what we actually need. But that's your prerogative that you've outlined for yourself without any real research, evidence, survey or conclusions from "professionals".
 
I can play games on my Mac just fine. All the games I ever want to play I can play on my Mac at at least 60 fps with no complaints. Bioshock, Kerbal, Civilization V, etc. No problems here.

Every professional I know and work with has a MacBook Pro with no regrets. The current offering of GPUs is more than adequate for everything we do since we aren't in need of render farms and super computing clusters. I think you assume we want more than what we actually need. But that's your prerogative that you've outlined for yourself without any real research, evidence, survey or conclusions from "professionals".

I'm glad you can play your last- generation video games on a Mac at 60fps...but a lot of people buy a Mac to maybe do some work on. Any 3d (or graphical intensive app) application runs like a snail on a rMBP compared to the many thin laptops that other companies offer, so why not have another OPTION for us who do that kind of work.
I never requested that they eliminate the existing cards.....I just requested an additional option for the growing segment of my industry.

P.S. The people who make the games you love to play also use such applications as Modo, Lightwave, 3DS Max etc....Modeling does not require render farms ...just a decent workstation card. And yes some modelers have to work on the go and actually like OSX too!!
 
Last edited by a moderator:
I'm glad you can play your last- generation video games on a Mac at 60fps...but a lot of people buy a Mac to maybe do some work on. Any 3d (or graphical intensive app) application runs like a snail on a rMBP compared to the many thin laptops that other companies offer, so why not have another OPTION for us who do that kind of work.
I never requested that they eliminate the existing cards.....I just requested an additional option for the growing segment of my industry.

P.S. The people who make the games you love to play also use such applications as Modo, Lightwave, 3DS Max etc....Modeling does not require render farms ...just a decent workstation card. And yes some modelers have to work on the go and actually like OSX too!!

If you're buying a rMBP to do heavy Lightwave and 3D modeling work on, you have bigger problems than if that notebook can run it properly. Primarily the fact that you didn't do any research as to what kind of tool you actually need to do the job properly. Apple won't make laptops that do that kind of heavy lifting. They never have and they never will. Sure, it can do light stuff in that field, maybe some spec and pre-viz work, but you aren't doing WETA-class, production-level crunching much less final-pass shading, lighting and rendering on a laptop. Those are workstation-class tasks. Get the new Mac Pro in November when it comes out and use it like Pixar does if you want that kind of performance.
 
Last edited by a moderator:
If you're buying a rMBP to do heavy Lightwave and 3D modeling work on, you have bigger problems than if that notebook can run it properly. Primarily the fact that you didn't do any research as to what kind of tool you actually need to do the job properly. Apple won't make laptops that do that kind of heavy lifting. They never have and they never will. Sure, it can do light stuff in that field, maybe some spec and pre-viz work, but you aren't doing WETA-class, production-level crunching much less final-pass shading, lighting and rendering on a laptop. Those are workstation-class tasks. Get the new Mac Pro in November when it comes out and use it like Pixar does if you want that kind of performance.
So all these companies like Dell, Lenovo, HP,Boxx etc that DO in fact have mobile workstation offerings are delusional and appealing to a delusional demographic....
 
Last edited by a moderator:
Honest question.

If you're doing that heavy of a workload that requires a workstation computer that can cost up to $4k+, why would you expect a small laptop that cost half that to be as good as it?
 
So all these companies like Dell, Lenovo, HP,Boxx etc that DO in fact have mobile workstation offerings are delusional and appealing to a delusional demographic....

Then buy one of those. They're twice as thick and their battery life is about half an hour when working in those applications. You obviously don't understand the goals of Apple as a laptop maker. They aren't after that market and haven't been in a long time. No bitching or moaning of yours can or will ever change that. If you REALLY want to do 3D modeling on a laptop, there are plenty of other alternatives available to you. Trust me, I want that kind of performance from Apple too, but I'm mature enough to understand that Apple won't make a 2" thick notebook that panders to the likes of your kind and other 3D VFX professionals too.
 
Last edited by a moderator:
Honest question.

If you're doing that heavy of a workload that requires a workstation computer that can cost up to $4k+, why would you expect a small laptop that cost half that to be as good as it?

Honest answer.

I desperately need mobility. I didn't request the performance of a $4k workstation (I already have one for that).... I requested a laptop workstation card OPTION similar to this http://www.dell.com/learn/us/en/04/campaigns/precision-m3800-workstation.

Im not asking anything that does not exist or hasn't been done before, and Im sure that I'm not the only one in the industry who likes the rMBP form factor and OSX...
 
Honest question.

If you're doing that heavy of a workload that requires a workstation computer that can cost up to $4k+, why would you expect a small laptop that cost half that to be as good as it?

My point as well.

----------

Im not asking anything that does not exist or hasn't been done before, and Im sure that I'm not the only one in the industry who likes the rMBP form factor and OSX...

You're looking in the wrong place friend. Apple will never give that to you. They are a thin-and-light laptop maker. You can't break the rules of physics to get your precious workstation-class GPUs into a retina MBP-sized notebook.
 
Honest answer.

I desperately need mobility. I didn't request the performance of a $4k workstation (I already have one for that).... I requested a laptop workstation card OPTION similar to this http://www.dell.com/learn/us/en/04/campaigns/precision-m3800-workstation.

Im not asking anything that does not exist or hasn't been done before, and Im sure that I'm not the only one in the industry who likes the rMBP form factor and OSX...

I understand that. I really do. But the problem comes in many ways.

That Dell for example. i7 + a workstation class GPU. It's using a 45W TGP. The i7 is Haswell, the max TDP could be 47W TDP. That's over 90. Ivy Bridge + dGPU was about 90. Haswell is suppose to be power saving, etc.

To use that laptop itself for work (using that GPU), that battery will die pretty fast.
 
I find that

1. I have to repeat myself a lot to thoroughly reply to your posts

2. You offer nothing of value or of remote intelligence to the discussion

.....so go play some Bioshock, it will be best for all.

I've said repeatedly that you are sniffing the wrong **** pile. Apple will never give you what you want. Buy Dell if they do give you what you want. You are on a Quixotesque mission my friend. But by all means, keep chasing those windmills on your mule, or have you not read literary works? In which case let me spell it out for you:

APPLE.....WILL.....NEVER.....MAKE.....A.....WORKSTATION....CLASS....NOTEBOOK.....UNTIL....WORKSTATION....CLASS....GPUS....CAN.....FIT.....INSIDE....A.....RETINA....MACBOOKPRO....CHASSIS....WITHOUT....OVERHEATING....OR.....REDUCING....BATTERY....LIFE.....TO....THE....DURATION....OF....TIME....IT.....TAKES....TO.....TAKE......A......PISS....

Until then, relax, buy your Dell, if it's ever released. Remember how often they've per-announced things they couldn't deliver as specified. And I wish you well on your endeavors. This argument has reached its pinnacle of intelligence (mostly from my end) so I bid you a fair adieu.

P.S. No time for games as I need to edit a feature on my MacBook Pro. You thought I couldn't do that....acting....
 
Last edited:
Apple probably puts "gaming" GPUs in their systems due to the fact that these chips have the horsepower to do your typical professional tasks as well as the ability to play games. Try running games on a Quadro or FireGL. From what I've been told, the end results aren't too great. Using "gaming" GPUs make the systems good for both work and play.
 
Guys, I think most people here are still missing the point. Check out the above links, look at the benchmarks, and you will see that in their current iteration, the existing cards are only better for one thing: GAMES!!!!!!........

Well, you are forgetting the other side of the equation - why is CAD software performing so much better on the workstation GU? If its because of artificially crippled double-sided rendering, then it can be easily fixed. By using few additional lines of code. I am not a big fan of conspiracy theories, but this is very likely to be one ;) So why are you complaining about Apple and not about Nvidia/ATI who 'trick' you into paying more money for essentially the same product?


And since we all agree here (i hope) that the Mac is not the right platform for games, why not agree that a workstation card would definitely serve more people for the (graphics or engineering related) work that they get paid to do? Especially when other companies can fit those same cards in laptops just as thin and for the same cost?

Thats exactly the problem: a Macbook with the workstation GPU will cost at least several hundreds more. These GPUs always carried a hefty price premium. It might be good for the engineers, but it would make the MBP much less attractive for the rest 95% of potential customers who don't need a workstation GPU. Now, why does Apple not provide a Quadro as an option? I have no idea, honestly. Then again, its possible that the OS X GPU drivers are not crippled the way Windows drivers are. Did anyone actually do any benchmarks on that?

----------


Let's wait for the specs, shall we? It sounds very amazing indeed but I am always sceptical about great announcements like that. From all we know, it might use an ULV CPU and a K1000M or something...

----------

Apple probably puts "gaming" GPUs in their systems due to the fact that these chips have the horsepower to do your typical professional tasks as well as the ability to play games. Try running games on a Quadro or FireGL. From what I've been told, the end results aren't too great. Using "gaming" GPUs make the systems good for both work and play.

Workstation GPUs are almost just as good at games (again, the trick is in driver optimisation). The reason why this myth is still around is because workstation GPUs usually pack much less horsepower for a similar price. E.g. a Quadro K2000 costs more then $400, but its the same GPU as a 640GT (a $100 part) - both will provide similar performance in games.
 
Could you give some examples? I was thinking along the lines of hidden geometry culling and emulating double-sided surfaces with two one-sided triangles. This technique is also very good. Of course I am not suggesting to use any of the common 'illusion' techniques often employed in games, just use APIs in a smarter way. The 650M can process over 900 million polygons per second. If you render one line as a quad (two triangles) it is still over 200M lines per second or 10 million lines rendered @20fps. Now I don't know what is the average complexity of the models you are working with...

I did some quick search and found only bits (xbitlabs.com and tomshardware); I was referring to the (maybe outdated?) discussion that the same scene looks different on different consumer graphics cards. There was some discussion about it a few years back when things were "optimized" on the driver level to get better benchmark scores by sacrificing image quality (and hoping that nobody would notice it)...
Also, Nvidia is advertising the new Quadro 6000 with "precision" results (though they may refer to the ECC bit, not sure).

I remember that there were some comparisons with the same scene being rendered on different platforms and the difference was shown (pixel-bx-pixel comparison; the difference should be 0, but it wasn't throughout the setup).
By the way, I am not a 3d-artist (did some things a long time ago, but only for fun); if somebody can render any somewhat complex scene on different setups (mobile, consumer/workstation) with exactly the same settings and post the output I'd happily do the pixel-by-pixel checking!
 
I did some quick search and found only bits (xbitlabs.com and tomshardware); I was referring to the (maybe outdated?) discussion that the same scene looks different on different consumer graphics cards. There was some discussion about it a few years back when things were "optimized" on the driver level to get better benchmark scores by sacrificing image quality (and hoping that nobody would notice it)...
Also, Nvidia is advertising the new Quadro 6000 with "precision" results (though they may refer to the ECC bit, not sure).

I remember that there were some comparisons with the same scene being rendered on different platforms and the difference was shown (pixel-bx-pixel comparison; the difference should be 0, but it wasn't throughout the setup).
By the way, I am not a 3d-artist (did some things a long time ago, but only for fun); if somebody can render any somewhat complex scene on different setups (mobile, consumer/workstation) with exactly the same settings and post the output I'd happily do the pixel-by-pixel checking!

Yea it's to do with things such as floating point precision. It's why quadro cards can be worse when running games because they don't take the same shortcuts as non-pro hardware.

The geforce cards have less precision because it doesn't matter for games - no one will notice. But they can cause inaccuracies that cannot be tolerated in various applications. I've never tried to do things such as fluid simulations on geforce cards but I'd imagine that would be one such application where you wouldn't want to deal with geforce cards.
 
I was referring to the (maybe outdated?) discussion that the same scene looks different on different consumer graphics cards.

[...]

I remember that there were some comparisons with the same scene being rendered on different platforms and the difference was shown (pixel-bx-pixel comparison; the difference should be 0, but it wasn't throughout the setup).

OpenGL is not a pixel-perfect specification so some differences in results is to be expected on different hardware. This is quite clearly stated in the OpenGL specification. It is true however that the driver would sometimes make shortcuts to increase the performance, so this is a valid concern.

Also, Nvidia is advertising the new Quadro 6000 with "precision" results (though they may refer to the ECC bit, not sure).

I think this refers to improved performance when working with double-precision data (this is indeed a big reason to get a Quadro, but not really useful for CAD).

----------

Yea it's to do with things such as floating point precision. It's why quadro cards can be worse when running games because they don't take the same shortcuts as non-pro hardware.

The geforce cards have less precision because it doesn't matter for games - no one will notice. But they can cause inaccuracies that cannot be tolerated in various applications. I've never tried to do things such as fluid simulations on geforce cards but I'd imagine that would be one such application where you wouldn't want to deal with geforce cards.

I remember doing some tests on shader precision years ago, when I had a GeForce FX card, the results on trigonometric functions were absolutely horrifying. But I think this would have changed by now. There are also ways to control this. Of course, the problem is that vendors are often not very particular about the specification... what we really need is a new, clean graphics API which would get rid of all the bloat accumulated over the years.

Of course, if you do heavy GPU computation that requires precision, a Quadro is pretty much a must.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.