Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
As a point of interest I'm looking forward to seeing how the base model iMac Pro compares with a top of the line standard 27" iMac. The price difference may work out to be worth it for "non-Pro" users to buy the Pro model if it means they get a couple more years of out of it before upgrading.

Given the growth in the use of iMacs in various creative industries over the past few years, I don't doubt there's a market for this machine.

I'd say that the Architecture firms I've been in that run ArchiCad will get them for sure.
 
  • Like
Reactions: renai-spirit
As I was reading some comments in regards to display, I say this is why Mac Pro would be something I would go for. I can at least choose the top of line display as long as it connected with no issue.
Was it Casey Neistat who ended up using his Mac Pro as a file server?
 
Starting at $5,000 dollars, then the dongles since it probably won't have an ethernet port, or any other ports you actually need or want, and soon you're talking real money.

"Dongles." How adorable. Kind of like watchbands. The currency of people not be taken seriously.

What do you need adapters for? SCSI or Centronics?
 
Conversely, plenty of Pro software out there (coincidentally happens to be Apple's Pro software) which is optimised for AMD.

Also they did say they'll be making a modual Mac Pro, which I'd bet my bottom tooth means you can't only fit AMD GPUs. Something for everyone and every workflow.


I will definitely keep an eye on the modular Mac Pro then. August / september will be a magical month. So many high end platforms rolling out Intel / AMD. Will see what route to go Apple vs DIY PC-build.
 
  • Like
Reactions: keysofanxiety
I don't see the point of a beefy pro machine that is stuck in the shell of the screen.
Screens have a longer life than next years graphics cards etc...

Is the ram and ssd soldered in and if not why is it soldered in to the regular iMac.
[doublepost=1498372183][/doublepost]
I am serious.

I have the current Mac Pro and I never use it anymore. It's gathering dust while my iPad Pro devices help me get all my important work done.

The future Steve Jobs spoke of has finally arrived.
I'd hazzard a guess that if you can get all your work done on an iPad, you are the class of person that didn't need a computer.

Everyones work is important, curious as to your choice of words...

I couldn't do my work on an iPad. I guess I could call that important to me, but importance does not equal needing a certain class of machine.

To me the class of machine is dependent on many aspects
  • Power required.
  • Software availability
  • Hardware aspects available such as input method and displays.
The iPad pro may be getting there in terms of power in its CPU, but it lacks keyboard, mouse, software, screen real estate etc for me to get my work done efficiently as a software developer.
[doublepost=1498372294][/doublepost]
I'm not sure I see the point of devices like this anymore. The iPad Pro can literally do it all at a fraction of the cost.
such as?
 
Screens have a longer life than next years graphics cards etc...
you'll definitely be able to get more than a year's usage out of a GPU in one of these (or any) imac.
if there's a newer model GPU that comes out next year or the next or the next.. this one will still work just as well.

Is the ram and ssd soldered in and if not why is it soldered in to the regular iMac.
neither of those are soldered on the regular imac.. (nor the CPU)
 
you'll definitely be able to get more than a year's usage out of a GPU in one of these (or any) imac.
if there's a newer model GPU that comes out next year or the next or the next.. this one will still work just as well.


neither of those are soldered on the regular imac.. (nor the CPU)
Try telling that to those whose games increase in spec required every year.
Yes if all things are static, there is no need to swap out the GPU, but in reality, things move on, software improves.
After all, why is there a Sony playstation 1, 2, 3, 4 and beyond....
[doublepost=1498372629][/doublepost]
you'll definitely be able to get more than a year's usage out of a GPU in one of these (or any) imac.
if there's a newer model GPU that comes out next year or the next or the next.. this one will still work just as well.


neither of those are soldered on the regular imac.. (nor the CPU)
I remember reading that at least the ram on the 21.5 inch iMac was soldered.
 
Try telling that to those whose games increase in spec required every year.
i think apple's solution for people like that is eGPU.. gamers themselves will probably prefer that method as well.

Yes if all things are static, there is no need to swap out the GPU, but in reality, things move on, software improves.
After all, why is there a Sony playstation 1, 2, 3, 4 and beyond....
software does improve.. but not at extremes per year.. more like extremes per decade (and even then, not really 'extreme'.)

gamers though? me personally, i don't care about that in a computing system.. at all.
so it's likely we have different needs/wants if that's what you're into.
[doublepost=1498372970][/doublepost]
I remember reading that at least the ram on the 21.5 inch iMac was soldered.
not on the latest model.

(that said, it's not super user friendly to get to.. once you're there though, an easy swap)
 
i think apple's solution for people like that is eGPU.. gamers themselves will probably prefer that method as well.


software does improve.. but not at extremes per year.. more like extremes per decade (and even then, not really 'extreme'.)

gamers though? me personally, i don't care about that in a computing system.. at all.
so it's likely we have different needs/wants if that's what you're into.
[doublepost=1498372970][/doublepost]
not on the latest model.

(that said, it's not super user friendly to get to.. once you're there though, an easy swap)
The machines are not marketed at typical people. The type of people who are likely to use that kind of power are those where rendering times are important and as each new GPU comes out annually, it halves the time taken to render (I don't know how much I'm just putting an example in there) so any time saving like that is saving money.
I guess Apple has tried to address this by showing off an external GPU but I don't know why they didn't just go with a replaceable box that screws on to the vesa mount on the back.

As for me, the GPU wouldn't speed things up for me. I have software builds that have gotten down to under an hour and any increase in speed of that saves time and money. So If my builds were on a mac, it would have to be on the cheese grater style that could be upgraded. I'd hate if my build times stayed static for a decade or even a couple of years.
 
I would also like to point out that in every computer I've owned, the major issue with the longevity has been the Motherboard, when CPU sockets change, or new technologies come about. At which point it doesn't matter what components you have, the machine itself becomes "disposable".
GPUs have been the exception. GPUs fit in those unchanging PCIe slots and, until recently, haven't needed anything faster than PCIe 2.0. People put 980s and such in their 2006-2012 Mac Pros, and if what you're doing involves graphics, that alone extends the life of your computer. The thing is, eGPUs will now be supported, so I don't know if it matters so much anymore.

Besides that, CPU and RAM are worth upgrading just once if you're keeping the machine for a long time, especially because old CPUs and RAM are very cheap. My brother upgraded his 2009 Mac Pro to newer Xeons (5670) and 32GiB RAM for around $130 total, and it's definitely more usable than many new Macs.

Oh yeah, and if a RAM chip breaks in a new Mac, the whole darn machine is toast unless you pay a lot to repair it.
 
Last edited:
I don't mind this move, a very powerful iMac.
But I would like something the other way around.

A headless Mac, don't really need xeon ... the fastest I7 are fine.
I just don't need another screen. I have a 23" 1920x1200 a 24" 1920x1080 that is rotated next to a 43" inch 4K,
all calibrated and IPS.
On a 680 GTX, 64gb Ram, 2 SSD's and 2 spinners for massive storage. And 20TB on USB 3.0 and FW800.
I edit Medium Format photo's. And have 3 Epson Printers, the largest printing A0.

It is nice if you need to print a series of 24 A0. To have them all open in PS,
and can work on them, while test printing,
so they match perfectly in color and sharpness.

These are mostly not PSD files, as they quickly go over 3 gig when you are working on images for campaigns,
that are shot at minimal over 40mp and up to a 100mp.

I work on two cMP's ... one for raw editing and one for editing and printing to devide the workload.

If camera's will go 200mp, high end photo studio's will too.

An Imac is not really the machine for this. The screen is useless ...
external storage is more used for project that are over 2 years old, incase a client lost their content.

This is just a scenario, but not very rare in the imaging industry.
Graphic design is done on Macbook Pro's ...
And webdesign, the coding part, is just done on 2 macbook Air's 2015,
one is only an 11" with 4 gig ram ... the girl using it, is the fastest, most adaptable designer in our team.
Anyone, will plug into a 24" screen or work on a cMP from time ...
The two oldest macs in full time use are a unibody macbook and a cMP 1.1 both with snow leopard
( fileserver, printers are also connected ... and older material, drumscanners, Imacon stuff ... ).

We don't really do video.
None works on a ipad, except curators we work with, when expo's are getting set up,
they can walk around fast showing the latest "blueprint" kind of images ... never created on Ipads.

That's just about our demands, I don't care If we are "pro" or not, we do make money.
A lot of it get's made on those Macbook Air's ...

If camera's get to 200mp, I think It will get hard for the cMP ... nMP are only prefered,
when a studio has to be set up on location ...

Then again, the record label we work with is loaded with Imacs and Macbook Pro's. :)
Also nMP, because cMP are a lot more noisy.
If this Imac is quit, It looks amazing for those kind of workplaces.

I don't know how fast an Ipad Pro is, but I don't see it running small to medium imaging company's.
10 files open in PS with multiple layers around the 100mp, converting batches of 100mp files in the background,
and up to 4 calibrated screens, multiple keyboards, tablets, mouses, so 3 people can edit one file together on a big screen.
 
  • Like
Reactions: Michaelgtrusa
The machines are not marketed at typical people. The type of people who are likely to use that kind of power are those where rendering times are important and as each new GPU comes out annually, it halves the time taken to render (I don't know how much I'm just putting an example in there) so any time saving like that is saving money.

that's pretty much the problem with your argument.. you don't actually know.

fwiw, i'm exactly the type of person you're trying to tell me about.. i do CAD/3D Modeling/Rendering.. daily.. as a profession. (using mainly Rhino3D, Indigo (local rendering), and Autodesk (cloud rendering))

i don't want to sit here and make a wall of words in an attempt to tell you how it is.. but if you have any particular questions about this stuff, feel free to ask.

---
or, just talk about the stuff you personally need/want in a computing system.. if everyone did that around here, the conversations would be much better and more beneficial as well as paint a much clearer real-world usage picture.. instead, everyone just wants to talk about and argue about 'pros' .. which leads to tons of misinformation being thrown around.
 
Last edited:
  • Like
Reactions: LordVic
Another expensive appliance by Apple. It is part of their formula. Attach a "Pro" name to an existing product. Make it so it can not be upgraded after purchase and slam a high price to it. What a joke.
 
Another expensive appliance by Apple. It is part of their formula. Attach a "Pro" name to an existing product. Make it so it can not be upgraded after purchase and slam a high price to it. What a joke.

The billions in revenue every quarter these "appliances" generate is no joke.
 
  • Like
Reactions: fourthtunz
that's pretty much the problem with your argument.. you don't actually know.

fwiw, i'm exactly the type of person you're trying to tell me about.. i do CAD/3D Modeling/Rendering.. daily.. as a profession. (using mainly Rhino3D, Indigo (local rendering), and Autodesk (cloud rendering))

i don't want to sit here and make a wall of words in an attempt to tell you how it is.. but if you have any particular questions about this stuff, feel free to ask.

---
or, just talk about the stuff you personally need/want in a computing system.. if everyone did that around here, the conversations would be much better and more beneficial as well as paint a much clearer real-world usage picture.. instead, everyone just wants to talk about and argue about 'pros' .. which leads to tons of misinformation being thrown around.
I see where you are coming from but this is a machine that most on these forums would have no use for.

If you are a professional cad user then I guess you will understand that each new level of graphics card improves rendering time. As I don't know you could tell me how much time it reduces rendering by. I am just making educated guesses.

I do have friends that do a lot of rendering using programs like Maya and also use amazon farms because it is quicker than his multi gpu machine.

As a developer I could get some benefit from this type of machine for local build times but I would baulk at the price of having to replace it every few years. I don't build every minute of every day and for the most part I wouldn't benefit from this machine.
 
I see where you are coming from but this is a machine that most on these forums would have no use for.

If you are a professional cad user then I guess you will understand that each new level of graphics card improves rendering time. As I don't know you could tell me how much time it reduces rendering by. I am just making educated guesses.

I do have friends that do a lot of rendering using programs like Maya and also use amazon farms because it is quicker than his multi gpu machine.

As a developer I could get some benefit from this type of machine for local build times but I would baulk at the price of having to replace it every few years. I don't build every minute of every day and for the most part I wouldn't benefit from this machine.
rendering is traditionally CPU based.. most legacy software is doing it this way.

the GPU in imac pro will allow for real-time rendering.. if/when the software developers re-write their applications to GPU based rendering.. when rendering becomes real-time, there are no more time reductions possible..

until some of the big players in CAD world adopt this type of processing, the GPU doesn't really matter for rendering times.. if they introduce the ability in 4 years from now, 2017 imac pro will be able to run it as intended.

all of that aside-- there are far more beneficial workflow/user speed enhancements than raw rendering times..
i might model a project for two weeks.. prep it for rendering for 4 hours.. computer processes the rendering for 2 hours..

if my rendering time doubled, to one-hour.. then so what.. nothing changed.. the project still took 10.5 days.

lots of stuff needs to happen prior to pushing the 'render now' button.. the most beneficial enhancements in software should be (and generally is) focused around improving user experience prior to 'render now'.

----
fwiw, i mostly use autodesk's cloud rendering service these days.. doing that, you're leasing time on a supercomputer (ie- 64,000 CPU cores) instead of buying a much more expensive and much slower 40+ core local rendering machine.

like, for less than $1, i can have a full-size full-quality image in less than a minute.. and i can do this from a mbp or imac or whatever.

until the developers get real-time GPU based rendering functioning properly in their software, this is the fastest and cheapest way to go about it (for me personally at least)..

but again- it's nothing to do with ability to swap GPUs in a computer.. the hardware is already ready (or will be very shortly).. it's a software problem now.
 
Last edited:
  • Like
Reactions: campyguy
rendering is traditionally CPU based.. most legacy software is doing it this way.

the GPU in imac pro will allow for real-time rendering.. if/when the software developers re-write their applications to GPU based rendering.. when rendering becomes real-time, there are no more time reductions possible..

until some of the big players in CAD world adopt this type of processing, the GPU doesn't really matter for rendering times.. if they introduce the ability in 4 years from now, 2017 imac pro will be able to run it as intended.

all of that aside-- there are far more beneficial workflow/user speed enhancements than raw rendering times..
i might model a project for two weeks.. prep it for rendering for 4 hours.. computer processes the rendering for 2 hours..

if my rendering time doubled, to one-hour.. then so what.. nothing changed.. the project still took 10.5 days.

lots of stuff needs to happen prior to pushing the 'render now' button.. the most beneficial enhancements in software should be (and generally is) focused around improving user experience prior to 'render now'.

----
fwiw, i mostly use autodesk's cloud rendering service these days.. doing that, you're leasing time on a supercomputer (ie- 64,000 CPU cores) instead of buying a much more expensive and much slower 40+ core local rendering machine.

like, for less than $1, i can have a full-size full-quality image in less than a minute.. and i can do this from a mbp or imac or whatever.

until the developers get real-time GPU based rendering functioning properly in their software, this is the fastest and cheapest way to go about it (for me personally at least)..

but again- it's nothing to do with ability to swap GPUs in a computer.. the hardware is already ready (or will be very shortly).. it's a software problem now.

That maybe from the CAD world but from a 3D rendering programs of animators and such, they do use the GPU for rendering and a lot of their workflow is rendering. I don't know how far along the GPU they are using in the iMac Pro is, but I know my friend had rendering farms going for hours and was using more than a single GPU. In movie making (CGI/Animation), I don't believe real time rendering is there.

My use case is similar to yours, the GPU doesn't help and most work is cpu bound so like you say our experiences can be better enhanced through other improvements, thats not to say that should come at the expense of other professionals, whom this machine is probably targeted at.

As for swapping GPU's that is probably more targeted at gamers, VR and AR. Not that gamers use macs anyway because they are so poor in that respect.

Not sure if you use Autodesk Fusion 360, does opening large files take a while on your mac? Some files I am opening as a part of a course take a while to load.

PS for what its worth, Autodesk don't use a supercomputer as such, they rent machines from Amazon which will be on machines with a max virtual core count of about 128. (As far as I am aware)
 
Last edited:
That maybe from the CAD world but from a 3D rendering programs of animators and such, they do use the GPU for rendering and a lot of their workflow is rendering. I don't know how far along the GPU they are using in the iMac Pro is, but I know my friend had rendering farms going for hours and was using more than a single GPU. In movie making (CGI/Animation), I don't believe real time rendering is there.
right.. it was using all those CPU cores ;)

My use case is similar to yours, the GPU doesn't help and most work is cpu bound so like you say our experiences can be better enhanced through other improvements, thats not to say that should come at the expense of other professionals, whom this machine is probably targeted at.
the GPU helps.. just that most mid to high end GPUs these days are pretty great for non-compute tasks.. and very very little software, even the software that would benefit from GPU compute, aren't doing it yet.
so in a way, yes, we're at a particular time where GPU doesn't really matter too much.. get something 'good' and you'll be fine for a while.

Not sure if you use Autodesk Fusion 360, does opening large files take a while on your mac? Some files I am opening as a part of a course take a while to load.

i model in Rhino then go into fusion for CAM and rendering.. i export from rhino in .step which open locally in Fusion instead of uploading to cloud.. so without the upload part, the files open quickly.

(granted, i might not be using files as large as you.. typically, i'm in the 5 - 50MB range for my .3dm.. 200MB file is gigantic to me.)

PS for what its worth, Autodesk don't use a supercomputer as such, they rent machines from Amazon which will be on machines with a max virtual core count of about 128. (As far as I am aware)
right, but there are far more available cores than 128.. or, it would take a crapton of users simultaneously rendering in order to saturate all available resources.

with rendering, it's not like you could render on 64,000 cores anyway.. the process won't scale that high and there's a law of diminishing returns you'll encounter when multi-threading single processes.. like, if you divide the process up amongst 2000 cores, it takes longer to add all 2000 pieces back up into a single whole than any benefit you'd of experienced with so many cores..

the rendering software (that i know of) tops out support at a little over 100cores.. so a 200core renderfarm would give the same results as 100.
 
This right here is the problem. People bitch and moan when Apple doesn't make a Mac, then people bitch and moan when Apple does make a Mac.

More like people bitch and moan when Apple simply does a rehash of the same old design. Gone are the days of Apple impressing us.

I'm in the market for an iMac, and I have my reservations right now, especially with the noisy fans on the i7 being reported.
 
right.. it was using all those CPU cores ;)
Well all i can say is that he has many graphics cards in his machine that he added to do rendering. I'll need to ask him. As far as I am aware though, graphics programs such a Maya use the GPU to do rendering. Autodesk may not do it but from what I have read this is more to do with GPUs not being a constant thing where as there are fewer choices of CPU (or something along those lines)
[/QUOTE]

the GPU helps.. just that most mid to high end GPUs these days are pretty great for non-compute tasks.. and very very little software, even the software that would benefit from GPU compute, aren't doing it yet.
so in a way, yes, we're at a particular time where GPU doesn't really matter too much.. get something 'good' and you'll be fine for a while.
There are lots of software that make use of the GPU



right, but there are far more available cores than 128.. or, it would take a crapton of users simultaneously rendering in order to saturate all available resources.
If you want more cores than 128 at Amazon, you spin up more EC2 instances.

with rendering, it's not like you could render on 64,000 cores anyway.. the process won't scale that high and there's a law of diminishing returns you'll encounter when multi-threading single processes.. like, if you divide the process up amongst 2000 cores, it takes longer to add all 2000 pieces back up into a single whole than any benefit you'd of experienced with so many cores..

the rendering software (that i know of) tops out support at a little over 100cores.. so a 200core renderfarm would give the same results as 100.

If you look at it from a different perspective, not all tasks are difficult to put back into a single whole, for example rendering a movie, each frame can be split into several threads x number of frames in the movie. I know that Disney has rendered on 50k+ cores.
 
you'll definitely be able to get more than a year's usage out of a GPU in one of these (or any) imac.
if there's a newer model GPU that comes out next year or the next or the next.. this one will still work just as well.


neither of those are soldered on the regular imac.. (nor the CPU)


Neither are soldered, but there's no access panel for the user to make upgrades. You'll either have to take it Apple to pop the hood if they're allowed to upgrade the memory, or buy a new machine if you want more memory. ...or figure out how to remove the case.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.