Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Why would Apple not drop the Mac Pro? They probably aren't making much money on it anyway. Why keep it around?

Because Pixar uses the Mac Pro's and then countless other businesses.

Apple will keep the Mac Pro line profitable, because some people just CANT use an iMac.

The MP will stay, if the price starts increasing again is another question all together.
 
Not my bag, but that quick turnaround is part of their deliverable. Most of it ends up sounding godawful and usually gets even more abuse in post.

LOL! nice one! as they say - its not what you have but how you use it! ... thats my excuse anyways!!!
 
The MP will stay, if the price starts increasing again is another question all together.

Imo, a professional has to have a Mac Pro even if the price goes up.
For many pros, the only other alternative is a PC. Yuk!
If things ever get that bad, I'll just get the latest Mac Pro and max it out to the highest heavens.
I'd rather spend a little more money on a new Mac Pro than tons of money to update an old one.
 
For many pros, the only other alternative is a PC. Yuk!

Windows 7 is lovely. A lot of the creative stuff runs as good or better on Windows 7 than Mac OS X these days. Adobe and Avid products have come a long way on the Windows side.

If you do not need Final Cut or Logic, a 6-core Windows 7 machine is rather appealing these days.
 
Because Pixar uses the Mac Pro's and then countless other businesses...

I'm not saying Apple is doing away with the Pro but if I remember correctly, Pixar doesn't use as many Mac Pros as one would think. I believe they use mainly Linux boxes for rendering. As for individual artist stations, don't most places leave that up to the individual artist and what they are most comfortable with? As long as the files are interoperable, which most are these days.
 
How many software fully exploit current hardware?
Photoshop doesn't
FinalCut Pro doesn't
Logic Studio doesn't

Logic does in a BIG way. All cores all threads. It also scales almost perfectly. Not sure why you included it. PS does exploit a decent amount of power. At least better than the rest of the suite.

----------

Cringley is a maroon of the first order. How he continues to garner respect baffles me. I'll stick to this article though. He claims to be a tech writer but makes this idiotic statement:

"...I use the term Light Peak, which is what Thunderbolt is called in the non-Apple world,..."

Of course any part-time tech nerd knows that "Thunderbolt" is Intel's marketing name for what was formerly known as "Light Peak"; it's not an Apple term as he suggests, almost mockingly as a silly Apple nomenclature. He thinks he is so superior because he is going to use what he thinks is the "real" name of Thunderbold, Light Peak.

Well, Cringley, as a member of the so-called professional media, ought to at least do research before making asinine statements. Intel is quite clear that "Thunderbolt" is the standard for Macs and PCs, and is the proper term for the standard, not "Light Peak." But I guess Cringley is still calling Istanbul Constantinople.

So when you realize he has this basic fact wrong, you can't believe anything else he says.

Also add that there is a distinction to TB "Copper" Light Peak and the promised "Optical" light peak due out later.
 
Logic does in a BIG way. All cores all threads. It also scales almost perfectly. Not sure why you included it.
My bad I was fixed in the past where the app had still huge chunks of 32bit code limiting its use of memory.

The same goes for my Photoshop remark, CS4 was poorly optimized (usually using a maximum of 2 cores, and still not fully Cocoa), but in the case of CS5 (for OS X), reviews (like the one on ars technica) mentioned that multithreading optimization was missing from some time consuming task, and more importantly GPU optimization used CUDA (Nvidia proprietary) instead of OpenCL.

About TB, a lot of people seem to think that going from copper to optical will magically give 10 times the bandwidth. The interface bandwidth is limited by the underlying technologies (CPU I/O -> PCIe tech).

In my mind to reach the 100Gb/s Intel has been talking about, they project themselves in a future where the TB interface lies on top of PCIe v3 with more channels available per TB port.
 
That guy is a moron. Can't he read Intel's roadmap?
If we can just wait until the chips get released before sounding the death knell. If they come and go and no Mac Pro then great, happy editorializing.
Not sure what the point is in calling the guy a moron.

He's entitled to his opinion just like the rest of us.

I'd rather give the guy the benefit of the doubt.

After all it's not like the rest of us haven't made a mistake from time to time.

Not only that, we don't even know if he's wrong.... yet.
 
Not sure what the point is in calling the guy a moron.

He's entitled to his opinion just like the rest of us.

I'd rather give the guy the benefit of the doubt.

After all it's not like the rest of us haven't made a mistake from time to time.

Not only that, we don't even know if he's wrong.... yet.

Correct. We don't know if he is wrong yet. That's not the point. His reason for writing the possibility of said scenario is based on him being too lazy to even recognize cycles in part availability. Since he is writing about HW I would think he would have done the very first thing prior to writing his piece.
When you reach my age you stop giving anyone the "benefit of the doubt".
People acting in a role as authoritative sources of tech need to vet their data first. If this were politics he'd get outed. Tomorrow he could be my champion. This is the net and all things are blown out of proportion and honestly you're way too polite:p
 
Not only that, we don't even know if he's wrong.... yet.

We don't know if he's wrong but his statements are very questionable.
Quote from the article:
"Mac Pro’s are Apple’s big box PCs. They haven’t been refreshed since last summer and new models were expected this month with the new Minis, but for some reason the new Mac Pro’s failed to appear."

New models were expected this month? Failed to appear, huh?
99.999 percent of the people in this forum are more knowledgeable than he is.
 
The thing is, while Apple has been promoting technologies (Grand Central Dispatch, openCL) that would ensure better use of big workstations, were devs following? (no)

More importantly are pro software in the media world in need of such power?

While engineers may need every last bit of processing power available, the next generation MacPro (if they follow the current line philosophy) make sense to them since they are in the market for workstations with multiple processors with multiple cores and multiple CUDA or openCL capable GPUs, on the other hand the image, video and audio professionals might be near the point where investing in way more powerful workstations would only bring marginal benefits compared to investing in better software.

How many software fully exploit current hardware?
Photoshop doesn't
FinalCut Pro doesn't
Logic Studio doesn't

FinalCut X does and we can see that on current hardware it manage to pull most calculations in almost real time. While the software isn't ready to replace it's ancestor the benefits of a full rewrite are quite apparent.

In the end Apple doesn't cater to engineers (unfortunately :( being one I dreamt of Catia on OS X), for the media professionals a Single workstation processor (6+ cores) with enough PCIe (v3) ports and memory (RAM and SATA) slots available would be more than enough.

Even more so now, that Thunderbolt exists, permitting the use of fast external storage systems.

For me MacPro aren't dead, but will evolve into something less epic, but that should still perform very well for the pro, while price might go down enough to interest more prosumer (geeks, gamers or rich kids ^^)

Anyway, what Apple is waiting for, is Intel's next workstation processors, and maybe even AMD next generation GPUs (more focused on calculation that graphics).
Sandy Bridge are great but they have a lot of limitations (like the number of PCI channels), and their internal GPUs are a waste of electricity.

Photoshop uses the core's although not multiple processors. That is why I buy the single processor, multiple core mac pro's. Still run circles around imacs and mb pro's , although not lately. The Xeon seems to hold it's value better than other processor's. My 2006 is the only mac from it's era that can be upgraded to Lion
 
Profitable business unit.

I would expect Apple to have a business unit that develops the Mac Pro range. As long as that unit is satisfying its customers, and turning a profit, then why ditch it? There are a lot more Mac Pros out there than you'd expect. Over the last 6 years I've noticed a lot more Macs in the businesses I visit, and a lot of those are either top end IMacs or Mac Pros. If they do ditch it, then is the IMac, with its horribly glossy screen an option? Not for everyone. I bought my Pro in 2008 and intended it to last for 5 years. I've yet to find anything that can really tax it. I've had Windows boxes that become very unresponsive when undertaking intensive work, but my Pro just keeps chugging along. Mind you I use a Mac mini for every day use, and only fire up the Pro for CAD work. (PCB design). Long may it last!
 
I don't really buy this article. The Mac Pro is dated, but it's streamlined replacement consists of a bunch of boxes all daisy chained together? (each of which requiring it's own power supply?)

Seesh. Yeah right.

The Cinema display with a built in GPU is an interesting concept. Would definitely be useful in the consumer market. A bit harder to do in a pro market where Pros needs a lot of specialty displays.
 
The Cinema display with a built in GPU is an interesting concept. Would definitely be useful in the consumer market. A bit harder to do in a pro market where Pros needs a lot of specialty displays.
I dont' see GPU equipped displays as good for users, as they tend to recycle displays vs. computers. So as such a display ages, the end user would end up having to replace the entire display due to the GPU performance over time rather than due to the display itself (gone bad outside of warranty, outgrown it, ... whatever).

Rather expensive way to keep up with GPU's for users (paying for both a GPU and display, regardless of what they actually need), and not a good idea IMO.

And then there's performance users/professionals that need more than what may ship as part of a display in terms of included GPU.
 
I dont' see GPU equipped displays as good for users, as they tend to recycle displays vs. computers. So as such a display ages, the end user would end up having to replace the entire display due to the GPU performance over time rather than due to the display itself (gone bad outside of warranty, outgrown it, ... whatever).

Rather expensive way to keep up with GPU's for users (paying for both a GPU and display, regardless of what they actually need), and not a good idea IMO.

And then there's performance users/professionals that need more than what may ship as part of a display in terms of included GPU.

Sounds like something Apple would go for then.
 
I dont' see GPU equipped displays as good for users, as they tend to recycle displays vs. computers. So as such a display ages, the end user would end up having to replace the entire display due to the GPU performance over time rather than due to the display itself (gone bad outside of warranty, outgrown it, ... whatever).

Rather expensive way to keep up with GPU's for users (paying for both a GPU and display, regardless of what they actually need), and not a good idea IMO.

And then there's performance users/professionals that need more than what may ship as part of a display in terms of included GPU.

Well, my thinking is I would like this for a Macbook Air. I've been thinking about switching from my Macbook Pro to an Air, but the lack of GPU power is making me second guess that. A display with a higher end GPU would solve that problem when I'm tethered to it (which is probably when I'd be doing my most intense work anyway.)

My Mac Pro does a great job now, but it would still be nice to have a display I could plug into in my bedroom in case I have company over or something and can't use my home office. Or, if I'm at work, and I need a machine that's light enough to haul around for meetings/notetaking, but takes a jump in performance when I sit back down at my desk for coding.

Think Duo Dock. :p
 
Edit: I forgot to say one thing, that is what I set out to say to begin with. Reading the comments on that Cringely article, I can't help but notice a trend that I always seem to see: Consumers, especially computer-literate consumers, tend to COMPLETELY FORGET THAT PRO AUDIO EVEN EXISTS. That is in spite of the fact that it is almost as big a market as video - look at Avid's numbers, both for MC and Pro Tools - it just isn't as flashy or as high profile. My theory? That laymen "get" video - it is something they SEE after all, and can conceptualize doing - but audio is completely alien. How many people use their phones to make music, compared to those who do the same for videos? Hell, the iPhone has a "video" button on its camera - where's the audio button - yeah, I know, "voice notes", but seriously, who uses that much?

I'll say here what I said on that moron's site - if you think the Mac Pro is dying, or at least that the high-end PCI-supporting-mac is dying, then you just plain don't understand WHO THE MAC PRO IS FOR. While it is priced low enough for the hobbyist, it is designed as a PROFESSIONAL WORKSTATION.

I can't tell you how many times people have tried to convince me that "an iMac is all you need for Audio" or some argument akin to it, then balked when I showed them an actual PRO mix. Convolution Verb, massive numbers of tracks and Auxes, plug-in counts into the hundreds, and EVERY parameter automated are not unusual - there are mixes I did 3 years ago, on Logic 8 on a 1,1 MP, that I can't open on my last-year MBP without freezing every track (yay for Logic 9).

Not to mention the fact that Avid - who, no matter how much people (mostly europeans, I've noticed) bitch and moan, owns the pro studio world - JUST RELEASED NEW PCIe CARDS. Apogee is just now getting Symphony 64 into actual use, as well. There's UAD, Focusrite, Metric Halo, Prism, and several others who also have PCIe cards - all of which I've seen in use at one PTHD studio or another, not to mention "high-end-project-studios" (which often are better equipped than so-called pro studios).

The Mac Pro, as it stands now (as an 8-year-old tower design, etc), may be dying - I would LOVE to see a rackmountable (preferably 2U [yeah right] or 3U) design instead of a tower - the actual concept of a Pro Apple Workstation isn't going anywhere, and anyone who thinks it is just doesn't have experience in the markets it is really intended for. I would say QED, but I didn't actually make an argument - more just a general "AARRGGGHHH"!
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.