d, but it certainly isn't vibrantly alive either.
I thought the most interesting idea presented in this article is the display with built-in GPU..
Imagine the heat in there.
d, but it certainly isn't vibrantly alive either.
I thought the most interesting idea presented in this article is the display with built-in GPU..
Cringely needs to learn how to use the apostrophe correctly.
Why would Apple not drop the Mac Pro? They probably aren't making much money on it anyway. Why keep it around?
Not my bag, but that quick turnaround is part of their deliverable. Most of it ends up sounding godawful and usually gets even more abuse in post.
The MP will stay, if the price starts increasing again is another question all together.
For many pros, the only other alternative is a PC. Yuk!
Because Pixar uses the Mac Pro's and then countless other businesses...
How many software fully exploit current hardware?
Photoshop doesn't
FinalCut Pro doesn't
Logic Studio doesn't
Cringley is a maroon of the first order. How he continues to garner respect baffles me. I'll stick to this article though. He claims to be a tech writer but makes this idiotic statement:
"...I use the term Light Peak, which is what Thunderbolt is called in the non-Apple world,..."
Of course any part-time tech nerd knows that "Thunderbolt" is Intel's marketing name for what was formerly known as "Light Peak"; it's not an Apple term as he suggests, almost mockingly as a silly Apple nomenclature. He thinks he is so superior because he is going to use what he thinks is the "real" name of Thunderbold, Light Peak.
Well, Cringley, as a member of the so-called professional media, ought to at least do research before making asinine statements. Intel is quite clear that "Thunderbolt" is the standard for Macs and PCs, and is the proper term for the standard, not "Light Peak." But I guess Cringley is still calling Istanbul Constantinople.
So when you realize he has this basic fact wrong, you can't believe anything else he says.
But higher end "pro" cards run hotter and are bigger.Like an iMac?
But higher end "pro" cards run hotter and are bigger.
My bad I was fixed in the past where the app had still huge chunks of 32bit code limiting its use of memory.Logic does in a BIG way. All cores all threads. It also scales almost perfectly. Not sure why you included it.
Not sure what the point is in calling the guy a moron.That guy is a moron. Can't he read Intel's roadmap?
If we can just wait until the chips get released before sounding the death knell. If they come and go and no Mac Pro then great, happy editorializing.
Not sure what the point is in calling the guy a moron.
He's entitled to his opinion just like the rest of us.
I'd rather give the guy the benefit of the doubt.
After all it's not like the rest of us haven't made a mistake from time to time.
Not only that, we don't even know if he's wrong.... yet.
Not only that, we don't even know if he's wrong.... yet.
The thing is, while Apple has been promoting technologies (Grand Central Dispatch, openCL) that would ensure better use of big workstations, were devs following? (no)
More importantly are pro software in the media world in need of such power?
While engineers may need every last bit of processing power available, the next generation MacPro (if they follow the current line philosophy) make sense to them since they are in the market for workstations with multiple processors with multiple cores and multiple CUDA or openCL capable GPUs, on the other hand the image, video and audio professionals might be near the point where investing in way more powerful workstations would only bring marginal benefits compared to investing in better software.
How many software fully exploit current hardware?
Photoshop doesn't
FinalCut Pro doesn't
Logic Studio doesn't
FinalCut X does and we can see that on current hardware it manage to pull most calculations in almost real time. While the software isn't ready to replace it's ancestor the benefits of a full rewrite are quite apparent.
In the end Apple doesn't cater to engineers (unfortunatelybeing one I dreamt of Catia on OS X), for the media professionals a Single workstation processor (6+ cores) with enough PCIe (v3) ports and memory (RAM and SATA) slots available would be more than enough.
Even more so now, that Thunderbolt exists, permitting the use of fast external storage systems.
For me MacPro aren't dead, but will evolve into something less epic, but that should still perform very well for the pro, while price might go down enough to interest more prosumer (geeks, gamers or rich kids ^^)
Anyway, what Apple is waiting for, is Intel's next workstation processors, and maybe even AMD next generation GPUs (more focused on calculation that graphics).
Sandy Bridge are great but they have a lot of limitations (like the number of PCI channels), and their internal GPUs are a waste of electricity.
I dont' see GPU equipped displays as good for users, as they tend to recycle displays vs. computers. So as such a display ages, the end user would end up having to replace the entire display due to the GPU performance over time rather than due to the display itself (gone bad outside of warranty, outgrown it, ... whatever).The Cinema display with a built in GPU is an interesting concept. Would definitely be useful in the consumer market. A bit harder to do in a pro market where Pros needs a lot of specialty displays.
I dont' see GPU equipped displays as good for users, as they tend to recycle displays vs. computers. So as such a display ages, the end user would end up having to replace the entire display due to the GPU performance over time rather than due to the display itself (gone bad outside of warranty, outgrown it, ... whatever).
Rather expensive way to keep up with GPU's for users (paying for both a GPU and display, regardless of what they actually need), and not a good idea IMO.
And then there's performance users/professionals that need more than what may ship as part of a display in terms of included GPU.
Unfortunately, you may be right (performance upgrades could be seen as a way toSounds like something Apple would go for then.
I dont' see GPU equipped displays as good for users, as they tend to recycle displays vs. computers. So as such a display ages, the end user would end up having to replace the entire display due to the GPU performance over time rather than due to the display itself (gone bad outside of warranty, outgrown it, ... whatever).
Rather expensive way to keep up with GPU's for users (paying for both a GPU and display, regardless of what they actually need), and not a good idea IMO.
And then there's performance users/professionals that need more than what may ship as part of a display in terms of included GPU.