Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

jeff7117

macrumors regular
Jul 22, 2009
174
456
How am I trolling?

I did NOT say I upscale 1080p to 4K. Where did I say that? How is that a troll when I corrected his statement?

[doublepost=1490649681][/doublepost]

But wait, I thought 4K was the standard for a while now? According to res0lve, 4K is the ABSOLUTE MINIMUM you should EVER WORK ON. 6K is apparently mainstream. And 8K is widely adopted now. How are those movies "fake" 4K then? Even the ones from 2016.
You should not blow up 1080 to 4K. Glad you asked. Hope that helps you out in the future.

Mandatory minimum is what many of us have to work under.

We don't all capture video at 720.

For the MAJORITY of us, 4K is usually the minimum. That's because WE POST IN 4K. It does not mean we DISTRIBUTE IN 4K. Subtle but major difference.

You're trolling hard at this point. And yes, the nMP still sucks for the above reasons in post #118.
 

steveOooo

macrumors 6502a
Jun 30, 2008
743
89
UK
Apple park launches in April - opening steve jobs thearter of inothing

'Ya know, Steve always used to say people will always need trucks, so reluctantly here it is, apple truck' today apple reinvents the truck, and here's Johnny to tell you all about it in our 12 hour t shirt against white vomit video'

Actually just thought for all the 1,000,000s of unsold trash cans, they could stockpile them and use as mortar rounds in the event of invasion.
 

Ethosik

Contributor
Oct 21, 2009
7,797
6,714
You should not blow up 1080 to 4K. Glad you asked. Hope that helps you out in the future.

Mandatory minimum is what many of us have to work under.

We don't all capture video at 720.

For the MAJORITY of us, 4K is usually the minimum. That's because WE POST IN 4K. It does not mean we DISTRIBUTE IN 4K. Subtle but major difference.

You're trolling hard at this point. And yes, the nMP still sucks for the above reasons in post #118.

I did not say I was doing that. WHERE did I say I was blowing up 1080p to 4K. What I said was a scenario where 4K is NOT the absolute MINIMUM requirement. I DO not blow up 1080p to 4K. I NEVER have blown up 1080p to 4K. WOW did you miss what I was saying ENTIRELY. Good for you that you need 4K at a minimum. I DO NOT.

And we don't all capture video at 4K or 6K or 8K. You guys know there are different workflows right? I primarily use 720p. So the Mac Pro is good for me. It does not OVERALL SUCK like people here say.
 

goMac

Contributor
Apr 15, 2004
7,662
1,694
And we don't all capture video at 4K or 6K or 8K. You guys know there are different workflows right? I primarily use 720p. So the Mac Pro is good for me. It does not OVERALL SUCK like people here say.

I don't think the nMP is a bad idea, but this is not a winning argument.

Who edits or even captures in 720p? Why would you do that?
 

goMac

Contributor
Apr 15, 2004
7,662
1,694
What's not a winning argument? Everybody should be required to work on videos at 4K at a minimum?

Why not work in 1080p? Much less 4k?

And if you're working in 720p, why buy a Mac Pro? Just buy a MacBook Air at that point...

Arguing that the Mac Pro is really great for 720p workflow is a bad argument.

Is this real life?
 
  • Like
Reactions: tuxon86

Ethosik

Contributor
Oct 21, 2009
7,797
6,714
Why not work in 1080p? Much less 4k?

And if you're working in 720p, why buy a Mac Pro? Just buy a MacBook Air at that point...

Arguing that the Mac Pro is really great for 720p workflow is a bad argument.

Is this real life?

Um because most of my clients can only stream 720p footage with their crappy internet. Why do I need to get force to 1080p if there is no reason to? What about 4K? There is NO reason for online training sessions to be in 4K resolution. 1080p for some things, 720p for the rest works JUST fine. Why force users to unnecessarily buffer just because I can say "OMG LOL I HAVE 4K VIDEOS - GOOD LUCK BUFFERING FOR A LONG TIME". If I cannot reliably buffer 4K footage with 300mbps internet, how can my clients with 5mbps internet?

And that is my point. Why was someone arguing with me that a $10,000 computer would make things even faster when my $3,000 Mac Pro is already over powered for what I do. My 2010 Mac Pro is STILL not sweating with my work flow, and it is the 6-core single CPU version.
 

jeff7117

macrumors regular
Jul 22, 2009
174
456
And that is my point. Why was someone arguing with me that a $10,000 computer would make things even faster when my $3,000 Mac Pro is already over powered for what I do. My 2010 Mac Pro is STILL not sweating with my work flow, and it is the 6-core single CPU version.

Perhaps you need a slower computer then. I would also suggest a Mac Book Air.
 

Ethosik

Contributor
Oct 21, 2009
7,797
6,714
Perhaps you need a slower computer then. I would also suggest a Mac Book Air.

I already got a 2016 Macbook Pro for on the go and some workflow too.

I am not denying that the Mac Pro is not ideal for quite a bit of workflows, but you cannot just simply state that it just sucks. What would the reaction be if I stated that the GTX 1080 just plain sucks? Just because it doesn't fit my workflow doesn't mean it sucks. It is a real shame that the Mac Pro doesn't have a config that benefits a lot of the workflows.
 

aaronhead14

macrumors 65816
Original poster
Mar 9, 2009
1,226
5,289
But wait, I thought 4K was the standard for a while now? According to res0lve, 4K is the ABSOLUTE MINIMUM you should EVER WORK ON. 6K is apparently mainstream. And 8K is widely adopted now. How are those movies "fake" 4K then? Even the ones from 2016.

The highest resolution capable for most ALEXA cameras (a very common digital cinema camera among filmmakers) is either 2.8K or 3.5K (depending on the model, and unless it's the ALEXA 65 which shoots 6K). The standard resolutions for the final delivery of a film are variations of 2K and 4K (there isn't an option for anything in between). So filmmakers upscale their movies to 4K in order to retain the detail, rather than downscale to 2K.

It's also important to note that resolution isn't always the thing that makes an image have lots of detail. Bitrate and compression are much more important than resolution. Using high resolutions, however, is a really common way of retaining detail upon compression. So filmmakers often upscale 2.8K or 3.5K footage to 4K upon mastering the film so that compression doesn't cause as much artifacting.

So these people who are saying that these films are "fake" 4K are *technically* correct, but they're also ignoring the important reason for why filmmakers do it: the movies look better in the end.

The other huge camera company is RED, and most of their cameras shoot at least 5K (with some models capable of doing 6K and even 8K). Those movies are in the "real" 4K section. So are the movies shot on film and then scanned in 4K (film isn't restricted to a resolution; you could scan a film from 20 years ago in 8K and it would be 8K). Digital IMAX cameras and the ALEXA 65 also both shoot 6K, so movies shot on those cameras are also in the "real" 4K section.

Sorry for my rambling. But basically, what I'm saying is that it's still really common to not shoot a film in 4K+; however, the specific types of cameras that shoot at a lower resolution have other attributes that make their footage look really great. So when people upscale to 4K, it's because they want to retain that great-looking footage.
[doublepost=1490668681][/doublepost]
Bottom of #114.
"Why should I blow up a raw 1080p footage to 4K?"

It's actually really common to upscale RAW 1080p footage. RAW 1080p footage contains much more detail than standard compressed 1080p footage. So the only way to retain that detail through web compression (and other forms of compression) is by upscaling.
 
Last edited:

itdk92

macrumors 6502a
Nov 14, 2016
504
180
Copenhagen, Denmark
....

And that it's a horrible step backwards for people with MP5,1 systems.

^^THIS!


The computer in my signature can easily handle 4k and 6k from RED cameras, including some 3D content, so the nMP is definitely NOT the only solution available, and arguably not the best Mac available.

Plenty of workarounds to make the signature 5.1 machine to work, but IT WORKS GREAT.

My point being, there is still stuff you can do to maximize performance, until Apple takes their s**t together and releases some new toys.
 
  • Like
Reactions: aaronhead14

deconstruct60

macrumors G5
Mar 10, 2009
12,264
3,861
Are you talking about gaming on your PC or Mac?

....
Here's some very recent Premiere benchmarks comparing the fastest machine Apple makes to a couple of standard fairly light weight NON Xeon workstations.

light-weight non Xeon is a bit off the mark. The 6900K and 6950K processors list for $1000-1700

https://ark.intel.com/products/94196/Intel-Core-i7-6900K-Processor-20M-Cache-up-to-3_70-GHz ($1089.00 - $1109.00)
http://ark.intel.com/products/94456...ssor-Extreme-Edition-25M-Cache-up-to-3_50-GHz ($1723.00 - $1743.00)

Basically the same die as the Xeon E5 v4 1660 and 1680 (match on price. 2 less core but higher clock)

https://ark.intel.com/products/92985/Intel-Xeon-Processor-E5-1660-v4-20M-Cache-3_20-GHz

and

https://ark.intel.com/products/92992/Intel-Xeon-Processor-E5-1680-v4-20M-Cache-3_40-GHz

Labeling $1K processors are lightweight is a stretch. Sure there are E5 4000 and E7 series processors that a 2x-3x the cost and 2x the cores, but the implicit gist here is because these are label "Core i7" that they are in the affordable mainstream zone. The Core i7 products are not uniform in implementation. The Core i7 x9xx stuff is in the same rage as the Xeon E5 16xx stuff. Same dies with certain features flipped on/off.

The performance gap is a total cluster with CUDA acceleration (Rendering, NOT gaming) since you can't run any Nvidia cards on a nMP without external expansion, and you can't run any Pascal series cards at all. eGPU solutions are also not officially supported by Apple and they have intentionally made it this way.

The 'war' Nvidia and Apple has had with GPGPU programming spans past just the Mac Pro. Nvidia wanted a proprietary tar-pit (closed CUDA API) and Apple wanted on open standards approach (OpenCL). Between Nvidia and Microsoft throwing roadblocks at OpenCL (e.g., Nvidia foot dragging on OpenCL 1.2 and later), Nvidia threatening to sue Apple on mobile GPUs , mediocre OpenCL implementations by Apple and AMD , and a few other factors that has hung up the Mac Pro. Nvidia , Apple and others have all played a role in absence of Nvidia GPUs in Macs last couple of years.


Apple switched to a large bet on Metal (their own standard so don't have to play the fractious standards committee game. Microsoft 'Direct Compute' and DirectX .... Apple big enough block now that have to adopt their standard to be a player. ), but that doesn't cover the entire scope of OpenCL ( and SPIR-V ) Apple balked at Flash (Adobe) as being the foundation for web graphics. It took years for HTML5+javascript improvements to get there but he open standards foundation is winning. CUDA isn't the same level of security drama but it is vendor lock in none-the-less. Apple has enough money, momentum, and resources that they are going to push back against that. Adobe has unwound themselves from CUDA only solutions in several areas.


Apple has chosen Intel up until this point but AMD has always been an x86 option. Apple swapped out Samsung Fabs, for TMSC fabs. It is worked better for a certain set of contracts they could switch back. Apple is now running Intel and Qualcomm modems in different iPhone models. etc. etc. etc. Folks screaming at Apple to pick just one and only one vendor 'forever' for a specific component is probably going to get sent off to ' cat rant > /dev/null '.


if Intel had merged up with HSA ( http://www.hsafoundation.com/members/ ) then perhaps Apple would have fell into place on that ( ARM (Mediatek, Samsung) , Imagintech means iOS space is already covered by architecture ). Not currently using any AMD CPUs is the primary blocker there along with not being able to switch back to Intel.




If you just need to edit FCPX, Apple has you covered and they think you can get by just fine editing on a MBP instead of the trash can.

poo-pooing FCPX for being locked into Apple and cheerleading CUDA at the same time is bizarro, hypocritical stuff.
To be a viable Mac software product means running on multiple Mac devices and multiple workloads. These "only good" for a couple thousand Mac applications typically have problems long term. Mac is like 6-7% of overall market. if aiming at 0.1% of that 6% the user base is awfully small to support a long term developer and support base. 0.1% of 90% is a more viable number since about an order of magnitude bigger.
 

jeff7117

macrumors regular
Jul 22, 2009
174
456
poo-pooing FCPX for being locked into Apple and cheerleading CUDA at the same time is bizarro, hypocritical stuff.
To be a viable Mac software product means running on multiple Mac devices and multiple workloads. These "only good" for a couple thousand Mac applications typically have problems long term. Mac is like 6-7% of overall market. if aiming at 0.1% of that 6% the user base is awfully small to support a long term developer and support base. 0.1% of 90% is a more viable number since about an order of magnitude bigger.

I'm not saying CUDA is better, but other platforms have a choice to use CUDA or OpenCL. Up until the 6,1, we had that choice too. For many applications, CUDA is faster. I'm not saying that the vendor lock-in is ideal, but when speed matters, it is what it is.
 

Jack Burton

macrumors 6502a
Feb 27, 2015
782
1,272
I'm not saying CUDA is better, but other platforms have a choice to use CUDA or OpenCL. Up until the 6,1, we had that choice too. For many applications, CUDA is faster. I'm not saying that the vendor lock-in is ideal, but when speed matters, it is what it is.

The impression I've gotten from multiple developers of GPU renderers is that they want to support Open CL, but some limitations on AMD and/or Apple's side prevent feature parity on the few that even bother attempt to support an Open CL version.

At least one dev waffles back and forth on what they believe can work on AMD cards. One week they say its coming and they had a break through, and the next week they are throwing up their hands and giving up. As frustrating as that is to me, it's likely more so for the developers working on the problem.

The Cinema 4d owner base is roughly 50% Mac according to Maxon, the makers of C4d. If there was a great renderer that supported all of my rendering needs on AMD based hardware, I'd stay on the Mac in a heart beat as my tasks greatly vary. I don't only do 3d. But when I do, I need that speed provided by multi GPU rendering.
 

jeff7117

macrumors regular
Jul 22, 2009
174
456
The impression I've gotten from multiple developers of GPU renderers is that they want to support Open CL, but some limitations on AMD and/or Apple's side prevent feature parity on the few that even bother attempt to support an Open CL version.

At least one dev waffles back and forth on what they believe can work on AMD cards. One week they say its coming and they had a break through, and the next week they are throwing up their hands and giving up. As frustrating as that is to me, it's likely more so for the developers working on the problem.

The Cinema 4d owner base is roughly 50% Mac according to Maxon, the makers of C4d. If there was a great renderer that supported all of my rendering needs on AMD based hardware, I'd stay on the Mac in a heart beat as my tasks greatly vary. I don't only do 3d. But when I do, I need that speed provided by multi GPU rendering.


You know, it's funny you mention AMD failures and driver issues because this same thing happened with Octane. Version 3.1 of Octane was supposed to support AMD cards, but it kept getting delayed. They finally came out and said that they were waiting on AMD.

I pulled this from the user forum: "We have an AMD branch we made for 3.0, but as I said earlier, we can't support a commercial release (or spend time optimizing) until AMD brings their driver stack to the level they have conceded they need to achieve for Octane to be as stable as it is on NVIDIA - so this is on them, and we are moving onto other features until this changes. "

And
"i think AMD drivers will never be fixed. lol


I think they have to, and they know it, which is why they agreed to fix it. Otherwise, they are by default ceding the high end commercial GPGPU market to NVIDI. OpenCL 2 doesn't exists (Linux and MacOS are still on 1.2, and apple is only support Metal, so even 1.2 is not a sure thing in the future). We did all this crazy work to cross compile CUDA to AMD IL. We had AMD Octane 3 running at a demo machine at siggraph and would have done a first test release around them if they had addressed this when they were supposed to.

In any case at least headless rendering will bring some relief to fustated Mac users. We can still use the local GPU for OctaneImager or the host app raserized viewport."


I had high hopes for an AMD rendering option from Octane and to keep my Mac, and instead, Maxon integrates AMD ProRender into their software. I have no doubt it was done to keep the Mac users happy, but almost all mainstream GPU rendering is done on CUDA at this point. Arnold is even working on a GPU based version.

Like I said, I hate the hardware lock-in but at this point, there just isn't much else to choose from.
 
Last edited:

pat500000

Suspended
Jun 3, 2015
8,523
7,515
You know, it's funny you mention AMD failures and driver issues because this same thing happened with Octane. Version 3.1 of Octane was supposed to support AMD cards, but it kept getting delayed. They finally came out and said that they were waiting on AMD.

I pulled this from the user forum: "We have an AMD branch we made for 3.0, but as I said earlier, we can't support a commercial release (or spend time optimizing) until AMD brings their driver stack to the level they have conceded they need to achieve for Octane to be as stable as it is on NVIDIA - so this is on them, and we are moving onto other features until this changes. "

And
"i think AMD drivers will never be fixed. lol


I think they have to, and they know it, which is why they agreed to fix it. Otherwise, they are by default ceding the high end commercial GPGPU market to NVIDI. OpenCL 2 doesn't exists (Linux and MacOS are still on 1.2, and apple is only support Metal, so even 1.2 is not a sure thing in the future). We did all this crazy work to cross compile CUDA to AMD IL. We had AMD Octane 3 running at a demo machine at siggraph and would have done a first test release around them if they had addressed this when they were supposed to.

In any case at least headless rendering will bring some relief to fustated Mac users. We can still use the local GPU for OctaneImager or the host app raserized viewport."


I had high hopes for an AMD rendering option from Octane and to keep my Mac, and instead, Maxon integrates AMD ProRender into their software. I have no doubt it was done to keep the Mac users happy, but almost all mainstream GPU rendering is done on CUDA at this point. Arnold is even working on a GPU based version.

Like I said, I hate the hardware lock-in but at this point, there just isn't much else to choose from.
Sounds like amd is dead.....
 

Jack Burton

macrumors 6502a
Feb 27, 2015
782
1,272
You know, it's funny you mention AMD failures and driver issues because this same thing happened with Octane. Version 3.1 of Octane was supposed to support AMD cards, but it kept getting delayed. They finally came out and said that they were waiting on AMD.

I pulled this from the user forum: "We have an AMD branch we made for 3.0, but as I said earlier, we can't support a commercial release (or spend time optimizing) until AMD brings their driver stack to the level they have conceded they need to achieve for Octane to be as stable as it is on NVIDIA - so this is on them, and we are moving onto other features until this changes. "

And
"i think AMD drivers will never be fixed. lol


I think they have to, and they know it, which is why they agreed to fix it. Otherwise, they are by default ceding the high end commercial GPGPU market to NVIDI. OpenCL 2 doesn't exists (Linux and MacOS are still on 1.2, and apple is only support Metal, so even 1.2 is not a sure thing in the future). We did all this crazy work to cross compile CUDA to AMD IL. We had AMD Octane 3 running at a demo machine at siggraph and would have done a first test release around them if they had addressed this when they were supposed to.

In any case at least headless rendering will bring some relief to fustated Mac users. We can still use the local GPU for OctaneImager or the host app raserized viewport."


I had high hopes for an AMD rendering option from Octane and to keep my Mac, and instead, Maxon integrates AMD ProRender into their software. I have no doubt it was done to keep the Mac users happy, but almost all mainstream GPU rendering is done on CUDA at this point. Arnold is even working on a GPU based version.

Like I said, I hate the hardware lock-in but at this point, there just isn't much else to choose from.

Yep, that was the one. I would have dug up the quote that you did but couldn't find it.

I think many people here (including me) were hoping that AMD would be viable. AMD seems eager to win people over with reasonably priced hardware. But Siggraph was an nVidia show.

I'd even settle for an external GPU solution with a modern nVidia card. I'd pay a premium for that to stay on the Mac. But there are no nVidia drivers for Mac for the 10XX series cards.

Software. It's the darn software. Whether it's AMD giving devs what they need or nVidia releasing drivers for the external GPU market (and Apple supporting that!), people like me are painted into a corner with Apple.

It's a damn shame.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.