ADM Radeon Pro 580

Discussion in 'iMac' started by elmarjazz, Jun 8, 2017.

  1. elmarjazz macrumors member

    Joined:
    May 26, 2010
    #1
    Since I'm trying to economise... but need this to work well at the same time...

    Is it worth getting the top end iMac with the ADM Radeon Pro 580, or is the lesser (and less money) 575 enough? Add an SSD and self installed RAM on both.

    The boost in CPU 3.5 vs 3.8 is a plus, but does it make that much of a difference?

    Photo, video editing
    Thanks
     
  2. Darajavahus macrumors member

    Darajavahus

    Joined:
    Aug 8, 2015
    #2
  3. Sirmausalot macrumors 6502a

    Sirmausalot

    Joined:
    Sep 1, 2007
    #3
    Video editing, and compression will benefit from an i7
     
  4. Outrigger macrumors 68000

    Outrigger

    Joined:
    Dec 22, 2008
    #4
    video editing is more dependent on cpu, not gpu.
     
  5. Mac32 Suspended

    Joined:
    Nov 20, 2010
    #5
    AMD 580...hmm. Sorry but it is a lot slower than what would be possible with an Nvidia GPU. Why can't Apple offer both (obviously Nvidia 1070/1080 would be more expensive), so there would be more options?
     
  6. CWallace, Jun 9, 2017
    Last edited: Jun 9, 2017

    CWallace macrumors 603

    CWallace

    Joined:
    Aug 17, 2007
    Location:
    Seattle, WA
    #6
    Well in terms of raw Terraflops, the 580 is close to the 1070 (6.2TF to 6.5TF) though Apple's woefully out-of-date Open CL support means that in real world it's not going to be anywhere near as good as under Windows Open CL (to say nothing of Windows CUDA).
     
  7. PJivan macrumors 6502

    PJivan

    Joined:
    Aug 19, 2015
    #7
    Actually Apple use metal and OpenCL opengl is just there for compatibility and I can see it gone next year...
     
  8. Zarniwoop, Jun 9, 2017
    Last edited: Jun 9, 2017

    Zarniwoop macrumors 65816

    Joined:
    Aug 12, 2009
    Location:
    West coast, Finland
    #8
    Radeon Pro 580 is the only GPU for iMac that can be used for Virtual reality. 570 and 575 are not recommended for that. Radeon Pro 580 is not Radeon RX 580. The Pro version is more power efficient (around 33%), but also a bit slower (~10%) than RX versions. RX 580 has a TDP of 185W (oc models even more), Pro 580 ~125W.
     
  9. fokmik macrumors 68030

    Joined:
    Oct 28, 2016
    Location:
    USA
    #9
    I wonder if anyone chosen the i7+580 combination and if we are going to see again the heating problems of m295x all over again
     
  10. Zarniwoop macrumors 65816

    Joined:
    Aug 12, 2009
    Location:
    West coast, Finland
    #10
    Radeon Pro 580 should have a similar thermal envelope as M395X had, and it didn't have those heating problems as the previous model had.

    But we really need to see more reviews, and they've been scarce so far...
     
  11. dlewis23, Jun 9, 2017
    Last edited: Jun 9, 2017

    dlewis23 macrumors 6502a

    Joined:
    Oct 23, 2007
    #11
    So far no for heading issues. I got mine today and its significantly cooler running the both previous gen. iMac. I have only played a game twice on it so far to test and the fan was no more then 1800 RPM.

    My previous iMac with the m295x would have the fan at max almost right away after getting in the game. The m395x wasn't much better then that.
     
  12. inhalexhale1 macrumors 6502a

    Joined:
    Jul 17, 2011
    Location:
    Ridgewood, NJ
    #12
    If I had to choose, I would definitely pick the CPU over the GPU. They're going to officially support eGPUs, and you can spend the money on something that will be upgradable, and way better than a 580 pro. Official support of eGPUs was one of the most unexpected and awesome announcements from Apple.
     
  13. triple-tap macrumors 6502

    Joined:
    Feb 18, 2013
    #13
    Agreed 100% that we should have NVIDIA options, but I don't thing the 580 is "a lot slower" than a 1070. In fact, the 580 cards sit between the 1060 and the 1070 cards. The 580s are respectable cards from AMD.
     
  14. diamond.g macrumors 603

    diamond.g

    Joined:
    Mar 20, 2007
    Location:
    Virginia
    #14
    On an iMac I am not sure if an eGPU is going to be such good deal. You end up having to use an external monitor thus negating the beautiful built in screen. Unless Apple allows the eGPU to drive the internal monitor...
     
  15. koyoot, Jun 10, 2017
    Last edited: Jun 10, 2017

    koyoot macrumors 603

    koyoot

    Joined:
    Jun 5, 2012
    #15
    I think this video will be very helpful for anyone who is asking himself if the 580 is worth the money.



    I suggest EVERYONE watches this video. RX 480 is the SAME chip that is in Radeon Pro 580.

    And here is comparison of CUDA(GTX 1060) vs OpenCL(RX 480)
    https://wiki.blender.org/index.php/Dev:Source/Render/Cycles/OpenCL
    [​IMG]
    Guys, please give them a credit, where is due ;).

    Software is catching up ;).
     
  16. inhalexhale1 macrumors 6502a

    Joined:
    Jul 17, 2011
    Location:
    Ridgewood, NJ
    #16
    Yea, it seems internal acceleration is not an option at the moment.
     
  17. epca12 macrumors regular

    Joined:
    Jun 11, 2017
    Location:
    UK
    #17
    I'm looking into an iMac and probably going to end up buying the 580 but now that macos supports external graphics I might look into that route in the future. There looks to be a substantial performance jump so I think the 580s worth it
     
  18. SoyCapitanSoyCapitan macrumors 68040

    SoyCapitanSoyCapitan

    Joined:
    Jul 4, 2015
    Location:
    Geneva
    #18
    Interesting leak this weekend. Benchmark sites have spotted there is an Intel Kabylake processor coming with integrated AMD GPU based on Vega.
     
  19. jjjoseph macrumors 6502

    Joined:
    Sep 16, 2013
    #19
    Using what program? It's all GPU these days. If you mean encoding and exporting using multithreading, yes that is CPU dependent, but the actual editing process, very GPU dependent.

    Which editing program are you specifically referring to?
    --- Post Merged, Jun 11, 2017 ---
    OpenGL is a legacy product or way teach people horrible methods of coding. OpenGL will soon be gone forever. It's a dinosaur.
    --- Post Merged, Jun 11, 2017 ---
    Using the eGPU to drive the internal monitor will be supported in High Sierra.
     
  20. Asclepio macrumors 6502a

    Asclepio

    Joined:
    Jul 11, 2011
    #20
    source?
     
  21. jjjoseph macrumors 6502

    Joined:
    Sep 16, 2013
    #21
    I don't code anymore but I have a friend familiar with how Apple is rolling out the eGPU and Metal2 framework, he is very close to these developments. He will be coding using the eGPU dev kit from Apple among other people in his team.

    Basically the only way Apple can do their "component" eGPU in the iMac Pro and future MacPro is by having eGPU drive an internal or specific graphic window. I.E. a headless eGPU.

    How this will roll out with Metal2 and High Sierra will be wait to be seen, if it's in the current Metal2 framework or being figured out.

    As people start getting the eGPU Metal2 developed boxes Apple is releasing we will know for sure if it's happening now or when, but I do know for certain they have been working on it, when it reaches us is up to Apple.

    I can try to find out more information, but it probably won't be until Metal2 comes out because even though they are starting to ship, the eGPU dev kit, I think they are being deliver with Metal not Metal2.
     
  22. joema2 macrumors 65816

    joema2

    Joined:
    Sep 3, 2013
    #22
    In general video editing is very CPU-dependent. This should be obvious -- that's why the Mac Pro is available with 12 cores and the upcoming iMac Pro with 18 cores. If it was "all GPU" they wouldn't need all those cores.

    Anybody can see this themselves in Premiere by simply importing some H264 4k content and using the JKL keys to scrub forward and back on the timeline. All CPU cores will often be pegged.

    Likewise here are two videos where Dave Dugdale (learningvideo.com) is complaining about how slow Resolve 12 is on 4k -- even using a GTX-1080 Ti on a Windows PC. If it was "all GPU" certainly the 1080 Ti would be fast enough.




    FCPX is more efficient than either Premiere or Resolve, but even it requires a lot of multi-core CPU horsepower. On my top-spec 2015 iMac 27, H264 4k content can be sluggish to skim through using FCPX. That's with no effects whatsoever, and is likely more a CPU than GPU limitation.
     
  23. jjjoseph, Jun 11, 2017
    Last edited: Jun 11, 2017

    jjjoseph macrumors 6502

    Joined:
    Sep 16, 2013
    #23
    Yeah. Not true. I'm a professional colorist. I work in tv and film I have for almost 20 years. I work in a studio with 20+ edit bays and we pump out tv commercials and spots on a daily basis. Our technical expertise in this field is light years beyond these youtubers.

    This Dave Dugdale is an amateur weddding video guy who probably clickbaits you to give him amazon discounts on gear. Looking through his website he is not an authority on anything but being a wedding DP.

    First he's comparing apples and oranges when one compares premiere to davinci to Final Cut Pro x as it relates to rendering in CPU and GPU.

    Resolve holds the entire image and any color nodes in its GPU to playback and then create a desired effect, where as Premiere uses the CPU then GPU to decode the frame and hold it in its memory for you to see, once you see it in Premiere the process is done, where Resolve is still holding all the images and nodes in memory. A different technology. Apple Final Cut Pro X uses proxy's and background rendering to achieve most of its speed. None of these technologies are 1 to1. A YouTube novice can say "hey option a exports in 1 minute and option b in 1.5 minutes, option a is better!" But it's not. It's two totally different technologies.

    For example. When I grade feature films and I am using say a RAW 4K r3d file, this entire image is held in RAM, when I add nodes and effects, if I have enough GPU power it plays back in real or greater than real time. The CPU is only used in the same way a CPU is used to run a computer, or if you have a CPU dependent Source Codec. There are usually Consumer codecs like AVCHD or h264 variants or codecs with innerframe compression. Say you are using DPX files, their is barely any compression so your CPU is not a variation of your speed. In Premiere for example the CPU is not used to hold this image in memory. You decode the image, do the mathematical operations to see your results, and your final play back and/or render. A higher core or faster CPU will not make the GPU encoding process any faster. Since in Resolve this is all done in the GPU.

    If you take my example of the feature films I grade and take that same r3d file add the same level of effects and color, you can not play that back in realtime, it has to be rendered, then if you change rendered again. When Premiere added the mercury playback using and your using CUDA, that is a similar technology as CUDA in resolve, but it's not program wide and doesn't work for all effects and operations inside the program.

    A novice video editor like these youtubers doesn't work heavily enough in this program to have a valid opinion. He or she not using the program in a way a professional colorist is using it and they aren't even pushing it close to a level I would.

    I usually have about 40+ color operations on complicated project and it plays back in realtime on the GPU. Put 40+ color operations an r3d file in timeline in premiere and you be waiting forever for the CPU to render this.

    Also I work directly with Blackmagic a lot and the drivers for Pascal GPUs are not as fast on MacOS as the drivers for windows or even Linux. So if your comparing Pascal on MacOS it's not using its full power.

    Also Final Cut Pro X is kinda a toy and it cheats. For you to do one to one comparisons with Davinci and Premiere you have to disable all the background rendering and proxy features. Rendering a 4K image that is actually a 2k of HD file, being held in proxy, is not a real comparison.

    A simple process timing of a process in these tests is not a portrayal of these programs. A quick little YouTube snippet of an export or a process shows nothing. Ask real experts that uses these programs all day everyday to produce tv and commercial content under extreme pressure and real deadlines and you will know the truth.

    To say a CPU is more dependent of GPU is just silly and not at all accurate. Also I don't know who the YouTube guy is or what is his angle but he obviously doesn't use any of the stuff professionally.

    My take is don't trust YouTube click bait.
     
  24. poematik13 macrumors 6502a

    Joined:
    Jun 5, 2014
    #24
    You nuked most of your credibility when you called FCP X a toy tbh. You can turn the proxy function off and work with the footage natively and it's still GPU accelerated and fast as hell. I agree with your industry views on youtubers and of course your description of Resolve's GPU activity is fairly accurate.

    What you're missing here is that all of these compressed codecs (h.264, prores, etc) are CPU-dependent because they need to be decompressed and compressed (a cycle) during playback. To get realtime playback a strong CPU is needed especially when the files are a 4K+ resolution. This is a universal phenomenon on all NLE's/apps because this CPU cycle is intrinsic to the codec's behavior.

    So that's why CPU's are important when working with digital media files, and why mac pros and imac pro give you 8 core, 12, 18, etc. core options.
     
  25. jjjoseph macrumors 6502

    Joined:
    Sep 16, 2013
    #25
    Most of the process of the CPU dependent codecs are PRO-SUMER codecs. MXF, DPX and ProRes are not as CPU dependent. Most of the professional codecs use less CPU power, and yes, Final Cut Pro X is a toy. No one in any professional edit or tv environment or film for that matter uses Final Cut Pro X. It is a toy.

    I could name a million reasons why Final Cut Pro X is a toy. Just ask why we can't or will never use Final Cut Pro X for professional tv work. It can't be done.
     

Share This Page