Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Asgorath

macrumors 68000
Mar 30, 2012
1,573
479
PRO SUB-FORUM. Mac Pro. Pro users. Pro apps.

Geforce doesn't do 10 bit. You talking like we don't have Geforce cards and don't know. Even the Nvidia sales shill said he only uses Windows has come to post to tell us we should upgrade to $3000 card (with no macos drivers) to get this basic modern feature. Guys....serious. Brand tribalism makes brains look very deficient. It's a very unevolved and unenlightened waste of energy.

I agree about the Mac Pro (i.e. the machine itself) sub-forum. Where does it say this is for discussing pro users or pro apps only? Many people, myself included, use(d) their Mac Pro for playing games. We have just as much right to post on this sub-forum as you, and just as much right to discuss non-pro usage cases like playing games.

Assuming the ad hominem attacks are aimed at me, please re-read my post -- I specifically said you should not buy a Quadro RTX card for use with macOS. My point was that if you are going to talk about pro usage cases, then it might be worth actually comparing the Radeon VII with the corresponding pro card from NVIDIA (that does support 10-bit color and has tons of memory).

I still find it funny how many "pro" users think $3000 is too much money for a top-of-the-line graphics card, when a fully decked out classic Mac Pro would've cost 6 or 7 grand back in the day. Or when the most expensive iMac Pro is over $13,000 now.
 
  • Like
Reactions: Reindeer_Games

h9826790

macrumors P6
Apr 3, 2014
16,614
8,546
Hong Kong
PRO SUB-FORUM. Mac Pro. Pro users. Pro apps.

This is your assumption, your opinion, not a fact.

Mac Pro is a piece of hardware, a tool. We can use it in whatever way we want, not just pro apps, or gaming, few post even share how to use the Mac Pro case to make chairs. IMO, they are still related to Mac Pro, and nothing wrong to discuss that in this forum.

If you want a thread that only discuss the Pro apps on cMP, you may create one. But you can't hijack the whole forum, and force the others only discuss pro usage in this hardware forum.
 

koyoot

macrumors 603
Jun 5, 2012
5,939
1,853
8-core Intel was more expensive.
Because it was HEDT CPU. I asked simple question. How expensive were mainstream CPUs, before?

They topped at 350$. 1800X increased that margin, to 500$, but everybody was ok with it, because it pushed HEDT to mainstream. That does not mean AMD is cheap brand, and have to always price their products cheaply to be competitive.

PRO SUB-FORUM. Mac Pro. Pro users. Pro apps.

Geforce doesn't do 10 bit. You talking like we don't have Geforce cards and don't know. Even the Nvidia sales shill said he only uses Windows has come to post to tell us we should upgrade to $3000 card (with no macos drivers) to get this basic modern feature. Guys....serious. Brand tribalism makes brains look very deficient. It's a very unevolved and unenlightened waste of energy.
Stop arguing, on the internet with people. They will call you a shill, a fanboy, and will be blind enoguh to not see their own biases, in the first place. It is very wrong forum to come with logic and reason, in any way, shape or form: its Apple users forum, in the first place. Attachment to brand always will be strong with users here.


P.S. Its funny that people, calling AMD uncompetitive, and saying that Nvidia is better are happy to pay 5-6 times more to get the same from Nvidia, which comes "free" with AMD GPU. But hey, we are on Apple forum.
 

koyoot

macrumors 603
Jun 5, 2012
5,939
1,853
It does not matter.
It does matter. Its market segmentation. You cannot compare HEDT CPUs, to mainstream ones, because they are different platforms. Its like comparing Core i7 8700K to AMD Threadripper 2990WX, and claiming on this base that Intel is "cheap" brand.

People were not OK with it. It was not worth it over the 1700.
And yet, people still bought it ;) It was not worth it over 1700, but still increased the ASP of MAINSTREAM CPU. AMD is not cheap company, whether people like it or not. They design products of great value. But they are not cheap.
 

cube

Suspended
May 10, 2004
17,011
4,972
It does matter. Its market segmentation. You cannot compare HEDT CPUs, to mainstream ones, because they are different platforms. Its like comparing Core i7 8700K to AMD Threadripper 2990WX, and claiming on this base that Intel is "cheap" brand.


And yet, people still bought it ;) It was not worth it over 1700, but still increased the ASP of MAINSTREAM CPU. AMD is not cheap company, whether people like it or not. They design products of great value. But they are not cheap.
People who don't care or don't know better bought it.

You can indeed compare segments. AMD provides value, I did not say it was "cheap".
 

splifingate

macrumors 65816
Nov 27, 2013
1,246
1,043
ATL
Geforce is a gaming card.

TMBTC, but my Card gets me to Recovery/Desktop/etc. @2x4K on my MP, which is all I really need/want.

We ("there is no we" (IYGTR)) don't Game here--and we (remember: 'TINW') don't care one-whit about 10-bit (vapor-where?, anyone).

I know it's important to you, CreeptoLoser, to be "correct"--and I will be the last one to convince you otherwise, Brother--but the 1-up game is becoming increasingly-boring.

Since you see reasonably-intelligent, and erudite vis-a-vis 'Current Events', it seems odd that you insist on deprecating others' use of their Systems . . .

. . . is there really so little room for others in your World View?

Regards, splifingate
 

koyoot

macrumors 603
Jun 5, 2012
5,939
1,853
I am actually quite interested in compute benchmarks for once on AMD GPU. Seems like this may be little powerhouse that will give even GTX 2080 Ti a run for its money. ALU difference is not that big for Nvidia GPU. Radeon 7 should be close.
 

deconstruct60

macrumors G5
Mar 10, 2009
12,298
3,893
If it's just 7nm Vega, then highly likely can use the current driver.

it is actually a different "Vega" . Same linage ... different implementation.


Just like the RX590 actually use the same macOS driver as RX580 .

Because it is the same design just "recomplled" from 14nm to 12nm. Primarily the only thing that changed with the design was to make adjustments for slightly denser transistors. Same functional units with the same abilities just clocked a bit higher due to the "gain" of the slightly smaller process.



If AMD lazy enough to let it share the same Device ID as well, then the driver is already available.

The double precision was changed. New INT4 and INT8 math. The ROPs went from 64 to 128. ( instead of a ROP for every CU there is 2 for every CU. Or can look at it as a group of 4 there is one of those for every 4 and now one group of four for every 2.). It has two more memory channels. It likely has a different L2 cache ( bigger and/or far more ports since the number of ROP consumers doubled. The memory is up so cache needs to go up too. ). It has external Infinity Band links and can be linked together in a network.

There are about 0.7 Billion more transistors on this than Vega 64.... those transistors are doing something. That's probably past the minimal device ID change threshold.


The central generic graphics pipeline is the same so many current apps will just see a boost from higher clocks ( which also came long with the shrink ). However, what would be broken would be to keep the device ID the same. This is not purely a implementation shrink. Most of the additions here are not aimed at the Game or mainstream app market, but that doesn't mean this isn't substantially different.

If some application threw those new instructions at a Vega 56/64, it should hiccup. They are going to accept slightly different code. The ROP balance is substantively different ( +100%).

P.S. Since the Vega VII has a few things turned off that the MI50/MI60 cards have turned on it should have the same deviceID as them either. Elements that are on the die but switched off at the "fuse" level mean need a different ID also. ( the capabilities aren't there so software/firmware shouldn't ask for them. )
 
Last edited:

Kpjoslee

macrumors 6502
Sep 11, 2007
416
266

deconstruct60

macrumors G5
Mar 10, 2009
12,298
3,893
.....

Just for the record, Im not very excited about Radeon 7, because frankly - who cares? The price is steep, but that is actuaslly fine, will help AMD make money.....

The price gouging talk is a bit overblown. For the MI50/MI60 price points perhaps. But for the Vega VII the HBMv2 is going to drive up costs. AMD is pushing a 'bigger' die through a brand new process tech. There hasn't been a "pipe cleaner" product to shake out the bugs. I'm not saying AMD is loosing money on the MI50 and Vega VII but the margins are probably thinner than most of of the folks complaining think.

AMD probably needs the volume of the Vega VII to spread the costs out more. Nvidia keeping their prices very high in the Data Center space probably lead to higher MI50/MI60 prices which somewhat created a volume vacuum that AMD can fill with the slightly tweak MI/50 sold as the Vega VII.

It doesn't look like there are going to be a wide "top to bottom" number of Vega 7nm used since these design changes was skewed heavily to being Data Center useful. Whereas the 12nm shrink of Vega 14nm is a way more cost effective "recompile" of the same baseline tech. No new feature testing just same stuff clocked a bit faster on slightly denser process ( some tweaks due to transistor proximity changes , but not working out new stuff. )

The "bottom" of what 7nm Vega could have been targeted at will get covered by Navi. So limited volume of $699 (and up) zone isn't going to get down to the dirt cheap level on wafer starts.
[doublepost=1547275363][/doublepost]

That makes sense. the MI50/MI60 pragmatically have no video out ports so more ROPs wan't going to assist in more directly connected display performance. Virtual GPUs to virtual screens I doubt would be a primary focus ( although they are pitching that a bit also. Multiple users per card helps offset the sky high price). If thought perhaps the ROP were coupled to the other computational upgrades somehow, because the virtual screen thing seems to be huge stretch.

The L2 cache probably is better off without 64 more consumers. It probably is bigger (bigger flows of double precision will soak up space in cache. ).

From gamer perspective largely the faster clocks are faster performance. The other parts with new functionality are going to be much bigger 'wins'. Big question is whether Apple is going to put significant effort into covering those differences with software to leverage them.

P.S. I have a suspicion that there are only 4 output sockets on Vega VII because they traded off some of the DisplayPot transistor space and pin budget for the new infinity links ( again not a loss on cards with primarily zero phyiscally connected displays ).

P.P.S. AMD miscommunication on ROPs make their benchmarks a bit more wish washy too. Just how carefully the narrow set has been picked to paint better picture. Apple mac systems as computational content engines will probably be a better fit than mainstream gaming.
 

cube

Suspended
May 10, 2004
17,011
4,972
Well, it is now claimed the VII will be sold at a loss.

It is after all a crippled MI50.
 

deconstruct60

macrumors G5
Mar 10, 2009
12,298
3,893
Well, it is now claimed the VII will be sold at a loss.

It is after all a crippled MI50.

What is the price of the MI50? 'Crippled' is relative. It will probably cost at least $100's less than a MI50 , but that won't mean it won't do product work for some.

If this is the claim that "less than 5000" will be made, that seems rather 'thin' as far as claim goes. It is probably correct that AMD isn't going to "over stuff" the retail pipeline with the cards. It is likely they will control and dribble them out so that the $699 and doesn't crater after 3-6 months. As most of the typical AMD retail conduits are gamer centric it probably won't sell at high rates into that market. AMD is also not going to pull a GPU+HBM assembly they could sell as a MI50 or MI60 to sell as a Vega VII.

I think the notion that the Navi cards of 2Q-3Q 2019 are going to 'wipe out' the Vega VII is a bit of likely hype. AMD will certainly probalby sell more of a much lower priced card, but it seems doubtful they will really be sitting in a highly overlapping performance spot.

I think there are two groups. A very small group of speculators who might want to try to piggy back on Vega VII early scarcity to make money reselling them for profit. And another group that wants to throw "doom" on it because they aren't going to buy it but want/need it to fail so that something else they want to buy gets cheaper. ( e.g., if AMD fails with Vega VII then will jump into race to bottom price war with Nvidia. )
 

koyoot

macrumors 603
Jun 5, 2012
5,939
1,853
The loss may not come from manufacturing costs, but design costs. 7 nm designs are very expensive. Even at 10k for Wafer, 331 mm2 die, AMD should be plenty on plus with $699 price point, if we count only manufacturing costs.

The other side of this coin is the design cost, and the volume of those GPUs. 5000 available is not that many, and it is not even the highest end(well, for AMD 699$ is Highest end...). 5000 GPUs sold for $700 is 3 500 000$. The design cost for 7 nm product is around 250 000 000$. HPC Vega 20 GPUs are being sold for 8000$ a piece. Which may still not be enough.

But we have to remember. Vega 20 is a pipe cleaner for 7 nm, and is based on alpha silicon.

I think the notion that the Navi cards of 2Q-3Q 2019 are going to 'wipe out' the Vega VII is a bit of likely hype. AMD will certainly probalby sell more of a much lower priced card, but it seems doubtful they will really be sitting in a highly overlapping performance spot.

I think there are two groups. A very small group of speculators who might want to try to piggy back on Vega VII early scarcity to make money reselling them for profit. And another group that wants to throw "doom" on it because they aren't going to buy it but want/need it to fail so that something else they want to buy gets cheaper. ( e.g., if AMD fails with Vega VII then will jump into race to bottom price war with Nvidia. )
No, Navi GPUs most likely will not wipe Radeon 7, unless AMD has made the most, badass, overkill Geometry pipeline in the history of GPUs.

Remember that Navi 10, and 12 are gaming cards, and are mostly focusing on this part. Navi 20 on the other hand is supposed to be replacement for Vega 20, and made on 7 nm EUV process.
 
Last edited:

deconstruct60

macrumors G5
Mar 10, 2009
12,298
3,893
It is not relative if working functionality is just disabled.

For something like DaVinci , FCPX , or some other video data manipulation the relative difference between a GPU that do or doesn't have FP64 is about zero. For apps that primarily use FP32 (or smaller ), the absence of some 1:2 FP64 has about no impact. Some thing for the machine learning tasks which are trying to get away with the crappiest resolution possible ( 4 , 8 , 16 bit representations of the data. )

Some thing for folks whose budgets go up to $800. Having some other products at $2,000 and $4,000 as options relatively doesn't mean much.

It is simply common practice these days that one silicon die is used to fill multiple product SKUs with different targeted audiences. A 4 , 6 , 8 , 10 core Intel W is basically the same die with parts "fused off". Same thing for several Core i9 x9xx models same thing as the Intel w basic equivalent with parts turned off. 'Crippled' is not a particularly appropriate adjective. The parts are useful for who/what they are targeted for.
 

cube

Suspended
May 10, 2004
17,011
4,972
For something like DaVinci , FCPX , or some other video data manipulation the relative difference between a GPU that do or doesn't have FP64 is about zero. For apps that primarily use FP32 (or smaller ), the absence of some 1:2 FP64 has about no impact. Some thing for the machine learning tasks which are trying to get away with the crappiest resolution possible ( 4 , 8 , 16 bit representations of the data. )

Some thing for folks whose budgets go up to $800. Having some other products at $2,000 and $4,000 as options relatively doesn't mean much.

It is simply common practice these days that one silicon die is used to fill multiple product SKUs with different targeted audiences. A 4 , 6 , 8 , 10 core Intel W is basically the same die with parts "fused off". Same thing for several Core i9 x9xx models same thing as the Intel w basic equivalent with parts turned off. 'Crippled' is not a particularly appropriate adjective. The parts are useful for who/what they are targeted for.
They are crippled. They would be useful for people who need FP64 but cannot afford Instinct.
 

Asgorath

macrumors 68000
Mar 30, 2012
1,573
479
They are crippled. They would be useful for people who need FP64 but cannot afford Instinct.

Can you elaborate on who falls into this category? FP64 is not useful for any of the video apps deconstruct60 mentioned, as far as I know. What apps do actually need FP64? Does Metal even expose native FP64?
 
  • Like
Reactions: Reindeer_Games

cube

Suspended
May 10, 2004
17,011
4,972
Can you elaborate on who falls into this category? FP64 is not useful for any of the video apps deconstruct60 mentioned, as far as I know. What apps do actually need FP64? Does Metal even expose native FP64?
Scientific applications, for example.

Windows and Linux do not use Metal.
 

Asgorath

macrumors 68000
Mar 30, 2012
1,573
479
Scientific applications, for example.

Windows and Linux do not use Metal.

Maybe people doing scientific research shouldn't be using cheap consumer gaming-focused GPUs for their work? I'm really not sure I see what the big deal is here, if you need FP64 then have your company buy you an MI50 or MI60.
 
  • Like
Reactions: Reindeer_Games
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.