Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Tempting price but eventually not worth it ;) There is a large disparity between individual machines, I'm not going to look for this, I think it was on this forum, one fellow had really poor scores, did just that and saw substantial improvements. But my scores were not that far off of his after repaste, so the potential of destroying 4.9k (with Apple care and tax) worth of equipment does not justify the expected performance gains. In my case at least, apparently I have a good copy.

EDIT - I forgot, the GPU scores would not improve at all, it can maintain max clock without problem. CPU would improve.
On my 2012 rMBP it's the GPU that feels the heat (no pun intended), the CPU does fine more or less. But since the two are like conjoined twins, you're bound to get at least one of them throttling at one point or another, and sometimes even if you gimp the other chip with a third party tool.

Really, a metal-based TIM isn't all that necessary, since heat trapping eventually takes place anyway with a closed lid. A high quality paste can do wonders in some instances, since Apple's application is luck of the draw I'm afraid.
 
The issue as I see it, is that apple did very little (nothing?) to accommodate the increased heat for the 2018 MBPs. They're not the only ones, but given the high price tag, it does leave a bad taste in one's mouth. Add in the fact that apple's proprietary technology (T2), is causing kernel panics for many of the owners, is unbelievable. Looks like someone who bought the Mac Mini is having this issue as well. So to confirm Apple's T2 chip is causing problems in the iMac Pro, the MacBook Pro and the Mac Mini. I'm sure we'll hear about it for the MBA as well.

Hopefully that apple has reworked the cooling to handle the vega more efficiently and only time will tell.

The real purpose of T1 and T2 is beyond my comprehension.

I don't think there was an immediate need for it, and even if there were, it should be failproof. Like all previous computers that came before this goddamn thing. The best hardware is the one that doesn't get in the way. If this crap serves a secondary purpose, why even advertise it?

The latest news is that T2 prevents Linux from being installed. Imagine spending $1500 on a Mac Mini with the sole purpose of using Linux and being gifted an expensive Space Gray brick.

Ugh.
 
Apple prioritizes GPU, minimal throttling on GPU, huge throttling on CPU. GPU seems to start throttling when CPU goes below 1.6GHz. And funny thing - it stays there when you remove CPU load, forever - or until you unload the GPU also.

Sounds weird, I never saw frequencies drop like that when running intensive stuff in GPU and CPU simultaneously. Then again, I didn't try to torture test the CPU at the same time. Anyway, I think it makes more sense to look at realistic use of computer. 100% utilisation of both CPU and GPU, as previously mention, is not realistic.

Ahhh - the most reliable GPU benchmark in the world. Here are my own scores in Cinebench OpenGL. Notice anything, ehm, suspicious?

Oh, I completely agree that Cinebench is pointless as a GPU benchmark and I am not of high opinion of the CPU benchmark either. But I don't think they used Cinebench. Rather, they rendered the same project in Cinema 4D R20 and they claim that Vega 20 machine took 60% less time doing it. Which is a rather valid way of testing a popular professional workflow. Of course, it doesn't say anything about how the GPU will perform in other applications.

My understanding is that it doesn't automatically result in double performance and 50% performance gain is what to be expected by going from promoted fp32 to 2xfp16 CUs. To show double or more you would pretty much need to write own low-level code that pairs fp16 data types and packs them into 32 bit registers.

Not at all, as long as your numbers are laid out in a continuous array you get double thoughtput automatically. A single 32-bit FP lane can do two 16-bit FP ops simultaneously. It's really just like SSE/AVX on the CPU. You can use your 256-bit ALU to process 4 double numbers in one operation, or 8 floats — the overall time is the same. So stuff like vector/matrix products etc, which is a common application for fp16 types in machine learning get the performance benefit automatically. Unless you programmer is an idiot and has laid out the basic datatypes in a stupid way of course :)
[doublepost=1541699584][/doublepost]
The real purpose of T1 and T2 is beyond my comprehension.

I don't think there was an immediate need for it

I think the reason fro T2 is fairly obvious. Apple needed a way to differentiate their products from others. For long time, their "thing" was making computers that were thinner and lighter but would still perform same or better many of their larger, uglier brethren, while also featuring top battery life. In the last few years, the competition has caught up and you have a lot of premium laptops in the similar category. The T2 chip is something that those competitors can't copy that easily. And it gives Apple a decisive edge in SSD performance and security. Of course, assuming it works correctly :)
 
If you're buying with having in mind letting it outlast 4-6 years of hardware updates in new releases, I don't really get why you're panicking because the first hardware update happens sooner than anticipated. You were obviously not worried about having the newest hardware for years.
To be fair, I do understand that an upgrade just weeks after one’s purchase does burn, especially if you’ve chosen the “latest and greatest” model. The expectation I guess was to have the best machine for at least several months, not of course for the entire projected time of use.
[doublepost=1541703009][/doublepost]
I also bought it with the conviction that it would work properly which it doesn't, so there's that.
I fail to understand why you haven’t yet brought in a lawyer to defend your rights, which are clear and strong, especially given the high price of what you bought. I’m sure there are consumer protection agencies in your country who’d be willing to help without you needing to pay a lawyer’s normal fees. In my experience (though not with Apple) a single letter written by a lawyer was sufficient to get the company to contact me with a proposal.
As was mentioned somewhere above, Apple is usually quite flexible when your claims are justifiable. They DO care about bad press. If your local Apple office is not listening, escalate to their UK or Luxembourg HQ. If that fails, call the US. You do have several possibilities, use them.
 
looks like you never had good PC laptop

Actually, looks more like perhaps you have not used one recently with latest Windows 10. These used to work find but with all this Modern Standby madness its gone from good to bad.
 
Rather, they rendered the same project in Cinema 4D R20 and they claim that Vega 20 machine took 60% less time doing it. Which is a rather valid way of testing a popular professional workflow. Of course, it doesn't say anything about how the GPU will perform in other applications.

This was released two months ago? Maybe they rewrote it in a way that takes advantage of the half precision math. Couldn't find any comparison between RX580 and Vega in actual Cinema 4D, that would give a ballpark value. Looks like they switched to AMD ProRender in R19, so one could expect that it will support all the latest bells and whistles of Vega cards. Bottom line, in general use Vega 56 is about 30% faster than RX580, while utilizing 55% more CUs, faster clock and more power. And I'm not talking even about Vega 64 which needs additional 50% more power for further 15% or so improvement.

Not at all, as long as your numbers are laid out in a continuous array you get double thoughtput automatically.

I'm not going to pretend I know how this should be implemented to take full advantage of. But I'm pretty sure if the software is fp32 based it needs to rewritten, identifying areas that can use half precision. And by the time time that happens on large scale, as is always a story with AMD's improvements, Nvidia will pull something a lot faster from their closet, Apple will be using ARM CPUs and make laptops that self-destruct after 2 years of use, and I'll be back on Windows.
 
Actually, looks more like perhaps you have not used one recently with latest Windows 10. These used to work find but with all this Modern Standby madness its gone from good to bad.
I have 2 Razer Blades, ASUS ROG, Toshiba ultrabook, Yoga Book 2, Aorus X5, all came with Windows 10, no problems with sleep/wake up

maybe Microsoft messed up by adding some extra "chip" something similar to Apple T2

personally I don't like Microsoft hardware, because everything is soldered and glued also they are terribly overpriced, I prefer others manufactures
 
Last edited:
I think the reason fro T2 is fairly obvious. Apple needed a way to differentiate their products from others. For long time, their "thing" was making computers that were thinner and lighter but would still perform same or better many of their larger, uglier brethren, while also featuring top battery life. In the last few years, the competition has caught up and you have a lot of premium laptops in the similar category. The T2 chip is something that those competitors can't copy that easily. And it gives Apple a decisive edge in SSD performance and security. Of course, assuming it works correctly :)

It's so secure it locks you out of your own laptop.

That, coupled with the Touch Bar, thermal throttling, and charging an obscene amount of money for OK-ish performance is what's killing the Mac.
 
  • Like
Reactions: ugru and Queen6
Bottom line, in general use Vega 56 is about 30% faster than RX580, while utilizing 55% more CUs, faster clock and more power. And I'm not talking even about Vega 64 which needs additional 50% more power for further 15% or so improvement.

True, but mobile cards have lower frequencies, undergo more rigorous selection, are die-thinned etc. It's quite difficult to predict the end result. At any rate, we should see some benchmarks fairly soon.

I'm not going to pretend I know how this should be implemented to take full advantage of. But I'm pretty sure if the software is fp32 based it needs to rewritten, identifying areas that can use half precision.

Sure, if your software uses fp32 you'll have to rewrite the parts you want to use half precision instead.

And by the time time that happens on large scale as is always a story with AMD's improvements, Nvidia will pull something a lot faster from their closet

It already happened at large scale? Half precision format has been supported on GPUs for over a decade now and has been widely used in games and pro-level software. Half-precision data uses half the storage and also half the bandwidth so they are attractive for many applications no matter with what speed your ALUs can process them.

At any rate, I don't think that Vega's ability to process half-precision at higher speed is or should be its major selling point. Sure, it's nice to have, but unless you do a lot of deep learning who cares, really. And its not exclusive to Vega — Pascal cards have it too (its just that Nvidia made a very weird choice — as they often do — of deliberately slowing down half-precision computation on consumer cards). The great thing about mobile Vega is a) HBM2 in a laptop b) ability to efficiently schedule heterogeneous programs and c) its "memory is cache" design.

As to how it will perform in practice, we will see. I am fairly sure that 50% at the same power consumption is possible.
 
I have 2 Razer Blades, ASUS ROG, Toshiba ultrabook, Yoga Book 2, Aorus X5, all came with Windows 10, no problems with sleep/wake up

maybe Microsoft messed up by adding some extra "chip" something similar to Apple T2

personally I don't like Microsoft hardware, because everything is soldered and glued also they are terribly overpriced, I prefer others manufactures

It's not about the machines, its about the software, ie Windows 10 which has problems with this lately but it used to work. But that is probably a discussion for a another thread.
 
It's not about the machines, its about the software, ie Windows 10 which has problems with this lately but it used to work. But that is probably a discussion for a another thread.

it's not a Windows 10 problem, it's a hardware, sometimes BIOS/UEFI

I'm working on Windows 10 machines on the past 3 years, on the most ultrabooks problem with sleep/wake up was solved by BIOS/UEFI update, other things are ACPI drivers, should be newest possible form manufacturer website, gaming laptops usually are more robust and less problematic
 
Last edited:
It already happened at large scale? Half precision format has been supported on GPUs for over a decade now and has been widely used in games and pro-level software.
True, but it didn't come with the benefit of double peformance. Don't know about the pro software, but games definitely don't use half precision. A year ago I remember the fuss about Ubisoft using fp16 and AMD victorious road to domination in gaming market. Turned out to be used only for water, and that's the only time I heard about it in any game.
As to how it will perform in practice, we will see. I am fairly sure that 50% at the same power consumption is possible.
I'm trying to figure out some sensible bet to make ;)
 
it's not a Windows 10 problem, it's a hardware, sometimes BIOS/UEFI

I'm working on Windows 10 machines on the past 3 years, on the most ultrabooks problem with sleep/wake up was solved by BIOS/UEFI update, other things are ACPI drivers, should be newest possible form manufacturer website, gaming laptops usually are more robust and less problematic

Just google it and you will find several reports about battery drain, poor perfmance after resume etc. And this is with latest bios, drivers etc. Have experienced it myself with Dells and Thinkpads. Might not be apparent unless you know what to look for though. However, that specific discussion is probably better suited elsewhere.
 
True, but it didn't come with the benefit of double peformance. Don't know about the pro software, but games definitely don't use half precision.

Of course games use half precision. It's a common format for HDR render targets, for example. Its used more as means to
save memory and bandwidth though. The shading languages generally have rather underdeveloped type systems (Metal is one very fortunate exception), but they do include hints which precision is sufficient. These is used widely in OpenGL ES games. Other than that, driver can optimise certain variables to use half precision if their input and output are low-precision formats and no high precision math is required.
 
I
personally I don't like Microsoft hardware, because everything is soldered and glued also they are terribly overpriced, I prefer others manufactures

Hmmm.. soldered and glued... terribly overpriced... what other manufacturer does that remind me of. It's on the tip of my tongue.. but the name escapes me. Some kind of fruit name of some sort ....
 
Of course games use half precision. It's a common format for HDR render targets, for example. Its used more as means to
save memory and bandwidth though. The shading languages generally have rather underdeveloped type systems (Metal is one very fortunate exception), but they do include hints which precision is sufficient. These is used widely in OpenGL ES games. Other than that, driver can optimise certain variables to use half precision if their input and output are low-precision formats and no high precision math is required.
Does it have any meaningful impact on performance when using Vega, and still what DirectX games are utilizing it? I don't think anybody is going to program games in OpenGL ES with Vega in mind. I've found an interesting article on this subject, by Codemasters programmer. Seems like utilizing fp16 in DirectX is quite a bit of effort and they got 10% improvement in shader performance on Vega.
https://gpuopen.com/first-steps-implementing-fp16/
 
I fail to understand why you haven’t yet brought in a lawyer to defend your rights, which are clear and strong, especially given the high price of what you bought. I’m sure there are consumer protection agencies in your country who’d be willing to help without you needing to pay a lawyer’s normal fees. In my experience (though not with Apple) a single letter written by a lawyer was sufficient to get the company to contact me with a proposal.
As was mentioned somewhere above, Apple is usually quite flexible when your claims are justifiable. They DO care about bad press. If your local Apple office is not listening, escalate to their UK or Luxembourg HQ. If that fails, call the US. You do have several possibilities, use them.

Thanks for the pep-talk, i appreciate it. FWIW, I now play the "waiting game" while the apple legal team handles the case. So after a month of waiting and no responses... I have to wait a little more.
I hope it gets solved
 
Does it have any meaningful impact on performance when using Vega, and still what DirectX games are utilizing it? I don't think anybody is going to program games in OpenGL ES with Vega in mind.

It has meaningful impact on performance on all GPUs, not just Vega, simply because of saved memory and bandwidth. As to the rest, I do not know, and as I said, I do not think this is interesting. I certainly won't spent any time trying to optimise my shaders for Vega, I think its dumb time investment. When I work on the Metal part of my renderer, I use half precision whenever it makes sense for the algorithm. In the Vulcan backend I simply don't bother.
 
It has meaningful impact on performance on all GPUs, not just Vega, simply because of saved memory and bandwidth. As to the rest, I do not know, and as I said, I do not think this is interesting. I certainly won't spent any time trying to optimise my shaders for Vega, I think its dumb time investment. When I work on the Metal part of my renderer, I use half precision whenever it makes sense for the algorithm. In the Vulcan backend I simply don't bother.
So you're programming renderers for a living? Why didn't you just say so initially, it would save me some time. You knew what I meant when I said the games don't use fp16, not in general, but in context of improving Vega performance.
To summarize with carefully chosen words: unlike mobile and consoles, PC gaming market (the main one, directX based) doesn't utilize half precision math to exploit hardware capabilities offered by specialized computing units, and will still not use it because nobody will invest engineering effort to optimize it for a small subset of customers. And this has nothing to do with general good programming techniques of choosing smallest possible data type and not holding a logical value in 32 bits of memory. Does that statement sound accurate?
 
So you're programming renderers for a living?

Not for a living, it's a hobby for now. I do have an aspiration to develop it to a business :)

Does that statement sound accurate?

Pretty much. You do see press releases of gaming studios here and there declaring that they do such optimisations, but it's all just marketing anyway. As I said, I don't think it's worth it. At any rate, both AMD and Nvidia has a team of devs who's task it is to analyse popular games and write driver-side optimisations for them. It's a terrible industry.
 
To be fair, I do understand that an upgrade just weeks after one’s purchase does burn, especially if you’ve chosen the “latest and greatest” model. The expectation I guess was to have the best machine for at least several months, not of course for the entire projected time of use.
[doublepost=1541703009][/doublepost]
I fail to understand why you haven’t yet brought in a lawyer to defend your rights, which are clear and strong, especially given the high price of what you bought. I’m sure there are consumer protection agencies in your country who’d be willing to help without you needing to pay a lawyer’s normal fees. In my experience (though not with Apple) a single letter written by a lawyer was sufficient to get the company to contact me with a proposal.
As was mentioned somewhere above, Apple is usually quite flexible when your claims are justifiable. They DO care about bad press. If your local Apple office is not listening, escalate to their UK or Luxembourg HQ. If that fails, call the US. You do have several possibilities, use them.

I don't think you are failing to see anything, there is no case because there are no damages and/or a breach of contract. In the US there is no consumer rights that mandate a company lay out their business plans and update schedules for consumers buying the previous product.

The machine offers the performance and capability it was sold with which is unaffected by the release of a new GPU. A computer is purchased on the requirements of the workload it will be doing. Making matters worse is Apples support is arbitrary to model year. A 2018 MBP base model and a 2018 MBP maxed out model will stop receiving Apple support at the same time. Someone that doesn't video/photo edit could literally no have a measurement difference in performance between iGPU and dGPU models.

Wanting to have the "best" for X amount of time is just self entitlement (emotions). Apples lawyers will tear this down before it ever made it into court. Does the customer require the best for their workload without compromise? If so then they will need receipts where they upgraded every iteration. What is "the best" anyway? There are competing laptops that are more powerful with better specs. Why weren't those laptops purchased?

Most obviously the product hasn't even been released yet. If it gets delayed 2 months is it a problem? 3 months? 4 months? How much does Apple owe him? Is that amount based on time since purchase? Apple also said the pricing would be announced so if its just more expensive options what then? We can't even establish how badly someones feelings are or aren't hurt for the counts at this moment in time! Lol

Does the opposite apply as well btw? If Apple goes 2 years without updating a product do the early adopters of that product need to pay Apple more money since their model was the best 2x as long?

Honestly I completely understand and can totally empathize with this situation. I would be very annoyed but the reality of the matter is this is all just a emotional because of what someone considers a lot of money.

Side note : Don't buy specs unless you need them with Apple products. System requirements for support are based on years and otherwise arbitrary. The lack of hardware is what ends supports not the speed of it, ex MacOS using a proprietary wireless protocol that requires BT 5.0. If you don't actually use the hardware to its fullest then you'll end up replacing hardware you never used one day...
 
  • Like
Reactions: 88Keys and HenryDJP
I don't think you are failing to see anything, there is no case because there are no damages and/or a breach of contract. In the US there is no consumer rights that mandate a company lay out their business plans and update schedules for consumers buying the previous product.

The machine offers the performance and capability it was sold with which is unaffected by the release of a new GPU. A computer is purchased on the requirements of the workload it will be doing. Making matters worse is Apples support is arbitrary to model year. A 2018 MBP base model and a 2018 MBP maxed out model will stop receiving Apple support at the same time. Someone that doesn't video/photo edit could literally no have a measurement difference in performance between iGPU and dGPU models.

Wanting to have the "best" for X amount of time is just self entitlement (emotions). Apples lawyers will tear this down before it ever made it into court. Does the customer require the best for their workload without compromise? If so then they will need receipts where they upgraded every iteration. What is "the best" anyway? There are competing laptops that are more powerful with better specs. Why weren't those laptops purchased?

Most obviously the product hasn't even been released yet. If it gets delayed 2 months is it a problem? 3 months? 4 months? How much does Apple owe him? Is that amount based on time since purchase? Apple also said the pricing would be announced so if its just more expensive options what then? We can't even establish how badly someones feelings are or aren't hurt for the counts at this moment in time! Lol

Does the opposite apply as well btw? If Apple goes 2 years without updating a product do the early adopters of that product need to pay Apple more money since their model was the best 2x as long?

Honestly I completely understand and can totally empathize with this situation. I would be very annoyed but the reality of the matter is this is all just a emotional because of what someone considers a lot of money.

Side note : Don't buy specs unless you need them with Apple products. System requirements for support are based on years and otherwise arbitrary. The lack of hardware is what ends supports not the speed of it, ex MacOS using a proprietary wireless protocol that requires BT 5.0. If you don't actually use the hardware to its fullest then you'll end up replacing hardware you never used one day...

My specific case is not "vega" only, my case is a computer that's faulty
- throttles to 2,7GHz (cant maintain base speed)
- display flickers (proven and recorded on a clean system)
- audio crackles
and a few other minor (more or less) issues.
And they released a new one before mine is even fixed and I right now have to wait for apple's legal team and had to find a replacement computer so i can continue working...

My first case number goes back to when computer was 13 days old - still within legal return period. I just "trusted" apple (my bad on that behalf) that they would make it right...
and after a month and a half they still didnt

so in short what Apple owes me, they owe me a working machine for 5,3 grand... Not a new machine that i have to take in for service.

So in my particular case (which is what you're commenting on) you were mistaken. If the case was Vega then you would be correct.

Vega was just salt on a wound.
 
Last edited:
  • Like
Reactions: CocoaNut
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.