Hello folks,
I haven't quite seen a page about this but I am rather curious.
So I know:
Hd5000 - 1100 & 1200
Iris 5100 - 2.4 - 1100 MHz
Iris 5100 -2.6&2.8 - 1200 MHz
Iris 5200 - 2.0 - 1200 MHz
Iris 5200 - 2.3&2.6 - 1300 MHz
So what I am curious about is in each of these models, after doing something gpu and CPU intensive or maybe just gpu independently, what steady state clocks do these chips reach for the igpu?
For example I think the hd5000 when gaming stays at 400 MHz when it under clocks itself because of heat. So getting an i5 vs i7 in say a MacBook Air might not make sense for gaming.
Point I'm trying to make is when these other chips go, at what MHz do they steady state at? Like if you get an upgraded chip, but it's constantly underclocking itself, does one ever benefit from the increased gpu?
My apologies if this sounds dumb, just curious if there is a point to upgrading the chip if chasing better igpu performance.
I haven't quite seen a page about this but I am rather curious.
So I know:
Hd5000 - 1100 & 1200
Iris 5100 - 2.4 - 1100 MHz
Iris 5100 -2.6&2.8 - 1200 MHz
Iris 5200 - 2.0 - 1200 MHz
Iris 5200 - 2.3&2.6 - 1300 MHz
So what I am curious about is in each of these models, after doing something gpu and CPU intensive or maybe just gpu independently, what steady state clocks do these chips reach for the igpu?
For example I think the hd5000 when gaming stays at 400 MHz when it under clocks itself because of heat. So getting an i5 vs i7 in say a MacBook Air might not make sense for gaming.
Point I'm trying to make is when these other chips go, at what MHz do they steady state at? Like if you get an upgraded chip, but it's constantly underclocking itself, does one ever benefit from the increased gpu?
My apologies if this sounds dumb, just curious if there is a point to upgrading the chip if chasing better igpu performance.