I disabled auto graphics switching as I wanted to see what the battery life impact would be. At the end of my test, I'm left wondering what the exact benefit of using the Intel chip really is.
When using the graphics switching the temperature during basic web browsing is around 45 to 50 degrees. When using the Radeon only, it's between 42-44 degrees. That just seems odd to me.
Also, on an up note, it seems like the battery life isn't even negatively impacted by using the Radeon chip. I got the same ~7 hours on basic usage.
There also have been stories of the graphics switching causing issues again as with the 2010 line and the nVidia chips. At this point I don't know what to think. I mean isn't the Intel the one that's on the same dye with the CPU?
When using the graphics switching the temperature during basic web browsing is around 45 to 50 degrees. When using the Radeon only, it's between 42-44 degrees. That just seems odd to me.
Also, on an up note, it seems like the battery life isn't even negatively impacted by using the Radeon chip. I got the same ~7 hours on basic usage.
There also have been stories of the graphics switching causing issues again as with the 2010 line and the nVidia chips. At this point I don't know what to think. I mean isn't the Intel the one that's on the same dye with the CPU?