Integrated graphics share the same total TDP as the CPU. So a 45w chip can either have full power to CPU (e.g. 40w CPU/5w iGPU) or full power to the iGPU (e.g. 25w CPU/20w iGPU). It will actually go over its 45w TDP for a time, but as soon as it hits 100c it will start throttling and the power will pull back closer to it's max TDP of around 45w.
For normal work loads (no CAD/gaming/3d graphics), this isn't a problem because the workload is generally intermittent so the the CPU/iGPU will hit it's turbo speeds and you'll have a smooth, fast, computer.
However, as soon as you run a game that requires constant high power input, the power has to be shared between your CPU and iGPU. Your 3.5ghz (turbo boosted) CPU will under-clock itself to keep the heat within limits and will be running much lower than it's specified frequency, around 1.5Ghz - As well as having the GPU limited as it's operating under the same power and heat envelope - this is why
iGPUs still suck for games.
Benchmarks will usually look okay as they are only run for a short time. But in real workloads, things run for much longer and the heat eventually slows it down.
Have a look the intel power gadget to see what I'm talking about in action:
https://software.intel.com/en-us/articles/intel-power-gadget-20 - If you want to game on it, the external TB3 enclosure from Razer is a good option.