To all of you saying 100% is 100%: you are wrong.

It always depends on the circumstances, since you can have 100% GPU LOAD with less than 100% GPU POWER DRAW and then it is not a problem, since the graphics card is designed for such a usage. But it is not designed for 100% power draw, because that usually exceeds the TDP.

The card does not mind 100% power draw peaks, since they are short lived and thermal dissipation inside the parts keeps the core temperature from exceeding the maximum. But when you continuously keep the heat generation through power consumed above the TDP or for a modified card the heat dissipation limit of the coolant system then the card will inevitably overheat and nothing anyone can modify in the case will help against that. Which is the reason, why overclocked cards also have a modified fan setup; this does not happen because two fans look cooler than one!

When designing graphics cards one has to estimate the normal work cycle the same way one would do when designing a vehicle engine. You take a base level usage and then add the possibility to exceed that for a short time, but only a short time. This can easily be compared to a car's red area of the rpm display, you can enter that area for a short while to push the limits of your car, but staying there constantly will lead to you kissing good bye to your cylinder head gaskets.

The whole concept of load vs. power is a thing game designers need to consider when programming. There have been a lot of cases in the past where a graphically demanding game for the time would use 100% load and not power, which is good. The bad part starts when switching to the graphically simple 2D menu would still use 100% load but also push the power consumption to its limits because the card tries to redraw that menu a few hundred times a second just because it can. And this then heats the card beyond it's safety limits.


tl;dr: GPU load is not the same as GPU power and one can not be judged without the other one when considering GPU heat.