I actually thought about giving my post another paragraph to make sure I closed all the holes but I have been criticized by several posters when I do that. I shall have to ignore them in the future.
The Lithium-Ion battery used in the iPhone is a consumer grade battery that has the ability to be recharged 300 to 500 times before reaching the inability to charge due to oxidation on the anode. This means the battery is guaranteed to fail. To help extend the life of the battery you have to worry about three major things.
1) Temperature.
2) Amperage.
3) Rate of charge.
That is why the iPhone charges at the rate of 1% a minute for the first 85%. When it reaches 85% it switches to a lower amperage. At 96% it switches to the lowest amperage. At 100% it does a periodic top of charge at a very very low amperage. Which means the iPhone when using the screen, CPU and GPU at 100% can drain the battery faster than the charge rate that is used at 96%.
Now if you remember from all the posts they all have stated between 94 and 96% is where they seem to stop charging. Which is the toggle point between two of the charging rates. The GPU can easily pull the amperage needed to force that toggle.
That is why the theory holds water in my opinion. It fits all the anecdotal evidence presented in this thread alone, not to mention at least two others on this site.
I will bow to your knowledge of battery charge rates
, that was not really my point, i was more referring to why your phone will show power drop whilst plugged in and mine (and in fact anyone else i have asked about this) have never seen (when loading the GPU to 100%). And more importantly how you think this relates to the suspected bug in the code for the GPU firmware ?. if the bug, as you say is causing or a factor to the plugged in power drop when loading the GPU, how can this be with no hardware issue on your phone ?.
let say for example my hardware is the same as yours, i.e. non faulty, but you somehow activate the troublesome code but i do not, we both run 3d games using 100% GPU, but yours drops when plugged in ?, that does not make sense, as i said before, unless the error in code can somehow over clock your GPU requiring it to pull more current i really can't see from a technical perspective how this is possible just from, as you say, the buggy code ?. Both our GPU's will be at 100% and if neither has a hardware fault they will pull the same current.
as a follow up, lets say we both have the buggy code running, which based on your logic, we have as we get around 1-2% drain in idle, we both run GPU at 100%, you get drop whilst plugged in, i don't. Sounds like the problem is your hardware
and finally lets say, you have the buggy code and i don't, again we both load GPU to 100%, you get drain, and i don't.... the GPU cannot be loaded more than 100% by running buggy code, 100% is 100%, if the buggy code pulls 5% for example at idle, this does not get added to the 100% when loading of a 3d game and you won't end up running 105%. Basically the difference we will have is i will be at zero at idle, you at 5% at idle (in theory) but once we load to 100%, we are the same, and hence the only difference is the hardware. once you have loaded the GPU to 100% it makes no odds at all what the GPU pulls at idle.
maybe the bug loads the CPU at the same time, but that cannot be the case as you have all ready proved you only see the drop in power when plugged in with 100% GPU loading and zero CPU loading. I cannot see how your drop when plugged in is anything to do with software (based on your tests anyway)