Just out of curiosity (helps to be retired and have time to do completely unimportant stuff that takes a little time)..
I wondered about what the temperature rise of a battery maybe in the range of 200 plus or minus amp hour capacity would be if 50 amps were put into the battery for one hour.
Some assumptions that wont be reality but this is what they are. 50 amps is constant for the whole hour. Lead acid batteries are around about 85 percent efficient in charging so 7.5 amps go in heating the battery and 42.5 amps go into electrical energy. Assume the whole battery has the heat capacity of 150 pounds (or 68 Kg) of lead. So the power that goes into heating the battery at 13.3 volts would be 7.5 amps * 13.3 volts - 100 watts.
100 watts for one hour is 100 watt hours. Convert this KJ and this would be 361 KJ (note, google will do this conversion for you if you want to check).
Use the equation in this link https://www.engineeringtoolbox.com/specific-heat-metals-d_152.html and also use 0.13 as the specific heat for lead.
The equation is Q = Cp* Mass * T (check the units). Plugging all the above into this equation is
361 KJ - (.13 Kj/KgC) * (68 Kg) * (T in C)
T - 40.83 C
I have checked this a couple times as it seems way too high but this says that if you charge a lead acid battery that has about 150 pounds of lead in it for one hour at 50 amps, and you did not lose any heat to cooling while your were charging, the battery temperature would rise a really astounding 40.83 degrees C.
This of course makes a bunch of assumption that wont happen (like being able to charge for one hour at 50 amps) but this is still only putting 43.5 amp hours of electrical energy back into the battery.
So.. not exactly a real world example but I would consider battery heating if you did charge at a high current rate for a long time.
I wondered about what the temperature rise of a battery maybe in the range of 200 plus or minus amp hour capacity would be if 50 amps were put into the battery for one hour.
Some assumptions that wont be reality but this is what they are. 50 amps is constant for the whole hour. Lead acid batteries are around about 85 percent efficient in charging so 7.5 amps go in heating the battery and 42.5 amps go into electrical energy. Assume the whole battery has the heat capacity of 150 pounds (or 68 Kg) of lead. So the power that goes into heating the battery at 13.3 volts would be 7.5 amps * 13.3 volts - 100 watts.
100 watts for one hour is 100 watt hours. Convert this KJ and this would be 361 KJ (note, google will do this conversion for you if you want to check).
Use the equation in this link https://www.engineeringtoolbox.com/specific-heat-metals-d_152.html and also use 0.13 as the specific heat for lead.
The equation is Q = Cp* Mass * T (check the units). Plugging all the above into this equation is
361 KJ - (.13 Kj/KgC) * (68 Kg) * (T in C)
T - 40.83 C
I have checked this a couple times as it seems way too high but this says that if you charge a lead acid battery that has about 150 pounds of lead in it for one hour at 50 amps, and you did not lose any heat to cooling while your were charging, the battery temperature would rise a really astounding 40.83 degrees C.
This of course makes a bunch of assumption that wont happen (like being able to charge for one hour at 50 amps) but this is still only putting 43.5 amp hours of electrical energy back into the battery.
So.. not exactly a real world example but I would consider battery heating if you did charge at a high current rate for a long time.
Last edited: