Another question for the mathematical minded. 12v 95ah AGM battery. in good condition.
Using a 500 w inverter and powering a 130w unit, will the unit last 4 - 5 hours? Thanks
I would think a 95Ah AGM battery in good condition and fully charged should survive a 130W load for 4 to 5 hours, but as Roger L has pointed out, we don't know the efficiency of the inverter, so we cant factor in the real power losses of your inverter.
As the battery ages its ability to perform will diminish.
Just a note about inverters and transformers they are called a constant power converters. This means:
Power In (Watts) = Power Out (Watts)
It's also useful to know that Watts are the product of Voltage x Current (Amps)
For example your 130W consumption translates to
230V x 0.56A = 130W
12V x 10.8A= 130W
If they were 100% efficient that would be perfectly true, but Inverters and Transformers do incur some losses usually as heat so the true conversion is
Power In (Watts) = (Power Out + Power losses in transformation.) (Watts)
Unfortunately we do not know the power loss in Buckmans inverter, and as expressed in previous comments that can vary depending on the type of invereter he might have, and the work the inverter is having to do. However it will always increase the consumption from the supply.
All types of batteries exhibit some sort of discharge voltage curve. As they are discharged the terminal voltage they produce also reduces. Unfortunately this is not always a linear relationship, and when a battery is considered to be fully discharged it will still have some residual terminal voltage. Each type of battery and chemical composition does have some "typical" values that well documented.
For example, a 12V AGM battery exhibits terminal voltages as shown here:
Voltage | Capacity |
| |
12.85V | 100% (resting) |
12.80V | 99% |
12.75V | 90% |
12.50V | 80% |
12.30V | 70% |
12.15V | 60% |
12.05V | 50% |
11.95V | 40% |
11.81V | 30% |
11.66V | 20% |
11.51V | 10% |
10.50V | 0% |
It should be realised that as the battery voltage drops, the inverter has to change its transfer function to compensate, as it maintains the 130W output power.
Consequently with the battery fully charged the inverter will pull
100% charged 130W = 12.85V x 10.1A
0% charged 130W = 10.5V x 12.4A
Plus whatever the inverters inefficiency requires to be covered.
As the battery is discharging the current it needs to supply is creeping up, and as that happens the ability of the battery to maintain that current also reduces, which means for example
A 100Ah battery should be able to supply a current of 1A continually for 100 hours So the product of the current x the time represents Ah. but as the discharge current increases, You would expect a 10A current to be maintained for 10Hours, in practice it will be slightly shorter. 100A current is likely to be for significantly less than 1hour. Again the rate at which the discharge current affects the discharge time is dependant on the design of the battery, and how old it is.