freeradical wrote:
[quote=AllGold]
I could be wrong, but I think the VA (Volt-Amp) setting gives you a little more "resolution" than the Watts setting.
Not exactly. Watts refers to real power, while volt-amps refers to apparent power. In a DC circuit, you simply multiply voltage times current to get watts since the voltage is constant. In an AC circuit, the voltage is constantly changing, so you multiply the instantaneous AC voltage times the instantaneous current, and then integrate those results over a period of time.
In practical terms, both numbers matter, but the volt-amperes specification is more important when it comes to safety, and the watts specification is more important when it comes to figuring out your electric bill. This is because you buy power from your electric company in watts (Kw/h actually). The volt-ampere specification tells you what gauge of wiring you need to safely run some equipment. Since we get AC power at 120 volts rms, you simply divide this into the volt-ampere specification to get current. The maximum current is what you need to safely making a choice of wire gauge.
The reason I qualified it so much is because I wasn't going on actual knowledge of how that stuff works. I was basing it off of observation. I noticed on some low current devices that I was testing, 3 VA equaled about 1 Watt. That doesn't work with higher current devices so maybe it was all about the inaccuracy of the Kill A Watt under low current.