You are on page 1of 3

Why is the rating of transformers given in kVA and not in kW?

Transformer

kVA is the unit for apparent power. Apparent power consists of active and reactive power. Active power is the share of the apparent power which transmits energy from the source (generator) to the user. Reactive power is the share of the apparent power which represents a useless oscillation of energy from the source to the user and back again. It occurs when on account of some inertia in the system there is a phase shift between voltage and current. This means that the current does not change polarity synchronous with the voltage. But the heat generated in a winding as well as the eddy current losses generated in a transformer core depend on the current only, regardless of whether it

aligns with the voltage or not. Therefore the heat is always proportional to the square of the current amplitude, irrespective of the phase angle (the shift between voltage and current). So a transformer has to be rated (and selected) by apparent power. It is often helpful to think of an extreme example: Imagine a use case where the only and exclusive load is a static var compensator (and such cases do exist). Would the load then be zero because the active power is zero? Most certainly not. Caution: In this situation the voltage across the output terminals will increase with load rather than drop!

Supplement: Special care has to be taken if the load current of a transformer includes any higher frequencies such as harmonics. Then the transformer may even overheat although the TRMS load current, measured correctly with a TRMS meter, does not exceed the current rating! Why is this? It is because the copper loss includes a share of about 5% to 10% of so-called supplementary losses. These arise from eddy currents in mechanical, electrically conductive parts made of ferromagnetic materials and especially in the low voltage windings with their large cross sections. The magnetic stray fields originating from a lack of magnetic coupling between the HV and LV windings (main stray canal) induce something that could be called an eddy voltage inside the conductors, which drives an eddy current flowing around in a circle across the conductor, perpendicular to the main load current. Now the amplitude of this eddy voltage is proportional to the rate of change of the magnetic field strength. The rate of change of the magnetic field strength is proportional to both the amplitude and the frequency of the current. So the eddy current increases proportionally to the load current and proportionally to the operating frequency, for the limitation to the eddy current is Ohms Law. The supplementary power loss caused by the eddy current is eddy current times eddy voltage. Hence, the supplementary losses increase by the square of the load current, which excites the magnetic stray field, and by the square of the frequency, while the main copper loss increases only by the square of the load current amplitude. Therefore the transformer runs hotter when the load current has the same amplitude but is superimposed by higher frequency constituents above the rated frequency. This additional heat loss is difficult to quantify, especially as the transformers stray reactance limits the passage of higher frequency currents to

some extent, but in an extreme case it may drive the supplementary loss up from 10% to 80% of the copper loss. This means that the transformer may run some 70% hotter (of temperature rise above ambient) than specified for rated (sinusoidal) current. Since the ohmic heat loss, however, depends on the square of the current, it is enough to limit the load current to some 65% of its rating to avoid overheating.

You might also like