Ohm’s law gives a relation between Voltage, Current, and Resistance of circuit. Namely, V=IR, or
Voltage = Current x Resistance
Note that this equation states that when Voltage goes up, current goes up.
A transformer is used to change the voltage of an AC circuit – a 1:10 transformer increases the voltage to 10x its old value, while a 10:1 transformer decreases it to 1/10th its old value.
Real-life transformers consist of two wires wrapped around a metal pole
Transformers must follow the rule
Voltageold x Currentold = Voltagenew x Currentnew (Power Conservation)
Thus as Voltagenew goes up (by, say, replacing the 1:10 transformer with a 1:11), Currentnew goes down.
One equation says that current goes up when voltage goes up, the other says that current goes down. Does this mean that the two equations are incompatible and one must be thrown away when working with transformers? No! The reason is that Ohm’s law applies to single circuits, while the other equation applies to how two separate circuits are related by a transformer.
To see this more clearly, imagine the following thought experiment: we connect a 1 volt AC generator to a 1 ohm resistor and measure the current. By Ohm’s Law, we should get 1 ampere of current*:
Now imagine we stuck a 1:10 transformer in the circuit, splitting our one circuit into two electrically-separate circuits. The confusion arises from the following question: does the current through the resistor go up because the voltage went up, or down because the transformer needs to conserve power?
Treating the transformer as a 10-volt AC voltage source in the right circuit, we use Ohm’s law to see that the current through the resistor has gone up – it is now 10 amps. In order to preserve power, this means that in the left circuit our original AC power source is now drawing 100 amps of current, 100x what it was drawing before.
This is why high voltage power lines are still dangerous – it’s not that they would cause less current to flow through your body than low-voltage power lines, it’s that the current through your body would be less than the (enormous) current drawn at the lower-voltage end of the power-plant’s transformer.
The most common answer to this question is "No, because transformers are non-ohmic devices." This is baloney – a transformer is nothing more than a pair of inductors, which are perfectly ohmic (ie. they obey Ohm’s law). For our purposes, an inductor can be thought of as a resistor that only resists AC current (this sort of “resistance” is called impedance). This is why the left circuit (above diagram) still obeys Ohm’s law: the transformer acts somewhat like a resistor (due to its impedence) whose resistance goes up as the current demand of the right-circuit goes down.
- (Un)Common Questions about Electricity
- Why can the resistor go at the beginning of the circuit OR at the end?
- What’s the difference between “voltage at a point” and “voltage between two points?” Also, what is ‘ground?’
- How is it possible that there’s not a complete circuit to the power plant?
- Why do we use Alternating Current (AC) instead of Direct Current (DC) in power lines?