/oʊm/

noun … “Unit of electrical resistance.”

Ohm is the standard unit used to quantify resistance in an electrical circuit. One ohm (Ω) is defined as the resistance that allows one ampere of current to flow when one volt of voltage is applied across it, according to Ohm’s law (V = I × R).

Key characteristics of Ohm include:

  • Unit symbol: Ω.
  • Relation to Ohm’s law: R = V / I.
  • Material dependence: resistance in ohms varies based on conductor type, length, and cross-sectional area.
  • Temperature effect: resistance measured in ohms can change with temperature.
  • Applications: specifying resistors, calculating currents and voltages, and designing circuits.

Workflow example: Calculating resistance:

voltage = 12    -- volts
current = 0.02   -- amperes
resistance = voltage / current
print(resistance)  -- 600 Ω

Here, a 12 V source driving 0.02 A results in a resistance of 600 ohms.

Conceptually, an Ohm is like the measurement of friction in a water pipe: it quantifies how strongly the material resists the flow of charges.

See Resistance, Current, Voltage, Power, Electricity.