Ohm's law is a fundamental principle in electrical engineering and physics that describes the relationship between voltage, current, and resistance in an electrical circuit. It is named after the German physicist Georg Simon Ohm, who first formulated the law in the early 19th century.
Ohm's law states that the current passing through a conductor between two points is directly proportional to the voltage across the two points, given that the temperature and other physical conditions remain constant. In mathematical form, Ohm's law is expressed as:
V = I * R
where:
V is the voltage across the conductor in volts (V).
I is the current flowing through the conductor in amperes (A).
R is the resistance of the conductor in ohms (Ω).
This relationship implies that if you increase the voltage (V) across a resistor or a conductor, the current (I) through it will also increase, provided the resistance (R) remains the same. Conversely, reducing the voltage will decrease the current as long as the resistance remains constant.
Ohm's law is an essential principle in the analysis and design of electrical circuits and is the basis for many practical applications in electronics and electrical engineering. It helps in understanding the behavior of electrical components, such as resistors, inductors, and capacitors, and plays a crucial role in determining circuit performance and safety.