In 1827, German physicist Georg Ohm published a groundbreaking paper that would come to be known as Ohm’s law. In it, he described the relationship between voltage, current, and resistance in electrical circuits. This simple yet powerful equation has since become one of the most fundamental principles of physics and has countless applications in the real world.
Ohm’s law states that the current flowing through a conductor is directly proportional to the voltage applied across it. In other words, doubling the voltage will double the current. This relationship is represented by the equation I=V/R, where I is current (measured in amperes), V is voltage (measured in volts), R is resistance (measured in ohms), and P is Power (measured in watts). i.e., V=I*R, R=V/I, P=I*V.
Also stated is that current equals Voltage divided by Resistance (I=V/R), Resistance equals Voltage divided by Current (R=V/I), and Voltage equals Current times Resistance (V=IR).
While Ohm’s law may seem like a simple concept, it can be used to solve complex problems involving electrical circuits. For example, if you know two out of three values (voltage or current and resistance), you can use Ohm’s law to calculate the third value using basic algebraic equations. Additionally, this principle can be used to design circuit components such as resistors with specific resistance values needed for a given application.