In voltage measurement, a **potentiometer is preferred to a voltmeter** primarily because it **consumes negligible current**. Let me explain this in simpler terms:
A **voltmeter** is an instrument used to measure the potential difference (voltage) across two points in an electrical circuit. However, when a voltmeter is connected, it draws a small amount of current from the circuit to make the measurement, which can slightly alter the voltage being measured. This is particularly an issue in high-resistance circuits where even a small current draw can significantly affect the measurement.
On the other hand, a **potentiometer** is a device designed to measure voltage by comparing it with a known reference voltage without drawing current from the circuit under test. It comes into balance at a point where no current flows through it, ensuring that the measurement is not influenced by the potentiometer itself. This makes it a non-invasive method of measuring voltage, which is particularly useful for precise measurements in sensitive circuits.
Here’s a brief explanation about why the other options listed are less relevant:
- A **wider range** is not a definitive advantage of potentiometers over voltmeters, as it depends on the settings and design of the specific instrument.
- **Bulkiness** is not a factor that inherently makes a potentiometer preferable; in fact, it can sometimes be a disadvantage due to space constraints.
- **Faster response** is not typically associated with potentiometers compared to voltmeters, as both can be designed to have quick response times depending on the technology used.
Therefore, the key advantage of the potentiometer is its **ability to measure voltage without altering the circuit**, which stems from its negligible current consumption. This **ensures more accurate and reliable measurements** in many applications.