Voltmeter sensitivity, in the context of measurement and instrumentation, refers to the ability of a voltmeter to detect and measure small changes in voltage accurately. It is an important characteristic of a voltmeter and is usually expressed in terms of volts per division (V/div) or millivolts per division (mV/div) on the voltmeter's display.
The sensitivity of a voltmeter is determined by the internal circuitry and components of the voltmeter. A more sensitive voltmeter can detect smaller voltage changes, making it suitable for measuring low-amplitude signals, while a less sensitive voltmeter is better suited for higher-amplitude signals.
For example, if you have a voltmeter with a sensitivity of 1 mV/div, each vertical division on the voltmeter's display represents a change of 1 millivolt in the measured voltage. Similarly, a voltmeter with a sensitivity of 0.1 V/div would mean that each vertical division represents a change of 0.1 volts in the measured voltage.
When selecting a voltmeter for a particular measurement task, it's important to choose a sensitivity that matches the expected voltage range of the signal you'll be measuring. Using a voltmeter with the appropriate sensitivity ensures that the measurements are accurate and that the displayed values are easily readable.
It's worth noting that the sensitivity of a voltmeter is not the only consideration when choosing a measurement instrument. Factors like accuracy, precision, input impedance, and the type of signal (AC or DC) also play important roles in determining the suitability of a voltmeter for a given application.