Most handheld or benchtop digital multimeters display measurements digitally. The number of digits displayed on the screen usually represents the measurement resolution of the digital multimeter. Resolution is the smallest detail that a digital multimeter can quantitatively determine when making a measurement. Therefore, the more digits a digital multimeter displays, the higher its resolution. Handheld digital multimeters are typically capable of displaying three-and-a-half and four-and-a-half digits. Benchtop digital multimeters are typically capable of displaying five and a half, six and a half, seven and a half, and even eight and a half digits.
A detailed look at the number of digits, accuracy and resolution of digital multimeters
What are the digits, accuracy and resolution of a digital multimeter?
Digital Multimeter (DMM)figure,accuracycap (a poem)resolution (of a photo)It is its core performance parameter, which directly affects the accuracy of the measurement results and the scope of application. The following is a detailed explanation:
1. Digits
define: The number of digits indicates the number of digits that the multimeter can display, usually in the form of "½ digit" (e.g. 3½ digits, 4½ digits, etc.).
½ position: The highest bit can only be displayed 0 maybe 1(For example, a 3½-digit multimeter with a maximum display value of 1999).
pantotope: All digits can be displayed 0-9(For example, a 4-digit multimeter with a maximum display value of 9999).
significance::
The more digits, the higher the measurement range and resolution (e.g., 4½ digits is finer than 3½ digits).
Common digit grades:
3½-bit: Basic model, suitable for daily maintenance (display range) 0-1999).
4½-bit: Mid to high end, suitable for laboratories (display range) 0-19999).
6½ and above: High precision scientific research or calibration purposes.
2. Accuracy
define: Accuracy indicates how close the measurement result is to the true value and is usually expressed as a percentage error plus a fixed error term (e.g. ± (1% + 3d)).
percentage error: Range dependent (e.g., at 2V range) ±1% homologous ±0.02V).
Fixed error (d): Based on the minimum resolution (e.g. 3d (Indicates ±3 minimum display units).
typical example:: If the accuracy of the multimeter in the 2V range is ± (1% + 3d)The resolution is 1mV::
The measured values are 1.500V When the error is 1.500 x 1% + 3 x 0.001V = 0.015V + 0.003V = ±0.018VThe
significance::
The higher the accuracy, the more reliable the measurements, but also the higher the cost.
Calibration, temperature drift and component quality all affect accuracy.
3. Resolution
define: Resolution is the smallest amount of input change that the multimeter can recognize, i.e., the smallest display unit (e.g., the 1mV).
Determined by both the range and the number of digits: for example, the resolution of a 4½-digit multimeter in the 2 V range is 2000mV/19999 ≈ 0.1mVThe
Difference with accuracy::
High resolution does not mean high accuracy (e.g., the resolution is shown up to 0.1mV(but the actual error may be much larger).
Resolution reflects "sensitivity" and accuracy reflects "confidence".
practical application::
In auto-range mode, the multimeter automatically switches ranges according to the input, which may affect the resolution (e.g., the resolution decreases when switching to a higher range).
Relationship between the three and recommendations for selection
Bits and Resolution: The more bits, the higher the resolution (e.g., 4½ bits are finer than 3½ bits).
Accuracy and Resolution: High resolution needs to be supported by high accuracy, otherwise fine display is meaningless.
Electronic design/debugging: 4½ bits or more (e.g. to analyze sensor signals).
Precision Measurement: 6½ digits or more, in conjunction with high accuracy (e.g., calibrated instruments).
summary table
parameters
define
Example (3½ digit table)
affect (usually adversely)
figure
Number of digits displayed (with ½ digit)
Maximum display 1999
Measurement range and upper limit of resolution
accuracy
Deviation of measurement results from the true value
± (0.5% + 2d)
Reliability of results
resolution (of a photo)
Minimum recognizable change
1mV (at 2V range)
Sensitivity and Detail Capture Capability
The choice needs to balance all three according to the needs: high digits and high accuracy are suitable for professional scenarios, while basic applications can be prioritized for cost-effectiveness.
A function generator is an electronic test device that generates a variety of standard waveforms for a device under test (DUT), such as sine, square, ramp, or sawtooth waveforms. In circuit design and circuit boards, testing often requires the use of controlled signals to simulate routine operations. Testing physical systems and sensors often requires stable and reliable signals that are as low as a few microvolts and may be as high as...
There are two main types of spectrum analyzers: sweep tuning analyzers and real-time analyzers. Modern spectrum analyzers use digital signal processing to provide more measurement capabilities, making it easier for you to interpret your measurements. Whether it's a swept-tuning analyzer or a real-time spectrum analyzer, they both display the change in amplitude with frequency. However, exactly how each analyzer handles...
Hello!sign in