The adjustable DC power supply is a mainstay of the electrical and electronics laboratory. It is indispensible in the prototyping of electronic circuits and extremely useful when examining the operation of DC systems. Of equal importance is the handheld digital multimeter or DMM. This device is designed to measure voltage, current, and resistance at a minimum, although some units may offer the ability to measure other parameters such as capacitance or transistor beta. Along with general familiarity of the operation of these devices, it is very important to keep in mind that no measurement device is perfect; their relative accuracy, precision, and resolution must be taken into account. Accuracy refers to how far a measurement is from that parameter’s true value. Precision refers to the repeatability of the measurement, that is, the sort of variance (if any) that occurs when a parameter is measured several times. For a measurement to be valid, it must be both accurate and repeatable. Related to these characteristics is resolution. Resolution refers to the smallest change in measurement that may be discerned. For digital measurement devices this is ultimately limited by the number of significant digits available to display.
A typical DMM offers 3 1/2 digits of resolution, the half-digit referring to a leading digit that is limited to zero or one. This is also known as a “2000 count display”, meaning that it can show a minimum of 0000 and a maximum of 1999. The decimal point is “floating” in that it could appear anywhere in the sequence. Thus, these 2000 counts could range from 0.000 volts up to 1.999 volts, or 00.00 volts to 19.99 volts, or 000.0 volts to 199.9 volts, and so forth. With this sort of limitation in mind, it is very important to set the DMM to the lowest range that won’t produce an overload in order to achieve the greatest accuracy.
A typical accuracy specification would be 1% of the reading plus two counts. “Reading” refers to the value displayed. If the 2 volt range was selected to read 1 volt (a measurement range of 0.000 to 1.999 for a 3 1/2 digit meter), 1% would be 10 millivolts (0.01 volts). To this a further uncertainty of two counts (i.e., the finest digit) must be included. In this example, the finest digit is one millivolt (0.001 volts) so this adds another 2 millivolts for a total of 12 millivolts of potential inaccuracy. In other words, the value displayed by the meter could be as much as 12 millivolts higher or lower than the true value. For the 20 volt range the inaccuracy would be computed in like manner but notice that accuracy is lost because the lowest digit is larger (i.e., the “counts” represent a larger value). In this case, the counts portion jumps up to 20 mV for a total inaccuracy of 30 mV. Obviously, if a signal in the vicinity of, say, 1.3 volts was to be measured, greater accuracy will be obtained on the 2 volt scale than on either the 20 or 200 volt scales. In contrast, the 200 millivolt scale would produce an overload situation and cannot be used. Overloads are often indicated by either a flashing display or a readout of “OL”. Finally, analog meters typically give a base accuracy in terms of a percentage of “full scale” (i.e., the selected scale or range) and not the signal itself, and obviously, there is no “counts” specification.