Terminologies used in Measurement

An instrument will be used commonly for measurement as a physical means in determining a quantity or variable. The instrument provides an extension to the persons aiding them to measure the desired quantity or variable.

An instrument can be defined as a device for determining the value or magnitude of a quantity or variable. Electronic instruments use the electrical or electronic principles for their functioning.
There are instruments of simple construction and we also find many instruments, with complicated construction. Naturally as the technology develops the desire to develop highly accurate and precision instruments increases. As a result we find more and more intelligent instruments in the field of electronic instrumentation, with improved techniques and user friendly test procedures.
Reductions in physical size of the instrument, multifunction facility, adaptability are the special features of the world class technology in measurements. However the choice of an instrument depends on its suitability for a particular application.

In the field of electronic measurements we come across terms that are to be defined. In this topic such terms have been defined with brief explanation. Terminologies used in measurement are explained below.


An Instrument may be defined as a device for shaping the value or magnitude of a quantity or variable.

Example - Ammeter, Voltmeter, Digital Multimeter, Frequency counter, Cathode Ray Oscilloscope, etc.


It may defined as the closeness with which an instrument reading approaches the true value of the variable being measured.

Example : Let us assume that the voltage across a resistor is 2.45 V. Let the voltage be measured by two different voltmeters, that have identical manufacturing specifications and are made by the same firm, If one instrument indicates 2.75 V, and the other meter indicates 2.5 V, the later one is more accurate than the former, one. The reason is that 2.5 V is closer to 2.45 V.


Example: Let us assume that a resistor has a true value of 1,4,87 ohm. If its value is measured by an ohm meter one will read it as 1.5 k ohm. Even on repeated readings of the scale, the reading arrived will only be 1.5 k ohm. This reading is close to the true value as of the information that one can read the scale only by estimation.

The point to be noted here is that precision consists of two characteristics:

Confirmity and the number of significant figures to which a measurement may be made. The reading of the value on repeated readings will be 1.5 k ohm, which is close to the true value as the reading was observed by estimation over the scale. It is observed that the true value cannot be estimated because of the limitation of the scale of the meter. In spite of the fact that there are no deviations in the observed value, the error in arriving at the true value because of the limitation of the scale leads to precision error.

The conclusion from the above example is that conformity is necessary, but is not sufficient condition for precision, because of the lack of significant figures obtained. Further precision is necessary, but not sufficient, condition for accuracy.


It may be defined as the ratio of output signal or response of the instrument to a change of the input or measured variable.

Example: A 50 μA is sensitive than a 1 mA meter as the former takes only 50 μA to give full scale deflection , where as the latter one takes 1 mA.


It may be defined as the smallest change in the measured value to which the instrument will respond.

Example: Let us assume that we are measuring the output voltage of a circuit keeping its input constant. If there is an incremental change in the input an instrument with good resolution will show a change in the output. An instrument that produces corresponding change in the measured value for incremental changes in the measured value has good resolution.


It may be defined as the amount of deviation from the true value of the variable or quantity being measured.

(a) Positive Error: When the indicated value of the instrument is more than the true value the error is said to be positive.
(b) Negative Error: If the indicated value of the instrument is less than the true value the error is said to be negative
(c) Gross Errors: Mostly these are human errors due to wrong reading of the instruments, improper adjustment application of the instruments and mistakes in computation.
(d) Systematic Errors: These errors are due to the wear and tear, aging of parts. Equipment also will be affected by environment.
(e) Random Errors: These are due to the variations in parameter or system at random that cannot be directly established.


It may be defined as a process in which an instrument is used to give numerical value to a parameter. In measurement a specific number or value is found out.


It may be defined as a process of observation or examination. A test need not necessarily lead to any numerical value.


It may be defined as a circuit dimension or element, such as current, voltage power or resistance.

No comments:

Post a Comment