Precision & Accuracy

Precision

       It is the degree of closeness of the output of the sensor for a certain reading. In simple words, it is the spread or dispersion of repeated results about a certain value.  The closer the outputs to each other, the greater is the precision of the sensor. For example, for a certain reading, if  a sensor gives output 3.2, 3.4, 3.4, 3.3, 3.2, 3.3, then it is more precise than the one which gives 3.0, 3.4, 3.8, 3.8, 3.2, 4.0.  It is independent of the exact value which is being measured.


Accuracy

      It is the closeness with which a sensor approaches the exact or the true value of the reading. A sensor is more accurate if its output is more close to the true value. For example, if the reading to be taken is 5.8, then a sensor giving output as 5.7, 5.75, 5.82, 5.81,  is more accurate than a sensor giving output as 5.65, 5.66, 5.58, 6.1 and so on. The difference between precision and accuracy is that precision is only closeness of outputs to each other whereas, accuracy means closeness of outputs to each other as well as to the expected or true value being measured. Therefore, a sensor which is accurate, can be rarely too less precise. However, a sensor being precise can give no idea about its accuracy.



Error is defined as:

Error = Measured Value - Expected Value

Lesser the error, more accurate is the sensor.
An instrument or a sensor having an accuracy of 0.05% is more accurate as compared to the one having accuracy of 0.1%.
This is because accuracy is always expressed in terms of error.

Sensor Characteristics
a: Low Precision, Low Accuracy
b: High Precision, High Accuracy
c: High Precision, Low Accuracy

Accuracy can be defined in either of the three ways:

A] Point Accuracy:
     By this way, accuracy is defined at a single value, which lies within the range of the instrument. This method is rarely used nowadays.

B] Percentage of Full Scale Reading:
     This method defines accuracy with respect to range of the sensor. For example, for a thermometer having range 500 degrees, if accuracy is defined as +0.5%  of full scale reading, then its error at 500 degrees is 2.5 deg. which is negligible. However, for a reading of 25 deg., the error becomes (500/25)*(0.5) which is as high as 10%.

C] Percentage of True Value:
     This is the best way to express Accuracy. In this method, the error goes on reducing with the reading, contrary to the above method. For example, if accuracy is specified as +1% of true value and as +1% of full scale reading, the different accuracy for different readings are compared in the table given below:

       


Percent of reading                                                                                                                                                                                              


 
  
Percent of full-scale reading                                                                                                                                           
Instrument reading (newton)Accuracy                  (newton)
Uncertainty(newton)Equivalent
‘percentage
of reading’
100010
101%
5005
102%
1001
1010%
500.5
1020%
100.1
10100%
50.05
10200%
00.00
10
infinite