ORC Radio Chatter

Experimenter's Corner => Test Equipment => Topic started by: K9DJT on June 29, 2013, 03:57:32 PM

Title: Accuracy, Resolution & Repeatability
Post by: K9DJT on June 29, 2013, 03:57:32 PM
Before we start discussing any particular test instrument, let’s take a look at something which applies to all, i.e., Accuracy, Resolution & Repeatability.  The fore mentioned is not only of interest in test equipment but also to measurement of all types.  Examples include dimensional (rulers, micrometers, calipers), temperature, weight, time, and even a rifle scope.  Basically, if it can be measured, Accuracy, Resolution & Repeatability apply.

So what is the difference between the three?  What makes for a good measurement device?  Do more digits on a meter or markings on a ruler make them more accurate?  And what is this repeatability thing?  In order to demonstrate the difference, I would like to draw an analogy to a bow and arrow target.  If you envision it as a gun shooting target, that is OK too.

We all know that being Accurate is the ability to hit the bull’s-eye.  (Single click the actual picture below.) Figure 1 shows three arrows all placed exactly in the center of the of target, which makes for not only a very Accurate shooter, but also a high Resolution and Repeatable one.  OK, now let’s look at figure 2.  This time the shooter again placed three arrows into the bull’s-eye but not dead center as before.  Is it still Accurate?  For practical purposes we can say it is.  After all, they are in the bull’s-eye.  So how Accurate do you need to be?  And how much are you willing to pay for it?  Instrument Accuracy is typically specified as a +/- percentage of reading.  This example is fitting of the expression often heard which says, “It’s good enough for ham radio.”  I am not sure if it is acceptable for a guidance system though.

When looking at figure 3, you will notice I added some additional rings to the bull’s-eye, or more digits to the meter.  Has the Accuracy change?  No, but the Resolution of the target has.  The point being, more digits does not make for a more accurate instrument.  Now that the instrument has better resolution, we can say it is not as accurate as it could be.  But again, how important is it as it relates to the type of work you are doing.

Ah, now for Repeatability.  Referring to figure 4, we can see there were three arrows which hit the target but each in a different place.  None near or really close to any other.  It’s easy to understand this is not Repeatable shooting.  Now looking at figure 5, we again have three arrows but this time grouped together in one spot.  This is very Repeatable shooting but not Accurate.  Figure 2 is reasonably repeatable and  reasonably accurate.

In conclusion, when purchasing a piece of test equipment, know in advance how accurate it needs to be for your application, and remember more digits does not necessarily mean it is any more accurate.  Look for something which fits your accuracy needs, and which is repeatable, meaning each time you probe a test point, you get the same reading.  It shouldn’t be different each time you go back to it.  Make some known measurements to build confidence and trust in the instrument so you can focus on your analysis without questioning your test equipment.
Title: Re: Accuracy, Resolution & Repeatability
Post by: N9LOO on July 17, 2013, 07:10:41 PM
Great explaination of the topic.  I'd be interested in know how to convert the various accuracy spec's I see for various pieces of test equipment -- for instance:
[Please see attachment]

Is this considered 'It's good enough for ham radio?"
Title: Re: Accuracy, Resolution & Repeatability
Post by: K9DJT on July 18, 2013, 10:04:24 AM
Looking at the accuracy specifications you provided, I can tell it is a digital multimeter, DMM, because the accuracy spec brings in a new term, i.e., COUNTS.  Counts refers to the actual numbers the DMM is capable of displaying.  For example a 3000 count DMM has the ability to display a number between 0-2999, a 4000 count  would be 0-3999 and a 5000 count has the ability to display 0-4999.

Now let’s look at computing the accuracy of an actual measurement.  For example, you might measure your auto battery voltage at 12.00 VDC (Volts DC).  The specs says (+) or (-) .5% +2 counts.  You would multiple 12.00 VDC by .5% (.005) and add (or subtract) 2 to the last digit of the measurement.  The result is +/- .06 VDC +2 counts which means the actual voltage can be as high as 12.06 VDC + 2 counts or as low as 11.94 VDC -2 counts.

The (+) or (-) 2 counts refers to the very last digit of measurement.   With the meter having a resolution of .01 VDC’s in the 60 VDC range, that means we simply add or subtract +/- 2 to our calculation.  Our measurement accuracy now equates to a high of 12.08 VDC or a low of 11.92 VDC.  If our resolution of the meter would be .001 instead of .01, the high could be 12.062 VDC with a low of 11.938.

At this point YOU need to determine if it is good enough for ham radio…  :-)

73, Gary
Title: Re: Accuracy, Resolution & Repeatability
Post by: N9LOO on July 18, 2013, 09:45:27 PM
Yes, it was for a DMM.  Thanks for the detailed explanation as I wasn't sure how the Counts spec factored in.  By providing me an example of how this translates into real world values I understand this much better now.

Is there any spec that covers the repeatability component that you refer to?
Title: Re: Accuracy, Resolution & Repeatability
Post by: K9DJT on July 19, 2013, 02:09:39 PM
I am unaware of a spec relating to "Repeatability" which is unfortunate.  This needs to be your own tests or word of mouth within the technical community.

73, Gary