Accuracy versus precision

Accuracy Versus Precision

Precision and accuracy are characteristics of measurement, which plays an important role in quality management. If the collected measurements do not meet the requirements, the deliverable cannot be accepted and is said to be of poor-quality.

You analyze the collected data for the accuracy and precision of the deliverable. If the measurements are accurate and precise, you will accept the deliverable. Otherwise, you will ask for corrective action.

A perfectly acceptable deliverable must be accurate and precise.

Accuracy Versus Precision

Precision and accuracy are often incorrectly assumed to be synonyms of each other. Therefore, let me clarify our understanding of these quality management concepts.

Precision

Measurements are precise when the values of repeated measurements are clustered and have less scatter.

Precise measurements are not necessarily close to the target value, it just means that the results are close to one another. These measurements may or may not be near the target value.

Measurements are said to have high-precision when there is little scatter.

A Real-World Example of Precision

Assume that you received an order to supply 10,000 rods, 10 meters in length, to your client. You started production, and during the quality inspection, you randomly measure five rods.

The length of each rod is as follows:

Accuracy Versus Precision

If you analyze these measurements, you will notice that they are not close to the target (10 meters) but are very close to one another. There is very little difference in their lengths, so their measurements have very little scatter.

In this case, the measurements are precise.

Precision is a measure of the variation among values.

Accuracy

Accuracy is defined as how close the measured values are to the target value.

Scatter doesn’t have a significant role here. Accurate measurements may or may not be close to one another in a scatter. In other words, accurate data does not have to be precise, but it is ideal.

A Real-World Example of Accuracy

Let us consider the example discussed earlier. From a production lot, you randomly pick five rods for a quality inspection. You measure their lengths, and the dimensions are as follows:

Accuracy Versus Precision

You can see that all measurements are very close to the target length of the rod, which is 10 meters. Although, the values are closer to the target value, they are not close to each other and scatter is high.

  • So, you can say that these measurements are accurate but not precise.
  • You may be wondering which characteristic of measurement is more desirable.
  • The answer is “accuracy.”

This is because all data are close to the actual value, which is the sign of correctness of a deliverable. However, the best case is if the measurements are precise as well, because the measurements are close to the target value and very close to each other.

The Difference Between Accuracy and Precision

There are a few differences between accuracy and precision:

  • Accurate data are close to the target value, while precise data are close to each other.
  • Accuracy is always desired while precision is desirable when it is coupled with accuracy.
  • Accurate data can be precise while precise data may or may not be accurate. 
  • Precision and accuracy are independent of each other.
  • One measurement is enough for accuracy, while precision requires many measurements.

The Significance of Accuracy and Precision

Measurements are important for quality management. If the measurements are precise as well as accurate, you can say that the product is defect free.

However, if the measured data is neither precise nor accurate, the product is defective; i.e. it is lacking correctness and exactness at the same time, and you have to take corrective and preventive action.

Summary

Precision and accuracy are vital quality management concepts. Accuracy is about closeness to the required value while precision measures repeatability.

Precision alone is as important, unless it is coupled with accuracy. It is not necessary for precise measurements to be accurate or accurate measurements to be precise.

Precise measurements can be accurate or inaccurate, and accurate measurements can be precise or imprecise. 

It is the responsibility of the project management team to decide the level of accuracy and precision for their project deliverables during the quality inspection.

These topics are important from a PMP exam point of view; therefore, you must know these concepts well and understand the differences between them.

I hope I have clarified a few things to you. If you have any thoughts or feedback, please share it through the comments section below.

Image credit => NOAA’s National Ocean Service

See also:  Title capitalization rules

Accuracy Versus Precision

A surveyor strives for both accuracy and precision.  Many people use the terms “accuracy” and “precision” interchangeably.  However, for those in the surveying profession (as well as other technical and scientific fields), these words have different meanings.

  To surveyors, “accuracy” refers to how closely a measurement or observation comes to measuring a “true value,” since measurements and observations are always subject to error.

  “Precision” refers to how closely repeated measurements or observations come to duplicating measured or observed values. 

Using four cases of rifle shots fired at a bull’s eye target, each with different results, helps to distinguish the meaning of these two terms.

Accuracy Versus Precision

These four sets of rifle shots illustrate the distinction that surveyors make between the terms “accuracy” and “precision,” as applied to surveying measurements and observations. Click image for larger view.

Case 1: Not accurate, not precise:   A shooter stands, aims through the rifle’s telescopic sight, and fires four shots at a target.  Upon examining the target, the shooter sees that all four shots are high or left and scattered all around that part of the target.  These shots were neither accurate (not close to the center) nor precise (not close to each other).

Case 2: Precise, not accurate: The shooter assumes a prone position, rests the barrel of the rifle on a support, takes careful aim, holds his breath, and gently squeezes the trigger.

  The target shows that these four shots are very close together, but all four are high and to the left of the bull’s eye.

  These shots are precise (close together), but not accurate (not close to the center of the target).

Case 3: Accurate, not precise:  The shooter adjusts the rifle’s telescopic sight and, full of confidence that the problem of inaccuracy has been solved, stands and quickly fires four shots.  Upon studying the target, the four holes are scattered across the target, but the location of each of the four is very close to the bull’s eye.  These shots are accurate, but not precise.

Case 4: Accurate, precise:  The shooter again assumes a prone position, rests the barrel of the rifle on a support, takes careful aim, holds his breath, and gently squeezes the trigger four times.  This time, the four holes are very close to the center of the target (accurate) and very close together (precise).

To illustrate the distinction between terms using a surveying example, imagine surveyors very carefully measuring the distance between two survey points about 30 meters (approximately 100 feet) apart 10 times with a measuring tape.

  All 10 of the results agree with each other to within two millimeters (less than one-tenth of an inch).  These would be very precise measurements.  However, suppose the tape they used was too long by 10 millimeters.

  Then the measurements, even though very precise, would not be accurate.

  Other factors that could affect the accuracy or precision of tape measurements include:  incorrect spacing of the marks on the tape, use of the tape at a temperature different from the temperature at which it was calibrated, and use of the tape without the correct tension to control the amount of sag in the tape.

 

Accuracy and Precision

They mean slightly different things!

Accuracy

Accuracy is how close a measured value is to the actual (true) value.

Precision

Precision is how close the measured values are to each other.

Examples

Here is an example of several values on the number line:

And an example on a Target:

High Accuracy Low Precision Low Accuracy High Precision High Accuracy High Precision

Accuracy Versus Precision

If you are playing football and you always hit the right goal post instead of scoring, then you are not accurate, but you are precise!

How to Remember?

  • aCcurate is Correct (a bullseye).
  • pRecise is Repeating (hitting the same spot, but maybe not the correct spot)

Bias (don't let precision fool you!)

When we measure something several times and all values are close, they may all be wrong if there is a “Bias

Bias is a systematic (built-in) error which makes all measurements wrong by a certain amount.

Examples of Bias

  • The scales read “1 kg” when there is nothing on them
  • You always measure your height wearing shoes with thick soles.
  • A stopwatch that takes half a second to stop when clicked

In each case all measurements are wrong by the same amount. That is bias.

Degree of Accuracy

Degree of Accuracy depends on the instrument we are measuring with. But as a general rule:

See also:  Newton, einstein, and gravitational waves

The Degree of Accuracy is half a unit each side of the unit of measure.

Examples:

When an instrument measures in “1”s any value between and is measured as “7”
When an instrument measures in “2”s any value between 7 and 9 is measured as “8”

(Notice that the arrow points to the same spot, but the measured values are different! Read more at Errors in Measurement. )

We should show final values that match the accuracy of our least accurate value used.

Example: We are told the dog is about 2 feet high.

We can convert that to 609.6 mm, but that suggests we know the height to within 0.1 mm!

So we should use 600 mm

Copyright © 2017 MathsIsFun.com

Accuracy vs. Precision

  • Precision Machined Products Association

Accuracy describes “close-to-true value.” Precision describes “repeatability.


Accuracy in measurement describes how closely the measurement from your system matches the actual or true measurement of the thing being measured.

It is the difference between the observed average of measurements and the true average.

  • Think of accuracy as the “trustworthiness” of a measurement system.
  • Precision in measurement describes how well a measurement system will return the same measure; that is its repeatability.
  • As the targets show, it is important to be both accurate and precise if you are to get useable information from your measurement system.

But the repeatability has two components: that of the measurement system (gage) itself and that of the operator(s). The differences resulting from different operators using the same measurement device is called reproducibility.

  1. In our shops, we cannot tell if our measurement system has repeatability or reproducibility issues without doing a Long Form Gage R&R study.
  2. Gage repeatability and reproducibility studies (GR&R) use statistical techniques to identify and discern the sources of variation in our measurement system: is it the gage, or is it the operator?
  3. Gage error determined by the GR&R is expressed as a percentage of the tolerance that you are trying to hold.

Typically, 10 percent or less gage error is considered acceptable. More than 30 percent is unacceptable; between 10 and 30 percent gage error may be acceptable depending on the application. 

Regardless, any level of gage error is an opportunity for continuous improvement.

Accuracy, Precision, and Error

  • Describe the difference between accuracy and precision, and identify sources of error in measurement
    • Accuracy refers to how closely the measured value of a quantity corresponds to its “true” value.
    • Precision expresses the degree of reproducibility or agreement between repeated measurements.
    • The more measurements you make and the better the precision, the smaller the error will be.

Accuracy is how close a measurement is to the correct value for that measurement. The precision of a measurement system is refers to how close the agreement is between repeated measurements (which are repeated under the same conditions).

Measurements can be both accurate and precise, accurate but not precise, precise but not accurate, or neither.

High accuracy, low precisionOn this bullseye, the hits are all close to the center, but none are close to each other; this is an example of accuracy without precision.

Low accuracy, high precisionOn this bullseye, the hits are all close to each other, but not near the center of the bullseye; this is an example of precision without accuracy.

Precision is sometimes separated into:

  • Repeatability — The variation arising when all efforts are made to keep conditions constant by using the same instrument and operator, and repeating the measurements during a short time period.
  • Reproducibility — The variation arising using the same measurement process among different instruments and operators, and over longer time periods.

Accuracy and Precision – YouTubeThis is an easy to understand introduction to accuracy and precision.

Error

All measurements are subject to error, which contributes to the uncertainty of the result. Errors can be classified as human error or technical error. Perhaps you are transferring a small volume from one tube to another and you don’t quite get the full amount into the second tube because you spilled it: this is human error.

Technical error can be broken down into two categories: random error and systematic error. Random error, as the name implies, occur periodically, with no recognizable pattern.

Systematic error occurs when there is a problem with the instrument. For example, a scale could be improperly calibrated and read 0.5 g with nothing on it. All measurements would therefore be overestimated by 0.5 g.

Unless you account for this in your measurement, your measurement will contain some error.

See also:  How to use histograms to take better pictures

How do accuracy, precision, and error relate to each other?

Here Is the Difference Between Accuracy and Precision

Accuracy and precision are two important factors to consider when taking data measurements. Both accuracy and precision reflect how close a measurement is to an actual value, but accuracy reflects how close a measurement is to a known or accepted value, while precision reflects how reproducible measurements are, even if they are far from the accepted value.

  • Accuracy is how close a value is to its true value. An example is how close an arrow gets to the bull's-eye center.
  • Precision is how repeatable a measurement is. An example is how close a second arrow is to the first one (regardless of whether either is near the mark).
  • Percent error is used to assess whether a measurement is sufficiently accurate and precise.

You can think of accuracy and precision in terms of hitting a bull's-eye.

Accurately hitting the target means you are close to the center of the target, even if all the marks are on different sides of the center.

Precisely hitting a target means all the hits are closely spaced, even if they are very far from the center of the target. Measurements that are both precise and accurate are repeatable and very near true values.

There are two common definitions of accuracy. In math, science, and engineering, accuracy refers to how close a measurement is to the true value.

The ISO (International Organization for Standardization) applies a more rigid definition, where accuracy refers to a measurement with both true and consistent results. The ISO definition means an accurate measurement has no systematic error and no random error. Essentially, the ISO advises that accurate be used when a measurement is both accurate and precise.

Precision is how consistent results are when measurements are repeated. Precise values differ from each other because of random error, which is a form of observational error. 

You can think of accuracy and precision in terms of a basketball player. If the player always makes a basket, even though he strikes different portions of the rim, he has a high degree of accuracy.

If he doesn't make many baskets but always strikes the same portion of the rim, he has a high degree of precision.

A player whose free throws always make the basket the exact same way has a high degree of both accuracy and precision.

Take experimental measurements for another example of precision and accuracy. If you take measurements of the mass of a 50.0-gram standard sample and get values of 47.5, 47.6, 47.5, and 47.

7 grams, your scale is precise, but not very accurate. If your scale gives you values of 49.8, 50.5, 51.0, and 49.6, it is more accurate than the first balance but not as precise.

The more precise scale would be better to use in the lab, providing you made an adjustment for its error.

An easy way to remember the difference between accuracy and precision is:

  • ACcurate is Correct (or Close to real value)
  • PRecise is Repeating (or Repeatable)

Do you think it's better to use an instrument that records accurate measurements or one that records precise measurements? If you weigh yourself on a scale three times and each time the number is different, yet it's close to your true weight, the scale is accurate.

Yet it might be better to use a scale that is precise, even if it is not accurate. In this case, all the measurements would be very close to each other and “off” from the true value by about the same amount.

This is a common issue with scales, which often have a “tare” button to zero them.

While scales and balances might allow you to tare or make an adjustment to make measurements both accurate and precise, many instruments require calibration. A good example is a thermometer.

Thermometers often read more reliably within a certain range and give increasingly inaccurate (but not necessarily imprecise) values outside that range. To calibrate an instrument, record how far off its measurements are from known or true values.

Keep a record of the calibration to ensure proper readings. Many pieces of equipment require periodic calibration to ensure accurate and precise readings.

Accuracy and precision are only two important concepts used in scientific measurements. Two other important skills to master are significant figures and scientific notation. Scientists use percent error as one method of describing how accurate and precise a value is. It's a simple and useful calculation.

Be the first to comment

Leave a Reply

Your email address will not be published.


*