Caliper accuracy question

Kosmic

Regular
Rating - 100%
11   0   0
Location
YUL
I'm using a cheap dial caliper (for years) and was questioning its precision for reloading. So, with tooling available, I measured the thickness of a Starrett feeler gauge from 0.001" to 0.030" in one thou increment and it was right on the spot. I measured 0.050", 0.075", 0.080" and 0.100" also accurately.

Can I call it good to go?
 
A good set of Japanese, Swiss, or English calipers, be them Vernier, dial, or digital, are only ever good for +/-.002". If you need to measure something to higher degree of accuracy than that, you need a micrometer, which are accurate to a +/-.0002".
All that being said, the same tool in a different individual's hands will yield a different measurement.
If you have taken measurements of knowns, and found your caliper to display said sizes, you'll be good. Make a mental not of the amount of force required to yield those measurements, and be sure to use the same force when measuring unknowns.
 
It is good to go. Calipers are more accurate than the people who use them. The cheap ones aren't always spot on, so just adjust with a large hammer, put it in the garbage and get another.
 
I've been pondering the same question myself. I ended up buying a se caliper just to make sure. Unfortunately, rarely will I get the same measurement which can make it hard to know if I have to trip a case or not.
 
Typical accuracy for a 6" caliper is +/- .001", for a total range of .002". Micrometers are typically +/- .001" for a total range of .0002". This is as advertised by "name brand" manufacturers like Starrett or Mitutoyo.

While good tools are nice to have, I find a cheap dial caliper from Princess Auto to be satisfactory for my reloading needs. Using something of known size to calibrate it (e.g. feeler gauge, bullet) can be helpful.
 
While the manufacturers may not back their products for uses more accurate than a certain amount, depending on the product, the weakest link is generally the guy running the tool.

Qualify your own readings on a known dimension, until you are happy with the accuracy and precision (two different things!). Accuracy=sameness, Precision=exactitude of dimension. Bearing shells (ball bearing housings) are cheap and wonderfully accurate devices <-Hint) to check repeatability against.

If you are not building parts for NASA by mail order, it really doesn't matter if your tools are reading off a bit, as long as they are reading the same amount off all the time.

If you pick up a bit of crap on the rack of a dial indicator, unless you are a thick f**k, you should be able to feel the lump as the pinion rolls over the bump.

Quoting silly random stuff from an exam question, is not as relevant to real life uses as the instructors would like to have you believe.

But, without having a system of reference dimensions and a means of qualifying the tools and the user's feel, all tools are suspect until proven otherwise, eh?

Short version, if it works, repeats for you, it's fine!

Cheers
Trev
 
I'm using a cheap dial caliper (for years) and was questioning its precision for reloading. So, with tooling available, I measured the thickness of a Starrett feeler gauge from 0.001" to 0.030" in one thou increment and it was right on the spot. I measured 0.050", 0.075", 0.080" and 0.100" also accurately.

Can I call it good to go?

Call it good to go from 0.0" up to 0.100" but not past. That's only a small percentage of the total range of your instrument.

For instance: 0.1" is only 1.5% of the total range on a 6" caliper. You can't tell anything about the other 98.5% of the measurement range by testing only the bottom 1.5%.
 
Last edited:
It's a caliper. Not gonna happen. Too much is influenced by variables out of the manufacturer's control. EI. the human error factor.

They certify the gauge, not the person running it. We have calipers certified to 0.0005"

But that's getting into the expensive range.
 
Call it good to go from 0.0" up to 0.100" but not past. That's only a small percentage of the total range of your instrument.

For instance: 0.1" is only 1.5% of the total range on a 6" caliper. You can't tell anything about the other 98.5% of the measurement range by testing only the bottom 1.5%.

^^This guy got the right answer. You've basically calibrated (or more accurately, you've verifier the calibration) of your instrument on a given range. Anything outside that range might be accurate or not. Basically, just because your caliper is accurate from 0 to 0.1'' doesn't mean it's accurate to measure a length of 2.1'' or 4.5''. If you ant to calibrate correctly, begin with 0.05'' like you did, then double until you reach the end of your caliper. So, for a 6'' caliper:
0.05
0.1
0.2
0.4
0.8
1.6
3.2
6.0
 
I've been using and trusting basic dial calipers for years. What I watch for is the zeroing. If it's out I wipe the jaw faces clean and 999 times out of a thousand that restores the zero. If it's out by much I'll zero the face then check a diameter or two with a mic then the calipers. I've also got a 1 inch and 2 inch standards from my mic set which I'll check on the calipers.

Out of the half dozen sets of calipers I've got around the shops I've had two of them that got sloppy and were restored to accuracy by simply removing, cleaning and resetting the slide gib screws with a touch of blue thread locker for a slight drag and no play.

Part of this is having a consistent "touch" with how you handle the calipers. And caring for them so they don't get dropped on the floor or crushed under heavy items. But treated with respect there's no reason at all that you can't get true results to within .001 once you've tested it against another more accurate tool like a micrometer.

Typically I've yet to see any of my dial calipers be more than .0005" out when checked with those calibration checking pieces from my mic set. And most of them are accurate to within a needle width.

That's me and my tools though.

To those like Sam that don't get the same reading twice you may want to check the gib strip screws for play and adjust each in turn for a good fit that produces a slight drag. If the calipers won't give a consistent reading then it's likely that there's some play in one or both of the gib screws.

There's no reason why a set of calipers can't work consistently to at worse .001" unless they are the cheapest junk out there or you're not using them correctly.

You want a firm but light contact. And the calipers need to be square to the work in all respects. So you want to move the calipers around a little or move the piece around a little to find the lowest reading for external measurements or the largest reading for internal measurements. This eliminates any error due to taking a reading across the piece at an angle, which would produce a larger or smaller reading. That minimum reading for external and max for internal measurements is the proper reading. And normally you want to take the reading without removing the calipers.

So much like anything it's more a matter of HOW we use this tool than an issue with the tool itself.
 
As mentioned, you are likely good to go.

Back in the day C type micrometers with rotating drums for handles to adjust the mandrel and base came with little ratchet mechanisms on the tip. This supposedly stopped the units from being over tensioned. Some ham fisted fools just have to crank them as hard as they possibly can. When I was in charge of quality control we had a couple of people like that. They quite literally destroyed the units by distorting the threads.

Also, most precision measuring devices, even some of the cheap ones, came with a certified "standard" to set your devices to. Like any other mechanical device these tools wear with use or something else goes awry. If your vernier is skipping a tooth or has done it all all, you're using to much force or its been dropped.

For measuring precisely the tips of some tools are made with ground points.

The biggest issue when it comes to measuring anything is keeping the measuring faces square to whatever you will measure. The unit will only measure the distance between its faces. If the unit is off square, the distance between the faces will be included in the read out.

One other thing with cheap precise measuring instruments and even some high end units. They are all mostly fine under an inch or a couple of centimeters, after that they start becoming erratic. I have 50 year old Starret micrometers and verniers that are extremely accurate. Right in the lovely hardwood box they come in are 5 different "standards" to go along with each diameter they are designed to measure, which are in one inch increments from 0-1 up to 5-6 inches. The vernier isn't electronic nor does it have a dial indicator. It also has a couple of standards to check accuracy at 1in/3in/6in. At anything over an inch they guarantee +- .001 inch at 70 degrees Fahrenheit. Yes, temperature makes a huge difference, especially at extremes and with larger devices.

For most purposes +-.002 is fine for reloading. The later measuring devices, especially dial indicators and electronic units that take their measurement readouts from a strip along the handle this now seems to be the accepted norm. If for some reason, you require tighter precision then be prepared to lay out substantial amounts of cash and make darn sure you UNDERSTAND what you're looking and that the device you want to purchase is capable of your requirements.

As far as how did they get away without precision measuring devices back in the day??? Each person or company had their devices made up individually at great expense. It wasn't that long ago that a tradesman with his own certified accurate measuring devices and tools was paid more and the last to be laid off. Precision tools were that valuable. Also, it wasn't that long ago when an accurate hunting rifle was considered to be acceptable shooting 4 inch groups with factory ammo and a very accurate rifle that would shoot the same ammo into 2 inches on a repeatable basis. One inch rifles were considered to be very rare birds. I've heard all of the stories about those rare birds but believe me when I say they were few and far between. Not today. Today's offerings are the best ever for off the shelf, mass produced firearms and ammo.
 
They certify the gauge, not the person running it. We have calipers certified to 0.0005"

But that's getting into the expensive range.
Maybe you're confusing discrimination with accuracy.
Just because it reads to .0005", doesn't mean it's accurate to that level.
In fact, you generally need another order of magnitude of descrimination beyond what your accuracy needs to be.
So to be accurate to .0005", you'd need to display to .00005", and you're not getting that on a caliper.
 
Back
Top Bottom