Caliper accuracy question

Today's offerings are the best ever for off the shelf, mass produced firearms and ammo.

Yep!

One of the reasons there were thousands of gunsmiths all over the place, was that production guns were mostly crap! By todays production standards, in any case.

If you look into the manufacturing technology used in Winchester during essentially their 'glory days' most of the machinery was set up to make a single part or even a single cut on a part, using a master part set on to the machine to set the depths of cut.

There were very few actual measuring tools in use on the production floor, as the operators were there to feed the machines and pull levers, not to make changes above their pay grade. A few more skilled workers did the set-ups.

Looking at the drawings that the tool room used to make their masters from, for the Winchester 1885 High Wall rifles, tolerances of plus or minus 5 thousandths were pretty common, as were tolerances of several degrees. This has been one of the eternal thorns in the arses of the guys trying to reverse engineer the parts by copying a single example, as they have to guess what the dimension was supposed to be, based on the likelihoods for it's purpose, and the dimension of the finished part in hand.

In the case of measuring tools and repeatability, accuracy and precision, there are several large and dry books about the subject. Foundations of Mechanical Accuracy, by Moore is a pretty good one of you wish to read up on the stuff.
Testing any measuring tool on only full even increments of it's thread, rack, etc., opens you up to becoming a victim of cyclical errors that you have avoided. What I mean is, if you only check every even Inch, you do not know that there is not a fault with the rack or screw thread that manifests at a different portion of the reading, if that makes any sense. It gets covered a lot when discussing calibration and ensuring accuracy. An example would be if one tooth was bent on the pinion of a dial indicator. As the pointer goes around and around, there will be an area where it reads correctly, and a smaller area where the measurement is not reading accurately. The dimensions used to calibrate tools in a calibration center, are chosen to search those errors out.

Like I said before though, if you are not making parts for NASA and mailing them in to be measured there by other tools than the ones used for making them, you pretty much need to be reasonably sure that the parts you deal with are consistent and measure the same each time with the same tools. Having a few Standards around (accurately made items of known dimension) goes a long ways towards that peace of mind. measuring anything, several times in a row, and coming up with the same measurement, is a pretty solid way to add some confidence to your repertoire. It gets you used to the feel needed to get the best out of what you are doing, and most of the real world applications of measuring tools in our uses for stuff like reloading, the differences between very expensive tools and merely OK ones, are largely academic.

Cheers
Trev
 
Calipers are good enough fr reloaders measuring COL. I have 4 sets of calipers. They are close enough for most of what I do but if I really need to know a dimension within .001", then it's a micrometer every time.
 
Maybe you're confusing discrimination with accuracy.
Just because it reads to .0005", doesn't mean it's accurate to that level.
In fact, you generally need another order of magnitude of descrimination beyond what your accuracy needs to be.
So to be accurate to .0005", you'd need to display to .00005", and you're not getting that on a caliper.

Yes we do. Each level of certification is 4x more accurate than what it's certified against, as a NIST requirement. Our gauge blocks for example are verified to +\- 0.00003".

Our measurements are for threads that have to be set up to within 0.0005" so our gauging has to be very specific.

The floor use calipers that just came in were $300-$600 each and those will be verified to 0.001" only.
 
Maybe you're confusing discrimination with accuracy.
Just because it reads to .0005", doesn't mean it's accurate to that level.
In fact, you generally need another order of magnitude of descrimination beyond what your accuracy needs to be.
So to be accurate to .0005", you'd need to display to .00005", and you're not getting that on a caliper.

Yes we do. Each level of certification is 4x more accurate than what it's certified against, as a NIST requirement. Our gauge blocks for example are verified to +\- 0.00003".

Our measurements are for threads that have to be set up to within 0.0005" so our gauging has to be very specific.

The floor use calipers that just came in were $300-$600 each and those will be verified to 0.001" only.

You have calipers that display .00005"? Are you sure you counted the zeros correctly?
 
Yes. The masters are for verification. The working (floor) gauges are 4 digits.

You have a "master" caliper that is accurate to .00005"? What brand and model is that - I'd like to look up the specs, because I have trouble accepting that.
Perhaps you mean .0005", and not .00005" ?
 
Last edited:
The books or "they" might say that this is the case. But when I've tested mine against my mic references and on random items which I compare to actual micrometer readings I've found that my own dial calipers are far better than +/-.002. For absolute measurements I still rely on a mic. But particularly on comparative measuring I'm more than happy with my calipers even when working to within .0005 between one item and another. And even when I've taken the time to compare readings from both mic and calipers they have always agreed to within less than or at worst to around .001.
 
Even that isn't the .0005" you're claiming.

Yes I know. Those are the "working" gauges we give to the floor workers. I'll find out the info for the more accurate one when I'm back at work.

And I'm well aware of micrometers. We have ~30 calipers and ~150 micrometers. Some things / measurements work better, or quicker with required accuracy using a caliper over a micrometer.
 
Yep!

One of the reasons there were thousands of gunsmiths all over the place, was that production guns were mostly crap! By todays production standards, in any case.

If you look into the manufacturing technology used in Winchester during essentially their 'glory days' most of the machinery was set up to make a single part or even a single cut on a part, using a master part set on to the machine to set the depths of cut.

There were very few actual measuring tools in use on the production floor, as the operators were there to feed the machines and pull levers, not to make changes above their pay grade. A few more skilled workers did the set-ups.

Looking at the drawings that the tool room used to make their masters from, for the Winchester 1885 High Wall rifles, tolerances of plus or minus 5 thousandths were pretty common, as were tolerances of several degrees. This has been one of the eternal thorns in the arses of the guys trying to reverse engineer the parts by copying a single example, as they have to guess what the dimension was supposed to be, based on the likelihoods for it's purpose, and the dimension of the finished part in hand.

In the case of measuring tools and repeatability, accuracy and precision, there are several large and dry books about the subject. Foundations of Mechanical Accuracy, by Moore is a pretty good one of you wish to read up on the stuff.
Testing any measuring tool on only full even increments of it's thread, rack, etc., opens you up to becoming a victim of cyclical errors that you have avoided. What I mean is, if you only check every even Inch, you do not know that there is not a fault with the rack or screw thread that manifests at a different portion of the reading, if that makes any sense. It gets covered a lot when discussing calibration and ensuring accuracy. An example would be if one tooth was bent on the pinion of a dial indicator. As the pointer goes around and around, there will be an area where it reads correctly, and a smaller area where the measurement is not reading accurately. The dimensions used to calibrate tools in a calibration center, are chosen to search those errors out.

Like I said before though, if you are not making parts for NASA and mailing them in to be measured there by other tools than the ones used for making them, you pretty much need to be reasonably sure that the parts you deal with are consistent and measure the same each time with the same tools. Having a few Standards around (accurately made items of known dimension) goes a long ways towards that peace of mind. measuring anything, several times in a row, and coming up with the same measurement, is a pretty solid way to add some confidence to your repertoire. It gets you used to the feel needed to get the best out of what you are doing, and most of the real world applications of measuring tools in our uses for stuff like reloading, the differences between very expensive tools and merely OK ones, are largely academic.

Cheers
Trev

I went to a course put on by Starrett concerning their micrometers and the standards they come with. Their mandrels and anvils are set to be within .00001 when completely closed to the prescribed tension and within the same tolerances through the one inch end of travel on the mandrel. The same goes for each measurement in between. This can change if to much tension is applied or the temperatures are at extremes.

I prefer the micrometers that only have one inch of travel with a set anvil. For example if you are measuring between 5-6 inches, the zero setting should be at 5.00001 inches and the maximum diameter/thickness is 6.00001 inches.

It's only when the anvils/mandrels are being changed out that discrepancies occur. I personally don't trust such set ups and even Starrett admits their tolerances can't be maintained with this system.

As far as verniers go. About 15 years ago I had the opportunity to test several different units. Some of them had dial indicators/digital readouts and other low priced units had linear scales that had to be matched up and read from there. Surprisingly it was the latter tools, without the bells and whistles that were the most accurate. The dial indicator/digital readout units all varied in readings. They were perfectly acceptable for most work but if tolerances are +-.001 or less than they should be set aside and other measuring devices should be used. There are some extremely accurate measuring instruments available. Easily capable of measuring to one degree of light, which is one, six millionths of an inch. These are used to measure the faces of mechanical seals used in the stuffing boxes of some pumps. These seals have tolerances that are so tight that if you touch the mating faces with bare fingers before installation, the acidic sweat will cause them to fail prematurely and leak.
 
Wow. Spot on .308.

Problem is that's a bullet for 223.

What am I doing wrong?

2qdzayg.jpg
 
What are you doing wrong? My guess is reading the box incorrectly. Lol.

Pretty good odds! LOL!

For what the OP wanted to know, he has the tool that he needs. He will not get any more accuracy out of his ammo by buying any more expensive measuring equipment.

For the guy that wants to make bench rest grade swaged bullets, and wishes to know what the difference in size is, between the base end of his die and the swell behind the ogive, not so much. Extra decimal places after the decimal point, tend to cost more money.

Different horses for different courses.

There are ways to ensure that you are getting an accurate reading, with measuring equipment that normally would not be expected to perform at the level sometimes required.
Mostly, that consists of training yourself to feel the correct amount of pressure to apply, in the correct manner, to get the best possible results, using reference Standards of a similar size to the work being measured.

Again, though, this is far more important for someone making stuff that will be tested for specs/tolerances in someone else's facility.
 
Thanks guys for all the info.

All my 308 bullets do measure 0.308" dead-on on that caliper. I'll keep going with this one then. I still have many reloading tools to buy and will concentrate my $$$ elsewhere.
 
Last edited:
Back
Top Bottom