Router Bit Accuracy
From contributor C:
You should pat your operator on the back. I have had new bits from Vortex and others that were as much as .015 undersize. Very few cutters measure to dead on the call out,
usually at least .001-.003.under.
From contributor L:
You have a good operator. After a bit is made, it must be sharpened. When sharpened, it gets smaller in diameter. The machinery manufacturers built this offset feature into these machines just for this.
From contributor W:
I am new in the CNC world. We just got our machine last week. When we set up our tooling in the tool catalogue, all of the new bits were off when measured with a digital mic.
From contributor M:
Typically endmills are precision ground to a unilateral tolerance ( +.000/-.0005 to.003). The primary reason for this in precision machining situations is, in theory, there will always be material left on critical machined dimensions if no compensation is entered. While not as exacting, router bits are manufactured by a similar process. It has been my experience that router bits measure similarly (a few tenths to a few thousandths undersize at the cutting edge extremities). I machined metal for a long time (age check!) and I hope I do not offend anyone, but you're fooling yourself if you think you can accurately check a cutting edge with digital calipers. You can get close, but close is relative. In fact, nice (Starrett, Brown & Sharp, Mitutoyo) OD micrometers take practice to measure a cutting edge (I always kept a second set of micrometers just for measuring cutting tools so as to not damage my good mics). Even then, with 3 or 5 flute cutters, you need anviled flute mics to get a good measurement.
A page at centaurtools.com explains the DIN 6499 standard to which precision and high precision DIN 6499 collets are made. It is geared towards metal, but it illustrates that all collets are not created equal, in fact some manufacturers only guarantee .0005 - .001 runout at the face of the collet. If your tool is standing out 3" from the face, that could equal .0015 - .003 runout at the cutting edge. That's assuming everything is clean! If the .0015/.003 runout is all to one side, that is .003/.006 swath of cut. It can add up! If your spindle taper and holders are older, that can affect this also.
I would be curious for some of the tooling manufacturers to weigh in. I have never seen new tooling sold -.015 (1/64th) undersize, but that is not to say it has not happened!
From contributor L:
And thus is why the right way to check is to measure the cut, the end result, and set your offsets from them.
From contributor S:
I know when we first got our machine, we were checking bits by hand also. Like contributor M indicated, it is difficult to check bits by hand, especially with multiple operators throughout the shop. We use the machine to check. Anytime we change a bit we have a program that is supposed to cut a six inch square with that bit while leaving about a quarter of an inch so it doesn't move. We mic that square and divide the difference by two and we have our cutter comp number. It seems to be more consistent than hand checking for us.
From contributor N:
I agree with contributor S - this method is exactly the same as I have my CNC operators do. We also dial in the length of the tool by checking the depth of the cut. All bits must be dialed in, which you will find out when trying to fit things together during assembly. .015" might not seem like a lot, but if you a cutting a dado it might be the difference between parts fitting together or not.
From contributor O:
I believe the variation introduced with panel thickness is in most cases greater than new router bit diameter or length variation. Add to it variation caused by measuring technique and it is easy to see that there are more factors that influence total variation. However, the proof is in the pudding (cut depth does vary) and the fact that your operator (very good operator) is taking extra care to modify tool offset is very admirable.
From contributor J:
I think if you can maintain +/-.1mm with your router, then you are doing good when you consider machine accuracy and material expansion and shrinkage. But I have sometimes seen that much or more variation simply by changing cut direction. Most of our sharpened tools are marked with the diameter. Most of the time that's good enough.
From the original questioner:
These are some great replies. What we are seeing is that the bit is cutting a slot wider than the stated diameter and my thought is that it is caused by runout due to poorly maintained collets and tool holders.
From contributor V:
The wide cut could also be caused by tool deflection, if you are feeding too fast.
From contributor G:
All very good answers. As a manufacturer of a number of different cutting edges, do cutting diameters come in exact (minus PCD tools)? Let's talk about solid carbide. When we get our raw material, it is oversized, with a dull gray appearance. It is then put into a rotary grinder to ensure it is round. This process takes the shank to .500 (+0 -.001). If it was oversized, it wouldn't fit into the collet. It is then put into a CNC 5-7 axis grinder and the cutting edges are produced. The diameter has to be smaller. If it was the same size as the shank, you wouldn't have a cutting edge. Tolerances in the diameter is -.002-.003. So in reality the .500 tool is .498. If a tool comes in smaller than that, QC missed a step.
From contributor U:
When spirals or solid carbide tooling is made, most companies start with rod that is +.0000 to -.0005. We try to hold .001 to .003 but sometimes it does get to .005. If you have some that are more than that, it may have been a setup piece or just overlooked by the operator. Also, you will get some different numbers on the diameter when checked by hand with a mic. Also note that if this is a single flute tool, you cannot check it in this manner.
Would you like to add information to this article?
Interested in writing or submitting an article?
Have a question about this article?
Have you reviewed the related Knowledge Base areas below?