Does anyone have a standard definition of the phrase â€œsignificantly out of tolerance?â€ If so, where is it documented? It appears to me that different entities define this in somewhat different ways. My interest in this centers around a policy at my company that requires an investigation (& possible corrective action) anytime a device fails calibration.
For instance, a gage with a tolerance of Â±1% will require the same level of investigation with an error of 1.05% as it would with an error of 10%. In reality, investigations have shown that errors of 1.05% have no bearing on OUR process or product quality, while errors of 10% certainly require containment of product. There is no question that a gage that is 0.05% beyond its tolerance limit should be adjusted and the calibration interval shortened. There is no question that the 1.05% error is out of tolerance, just not â€œsignificantlyâ€ so. The question isâ€”-at what level would the gage be considered â€œsignificantly out of tolerance?â€ I want to propose a revision to our procedure that would be based upon good measurement science that would define levels of out of tolerance conditions and appropriate reactions. It would seem that defining such levels would allow us to focus on real issues when devices are found out of tolerance. THANKS!