English (English)

How to Correctly Perform a System Accuracy Test and Stay Compliant

28 August 2020 /

Despite the challenges that the past several months have thrown at industries around the world, maintaining the accuracy of their heat treatment equipment remains a key responsibility for quality and maintenance managers everywhere.

And it has been challenging. Many of the contractors that some businesses depend on to perform their system accuracy tests (SATs) are operating at reduced capacity. On-site teams have faced similar challenges as various lockdowns and remote working guidelines disrupt the standard procedure. In some cases, plants may find themselves without the personnel needed to carry out these essential tests, actually putting compliance at risk.

But for all the added challenges 2020 has heaped on manufacture and production, the issues around the correct performance of SATs have existed for much longer. What do these common pitfalls look like and how can the managers responsible avoid them to ensure their equipment is calibrated as accurately as possible within the permitted tolerances?

System Accuracy Test (SAT) definition

Companies that have not been carrying out SATs thus far should read on for an overview of what a SAT is and why they are important. Otherwise, scroll down to ‘Common pitfalls when conducting SATs’.

Known elsewhere as in-situ calibration, System Accuracy Tests are generally performed as a system calibration where the equipment has reference sensors temporarily placed next to the sensors to be calibrated. The equipment is then operated to its normal working temperature(s).

The measurements from the reference sensors are recorded alongside the measurements from the equipment instrument display and so includes any errors within the sensor, leadwires, connections and instrument. This is particularly important for detecting drift in thermocouples, for example.

These calibrations depend on the use of completely independent reference sensors, as well as the stability of the equipment itself. Occasionally they are performed at a single point, perhaps in the centre of the workspace, to calibrate the temperature of the space rather than the sensor specifically. They may be required as often as weekly, and yet this calibration is often overlooked entirely because its importance is not understood.

Many companies that do carry out SATs seek shortcuts around these complex, often fiddly tests. Unfortunately, such shortcuts lead to failures in accuracy and results. Here we look at three common traps companies fall into when conducting SATs and how to avoid them.

Common pitfalls when conducting SATs

  1. Not conducting a truly independent test. This is commonly seen when the technician attaches their instrument via the sockets inside the machine being tested. What they don’t realise is that the thermocouple wire and even the sockets could be a source of error. Without understanding the effect of using the machine’s sockets and pass-throughs, a technician can easily make measurements that appear in tolerance when they are not or stop the machine for a non-conformance when it is actually in tolerance. They need to use a truly independent system to comply with the standard and get results that can be relied upon. Standards ask for independent measurements for a good reason, but the technician doesn’t realise this and doesn’t vigorously follow the standard to the letter.

  2. Seeking to get around this by determining the accuracy of the socket first using a calibrator. Many technicians recognise the failings in conducting the test using a part of the system that could contribute to an error, but rather than using an independent system, they think they can get around it by plugging their calibrator into the socket to ensure it is accurate. When this happens, all of the voltage is generated in the calibration device, not in the wire. The wire remains cold, so when a signal is injected into it, the technician can’t simulate the voltage generated within it when the machine is heated. They assume it is calibrated correctly, when in fact they are ‘flying blind’.

  3. The technicians getting failures don’t understand why they are getting failures. This is a commonplace issue, particularly when the people conducting the SAT know the steps they need to follow but not the science behind them. Adjustment and recalibration may be through a correction applied to the instrument reading or by replacement of the sensor, but because the technician doesn’t understand how a thermocouple works, for example, they might replace the wrong part in the system or not replace all the parts that need adjusting.

The risks associated with non-compliant SATs

In each case, further testing will be needed to determine the source of error to ensure the right corrective action is taken. Not doing so risks the continued production of products that may not be compliant with industry standards, potentially for months on end. From a commercial perspective, this can lead to product recalls incurring huge amounts of administration and costs. From a human perspective, it is a significant health and safety risk.

With the current climate putting greater stress on many companies’ ability to perform these tests correctly, and with the compliance of their operations at stake, what options can companies consider to get their SATs right and maximise the accuracy of the results?

How to perform a System Accuracy Test and stay compliant with VFE

If accountable managers recognise any of the pitfalls and challenges referenced above, or if they need help interpreting results and taking corrective action, one of the quickest and simplest solutions is to reach out to us at Vacuum Furnace Engineering (VFE) for support.

At VFE, we have over 20 years of experience helping our customers to rectify non-conformities in the field of SATs. In that time, we have come across numerous ways in which we can help companies to perform SATs correctly. In half a day, our engineers will be able to determine if a system really is at fault as well as diagnose where the errors are coming from. They can estimate the uncertainty of measurement so that the risks of false accept/reject can be quantified and, if necessary, carry out adjustments there and then to maximise the accuracy of a company’s systems and bring them back within compliance.

The global climate continues to test industries everywhere but with an increased awareness of the pitfalls surrounding SATs, quality and maintenance managers can minimise the risks associated with non-compliance and have confidence in the integrity of their production line.

For more information about our Calibration Services or to request a System Accuracy Test, click the button below to get in touch.

Request a System Accuracy Test