Supporting accuracy supports the bottom line

Published:  04 June, 2015

Trevor Dunger, product specialist for pressure and level for ABB’s UK Measurement Products business, explains the growing importance of calibration in promoting good practice and profitability.

We all know that you can’t manage what you can’t measure, so maintaining an accurate monitoring and control regime is essential. That’s just one reason why it’s the right time to think about how well your calibration regime is supporting your business.

Correct instrument calibration is vital to ensuring accurate measurement performance.

Instruments such as pressure and temperature sensors and transmitters will have been calibrated when they were first manufactured to check their performance under a known set of operating conditions.

Although this calibration will be valid when the instrument is first installed, it cannot be assumed that it will remain valid throughout its operational life. Wear and tear, vibration, ambient temperatures and exposure to the elements can all cause an instrument’s performance to stray from its original calibrated values.

Arduous processes in particular will cause instruments to drift, such that a failure to routinely take a device out of service and calibrate it could lead to a measurement error. Drift is also quite prevalent on older instruments compared to the new generation of instruments, which feature improved electronics, with self-checking routines built-in, and a more robust mechanical design. Nothing in life is constant, and this is equally true of electrical components, which can undergo a change in performance due to small chemical and / or physical changes with time, resulting in unavoidable long term drift.

The calibration of an instrument can also often be compromised as soon as it is installed. In most cases, installers will calibrate a device to the installation using their own devices, thereby destroying the original factory calibration. The resulting new calibration will only be as good as the equipment they are using, which in some cases may themselves not be properly calibrated.

It is important to be aware that any of these factors could affect a manufacturer’s guidelines when it comes to the frequency of calibrating their instruments. Even where a manufacturer recommends a longer period between calibration checks, the characteristics of the installation conditions can impact on the performance of the transmitter and/or primary sensing element. In such applications, more frequent calibrations, or at least inspections, may be necessary.

Technological progress

Improving technology is also playing a role in driving calibration standards, due especially to the need for more accurate calibration as the measurement accuracy of the instruments being tested improves. IEC 61298 states that the measuring equipment for process measurement and control devices should display a measurement uncertainty four times better than that of the device being calibrated. This can be difficult to achieve, as the test equipment being calibrated might be just as accurate as the working standard. There are strategies that calibration specialists can use to get around the problem, but end users need to be clear about how they go about it.

The result is that calibration can be a difficult and expensive challenge. The cost for supporting some plant-based instrumentation can be 10 percent of its original purchase price on a yearly basis. Nevertheless, calibration is essential, so the key thing is to get the right regime in place to ensure that the investment in testing is not wasted.

When should you calibrate?

The right time to calibrate a specific piece of equipment will depend on the type of instrument and the job it is doing. There are some common situations that typically call for calibration, however, such as:

• A calibration check with a new instrument

• at time intervals specified by the manufacturer or regulator

• after a specified number of operating hours or usage cycles

• when an instrument has had a physical shock or vibration that might potentially put it out of calibration

• whenever the output appears doubtful

The frequency with which you check your pressure transmitter calibration will also depend on how critical it is to the process. Section 7.6 of ISO 9001:2008 requires that measurement instruments are maintained and calibrated on a regular basis, with the frequency being dictated by the specific requirements and demands of the application.

If high performance and accuracy is crucial to production or health and safety, then the transmitter should be checked regularly. Some applications will have a financial implication, for instance measuring the flow rates of materials in the petrochemical industry for fiscal purposes.

In safety critical applications, companies will need to have their pressure transmitters proof tested, with the exact frequency being determined by the target reliability required. In a SIL1 application, for example, this may be every 12 months or every three months in a SIL3 application.

Look beyond the hype

It is important to remember that the figures quoted in a manufacturer’s specifications may be based on a specific set of conditions for temperature and pressure that bear little resemblance to real plant conditions.

Depending on the application, the accuracy needed, the ambient conditions and a range of other factors, the actual calibration frequency of a device may differ markedly from that quoted by its maker. The user will only know if he takes his particular conditions into account and calculates the calibration frequency himself.

Finding the real calibration frequency

The calibration frequency of any pressure transmitter depends on three things: the application of the device, the performance the user needs from it and the inherent operating conditions.

When calculating calibration frequency, the following five stage process should be followed:

1. Determine the performance required for the application – is it a safety critical application requiring high accuracy or a more straightforward application where accuracy is less important?

2. Determine the operating conditions – operating conditions such as static pressure and ambient temperature can have an impact on transmitter performance, resulting in potential errors, and need to be factored in

3. Calculate the Total Probable Error (TPE) or Total Performance – this is determined by a formula which is used to calculate the potential difference between the device’s quoted base accuracy and the likely effects of static pressure and ambient temperature on measurement performance

4. Determine the stability for a month – calculating the stability on a monthly basis will provide a benchmark for measuring ongoing performance

5. Calculate the calibration frequency – using the results of steps 1 to 4, the calibration frequency can then be calculated using the desired performance minus the Total Probable Error divided by the stability per month.

The resulting figure from this calculation can then be used to set the frequency with which the calibration needs to be checked in order to achieve the desired accuracy.

Asking the right questions

To determine the calibration frequency, the user must ensure he gets certain information from the pressure transmitter’s manufacturer. This includes temperature error, stability (drift), static pressure error and base accuracy.

Above all, a customer must not rely solely on the assurances of a manufacturer, as different models in a manufacturer’s range will have differing performances. Making a purchasing decision based on ‘headline’ statements in literature can result in over specified instrumentation or a poorly performing pressure transmitter which can adversely affect process control, resulting in poor product quality and increased waste. Checking pressure transmitter calibration at the right intervals will avoid these problems, while keeping the cost of ownership to a minimum.

Who can help?

Calibration is all about standards, so it’s advisable to stick to using accredited calibration services. Suppliers and laboratories offering calibration services in Britain are accredited by The United Kingdom Accreditation Service (UKAS). UKAS is the sole national accreditation body recognised by government to assess, against internationally agreed standards, organisations that provide certification, testing, inspection and calibration services.

For further information please visit: www.abb.com

Sign up for the PWE newsletter

Latest issue

To view a digital copy of the latest issue of Plant & Works Engineering, click here.

View the past issue archive here.

To subscribe to the journal please click here.

Poll

"How is your manufacturing business preparing for a net Zero target?"






Twitter