SlideShare a Scribd company logo
1
1
CALIBRATION PRINCIPLES
After completing this chapter, you should be able to:
Define key terms relating to calibration and interpret the
meaning of each.
Understand traceability requirements and how they are
maintained.
Describe characteristics of a good control system technician.
Describe differences between bench calibration and field
calibration. List the advantages and disadvantages of each.
Describe the differences between loop calibration and
individual instrument calibration. List the advantages and
disadvantages of each.
List the advantages and disadvantages of classifying
instruments according to process importance—for example,
critical, non-critical, reference only, OSHA, EPA, etc.
1.1 WHAT IS CALIBRATION?
There are as many definitions of calibration as there are methods.
According to ISA’s The Automation, Systems, and Instrumentation Dictionary,
the word calibration is defined as “a test during which known values of
measurand are applied to the transducer and corresponding output
readings are recorded under specified conditions.” The definition includes
the capability to adjust the instrument to zero and to set the desired span.
An interpretation of the definition would say that a calibration is a
comparison of measuring equipment against a standard instrument of
higher accuracy to detect, correlate, adjust, rectify and document the
accuracy of the instrument being compared.
Typically, calibration of an instrument is checked at several points
throughout the calibration range of the instrument. The calibration range is
defined as “the region between the limits within which a quantity is
measured, received or transmitted, expressed by stating the lower and
Cable05.book Page 1 Wednesday, December 8, 2004 9:36 AM
2 Calibration Principles
upper range values.” The limits are defined by the zero and span values.
The zero value is the lower end of the range. Span is defined as the
algebraic difference between the upper and lower range values. The
calibration range may differ from the instrument range, which refers to the
capability of the instrument. For example, an electronic pressure
transmitter may have a nameplate instrument range of 0–750 pounds per
square inch, gauge (psig) and output of 4-to-20 milliamps (mA). However,
the engineer has determined the instrument will be calibrated for 0-to-300
psig = 4-to-20 mA. Therefore, the calibration range would be specified as
0-to-300 psig = 4-to-20 mA. In this example, the zero input value is 0 psig
and zero output value is 4 mA. The input span is 300 psig and the output
span is 16 mA.
Different terms may be used at your facility. Just be careful not to
confuse the range the instrument is capable of with the range for which
the instrument has been calibrated.
1.2 WHAT ARE THE CHARACTERISTICS OF A
CALIBRATION?
Calibration Tolerance: Every calibration should be performed to a specified
tolerance. The terms tolerance and accuracy are often used incorrectly. In
ISA’s The Automation, Systems, and Instrumentation Dictionary, the
definitions for each are as follows:
Accuracy: The ratio of the error to the full scale output or the ratio of the
error to the output, expressed in percent span or percent reading,
respectively.
Tolerance: Permissible deviation from a specified value; may be expressed
in measurement units, percent of span, or percent of reading.
As you can see from the definitions, there are subtle differences
between the terms. It is recommended that the tolerance, specified in
measurement units, is used for the calibration requirements performed at
your facility. By specifying an actual value, mistakes caused by calculating
percentages of span or reading are eliminated. Also, tolerances should be
specified in the units measured for the calibration.
Cable05.book Page 2 Wednesday, December 8, 2004 9:36 AM
Calibration 3
For example, you are assigned to perform the calibration of the
previously mentioned 0-to-300 psig pressure transmitter with a specified
calibration tolerance of ±2 psig. The output tolerance would be:
The calculated tolerance is rounded down to 0.10 mA, because
rounding to 0.11 mA would exceed the calculated tolerance. It is
recommended that both ±2 psig and ±0.10 mA tolerances appear on the
calibration data sheet if the remote indications and output milliamp signal
are recorded.
Note the manufacturer’s specified accuracy for this instrument may
be 0.25% full scale (FS). Calibration tolerances should not be assigned
based on the manufacturer’s specification only. Calibration tolerances
should be determined from a combination of factors. These factors
include:
• Requirements of the process
• Capability of available test equipment
• Consistency with similar instruments at your facility
• Manufacturer’s specified tolerance
Example: The process requires ±5°C; available test equipment is capable
of ±0.25°C; and manufacturer’s stated accuracy is ±0.25°C. The specified
calibration tolerance must be between the process requirement and
manufacturer’s specified tolerance. Additionally the test equipment must
be capable of the tolerance needed. A calibration tolerance of ±1°C might
be assigned for consistency with similar instruments and to meet the
recommended accuracy ratio of 4:1.
Accuracy Ratio: This term was used in the past to describe the relationship
between the accuracy of the test standard and the accuracy of the
instrument under test. The term is still used by those that do not
understand uncertainty calculations (uncertainty is described below). A
good rule of thumb is to ensure an accuracy ratio of 4:1 when performing
calibrations. This means the instrument or standard used should be four
times more accurate than the instrument being checked. Therefore, the test
2 psig
300 psig÷
16 mA×
0.1067 mA
----------------------------
Cable05.book Page 3 Wednesday, December 8, 2004 9:36 AM
4 Calibration Principles
equipment (such as a field standard) used to calibrate the process
instrument should be four times more accurate than the process
instrument, the laboratory standard used to calibrate the field standard
should be four times more accurate than the field standard, and so on.
With today's technology, an accuracy ratio of 4:1 is becoming more
difficult to achieve. Why is a 4:1 ratio recommended? Ensuring a 4:1 ratio
will minimize the effect of the accuracy of the standard on the overall
calibration accuracy. If a higher level standard is found to be out of
tolerance by a factor of two, for example, the calibrations performed using
that standard are less likely to be compromised.
Suppose we use our previous example of the test equipment with a
tolerance of ±0.25°C and it is found to be 0.5°C out of tolerance during a
scheduled calibration. Since we took into consideration an accuracy ratio
of 4:1 and assigned a calibration tolerance of ±1°C to the process
instrument, it is less likely that our calibration performed using that
standard is compromised.
The out-of-tolerance standard still needs to be investigated by reverse
traceability of all calibrations performed using the test standard.
However, our assurance is high that the process instrument is within
tolerance. If we had arbitrarily assigned a calibration tolerance of ±0.25°C
to the process instrument, or used test equipment with a calibration
tolerance of ±1°C, we would not have the assurance that our process
instrument is within calibration tolerance. This leads us to traceability.
Traceability: All calibrations should be performed traceable to a nationally
or internationally recognized standard. For example, in the United States,
the National Institute of Standards and Technology (NIST), formerly
National Bureau of Standards (NBS), maintains the nationally recognized
standards. Traceability is defined by ANSI/NCSL Z540-1-1994 (which
replaced MIL-STD-45662A) as “the property of a result of a measurement
whereby it can be related to appropriate standards, generally national or
international standards, through an unbroken chain of comparisons.”
Note this does not mean a calibration shop needs to have its standards
calibrated with a primary standard. It means that the calibrations
performed are traceable to NIST through all the standards used to
calibrate the standards, no matter how many levels exist between the shop
and NIST.
Traceability is accomplished by ensuring the test standards we use
are routinely calibrated by “higher level” reference standards. Typically
the standards we use from the shop are sent out periodically to a
standards lab which has more accurate test equipment. The standards
Cable05.book Page 4 Wednesday, December 8, 2004 9:36 AM
Calibration 5
from the calibration lab are periodically checked for calibration by “higher
level” standards, and so on until eventually the standards are tested
against Primary Standards maintained by NIST or another internationally
recognized standard.
The calibration technician’s role in maintaining traceability is to
ensure the test standard is within its calibration interval and the unique
identifier is recorded on the applicable calibration data sheet when the
instrument calibration is performed. Additionally, when test standards
are calibrated, the calibration documentation must be reviewed for
accuracy and to ensure it was performed using NIST traceable equipment.
FIGURE 1-1.
Traceability Pyramid
Uncertainty: Parameter, associated with the result of a measurement that
characterizes the dispersion of the values that could reasonably be
attributed to the measurand. Uncertainty analysis is required for
calibration labs conforming to ISO 17025 requirements. Uncertainty
analysis is performed to evaluate and identify factors associated with the
calibration equipment and process instrument that affect the calibration
accuracy. Calibration technicians should be aware of basic uncertainty
analysis factors, such as environmental effects and how to combine
Process Instrument
Primary Standards
National
Measurement
Standard
(e.g., NIST)
Working Standards
(“normal” shop instruments)
Secondary Standards
Cable05.book Page 5 Wednesday, December 8, 2004 9:36 AM
6 Calibration Principles
multiple calibration equipment accuracies to arrive at a single calibration
equipment accuracy. Combining multiple calibration equipment or
process instrument accuracies is done by calculating the square root of the
sum of the squares, illustrated below:
Calibration equipment combined accuracy
Process instrument combined accuracy
1.3 WHY IS CALIBRATION REQUIRED?
It makes sense that calibration is required for a new instrument. We
want to make sure the instrument is providing accurate indication or
output signal when it is installed. But why can’t we just leave it alone as
long as the instrument is operating properly and continues to provide the
indication we expect?
Instrument error can occur due to a variety of factors: drift,
environment, electrical supply, addition of components to the output
loop, process changes, etc. Since a calibration is performed by comparing
or applying a known signal to the instrument under test, errors are
detected by performing a calibration. An error is the algebraic difference
between the indication and the actual value of the measured variable.
Typical errors that occur include:
FIGURE 1-2.
Span Error
calibrator1 error( )
2
calibrator2 error( )
2
etc. error( )
2
+ +
sensor error( )
2
transmitter error( )
2
indicator error( )
2
etc. error( )+
2
+ +
100%
0
100%
ACTUAL VALUE
% INPUT
%OUTPUT
SPAN ERRORS
Cable05.book Page 6 Wednesday, December 8, 2004 9:36 AM
Calibration 7
FIGURE 1-3.
Zero Error
FIGURE 1-4.
Combined Zero and Span Error
100%
ACTUAL VALUE
0
100%
% INPUT
%OUTPUT
ZERO ERRORS
100%
0
100%
ACTUAL VALUE
% INPUT
%OUTPUT
COMBINED ZERO AND
SPAN ERRORS
Cable05.book Page 7 Wednesday, December 8, 2004 9:36 AM
8 Calibration Principles
FIGURE 1-5.
Linearization Error
Zero and span errors are corrected by performing a calibration. Most
instruments are provided with a means of adjusting the zero and span of
the instrument, along with instructions for performing this adjustment.
The zero adjustment is used to produce a parallel shift of the input-output
curve. The span adjustment is used to change the slope of the input-output
curve. Linearization error may be corrected if the instrument has a
linearization adjustment. If the magnitude of the nonlinear error is
unacceptable and it cannot be adjusted, the instrument must be replaced.
To detect and correct instrument error, periodic calibrations are
performed. Even if a periodic calibration reveals the instrument is perfect
and no adjustment is required, we would not have known that unless we
performed the calibration. And even if adjustments are not required for
several consecutive calibrations, we will still perform the calibration check
at the next scheduled due date. Periodic calibrations to specified
tolerances using approved procedures are an important element of any
quality system.
1.4 WHO PERFORMS CALIBRATIONS? – THE
CONTROL SYSTEM TECHNICIAN
A control system technician (CST) is a skilled craftsperson who
knows pneumatic, mechanical, and electrical instrumentation. He or she
understands process control loops and process control systems, including
100%
0
100%
ERRORS CAUSED
BY NONLINEARITY
DESIRED VALUE
% INPUT
%OUTPUT
Cable05.book Page 8 Wednesday, December 8, 2004 9:36 AM
Calibration 9
those that are computer-based. Typically, he or she has received training
in such specialized subjects as theory of control, analog and/or digital
electronics, microprocessors and/or computers, and the operation and
maintenance of particular lines of field instrumentation.
A CST performs calibration, documentation, loop checks,
troubleshooting, and repair or replacement of instrumentation. These
tasks relate to systems that measure and control level, temperature,
pressure, flow, force, power, position, motion, physical properties,
chemical composition and other process variables.
1.5 CHARACTERISTICS OF A CONTROL SYSTEM
TECHNICIAN
Honesty and Integrity: A CST must possess honesty and integrity above all
else. Most technicians work independently much of the time. Calibrations
must be performed in accordance with procedures and must be properly
documented. Additionally, the calibration department may be
understaffed and production schedules may demand unrealistic
completion requirements. These factors can have a real impact on proper
performance and documentation of calibrations. Remember: Nobody can
take away your integrity; only you can give it away.
Attention to Detail: Calibrations should be performed in accordance with
detailed instructions. Each different make/model instrument is adjusted
differently. Each instrument is installed in a different physical and loop
configuration. Because of these and many other differences, attention to
detail is very important. The minute a technician is not paying attention to
detail, safety and proper performance are jeopardized.
Excellent Documentation Practices: In many facilities, the impression of
quality is determined by the content and appearance of documentation.
Many technicians complain the paperwork is 90% of the work. In today’s
world of ISO9000, cGMPs, A2LA, and other quality standards,
documentation is essential. If it isn’t documented, it wasn’t done.
Calibration Data Sheets must be neat, complete, signed and, if required,
reviewed in a timely manner. When changes occur, all related
documentation, such as drawings, manuals, specifications and databases
must also be updated.
Cable05.book Page 9 Wednesday, December 8, 2004 9:36 AM
10 Calibration Principles
Understanding of Processes: One thing that sets technicians apart is an
understanding of the process, particularly how the instruments monitor
and control the process. There is a difference between calibrating an
individual component and calibrating an instrument as part of the bigger
process control loop. For example, knowing when a controller can be
placed in manual without affecting the process and what to do while that
controller is in manual, requires an understanding of the process.
Additionally, when an operator says there is a problem with his
indication, a technician who knows the instrument loop and process will
be more capable of identifying the cause of the problem.
Some basic concepts on how calibrations should be performed need
to be discussed before we go on. Some of these may be new concepts not
used in your facility, but you should be familiar with them. Some of these
practices are industry dependent. Although calibrations are generally
performed the same, some different practices have developed. These
practices are:
• Loop Calibration vs. Individual Instrument Calibration
• Bench Calibration vs. Field Calibration
• Classification of Instruments as Critical, Non-Critical, For
Reference Only, etc.
1.6 LOOP CALIBRATION VS. INDIVIDUAL
INSTRUMENT CALIBRATION
An individual instrument calibration is a calibration performed only on
one instrument. The input and output are disconnected. A known source
is applied to the input, and the output is measured at various data points
throughout the calibration range. The instrument is adjusted, if necessary,
and calibration is checked.
DISADVANTAGES OF INDIVIDUAL
CALIBRATION
ADVANTAGES OF INDIVIDUAL
CALIBRATION
1. Entire loop is not verified within
tolerance
2. Mistakes on re-connect
3. Less efficient use of time to do one
calibration for each loop instrument as
opposed to one calibration for the
loop
1. Correct instrument will be adjusted
2. More compatible with multifunction
calibrators
Cable05.book Page 10 Wednesday, December 8, 2004 9:36 AM

More Related Content

PPTX
Auditing of microbiological lab
PPTX
Qualification of GC & FTIR
PDF
CCP and CQA concept .pdf
PPTX
Vendor Audites
PPTX
Auditing of microbiology laboratory
PPT
Validation of dry_powder_mixer
PPTX
Auditing of microbiological lab
Qualification of GC & FTIR
CCP and CQA concept .pdf
Vendor Audites
Auditing of microbiology laboratory
Validation of dry_powder_mixer

What's hot (20)

PPTX
qualification of FTIR and DSC
DOCX
Auditing of vendors and production department
PPTX
Quality control test for surgical products
PPTX
Qualification of laboratory equipments by Mayuri Soni
PPTX
BACPAC
PDF
Quality Systems Approach to Pharmaceutical cGMP
PPTX
PPTX
ICH-Q2 AMV
PPTX
Critical Hazard Management System & Explosions.pptx
PPTX
AUDITING OF QUALITY ASSURANCE AND ENGINEERING DEPARTMENT.pptx
PPTX
Air based hazards
PPTX
Auditing of capsule, sterile production and packaging
PDF
Handling of Reserve Samples Dr.A. Amsavel
PPTX
NEW ERA OF DRUG PRODUCT: OPPORTUNITIES AND CHALLENGES
PPTX
Packaging material vendor audit
PPTX
Qualification Of Autoclave
PPTX
Validation of pharaceutical water system and pure steam
PPTX
Quality by Design and Process Analytical Technology
PPTX
Auditing of quality assurance and maintenance of engineering department
PPTX
Concept of URS,DQ,IQ,OQ,PQ
qualification of FTIR and DSC
Auditing of vendors and production department
Quality control test for surgical products
Qualification of laboratory equipments by Mayuri Soni
BACPAC
Quality Systems Approach to Pharmaceutical cGMP
ICH-Q2 AMV
Critical Hazard Management System & Explosions.pptx
AUDITING OF QUALITY ASSURANCE AND ENGINEERING DEPARTMENT.pptx
Air based hazards
Auditing of capsule, sterile production and packaging
Handling of Reserve Samples Dr.A. Amsavel
NEW ERA OF DRUG PRODUCT: OPPORTUNITIES AND CHALLENGES
Packaging material vendor audit
Qualification Of Autoclave
Validation of pharaceutical water system and pure steam
Quality by Design and Process Analytical Technology
Auditing of quality assurance and maintenance of engineering department
Concept of URS,DQ,IQ,OQ,PQ
Ad

Similar to Calabration principles-chapter1 (20)

PPTX
Calibration of Instruments
PDF
2_Measurement+and+Error.pdf
PPS
Calibration & Quality Maintenance
PPTX
presentation calibration presentation calibration
PPTX
Calibration and Qualification
PPTX
calibration of analytical instruments
PPTX
calibration.pptx
PPTX
UNIT- I Basics of Metrology.pptx
PPT
Traceability in Measurement Uncertainty Training.ppt
PDF
UNIT 04 Alan_Morris_EMI_Chap_04 Calibration of Measuring Sensors and Instrume...
PDF
What is Metrology Calibration Importance, Principles, and Types.pdf
PPTX
PPTX
PPTX
Unit 1
PPTX
BASICS OF METROLOGY
PPTX
Unit i mm (1)
PPT
Lecture 02 IPC.ppt PLC ANALOG DIGITAL CONTROL
PDF
Measurement Errors and Standards
PPTX
Calibration of Analytical instrument
PPTX
Choosing the Right Calibrator Webinar Presented by Druck BHGE and Transcat
Calibration of Instruments
2_Measurement+and+Error.pdf
Calibration & Quality Maintenance
presentation calibration presentation calibration
Calibration and Qualification
calibration of analytical instruments
calibration.pptx
UNIT- I Basics of Metrology.pptx
Traceability in Measurement Uncertainty Training.ppt
UNIT 04 Alan_Morris_EMI_Chap_04 Calibration of Measuring Sensors and Instrume...
What is Metrology Calibration Importance, Principles, and Types.pdf
Unit 1
BASICS OF METROLOGY
Unit i mm (1)
Lecture 02 IPC.ppt PLC ANALOG DIGITAL CONTROL
Measurement Errors and Standards
Calibration of Analytical instrument
Choosing the Right Calibrator Webinar Presented by Druck BHGE and Transcat
Ad

Recently uploaded (20)

PDF
Model Code of Practice - Construction Work - 21102022 .pdf
PDF
The CXO Playbook 2025 – Future-Ready Strategies for C-Suite Leaders Cerebrai...
PPTX
CYBER-CRIMES AND SECURITY A guide to understanding
PPTX
OOP with Java - Java Introduction (Basics)
PPTX
Fundamentals of safety and accident prevention -final (1).pptx
DOCX
573137875-Attendance-Management-System-original
PDF
Evaluating the Democratization of the Turkish Armed Forces from a Normative P...
PPTX
additive manufacturing of ss316l using mig welding
PDF
BIO-INSPIRED HORMONAL MODULATION AND ADAPTIVE ORCHESTRATION IN S-AI-GPT
PDF
Unit I ESSENTIAL OF DIGITAL MARKETING.pdf
PDF
III.4.1.2_The_Space_Environment.p pdffdf
PPTX
UNIT-1 - COAL BASED THERMAL POWER PLANTS
PPTX
Infosys Presentation by1.Riyan Bagwan 2.Samadhan Naiknavare 3.Gaurav Shinde 4...
PDF
PPT on Performance Review to get promotions
PPTX
Current and future trends in Computer Vision.pptx
PDF
composite construction of structures.pdf
PPTX
CH1 Production IntroductoryConcepts.pptx
PPTX
MET 305 2019 SCHEME MODULE 2 COMPLETE.pptx
PDF
Enhancing Cyber Defense Against Zero-Day Attacks using Ensemble Neural Networks
PPTX
Internet of Things (IOT) - A guide to understanding
Model Code of Practice - Construction Work - 21102022 .pdf
The CXO Playbook 2025 – Future-Ready Strategies for C-Suite Leaders Cerebrai...
CYBER-CRIMES AND SECURITY A guide to understanding
OOP with Java - Java Introduction (Basics)
Fundamentals of safety and accident prevention -final (1).pptx
573137875-Attendance-Management-System-original
Evaluating the Democratization of the Turkish Armed Forces from a Normative P...
additive manufacturing of ss316l using mig welding
BIO-INSPIRED HORMONAL MODULATION AND ADAPTIVE ORCHESTRATION IN S-AI-GPT
Unit I ESSENTIAL OF DIGITAL MARKETING.pdf
III.4.1.2_The_Space_Environment.p pdffdf
UNIT-1 - COAL BASED THERMAL POWER PLANTS
Infosys Presentation by1.Riyan Bagwan 2.Samadhan Naiknavare 3.Gaurav Shinde 4...
PPT on Performance Review to get promotions
Current and future trends in Computer Vision.pptx
composite construction of structures.pdf
CH1 Production IntroductoryConcepts.pptx
MET 305 2019 SCHEME MODULE 2 COMPLETE.pptx
Enhancing Cyber Defense Against Zero-Day Attacks using Ensemble Neural Networks
Internet of Things (IOT) - A guide to understanding

Calabration principles-chapter1

  • 1. 1 1 CALIBRATION PRINCIPLES After completing this chapter, you should be able to: Define key terms relating to calibration and interpret the meaning of each. Understand traceability requirements and how they are maintained. Describe characteristics of a good control system technician. Describe differences between bench calibration and field calibration. List the advantages and disadvantages of each. Describe the differences between loop calibration and individual instrument calibration. List the advantages and disadvantages of each. List the advantages and disadvantages of classifying instruments according to process importance—for example, critical, non-critical, reference only, OSHA, EPA, etc. 1.1 WHAT IS CALIBRATION? There are as many definitions of calibration as there are methods. According to ISA’s The Automation, Systems, and Instrumentation Dictionary, the word calibration is defined as “a test during which known values of measurand are applied to the transducer and corresponding output readings are recorded under specified conditions.” The definition includes the capability to adjust the instrument to zero and to set the desired span. An interpretation of the definition would say that a calibration is a comparison of measuring equipment against a standard instrument of higher accuracy to detect, correlate, adjust, rectify and document the accuracy of the instrument being compared. Typically, calibration of an instrument is checked at several points throughout the calibration range of the instrument. The calibration range is defined as “the region between the limits within which a quantity is measured, received or transmitted, expressed by stating the lower and Cable05.book Page 1 Wednesday, December 8, 2004 9:36 AM
  • 2. 2 Calibration Principles upper range values.” The limits are defined by the zero and span values. The zero value is the lower end of the range. Span is defined as the algebraic difference between the upper and lower range values. The calibration range may differ from the instrument range, which refers to the capability of the instrument. For example, an electronic pressure transmitter may have a nameplate instrument range of 0–750 pounds per square inch, gauge (psig) and output of 4-to-20 milliamps (mA). However, the engineer has determined the instrument will be calibrated for 0-to-300 psig = 4-to-20 mA. Therefore, the calibration range would be specified as 0-to-300 psig = 4-to-20 mA. In this example, the zero input value is 0 psig and zero output value is 4 mA. The input span is 300 psig and the output span is 16 mA. Different terms may be used at your facility. Just be careful not to confuse the range the instrument is capable of with the range for which the instrument has been calibrated. 1.2 WHAT ARE THE CHARACTERISTICS OF A CALIBRATION? Calibration Tolerance: Every calibration should be performed to a specified tolerance. The terms tolerance and accuracy are often used incorrectly. In ISA’s The Automation, Systems, and Instrumentation Dictionary, the definitions for each are as follows: Accuracy: The ratio of the error to the full scale output or the ratio of the error to the output, expressed in percent span or percent reading, respectively. Tolerance: Permissible deviation from a specified value; may be expressed in measurement units, percent of span, or percent of reading. As you can see from the definitions, there are subtle differences between the terms. It is recommended that the tolerance, specified in measurement units, is used for the calibration requirements performed at your facility. By specifying an actual value, mistakes caused by calculating percentages of span or reading are eliminated. Also, tolerances should be specified in the units measured for the calibration. Cable05.book Page 2 Wednesday, December 8, 2004 9:36 AM
  • 3. Calibration 3 For example, you are assigned to perform the calibration of the previously mentioned 0-to-300 psig pressure transmitter with a specified calibration tolerance of ±2 psig. The output tolerance would be: The calculated tolerance is rounded down to 0.10 mA, because rounding to 0.11 mA would exceed the calculated tolerance. It is recommended that both ±2 psig and ±0.10 mA tolerances appear on the calibration data sheet if the remote indications and output milliamp signal are recorded. Note the manufacturer’s specified accuracy for this instrument may be 0.25% full scale (FS). Calibration tolerances should not be assigned based on the manufacturer’s specification only. Calibration tolerances should be determined from a combination of factors. These factors include: • Requirements of the process • Capability of available test equipment • Consistency with similar instruments at your facility • Manufacturer’s specified tolerance Example: The process requires ±5°C; available test equipment is capable of ±0.25°C; and manufacturer’s stated accuracy is ±0.25°C. The specified calibration tolerance must be between the process requirement and manufacturer’s specified tolerance. Additionally the test equipment must be capable of the tolerance needed. A calibration tolerance of ±1°C might be assigned for consistency with similar instruments and to meet the recommended accuracy ratio of 4:1. Accuracy Ratio: This term was used in the past to describe the relationship between the accuracy of the test standard and the accuracy of the instrument under test. The term is still used by those that do not understand uncertainty calculations (uncertainty is described below). A good rule of thumb is to ensure an accuracy ratio of 4:1 when performing calibrations. This means the instrument or standard used should be four times more accurate than the instrument being checked. Therefore, the test 2 psig 300 psig÷ 16 mA× 0.1067 mA ---------------------------- Cable05.book Page 3 Wednesday, December 8, 2004 9:36 AM
  • 4. 4 Calibration Principles equipment (such as a field standard) used to calibrate the process instrument should be four times more accurate than the process instrument, the laboratory standard used to calibrate the field standard should be four times more accurate than the field standard, and so on. With today's technology, an accuracy ratio of 4:1 is becoming more difficult to achieve. Why is a 4:1 ratio recommended? Ensuring a 4:1 ratio will minimize the effect of the accuracy of the standard on the overall calibration accuracy. If a higher level standard is found to be out of tolerance by a factor of two, for example, the calibrations performed using that standard are less likely to be compromised. Suppose we use our previous example of the test equipment with a tolerance of ±0.25°C and it is found to be 0.5°C out of tolerance during a scheduled calibration. Since we took into consideration an accuracy ratio of 4:1 and assigned a calibration tolerance of ±1°C to the process instrument, it is less likely that our calibration performed using that standard is compromised. The out-of-tolerance standard still needs to be investigated by reverse traceability of all calibrations performed using the test standard. However, our assurance is high that the process instrument is within tolerance. If we had arbitrarily assigned a calibration tolerance of ±0.25°C to the process instrument, or used test equipment with a calibration tolerance of ±1°C, we would not have the assurance that our process instrument is within calibration tolerance. This leads us to traceability. Traceability: All calibrations should be performed traceable to a nationally or internationally recognized standard. For example, in the United States, the National Institute of Standards and Technology (NIST), formerly National Bureau of Standards (NBS), maintains the nationally recognized standards. Traceability is defined by ANSI/NCSL Z540-1-1994 (which replaced MIL-STD-45662A) as “the property of a result of a measurement whereby it can be related to appropriate standards, generally national or international standards, through an unbroken chain of comparisons.” Note this does not mean a calibration shop needs to have its standards calibrated with a primary standard. It means that the calibrations performed are traceable to NIST through all the standards used to calibrate the standards, no matter how many levels exist between the shop and NIST. Traceability is accomplished by ensuring the test standards we use are routinely calibrated by “higher level” reference standards. Typically the standards we use from the shop are sent out periodically to a standards lab which has more accurate test equipment. The standards Cable05.book Page 4 Wednesday, December 8, 2004 9:36 AM
  • 5. Calibration 5 from the calibration lab are periodically checked for calibration by “higher level” standards, and so on until eventually the standards are tested against Primary Standards maintained by NIST or another internationally recognized standard. The calibration technician’s role in maintaining traceability is to ensure the test standard is within its calibration interval and the unique identifier is recorded on the applicable calibration data sheet when the instrument calibration is performed. Additionally, when test standards are calibrated, the calibration documentation must be reviewed for accuracy and to ensure it was performed using NIST traceable equipment. FIGURE 1-1. Traceability Pyramid Uncertainty: Parameter, associated with the result of a measurement that characterizes the dispersion of the values that could reasonably be attributed to the measurand. Uncertainty analysis is required for calibration labs conforming to ISO 17025 requirements. Uncertainty analysis is performed to evaluate and identify factors associated with the calibration equipment and process instrument that affect the calibration accuracy. Calibration technicians should be aware of basic uncertainty analysis factors, such as environmental effects and how to combine Process Instrument Primary Standards National Measurement Standard (e.g., NIST) Working Standards (“normal” shop instruments) Secondary Standards Cable05.book Page 5 Wednesday, December 8, 2004 9:36 AM
  • 6. 6 Calibration Principles multiple calibration equipment accuracies to arrive at a single calibration equipment accuracy. Combining multiple calibration equipment or process instrument accuracies is done by calculating the square root of the sum of the squares, illustrated below: Calibration equipment combined accuracy Process instrument combined accuracy 1.3 WHY IS CALIBRATION REQUIRED? It makes sense that calibration is required for a new instrument. We want to make sure the instrument is providing accurate indication or output signal when it is installed. But why can’t we just leave it alone as long as the instrument is operating properly and continues to provide the indication we expect? Instrument error can occur due to a variety of factors: drift, environment, electrical supply, addition of components to the output loop, process changes, etc. Since a calibration is performed by comparing or applying a known signal to the instrument under test, errors are detected by performing a calibration. An error is the algebraic difference between the indication and the actual value of the measured variable. Typical errors that occur include: FIGURE 1-2. Span Error calibrator1 error( ) 2 calibrator2 error( ) 2 etc. error( ) 2 + + sensor error( ) 2 transmitter error( ) 2 indicator error( ) 2 etc. error( )+ 2 + + 100% 0 100% ACTUAL VALUE % INPUT %OUTPUT SPAN ERRORS Cable05.book Page 6 Wednesday, December 8, 2004 9:36 AM
  • 7. Calibration 7 FIGURE 1-3. Zero Error FIGURE 1-4. Combined Zero and Span Error 100% ACTUAL VALUE 0 100% % INPUT %OUTPUT ZERO ERRORS 100% 0 100% ACTUAL VALUE % INPUT %OUTPUT COMBINED ZERO AND SPAN ERRORS Cable05.book Page 7 Wednesday, December 8, 2004 9:36 AM
  • 8. 8 Calibration Principles FIGURE 1-5. Linearization Error Zero and span errors are corrected by performing a calibration. Most instruments are provided with a means of adjusting the zero and span of the instrument, along with instructions for performing this adjustment. The zero adjustment is used to produce a parallel shift of the input-output curve. The span adjustment is used to change the slope of the input-output curve. Linearization error may be corrected if the instrument has a linearization adjustment. If the magnitude of the nonlinear error is unacceptable and it cannot be adjusted, the instrument must be replaced. To detect and correct instrument error, periodic calibrations are performed. Even if a periodic calibration reveals the instrument is perfect and no adjustment is required, we would not have known that unless we performed the calibration. And even if adjustments are not required for several consecutive calibrations, we will still perform the calibration check at the next scheduled due date. Periodic calibrations to specified tolerances using approved procedures are an important element of any quality system. 1.4 WHO PERFORMS CALIBRATIONS? – THE CONTROL SYSTEM TECHNICIAN A control system technician (CST) is a skilled craftsperson who knows pneumatic, mechanical, and electrical instrumentation. He or she understands process control loops and process control systems, including 100% 0 100% ERRORS CAUSED BY NONLINEARITY DESIRED VALUE % INPUT %OUTPUT Cable05.book Page 8 Wednesday, December 8, 2004 9:36 AM
  • 9. Calibration 9 those that are computer-based. Typically, he or she has received training in such specialized subjects as theory of control, analog and/or digital electronics, microprocessors and/or computers, and the operation and maintenance of particular lines of field instrumentation. A CST performs calibration, documentation, loop checks, troubleshooting, and repair or replacement of instrumentation. These tasks relate to systems that measure and control level, temperature, pressure, flow, force, power, position, motion, physical properties, chemical composition and other process variables. 1.5 CHARACTERISTICS OF A CONTROL SYSTEM TECHNICIAN Honesty and Integrity: A CST must possess honesty and integrity above all else. Most technicians work independently much of the time. Calibrations must be performed in accordance with procedures and must be properly documented. Additionally, the calibration department may be understaffed and production schedules may demand unrealistic completion requirements. These factors can have a real impact on proper performance and documentation of calibrations. Remember: Nobody can take away your integrity; only you can give it away. Attention to Detail: Calibrations should be performed in accordance with detailed instructions. Each different make/model instrument is adjusted differently. Each instrument is installed in a different physical and loop configuration. Because of these and many other differences, attention to detail is very important. The minute a technician is not paying attention to detail, safety and proper performance are jeopardized. Excellent Documentation Practices: In many facilities, the impression of quality is determined by the content and appearance of documentation. Many technicians complain the paperwork is 90% of the work. In today’s world of ISO9000, cGMPs, A2LA, and other quality standards, documentation is essential. If it isn’t documented, it wasn’t done. Calibration Data Sheets must be neat, complete, signed and, if required, reviewed in a timely manner. When changes occur, all related documentation, such as drawings, manuals, specifications and databases must also be updated. Cable05.book Page 9 Wednesday, December 8, 2004 9:36 AM
  • 10. 10 Calibration Principles Understanding of Processes: One thing that sets technicians apart is an understanding of the process, particularly how the instruments monitor and control the process. There is a difference between calibrating an individual component and calibrating an instrument as part of the bigger process control loop. For example, knowing when a controller can be placed in manual without affecting the process and what to do while that controller is in manual, requires an understanding of the process. Additionally, when an operator says there is a problem with his indication, a technician who knows the instrument loop and process will be more capable of identifying the cause of the problem. Some basic concepts on how calibrations should be performed need to be discussed before we go on. Some of these may be new concepts not used in your facility, but you should be familiar with them. Some of these practices are industry dependent. Although calibrations are generally performed the same, some different practices have developed. These practices are: • Loop Calibration vs. Individual Instrument Calibration • Bench Calibration vs. Field Calibration • Classification of Instruments as Critical, Non-Critical, For Reference Only, etc. 1.6 LOOP CALIBRATION VS. INDIVIDUAL INSTRUMENT CALIBRATION An individual instrument calibration is a calibration performed only on one instrument. The input and output are disconnected. A known source is applied to the input, and the output is measured at various data points throughout the calibration range. The instrument is adjusted, if necessary, and calibration is checked. DISADVANTAGES OF INDIVIDUAL CALIBRATION ADVANTAGES OF INDIVIDUAL CALIBRATION 1. Entire loop is not verified within tolerance 2. Mistakes on re-connect 3. Less efficient use of time to do one calibration for each loop instrument as opposed to one calibration for the loop 1. Correct instrument will be adjusted 2. More compatible with multifunction calibrators Cable05.book Page 10 Wednesday, December 8, 2004 9:36 AM