SlideShare a Scribd company logo
Instrumentation and Process Control
BS (Eng.) Chemical Engineering 6th Semester
Engr. Muhammad Suleman
Lecturer
1
Study of Scientific Principles Employed in Instruments
• As future chemical engineers, understanding these principles is crucial because
instruments are the backbone of process control, analysis, and research in our
field. Whether it’s a simple thermometer or a sophisticated gas chromatograph,
every instrument operates based on fundamental scientific principles.
Importance of Scientific Instruments in Chemical
Engineering
• Process Monitoring and Control: Instruments help monitor parameters like
temperature, pressure, flow rate, and composition in real-time, ensuring optimal
process conditions.
• Quality Control: Analytical instruments ensure the quality of raw materials and
final products.
• Research and Development: Instruments are indispensable in developing new
processes, materials, and products.
• Safety: Instruments detect hazardous conditions, such as leaks or overheating,
preventing accidents.
Scientific Principles Behind Instruments
• Thermodynamics
• Thermodynamics is the study of energy, heat, and work. Many instruments rely
on thermodynamic principles:
• Thermocouples: Measure temperature based on the Seebeck effect, where a
voltage is generated due to a temperature difference between two dissimilar
metals.
• Calorimeters: Measure heat changes in chemical reactions using the principle of
heat transfer.
• Can anyone explain why thermocouples are widely used in industrial
processes compared to other temperature sensors?
Scientific Principles Behind Instruments
• Fluid mechanics deals with the behavior of fluids (liquids and gases). Instruments
based on fluid mechanics include:
• Flow Meters: Measure flow rates using principles like Bernoulli’s equation or the
Coriolis effect.
• Manometers: Measure pressure differences using the height of a fluid column.
• Question: How does a Venturi meter use Bernoulli’s principle to measure
flow rate?
• Electrochemistry involves the interaction of electrical energy and chemical
reactions. Instruments based on electrochemistry include:
• pH Meters: Measure the acidity or alkalinity of a solution using the potential
difference between two electrodes.
• Conductivity Meters: Measure the ability of a solution to conduct electricity,
which depends on ion concentration.
• Question: Why is it important to calibrate a pH meter before use?
Signal Processing and Data Interpretation
• Signal processing is a critical aspect of instrumentation because raw
data from sensors often needs to be conditioned, amplified, and
converted into a usable format.
• Transducers: Devices that convert one form of energy into another.
they typically convert physical quantities (e.g., temperature, pressure,
flow rate) into electrical signals.
• Types of Transducers:
• Active Transducers: Generate electrical signals directly from the
physical input (e.g., thermocouples, piezoelectric sensors).
• Passive Transducers: Require an external power source to produce
an output signal (e.g., strain gauges, capacitive sensors).
• Example: Thermocouple converts temperature differences into a
voltage signal using the Seebeck effect.
Signal Conditioning
• To prepare transducer’s output for further processing by amplifying,
filtering, and linearizing the signal.
• Components:
• Amplifiers: Increase the amplitude of weak signals. Operational
amplifiers commonly used for this purpose.
• Example: Strain gauge outputs a small change in resistance, which is
converted into a voltage signal and then amplified.
• Filters: Remove unwanted noise or frequencies from the signal.
• Linearization: Some transducers produce non-linear outputs, which
need to be linearized for accurate interpretation.
• Example: Thermistors have a non-linear resistance-temperature
relationship, which can be linearized using software or hardware
techniques.
Calibration and Validation
• Essential to ensure that instruments provide accurate, reliable, and
consistent measurements.
• Calibration process of comparing an instrument’s output to a known
standard under specific conditions.
• Steps in Calibration:
• Select a Standard: Use a reference material or instrument with
traceable accuracy.
• Perform Measurements: Compare the instrument’s output to the
standard at multiple points across its range.
• Adjust Instrument: Correct any deviations from the standard (if
possible).
• Example: Calibrating a pressure sensor using a deadweight tester.
• Calibration must be performed under controlled environmental
conditions (e.g., temperature, humidity) to minimize errors.
• Question : Why is it necessary to calibrate an instrument at
multiple points across its range rather than just one point?
Calibration and Validation
• Temperature Sensor to be Calibrated: Thermocouple, RTD
(Resistance Temperature Detector), or thermistor.
• Standard Thermometer: High-precision reference thermometer
(e.g., a platinum resistance thermometer or a calibrated mercury-in-
glass thermometer).
• Temperature Bath: Stable and uniform heat source to create a
controlled temperature environment.
• Data Acquisition System: Record the readings from both the sensor
and the standard thermometer.
• Insulated Container or Well: Ensure both the sensor and the
standard thermometer are exposed to the same temperature.
Calibration and Validation
• Suppose you are calibrating a thermocouple at 100°C using a
standard platinum resistance thermometer (SPRT). The following
data is recorded:
• The error at 100°C is +0.5°C, indicating that the sensor reads
0.5°C higher than the standard.
• If the sensor has an adjustable offset, you can apply a correction
of -0.5°C to align it with the standard.
Calibration and Validation
• Level Sensor: The sensor to be calibrated.
• Reference Measurement: Calibrated tape measure, dipstick, or sight glass to
measure the actual liquid level.
• Test Vessel or Tank: A container with known dimensions to hold the liquid.
Static characteristics of instruments
• Thermometer in a room and its reading shows a temperature of 20°C, then It does
not really matter whether the true temperature of the room is 19.5°C or 20.5°C.
• Such small variations around 20°C are too small to affect whether we feel war
enough or not. Our bodies cannot discriminate between such close levels of
temperature and therefore a thermometer with an inaccuracy of ±0.5°C is
perfectly adequate.
• If we had to measure the temperature of certain chemical processes, however, a
variation of 0.5°C might have a significant effect on the rate of reaction or even
the products of a process.
• A measurement inaccuracy much less than ±0.5°C is therefore clearly required.
Accuracy of measurement is thus one consideration in the choice of instrument
for a particular application.
• Other parameters such as Sensitivity, Linearity and the reaction to ambient
temperature changes. These attributes are collectively known as the static
characteristics of instruments.
Accuracy and Inaccuracy (measurement
uncertainty)
•Static Properties:
•Accuracy: Closeness of the measured value to the true value. For example, a
pressure sensor with ±0.1% accuracy is more reliable than one with ±1%
accuracy.
•The degree to which measured value matches the true or accepted value.
•Factors Affecting Accuracy:
•Calibration errors.
•Environmental conditions (temperature, humidity, etc.).
•Wear and tear of the instrument.
•Improving Accuracy:
•Regular calibration.
•Using higher-quality instruments.
•Compensating for environmental factors.
Accuracy and Inaccuracy (measurement
uncertainty)
Accuracy of an instrument is a measure of how close the output reading of the
instrument is to the correct value.
•Inaccuracy is the extent to which a reading might be wrong, and is often quoted as a
percentage of the full-scale (f.s.) reading of an instrument.
•E.g. pressure gauge of range 0–10 bar has a quoted inaccuracy of ±1.0% f.s. (±1% of
full-scale reading), maximum error to be expected in any reading is 0.1 bar. This
means that when the instrument is reading 1.0 bar, possible error is 10% of this value.
•Important system design rule instruments are chosen such that their range is
appropriate to spread of values being measured, best possible accuracy is maintained
in instrument readings.
•Thus, if we were measuring pressures with expected values between 0 and 1 bar,
we would not use an instrument with a range of 0–10 bar.
Precision/repeatability/reproducibility
•Precision: Consistency of repeated measurements. A flow meter that gives the
same reading multiple times for the same flow rate is highly precise.
•High precision means low random error (repeatability).
Precision describes an instrument’s degree of freedom from random errors.
•If a large number of readings are taken of the same quantity by a high precision
instrument, spread of readings will be very small.
•Precision is often, though incorrectly, confused with accuracy. High precision does
not imply anything about measurement accuracy.
•High precision instrument may have a low accuracy.
•Low accuracy measurements from a high precision instrument are normally caused
by a bias in the measurements, which is removable by recalibration.
•Repeatability and reproducibility mean approximately the same but are applied in
different contexts.
Precision/repeatability/reproducibility
• Repeatability closeness of output readings when same input is applied
repetitively over a short period of time, with the same measurement
conditions, same instrument and observer, same location and same conditions
of use maintained throughout.
• Reproducibility closeness of output readings for the same input when there
are changes in the method of measurement, observer, measuring
instrument, location, conditions of use and time of measurement.
• Both terms thus describe the spread of output readings for the same input. This
spread is referred to as repeatability if measurement conditions are constant and
as reproducibility if the measurement conditions vary.
Precision/repeatability/reproducibility
The degree of repeatability or reproducibility in measurements from an instrument is an
alternative way of expressing its precision. Figure 2.5 illustrates this more clearly.
18
• Sensitivity: The sensitivity of measurement is a measure of the change in instrument output
that occurs when the quantity being measured changes by a given amount. Thus, sensitivity
is the ratio:
• Sensitivity of measurement is therefore the slope of the straight line drawn on Figure 2.6. If,
for example, a pressure of 2 bar produces deflection of 10 degrees in a pressure transducer,
the sensitivity of the instrument is 5 degrees/bar (assuming that the deflection is zero with
zero pressure applied).
• .
Dynamic and Static Properties of Instruments
19
• Ability to detect small changes in the measured variable. For example, a
highly sensitive pH sensor can detect changes of 0.01 pH units.
• The ratio of the change in the output to the change in the input.
• Indicates how effectively the instrument responds to small changes in the
measured quantity.
Dynamic and Static Properties of Instruments
20
•Span
•The difference between the maximum and minimum values of the input
that the instrument can measure.
Dynamic and Static Properties of Instruments
21
•Threshold
•If the input to an instrument is gradually increased from zero, the input will
have to reach a certain minimum level before the change in the instrument
output reading is of a large enough magnitude to be detectable. This
minimum level of input is known as the threshold of the instrument. as a
percentage of full-scale readings. As an illustration, a car speedometer
typically has a threshold of about 15 km/h. This means that, if the vehicle
starts from rest and accelerates, no output reading is observed on the
speedometer until the speed reaches 15 km/h.
Dynamic and Static Properties of Instruments
Dynamic and Static Properties of
Instruments
• Range: Minimum and maximum values an instrument can measure. For example, a
temperature sensor with a range of -50°C to 150°C is suitable for cryogenic and
moderate-temperature applications.
• For example, a thermometer with a range of 0°C to 100°C.
• Resolution: Smallest change in the measured variable that the instrument can
detect. For example, a digital thermometer with a resolution of 0.1°C can detect
small temperature changes.
• The smallest change in the input quantity that the instrument can detect and
display.
• For example, a digital multimeter with a resolution of 0.01 V can detect changes as
small as 0.01 V.
Dynamic and Static Properties of
Instruments
• Hysteresis
• The difference in the output for the same input when the input is increased and decreased.
• Caused by internal friction, magnetic effects, or mechanical play in the instrument.
• Dead Zone
• The range of input values for which the instrument does not respond or produce any output.
• Often caused by friction or backlash in mechanical systems.
• Repeatability
• The ability of the instrument to produce the same output for the same input under the same
conditions over multiple trials.
• Reproducibility
• The ability of the instrument to produce the same output for the same input under different
conditions (e.g., different operators, environments).
Dynamic and Static Properties of Instruments
• Threshold
• The minimum input required to produce a detectable output.
• Also known as the minimum detectable signal.
• Zero Drift
• The change in the instrument's output over time when the input is zero.
• Caused by factors like temperature changes or aging of components.
• Stability
• The ability of the instrument to maintain its performance over time without significant drift or degradation.
• Error
• The difference between the measured value and the true value.
• Types of errors include:
• Systematic Error: Consistent deviation due to instrument flaws.
• Random Error: Unpredictable variations due to noise or external factors.
Active and Passive Instruments
Instruments are divided into active or passive ones according to whether instrument output is
entirely produced by quantity being measured or whether quantity being measured simply
modulates the magnitude of some external power source.
Passive instrument is the pressure-measuring device shown in Figure. Pressure of the fluid is
translated into a movement of a pointer against a scale. Energy expended in moving the pointer
derived entirely from change in pressure measured: there are no other energy inputs to system.
Active instrument float-type petrol tank level indicator sketched in Figure. Change in petrol
level moves a potentiometer arm, and output signal consists of proportion of external voltage
source applied across two ends of potentiometer. Energy in output signal comes from external
power source: Primary transducer float system is merely modulating value of the voltage from
external power source.
In active instruments, the external power source is usually in electrical form, but in some
cases, it can be other forms of energy such as a pneumatic or hydraulic one.
Active and Passive Instruments
Null-type and deflection-type instruments
• Pressure gauge just mentioned is good example deflection type instrument value of quantity being
measured is displayed in terms of amount of movement of a pointer.
• Alternative type of pressure gauge is the deadweight gauge shown in Figure, null-type instrument. Weights
are put on top of the piston until the downward force balances the fluid pressure.
• Weights are added until the piston reaches a datum level, known as null point. Pressure measurement is
made in terms of value of weights needed to reach this null position.
• Accuracy of these two instruments depends on different things. First one depends on linearity and
calibration of the spring, whilst for second it relies on calibration of weights.
• Calibration of weights is much easier than careful choice and calibration of a linear-characteristic spring,
this means that second type of instrument will normally be more accurate. Accordance with the general rule
that null-type instruments are more accurate than deflection types.
Null-type and deflection-type instruments
• In terms of usage, the deflection type instrument is clearly more convenient.
• It is far simpler to read the position of a pointer against a scale than to add and subtract weights
until a null point is reached.
• Deflection-type instrument is therefore the one that would normally be used in the workplace.
• However, for calibration duties, null-type instrument is preferable because of its superior
accuracy.
• Extra effort required to use such an instrument is perfectly acceptable in this case because of
the infrequent nature of calibration operations.
Analogue and digital instruments
• An analogue instrument gives an output varies continuously as quantity being measured
changes.
• Output can have an infinite number of values within range that instrument is designed to
measure.
• Deflection-type of pressure gauge described earlier in this chapter (Figure 2.1) good example
of an analogue instrument.
• As the input value changes, pointer moves with a smooth continuous motion. Whilst pointer
can therefore be in an infinite number of positions within its range of movement, number of
different positions that eye can discriminate between is strictly limited, this discrimination
being dependent upon how large the scale is and how finely it is divided.
• Digital instrument has an output that varies in discrete steps and so can only have a finite
number of values. Rev counter sketched in Figure 2.4 example of digital instrument.
• Cam is attached to revolving body whose motion is being measured, and on each revolution
the cam opens and closes a switch. Switching operations are counted by an electronic counter.
System can only count whole revolutions and cannot discriminate any motion that is less than
a full revolution.
Analogue and digital instruments

More Related Content

PPTX
unit i.pptx
PPTX
Instrumentation and Control Engineering ppt 1.pptx
PDF
Investigate Engineering Measurement Systems
PDF
Chemical Process Instrumentation;General principles of measurement systems
PPT
Introduction to Temperature Measurement and Calibration Presented by Fluke Ca...
PPTX
Measurement and metrology B.Tech engineeing-1.pptx
PPTX
Measurement and metrology-1MMM MMM mm.pptx
PPTX
instrumentation electrical engineering Lecture 1.2.pptx
unit i.pptx
Instrumentation and Control Engineering ppt 1.pptx
Investigate Engineering Measurement Systems
Chemical Process Instrumentation;General principles of measurement systems
Introduction to Temperature Measurement and Calibration Presented by Fluke Ca...
Measurement and metrology B.Tech engineeing-1.pptx
Measurement and metrology-1MMM MMM mm.pptx
instrumentation electrical engineering Lecture 1.2.pptx

Similar to Lecture 02 IPC.ppt PLC ANALOG DIGITAL CONTROL (20)

PPTX
SENSOR. .n
PPTX
Temprature sensor
PPTX
Unit 5_measurement & control.pptx
PPTX
Lect 1 Measurements and Measurement Systems.pptx
PPTX
Industrial-instrumentation.pptx
PDF
Unit 1 Transducers Engineering (Instrumentation).pdf
PDF
EMI unit -1 introduction to measurements
PPTX
Industrial Temperature Measurement
PPTX
PPT
Sensors and transducers 1.ppt
PPTX
Introduction to metrology
PPTX
Digital thermometer ppt
PDF
Introduction to Emi static &dynamic measurements
PPT
Emm unit i
PPTX
Unit-1 instrumentation
PDF
Measurement Errors and Standards
PPTX
How to Use a Portable Bath to Improve Calibration of Sanitary RTD's and Trans...
PPTX
unit-1-Basics of metrology.pptx
PPT
Instrumentation basics karthik anand UNIT - I
PDF
Measurement System and concept of measuring.pptx.pdf
SENSOR. .n
Temprature sensor
Unit 5_measurement & control.pptx
Lect 1 Measurements and Measurement Systems.pptx
Industrial-instrumentation.pptx
Unit 1 Transducers Engineering (Instrumentation).pdf
EMI unit -1 introduction to measurements
Industrial Temperature Measurement
Sensors and transducers 1.ppt
Introduction to metrology
Digital thermometer ppt
Introduction to Emi static &dynamic measurements
Emm unit i
Unit-1 instrumentation
Measurement Errors and Standards
How to Use a Portable Bath to Improve Calibration of Sanitary RTD's and Trans...
unit-1-Basics of metrology.pptx
Instrumentation basics karthik anand UNIT - I
Measurement System and concept of measuring.pptx.pdf
Ad

Recently uploaded (20)

PDF
Model Code of Practice - Construction Work - 21102022 .pdf
PPTX
Current and future trends in Computer Vision.pptx
PDF
Mitigating Risks through Effective Management for Enhancing Organizational Pe...
PDF
Enhancing Cyber Defense Against Zero-Day Attacks using Ensemble Neural Networks
PPTX
UNIT-1 - COAL BASED THERMAL POWER PLANTS
PDF
July 2025 - Top 10 Read Articles in International Journal of Software Enginee...
PPTX
Infosys Presentation by1.Riyan Bagwan 2.Samadhan Naiknavare 3.Gaurav Shinde 4...
PPT
introduction to datamining and warehousing
PDF
keyrequirementskkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkk
PPTX
OOP with Java - Java Introduction (Basics)
PPTX
MET 305 2019 SCHEME MODULE 2 COMPLETE.pptx
PPTX
web development for engineering and engineering
PPTX
Engineering Ethics, Safety and Environment [Autosaved] (1).pptx
PDF
PPT on Performance Review to get promotions
PDF
Operating System & Kernel Study Guide-1 - converted.pdf
PDF
BMEC211 - INTRODUCTION TO MECHATRONICS-1.pdf
PPTX
M Tech Sem 1 Civil Engineering Environmental Sciences.pptx
PDF
Digital Logic Computer Design lecture notes
PPT
Project quality management in manufacturing
PDF
R24 SURVEYING LAB MANUAL for civil enggi
Model Code of Practice - Construction Work - 21102022 .pdf
Current and future trends in Computer Vision.pptx
Mitigating Risks through Effective Management for Enhancing Organizational Pe...
Enhancing Cyber Defense Against Zero-Day Attacks using Ensemble Neural Networks
UNIT-1 - COAL BASED THERMAL POWER PLANTS
July 2025 - Top 10 Read Articles in International Journal of Software Enginee...
Infosys Presentation by1.Riyan Bagwan 2.Samadhan Naiknavare 3.Gaurav Shinde 4...
introduction to datamining and warehousing
keyrequirementskkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkk
OOP with Java - Java Introduction (Basics)
MET 305 2019 SCHEME MODULE 2 COMPLETE.pptx
web development for engineering and engineering
Engineering Ethics, Safety and Environment [Autosaved] (1).pptx
PPT on Performance Review to get promotions
Operating System & Kernel Study Guide-1 - converted.pdf
BMEC211 - INTRODUCTION TO MECHATRONICS-1.pdf
M Tech Sem 1 Civil Engineering Environmental Sciences.pptx
Digital Logic Computer Design lecture notes
Project quality management in manufacturing
R24 SURVEYING LAB MANUAL for civil enggi
Ad

Lecture 02 IPC.ppt PLC ANALOG DIGITAL CONTROL

  • 1. Instrumentation and Process Control BS (Eng.) Chemical Engineering 6th Semester Engr. Muhammad Suleman Lecturer 1
  • 2. Study of Scientific Principles Employed in Instruments • As future chemical engineers, understanding these principles is crucial because instruments are the backbone of process control, analysis, and research in our field. Whether it’s a simple thermometer or a sophisticated gas chromatograph, every instrument operates based on fundamental scientific principles.
  • 3. Importance of Scientific Instruments in Chemical Engineering • Process Monitoring and Control: Instruments help monitor parameters like temperature, pressure, flow rate, and composition in real-time, ensuring optimal process conditions. • Quality Control: Analytical instruments ensure the quality of raw materials and final products. • Research and Development: Instruments are indispensable in developing new processes, materials, and products. • Safety: Instruments detect hazardous conditions, such as leaks or overheating, preventing accidents.
  • 4. Scientific Principles Behind Instruments • Thermodynamics • Thermodynamics is the study of energy, heat, and work. Many instruments rely on thermodynamic principles: • Thermocouples: Measure temperature based on the Seebeck effect, where a voltage is generated due to a temperature difference between two dissimilar metals. • Calorimeters: Measure heat changes in chemical reactions using the principle of heat transfer. • Can anyone explain why thermocouples are widely used in industrial processes compared to other temperature sensors?
  • 5. Scientific Principles Behind Instruments • Fluid mechanics deals with the behavior of fluids (liquids and gases). Instruments based on fluid mechanics include: • Flow Meters: Measure flow rates using principles like Bernoulli’s equation or the Coriolis effect. • Manometers: Measure pressure differences using the height of a fluid column. • Question: How does a Venturi meter use Bernoulli’s principle to measure flow rate? • Electrochemistry involves the interaction of electrical energy and chemical reactions. Instruments based on electrochemistry include: • pH Meters: Measure the acidity or alkalinity of a solution using the potential difference between two electrodes. • Conductivity Meters: Measure the ability of a solution to conduct electricity, which depends on ion concentration. • Question: Why is it important to calibrate a pH meter before use?
  • 6. Signal Processing and Data Interpretation • Signal processing is a critical aspect of instrumentation because raw data from sensors often needs to be conditioned, amplified, and converted into a usable format. • Transducers: Devices that convert one form of energy into another. they typically convert physical quantities (e.g., temperature, pressure, flow rate) into electrical signals. • Types of Transducers: • Active Transducers: Generate electrical signals directly from the physical input (e.g., thermocouples, piezoelectric sensors). • Passive Transducers: Require an external power source to produce an output signal (e.g., strain gauges, capacitive sensors). • Example: Thermocouple converts temperature differences into a voltage signal using the Seebeck effect.
  • 7. Signal Conditioning • To prepare transducer’s output for further processing by amplifying, filtering, and linearizing the signal. • Components: • Amplifiers: Increase the amplitude of weak signals. Operational amplifiers commonly used for this purpose. • Example: Strain gauge outputs a small change in resistance, which is converted into a voltage signal and then amplified. • Filters: Remove unwanted noise or frequencies from the signal. • Linearization: Some transducers produce non-linear outputs, which need to be linearized for accurate interpretation. • Example: Thermistors have a non-linear resistance-temperature relationship, which can be linearized using software or hardware techniques.
  • 8. Calibration and Validation • Essential to ensure that instruments provide accurate, reliable, and consistent measurements. • Calibration process of comparing an instrument’s output to a known standard under specific conditions. • Steps in Calibration: • Select a Standard: Use a reference material or instrument with traceable accuracy. • Perform Measurements: Compare the instrument’s output to the standard at multiple points across its range. • Adjust Instrument: Correct any deviations from the standard (if possible). • Example: Calibrating a pressure sensor using a deadweight tester. • Calibration must be performed under controlled environmental conditions (e.g., temperature, humidity) to minimize errors. • Question : Why is it necessary to calibrate an instrument at multiple points across its range rather than just one point?
  • 9. Calibration and Validation • Temperature Sensor to be Calibrated: Thermocouple, RTD (Resistance Temperature Detector), or thermistor. • Standard Thermometer: High-precision reference thermometer (e.g., a platinum resistance thermometer or a calibrated mercury-in- glass thermometer). • Temperature Bath: Stable and uniform heat source to create a controlled temperature environment. • Data Acquisition System: Record the readings from both the sensor and the standard thermometer. • Insulated Container or Well: Ensure both the sensor and the standard thermometer are exposed to the same temperature.
  • 10. Calibration and Validation • Suppose you are calibrating a thermocouple at 100°C using a standard platinum resistance thermometer (SPRT). The following data is recorded: • The error at 100°C is +0.5°C, indicating that the sensor reads 0.5°C higher than the standard. • If the sensor has an adjustable offset, you can apply a correction of -0.5°C to align it with the standard.
  • 11. Calibration and Validation • Level Sensor: The sensor to be calibrated. • Reference Measurement: Calibrated tape measure, dipstick, or sight glass to measure the actual liquid level. • Test Vessel or Tank: A container with known dimensions to hold the liquid.
  • 12. Static characteristics of instruments • Thermometer in a room and its reading shows a temperature of 20°C, then It does not really matter whether the true temperature of the room is 19.5°C or 20.5°C. • Such small variations around 20°C are too small to affect whether we feel war enough or not. Our bodies cannot discriminate between such close levels of temperature and therefore a thermometer with an inaccuracy of ±0.5°C is perfectly adequate. • If we had to measure the temperature of certain chemical processes, however, a variation of 0.5°C might have a significant effect on the rate of reaction or even the products of a process. • A measurement inaccuracy much less than ±0.5°C is therefore clearly required. Accuracy of measurement is thus one consideration in the choice of instrument for a particular application. • Other parameters such as Sensitivity, Linearity and the reaction to ambient temperature changes. These attributes are collectively known as the static characteristics of instruments.
  • 13. Accuracy and Inaccuracy (measurement uncertainty) •Static Properties: •Accuracy: Closeness of the measured value to the true value. For example, a pressure sensor with ±0.1% accuracy is more reliable than one with ±1% accuracy. •The degree to which measured value matches the true or accepted value. •Factors Affecting Accuracy: •Calibration errors. •Environmental conditions (temperature, humidity, etc.). •Wear and tear of the instrument. •Improving Accuracy: •Regular calibration. •Using higher-quality instruments. •Compensating for environmental factors.
  • 14. Accuracy and Inaccuracy (measurement uncertainty) Accuracy of an instrument is a measure of how close the output reading of the instrument is to the correct value. •Inaccuracy is the extent to which a reading might be wrong, and is often quoted as a percentage of the full-scale (f.s.) reading of an instrument. •E.g. pressure gauge of range 0–10 bar has a quoted inaccuracy of ±1.0% f.s. (±1% of full-scale reading), maximum error to be expected in any reading is 0.1 bar. This means that when the instrument is reading 1.0 bar, possible error is 10% of this value. •Important system design rule instruments are chosen such that their range is appropriate to spread of values being measured, best possible accuracy is maintained in instrument readings. •Thus, if we were measuring pressures with expected values between 0 and 1 bar, we would not use an instrument with a range of 0–10 bar.
  • 15. Precision/repeatability/reproducibility •Precision: Consistency of repeated measurements. A flow meter that gives the same reading multiple times for the same flow rate is highly precise. •High precision means low random error (repeatability). Precision describes an instrument’s degree of freedom from random errors. •If a large number of readings are taken of the same quantity by a high precision instrument, spread of readings will be very small. •Precision is often, though incorrectly, confused with accuracy. High precision does not imply anything about measurement accuracy. •High precision instrument may have a low accuracy. •Low accuracy measurements from a high precision instrument are normally caused by a bias in the measurements, which is removable by recalibration. •Repeatability and reproducibility mean approximately the same but are applied in different contexts.
  • 16. Precision/repeatability/reproducibility • Repeatability closeness of output readings when same input is applied repetitively over a short period of time, with the same measurement conditions, same instrument and observer, same location and same conditions of use maintained throughout. • Reproducibility closeness of output readings for the same input when there are changes in the method of measurement, observer, measuring instrument, location, conditions of use and time of measurement. • Both terms thus describe the spread of output readings for the same input. This spread is referred to as repeatability if measurement conditions are constant and as reproducibility if the measurement conditions vary.
  • 17. Precision/repeatability/reproducibility The degree of repeatability or reproducibility in measurements from an instrument is an alternative way of expressing its precision. Figure 2.5 illustrates this more clearly.
  • 18. 18 • Sensitivity: The sensitivity of measurement is a measure of the change in instrument output that occurs when the quantity being measured changes by a given amount. Thus, sensitivity is the ratio: • Sensitivity of measurement is therefore the slope of the straight line drawn on Figure 2.6. If, for example, a pressure of 2 bar produces deflection of 10 degrees in a pressure transducer, the sensitivity of the instrument is 5 degrees/bar (assuming that the deflection is zero with zero pressure applied). • . Dynamic and Static Properties of Instruments
  • 19. 19 • Ability to detect small changes in the measured variable. For example, a highly sensitive pH sensor can detect changes of 0.01 pH units. • The ratio of the change in the output to the change in the input. • Indicates how effectively the instrument responds to small changes in the measured quantity. Dynamic and Static Properties of Instruments
  • 20. 20 •Span •The difference between the maximum and minimum values of the input that the instrument can measure. Dynamic and Static Properties of Instruments
  • 21. 21 •Threshold •If the input to an instrument is gradually increased from zero, the input will have to reach a certain minimum level before the change in the instrument output reading is of a large enough magnitude to be detectable. This minimum level of input is known as the threshold of the instrument. as a percentage of full-scale readings. As an illustration, a car speedometer typically has a threshold of about 15 km/h. This means that, if the vehicle starts from rest and accelerates, no output reading is observed on the speedometer until the speed reaches 15 km/h. Dynamic and Static Properties of Instruments
  • 22. Dynamic and Static Properties of Instruments • Range: Minimum and maximum values an instrument can measure. For example, a temperature sensor with a range of -50°C to 150°C is suitable for cryogenic and moderate-temperature applications. • For example, a thermometer with a range of 0°C to 100°C. • Resolution: Smallest change in the measured variable that the instrument can detect. For example, a digital thermometer with a resolution of 0.1°C can detect small temperature changes. • The smallest change in the input quantity that the instrument can detect and display. • For example, a digital multimeter with a resolution of 0.01 V can detect changes as small as 0.01 V.
  • 23. Dynamic and Static Properties of Instruments • Hysteresis • The difference in the output for the same input when the input is increased and decreased. • Caused by internal friction, magnetic effects, or mechanical play in the instrument. • Dead Zone • The range of input values for which the instrument does not respond or produce any output. • Often caused by friction or backlash in mechanical systems. • Repeatability • The ability of the instrument to produce the same output for the same input under the same conditions over multiple trials. • Reproducibility • The ability of the instrument to produce the same output for the same input under different conditions (e.g., different operators, environments).
  • 24. Dynamic and Static Properties of Instruments • Threshold • The minimum input required to produce a detectable output. • Also known as the minimum detectable signal. • Zero Drift • The change in the instrument's output over time when the input is zero. • Caused by factors like temperature changes or aging of components. • Stability • The ability of the instrument to maintain its performance over time without significant drift or degradation. • Error • The difference between the measured value and the true value. • Types of errors include: • Systematic Error: Consistent deviation due to instrument flaws. • Random Error: Unpredictable variations due to noise or external factors.
  • 25. Active and Passive Instruments Instruments are divided into active or passive ones according to whether instrument output is entirely produced by quantity being measured or whether quantity being measured simply modulates the magnitude of some external power source. Passive instrument is the pressure-measuring device shown in Figure. Pressure of the fluid is translated into a movement of a pointer against a scale. Energy expended in moving the pointer derived entirely from change in pressure measured: there are no other energy inputs to system. Active instrument float-type petrol tank level indicator sketched in Figure. Change in petrol level moves a potentiometer arm, and output signal consists of proportion of external voltage source applied across two ends of potentiometer. Energy in output signal comes from external power source: Primary transducer float system is merely modulating value of the voltage from external power source. In active instruments, the external power source is usually in electrical form, but in some cases, it can be other forms of energy such as a pneumatic or hydraulic one.
  • 26. Active and Passive Instruments
  • 27. Null-type and deflection-type instruments • Pressure gauge just mentioned is good example deflection type instrument value of quantity being measured is displayed in terms of amount of movement of a pointer. • Alternative type of pressure gauge is the deadweight gauge shown in Figure, null-type instrument. Weights are put on top of the piston until the downward force balances the fluid pressure. • Weights are added until the piston reaches a datum level, known as null point. Pressure measurement is made in terms of value of weights needed to reach this null position. • Accuracy of these two instruments depends on different things. First one depends on linearity and calibration of the spring, whilst for second it relies on calibration of weights. • Calibration of weights is much easier than careful choice and calibration of a linear-characteristic spring, this means that second type of instrument will normally be more accurate. Accordance with the general rule that null-type instruments are more accurate than deflection types.
  • 28. Null-type and deflection-type instruments • In terms of usage, the deflection type instrument is clearly more convenient. • It is far simpler to read the position of a pointer against a scale than to add and subtract weights until a null point is reached. • Deflection-type instrument is therefore the one that would normally be used in the workplace. • However, for calibration duties, null-type instrument is preferable because of its superior accuracy. • Extra effort required to use such an instrument is perfectly acceptable in this case because of the infrequent nature of calibration operations.
  • 29. Analogue and digital instruments • An analogue instrument gives an output varies continuously as quantity being measured changes. • Output can have an infinite number of values within range that instrument is designed to measure. • Deflection-type of pressure gauge described earlier in this chapter (Figure 2.1) good example of an analogue instrument. • As the input value changes, pointer moves with a smooth continuous motion. Whilst pointer can therefore be in an infinite number of positions within its range of movement, number of different positions that eye can discriminate between is strictly limited, this discrimination being dependent upon how large the scale is and how finely it is divided. • Digital instrument has an output that varies in discrete steps and so can only have a finite number of values. Rev counter sketched in Figure 2.4 example of digital instrument. • Cam is attached to revolving body whose motion is being measured, and on each revolution the cam opens and closes a switch. Switching operations are counted by an electronic counter. System can only count whole revolutions and cannot discriminate any motion that is less than a full revolution.
  • 30. Analogue and digital instruments