🎧 New: AI-Generated Podcasts Turn your study notes into engaging audio conversations. Learn more

ME-415-ME-Laboratory 1-Pressure Measurement PDF

Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...

Summary

This document provides an introduction to pressure measurement, covering its importance in various industries and applications. It discusses different types of pressure measurement, such as absolute and gauge pressure, and explains pressure-measuring instruments like manometers and Bourdon tubes. The document also touches upon the significance of precise pressure measurement in safety, process control, efficiency, and equipment protection.

Full Transcript

II. PRESSURE MEASUREMENT INTRODUCTION Pressure measurement is one of the most common of all measurements made on systems. Pressure along with flow measurements is extensively used in industry, laboratories and many other fields for a wide variety of reasons. Pressure measurements are con...

II. PRESSURE MEASUREMENT INTRODUCTION Pressure measurement is one of the most common of all measurements made on systems. Pressure along with flow measurements is extensively used in industry, laboratories and many other fields for a wide variety of reasons. Pressure measurements are concerned not only with determination of force per unit area but are also involved in many liquid level, density, flow and temperature measurements. The measurement of pressure is one of the most important measurements, as it is used in almost all industries. Some important applications of pressure measurement are listed. 1. The pressure of steam in a boiler is measured for ensuring safe operating condition of the boiler. 2. Pressure measurement is done in continuous processing industries such as manufacturing and chemical industries. 3. Pressure measurement helps in determining the liquid level in tanks and containers. 4. Pressure measurement helps in determining the density of liquids. 5. In many flow meter (such as venturi meter, orifice meter, flow nozzle, etc.,) pressure measurement serves as an indication of flow rate. 6. Measurement of pressure change becomes an indication of temperature (as used in pressure thermometers-fluid expansion type). 7. Apart from this, pressure measurement is also required in day-to-day situations such as maintaining optimal pressure in tubes of vehicle tires. Pressure measurement is crucial in process industries for several reasons: 1. Safety: Accurate pressure monitoring helps prevent over-pressurization, which can lead to equipment failure, explosions, or the release of hazardous substances. Maintaining proper pressure levels is essential for ensuring the safety of personnel and equipment. 2. Process Control: Pressure is a key parameter in many industrial processes. Precise pressure measurement allows for better control of processes like chemical reactions, fluid flow, and gas compression. This ensures that the process operates within desired parameters, leading to consistent product quality. 3. Efficiency: Proper pressure management can optimize energy consumption and reduce waste. For example, in a steam system, maintaining the correct pressure can improve energy efficiency, reducing costs and minimizing environmental impact. 4. Equipment Protection: Continuous pressure monitoring helps protect sensitive equipment, such as pumps, compressors, and vessels, from damage due 9 to abnormal pressure conditions. This extends the lifespan of the equipment and reduces maintenance costs. 5. Compliance and Regulation: Many process industries are subject to strict regulations regarding pressure levels. Accurate pressure measurement is necessary to ensure compliance with industry standards and regulatory requirements. 6. Early Detection of Issues: Deviations in pressure can indicate potential problems, such as leaks, blockages, or equipment malfunctions. Early detection through pressure monitoring allows for timely intervention, preventing costly downtime or accident. PRESSURE Pressure is the force exerted per unit area on the surface of an object. It is a measure of how much force are distributed in fluids (gases and liquids) and solids and applied over a specific area and is typically expressed in units such as pascals (Pa), atmospheres (atm), or pounds per square inch (psi). Mathematically, pressure is defined as: 𝐹𝑜𝑟𝑐𝑒 (𝐹) 𝑃𝑟𝑒𝑠𝑠𝑢𝑟𝑒 (𝑃) = 𝐴𝑟𝑒𝑎 (𝐴) Where: Force (F) is the perpendicular force applied to the surface, measured in newtons (N) or pounds (lb). Area (A) is the area over which the force is distributed, measured in square meters (m²) or square inches (in²). TYPES OF PRESSURE MEASUREMENT Pressure can be measured in various ways depending on the application and the reference point used. Here are the main types of pressure measurement: 1. Absolute Pressure is the pressure measured relative to a perfect vacuum (zero reference point). Used in applications where a true reference point is needed, such as in vacuum systems, altimeters, and when measuring atmospheric pressure. Example: Atmospheric pressure measured at sea level is approximately 101.3 kPa (absolute). 2. Gauge Pressure is the pressure measured relative to the ambient atmospheric pressure. It is common in industrial and automotive applications, such as measuring tire pressure or pressure in a pressurized vessel. Example: A tire gauge reading of 30 psi means the pressure inside the tire is 30 psi above the atmospheric pressure. 3. Differential Pressure is the difference in pressure between two points in a system. It is used in flow measurement, filtration, level measurement, and in systems where it’s important to know the pressure difference across components like filters or orifices. Example: Measuring the pressure drop across a filter to determine its condition. 10 4. Atmospheric Pressure is the pressure exerted by the weight of the Earth's atmosphere. It varies with altitude and weather conditions. It is used in weather forecasting, altimetry, and aviation. Example: The standard atmospheric pressure at sea level is 101.325 hPa (kilopascals). 5. Vacuum Pressure is the pressure lower than the atmospheric pressure. It is often measured as a gauge pressure with reference to atmospheric pressure. It is used in vacuum systems, such as those in manufacturing, scientific research, and space applications. Example: A vacuum pump creating a vacuum of 0.1 atm (absolute) has a gauge pressure of approximately -0.9 atm. 6. Hydrostatic Pressure is the pressure exerted by a fluid at rest due to the force of gravity. It is relevant in applications involving liquids, such as in dams, water towers, and blood pressure measurement. Example: The pressure at the bottom of a 10-meter column of water is approximately 98.1 kPa. UNITS OF PRESSURE MEASUREMENT Pressure can be measured in various units, depending on the system of measurement and the specific application. Here are some common units of pressure: 1. Pascal (Pa): 1 Pascal is equal to 1 Newton per square meter (N/m²). It is widely used in scientific and engineering applications. 2. Kilopascal (kPa): 1 Kilopascal is equal to 1,000 Pascals. It is often used in meteorology and tire pressure. 3. Megapascal (MPa) 1 Megapascal is equal to 1,000,000 Pascals. It is common in hydraulic systems and high-pressure applications. 4. Pound per Square Inch (psi): The pressure resulting from a force of one pound- force applied to an area of one square inch. It is widely used in the United States for tire pressure, hydraulic systems, and other industrial applications. 11 5. Pound per Square Foot (psf): The pressure resulting from a force of one pound- force applied to an area of one square foot. It is used in structural engineering and wind pressure measurement. 6. Bar:1 Bar is equal to 100,000 Pascals. It is used in meteorology, oceanography, and automotive applications. 7. Atmosphere (atm): 1 Atmosphere is the average atmospheric pressure at sea level. It is often used in chemistry and physics to express pressures. 8. Torr (Torr): 1 Torr is approximately equal to the pressure exerted by a 1 mm column of mercury at 0°C. It is commonly used in vacuum measurements and barometry. 9. Millimeter of Mercury (mmHg): The pressure exerted by a 1 mm column of mercury. It is widely used in medicine, particularly in blood pressure measurement. 10. Inch of Mercury (inHg): The pressure exerted by a 1-inch column of mercury. It is used in aviation, weather reports, and some vacuum systems. 11. Millibar (mbar): 1 Millibar is equal to 100 Pascals. It is often used in meteorology to measure atmospheric pressure. Table 1. Pressure Conversion Matrix STATIC AND DYNAMIC PRESSURE Static pressure is uniform in all directions, so pressure measurements are independent of direction in an immovable (static) fluid. Flow, however, applies additional pressure on surfaces perpendicular to the flow direction, while having little impact on surfaces parallel to the flow direction. This directional component of pressure in a moving (dynamic) fluid is called dynamic pressure. An instrument facing the flow direction measures the sum of the static and dynamic pressures; this measurement is called the total pressure or stagnation pressure. Since dynamic pressure is referenced to static pressure, it is neither gauge nor absolute; it is a differential pressure. While static gauge pressure is of primary importance to determining net loads 12 on pipe walls, dynamic pressure is used to measure flow rates and airspeed. Dynamic pressure can be measured by taking the differential pressure between instruments parallel and perpendicular to the flow. Pitot-static tubes, for example perform this measurement on airplanes to determine airspeed. The presence of the measuring instrument inevitably acts to divert flow and create turbulence, so its shape is critical to accuracy and the calibration curves are often non-linear. PRESSURE MEASURING INTRUMENTS Many instruments have been invented to measure pressure, with different advantages and disadvantages. Pressure range, sensitivity, dynamic response and cost all vary by several orders of magnitude from one instrument design to the next. Following gives the instruments which are used in various situations: 1. Manometer: A manometer is a device used to measure pressure, typically of gases or liquids, by balancing the pressure against a column of liquid. The basic principle behind a manometer is that the pressure exerted by a fluid in a column is proportional to the height of the fluid column. Manometers are simple, reliable, and widely used in various applications, particularly in laboratories, HVAC systems, and industrial processes. The main purpose of a manometer is to compare the pressure of a fluid with a known reference pressure. It typically consists of a glass tube or a flexible diaphragm that reacts to pressure changes. The device can measure both positive and negative pressures, providing valuable information for different systems and processes. The fundamental relationship for pressure expressed by a liquid column is 𝑃 = 𝑃2 − 𝑃1 = 𝜌𝑔ℎ where P = differential pressure P1 = pressure at the low-pressure connection P2 = pressure at the high-pressure connection 𝜌 =density of the liquid g = acceleration of gravity h = height of the liquid column In all forms of manometers (U tubes, well-types, and inclines) there are two liquid surfaces. Pressure determinations are made by how the fluid moves when pressures are applied to each surface. For gauge pressure, P2 is equal to zero (atmospheric reference), simplifying the equation to 𝑃 = 𝜌𝑔ℎ Types of Manometers a) U-Tube Manometer: The U-tube manometer consists of a U-shaped glass tube filled with a liquid, usually mercury or water. One end of the tube is open to the atmosphere or a reference pressure, while the other end is connected to the pressure source being 13 measured. It is used to measure small pressure differences, such as in laboratory experiments or ventilation systems. The principle of the manometer is that the pressure to be measured is applied to one side of the tube producing a movement of liquid, as shown in figure above. It can be seen that the level of the filling liquid in the leg where the pressure is applied, i.e. the left leg of the tube, has dropped, while that in the right-hand leg as risen. A scale is fitted between the tubes to enable us to measure this displacement. Let us assume that the pressure we are measuring and have applied to the left hand side of the manometer is of constant value. The liquid will only stop moving when the pressure exerted by the column of liquid, h is sufficient to balance the pressure applied to the left side of the manometer, i.e. when the head pressure produced by column “h” is equal to the pressure to be measured. Knowing the length of the column of the liquid, h, and density of the filling liquid, we can calculate the value of the applied pressure by ρgh. b) Inclined Manometer: An inclined manometer is a specialized type of manometer designed to measure small pressure differences with greater accuracy and sensitivity than standard U-tube manometers. It is commonly used in applications where precise low- pressure measurements are required. The key feature of an inclined manometer is the inclination of the tube, which allows for a larger scale of measurement for small pressure changes, making it easier to detect and measure minute pressure differences. The inclined manometer consists of a transparent tube filled with a liquid, typically water or a low-density oil. The tube is mounted on a scale that is inclined at a specific angle to the horizontal plane. One end of the tube is connected to the pressure source, and the other end is open to the atmosphere or connected to another pressure source for differential pressure measurement. 14 When pressure is applied, the liquid in the tube moves along the inclined plane. The movement of the liquid creates a height difference, which corresponds to the pressure difference between the two ends of the manometer. The pressure difference is calculated using the equation: ∆𝑃 = 𝜌𝑔ℎ 𝑥 𝑠𝑖𝑛𝜃 Where: 𝜌 is the density of the liquid. g is the acceleration due to gravity. h is the length of the liquid column along the inclined plane. θ is the angle of inclination. Because the tube is inclined, a small change in pressure results in a more significant movement of the liquid along the scale. The length h is measured along the inclined scale, and since the tube is tilted, small pressure differences cause larger displacements, making the manometer more sensitive to low pressures. The length of the liquid column along the inclined plane is directly proportional to the pressure difference. The inclination allows for finer resolution in measurement, making it easier to read small changes in pressure. c) Well-Type Manometer: A well-type manometer is a variant of the traditional U-tube manometer, designed to measure pressure with enhanced readability and convenience, especially for larger pressure ranges. This type of manometer is commonly used in both industrial settings and laboratories due to its ability to provide clear and accurate pressure readings. A well-type manometer uses a large reservoir or "well" on one side of the tube, which helps stabilize the liquid level, making the measurement process easier and more reliable. As illustrated in Figure, if one leg of the manometer is increased many times in area to that of the other, the volume of fluid displaced will represent very little change of 15 height in the smaller area leg. This condition results in an ideal arrangement whereby it is necessary to read only one convenient scale adjacent to a single indicating tube rather than two in the U-type. The larger area leg is called the "well". The higher pressure source being measured must always be connected to the well connection "P". A lower pressure source must always be connected to the top of the tube, and a differential pressure must always have the higher pressure source connected at the well connection "P". In any measurement the source of pressure must be connected in a manner that will cause the indicating fluid to rise in the indicating tube. The true pressure still follows the principles previously outlined and is measured by the difference between the fluid surfaces. It is apparent that there must be some drop in the well level. c) Digital Manometer: Unlike traditional manometers, digital manometers do not rely on a hydrostatic balance of fluids (water/mercury) in order to detect pressure. Rather, they come with a component known as a pressure transducer that converts the level of pressure observed into an electric signal/value; the value can then be recorded as the amount of pressure present. There are usually three types of electrical variables used by pressure transducers: resistive, capacitive, and inductive. Digital manometers represent a significant advancement in pressure measurement technology, offering high accuracy, ease of use, and a range of features that make them suitable for various applications. Whether in industrial process control, HVAC systems, automotive diagnostics, or scientific research, digital manometers provide reliable and precise pressure readings, contributing to the efficiency, safety, and effectiveness of operations. Despite their higher cost and complexity, the benefits of digital manometers make them a valuable tool in modern industry and research. 2. Bourdon Tube: It is most widely used as a pressure sensing element. It consists of a narrow bore tube of elliptical cross-section, sealed at one end. The pressure is applied at the other end which is open and fixed. The tube is formed into a curve, a flat spiral or a helix. When the pressure is applied, the effect of the forces is to straighten it so that the closed end is displaced. Figure 6 illustrates a C bourdon tube as used in direct indicating gauge which usually has an arc of 250o. The process pressure is connected to the fixed socket end of the tube while the tip end is sealed. Because of the difference between inside and outside radii, the bourdon tube presents different areas to pressure, which causes the tube to tend to straighten when pressure is applied. The resulting tip-motion is non-linear because less motion results from each increment of additional pressure. This non-linear motion has to be converted to linear rotational pointer response. This is done mechanically by means of a geared sector and pinion movement as shown in figure. The tip motion is transferred to the tail of the movement sector by the connector link. The 16 angle between the connecting link and the sector tail is called the travelling angle. This angle changes with tip movement in a non-linear fashion and so the movement of the pinion and, therefore, pointer is linear. Frequently used bourdon tube materials include bronze, alloy and stainless steel. These elements are not ideally suited for low pressure, vacuum or compound measurements because the spring gradient of bourdon tube is too low. The advantages of Bourdon tube pressure gauges are that they give accurate results. Bourdon tubes are simple in construction and their cost is low. They can be modified to give electrical outputs. They are safe even for high pressure measurement and the accuracy is high especially at high pressures. The Bourdon gauge coupled with a S.S, capsule type sensing bulb is used in milk homogenizer. The Bourdon tube pressure gauges have some limitations also. They respond slowly to changes in pressure. They are subjected to hysteresis and are sensitive to shocks and vibrations. As the displacement of the free end of the bourdon tube is low, it requires amplification. Moreover, they cannot be used for precision measurement. 3. Diaphragm Pressure Gauge: Diaphragm pressure gauge consists of a diaphragm isolator and a general pressure gauge. It is suitable for measuring the pressure of media that are highly corrosive, high temperature, high viscosity, easy to crystallize, easy to solidify, and have solid floating substances, and direct measurement of the media must be avoided. Input general pressure gauge to prevent accumulation of sediment and water- prone occasions. Diaphragm pressure gauges are mainly used in industries such as petrochemicals, alkalis, chemical fibers, printing and dyeing, pharmaceuticals, food, and dairy, to measure the pressure of flowing fluid media during the production process. Diaphragm pressure gauge consists of a diaphragm isolator and a general pressure gauge. The diaphragm seals the diaphragm. When the measured medium pressure P acts on the diaphragm, the diaphragm deforms and compresses the working medium filled in the system, making the working medium form ΔP equivalent to P and perform work. The conduction of liquid causes the free end of the elastic element (spring tube) in the pressure gauge to undergo corresponding elastic deformation and displacement, and then the measured pressure value is displayed according to the working principle of the pressure gauge that matches it. 17 4. Bellows Pressure Gauge: A bellows pressure gauge is a type of pressure measuring instrument that uses a bellows-like expansion and contraction mechanism to measure the pressure of gases or liquids. The bellows, which is a flexible metal element, responds to changes in pressure by expanding or contracting. This movement is then mechanically translated into a pressure reading displayed on a dial. Bellows pressure gauges are often used in applications where high sensitivity and accuracy are required, such as laboratory settings or applications involving low pressure measurements. The central feature of a bellows pressure gauge is the bellows itself. The bellows is a cylindrical, accordion-like structure made of thin, flexible metal. When pressure is applied to the inside of the bellows, it expands, and when pressure decreases, it contracts. This expansion and contraction movement is directly proportional to the pressure changes within the system being measured. The working principle of a bellows pressure gauge is based on the expansion and contraction of the bellows due to pressure changes. The bellows is connected to a movement mechanism that translates its movement into the rotation of a pointer on a dial. As the bellows expands or contracts in response to pressure variations, the pointer moves along the dial, indicating the pressure reading. 5. McLeod Vacuum Gauge: The McLeod Gauge is used to measure vacuum pressure. It also serves as a reference standard to calibrate other low pressure gauges. The components of McLeod gauge include a reference column with reference capillary tube. 18 The reference capillary tube has a point called zero reference point. This reference column is connected to a bulb and measuring capillary and the place of connection of the bulb with reference column is called as cut off point. It is called so because if the mercury level is raised above this point, it will cut off the entry of the applied pressure to the bulb and measuring capillary. Below the reference column and the bulb, there is a mercury reservoir operated by a piston. The pressure to be measured (P1) is applied to the top of the reference column of the McLeod Gauge as shown in the figure. The mercury level in the gauge is raised by operating the piston to fill the volume as shown by the dark shade in the diagram. When the applied pressure fills the bulb and the capillary, again the piston is operated so that the mercury level in the gauge increases. When the mercury level reaches the cut-off point, a known volume of gas (V1) is trapped in the bulb and measuring capillary tube. The mercury level is further raised by operating the piston so the trapped gas in the bulb and measuring capillary tube is compressed. This is done until the mercury level reaches the “Zero Reference Point” marked on the reference capillary. In this condition, the volume of the gas in the measuring capillary tube is read directly by a scale besides it. That is, the difference in height ‘H’ of the measuring capillary and the reference capillary becomes a measure of the volume (V2) and pressure (P2) of the trapped gas. Now as V1, V2, and P2 are known, the applied pressure P1 can be calculated using Boyle’s Law given by: 𝑃1 𝑉1 = 𝑃2 𝑉2 The working of McLeod Gauge is independent of the gas composition. A linear relationship exists between the applied pressure and height and there is no need to apply corrections to the readings. The limitations are that the gas whose pressure is to be measured should obey the Boyle’s law and the presence of vapours in the gauge affects the performance. 6. Pirani Gauge: The Pirani gauge consists of a metal wire open to the pressure being measured. The wire is heated by a current flowing through it and cooled by the gas 19 surrounding it. If the gas pressure is reduced, the cooling effect will decrease; hence the equilibrium temperature of the wire will increase. The resistance of the wire is a function of its temperature and by measuring the voltage across the wire and the current flowing through it, the resistance can be determined and so the gas pressure is evaluated. Figure shows a Pirani gauge with two platinum alloy filaments which act as resistances in two arms of a Wheatstone bridge. One filament is the reference filament and the other is the measurement filament. The reference filament is immersed in a fixed- gas pressure, while the measurement filament is exposed to the system gas. A current through the bridge heats both filaments. Gas molecules hit the heated filaments and conduct away some of the heat. If the gas pressure around the measurement filament is not identical to that around the reference filament, the bridge is unbalanced and the degree of unbalance is a measure of the pressure. The unbalance is adjusted and the current needed to bring about balance is used as a measure of the pressure. 7. Ionization Gauge: These gauges are the most sensitive gauges for measuring very low pressures or high vacuum. The principle of operation of these gauges sensing pressure of gas by measuring the electrical ions produced when the gas is bombarded with electrons. Fewer ions will be produced by lower density gases. The electrons are generated by thermo ionic emission. These electrons collide with gas atoms and generate positive ions. The ions are attracted to a suitably biased electrode known as the collector. The current in the collector is proportional to the rate of ionization, which is a function of the pressure in the system. Hence, measuring the collector current gives the gas pressure. The ionization gauges are of two types, the hot cathode ionization gauges and the cold cathode ionization gauges. In hot cathode version an electrically heated filament produces an electron beam. The electrons travel through the gauge and ionize gas molecules around them. The resulting 20 ions are collected at a negative electrode. The current depends on the number of ions, which depends on the pressure in the gauge. The working of cold cathode gauge is also same with the only difference in the production of electrons which are produced in the discharge of a high voltage. 8. Thermal Conductivity Vacuum Gauge: The thermal conductivity vacuum gauge works on the principle that at low pressure the thermal conductivity of a gas is a function of pressure. The figure shows the basic elements of a thermocouple vacuum gauge. It consists of a linear element which is heated by a known current source and is contact with a thermocouple attached to its centre. The heater element together with the thermocouple is enclosed in a glass enclosure. The vacuum system to be evaluated is connected to this enclosure. The heater element is supplied with a constant electrical energy. The temperature of the heating element is a function of heat loss to the surrounding gas, which in turn is a function of thermal conductivity of gas that is dependent on the pressure of the gas. The temperature is measured by the thermocouple and is calibrated to read the pressure of the gas. 9. Dead Weight Tester: The dead weight tester is basically a pressure producing and pressure measuring device. It is used to calibrate pressure gauges. The dead weight tester apparatus consists of a piston cylinder combination fitted above the chamber as shown in Figure 13. The chamber below the cylinder is filled with oil. The top portion of the piston is attached with a platform to carry weights. A plunger with a handle is provided to vary the pressure of oil in the chamber. The pressure gauge to be tested is fitted at an appropriate place as shown in the Figure 13. Figure 13. Dead Weight Tester To calibrate a pressure gauge, an accurately known sample of pressure is introduced to the gauge under test and then the response of the gauge is observed. In order to create this accurately known pressure, the valve of the apparatus is closed and a known weight is placed on the platform above the piston. By operating the plunger, fluid pressure is applied to the other side of the piston until the force developed is enough to lift the piston-weight combination. When this happens, the piston weight combination 21 floats freely within the cylinder between limit stops. In this condition of equilibrium, the pressure force of fluid is balanced against the gravitational force of the weights plus the friction drag on the piston. The pressure which is caused due to the weights placed on the platform is calculated using the area of the piston. Now the pressure gauge to be calibrated is fitted at an appropriate place on the dead weight tester. Now the valve in the apparatus is opened so that the fluid pressure P is transmitted to the gauge, which makes the gauge indicate a pressure value. This pressure value shown by the gauge should be equal to the known input pressure P. If the gauge indicates some other value then the gauge is calibrated, adjusting the pressure on the gauge so that it reads a value equal to input pressure. Gauge tester is used to calibrate all kinds of pressure gauges and a wide range of pressure measuring devices. It is simple in construction and easy to use. 10. Strain Gauge Pressure Transducer: The strain gauge, as explained in Lesson 16, is a fine wire which changes its resistance when mechanically strained. A strain gauge may be attached to the diaphragm so that when the diaphragm flexes due to process pressure applied on it, the strain gauge stretches or compresses. This deformation of the strain gauge causes the variation in its length and cross sectional area due to which its resistance changes. The small change in resistance that occurs in stain gauge is measured using a Wheatstone bridge. Figure 14 shows the null type bridge circuit. The strain gauge represents the resistance R4 whose value depends upon the physical variable being measured. Under balanced conditions; R4 = R2 (R3 / R1) The ratio of resistors R3 and R1 is fixed for a particular measurement. The bridge is balanced by varying the value of resistor R2. Thus if three resistances are known the fourth may be determined. 11. Capacitive Pressure Transducer: The capacitance between two metal plates changes if the distance between these two plates changes. A variable capacitance pressure transducer is shown in the figure. The movable plate in the capacitor is the diaphragm. When the pressure is applied on the diaphragm it deflects and changes its position, due to which the distance between the plates is changed. The change in capacitance between a metal diaphragm and a fixed metal plate is measured and calibrated to the change in pressure. These pressure transducers are generally very stable and linear. They can withstand vibrations. But they are sensitive to high temperatures and are more 22 complicated to setup than most pressure sensors. Their performance is also affected by the dirt and dust as they change the dielectric constant. 12. Potentiometric Pressure Sensors: The potentiometric pressure sensor provides a simple method for obtaining an electrical output from a mechanical pressure gauge. The device consists of a precision potentiometer, whose wiper arm is mechanically linked to a Bourdon or bellows element. The movement of the wiper arm across the potentiometer converts the mechanically detected sensor deflection into a resistance measurement, using a Wheatstone bridge circuit. Potentiometric pressure sensors drive a wiper arm on a resistive element. It consists of a potentiometer (a variable resistance) which is made by winding resistance wire around an insulated cylinder. A movable electrical contact, called a wiper slides along the cylinder, touching the wire at on point on each turn. The position of wiper determines the resistance between the end of the wire and the wiper. A potentiometric pressure sensor has a Bourdon tube as the detecting element that moves the wiper. As the wiper moves the change in resistance between the terminals is equivalent to the pressure sensed by the Bourdon tube. These devices are simple and inexpensive. Resistance can easily be converted into a standard voltage or current signal. They also provide a strong output that can be read without additional amplification. This permits them to be used in low power applications. They are however used in low-performance applications, such as, dashboard oil pressure gauges. For reliable operation the wiper must bear on the element with some force, which leads to repeatability and hysteresis errors. They have finite resolution, as the wiper moves from one turn to the next the resistance jumps from one value to the other. Errors also will develop due to mechanical wear of the components and of the contacts. Each 23 time the wiper makes and breaks contact with a turn of wire, it causes an extra electrical signal, which is called noise. The addition of noise to the standard electrical signal makes the signal some what confusing. The amount of noise becomes greater as the potentiometer wears out. To reduce the noise some potentiometer are made by depositing a resistance material on a non-conducting ceramic surface. The wiper moves over this surface just as in a wire wound potentiometer, but the resistance can change continuously rather than in increments and is less electrical noise. 13. Inductive Pressure Transducer: An inductive pressure transducer is an electronic device that converts pressure into an electrical signal through the principle of inductance. This type of transducer is commonly used in industrial and automotive applications where precise and reliable pressure measurements are required. Inductive pressure transducers work by using the principle of inductance, where the electrical inductance of a coil is affected by the position of a movable core or diaphragm that changes in response to pressure. The movement of the core alters the inductance of the coil, which is then converted into an electrical signal proportional to the applied pressure. 14. Barometer: A barometer is an instrument used to measure atmospheric pressure, which is essential for weather forecasting, altitude determination, and various scientific applications. The barometer has been a fundamental tool in meteorology since its invention in the 17th century by Evangelista Torricelli. Barometers operate on the principle that atmospheric pressure exerts a force on a fluid or mechanical element, and this force can be measured to determine the pressure. The two most common types of barometers are the mercury barometer and the aneroid barometer. A mercury barometer consists of a glass tube that is sealed at one end and open at the other, with the open end immersed in a reservoir of mercury. The tube is typically about 1 meter (about 3 feet) long. When the tube is inverted in the mercury reservoir, the atmospheric pressure pushes down on the mercury in the reservoir, forcing mercury up the tube. The height of the mercury column is proportional to the atmospheric pressure. A higher mercury column indicates higher atmospheric pressure, and a lower column indicates lower pressure. The height of the mercury column is measured in millimeters or inches of mercury (mmHg or inHg). Standard atmospheric pressure at sea level is defined as 760 mmHg. 24 An aneroid barometer does not use liquid. Instead, it contains a small, flexible metal box called an aneroid cell. The box is partially evacuated of air, creating a vacuum inside. Changes in atmospheric pressure cause the aneroid cell to expand or contract. This movement is mechanically linked to a needle that moves across a calibrated scale to indicate the pressure. The reading is usually in units such as hectopascals (hPa), millibars (mbar), or inches of mercury (inHg). STANDARDS The American Society of Mechanical Engineers (ASME) has developed two separate and distinct standards on pressure measurement, B40.100 and PTC 19.2. B40.100 provides guidelines on Pressure Indicated Dial Type and Pressure Digital Indicating Gauges, Diaphragm Seals, Snubbers, and Pressure Limiter Valves. PTC 19.2 provides instructions and guidance for the accurate determination of pressure values in support of the ASME Performance Test Codes. The choice of method, instruments, required calculations, and corrections to be applied depends on the purpose of the measurement, the allowable uncertainty, and the characteristics of the equipment being tested. The methods for pressure measurement and the protocols used for data transmission are also provided. Guidance is given for setting up the instrumentation and determining the uncertainty of the measurement. Information regarding the instrument type, design, applicable pressure range, accuracy, output, and relative cost is provided. Information is also provided on pressure-measuring devices that are used in field environments i.e., piston gauges, manometers, and low-absolute-pressure (vacuum) instruments. 25 These methods are designed to assist in the evaluation of measurement uncertainty based on current technology and engineering knowledge, taking into account published instrumentation specifications and measurement and application techniques. This Supplement provides guidance in the use of methods to establish the pressure- measurement uncertainty. US ASME Standards B40.100-2013: Pressure gauges and Gauge attachments. PTC 19.2-2010 : The Performance test code for pressure measurement. CALIBRATION OF A PRESSURE GAUGE Regular calibration of pressure gauges is essential for maintaining measurement accuracy, ensuring process safety, and complying with industry standards such as ISO, ANSI, or ASME. By following a systematic calibration procedure, you can identify and correct any inaccuracies in pressure gauges, ensuring reliable and precise pressure measurements in your applications. 1. Prepare the gauge that requires calibration and gather all the necessary equipment such as pressure source and connection accessories; fittings, hoses, and adapters to connect the gauge to the reference standard (e.g. deadweight tester or a precision digital pressure calibrator.). Bring a data recording device such as notepad, calibration software, or spreadsheet to record the readings. To ensure safety, wear appropriate personal protective equipment (PPE) as needed. Verify that the pressure gauge is disconnected from the process line and depressurized. Check that the pressure source and reference standard are within the range of the gauge being calibrated. 2. Securely connect the pressure gauge to the calibration setup. Ensure all connections are tight to prevent leaks. Connect the reference pressure standard to the same setup, ensuring it is parallel to the gauge being calibrated. Remove any air or gas trapped in the system by bleeding it through a bleed valve or similar device. This is especially important in liquid-filled systems to avoid measurement errors. 3. Before applying pressure, check if the gauge needle is at zero (or the lowest value) when the system is not pressurized. If it’s not at zero, note the deviation or adjust it if the gauge has an adjustable zero. Gradually apply pressure using the pressure source. Start at 0% of the gauge’s full scale and increase in known increments (e.g., 25%, 50%, 75%, 100% of the full scale). At each pressure point, allow the system to stabilize and then record the reading from both the reference standard and the pressure gauge. Record readings as you increase pressure (upward calibration) and as you decrease pressure back to zero (downward calibration) to check for hysteresis. Apply the maximum rated pressure of the gauge and record the reading. 26 Then slowly decrease the pressure back to zero, recording at the same intervals. 4. For each pressure point, compare the gauge reading to the reference standard reading and calculate the error. The difference between these two readings is the gauge’s error at that point. Determine whether the gauge’s error is within the acceptable tolerance for your application. Tolerances are typically defined by industry standards or specific application requirements. 5. If the gauge is adjustable and the error is outside the acceptable range, adjust the gauge according to the manufacturer’s instructions. Typically, adjustments can be made via a screw or knob on the back or side of the gauge. After making adjustments, repeat the calibration process to verify that the gauge now reads within the acceptable tolerance. 6. Document the original readings, the reference standard readings, any adjustments made, and the final readings after calibration. Attach a calibration label to the gauge with the date of calibration, next due date, and the name of the person who performed the calibration. 7. Once calibrated, reinstall the gauge into the system if it was removed. Monitor the gauge during initial operation to ensure it functions correctly in the system. PROCEDURE TO READ PRESSURE FROM A PRESSURE GAUGE Reading pressure from a pressure gauge is a straightforward process, but it's important to follow the correct procedure to ensure an accurate reading and to avoid damaging the gauge or the system it's attached to. 1. Depending on the environment and the fluid being measured, wear appropriate safety gear, such as gloves, goggles, or protective clothing. Ensure the system is stable and not undergoing rapid pressure changes that could affect the reading. If applicable, vent or bleed any trapped air from the pressure line to prevent erroneous readings. 2. Locate the pressure gauge on the system. Ensure that it is the correct gauge for the system or process you are monitoring, especially regarding the pressure range. 3. Visually inspect the gauge for any signs of damage, such as cracks in the glass, bent needles, or leakage around the fittings. Ensure the gauge is within its calibration date and is suitable for the pressure range you expect to measure. 4. Stand directly in front of the gauge to avoid parallax error (misreading the needle due to viewing at an angle). Look at the needle or pointer on the gauge dial. The needle should be stable and not fluctuating rapidly. Note where the needle points on the scale. The scale is usually marked in units such as psi (pounds per square inch), bar, kPa (kilopascals), or other units depending on the gauge. If the needle falls between two marks on the scale, estimate the value between those marks. Most gauges have major and minor gradations to assist with this. 27 5. Write down the pressure reading and the units. Note the time and date if the reading is part of a routine check or a log. 6. If possible, take a second reading to confirm the first. This is especially important in critical applications. 7. If you had to change the system's status (e.g., open or close valves) to read the pressure, return the system to its normal operating condition. 8. If the reading is abnormal or if the gauge is showing signs of wear or damage, report this immediately for maintenance or calibration. If the gauge needs to be calibrated, follow the appropriate procedures or notify the responsible personnel. For accurate reading, consider the effect of temperature, as extreme temperatures can affect the accuracy of the reading. Some gauges are temperature- compensated. Ensure that the gauge is appropriate for the pressure range of the system. Ideally, the normal operating pressure should be within the middle third of the gauge's range. Pressure gauges should be also regularly calibrated to maintain accuracy. A poorly calibrated gauge can lead to incorrect readings and potentially unsafe conditions. 28

Use Quizgecko on...
Browser
Browser