🎧 New: AI-Generated Podcasts Turn your study notes into engaging audio conversations. Learn more

Ch - 1 FINAL COPY to MTech students.pdf

Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...

Full Transcript

PCC-ME 304 - Measurements and Metrology Module I: Measurement Purpose and Parameters Module I: Measurement Purpose and Parameters: Parameters – geometry (straightness, flatness, roundness, etc.), displacement, force, speed, torque, flow, level, pressure,...

PCC-ME 304 - Measurements and Metrology Module I: Measurement Purpose and Parameters Module I: Measurement Purpose and Parameters: Parameters – geometry (straightness, flatness, roundness, etc.), displacement, force, speed, torque, flow, level, pressure, temperature, acceleration, etc.; Definitions: Accuracy, precision, range, resolution, uncertainty and error sources; Regression analysis. RATIONALE/ABSTRACT: This chapter explores the fundamental aspects of measurement parameters, ranging from geometric properties such as straightness, flatness and roundness. It also explains the additional parameters such as cylindricity, profile, parallelism, perpendicularity, symmetry, concentricity, and angularity. It gives information about dynamic variables such as displacement, distance, force, speed, torque and temperature. It explores the significance of these parameters in various fields, emphasizing their importance in engineering, science, and industrial processes. The chapter describes the definitions of key concepts including accuracy, precision, range, resolution, uncertainty, and sources of error, and sheds light on their implications for data reliability and interpretation. Furthermore, it explores regression analysis as a statistical tool for exploring relationships between variables. 1.1. INTRODUCTION Measurements are part of our life. Through the measurements, one can easily communicate with the user, industrialist, or the required person. The basic measurements are mass, length and time and these basic measurements are helpful to estimate other secondary parameters. The measurements vary depending on the different streams of the area. For example, the measurement in chemistry involves pH value, concentration, etc. A measurement of length can be made using a normal scale, vernier caliper, micrometer and other special equipment. The meter scale answers in terms of mm or cm (10-2 m to 10-3 m). The vernier caliper can give the 1 answer less than 1 mm (or in terms of 10-4 m). The micrometer/screw gauge/spherometer can produce up to 0.01 mm (10-5 m). Similarly, some special instruments such as nanometers and electronics-operated instruments are used to measure the length less than the above ranges (10- 2 to 10-5 m). Nanometers are used to measure the size of atoms, molecules and microorganisms which are less than 10-6 m. Similarly, small-scale mass measurement is used in air or water pollution measurements. Pollution means the amount of contaminated substances present in the unit volume of air or water. For example, the arsenic amount present in the water must be less than 5 × 10 -9 gram per litre. If it exceeds, it will harm us. Time measurement is also very expensive in certain applications. It must be very accurate. In the 100 m running race, an athlete sometimes loses his/her medal with a difference of 0.01 s. It shows that all parameters must be measured very sensitively. Such large-scale and small-scale length, mass and time measurements can be done by various instruments/measuring devices. 1.2. GEOMETRY PARAMETERS Geometry parameters such as straightness, flatness and roundness are essential for error-free final machine or machine components. Especially, in the modern world, high accuracies are expected on the edges and surfaces of machine components. A machine is the assembly of a number of components. Each component has a definite relation with the other. The surfaces of components joining each other are called mating surfaces and their dimensions are called mating dimensions. If the edges or surfaces of components are inaccurately made, the expected fit (more or less freedom in their relative movement of tightness in a fixed assembly) 2 between the mating parts may not be achieved. Such failures in different components may make an inaccurate machine or final assembly. In any machine or final assembly by the manufacturer, the following dimensional parameters must be inspected; (i) The dimensional accuracy (ii) Accuracy of geometrical form (or surface macro geometry) – flat, cylindrical, taper, spherical surfaces and / or combination of above surfaces (iii) Surface waviness (iv) Geometry structure which involves surface roughness and micro geometry Various metrology tools and techniques are employed to measure and verify straightness, flatness, and roundness. These include coordinate measuring machines (CMMs), laser scanners, and optical comparators. Quality control processes typically involve comparing measured geometric features against specified tolerances to ensure compliance with design requirements. Statistical process control (SPC) methods may also be utilized to monitor and maintain the consistency of geometric features during production processes. 1.2.1. Straightness or alignment The shortest distance between the two points can be joined by a straight line. Sometimes, such an exact straight line may not be made by a user (or) the straight path is not necessarily presented in a physical sense. It means that a doubt is raised about the linearity of the straight line. This can be resolved using a length-measuring instrument. The straightness can be symbolically and actually mentioned using Fig. 1. 1 (a) and (b), respectively. 3 (a) (b) Fig. 1.1. Symbolic and tolerance zone representation The straightness of a line or edge is measured by its parallelism with a straight edge of known accuracy or by direct contact with a tool of calibrated straightness. Simply, it says how accurately straight a target is. Gauge blocks can be used to measure the straightness. The gauge block was invented by Carl Edvard Johansson, a Swedish machinist in 1896. It is a ceramic or metal block as shown in Fig. 1.2 that is used for producing precision lengths. Each block is made up of a specific standard thickness and / or length. Gauge blocks are used as a reference for a standard length measurement or inspection. 4 Fig. 1.2. Gauge block Straightness measures how closely the elements of a surface or line follow a straight path. For example, in the context of a shaft or rod, straightness would assess how much the axis of the shaft deviates from being perfectly straight. In manufacturing, straightness is crucial for components such as shafts, rails, and guides where any deviation from straightness can lead to operational issues, such as misalignment or excessive friction. 1.2.2. Flatness Flatness is a measure of the geometric concept of a plane or machine surface. It is a precondition for the parallelism of nominal flat surfaces. The flatness refers to the deviation of a surface from being perfectly flat. It measures how uniformly the surface lies relative to a reference plane. It can be measured by a mechanical device or by a combination of mechanical 5 devices with optical instruments. Such controlled flatness is required in a number of machine surfaces so that their mating can be confirmed with other surfaces. The symbol and representation are mentioned in Fig. 1.3 (a) and (b), respectively. Fig. 1.3 (b) shows the flatness for a length of 200 units with a tolerance value of 1 unit. The flatness tolerance zone ensures that any deviation from a perfectly made flat surface stays within acceptable limits, ensuring accuracy in manufacturing. Fig. 1.3 (b) ensures that the flatness tolerance (the distance between two planes of flatness) must be made with 0.1 unit. Example: carriage is mated on the lathe bed, a seal is mated between two surfaces to avoid oil leakage, etc. (a) (b) Fig. 1.3. (a) Symbol of flatness and (b) its representation with tolerance The factors of flatness are; 6 (i) The surface area of the machine component (ii) Interrelation with other surfaces (iii) Expected measuring accuracy (iv) The size and shape of the components, etc. The deficiency in flatness occurs because of deflection. During the measuring flatness process, the gauge block (or inspection bar) and the surface to be calibrated must be kept on a surface of inspection grade to avoid deflection. In manufacturing, flatness is essential for components such as bearing surfaces, sealing surfaces, and mounting surfaces where proper contact and sealing are required. For example, in the case of a machining table or a mounting plate, flatness ensures that workpieces are securely held and machined accurately. 1.2.3. Roundness (or) circularity Roundness refers to the deviation of a shape from being perfectly round. It assesses how closely the surface of a part approaches a perfect circle. It is otherwise called circularity. In a 2D geometry, the roundness is measured by how the domain does not sustain its out-of-round. The symbol of roundness is mentioned in Fig. 1.4 (a) and its actual representation is mentioned in Fig. 1.4 (b). The numeral "0.03 tolerance value" denotes the allowable deviation from a perfect circle when defining roundness or circularity. It indicates the maximum acceptable difference between the actual shape of a circular object and its ideal geometric form, providing a quantitative measure of deviation. 7 In manufacturing applications, roundness is critical for components such as bearings, pistons, and cylindrical shafts where rotational symmetry is essential for proper functioning. Achieving roundness involves ensuring uniformity in diameter along the entire circumference of a cylindrical or circular part. (a) (b) Fig. 1.4. (a) Symbol of roundness and (b) its representation 1.2.4. Additional parameters Apart from the straightness, flatness and roundness, there are additional geometric parameters (discussed below) that contribute to the overall quality, functionality, and performance of manufactured parts and assemblies. They are essential for meeting design specifications, 8 ensuring proper fit and alignment, and optimizing the performance and reliability of engineered products. 1.2.4.1. Cylindricity: It refers to the overall deviation of a surface of revolution from a perfect cylinder. It measures how closely the surface of the part approaches a true cylindrical form. This parameter is essential for components such as pistons, hydraulic cylinders, and shafts, where precise circularity along the entire length of the part is critical for proper functionality and fit. The symbol and its representation are mentioned in Fig. 1.5 (a) and (b), respectively. Cylindricity, defined by a tolerance value of 0.03 unit, represents the deviation between the actual surface of a cylinder and its perfect form, ensuring that the entire surface lies within this specified tolerance zone. Ensuring precise circularity along the entire length of these components is essential for the function of the final product. Cylindricity ensures that these parts can function smoothly within their respective systems without causing any friction or misalignment issues. (a) (b) 9 Fig. 1.5. (a) Symbol of cylindricity and (b) its representation 1.2.4.2. Profile: It refers to the deviation of a surface from its true shape or form. Line profile and surface profile are two main types of profile measurements as shown in Fig. 1.6. Line profile involves measuring dimensional characteristics along a specified linear path, whereas surface profile encompasses the three-dimensional shape and texture of an entire surface. Line profile measurements focus on individual features and are crucial for dimensional verification and geometric analysis, while surface profile measurements provide comprehensive insights into overall surface quality, form, and texture, essential for assessing surface finishes and detecting defects. It evaluates the contour or outline of a part relative to a reference shape. Profile measurements are important for ensuring that the components meet specified dimensional requirements and geometric tolerances. Profile measurements are used in the manufacturing of complex parts with intricate shapes, such as turbine blades, propeller blades, gears, and impellers. Profile control ensures that the components meet specified dimensional requirements and geometric tolerances. 10 Profile measurements are extensively used in industries like aerospace, automotive, and precision engineering for quality control and inspection of complex components. For example, in the aerospace industry, profile measurements are crucial for ensuring the precise contours and dimensional accuracy of turbine blades and engine components. Fig. 1.6. Profile of a surface and a line 1.2.4.3. Parallelism: It measures the deviation between two parallel surfaces or features. It assesses how uniformly spaced two surfaces are located from each other along their entire length or width. Parallelism is critical in applications where parts must fit together or move relative to each other with minimal clearance or interference. Its symbol and actual representation are given in Fig. 1.7 (a) and (b), respectively. The tolerance value of 0.001 unit indicates the precision required in aligning geometric features. Achieving parallelism within this narrow tolerance range is essential for ensuring optimal performance and functionality of mechanical systems and components. 11 Examples: machine tool beds, guiding rails, and gear teeth. (a) (b) Fig. 1.7. (a) Symbol of parallelism and (b) the representation 1.2.4.4. Perpendicularity: It evaluates the deviation of a surface or feature from a true right angle or 90-degree orientation relative to a reference plane. It also explains the configuration of one surface perpendicular to another reference surface. It can be measured by a height gauge but its surface must be aligned 12 with 90° perpendicular. The whole surface needs to be checked to ensure the perpendicular behaviour of a surface. This parameter is essential for ensuring proper alignment and assembly of components, such as mounting surfaces, mating parts, and precision fixtures. Parts with perpendicular surfaces must be aligned accurately to ensure proper fit and functionality. The symbol of perpendicularity is mentioned in Fig. 1.8 (a) and the actual representation is given in Fig. 1.8 (b). The tolerance value of 0.001 unit indicates the allowable deviation from a perfect right angle in a given geometric feature. This specification ensures that the perpendicularity of the feature is maintained within a precise limit of 0.001 units. Perpendicularity does not control the angle of the referenced feature. Example: A buffer stop is used in railways to stop the rail vehicles at the end of the track. (a) (b) 13 Fig. 1.8. (a) Symbolic and (b) representation of perpendicularity 1.2.4.5. Symmetry: Symmetry measures the uniformity of distribution of mass or material around a central axis or plane. It assesses how well-balanced a part is relative to its intended design. Symmetry is important in the manufacturing process. The components with rotational symmetry, such as rotating shafts, impellers, turbine rotors, etc. always must be maintained with symmetricity for their better performance and long life. The symmetrical distribution of mass or material around a central axis ensures balanced operation and minimizes vibration, wear, and stress concentrations. The symbol of symmetricity and its actual representation are given in Fig. 1.9 (a) and (b), respectively. Symmetry with a tolerance value of 0.02 unit indicates the permissible deviation from perfect symmetry in a given geometric feature or object. This numerical value represents the maximum allowable difference between corresponding dimensions on opposite sides of the symmetry axis. (a) (b) 14 Fig. 1.9. (a) The symbolic and (b) representation of symmetry 1.2.4.6. Concentricity: Concentricity evaluates the alignment of two or more cylindrical features with a common axis. It assesses how well the centers of different circular elements coincide. This parameter is crucial for components such as bearings, shafts, bushings, and seals where precise alignment and coaxiality are necessary for smooth operation and minimal wear. Concentricity is critical for components such as bearings, shafts, and bushings. Precise alignment and coaxiality of cylindrical features are necessary for smooth operation, minimal wear, and efficient power transmission in rotating assemblies. Its symbol and actual representation are mentioned in Fig. 1.10 (a) and (b), respectively. A tolerance value of 0.25 unit indicates the permissible deviation from perfect alignment allowing for variations in concentricity within the specified range. (a) 15 (b) Fig. 1.10. (a) Symbol and (b) actual representation of concentricity 1.2.4.7. Angularity: Angularity measures the deviation of a surface or feature from a specified angle or angular orientation relative to a reference plane or axis. Angularity is important for ensuring the proper fit and function of parts with angled surfaces or features, such as mating components, bevelled edges, and machined angles. Angularity measurements are used in the manufacturing of components with angled surfaces or features, such as mating components, bevelled edges, and machined angles. Angularity control ensures that the parts with specified angular orientations 16 fit together accurately and function properly. Figs. 1.11 (a) and (b) show the symbolic actual representations of angularity. The sine bar (Fig. 1.12) is used to measure the angularity. Angularity, defined with a tolerance value of 0.01 unit specifies the permissible deviation from the intended angle in a geometric feature that ensures precise alignment and conformity to specified angular requirements. (a) (b) Fig. 1.11. (a) Symbol and (b) representation of angularity 17 Fig. 1.12. Sine bar The other geometrical parameters are; Length: The extent of an object or distance between two points, typically measured in units such as meters, centimeters, or inches. Width: The measurement of the extent of an object from side to side, perpendicular to its length. Height: The vertical measurement of an object from its base to its top. Diameter: The distance across the widest part of a circle or cylinder. Angle: The measure of the separation between two intersecting lines or surfaces, typically measured in degrees. Surface Area: The total area of the exposed surface of an object, calculated by adding the areas of its individual faces. Volume: The amount of space occupied by an object, typically measured in cubic units such as cubic meters or cubic centimeters. 1.2.5. Applications of parameters: 18 These parameters are essential in a wide range of industries including automotive, aerospace, precision engineering, and medical devices. In automotive manufacturing, for instance, straightness, flatness, and roundness are crucial for ensuring the proper fit and function of engine components, transmission parts, and chassis components. In aerospace applications, precise geometric tolerances are necessary to ensure the structural integrity and performance of aircraft components, such as turbine blades, wing sections, and fuselage panels. In medical device manufacturing, components such as implants and surgical instruments require high precision in terms of straightness, flatness, and roundness to ensure compatibility, functionality, and patient safety. 1.3. PHYSICAL PARAMETERS Physical parameters are measurable characteristics or properties of a physical system, substance, or phenomenon. These parameters describe various aspects of the physical nature. Physical parameters are often used in scientific research, engineering, industry, and everyday life to describe, analyze, and understand the behaviour and properties of materials, objects, and systems. They play a crucial role in fields such as physics, chemistry, biology, engineering, environmental science, and medicine which are discussed in this section. 1.3.1. Displacement Displacement is a fundamental physical quantity that represents the change in position of an object from its initial location to its final location. It is a vector quantity, meaning it has both magnitude and direction. Displacement is generally denoted by S or Δx. In one-dimensional 19 motion, displacement is typically measured along a straight line, while in two- or three- dimensional motions, it accounts for changes in position along multiple axes. Displacement = ∆𝑥 = 𝑥𝑓 − 𝑥𝑖 (1.1) Where xf refers to the value of the final position and xi refers to the value of the initial position. 1.3.1.1. Importance of displacement Displacement measurements are crucial for understanding how objects move and where they're located. By tracking changes in position over time, we can figure out the path, speed, and acceleration of moving things. This is key in fields like physics and engineering, where we study motion without considering the forces behind it. Moreover, displacement measurements are vital for navigation systems like global positioning systems (GPS) and robotics, helping vehicles and robots know where they are and how to get to their destination accurately. In engineering, displacement measurements are essential for analyzing the structures and materials. By measuring how things move or deform under different conditions, engineers can assess the strength and stability of buildings, bridges, and mechanical systems. These measurements are also crucial for material testing, where researchers study how materials behave under stress. 1.3.1.2. Displacement Measurement techniques: Linear displacement sensors: Linear displacement sensors, such as potentiometers, linear variable differential transformers (LVDTs), linear encoders, and linear potentiometric sensors, are commonly used to measure one-dimensional displacements along a straight path. These 20 sensors provide direct measurements of linear position and are widely used in industrial automation, machine tools, and position feedback systems. Rotary displacement sensors: Rotary displacement sensors, such as rotary encoders, resolver sensors, and potentiometric rotary sensors measure angular displacements or rotations around a fixed axis. These sensors are essential for monitoring the position and orientation of rotating shafts, wheels, and joints in machinery, robotics, and automotive applications. Non-contact displacement measurement: Non-contact displacement measurement techniques, including laser displacement sensors, ultrasonic sensors, and optical triangulation sensors, enable contactless measurement of displacements with high accuracy and resolution. These techniques are suitable for applications where physical contact with the object is impractical or may affect the measurement accuracy. 1.3.2. Force Force is a fundamental physical quantity that represents the interaction between objects that causes a change in motion, such as acceleration or deformation. It is described by Newton's second law of motion, which states that the force acting on an object is equal to the mass of the object multiplied by its acceleration (F = m × a). Force is a vector quantity, and it is typically measured in units of Newtons (N). 1.3.2.1. Importance of force In various fields like engineering and mechanics, force measurements are very important as they help engineers design and optimize structures and devices by understanding how forces affect their stability, strength, and performance. By measuring forces, engineers can also assess 21 the loads, stresses and strains on materials, ensuring they can withstand expected conditions without failing. Furthermore, force measurements are crucial for analyzing dynamic systems, where forces change over time due to motion, vibration, or other factors. By studying these dynamic forces, engineers can identify potential issues like vibration-induced fatigue and design systems to mitigate them, ensuring long-term reliability and safety. Understanding reaction forces, which occur in response to applied forces according to Newton's third law of motion, is also important for designing mechanisms and connections that can handle forces and maintain stability. 1.3.2.2. Force Measurement Techniques: Load cells: Load cells are transducers that convert applied force into an electrical signal, typically using strain gauge technology. They are widely used for measuring forces in applications such as material testing, weighing scales, and industrial automation. Strain gauges: Strain gauges are sensors that detect changes in strain or deformation in materials under load. By measuring the strain, engineers can calculate the applied force using principles of elasticity and material properties. Pressure sensors: Pressure sensors indirectly measure force by sensing the pressure exerted by a fluid or gas. They are commonly used in hydraulic systems, pneumatic systems, and automotive applications to measure forces exerted by fluids or gases. Force plates: Force plates are devices used to measure ground reaction forces in biomechanics and sports science. They are often embedded in floors or platforms and are used to analyze human movement, gait, and athletic performance. 1.3.3. Speed 22 Speed is a measure of how fast an object is moving. Speed is measured in a particular direction and is also called velocity. Speed quantifies the rate of change of displacement over time. Mathematically, speed is calculated as the distance travelled divided by the time taken to travel that distance. It is a scalar quantity, meaning it only has magnitude and no direction, unlike velocity, which is a vector quantity. 1.3.3.1. Importance of speed Speed is incredibly important in two key areas: transportation and sports. In transportation, like cars, trains, planes, and ships, the speed determines how quickly people and things can move from one place to another. It's crucial for making travel faster, more efficient, and safer. In sports such as running, swimming, and cycling, athletes always aim to go as fast as they can to beat their competitors. Speed measurements in sports help coaches and athletes to keep track of progress, find areas to improve and make training better. Also, in fluid dynamics, which is all about how liquids and gases move, speed measurements help engineers to design things like pumps and turbines to work better. In aerospace engineering, where things like planes and rockets fly really fast, speed measurements help to make sure that these vehicles are safe and efficient in the sky. So, whether it's getting from A to B, winning a race, or designing high-tech machines, speed is key in making things work smoothly and efficiently. 1.3.3.2. Speed Measurement Techniques 23 Speedometers: Speedometers are instruments used in vehicles to display the instantaneous speed of the vehicle. They typically use mechanical or electronic sensors to measure wheel rotation or vehicle movement and convert it into speed readings displayed on the dashboard. Radar Guns: Radar guns use the Doppler effect to measure the speed of moving objects, such as vehicles on roads or balls in sports like baseball and cricket. They emit radio waves that bounce off the moving object and return to the radar gun, allowing for accurate speed measurements. Global Positioning System (GPS) Systems: GPS technology is used to measure the speed of vehicles, ships, and aircraft by tracking their movement and position changes over time. GPS systems provide real-time speed data and are widely used in navigation, logistics, and fleet management applications. 1.3.4. Torque Torque is the measure of the force that can cause an object to rotate about an axis. It is often described as the rotational equivalent of force. It is represented by the symbol τ and is calculated as the product of the force applied and the distance from the axis of rotation to the point of application of the force. Mathematically, torque (τ) is expressed as τ = r × F, where r is the distance from the axis of rotation to the point of application of force, and F is the applied force perpendicular to the distance vector. Torque is typically measured in units such as Newton-meters (N·m) or pound- feet (lb-ft). Torque plays a critical role in various engineering fields, including automotive engineering, robotics, machinery, and manufacturing. In automotive engineering, torque is essential for engine performance, determining the rotational force generated by the engine's crankshaft. 24 1.3.4.1. Measurement techniques of torque: Torque wrenches: Torque wrenches are hand tools used to apply a specific amount of torque to a fastener, such as a bolt or nut. They typically have a calibrated scale or digital display that indicates the amount of torque being applied. Torque wrenches are widely used in automotive repair, assembly, and maintenance tasks where precise torque application is essential. Torque sensors: Torque sensors are devices that directly measure torque by detecting the twist or deformation in a shaft or other rotating component. These sensors may use strain gauges, piezoelectric elements, or optical encoders to convert mechanical deformation into an electrical signal proportional to the torque applied. Reaction torque sensors: Reaction torque sensors measure torque indirectly by detecting the reaction force exerted on a stationary component when torque is applied to a rotating shaft or member. These sensors are often used in applications where direct measurement of torque is challenging or impractical, such as in-line torque monitoring of rotating machinery or torque measurement in confined spaces. Rotary torque transducers: Rotary torque transducers are precision instruments used for accurate and continuous measurement of torque in rotating shafts or systems. They typically consist of a stationary stator and a rotating rotor connected to the shaft or component being measured. The torque applied to the shaft causes the rotor to twist relative to the stator, generating an electrical signal proportional to the torque. Strain gauge techniques: Strain gauge techniques involve attaching strain gauges to a shaft or member to measure the deformation caused by torque. As torque is applied, the strain gauges detect changes in resistance, which can be correlated to the applied torque using calibration curves or mathematical equations. Strain gauge techniques are widely used in laboratory settings for experimental studies and research involving torque measurement. 25 Torque meters: Torque meters are instruments designed specifically for measuring torque in various applications. They may utilize different measurement principles, such as strain gauges, magnetoelastic effects, or optical sensing, to provide accurate torque measurements. Torque meters are available in handheld, benchtop, and inline configurations. 1.3.5. Distance The length of the path travelled by an object, measured in meters (m) or other distance units. It is a scalar quantity, meaning it only has magnitude and no direction. The between the points (x1, y1) and (x2, y2) can be measured by the equation, 𝐷𝑖𝑠𝑡𝑎𝑛𝑐𝑒 = √(𝑥2 − 𝑥1 )2 + (𝑦2 − 𝑦1 )2 (1.2) 1.3.5.1 Importance of Distance Distance is fundamental for describing the spatial separation between two points or the extent of motion along a path. It is used in various fields, including physics, engineering, navigation, and geography. Navigation and Mapping: Distance is fundamental for navigation, cartography, and mapping. Accurate distance measurements are essential for creating maps, determining travel routes, and navigating through unfamiliar terrain. From GPS systems in vehicles to digital mapping applications on smartphones, distance plays a crucial role in guiding travellers and ensuring efficient transportation. Surveying and Construction: In surveying and construction industries, precise distance measurements are critical for laying out property boundaries, designing infrastructure, and building structures. Surveyors use advanced instruments such as total stations and laser 26 rangefinders to measure distances accurately, ensuring that construction projects adhere to specifications and regulations. Engineering and Design: Distance measurements are indispensable in engineering and design processes across various disciplines. Engineers use distance data to model and simulate complex systems, analyze spatial relationships, and optimize product designs. Science and Research: Distance measurements are essential for scientific research in fields such as astronomy, geology, and physics. Astronomers use distance measurements to study celestial objects and cosmic phenomena, while geologists use distance data to analyze terrain features and geological formations Industrial Applications: Distance measurements play a crucial role in industrial automation, robotics, and manufacturing processes. Automated systems use distance sensors and vision systems to precisely control the movement of machinery, robots, and conveyor systems. 1.3.5.2. Measurement Techniques of Distance Distance can be measured using instruments such as rulers, tape measures, odometers, GPS devices, and surveying equipment. Tape measurement: One of the simplest and most common tools for measuring distance is the tape measure. It consists of a flexible ribbon or metal strip marked with linear graduations (such as inches or centimeters) for direct measurement of distances. Laser distance measurement: Laser distance measurement devices use laser technology to accurately determine distances to a target object. These devices emit laser beams and measure the time it takes for the beam to reflect off the object and return to the sensor, enabling precise distance calculations. Ultrasonic distance measurement: Ultrasonic distance measurement devices emit ultrasonic waves that bounce off objects and return to a sensor. By measuring the time it takes for the 27 ultrasonic waves to travel to the object and back, these devices can calculate distances accurately. Total station: Total stations are advanced surveying instruments used for precise distance and angle measurements in land surveying, construction, and engineering projects. They combine electronic distance measurement (EDM) technology with angular measurements to provide accurate three-dimensional coordinates of points. Global navigation satellite systems (GNSS): GNSS systems, such as GPS (Global Positioning System), use satellites to provide accurate positioning and distance measurements on Earth's surface. GNSS receivers receive signals from multiple satellites to determine the receiver's location and calculate distances between points with high precision. Wheel Odometers: Wheel odometers, commonly used in vehicles and machinery, measure distances by counting the revolutions of a wheel. By knowing the wheel's circumference, the device can calculate the distance travelled based on the number of rotations. Photogrammetry: Photogrammetry is a technique that uses overlapping photographs to measure distances and create three-dimensional models of objects or terrain. By analyzing the parallax and perspective distortion in the images, software algorithms can accurately determine distances between points. Trilateration: Trilateration is a geometric technique used in surveying and navigation to determine distances based on the intersection of circles or spheres with known radii. By measuring the distances from multiple reference points, trilateration can pinpoint the location of an unknown point. 1.3.6. Time It is the duration between two events, measured in seconds (s) or other time units. It is the interval during which an action occurs. It is a scalar quantity and is measured in units such as seconds, minutes, hours, or days. 28 1.3.6.1. Importance of Time Time measurement is essential for describing the sequence of events, the duration of processes, and the timing of phenomena. It is used across all scientific disciplines, engineering fields, and everyday activities. Time is essential for various applications. Navigation systems like GPS, ensure accurate positioning. In scientific research, it's crucial for studying celestial events and biological processes. Industries rely on precise timing for efficient operations and productivity. In communication, synchronized time is vital for accurate data transmission and smooth network functioning. 1.3.6.2. Measurement Techniques of Time Time can be measured using devices such as clocks, stopwatches, timers, sundials, and atomic clocks. (need to be added something and some figs ????????????) 1.3.7. Velocity The rate of change of displacement over time is called velocity. It is measured in meters per second (m/s) or other velocity units. It is a vector quantity which means that it has both magnitude and direction. 1.3.7.1. Importance of Velocity 29 Velocity describes how fast an object is moving and in which direction it is moving. It is crucial for analyzing motion, predicting future positions, and designing systems involving moving objects. Velocity is crucial information for understanding movement in various fields like engineering, sports, and environmental science. It helps engineers to design safer cars, athletes to improve their performance, and scientists to track environmental changes. For example, in automotive engineering, velocity measurements are essential for assessing vehicle performance and ensuring safe operation. In sports, velocity helps athletes to monitor their progress and optimize training. Similarly, in environmental science, velocity measurements provide insights into weather patterns, ocean currents, and airflows, aiding in understanding climate dynamics and environmental impacts. Furthermore, velocity measurements are vital for optimizing systems and structures in fluid dynamics and engineering applications. By studying velocity profiles and flow rates, engineers can design efficient pumps, turbines, pipelines, and HVAC systems. These measurements also play a crucial role in environmental monitoring, allowing scientists to track currents in oceans and rivers, wind speeds in the atmosphere, and airflow in urban environments. Overall, velocity is a key parameter for analyzing motion and optimizing various processes in engineering, sports, and environmental science. 1.3.7.2. Measurement Techniques of Velocity Velocity can be measured using techniques such as radar guns, Doppler radar, speedometers, GPS devices, and velocity sensors. 30 Speedometers: Speedometers are instruments used in vehicles to measure and display instantaneous velocity. They typically utilize mechanical or electronic sensors to detect wheel rotation or vehicle movement and convert it into speed readings displayed on the dashboard. Radar and lidar: Radar and lidar (light detection and ranging) systems are remote sensing technologies used for velocity measurements in various applications. They emit radio waves or laser pulses and measure the doppler shift in the reflected signals to determine the velocity of moving objects such as vehicles, aircraft, and weather systems. Anemometers: Anemometers are the devices used to measure wind velocity and airflow in meteorology, environmental monitoring, and HVAC systems. They employ different sensing principles, including cup anemometers, hot-wire anemometers, and sonic anemometers, to quantify wind speed and direction accurately. Doppler Ultrasound: Doppler ultrasound is a medical imaging technique that measures blood velocity by detecting the Doppler shift in ultrasound waves reflected from moving blood cells. It is used in diagnostic procedures such as vascular ultrasound to assess blood flow and velocity in arteries and veins. Particle Image Velocimetry (PIV): Particle Image Velocimetry is an optical technique used in fluid dynamics research to measure velocity fields in fluid flows. It involves seeding the flow with tracer particles and using high-speed cameras to capture images of particle motion. Finally, the velocity vectors are calculated using image processing algorithms. Global Navigation Satellite Systems (GNSS): GNSS systems, such as GPS (Global Positioning System), provide velocity information by tracking changes in the position of receivers over time. By analyzing the displacement of GNSS receivers relative to satellite positions, velocity vectors can be calculated to determine the speed and direction of movement. 1.3.8. Mass 31 The amount of matter in an object is measured in kilograms (kg) or other mass units. It is a scalar quantity and is measured in units such as kilograms, grams, milligrams, or pounds. 1.3.8.1. Importance of Mass Mass determines the inertia of an object and its gravitational interaction with other objects. It is a fundamental property of matter and plays a key role in physics, engineering, and chemistry. It is also one of the primary parameters measured physically from a substance and is used to estimate all secondary parameters such as force, molar number, density, specific volume, enthalpy, specific heat, etc. 1.3.8.2. Measurement Techniques of Mass Mass can be measured using instruments such as balance scales, spring scales, and mass meters etc. Balance Scales: Traditional balance scales compare the mass of an object with known masses on the opposite side of a beam or lever. They provide accurate mass measurements by achieving equilibrium between the unknown mass and known masses. Spring Scales: Spring scales measure mass by quantifying the force exerted by an object's mass on a spring system. As the mass is placed on the scale, the spring stretches or compresses, and the amount of deformation is proportional to the mass. Weighing Machines: Weighing machines, such as electronic balances and weighing platforms, use load cells or strain gauges to measure the force exerted by an object's mass on a support structure. This force is then converted into a mass measurement using calibration and conversion algorithms. 32 Gravitational methods: Gravitational methods measure mass indirectly by quantifying the gravitational force exerted on an object. Techniques such as gravimetry and gravimetric analysis involve measuring gravitational acceleration or gravitational attraction to determine mass. Magnetic Levitation: Magnetic levitation (maglev) systems use magnetic fields to suspend objects in mid-air and measure their mass based on the strength of the magnetic field required for levitation. This technique is used in precision mass measurement applications such as analytical balances and microbalances. Nuclear magnetic resonance (NMR): NMR spectroscopy is a technique used in chemistry and biochemistry to determine the mass and molecular structure of compounds by analyzing their magnetic properties in a magnetic field. Interferometry: Interferometric methods, such as optical interferometry and laser interferometry, measure mass indirectly by detecting changes in the interference patterns of light caused by the gravitational influence of mass on space-time. 1.3.9. Energy The capacity of a system to do work or produce heat is called energy. It is measured in joules (J) or other energy units. It exists in various forms, including kinetic energy, potential energy, kinetic, thermal, electrical, chemical, nuclear, electromagnetic energy and many other forms. 1.3.9.1. Importance of Energy Energy is essential for describing and understanding the physical processes, transformations, and interactions. It is the basis for all mechanical, thermal, electrical, and chemical phenomena. Energy serves as the fundamental driving force behind numerous physical phenomena, 33 including motion, mechanical, thermal, electrical, chemical reactions, and heat transfer. Energy plays a pivotal role in shaping various sectors, from industry and transportation to healthcare and agriculture. Understanding energy sources and consumption patterns provides the base for addressing some challenges such as climate change, energy security, and resource depletion. Transitioning to renewable energy sources and improving energy efficiency are essential steps toward achieving a greener and more sustainable future for generations to come. 1.3.9.2. Measurement Techniques of Energy A diverse array of measurement techniques is employed to accurately quantify and analyze energy usage, production, and efficiency across different domains. Direct measurement methods involve specialized instruments like calorimeters and energy meters, which directly quantify energy exchange in processes such as chemical reactions and electrical power generation. Indirect measurement techniques, including fuel consumption analysis and temperature differentials, provide insights into energy consumption and production rates. Remote sensing technologies, such as satellite imagery and aerial surveys, enable large-scale assessments of energy resources and infrastructure. Energy audits and life cycle analysis further enhance understanding by evaluating energy systems' performance and environmental impacts, guiding strategic decision-making for sustainable energy management initiatives. 1.3.10. Power The rate at which work (W) is done or energy is transferred, measured in watts (W) or other power units. Power (P) represents the amount of energy transferred or converted per unit time (t). 𝑊 𝑃= (1.3) 𝑡 34 Since power lacks directionality, it is considered a scalar quantity. The standard unit for power in the International System of Units (SI) is the Joule per Second (J/s), which is commonly referred to as Watt. A Watt is defined as the amount of power required to perform one joule of work in one second. This unit is named after Sir James Watt (1736-1819), credited with inventing the steam engine. 1.3.10.1. Importance of Power Power quantifies how quickly energy is generated, consumed, or transferred in various systems. It is crucial for assessing the performance of machines, engines, and electrical devices. It plays a crucial role in various sectors, including industry, transportation, and infrastructure. Power is essential for driving machinery, generating electricity, and powering vehicles, making it indispensable for economic growth, technological advancement, and societal well-being. Understanding power generation, distribution, and consumption patterns is vital for ensuring energy security, promoting sustainability, and mitigating environmental impacts. Transitioning to renewable energy sources and improving power efficiency are key strategies for addressing climate change and achieving a more sustainable energy future. 1.3.10.2. Measurement Techniques of Power Direct measurement methods involve using power meters or wattmeters to measure electrical power consumption or generation. These devices provide accurate real-time readings of power usage and enable monitoring of energy consumption trends over time. Indirect measurement methods involve calculating power based on other parameters such as force, velocity, and energy. Dynamometers, which calculate power (P) based on the torque and rotational speed of 35 an engine or motor. Indirect measurement techniques also include analyzing factors such as fuel consumption rates, temperature differentials, or pressure differentials to infer power generation or consumption. Remote sensing technologies, such as satellite imagery and aerial surveys, provide valuable data for assessing power infrastructure and energy resources on a large scale. Energy audits and performance evaluations further enhance understanding by identifying opportunities for improving power efficiency and optimizing energy systems. Overall, these measurement techniques play a crucial role in quantifying power usage, optimizing energy efficiency, and guiding informed decision-making for sustainable energy management. 1.3.11. Flow Flow refers to the motion of a fluid or gas, characterized by its velocity, direction, and volume or mass per unit time. It can occur in various forms, including laminar flow, turbulent flow, and transitional flow, depending on factors such as velocity, viscosity, and geometry of the system. Understanding flow dynamics is crucial for analyzing and optimizing processes involving fluid transport, mixing, and distribution. 1.3.11.1. Importance of Flow Flow measurements are essential in various engineering fields, including chemical engineering, civil engineering, environmental engineering, and mechanical engineering. In mechanical engineering, flow measurements are used in fluid power systems, heating- ventilation, and air conditioning (HVAC) systems, and thermal management applications. In chemical engineering, flow measurements are vital for controlling and optimizing processes 36 such as mixing, reaction kinetics, and mass transfer in chemical reactors and pipelines. In civil engineering, flow measurements are crucial for designing and managing water supply systems, drainage systems, and irrigation networks. In environmental engineering, flow measurements are essential for monitoring and managing water resources, wastewater treatment plants, and pollution control systems. 1.3.11.2. Measurement Techniques of Flow Differential pressure flow meters  Orifice plates: These create a pressure drop across a restriction to estimate flow.  Venturi tubes: Similar to orifice plates but more efficient due to a gradual taper.  Pitot tubes: Measure pressure difference to find flow velocity.  Flow nozzles: Similar to venturi tubes but shorter for compactness. Velocity flow meters  Electromagnetic flow meters: Measure voltage changes caused by fluid flow.  Ultrasonic flow meters: Use sound waves to gauge flow speed.  Doppler flow meters: Detect frequency changes in reflected sound waves.  Turbine flow meters: Employ a spinning rotor to measure flow. Positive displacement flow meters  Piston meters: Use a piston to measure fixed fluid volumes.  Oval gear meters: Measure fluid volumes with rotating oval gears.  Nutating disc meters: Use a wobbling disc to displace fluid. Mass flow meters  Thermal mass flow meters: Measure heat transfer for mass flow estimation.  Coriolis flow meters: Use tube oscillations to gauge mass flow.  Vortex shedding flow meters: Count vortices to determine flow rate. 37 Open channel flow measurement  Weirs and flumes: Create known flow restrictions to estimate flow.  Ultrasonic open channel flow meters: Use ultrasonic sensors to measure fluid levels and velocity in open channels. 1.3.12. Level Level refers to the height or depth of a substance within a container or vessel. It could be a liquid, such as water or oil, or a solid material, such as grains or powders. Level measurement involves determining the position of the substance surface relative to a reference point, usually the bottom or top of the container. This measurement provides valuable information about the quantity, volume, and status of the stored material. 1.3.12.1. Importance of Level Level measurement is crucial across various industries, including manufacturing, agriculture, chemical processing, wastewater management, and oil and gas production. In manufacturing, level measurement ensures proper inventory management and process control in storage tanks and vessels. In agriculture, it helps to monitor grain levels in silos and manage inventory effectively. In wastewater management, it enables the monitoring of sewage levels in treatment plants and pumping stations, ensuring efficient operation and compliance with environmental regulations. 1.3.12.2. Measurement Techniques of Level 38 Radar level measurement: Radar sensors emit microwave signals that reflect off the material's surface and return to the sensor. By analyzing the time it takes for the signals to return, the distance to the material's surface can be determined. Guided wave radar (GWR) level measurement: GWR sensors transmit microwave signals along a probe or cable that extends into the material. The signals reflect off the material's surface and return to the sensor, allowing for accurate level measurement. GWR is suitable for liquids, solids, and interface applications. Capacitance level measurement: Capacitance level sensors use changes in capacitance between electrodes to detect changes in the material's level. As the material level changes, the capacitance between the electrodes alters, allowing for level measurement. Float level measurement: Float level sensors utilize a buoyant float attached to a rod or cable that moves with the material's surface. As the material level changes, the float moves accordingly, actuating a switch or sensor to indicate the level. Pressure level measurement: Pressure sensors measure the pressure exerted by the material in a tank or vessel. By converting the pressure into a level measurement using calibration curves or equations, the material's level can be determined. Radiometric level measurement: Radiometric sensors use gamma or neutron radiation to penetrate the material and measure its density. By analyzing the attenuation of the radiation, the material's level can be determined. Ultrasonic level measurement: Ultrasonic sensors emit high-frequency sound waves that bounce off the material's surface and return to the sensor. By measuring the time it takes for the sound waves to return, the distance to the material's surface can be calculated. This method is suitable for liquids and solids and works well in challenging environments. 1.3.13. Pressure 39 Pressure (p) can be defined mathematically as the force (F) applied perpendicular to the surface of an object per unit area (A) over which the force is distributed. Mathematically, pressure is expressed as: 𝐹 𝑃𝑟𝑒𝑠𝑠𝑢𝑟𝑒, 𝑝 = 𝐴 (1.4) It quantifies how much force is applied per unit area, with the SI unit of pressure being the Pascal (Pa), which is equivalent to one Newton per square meter (N/m²). However, due to the small magnitude of the Pascal, alternative units are commonly used. Kilopascals (kPa) are frequently used, especially for atmospheric pressure. In the U.S., pounds per square inch (psi) is prevalent for measuring water pressure. Atmospheric pressure is often expressed in atmospheres (atm) or torr, with 1 atm corresponding to sea-level atmospheric pressure and 1 torr equating to 1/760th of an atmosphere. Meteorologists use millibars (mbar), where 1 millibar equals 100 Pascal. Additional units include millimeters of mercury (mmHg), akin to torr, commonly used in blood pressure measurements, and dyne per square centimeter, where 1 dyne per square centimeter equals 0.1 Pascal. Each unit offers a different perspective on pressure measurement, catering to various scientific and practical needs. 1.3.13.1. Importance of Pressure Pressure plays a critical role across various fields due to its fundamental nature in quantifying force distribution over a given area. Its importance spans from fluid dynamics to engineering applications, meteorology, and beyond. In fluid dynamics, pressure gradients drive fluid flow, representing the phenomena such as flow rate, velocity profiles, and turbulence. Understanding and accurately measuring pressure are essential for optimizing fluid systems, designing efficient pipelines, and ensuring the structural integrity of vessels and equipment in industries ranging from aerospace to manufacturing. 40 Moreover, pressure measurements are indispensable in engineering processes and system monitoring, ensuring safety, efficiency, and reliability. In HVAC systems, pressure sensors regulate air flow rates, monitor filter performance, and maintain indoor air quality. Similarly, in industrial settings, pressure gauges and transmitters are employed to monitor vessel pressures, detect leaks, and prevent overpressure situations, safeguarding equipment and personnel. 1.3.13.2. Measurement Techniques of Pressure Various techniques are employed for pressure measurement, depending on the application requirements and environmental conditions. Common methods include: Mechanical pressure gauges: These gauges utilize mechanical elements such as diaphragms, Bourdon tubes, or bellows to translate pressure into mechanical displacement, which is then indicated on a scale. Electronic pressure transducers: Electronic transducers convert pressure into an electrical signal, typically using technologies like piezoelectric, capacitive, or strain gauge sensors. The electrical signal is then processed and displayed as a pressure reading. Differential pressure transmitters: These devices measure the pressure difference between two points in a system using two pressure sensing elements. They are commonly used in HVAC systems, filter monitoring, and flow measurement applications. Hydraulic pressure sensors: Hydraulic pressure sensors measure pressure by transmitting pressure-induced changes in hydraulic fluid to a sensing element. They are suitable for high- pressure applications and harsh environments. 41 Piezoresistive pressure sensors: These sensors use the change in resistance of a piezoresistive material under pressure to measure pressure accurately. They are commonly used in automotive, aerospace, and industrial applications. 1.3.14. Temperature Temperature is a fundamental physical quantity that reflects the average kinetic energy per molecule or particles within a substance. Temperature measurements are typically expressed in units such as degree Celsius (°C) or degree Fahrenheit (°F), with the Kelvin scale (K) being commonly used in scientific applications. This concept is crucial across numerous fields such as thermodynamics, materials science, and climate science. Temperature of a system is directly proportional to the average kinetic energy of its constituent particles. However, in the context of fundamental physics, the Kelvin scale is often preferred because it directly relates to the thermal energy of the system and has its zero point at absolute zero, the theoretical lowest possible temperature where everything stops moving. The unit Fahrenheit is used in the United States. Water freezes at 32 °F and boils at 212 °F. Celsius is used in many other countries. Here, water freezes at 0° and boils at 100°. To convert temperature from Celsius to Fahrenheit, following relationship can be used: 9 𝑇𝐹 = 5 𝑇𝐶 + 32 (1.5) Where, TF and TC are temperature in Fahrenheit (F) and Celsius (°C). To convert from Celsius to Kelvin, following relationship can be used: 𝑇𝐾 = 𝑇𝐶 + 273.15 (1.6) For example, if you have 20°C want to convert it to Fahrenheit and Kelvin 9 𝑇𝐹 = ( × 20) + 32 = 68 F 5 𝑇𝐾 = 20 + 273.15 = 293.15 K 42 1.3.14.1. Importance of Temperature In thermodynamics, temperature plays a central role in understanding how heat energy moves and transforms within systems. It helps to determine the direction and extent of heat transfer between objects, influencing processes like heating, cooling, and phase transitions. Temperature measurements are indispensable for monitoring and controlling thermal processes in industries such as manufacturing, energy production, and chemical processing. For instance, in a chemical reactor, precise temperature control is vital for optimizing reaction rates, product quality, and energy efficiency. In materials science, temperature affects material properties such as strength, conductivity, and phase behaviour. By varying the temperature, researchers can study how materials respond to different thermal conditions, leading to the development of new materials with tailored properties. Temperature measurements enable the characterization of material behaviour under varying thermal loads, facilitating the design of materials for specific applications, such as aerospace components, electronic devices, and structural materials. In climate science, monitoring temperature variations over time helps scientists to understand climate change, assess its impacts on ecosystems and human societies, and develop strategies for mitigation and adaptation. Temperature data collected from weather stations, satellites, and other monitoring systems are essential for predicting weather patterns, modelling climate scenarios, and informing policy decisions related to environmental management and sustainability. 1.3.14.2. Measurement Techniques of Temperature 43 Various temperature measurement techniques exist, including thermocouples, resistance temperature detectors (RTDs), thermistors, and infrared sensors, each with its advantages, limitations, and suitability for different applications and temperature ranges. Thermocouples: Thermocouples consist of two dissimilar metal wires joined at one end. When there is a temperature difference between the junction and the free ends, a voltage is generated, which is proportional to the temperature difference. Resistance temperature detectors (RTDs): RTDs are temperature sensors made of materials whose electrical resistance changes with temperature. As the temperature changes, so does the electrical resistance of the RTD, which can be measured accurately. RTDs offer excellent accuracy and stability over a wide temperature range but are more expensive than thermocouples. Thermistors: Thermistors are temperature sensors made of semiconductor materials whose resistance changes significantly with temperature. They offer high sensitivity and accuracy in a limited temperature range, making them suitable for precise measurements in specific applications like medical devices and automotive systems. Infrared thermometers: Infrared thermometers measure temperature by detecting the infrared radiation emitted by an object. They are non-contact devices, making them ideal for measuring the temperature of moving or inaccessible objects. 44 Bimetallic strips: Bimetallic strips consist of two different metals bonded together, each with different coefficients of thermal expansion. When heated, the strip bends due to the unequal expansion of the metals, which can be used to measure temperature changes. Liquid-in-glass thermometers: These traditional thermometers consist of a glass tube with a liquid (usually mercury or alcohol) sealed inside. As the temperature changes, the liquid expands or contracts, causing it to rise or fall in the tube. The temperature can be read from a scale marked on the tube. 1.3.15. Acceleration Acceleration (a) is a fundamental concept in physics and engineering, representing the rate of change of velocity over time. Acceleration measurements provide valuable insights into dynamic motion, inertial forces, and vehicle performance in various systems. The equation to calculate acceleration is given by ∆𝑣 𝑎= (1.7) ∆𝑡 Where Δv is change in velocity and Δt is change in time. Let's say there is a car cruising at 30 m/s, and then it gradually speeds up until it hits 35 m/s after 10 seconds as shown in Fig. 1.13. To figure out how quickly its speed changed, we look at the difference between its final and initial speeds. Then acceleration can be estimated as 35 m/s − 30 m/s 𝑎= = 0.5 m/s2 10 s 45 Fig. 1.13. Representation of acceleration 1.3.15.1. Importance of Acceleration In physics, acceleration plays a central role in describing the motion of objects according to Newton's laws of motion. It helps to explain how objects change their velocity over time when subjected to external forces. Acceleration measurements allow physicists to study the behaviour of particles, celestial bodies, and other systems under the influence of gravitational, electromagnetic, and other forces. In automotive engineering, acceleration measurements are essential for evaluating vehicle performance and safety. Acceleration tests provide data on a vehicle's ability to accelerate from rest to a certain speed, which is critical for assessing engine power, transmission efficiency, and overall drivability. Acceleration measurements also help engineers to optimize vehicle design for fuel efficiency and emissions reduction. In aerospace, acceleration measurements are vital for spacecraft and aircraft design, testing, and operation. Acceleration data allow engineers to evaluate the performance of propulsion systems, control surfaces, and structural components under different flight conditions. 1.3.15.2. Measurement Techniques of Acceleration 46 Accelerometers: Accelerometers are electronic devices that measure acceleration forces in one or more directions. They typically consist of a mass suspended by a spring, with the displacement of the mass proportional to the applied acceleration. This displacement is then converted into an electrical signal using piezoelectric or capacitive sensors. Strain gauges: Strain gauges are sensors that measure changes in resistance due to mechanical strain. When subjected to acceleration forces, certain materials experience deformation, causing a change in resistance in the strain gauge. By measuring this change in resistance, acceleration can be determined. Inertial measurement units (IMUs): IMUs combine multiple sensors, such as accelerometers, gyroscopes, and sometimes magnetometers, to measure acceleration, angular velocity, and orientation. By integrating the acceleration measurements over time, IMUs can provide information on velocity and position. IMUs are commonly used in navigation systems for aircraft, spacecraft, drones, and autonomous vehicles. Force plates: Force plates are platforms equipped with multiple force sensors that measure the forces exerted by a person or object. Force plates are frequently used in biomechanics research to study human movement, gait analysis, and sports performance. High-speed cameras: High-speed cameras capture rapid motion with high temporal resolution, allowing for the measurement of acceleration indirectly. By tracking the movement of objects in consecutive frames, the change in velocity over time can be determined, and acceleration can be calculated. 1.4. MEASUREMENT CONCEPTS AND ANALYTICAL TECHNIQUES In measurements and analysis, several key concepts and techniques govern the accuracy and reliability of data. Accuracy, precision, range, resolution, uncertainty, error sources, and 47 regression analysis are fundamental components. Understanding these concepts is vital for obtaining meaningful insights and making informed decisions in various fields. Any measurements have errors but these errors must be minimized so that the end users or industries get the accurate final products. Errors in designs deviate the original products and finally the products are unfit for the industries. Or errors create further damages to the main or sub systems. Each measurement has the result, as it is the outcome of any measurement. This end result can be plotted and from that a mathematical relation can be achieved is called regression. 1.4.1. Accuracy It refers to the closeness of a measured value to its true or accepted value. It indicates how well a measurement reflects the actual quantity being measured. Accuracy refers to how close the average or experimental value of a set of measurements is to the true value. Mathematically, accuracy can be calculated by subtracting the true value from the mean value of the measurements. The smaller this difference, the higher is the accuracy. 𝐴𝑐𝑐𝑢𝑟𝑎𝑐𝑦 = 𝑚𝑒𝑎𝑛 𝑣𝑎𝑙𝑢𝑒 − 𝑡𝑟𝑢𝑒 𝑣𝑎𝑙𝑢𝑒 (1.8) Accuracy is crucial for ensuring that measurements are reliable and trustworthy. Inaccurate measurements can lead to errors in analysis, decision-making, and product quality. Accuracy is typically expressed as a percentage or a fraction of the true value. For example, standard ruler that is supposed to measure lengths accurately to the nearest millimetre. You measure the length of an object using this ruler multiple times and find that the average measured length is 15.2 centimetres. Now, let's say the true length of the object is determined by two instruments A and B, determined by a highly precise instrument, is found to be 15.3 centimetres and 15.00 centimetres, respectively. Then, 48 Accuracy of method A = 15.2 - 15.3 = -0.1 cm Accuracy of method B = 15.2 - 15.00 = 0.2 cm In this method A, the negative sign indicates that the measured value is slightly lower than the true value. So, the accuracy of the ruler in this example is 0.1 cm lower than the true value. Accuracy of method B is 0.2 cm higher than the true value. Thus, method A is considered more accurate as it has a smaller difference between the mean value and the expected true value. Accuracy also indicates how correct the measurements are, but it's important to note that it's not helpful when the true value is unknown. In such cases, precision becomes more relevant. There are three types of accuracy, each serving different purposes: Point accuracy: Point accuracy assesses the accuracy of an instrument at a specific point on its scale. However, it does not provide insight into the overall accuracy of the instrument. Accuracy as percentage of the scale range: This type of accuracy evaluates measurements based on their uniform scale range. It considers the entire range of measurements rather than focusing individual points. Accuracy as percentage of true value: In this type of accuracy assessment, the measured values are compared to their true values. A tolerance level, such as ±0.5 % is typically considered acceptable deviation from the true value during measurement. 1.4.2. Precision 49 Precision refers to how closely multiple measurements of the same quantity agree with each other. It is calculated as the difference between an individual measured value and the arithmetic mean value of a series of measurements. 𝐴𝑐𝑐𝑢𝑟𝑎𝑐𝑦 = 𝑖𝑛𝑑𝑖𝑣𝑖𝑑𝑢𝑎𝑙 𝑣𝑎𝑙𝑢𝑒 − 𝑎𝑟𝑖𝑡ℎ𝑚𝑒𝑡𝑖𝑐 𝑚𝑒𝑎𝑛 (1.9) For instance, let us consider the measurement of the length of a pencil. If multiple measurements yield values such as 14.2 cm, 14.3 cm, and 14.1 cm, with an average of 14.2 cm, then these measurements are considered precise. This indicates a high level of agreement among the repeated measurements. However, if the measurements are 13.5 cm, 14.7 cm, and 15.1 cm, with an average of 14.4 cm, then they are accurate but not precise. Despite the average being close to the true value, the individual measurements vary widely from one another, indicating lower precision. Therefore precision refers to the consistency or repeatability of measurements when the same quantity is measured multiple times under the same conditions. It reflects the degree of scatter or variation among repeated measurements. A precise measurement produces consistent results with little deviation from the mean. Precision is essential for assessing the reliability and reproducibility of measurements. It is typically expressed as a standard deviation or a coefficient of variation. Figs. 1.14 (a), (b) and (c) provide the clear view and difference between accuracy and precise. 50 Fig. 1.14. (a), (b) and (c) Comparison of accuracy and precision Therefore, the differences between accuracy and precision can be mentioned through Table 1.1. Table 1.1. Difference between accuracy and precision Sl. Accuracy Precision No 1 Accuracy is the measure of closeness of Precision is a measure of reproducibility the answer with the true value of the i.e. getting the nearest value again and quantity being measured. again in a measurement. 2 Measurement can be accurate but not Measurement can be precise but not necessarily precise. necessarily accurate. 3 It can be determined with a single It needs several measurements to be measurement. determined. 4 Accuracy values have to be precise in Precise values may or may not be most cases. accurate. 5 It is called a degree of conformity. It is called a degree of reproducibility 1.4.3. Range 51 The upper and lower limits of an instrument can measure a value or signal such as amps, volts and ohms. It is defined as the span of values that a measurement instrument can accurately measure. It represents the minimum and maximum values within which the instrument can provide reliable measurements. The range of a measurement instrument is determined by its design, specifications, and calibration. A wide range allows for versatility and flexibility in measurement applications, while a narrow range limits the types of measurements that can be performed. If we have a thermometer and it has the range of -15°C to 100°C, it means that the thermometer can be comfortably used for the minimum value of -15°C to the maximum value of 100°C. If the temperature of a real application does not fall on the same range, then the thermometer may not produce reliable results. If we want to measure the temperature of a milk chilling plant set at -20 °C, then the selected thermometer would not be preferable because the plant temperature exceeds the range of the thermometer. At the same time, the same thermometer can be reliably used for measuring the temperature of a hot air oven set at 75°C. It may produce accurate results as the temperature of hot air oven is within the range of thermometer (-15 to 100°C). 1.4.4. Resolution It refers to the smallest change in the quantity being measured that can be detected or displayed by the measurement instrument. It represents the level of detail or granularity in measurement results. Resolution is determined by the precision of the instrument's scale or display and the sensitivity of its sensing mechanism. A higher resolution allows for more precise and accurate measurement of small changes in the measured quantity. Resolution is typically expressed as the smallest increment of measurement that the instrument can detect. 52 Let us consider an electronic weighing balance with the range of 10 to 100 g and it has a resolution of 1 mg. A sample mass of 15 g (approx.) needs to be measured. It implies that the weighing balance can detect as small as 1 mg while measuring the 15 g sample. It means that the device produces as tiny as one-thousandths of a gram or 0.001 g or 1 mg. In simpler terms, it's the smallest increment that the instrument can display or measure accurately. Higher resolution instruments can detect smaller changes, offering greater precision in measurements, but expensiveness of the measuring instruments may be higher than the lower precision instruments. 1.4.5. Uncertainty Uncertainty arises from various sources, including inherent limitations of measurement instruments, environmental factors, and human error. Uncertainty is typically expressed as a range or interval within which the true value of the measured quantity is believed to lie with a certain level of confidence. Understanding and accounting for uncertainty is crucial for interpreting measurement results and making informed decisions based on them. For example, you are measuring the length of a wooden plank using a ruler marked in millimeters. After multiple measurements, you find that the lengths you record vary slightly: 1035 mm, 1036 mm, 1034 mm, 1037 mm, and 1035 mm. The uncertainty in your measurements can be represented by calculating the range, which is the difference between the maximum and minimum measured values: 𝑅𝑎𝑛𝑔𝑒 = 𝑀𝑎𝑥𝑖𝑚𝑢𝑚 𝑣𝑎𝑙𝑢𝑒 − 𝑀𝑖𝑛𝑖𝑚𝑢𝑚 𝑣𝑎𝑙𝑢𝑒 (1.10) = 1037 mm − 1034 mm 53 = 3 mm This range indicates the variability or uncertainty in your measurements. However, it doesn't provide a complete picture of uncertainty because it doesn't consider other factors like the precision of the ruler or potential errors in your measurement technique. To express uncertainty more comprehensively, the standard deviation (σ) can be calculated. The standard deviation quantifies the average deviation of individual measurements from the mean (average) value. A higher standard deviation indicates greater variability or uncertainty in the measurements. By calculating the standard deviation of measurements, better understand the uncertainty associated with measurements and make informed decisions based on them. ∑(𝑥𝑖 −𝑥̅ )2 𝜎=√ (1.11) 𝑁 where, xi is individual measurement of parameter, 𝑥̅ is mean value of all xi parameter and N is total number of measurements. 1.4.6. Sources of error Errors in measurements can stem from a variety of sources, including systematic errors and random errors. Common sources of systematic errors include instrument calibration issues, environmental conditions, and procedural errors. Instruments with linear response may encounter two types of systematic errors: Offset error, where the instrument does not register zero when the measured quantity is zero. Scale factor error, where the instrument consistently overestimates or underestimates changes in the measured quantity. Examples: 54 (i) Inconsistent pH readings due to improper calibration of the pH meter (ii) Distorted sound measurements from a microphone placed too close to a source of noise, such as a fan or air conditioner (iii) Inaccuracies in temperature measurements due to inadequate thermal contact between the thermometer and the substance being measured and (iv) Errors in solar radiation measurements caused by obstructions such as trees or buildings casting shadows on the radiometer. The accuracy of measurements is often reduced by systematic errors, which are difficult to detect even for experienced research workers. On the other hand, random errors are unpredictable fluctuations or change in experimental measurement outcomes due to factors such as instrument noise, human variability, and inherent variability in the measured quantity itself. Examples of causes of random errors are: (i) Random fluctuations in the signal due to electromagnetic interference or electronic noise (ii) Irregular changes in the heat loss rate from solar air heater due to changes in the wind (iii) Fluctuations in voltage readings due to interference in the electrical circuit. The precision is limited by the random errors. It may usually be determined by repeating the measurements. 1.4.7. Regression analysis 55 Regression analysis is a statistical method used to analyze the relationship between two or more variables for the estimation of relationships between a dependent variable and one or more independent variables. It aims to identify and quantify the underlying patterns or trends in the data and make predictions based on these relationships. In the context of measurement, regression analysis can help to assess the accuracy of measurement instruments, calibrate sensors, and model complex phenomena based on empirical data. By fitting mathematical models to experimental data, regression analysis enables researchers to extract meaningful insights and make informed decisions in scientific and engineering applications. Regression analysis includes several variations, such as linear, multiple linear, and nonlinear (Fig. 1.15). The most common models are simple linear and multiple linear. Nonlinear regression analysis is commonly used for more complicated data sets in which the dependent and independent variables show a nonlinear relationship. Fig. 1.15. Common types of regression analysis Nomenclature A surface area (m2) F Force (N) M mass (kg) N Total number of measurements 56 P Power (J/s) p Pressure (N/m2) r distance from the axis of rotation (m) S or Δx Displacement (m) TC Temperature in degree Celsius (°C) TF Temperature in Fahrenheit (F) TK Temperature in Kelvin (K) t Time (s) xf and xi final position and initial position in coordinate system (m) x1, y1, x2, y2 points in the coordinate system 𝑥̅ mean value W Work (J) Green Symbol τ torque (N·m) α acceleration (m/s2) σ standard deviation Δv change in velocity Δt change in time Acronyms CMM coordinate measuring machines GNSS Global navigation satellite systems GPS Global Positioning System GWR Guided wave radar IMU Inertial measurement units LVDT linear variable differential transformers NMR Nuclear magnetic resonance PIV Particle Image Velocimetry RTD Resistance temperature detectors References (If the references are common for all the modules, they can be kept at the last) 1. "Engineering Metrology and Measurements", Raghavendra N.V. and Krishnamurthy L., Oxford University press, 2013 57 2. "Fundamentals of Dimensional Metrology", Connie L. Dotson, 6th edition, Cengage, 2016. 3. "Mechanical Measurements", Beckwith T.G., Marangoni R.D, John H. Lienhard V, 6th edition, Pearson Publisher, 2020. 4. “Geometric Dimensioning and Tolerancing: Applications, Analysis, Gauging and Measurement”, Meadows, J. D., ASME Press, 2020. 5. "Introduction to Engineering Experimentation", Wheeler A.J. and Ganji A.R., Third Edition Pearson Publisher 2010. 6. “Handbook of Dimensional Measurement”, Curtis M. A., and Farago F. T., Fifth Edition, Industrial Press Inc.,U.S., 2013 7. “Geometric Dimensioning and Tolerancing: Applications, Analysis & Measurement”, Meadows J. D., James D Meadows & Assoc Inc, 2009. 8. "Mechanical Engineering Metrology and Measurement" R. K. Jain, 22nd edition, Khanna Publishers, 2022 9. "Mechanical Measurements" Venkateshan S. P. , 2nd Edition, Springer: New York, NY, USA, 2015. 10. "Fundamentals of Fluid Mechanics", Munson B. R., Young D. F., and Okiishi T. H., John Wiley & Sons, Inc. Publication, 2009. 11. "Instrumentation and Measurement in Electrical Engineering", Roman Malaric, Brown Walker press, Florida, 2011 12. "Engineering Metrology and Measurements", Raghavendra N. V., Krishnamurthy L., Oxford, 2013. 13. "Fundamentals of Dimensional Metrology", Dotson C. L., Sixth Edition, Cengage Learning India Private Limited, 2016. 14. "Mechanical Measurements" by Thomas G. Beckwith, Roy D. Marangoni, John H. Lienhard V, 6th Edition, Pearson Education, 2006. 15. "Introduction to Engineering Experimentation" by Wheeler A. J., Ganji A. R., Third Edition, Pearson, 2010. 16. "Geometric Dimensioning and Tolerancing: Applications, Analysis & Measurement", Griffith G. K., Second edition, Pearson, 2001. 17. "Principles of Mass Transfer and Separation Processes", Dutta B. K., Third Edition, PHI Learning Pvt. Ltd., 2018. 18. "Measurement and Instrumentation: Theory and Application" by Morris A. S. and Langari R., Elsevier, 2015. 19. "Flow Measurement Handbook: Industrial Designs, Operating Principles, Performance, and Applications" by Baker R.C., Third Edition, Cambridge University Press, 2016. 58 20. "Energy: Its Use and the Environment" by Roger A. Hinrichs and Merlin H. Kleinbach, 5th Edition, Cengage Learning, 2013. 21. "Power System Analysis and Design", Glover J. D., Sarma M. K., and Overbye T. J., Sixth Edition, Cengage Learning, 2017. 59

Use Quizgecko on...
Browser
Browser