Podcast
Questions and Answers
Which of the following best describes the role of metrology in science and technology?
Which of the following best describes the role of metrology in science and technology?
- It is limited to routine measurements in specific sectors.
- It plays a crucial role in enabling scientific advancements and understanding nature through precise and accurate measurements. (correct)
- It primarily focuses on theoretical aspects of measurement uncertainty.
- It is mainly used for measurements of physical quantities.
What was the main motivation behind the invention of the metric system in revolutionary France?
What was the main motivation behind the invention of the metric system in revolutionary France?
- To create a measurement system that would vary from person to person.
- To establish a universal system of measurement that was not based on variable bodily dimensions. (correct)
- To simplify trade relations with England, based on pre-existing English measures.
- To align with existing bodily dimensions for easier adoption.
In the International System of Quantities, which of the following is NOT considered one of the seven base quantities?
In the International System of Quantities, which of the following is NOT considered one of the seven base quantities?
- Electric current
- Volume (correct)
- Mass
- Time
Why is the definition of the 'measurand' considered critical in measurement?
Why is the definition of the 'measurand' considered critical in measurement?
What is the purpose of a 'reference measurement procedure'?
What is the purpose of a 'reference measurement procedure'?
How does 'metrological traceability' contribute to the reliability of a measurement result?
How does 'metrological traceability' contribute to the reliability of a measurement result?
What is the key difference between 'validation' and 'verification' in the context of analytical procedures?
What is the key difference between 'validation' and 'verification' in the context of analytical procedures?
A laboratory is calibrating a thermometer using a standard reference thermometer. According to the text, what is the outcome of this process?
A laboratory is calibrating a thermometer using a standard reference thermometer. According to the text, what is the outcome of this process?
A scientist notices continuous variation in the readings of a measuring device over time, unrelated to changes in the measured quantity. What type of error is this?
A scientist notices continuous variation in the readings of a measuring device over time, unrelated to changes in the measured quantity. What type of error is this?
Which statement accurately describes the relationship between 'accuracy' and 'precision' in measurement?
Which statement accurately describes the relationship between 'accuracy' and 'precision' in measurement?
Flashcards
Metrology
Metrology
The science of measurements and its application across all sectors, including theoretical and practical aspects.
Metric System Origin
Metric System Origin
A system of measurement based on invariable references, not subject to bodily dimensions, created in revolutionary France.
Quantity
Quantity
A property of a phenomenon, body, or substance that can be expressed with a number and a reference.
Quantity Value
Quantity Value
Signup and view all the flashcards
Measurand
Measurand
Signup and view all the flashcards
Measurement Procedure
Measurement Procedure
Signup and view all the flashcards
Reference Measurement Procedure
Reference Measurement Procedure
Signup and view all the flashcards
Metrological Traceability
Metrological Traceability
Signup and view all the flashcards
Accuracy
Accuracy
Signup and view all the flashcards
Random Error
Random Error
Signup and view all the flashcards
Study Notes
Metrology
- Metrology is the science of measurements and their applications.
- It encompasses all theoretical and practical aspects of measurement across all sectors, including routine procedures.
- Metrology applies to analytical, biological, and clinical measurements.
- It is relevant regardless of the magnitude of the measurement uncertainty.
Significance of Metrology
- The science is sophisticated, dynamic and important to researchers, engineers, labs, and high-tech industries.
- Scientific advancements stem from the pursuit of more efficient and accurate measurement methods.
- Metrology is crucial for understanding nature and plays a vital role in modern society.
Origin of the Metric System
- The metric system emerged in late 18th-century revolutionary France.
- Its creation was a deliberate attempt to establish a universal measurement system.
- The system aimed to avoid reliance on human body dimensions, which vary.
- The goal was to create a measurement system that would be applicable "for all times, for all peoples".
- The meter was defined as 1/10,000,000 of the distance from the North Pole to the Equator along the line passing through Paris.
- The liter was defined as the volume of a cube of distilled water with dimensions of 1/1,000 of a cubic meter.
- The kilogram was defined as the weight of a liter of distilled water in a vacuum.
- A bar, slightly longer than a meter, had the meter’s length marked by two inscribed lines on its surface.
Metrology Language and VIM
- Metrology has its own specialized language.
- The VIM (International Vocabulary of Metrology) was created to provide a common language, mainly for physical measurements.
- VIM is applicable across all scientific disciplines, especially chemistry and biology.
Importance of VIM
- VIM is the standard reference in international standards and guides, and accommodates terminology changes in chemical and biological measurements.
- It is the basis for standard terminology in measurements.
Terminology
- Quantity refers to a property of a phenomenon, body, or substance with a magnitude expressible as a number and a reference.
- Quantity Value refers to the number and reference together. Indicates the magnitude of a quantity.
- The magnitude of a quantity is expressed by a number and a measurement unit, with additional references to a measurement procedure or material if needed.
- Nominal Quantity Value is a rounded or approximate value that characterizes a measuring instrument or system, providing guidance for its use.
- Reference Quantity Value is used as a basis to compare values of quantities of the same kind.
- Certified reference materials and devices have a reference quantity value and an associated measurement uncertainty.
- The quantity value on a certified reference material, along with its uncertainty, acts as a reference quantity value for its property.
- This value calibrates measuring instruments, determining the value of same-kind quantities.
- When calibrating a mercury thermometer against a standard, the standard's values are reference quantity values.
- In analysis, certified reference material values are used as reference quantity values to assess measurement procedure trueness.
- A set of known solution concentrations, when analyzed to build a calibration diagram, are reference quantity values.
International System of Quantities (SI)
- SI is based on seven base quantities:
- Length (m).
- Mass (kg).
- Time (s).
- Electric current (A).
- Thermodynamic temperature (K).
- Amount of substance (mol).
- Luminous intensity (Cd).
- The kilogram is defined using the fixed numerical value of the Planck constant h = 6.62607015 x 10^-34 J s, equivalent to kg m^2 s^-1.
- The meter and second are defined in terms of the speed of light (c) and cesium frequency (ΔVcs).
- The mole is defined as 6.02214076 × 10^23 of a chemical unit (atoms, molecules, ions, etc.).
Measurement Defined
- The process of experimentally obtaining one or more quantity values reasonably attributed to a quantity.
- Measurement involves a series of defined actions that can be a single-step or multi-stage process.
- It encompasses the entire process of obtaining a quantity value, rather than just the final numerical value.
Higher Order Method
- It is a method that takes in a function as an argument or returns a function as a result.
- Common examples include array methods like map, filter, and reduce
Measurand Defined
- Measurand is the quantity intended to be measured.
- Defining the measurand is critical so that the measurement result is suitable for its intended purpose.
- The definition of the measurand should include all important parameters and conditions.
- As an example, when weighing liquid from a pipette, the measurand specification must include the liquid type and temperature.
- In chemical and biological analysis, specification requires at least the quantity, the analyte, and the matrix if relevant, even without a clear chemical definition.
Measurement Procedure
- A detailed description of a measurement based on measurement principles, a method, and a measurement model.
- It includes all calculations required to obtain a measurement result.
- The description involves several levels of detail, with the measurement procedure being the most comprehensive.
- Measurements require understanding the underlying measurement principle.
Reference Measurement Procedures
- These are measurement procedures accepted as providing measurement results suitable for assessing the measurement trueness of measured quantity values.
- The values are obtained from other measurement procedures for quantities of the same kind, during calibration or in characterizing reference materials.
- They are well-characterized and normally have a small measurement uncertainty.
Metrological Traceability
- It is a property of a measurement result.
- The result is related to specified reference standards, not institutions.
Surrogate Measurement
- It is an indicator that effectively represents another indicator that is intended to be measured.
Primary Reference Measurement Procedure
- It Is a procedure used to obtain a measurement result without relation to a measurement standard for a quantity of the same kind.
- This is also known as primary methods of measurement.
- This procedure allows a quantity value to be determined with direct reference to the definition of its measurement unit or fundamental constants.
- They provide metrologically traceable measurement results with the highest levels of accuracy.
Pycnometer
- A standard vessel, often with a thermometer, used to measure and compare the densities or specific gravities of liquids or solids.
Understanding Errors in Measurement
- Every measured quantity has a "real value" obtained after an ideal experiment.
- “Accurate measurements" do not exist in reality. There is always a probability of errors which makes the measured and the real values to differ.
- Errors introduce uncertainty regarding the difference between measured and real values.
- A measurement error is the difference between the measurement's result and the real (or reference) value.
- Correction is a value added to the measurement result in order to offset any systematic error.
- If there’s no random error, adding the correction to the measurement result would give the real value
Types of Measurement Errors
- Random Error: error contribution due to random and unpredicted differences between successive measurements.
- Systematic Error: error contribution due to systematic difference between the measurement result and the real value.
- Bias: estimate of a systematic measurement error
Accuracy and Precision
- Accuracy is the closeness of agreement between the measurement result and the real value of the thing thats measured.
- A measurement is more accurate when it gives a smaller error.
- Precision is the closeness of agreement between indications or measured quantity values obtained by replicate measurements on the same or similar objects under specified conditions.
Repeatability and Reproducibility
- Repeatability: the closeness of agreement between successive/repeated measurements of the same quantity under the same measurement conditions within a short period of time involving:
- The same operator
- The same procedure
- The same measurement system
- The same location/facilities
- The same conditions
- Reproducibility: the closeness of agreement between measurements conducted under different conditions
- Different locations
- Different operators
- Different measurement systems
- Different replicates of measurements on the same or similar object
Drift and Hysteresis
- Drift: Continuous variation of the indication provided by a measurement device, which is not due to any change of the measured quantity nor to any other change affecting the measurement.
- Continuous change of the metrological characteristics of a measurement device against the time
- Hysteresis: the indication provided by a measuring device depends upon the history of previous measurements conducted by this device.
Calibration
- Calibration is an operation that under specified conditions:
- Establishes a relation between the quantity values with measurement uncertainties provided by measurement standards and corresponding indications with associated measurement uncertainties
- Uses this information to establish a relation for obtaining a measurement result from an indication.
- A calibration may be expressed as a statement, diagram, etc.
- Calibration should not be confused with adjustment or self-calibration.
Calibration Hierarchy
- The sequence of calibrations from a reference to a final measurement system, where the outcome of each calibration depends on the outcome of the previous calibration.
- Measurement uncertainty increases along the sequence of calibrations.
- The elements of a calibration hierarchy are one or more measurement standards and measurement systems operated according to measurement procedures.
Measurement Traceability
- The property of a measurement result that allows it to be related to a reference through an unbroken chain of calibrations.
- Each calibration contributes to the measurement uncertainty.
- Metrological traceability requires an established calibration hierarchy.
- Traceability can also be used in other fields (samples, documents, products, etc.).
Standards Used in Calibrations:
- Primary standards: realize the SI units at the highest accuracy level; they are not calibrated but are compared with others.
- Secondary standards: realize the SI units and are calibrated by primary standards.
- Working standards: used to calibrate routine measuring instruments and are calibrated by secondary standards.
Validation vs. Verification
-
Verification confirms that a product/service meets original requirements.
-
Validation determines if it works for the intended purpose.
Method Validation
- It is the process used to confirm that the analytical procedure employed for a specific test is suitable for its intended use.
Studying That Suits You
Use AI to generate personalized quizzes and flashcards to suit your learning preferences.