🎧 New: AI-Generated Podcasts Turn your study notes into engaging audio conversations. Learn more

Exercise No. 1: Laboratory Safety, Quality Control and Standardization PDF

Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...

Summary

This document provides guidelines on laboratory safety, quality control, and standardization of methods in a chemical laboratory setting. It covers general safety rules, quality assurance practices, and the importance of accurate measurements in analysis. The document is intended for a secondary education context.

Full Transcript

Exercise No. 1: Laboratory Safety, Quality Control and Standardization Introduction Performing experiments involving chemicals exposes students to serious risks. These hazards include contact to toxic or corrosive chemicals or exposure to chemical vapors, radiation, fire, explosions, or seri...

Exercise No. 1: Laboratory Safety, Quality Control and Standardization Introduction Performing experiments involving chemicals exposes students to serious risks. These hazards include contact to toxic or corrosive chemicals or exposure to chemical vapors, radiation, fire, explosions, or serious burns. To prevent the occurrence of any accident, students need to have a good understanding of the hazards inside the laboratory. Thus, having a strong set of overall laboratory safety rules is essential to avoiding accidents in the laboratory. Any analytical data generated in the laboratory must possess a very high degree of reliability. Thus, the laboratory must practice a strict quality assurance program to ensure accuracy and precision of results. In this exercise, you will be introduced to essential quality control measures being practiced in international laboratories. The methods used for laboratory analysis also need to be standardized, either against an accepted method, or against other laboratories, a practice known as interlaboratory comparison, or proficiency testing. The method can also be tested against a standard solution whose concentrations have been established to have high degree of reliability (e.g. certified reference material). This is called standardization Learning Outcomes 1. To be able to identify the common hazards in a chemical laboratory and learn the important laboratory safety practices to prevent injuries. 2. To learn the importance of quality control and adopt these quality control procedures during the conduct of laboratory analyses. 3. To understand the importance of standardization of methods. Procedure General Instruction: Read and understand very carefully the topics discussed below: A. General rules on Laboratory Safety 1. Keep first aid kit handy at a conspicuous working place in the laboratory. Learn how to use the contents of the first aid kit. 2. Personal safety aids such as laboratory coat, hand protection gloves, safety glasses, face shield and proper footwear should be used while working in the laboratory. 3. Observe normal laboratory safety practice in connecting equipment with power supply, in handling chemicals and preparing solutions of reagents. 4. All electrical work must be done by qualified personnel. 5. Ensure rinsing of pipette before use with the next solution. 6. Do not return the liquid reagents back into the bottle after they are taken out for use. 7. Do not put readily soluble substances directly into volumetric flax but first transfer into a beaker, dissolve and then put in the flask. 8. Store oxidizing chemicals like iodine and silver nitrate only in amber color bottles. 9. Keep the working tables/space clean. Clean up spillage immediately. 10. Wash hands after handling toxic / hazardous chemical. 11. Never suck the chemicals with mouth but use automatic pipetting device. 12. Use forceps / tongs to remove containers from the hot plates/ovens/furnaces. 13. Do not use laboratory glassware for eating/drinking. 14. Use fume hood while handling concentrated acids, bases and hazardous chemicals. 15. Never open a centrifuge cover until the machine has stopped. 16. Add acid to water and not water to acid while diluting the acid. 17. Always put labels on bottles, vessels and wash bottles containing reagents, solutions, samples and water. 18. Handle concentrated acids and bases in a fume hoods. 19. Do not heat glassware and inflammable chemicals directly on the flame. 20. Read the labels of the bottles before opening them. 21. Do not hold stopper between fingers while pouring liquid from bottle, nor put it on the working table but on a clean watch glass. 22. Calibrate the equipment periodically. 23. Maintain instrument manual and log book for each equipment to avoid mishandling, accident and damage to equipment. 24. Carryout standardization of reagents daily before use. 25. Always carry out a blank sample analysis with each batch. B. Laboratory Quality Assurance and Quality Control In this class we define the terms quality, quality assurance and quality control following the International Standardization Organization (ISO) and also from FAO Soils Bulletin 74 and the analytical methods as described in FAO Fertilizer and Plant Nutrition Bulletin No.19. Quality Quality is defined as “the total features and characteristics of a product or service that bear on its ability to satisfy stated and implied need.” A product can be stated to possess good quality, if it meets the predetermined parameters. In case of an analytical laboratory, the quality of the laboratory may be considered adequate and acceptable if it has the capacity to deliver the analytical results on a product within the specified limits of errors and as per other agreed conditions of cost and time of analysis so as to enable an acceptable judgement on the product quality. Quality Assurance As per ISO, it means “the assembly of all planned and systematic action necessary to provide adequate confidence that a product, a process or service will satisfy given quality requirements”. The results of these actions are checked by another independent laboratory/person to conform the pronouncement on the quality of a product by a given laboratory. This could be referred as inter-laboratory check. Quality Control Quality control is an important part of quality assurance which is defined by ISO as “the operational techniques and activities that are used to satisfy quality requirements”. Quality assessment or evaluation is necessary to see if the activities performed to verify the quality are effective. Thus, an effective check on all the activities and processes in a laboratory can only ensure that the results pronounced on a product quality are within the acceptable parameters of accuracy. In quality control system, the following steps are involved, which when implemented properly, ensure that the results delivered are acceptable and verifiable by another laboratory. Check on the performance of the instruments. Calibration or standardization of instruments and chemicals. Adoption of sample check system as a batch control within the laboratory. External check: inter-laboratory exchange program. To ensure obtaining accurate and acceptable results of analysis on a sample, the laboratory has to ensure that the equipment are properly calibrated and the methods and techniques used are scientifically sound and give reproducible results. For ensuring the high standards of quality, Good Laboratory Practice (GLP) needs to be followed. GLP is defined as “the organizational process and the conditions under which laboratory studies are planned, performed, monitored, recorded and reported”. Thus, the GLP expects a laboratory to work according to a system of procedures and protocols whereas the procedures are also specified as the Standard Operating Procedure (SOP). Standard Operating Procedure (SOP) A Standard Operating Procedure (SOP) is a document which describes the regularly recurring operations relevant to the quality of the investigation. This is what is often referred to as Laboratory Manual. The purpose of an SOP is to carry out the operation correctly and always in the same manner. An SOP should be available at the place where the work is done. If, for justifiable reasons, any deviation is allowed from SOP, the deviated procedure must be fully documented. In a laboratory, SOP may be prepared for: Safety precaution. Procedure for operating instruments. Analytical methods and preparation of reagents. Registration of samples. To sum up, all the operations have to be properly documented so as very little no chance is left for errors in any manner. C. Error, Precision, Accuracy and Detection Limit Error Error is an important component of the analysis. In any analysis, when the quantity is measured with the greatest exactness that the instrument, method and observer are capable of, the results of successive determination still differ among themselves to a greater or lesser extent. The average value is accepted as most probable, although it may not always be the true value. Sometimes, the difference in the successive values may be small, but it can also be large. The reliability of the result depends upon the magnitude of this difference. There could be a number of factors responsible for this difference which is also referred as ‘error’. The error in absolute term, is the difference between the observed or measured value and the true or most probable value of the quantity measured. The absolute-error is a measure of the accuracy of the measurement. The accuracy of a determination may, therefore, be defined as the agreement between the measured value and the true or most probable value. The relative error is the absolute error divided by the true or most probable value. The error may be caused due to any deviation from the prescribed steps required to be taken in analysis. The purity of chemicals, their concentration/strength and the accuracy of the instruments and the skill of the technician are also important factors. Precision and accuracy In analysis, other important terms to be understood are precision and accuracy. Precision is defined as the agreement of values in a series of measurements of the same quantity. The mean deviation or the relative mean deviation is a measure of precision. Accuracy expresses the correctness of a measurement, while precision expresses the reproducibility of a measurement. Precision always accompanies accuracy, but a high degree of precision does not imply accuracy. In ensuring high accuracy in analysis, accurate preparation of reagents including their perfect standardization is critical. Not only this, even the purity of chemicals is important. Thus, the chemical reagents used in analyses must always be of high purity which is referred as AR-grade (Analytical Reagent). Detection limit In the analysis for trace elements in soils, plants and fertilizers and for environmental monitoring, need arises to measure very low contents of analytes. Modern equipment are capable of such estimation. However, while selecting an equipment and the testing method for such purpose, it is important to have information about the lowest limits up to which analytes can be detected or determined with sufficient confidence. Such limits are called as detection limits or lower limits of detection. The capacity of the equipment and the method may be such that it can detect the traces of analyte in the sample. In quantitative terms, the lowest contents of such analyte may be decided through appropriate research as the values of interpretable significance. The service laboratories are generally provided with such limits. D. Quality Control of Analytical Procedures and Independent Standards The main goal of the quality control measures is to ensure that the analytical data generated by the laboratory possess a minimum of error and has consistency. To check and verify the accuracy of analysis, independent standards are used in the system. The extent of deviation of analytical value on a standard sample indicates the accuracy of the analysis. Independent standard can be prepared in the laboratory from pure chemicals. When new standard is prepared, the remainder of the old ones always have to be measured as a mutual check. If the results are not within the acceptable levels of accuracy, the process of calibration, preparation of standard curve and the preparation of reagents may be repeated till acceptable results are obtained on the standard sample. After assuring this, analysis on unknown sample has to be started. Apart from independent standard, certified reference samples can also be used as ‘standard’. Such samples are obtained from other selected laboratories where the analysis on a prepared standard is carried out by more than one laboratory and such samples along with the accompanied analytical values are used as a check to ensure the accuracy of analysis. Use of blank A blank determination is an analysis without the analyte or attribute or in other words, an analysis without a sample by going through all steps of the procedure with the reagents only. Use of blank accounts for any contamination in the chemicals used in actual analysis. The ‘estimate’ of the blank is subtracted from the estimates of the samples. The use of ‘sequence control’ samples is made in long batches in automated analysis. Generally, two samples, one with a low content of analyte and another with very high content of known analyte (but the contents falling within the working range of the method) are used as standards to monitor the accuracy of analysis. Blind sample A blind sample is a sample with known content of analyte. This sample is inserted by the head of the laboratory in batches and times unknown to the analyst. Various types of sample material may serve as blind samples such as control samples or sufficiently large leftover of test samples (analyzed several times). It is essential that analyst is aware of the possible presence of a blind sample but is not able to recognize the material as such. Method Validation Validation is the process of determining the performance characteristics of a method / procedure. It is a pre- requisite for judgement of the suitability of produced analytical data for the intended use. This implies that a method may be valid in one situation and invalid in another. If a method is very precise and accurate but expensive for adoption, it may be used only when the data with that order of precision are needed. The data may be inadequate, if the method is less accurate than required. Two types of validation are followed. Validation of own procedure In-house validation of method or procedure by individual user laboratory is a common practice. Many laboratories use their own version of even well-established method for reasons of efficiency, cost and convenience. A change in liquid solid ratio in extraction procedures for available soil nutrients and shaking time etc. result in changed value, hence need validation. Such changes are often introduced to consider local conditions, cost of analysis, required accuracy and efficiency. Validation of such changes is the part of quality control in the laboratory. It is also a kind of research project, hence all types of the laboratories may not be in a position to modify the standard method. They should follow the given method as accepted and practiced by most other laboratories. Apart from validation of methods, a system of internal quality control is required to be followed by the laboratories to ensure that they are capable of producing reliable analytical data with minimum of error. This requires continuous monitoring of the operation and systematic day to day checking of the produced data to decide whether these are reliable enough to be released. The following steps need to be taken for internal quality control: Use a blank and a control (standard) sample of known composition along with the samples under analysis. Since the quality control systems rely heavily on control samples, the sample preparation may be done with great care to ensure that the: Sample is homogenous. Sample material is stable. Sample has uniform and correct particle size as sieved through a standard sieve. Relevant information such as properties of the sample and the concentration of the analyte are available. The samples under analysis may also be processed / prepared in such a way that it has similar particle size and homogeneity as that of the standard (control) sample. When an error is detected in the analysis through internal check, corrective measures should be taken. The error can be due to calculation or typing. If not, it requires thorough check on sample identification, standards, chemicals, pipettes, dispensers, glassware, calibration procedure and equipment. Standard may be old or wrongly prepared. Pipette may indicate wrong volume, glassware may not be properly cleaned and the equipment may be defective or the sample intake tube may be clogged in case of flame photometer or Atomic Absorption Spectrophotometer. Source of error may be detected and samples be analyzed again. Validation of the Standard Procedure This refers to the validation of new or existing method and procedures intended to be used in many laboratories including procedures accepted by national system or ISO. This involves an inter-laboratory program of testing the method by a member of selected renowned laboratories according to a protocol issued to all participants. Validation is not only relevant when non-standard procedures are used but just as well when validated standard procedures are used and even more so when variants of standard procedures are introduced. The results of validation tests should be recorded in a validation report from which the suitability of a method for a certain purpose can be deduced. Inter-laboratory sample and data exchange program: If an error is suspected in the procedure and uncertainty cannot readily be solved, it is not uncommon to have the sample analyzed in another laboratory of the same system/organization. The results of the other laboratory may or may not be biased, hence doubt may persist. The sample check by another accredited laboratory may be necessary and useful to resolve the problem. An accredited laboratory should participate at least in one inter-laboratory exchange program. Such programs do exist locally, regionally, nationally and internationally. The laboratory exchange program exists for method performance studies and laboratory performance studies. In such exchange program, some laboratories or the organizations have devised the system where periodically samples of known composition are sent to the participating laboratory without disclosing the results. The participating laboratory will analyze the sample by a given method and find out the results. It provides a possibility for assessing the accuracy of the method being used by a laboratory, and also about the adoption of the method suggested by the lead laboratory E. Preparation of Reagent Solutions and their Standardization Chemical reagents are manufactured and marketed in different grades of purity. In general the purest reagents are marketed as “Analytical Reagent” or AR grade. Further, the markings may be ‘LR’ meaning laboratory reagent or “CP”, meaning chemically pure. The strength of chemicals is expressed as normality or molarity. It is, therefore, useful to have some information about the strength of important acids and alkali most commonly used in the chemical laboratories. Acids and alkali are basic chemicals required in a laboratory. Molarity One molar (M) solution contains one mole or one molecular weight in grams of a substance in each liter of the solution. Molar method of expressing concentration is useful due to the fact that the equal volumes of equimolar solutions contain equal number of molecules. Normality The normality of a solution is the number of gram equivalents of the solute per liter of the solution. It is usually designated by letter N. Semi-normal, penti-normal, desinormal, centi-normal and milli-normal solutions are often required, these are written (shortly) as 0.5N, 0.2N, 0.1N, 0.01N and 0.001N, respectively. However, molar expression is preferred because ‘odd’ normalities such as 0.121N are clumsily represented in fractional form. The definition of normal solution utilizes the term ‘equivalent weight’. This quantity varies with the type of reaction, and hence it is difficult to give a clear definition of equivalent weight which will cover all reactions. It often happens that the same compound possess different equivalent weights in different chemical employed for one purpose and a different normality when used in another chemical reaction. Hence, the system of molarity is preferred. Equivalent weight (Eq W) The equivalent weight of a substance is the weight in grams which in its reaction corresponds to a gram atom of hydrogen or of hydroxyl or half a gram atom of oxygen or a gram atom of univalent ion. When one equivalent weight of a substance is dissolved in one litre, it gives I N solution. Milliequivalent weight (mEq W) Equivalent weight (Eq W) when expressed as milli-equivalent weight (mEq W), means the equivalent weight in grams divided by 1000. It is commonly expressed by “me”. It is the most convenient value because it is the weight of a substance contained in or equivalent to one ml of I N solution. It is, therefore, a unit which is common to both volumes and weights, making it possible to convert the volume of a solution to its equivalent weight and the weight of a substance to its equivalent volume of solution: Number of mEq = Volume x Normality Table 1 Concentrations of Acids & Bases Composition of concentrated reagent grade acids, ammonium hydroxide, and sodium and potassium hydroxide solutions (with dilution directions to prepare 1N solution) Chemical Name Molecular Approx. Molarity of mL of Concd. Reagent Formula Strength of Concd. Needed to Prepare 1 Concd. Reagent Liter of 1N Soln. c Reagenta Acetic Acid, Glacial CH3COOH 99.8 17.4 57.5 Formic Acid HCOOH 90.0 23.6 42.5 Hydrochloric Acid HCl 37.2 12.1 82.5 Hydrofluoric Acid HF 49.0 28.9 34.5 Nitric Acid HNO3 70.4 15.9 63.0 Perchloric Acid HClO4 70.5 11.7 85.5 Perchloric Acid HClO4 61.3 9.5 105.5 Phosphoric Acid H3PO4 85.5 14.8 22.5 Sulfuric Acid H2SO4 96.0 18.0 28.0 Ammonium NH4OH 56.6b 14.5 69.0 Hydroxide Sodium Hydroxide NaOH 50.5 19.4 51.5 Potassium Hydroxide KOH 45 11.7 85.5 a - Representative value, w/w%. b - Equivalent to 28.0% w/w NH3. c - Rounded to nearest 0.5 ml. Some Important Terms Commonly Used in a Chemical Laboratory Buffer solutions Solutions containing a weak acid and its salt or weak base and its salt (e.g. CH 3COOH + CH3COONa) and (NH4OH + NH4Cl) possess the characteristic property to resist changes in pH when some acid or base is added in them. Such solutions are referred to as buffer solutions. Following are important properties of a buffer solution: It has a definite pH value. Its pH value does not alter on keeping for a long time. Its pH value is only slightly altered when strong base or strong acid is added. It may be noted that because of the above property, readily prepared buffer solutions of known pH are used to check the accuracy of pH meters being used in the laboratory. Titration It is a process of determining the volume of a substance required to just complete the reaction with a known amount of other substance. The solution of known strength used in the titration is called titrant. The substance to be determined in the solution is called titrate. The completion of the reaction is judged with the help of appropriate indicator. Indicator A substance which indicates the end point on completion of the reaction is called as indicator. Some of the commonly used indicators in volumetric analysis are: Range of color change Indicator Color on acidic side Color on basic side (pH) Methyl violet Yellow 0.0–1.6 Violet Bromophenol blue Yellow 3.0–4.6 Blue Methyl orange Red 3.1–4.4 Yellow Methyl red Red 4.4–6.3 Yellow Litmus Red 5.0–8.0 Blue Bromothymol blue Yellow 6.0–7.6 Blue Phenolphthalein Colorless 8.3–10.0 Pink Alizarin yellow Yellow 10.1–12.0 Red Standard solution The solution of accurately known strength (or concentration) is called a standard solution. It contains a definite number of gram equivalent or gram mole per liter of solution. All titrimetric methods depend upon standard solutions which contain known amounts (exact) of the reagents in unit volume of the solution. A solution is prepared, having approximately the desired concentration. This solution is then standardized by titrating it with another substance which can be obtained in highly purified form. Thus, potassium permanganate solution can be standardized against sodium oxalate which can be obtained in a high degree of purity, since it is easily dried and is non-hygroscopic. Such substance, whose weight and purity is stable, is called as ‘Primary Standard’. A primary standard must have the following characteristics: It must be obtainable in a pure form or in a state of known purity. It must react in one way only under the condition of titration and there must be no side reactions. It must be non-hygroscopic. Salt hydrates are generally not suitable as primary standards. Normally, it should have a large equivalent weight so as to reduce the error in weighing. An acid or a base should preferably be strong, that is, they should have a high dissociation constant for being used as standards. Primary standard solution is one which can be prepared directly by weighing and with which other solutions of approximate strength can be titrated and standardized. Some Primary standards are given below: Acids 1. Potassium hydrogen phthalate 2. Benzoic acid Bases 1. Sodium carbonate 2. Borax Oxidizing agents 1. Potassium dichromate 2. Potassium bromate Secondary Standard Solutions are those which are prepared by dissolving a little more than the gram equivalent weight of the substance per liter of the solution and then their exact standardization is done with primary standard solution. Some Secondary standards are given below: Acid 1. Sulfuric acid 2. Hydrochloric acid Base 1. Sodium hydroxide The concentrated HCl is approximately 11N. Therefore, to prepare a standard solution, say decinormal (0.1N) of the acid, it is diluted roughly one hundred times. Take 10 ml of acid and make approximately 1 liter by dilution with distilled water. Titrate this acid against 0.1N Na2CO3 (Primary standard) using methyl orange as indicator. Color changes from pink to yellow when acid is neutralized. Example: Suppose 10 ml of acid and 12ml of Na2CO3 are consumed in the titration. Acid Alkali V1 x N1 = V2 x N2 10 x N1 = 12 x 0.1 10 N1 = 1.2 N1 = 0.12 Normality of acid is 0.12 Similarly, normality of sulfuric acid can be worked out. H 2SO4 needs to be diluted about 360 times to get approximately 0.1N because it has a normality of approximately 35. Then titrate against standard Na2CO3 to find out exact normality of H2SO4. Standardization of sodium hydroxide (NaOH): As per above method, the normality of HCl/H2SO4 has been fixed. Therefore, to find out the normality of sodium hydroxide, titration is carried out by using any one of these standard acids to determine the normality of the sodium hydroxide. For working out molarity, molar standard solutions are used. In case of the standardization of NaOH or any other alkali, potassium hydrogen phthalate can also be used as a primary standard instead of going through the titration with secondary standards. It can be decided depending upon the availability of chemicals in the laboratory. References 1. Van Reeuwijk. L.P. 2002. International Soil Reference and Information Center (ISRIC). Procedures for Soil Analysis 2. Beran, J.A. 2014. Laboratory Manual for Principles of General Chemistry ,10th Edition. 3. Shehana, R.S. Et.al. PRACTICAL MANUAL Soil Chemistry, Soil Fertility & Nutrient Management. Department of Soil Science & Agricultural Chemistry. Kerala Activity Answer the following questions. Explain in detail and submit your work through email at [email protected] 1. The chemical laboratory can be a dangerous place to work in. If you are the in-charge of a chemical laboratory, how can you ensure that the personnel or students in the lab are safe? Explain in detail. 2. What quality control measures or steps does an analyst need to do when performing an analysis to ensure that his/her results are of high quality? 3. A student needs to analyze phosphorus using Bray extraction method, and a spectrophotometer for the instrumental analysis. What must the student do to ensure that his analytical results are both accurate and precise? 4. How is an analytical method validated? Why is it important to validate a method? 5. Titration is one of the most popular methods in determining the concentration of a particular substance or analyte. Explain briefly how titration works.

Use Quizgecko on...
Browser
Browser