Nutritional Assessment: Biochemical Methods PDF

Summary

This document discusses biochemical methods for nutritional assessment, comparing them to other assessment types like anthropometry and clinical methods. It reviews different biochemical tests and their applications in detecting deficiencies. The document also covers protein status assessment and its importance in both developed and developing nations.

Full Transcript

introduction Compared with the other methods of nutritional assess ment (anthropometric, clinical methods, and dietary), biochemical tests can potentially provide more objective and quantitative data on nutritional status. Biochemical tests, also known as biomarkers, often can detect nutrient defici...

introduction Compared with the other methods of nutritional assess ment (anthropometric, clinical methods, and dietary), biochemical tests can potentially provide more objective and quantitative data on nutritional status. Biochemical tests, also known as biomarkers, often can detect nutrient deficits long before anthropometric measures are altered and clinical signs and symptoms appear. Some of these tests are useful indicators of recent nutrient intake and can be used in conjunction with dietary methods to assess food and nutrient consumption. This chapter discusses the topic of biochemical methods in nutritional assessment, reviews the more 282 Chapter 9 Biochemical Assessment of Nutritional Status 283 commonly encountered tests for those nutrients of public health importance, and provides examples of various bio chemical techniques in nutritional assessment. Nutritional science is a relatively young discipline, and use of biochemical methods as indicators of nutritional status is still in development. This, along with all that yet remains unknown about the human body, makes the use of these measures in nutritional assessment a rapidly develop ing field and one with many research opportunities. uSe of Biochemical meaSureS Biochemical tests available for assessing nutritional sta tus can be grouped into two general and somewhat arbi trary categories: static tests and functional tests. These are sometimes referred to as direct and indirect tests, respectively. Static tests are also referred to as qualitative and quantitative biochemical indicators. Functional tests are also referred to as biological, functional, and histo logic indicators.1 Static tests are based on measurement of a nutrient or its metabolite in the blood, urine, or body tissue---for example, serum measurements of folate, retinol, vitamin B12, vitamin D. These are among the most readily available tests, but they have certain limitations. Although they may indicate nutrient levels in the particular tissue or fluid sam pled, they often fail to reflect the overall nutrient status of an individual or whether the body as a whole is in a state of nutrient excess or depletion.2 For example, the amount of calcium in serum can be easily determined, but that single static measurement is a poor indicator of the body's overall calcium status or of bone mineral content. Functional tests of nutritional status are based on the ultimate outcome of a nutrient deficiency, which is the fail ure of the physiologic processes that rely on that nutrient for optimal performance. Included among these functional tests are measurement of dark adaptation (assesses vita min A status) and urinary excretion of xanthurenic acid in response to consumption of tryptophan (assesses vita min B6 status). Although many functional tests remain in the experimental stage, this is an area of active research and one that is likely to be fruitful. One drawback of some functional tests, however, is a tendency to be nonspecific; they may indicate general nutritional status but not allow identification of specific nutrient deficiencies. Biochemical tests can also be used to examine the validity of various methods of measuring dietary intake or to determine if respondents are underreporting or over reporting what they eat. The ability of a food frequency questionnaire to accurately measure protein intake, for example, can be assessed by comparing reported protein intake with 24-hour urine nitrogen excretion. When prop erly used, this method is sufficiently accurate to use as a validation method in dietary surveys. As with any test requiring a 24-hour urine sample, however, each collec tion must be complete (i.e., respondents must collect all urine during an exact 24-hour period). Urinary nitrogen is best estimated using multiple 24-hour urine samples, and any extrarenal nitrogen losses must be accounted for.3 The doubly labeled water technique, as mentioned in Chapters 3 and 7, is another biochemical test useful for determining validity and accuracy of reporting. It can be an accurate way of measuring energy expenditure without interfering with a respondent's everyday life.4 If reported energy and protein consumption fail to match estimates of energy and protein intake derived from these properly performed biochemical tests, then the dietary assessment method may be faulty or the respondent did not accu rately report food intake. Biochemical tests are a valuable adjunct in assessing and managing nutritional status; however, their use is not without problems. Most notable among these is the influ ence that nonnutritional factors can have on test results. A variety of pathologic conditions, use of certain medica tions, and technical problems in a sample collection or assay can affect test results in ways that make them unus able. Another problem with some biochemical tests is their nonspecificity. A certain test may indicate that a patient's general nutritional status is impaired yet lack the specific ity to indicate which nutrient is deficient. Additionally, no single test, index, or group of tests by itself is sufficient for monitoring nutritional status. Biochemical tests must be used in conjunction with measures of dietary intake, anthropometric measures, and clinical methods. Protein StatuS Assessing protein status can be approached by use of anthropometric (Chapters 6 and 7), biochemical, clinical (Chapter 10), and dietary data (Chapters 3 and 4). Biochemical assessment of protein status has typi cally been approached from the perspective of the two compartment model: evaluation of somatic protein and visceral protein status. The body's somatic protein is found within skeletal muscle. Visceral protein can be regarded as consisting of protein within the organs or vis cera of the body (liver, kidneys, pancreas, heart, and so on), the erythrocytes (red blood cells), and the granulo cytes and lymphocytes (white blood cells), as well as the serum proteins.5,6 The somatic and visceral pools contain the metabolically available protein (known as body cell mass), which can be drawn on, when necessary, to meet various bodily needs. The somatic and visceral protein pools comprise about 75% and 25% of the body cell mass, respectively. Together, they comprise about 30% to 50% of total body protein.5 The remaining body protein is found primarily in the skin and connective tissue (bone matrix, cartilage, tendons, and ligaments) and is not readily exchangeable with the somatic and visceral pro tein pools. Division of the body's protein into these two compartments is somewhat arbitrary and artificial. Although the somatic compartment is homogeneous, the 284 Nutritional Assessment Creatinine Excretion and Creatinine-Height Index A biochemical test sometimes used for estimating body muscle mass is 24-hour urinary creatinine excretion. Creatinine, a product of skeletal muscle, is excreted in a relatively constant proportion to the mass of muscle in the body. It is readily measured by any clinical laboratory. Lean body mass can be estimated by comparing 24-hour urine creatinine excretion with a standard based on stature (Table 9.1) or from reference values of 23 and 18 mg/kg of recommended body weight for males and females, respectively. Another approach is using the creatinine height index (CHI), a ratio of a patient's measured 24-hour urinary creatinine excretion and the expected excretion of a reference adult of the same sex and stature. The CHI is expressed by the following formula: CHI= 24-hr urine creatinine(mg)×100 Expected 24-hr urine creatinine (mg) Expected 24-hour urine creatinine values are shown in Table 9.1. These should be matched to the subject's sex and height. The CHI is expressed as a percent of expected value. A CHI of 60% to 80% is considered indicative of mild protein depletion; 40% to 60% reflects moderate protein depletion; and a value under 40% represents severe depletion.5--7 visceral protein pool is composed of hundreds of different proteins serving many structural and functional roles. Although protein is not considered a public health issue among the general population of developed nations, protein-energy malnutrition (PEM), also known as protein-calorie malnutrition, can be a result of certain dis eases and is clearly a pressing concern in many develop ing nations. Protein-energy malnutrition can be seen in persons with cancer and acquired immune deficiency syndrome (AIDS), children who fail to thrive, those with anorexia nervosa, and homeless persons. Because of its high prevalence and relationship to infant mortality and impaired physical growth, PEM is considered the most important nutritional disease in developing countries.7 It is also of concern in developed nations. According to some reports, PEM has been observed in nearly half of the patients hospitalized in medical and surgical wards in the United States. In more recent studies, the prevalence of PEM ranged from 30% to 40% among patients with hip fractures, patients under going thoracic surgery for lung cancer, patients receiving ambulatory peritoneal dialysis, and children and adoles cents with juvenile rheumatoid arthritis.2,7--9 Assessment of protein status is central to the preven tion, diagnosis, and treatment of PEM. The causes of PEM can be either primary (inadequate food intake) or second ary (other diseases leading to insufficient food intake, inadequate nutrient absorption or utilization, increased nutritional requirement, and increased nutrient losses).7--9 The protein and energy needs of hospitalized patients can be two or more times those of healthy persons as a result of hypermetabolism accompanying trauma, infection, burns, and surgical recovery. PEM can result in kwashior kor (principally a protein deficiency), marasmus (predom inantly an energy deficiency), or marasmic kwashiorkor (a combination of chronic energy deficit and chronic or acute protein deficiency).7 Clinical findings pertinent to kwashi orkor and marasmus are discussed in Chapter 10. Body weight is a readily obtained indicator of energy and protein reserves. However, it must be carefully inter preted because it fails to distinguish between fat mass and fat-free mass, and losses of skeletal muscle and adipose tissue can be masked by water retention resulting from edema and ascites. The creatinine-height index is also well suited to the clinical setting but has limited precision and accuracy. Use of midarm muscle circumference and midarm muscle area are two other approaches to assessing somatic protein status.7 These are discussed in Chapter 7. Rather than relying on any single indicator, a combina tion of measures can produce a more complete picture of protein status. The choice of approaches depends on meth ods available to the particular facility. Biochemical data on nutritional status constitute only part of the necessary infor mation to evaluate the severity of nutritional depletion and PEM. Data relating to dietary intake, pertinent anthropomet ric measures, and clinical findings are necessary as well. table 9.1 Expected 24-Hour Urinary Creatinine Values for Height for Adult Males and Females Adult Males\* Adult Females† Height (cm) Creatinine (mg) Height (cm) Creatinine (mg) 157.5 1288 147.3 830 160.0 1325 149.9 851 162.6 1359 152.4 875 165.1 1386 154.9 900 167.6 1426 157.5 925 170.2 1467 160.0 949 172.7 1513 162.6 977 175.3 1555 165.1 1006 177.8 1596 167.6 1044 180.3 1642 170.2 1076 182.9 1691 172.7 1109 185.4 1739 175.3 1141 188.0 1785 177.8 1174 190.5 1831 180.3 1206 193.0 1891 182.9 1240 Source: Blackburn GL, Bistrian BR, Maini BS, Schlamm HT, Smith MR. 1977. Nutritional and metabolic assessment of the hospitalized patient. Journal of Parenteral and Enteral Nutrition 1:11--12. \*Creatinine coefficient for males = 23 mg/kg of "ideal" body weight. †Creatinine coefficient for females = 18 mg/kg of "ideal" body weight. chapter 9 Biochemical Assessment of Nutritional Status 285 As mentioned in the section "Use of Biochemical Measures," a major concern when using any test requir ing a 24-hour urine sample is obtaining a complete urine sample collected during an exact 24-hour period. The value of protein status measurements based on urinary creatinine measurements can also be compromised by the effect of diet on urine creatinine levels, variability in cre atinine excretion, and the use of height-weight tables for determining expected creatinine excretion based on sex and stature.5 These limitations are discussed in Chapter 6. Nitrogen Balance A person is said to be in nitrogen balance when the amount of nitrogen (consumed as protein) equals the amount excreted by the body. It is the difference between nitrogen intake and the amount excreted from the body in urine and feces or lost in miscellaneous ways such as the sloughing of skin cells and blood loss.10 Nitrogen balance is the expected state of the healthy adult. It occurs when the rate of protein synthesis, or anabolism, equals the rate of pro tein degradation, or catabolism. Positive nitrogen balance occurs when nitrogen intake exceeds nitrogen loss and is seen in periods of anabolism, such as childhood or recov ery from trauma, surgery, or illness. Negative nitrogen bal ance occurs when nitrogen losses exceed nitrogen intake and can result from insufficient protein intake, catabolic states (for example, sepsis, trauma, surgery, and cancer), or periods of excessive protein loss (as a result of burns or certain gastrointestinal and renal diseases characterized by unusual protein loss). Nutritional support can help return a patient to positive nitrogen balance or at least prevent severe losses of energy stores and body protein.5--7 Nitrogen balance studies involve 24-hour measure ment of protein intake and an estimate of nitrogen losses from the body. The following formula is used: N2 Balance=PRO 6.25 −UUN− where N2 Balance = nitrogen balance; PRO = protein intake (g/24 h); and UUN = urine urea nitrogen (g/24 h). Protein intake, measured by dietary assessment methods, is divided by 6.25 to arrive at an estimate of nitrogen intake. Nitrogen loss is generally estimated by measuring urine urea nitrogen (which accounts for 85% to 90% of nitrogen in the urine) and adding a constant (for example, 4 g) to account for nitrogen losses from the skin, stool, wound drainage, nonurea nitrogen, and so on, which cannot be easily measured.5--7,10 Problems associated with measuring protein intake and nitrogen excretion limit the usefulness of this approach. For example, it is difficult to account for the unusually high nonurine nitrogen losses seen in some patients with burns, diarrhea, vomiting, or fistula drain age. In such cases, this approach to calculating nitrogen balance may not yield accurate results. Serum Proteins Serum protein concentrations can be useful in assessing protein status, in determining whether a patient is at risk of experiencing medical complications, and for evaluat ing a patient's response to nutritional support. The serum proteins of primary interest in nutritional assessment are shown in Table 9.2.2 In most instances, their measure ment is simple and accurate. Use of serum protein mea surements is based on the assumption that decreases in serum concentrations are due to decreased liver produc tion (the primary site of synthesis). This is considered a consequence of a limited supply of amino acids from which the serum proteins are synthesized or a decrease in the liver's capacity to synthesize serum proteins. The extent to which nutritional status or liver function affects serum protein concentrations cannot always be deter mined. A number of factors other than inadequate protein intake affect serum protein concentrations. These are noted in Table 9.2. Albumin The most familiar and abundant of the serum proteins, as well as the most readily available clinically, is albumin. Serum albumin level has been shown to be an indicator of depleted protein status and decreased dietary protein intake. Measured over the course of several weeks, it has been shown to correlate with other measures of protein status and to respond to protein repletion. Low concen trations of serum albumin are associated with increased morbidity and mortality in hospitalized patients.2,5--7,11 Despite these correlations, the value of albumin as a pro tein status indicator is severely limited by several factors. Its relatively long half-life (14 to 20 days) and large body pool (4 to 5 g/kg of body weight) cause serum levels to respond very slowly to nutritional change.2,5--7,11 Because it is neither sensitive to, nor specific for, acute PEM or a patient's response to nutritional support, it is not a useful indicator of protein depletion or repletion.2,11 Serum albumin level is determined by several fac tors: the rate of synthesis, its distribution in the body, the rate at which it is catabolized, abnormal losses from the body, and altered fluid status.2,11 About 60% of the body's albumin is found outside the bloodstream. When serum concentrations begin falling during early PEM, this extravascular albumin moves into the bloodstream, helping maintain normal serum concentrations despite protein and energy deficit. During the acute catabolic phase of an injury, an infection, or surgery, there is increased synthesis of substances known as acute-phase reactants. Included among these are C-reactive protein, fibrinogen, haptoglobin, and alpha-1-acid glycoprotein. Acute-phase reactants decrease synthesis of albumin, prealbumin, and transferrin. Consequently, levels of these serum proteins may remain low during this cata bolic phase despite the provision of adequate nutritional 286 Nutritional Assessment trauma), and uremia. Serum levels can be increased during pregnancy, estrogen therapy, and acute hepatitis.14,18 Prealbumin Prealbumin, also known as transthyretin and thyroxine binding prealbumin, is synthesized in the liver and serves as a transport protein for thyroxine (T4) and as a carrier protein for retinol-binding protein. Because of its short half-life (two to three days) and small body pool (0.01 g/kg body weight), it is considered a more sensitive indicator of protein nutriture and one that responds more rapidly to changes in protein status than albumin or transferrin. Prealbumin decreases rapidly in response to deficits of either protein or energy and is sensitive to the early stages of malnutrition. Because serum concentration quickly returns to expected levels once adequate nutri tional therapy begins, it is not recommended as an end point for terminating nutritional support. It may prove to be better suited as an indicator of recent dietary intake than as a means of assessing nutritional status.2 Serum concentration also will return to expected levels in response to adequate energy in the absence of sufficient protein intake. Its use as an indicator of protein status appears to be preferable to the use of albumin or transfer rin. However, like the other serum proteins outlined in support.14 The practice of administering albumin to severely ill patients also can interfere with its use as an indicator of protein status. Transferrin Serum transferrin is a β-globulin synthesized in the liver that binds and transports iron in the plasma. Because of its smaller body pool and shorter half-life, it has been considered a better index of changes in protein status compared with albumin.2,11 Although serum transferrin has been shown to be associated with clinical outcome in children with kwashiorkor and marasmus, its use to pre dict morbidity and mortality outcomes in hospitalized patients has produced conflicting results.5 Serum transferrin can be measured directly (by radial immunodiffusion and nephelometry), but it is frequently estimated indirectly from total iron-binding capacity (TIBC) using a prediction formula suited to the particular facility's method for measuring TIBC.5 The use of transferrin as an index of nutritional status and repletion is limited by several factors other than protein status that affect its serum concentration. As outlined in Table 9.2, transferrin levels decrease in chronic infections, protein-losing enteropathy, chronically draining wounds, nephropathy, acute catabolic states (e.g., surgery and table 9.2 Serum Proteins Used in Nutritional Assessment Serum Protein Half-Life Function Comments\* Albumin 18--20 days Maintains plasma oncotic pressure; carrier for small molecules In addition to protein status, other factors affect serum concentrations. Normal: 3.5--5.0 g/L Mild depletion: 3.0--3.4 g/L Moderate depletion: 2.4--2.9 g/L Severe depletion: \< 100 mg/dL Prealbumin (transthyretin) 2--3 days Binds T3 and, to a lesser extent, T4; carrier for retinol binding protein Level is increased in patients with chronic renal failure on dialysis due to decreased renal catabolism; reduced in acute catabolic states, after surgery, in hyperthyroidism, in protein-losing enteropathy; increased in some cases of nephrotic syndrome; serum level determined by overall energy balance as well as nitrogen balance. Normal: 16--40 mg/dL Mild depletion: 10--15 mg/dL Moderate depletion: 5--9 mg/dL Severe depletion: \< 5 mg/dL Retinol-binding protein (RBP) 12 hours Transports vitamin A in plasma; binds noncovalently to prealbumin It is catabolized in renal proximal tubular cell; with renal disease, RBP increases and half-life is prolonged; low in vitamin A deficiency, acute catabolic states, after surgery, and in hyperthyroidism. Normal: 2.1--6.4 mg/dL Source: Heimburger DC. 2012. Malnutrition and nutritional assessment. In Longo DL, Fauci AS, Kasper DL, Hauser SL, Jameson JL, Loscalzo J (eds.), Harrison's principles of internal medicine, 18th ed. New York: McGraw-Hill, 605--612; and Charney P, Malone AM. 2009. ADA pocket guide to nutrition assessment, 2nd ed. Chicago: American Dietetic Association. \*All the listed proteins are influenced by hydration and the presence of hepatocellular dysfunction. chapter 9 Biochemical Assessment of Nutritional Status 287 Table 9.2, several factors other than protein status affect its concentration in serum. Levels are reduced in liver disease, sepsis, protein-losing enteropathies, hyperthy roidism, and acute catabolic states (e.g., following surgery or trauma). Serum prealbumin can be increased in patients with chronic renal failure who are on dialysis due to decreased renal catabolism.2,11 Retinol-Binding Protein Retinol-binding protein, a liver protein, acts as a carrier for retinol (vitamin A alcohol) when complexed with prealbu min. It circulates in the blood as a 1:1:1 trimolecular com plex with retinol and prealbumin.13 Retinol-binding protein shares several features with prealbumin. It responds quickly to protein-energy deprivation and adequate nutritional therapy, as well as to ample energy in the absence of suf ficient protein. Like prealbumin, it may be a better indica tor of recent dietary intake than of overall nutritional status. It has a much shorter half-life (about 12 hours) than preal bumin. Its smaller body pool (0.002 g/kg body weight), however, complicates its precise measurement. There is no convincing evidence that its use in nutritional assessment is preferred over prealbumin. Because it is catabolized in the renal proximal tubule cell, serum levels are increased in renal disease and its half-life is prolonged. Serum levels can be decreased in vitamin A deficiency, acute catabolic states, and hyperthyroidism.2,13 iron status Iron deficiency is the most common single nutrient defi ciency in the United States and the most common cause of anemia. Preschool children and women of childbear ing age are at highest risk of iron deficiency.14 Globally, 43% of preschool children and 33% of nonpregnant women are anemic, with the highest burden in Africa and South Asia.15 Iron deficiency results when ingestion or absorption of dietary iron is inadequate to meet iron losses or iron require ments imposed by growth or pregnancy. Considerable iron can be lost from heavy menstruation, frequent blood donations, early feeding of cow's milk to infants, frequent aspirin use, or disorders characterized by gastrointestinal bleeding. Risk of iron deficiency increases during periods of rapid growth---notably, in infancy (especially in premature infants), adolescence, and pregnancy. The consequences of iron deficiency include reduced work capacity, impaired body temperature regulation, impairments in behavior and intellectual performance, increased susceptibility to lead poisoning, and decreased resistance to infections. During pregnancy, iron deficiency increases risk of maternal death, prematurity, low birth weight, and neonatal mortality. During early childhood it can have adverse effects on cogni tive, motor, and emotional development that may be only partially reversible.15,16 The public health significance of iron deficiency is underscored by the fact that two of the objectives for Healthy People 2020 relate to reducing iron deficiency among young children and females of childbear ing age and among pregnant females.14 Clinicians typically request the complete blood count (CBC) as a first-step approach to assessing iron status (see Box 9.1). Anemia is a hemoglobin level below the normal reference range for individuals of the same sex and age. Descriptive terms such as microcytic, macrocytic, and hypochromic are sometimes used to describe anemias. Microcytic refers to abnormally small red blood cells defined by a mean corpuscular volume (MCV) \< 80 fem toliters (fL), whereas macrocytic describes unusually large red blood cells defined as an MCV \> 100 fL. Hypochromic cells are those with abnormally low levels of hemoglobin as defined by a mean corpuscular hemoglobin concentra tion \< 320 g of hemoglobin/L or by a mean corpuscular hemoglobin \< 27 picograms (pg, 10−12 grams). Although the most common cause of anemia is iron deficiency, other common causes include inflammation, infection, tissue injury, and cancer, which are collectively referred to as anemia of inflammation.15 Anemia can also result from deficiencies of folate and vitamin B12. Of par ticular concern to physicians working with individual patients and nutritional epidemiologists attempting to estimate the prevalence of iron deficiency in populations is differentiating iron-deficiency anemia from anemia caused by inflammatory disease, infection, chronic dis eases, and thalassemia traits.17,18 Stages of Iron Depletion The risk of iron deficiency increases as the body's iron stores are depleted. Iron depletion can be divided into three stages. These stages and the biochemical tests used in identifying them are shown in Table 9.3. Figure 9.1 illustrates how values for these tests change throughout the stages of iron deficiency. The first stage of iron depletion, depleted iron stores, is not associated with any adverse physiologic effects, but it does represent a state of vulnerability.16,18 Low stores occur in healthy persons and appear to be the usual physi ologic condition for growing children and menstruating women.18,19 As shown in Figure 9.1, during this first stage, low iron stores are reflected by decreased serum ferritin levels, but values for the other biochemical tests remain within normal limits. The second stage of iron depletion, early functional iron deficiency without anemia, can be considered repre sentative of early or mild iron deficiency because, at this point, adverse physiologic consequences can begin to occur. This stage is characterized by changes indicating insufficient iron for normal production of hemoglobin and other essential iron compounds (for example, myoglobin and iron-containing enzymes).16,18,19 As shown in Figure 9.1, this stage is characterized by decreased 288 Nutritional Assessment Box 9.1 The Complete Blood Count (CBC) ©David C. Nieman A complete blood count (CBC) is a blood test used to evaluate overall health and detect a wide range of disorders, including anemia, infection, inflammation, bleeding disorder, and leukemia. Abnormal increases or decreases in cell counts as revealed in a complete blood count may indicate an underlying medical condition that calls for further evaluation. A phlebotomist collects the sample through venipuncture, drawing the blood into a test tube containing an anticoagulant (EDTA, sometimes citrate) to stop it from clotting. The sample is then transported to a laboratory, where automated hematology analyzers perform the cell counts, measurements, and calculations. The three main physical technologies used in hematology analyzers are electrical impedance, flow cytometry, and fluorescent flow cytometry. These are used in combination with chemical reagents that lyse or alter blood cells to perform the measurable parameters. The CBC is a panel of tests that evaluates the three types of cells (red blood cells, white blood cells, and platelets) that circulate in the blood compartment. ©David C. Nieman normal ranges.) Mean corpuscular volume (MCV) is a measurement of the average size of a single red blood cell. Mean corpuscular hemoglobin (MCH) is a calculation of the average amount of hemoglobin inside a single red blood cell. Mean corpuscular hemoglobin concentration (MCHC) is a calculation of the average concentration of hemoglobin inside a single red blood cell. Red cell distribution width (RDW) is a calculation of the variation in the size of RBCs. The CBC may also include reticulocyte count, which is a measurement of the absolute count or percentage of young red blood cells in blood. 2. White blood cells: A low white blood cell (WBC) count (leukopenia) may be caused by a medical condition such as an autoimmune disorder that destroys WBCs, bone marrow disease, or cancer. Certain medications can cause WBCs to drop. High WBC counts (leukocytosis) may indicate an infection or inflammation, an immune system disorder, bone marrow disease, or a reaction to medication. The normal WBC range is 3.5--10.5 billion cells/L. 1. Red blood cells and iron status: red blood cells (RBC), which carry oxygen; hemoglobin, the oxygen-carrying protein in red blood cells; and hematocrit, the proportion of red blood cells to the fluid component, or plasma, in blood. If measures in these three areas are lower than normal, anemia is typically diagnosed (with follow-up tests to confirm). High RBC counts (erythrocytosis) or high hemoglobin or hematocrit levels could indicate an underlying medical condition, such as polycythemia vera or heart disease. (See Tables 9.4 and 9.5 and www. labtestsonline.org for normal ranges.) RBC indices, often included in the CBC, are calculations that provide information on the physical characteristics of the RBCs. (See Tables 9.4 and 9.5 for transferrin saturation and increased erythrocyte protopor phyrin levels. A precursor of hemoglobin, erythrocyte protoporphyrin increases when too little iron is available for optimal hemoglobin synthesis. Although hemoglobin The WBC differential, which may or may not be included in the CBC test, identifies and counts the number of five different types of WBCs: neutrophils, lymphocytes, monocytes, eosinophils, and basophils. 3. Platelets: Platelets are cell fragments that are vital for normal blood clotting. A platelet count that is lower than normal (thrombocytopenia) or higher than normal (thrombocytosis) may indicate an underlying disease condition or a side effect from medication. Further testing is needed to diagnose the cause. The normal platelet count is 150--450 billion/L. Mean platelet volume (MPV) and platelet distribution width (PDW) may be reported with a CBC (calculations of the average size of platelets and how uniform platelets are in size, respectively). may be decreased at this stage, it may not fall below the lowest levels seen in normal subjects. Consequently, hemoglobin is not a useful indicator of either stage 1 or stage 2 iron depletion.16 chapter 9 Biochemical Assessment of Nutritional Status 289 table 9.3 The Three Stages of Iron Deficiency and the Indicators Used to Identify Them Stage of Iron Deficiency Indicator Diagnostic Range 1. Depleted stores 2. Early functional iron deficiency (without anemia) 3. Iron-deficiency anemia Serum ferritin concentration Total iron-binding capacity Transferrin saturation Free erythrocyte protoporphyrin Serum transferrin receptor Hemoglobin concentration Mean corpuscular volume  400 μg/dL  70 μg/dL erythrocyte \> 8.5 mg/L  11 44--64 39--59 35--50 29--43 30--40 32--44 42--52 37--47 \> 33 Source: Pagana KD, Pagana TJ. 2010. Mosby's manual of diagnostic and laboratory tests, 4th ed. St. Louis: Mosby Elsevier. Cuto value Individuals with abnormal status False positives Number of individuals Individuals with normal status False negatives Values for indicator Figure 9.2 Effect of applying a cutoff value for an indicator of nutritional status to the distributions of values for individuals with adequate status and individuals with inadequate status. volume of whole blood. It can be measured manually by comparing the height of whole blood in a capillary tube with the height of the RBC column after the tube is cen trifuged. In automated counters, it is calculated from the RBC count (number of RBCs per liter of blood) and the mean corpuscular volume. Hematocrit depends largely on the number of red blood cells and to a lesser extent on their average size. Normal ranges for hematocrit are shown in Table 9.4. As is the case with hemoglobin, iso lated measurement of hematocrit is not suitable as a sole indicator of iron status.17 Mean Corpuscular Hemoglobin The mean corpuscular hemoglobin (MCH) is the amount of hemoglobin in red blood cells. It is calculated by divid ing hemoglobin level by the red blood cell count. Reference values are approximately 26 to 34 pg. MCH is influenced by the size of the red blood cell and the amount of hemoglobin in relation to the size of the cell.18,19 A similar measure, mean corpuscular hemoglobin concentration (MCHC) is the average concentration of hemoglobin in the average red blood cell. It is calculated by dividing the hemoglobin value by the value for hema tocrit. Normal values lie in the range of 320 to 360 g/L (32 to 36 g/dL).18,19 Mean Corpuscular Volume Mean corpuscular volume (MCV) is the average volume of red blood cells. It is calculated by dividing the hemato crit value by the RBC count. Values for MCV are nor mally in the range of 80 to 100 fL for both males and females. Factors increasing MCV (resulting in macrocytosis) include deficiencies of folate or vitamin B12, chronic liver disease, alcoholism, and cytotoxic chemotherapy. Among factors decreasing MCV (resulting in microcytosis) are chronic iron deficiency, thalassemia, anemia of chronic diseases, and lead poisoning.18,19 Reference blood cell values for adults are shown in Table 9.5. 292 Nutritional Assessment table 9.5 Reference Blood Cell Values for Adults Males Females Hemoglobin (g/dL of blood) Hematocrit (%) Red cell count (× 1012/L blood) Mean corpuscular hemoglobin (pg) Mean corpuscular hemoglobin concentration (g/dL of blood) Mean corpuscular volume (fL) 14--18 40--54 4.7--6.1 27--33 31--35 82--98 12--16 37--47 4.2--5.4 27--33 31--35 82--98 Source: Ravel R. 1994. Clinical laboratory medicine: Clinical application of laboratory data, 6th ed. St. Louis: Mosby. Assessing Iron Status One of the challenges of addressing iron deficiency has been uncertainty about the best approach for assessing iron status.17 Because no single biochemical test exists for reliably assessing iron status, models using two or more different indicators of iron status have been developed. Three of these models are shown in Table 9.6. The two most commonly used are the ferritin model and the body iron model. The ferritin model (also known as the three indicator model) uses three indicators: serum ferritin, transferrin saturation, and erythrocyte protoporphy rin.14,16,23,24 The body iron model uses two indicators: serum ferritin and soluble transferrin receptor (sTfR).14,15,17,20--23 Another model, the mean corpuscular volume (MCV) model, uses MCV, transferrin saturation, and erythrocyte protoporphyrin as indicators.25 Both the ferritin model and the MCV model require that at least two of the three indicators be abnormal.23--25 The ferritin model tends to overestimate the presence of iron defi ciency because it includes ferritin, which reflects stores in the first stage of iron depletion. The MCV model, on the table 9.6 Models for Assessing Iron Status Model Measurements Used Body iron model\* Ferritin model† Mean corpuscular volume (MCV) model† Soluble transferrin receptor (sTfR) Serum ferritin Serum ferritin Transferrin saturation Erythrocyte protoporphyrin MCV Transferrin saturation Erythrocyte protoporphyrin \*Measurements used to calculate the sTfR to serum ferritin ratio. †Two of three values must be abnormal. other hand, includes three biochemical tests, all of which reflect altered red blood cell formation.26 Both models are capable of identifying persons in the second and third stages of iron depletion, but they may fail to distinguish iron-deficiency anemia from the other common causes of anemia, such as inflammation, acute and chronic disease, and lead poisoning, because they include erythrocyte pro toporphyrin as a variable.24--26 The MCV model was employed to assess iron status using data from NHANES II (1976--1980).25 The ferritin model was used in NHANES III as well as the first few years of the continuous NHANES survey that began in 1999.20 Beginning in 2003, NHANES limited its interest in assessing iron status to children 1--5 years of age and to women of childbearing age (12--49 years of age) and introduced the measurement of sTfR. Since 2003, NHANES has used the body iron model to assess body iron in the two groups of interest (children 1--5 years and women 12--49 years).20 Of the three models, the body iron model is considered superior because it is less affected by inflammation than are the ferritin and MCV models. In addition, only two measures are used in the body model, whereas three measures are used in the other models. The greater simplicity of the body iron model makes it more suitable for use in areas where resources are limited and where anemia due to inflammation, chronic disease, and nutrient deficiencies other than iron are relatively common.14,15,20--22 Using the body iron model, body iron can be estimated from the ratio of sTfR to serum ferritin. Body iron is in positive balance (≥ 0 mg of iron per kg of body weight or ≥ 0 mg/kg) when there is residual storage iron or in a negative balance (\< 0 mg of iron per kg of body weight or \< 0 mg/kg) during func tional iron deficiency when there is a lack of iron required to maintain a normal hemoglobin concentration.20 Both sTfR and serum ferritin can be measured using a small capillary blood specimen. Body iron is expressed in mg of iron per kg of body weight, controlling for the effect of body weight when estimating body iron and increasing the utility of this model for assessing the iron status of younger persons. Figure 9.3 shows the age-adjusted prev alence estimates of low body iron stores (\< 0 mg/kg) in U.S. children and women by race/ethnicity using data from NHANES 2003--2006. The highest prevalence of low body iron assessed using the body iron model was seen in Mexican American children 1--5 years of age and in non-Hispanic black women 12--49 years of age.20 Iron Overload A second disorder of iron metabolism is iron overload, which is the accumulation of excess iron in body tissues. Iron overload is most often the result of hemochromato sis, a group of genetic diseases characterized by excessive intestinal iron absorption and deposition of excessive amounts of iron in parenchymal cells with eventual tissue Iron Deficiency Prevalence (%) 20 chapter 9 Biochemical Assessment of Nutritional Status 293 16 12 8 4 0 Mexican American Non Hispanic Black Non Hispanic White Children 1--5 years of age with low body iron stores Mexican American Non Hispanic Black Non Hispanic White Women 12--49 years of age with low body iron stores Figure 9.3 Age-adjusted prevalence estimates of low body iron stores (\< 15 ng/mL) and high serum ferritin (\> 200 ng/mL for men and \> 150 ng/mL for women) in U.S. men and women 12--49 years of age, based on data collected by NHANES between 1999 and 2006. As shown in Figure 9.4, women are at risk of iron deficiency, while men are at risk of iron excess, and there are marked differences in the prevalence of low and high serum ferritin concentrations between the two sexes.20,27 calcium status Calcium is essential for bone and tooth formation, muscle contraction, blood clotting, and cell membrane integ rity.28,29 Of the 1200 g of calcium in the adult body, approximately 99% is contained in the bones. The remain ing 1% is found in extracellular fluids, intracellular struc tures, and cell membranes.28,29 Interest in osteoporosis prevention and treatment, coupled with data showing low calcium intakes in certain groups, especially women, has made calcium a current public health issue. This has sparked interest in assessing the body's calcium status. At the current time, there are no appropriate biochem ical indicators for assessing calcium status. This is due in large part to the biological mechanisms that tightly control serum calcium levels despite wide variations in dietary intake and the fact that the skeleton serves as a calcium reserve so large that calcium deficiency at the cellular or tissue level is essentially never encountered, at least for nutritional reasons.28,31 Potential approaches to assessing calcium status can be categorized in three areas: bone min eral content measurement, biochemical markers, and mea sures of calcium metabolism.28 Of these three approaches, measurement of bone mineral content by such methods as quantitative computed tomography, single- and dual- photon absorptiometry, and dual-energy X-ray absorpti ometry is currently the most feasible approach to assessing calcium status. DXA testing for body composition and bone mineral content is described in Chapter 6. However, fewer biochemical markers and measures of calcium metabolism are available. Attempts to identify a calcium status indicator in blood have been unsuccessful. Serum Calcium Fractions Serum calcium exists in three fractions: protein-bound, ionized, and complexed.28 These and other values for cal cium in body fluids are shown in Table 9.7. The protein bound calcium is considered physiologically inactive, whereas the ionized fraction is considered physiologically active and functions as an intracellular regulator.30--32 Complexed calcium is complexed with small negative ions, such as citrate, phosphate, and lactate. Its biological role is uncertain. Because the ionized and complexed 294 Nutritional Assessment 28 24 Prevalence (%) 20 16 12 8 4 0 Men 1999--2002 Women 1999--2002 Women 2003--2006 Men 1999--2002 Women 1999--2002 Women 2003--2006 Low Serum Ferritin H igh Serum Ferritin Figure 9.4 Age-adjusted prevalence estimates of low ( 200 ng/mL for men and \> 150 ng/mL for women) in U.S. men and women 12--49 years of age, NHANES 1999--2006. Women are at risk of iron deficiency, while men are at risk of iron excess, as shown by the marked differences in the prevalence of low and high serum ferritin concentrations between the two sexes. Source: U.S. Centers for Disease Control and Prevention. 2012. Second national report on biochemical indicators of diet and nutrition in the U.S. population. Atlanta: National Center for Environmental Health. www.cdc.gov/nutritionreport. table 9.7 Normal Values for Calcium in Body Fluids Mean Normal Range Plasma Total calcium (mmol/L) Ionized (mmol/L) Complexed (mmol/L) Protein-bound (mmol/L) Urine 24-hour calcium (mmol/L) Women Men 2.5 1.18 4.55 6.22 Fasting calcium: creatinine ratio (molar) Postmenopausal women 0.341 ± 0.183\* Men 0.169 ± 0.099\*    2.3--2.75    1.1--1.28 0.15--0.30 0.93--1.08 1.25--10 1.25--12.5 Source: Weaver CM. 1990. Assessing calcium status and metabolism. Journal of Nutrition 120:1470--1473. \*Mean ± SD. calcium are diffusible across semipermeable membranes, these two fractions can be collectively referred to as ultra filterable calcium. Serum levels of calcium are so tightly controlled by the body that there is little, if any, associa tion between dietary calcium intake and serum levels.32 Altered serum calcium levels are rare and indicate serious metabolic problems rather than low or high dietary intakes. Low serum calcium, or hypocalcemia (serum cal cium concentration \< 2.3 mmol/L), can result from a vari ety of conditions, including hypoparathyroidism (deficient or absent levels of parathyroid hormone), renal disease, and acute pancreatitis. High serum calcium concentra tions, or hypercalcemia (serum calcium \> 2.75 mmol/L), can be due to increased intestinal absorption, bone resorp tion, or renal tubular reabsorption resulting from such conditions as hyperparathyroidism, hyperthyroidism, and hypervitaminosis D (excessive intake of vitamin D).30,31 Urinary Calcium Urinary calcium levels are more responsive to changes in dietary calcium intake than are serum levels.31 However, uri nary calcium is affected by a number of other factors, includ ing those factors leading to hypercalcemia. When serum levels are high, more calcium is available to be excreted through the urine. There is a diurnal variation in urinary calcium, with concentrations higher during the day and lower in the evening.30 Calcium output tends to be increased when the diet is rich in dietary protein and is low in phos phate and tends to be decreased by high-protein diets rich in phosphate.30 Urinary calcium losses are increased when the volume of urine output is higher and when the kidneys' ability to reabsorb calcium is impaired.31,32 Hypocalciuria can result from those factors leading to hypocalcemia as well as from renal failure.36,37 chapter 9 Biochemical Assessment of Nutritional Status 295 Use of the ratio of calcium to creatinine calculated from two-hour fasting urine samples has been suggested as a possible indicator of calcium status but requires fur ther research. The calcium level in an overnight urine sample shows potential as an indicator of compliance with calcium supplementation.31 Zinc status Zinc's most important physiologic function is as a com ponent of numerous enzymes.16,33--35 Consequently, zinc is involved in many metabolic processes, including pro tein synthesis, wound healing, immune function, and tis sue growth and maintenance. Severe zinc deficiency characterized by hypogonadism and dwarfism has been observed in the Middle East. Evidence of milder forms of zinc deficiency (detected by biochemical and clinical measurements) has been found in several population groups in the United States. In humans and laboratory animals, a reduction or cessation of growth is an early response to zinc deficiency, and supplementation in growth-retarded infants and children who are mildly zinc deficient can result in a growth response.33,34 Because there is concern about the adequacy of zinc intake among certain groups, especially females, zinc is considered a potential public health issue for which fur ther study is needed. Nutrient intake data and other spe cific findings suggest that several U.S. population groups may have marginal zinc intakes. According to NHANES data, the average intake of zinc among females ages 20 to 49 years (approximately 9.6 mg/d) is roughly 80% of the RDA. Biochemical and clinical data derived from U.S. government nutritional monitoring activities, however, show no impairment of zinc status. Plasma Zinc Concentrations There is currently no specific sensitive biochemical or functional indicator of zinc status.33 Static measurements of plasma zinc are available, but their use is complicated by the body's homeostatic control of zinc levels and by factors influencing serum zinc levels that are unrelated to nutritional status.33,34 There is little, if any, functional reserve of zinc in the body, as there is of some other nutrients (for example, iron, calcium, and vitamin A). The body's zinc levels are main tained by both conservation and redistribution of tissue zinc. In mild zinc deficiency, conservation is manifested by reduction or cessation of growth in growing organisms and by decreased excretion in nongrowing organisms. In most instances of mild deficiency, this appears to be the extent of clinical and biochemical changes. If the deficiency is severe, however, additional clinical signs soon appear.16,33 Mature animals and humans have a remarkable capac ity to conserve zinc when intakes are low. As a result, inducing zinc deficiency in full-grown animals can be dif ficult.33 Several mechanisms are responsible for this. Fecal zinc excretion, for example, can be cut by as much as 60% when dietary intake is low. Not only is the efficiency of intestinal absorption of zinc increased, but losses via the gastrointestinal tract, urine, and sweat are diminished.33--38 In laboratory animals, deficiency can lead to selective redistribution of zinc from certain tissues to support other, higher-priority tissues. In mild zinc deficiency, plasma zinc levels apparently can be maintained at the expense of zinc from other tissues.33,34 Some evidence suggests that redistribution of total body zinc also occurs in humans. The result of the body's conservation of and ability to redistribute zinc is that measurements of plasma zinc are not a reliable indicator of dietary zinc intake or changes in whole-body zinc status.38 This is especially the case in mild zinc deficiency. For example, plasma zinc concentra tions in growth-retarded children whose growth responded to zinc supplementation were not significantly different from concentrations in normally developed children either before or after supplementation.38 Despite these limita tions, measurement of plasma zinc concentration may be a useful, albeit late, indicator of the size of the body's exchangeable zinc pool. Less than expected values may signal a loss of zinc from bone and liver and increased risk for clinical and metabolic signs of zinc deficiency.37,38 Several factors unrelated to nutritional status can influence plasma zinc levels. Decreased levels can result from stress, infection or inflammation, and use of estro gens, oral contraceptives, and corticosteroids.16,33,34 Plasma zinc can fall by 15% to 20% following a meal.35 Increased plasma zinc concentrations can result from fasting and red blood cell hemolysis. Metallothionen and Zinc Status Metallothionen is a protein found in most tissues but pri marily in the liver, pancreas, kidney, and intestinal mucosa. Metallothionen holds promise as a potential indicator of zinc status, particularly when used in conjunction with plasma zinc levels.35,38 Measurable amounts are found in serum and in red blood cells. Metallothionen has the capacity of binding zinc and copper, and tissue metallo thionen concentrations often are proportional to zinc sta tus. In animals, levels are almost undetectable in zinc deficiency and are responsive to zinc supplementation. Whereas plasma zinc levels fall in response to acute stim uli (for example, stress, infection, and inflammation), hepatic and serum metallothionen levels are increased in response to these stimuli. Thus, when plasma levels of zinc fall and of metallothionen rise, it is likely that tissue zinc has been redistributed in response to acute stimuli and that a zinc deficiency is not present because metallo thionen is not responsive to acute stimuli in zinc-deficient animals. If plasma zinc and metallothionen are both below expected levels, it is likely that zinc deficiency is present. Erythrocyte metallothionen (which is not affected by stress) also can be used as an indicator of zinc status.35 296 Nutritional Assessment Hair Zinc Several researchers have investigated the use of zinc in hair as an indicator of body zinc status.36,38 Decreased concentration of zinc in hair has been reported in zinc deficient dwarfs, in marginally deficient children and ado lescents, and in conditions related to zinc deficiency, such as celiac disease, acrodermatitis enteropathica, and sickle cell disease. Because hair grows slowly (about 1 cm per month), levels of zinc and other trace elements in hair reflect nutritional status over many months and thus are not affected by diurnal variations or short-term fluctua tions in nutritional status. Because of this, hair zinc levels may not be correlated with measurements of zinc in serum or erythrocytes, which reflect shorter-term zinc status.36,38 Obtaining a sample is noninvasive, and analyzing hair for zinc and other trace elements is relatively easy. It is important to note that trace elements in hair can come from endogenous sources (those that are ingested or inhaled by the subject and then enter the hair through the hair follicle) and exogenous sources (contamination from trace elements in dust, water, cosmetics, and so on).39 A major drawback in using hair as an indicator of trace ele ment status is its susceptibility to contamination from these exogenous sources. Some exogenous contaminants can be removed by carefully washing the hair sample before analysis, and several standardized washing proce dures have been suggested.39 However, some contami nants may be difficult or impossible to remove. Selenium, an ingredient in some antidandruff shampoos, is known to increase the selenium content of hair and cannot be removed by the recommended washing procedures.39 A variety of other nonnutritional factors may affect the trace element content of hair. Included among these are cer tain diseases, rate of hair growth, hair color, sex, pregnancy, Box 9.2 and age.39 It has been reported, for example, that higher concentrations of zinc, iron, nickel, and copper can be found in red hair, compared with brown hair, and that iron and manganese are found in higher concentrations in brown hair than in blonde hair.36,38,39 These factors limit the usefulness of hair as an index of zinc and other trace element status. Urinary Zinc Lower than expected concentrations of zinc have been reported in the urine of zinc-depleted persons.35,36 However, factors other than nutritional status can influ ence urinary zinc levels, such as liver cirrhosis, viral hep atitis, sickle cell anemia, surgery, and total parenteral nutrition. Problems associated with obtaining 24-hour urine collections can also complicate use of this indicator. Consequently, urine measurements of zinc are not the preferred approach to assessing zinc status. iodine status Iodine is a trace element essential for the synthesis of thyroid hormones that regulate metabolic processes related to normal growth and development in humans and animals.24,40 Inadequate intake of iodine leads to insuffi cient production of thyroid hormones, resulting in a vari ety of adverse effects collectively referred to as iodine-deficiency disorders. In humans these include mental retardation, hypothyroidism, cretinism, goiter, and varying degrees of other growth and developmental abnormalities (see Box 9.2). Thyroid hormones are par ticularly important for central nervous system develop ment during uterine development and the first two years of life, making this the most critical period for adequate Manifestations of Iodine Deficiency Disorder Throughout the Life Span Fetal Period Spontaneous abortion Stillbirth Congenital anomalies in offspring Increased perinatal mortality Neonatal Period Increased infant mortality Cretinism Childhood and Adolescence Adulthood Impaired mental function Decreased ability to learn Apathy Reduced work productivity All Ages Goiter Hypothyroidism Increased susceptibility of the thyroid gland to nuclear radiation Impaired mental function Cretinism Delayed physical development Source: Zimmermann MB, Jooste PL, Pandav CS. 2008. Iodine-deficiency disorders. Lancet 372:1251--1262; and Zimmermann MB. 2009. Iodine deficiency. Endocrine Reviews 30:376--408. chapter 9 Biochemical Assessment of Nutritional Status 297 iodine intake. Deficiency during pregnancy can lead to spontaneous abortion, stillbirth, congenital anomalies in offspring, and increased perinatal and infant mortality. Severe iodine deficiency in utero can result in a condition known as cretinism, which is characterized by gross men tal retardation, short stature, deaf-mutism, and spastic ity.41,42 According to the World Health Organization, iodine deficiency is the leading cause of mental retarda tion in the world. Studies of moderately to severely iodine-deficient populations indicate that chronic iodine deficiency can reduce the intelligence quotient (IQ) by 12.5 to 13.5 points.24,41 The provision of an iodine supple ment to moderately iodine-deficient children has been shown to improve cognitive functioning, suggesting that in some instances cognitive impairment can be at least partly reversed by iodine repletion.41 The classic sign of iodine deficiency is thyroid gland enlargement, or goiter, which can occur at any age, including infancy. Goiter is the earliest and most obvious manifestation of iodine deficiency but certainly not its most devastating consequence.40,41 When iodine intake is insufficient, the anterior pituitary gland increases its secretion of thyroid-stimulating hormone (TSH) in an effort to maximize the thyroid gland's uptake of available iodine, which stimulates thyroid hypertrophy and hyper plasia. Initially goiters are a small, diffuse, and uniform enlargement of the thyroid gland, increasing in size as the iodine deficiency continues. Over time they can develop nodules or lumps and become massive enough to com press neighboring structures, such as the trachea, esopha gus, and nerves, as shown in Figure 9.5. In some instances, surgical removal is required.40--42 In younger patients and Figure 9.5 Goiter in the neck of an adolescent. those with relatively small, diffuse, and soft goiters, iodine repletion can result in a variable degree of shrink age of the goiter. However, in older patients and those with nodular or fibrotic goiters, fewer than one-third experience any significant shrinkage of the goiter.43 Paradoxically, excessive iodine intake can cause goiter, as well as hyperthyroidism and hypothyroidism, and increases the risk of thyroid cancer.20 Assessing Iodine Status Because more than 90% of dietary iodine is eventually excreted in the urine, urinary iodine (UI) is the most widely used indicator of recent iodine intake and nutri tion status.24,41,42,44 For individuals, a 24-hour urine col lection is necessary to estimate iodine intake using UI concentration. Because 24-hour urine collections are impractical in field studies involving large numbers of subjects, spot urine collections from a representative sample of the target population can be used to calculate median UI in nanograms of iodine per milliliter of urine (ng/mL). Variations in UI concentrations due to dif ferences in hydration among the subjects and the day-to day variation in iodine intake by individuals will be evened out when a sufficiently large number of spot urine samples are collected from each group or subgroup. The categories for median UI concentrations developed by the World Health Organization (WHO) for assessing iodine status in school-age children and adults (excluding preg nant and lactating women) are shown in Table 9.8. The WHO's categories for median UI concentrations for assessing iodine status in pregnant and lactating women and children less than 2 years of age are shown in Table 9.9. Because of hydration status and day-to-day variation in iodine intake, UI concentrations of spot urine samples are not a reliable indicator of an individual's iodine status. Therefore, it would be a mistake to assume that an individual is iodine deficient based on a UI con centration \< 100 ng/mL in a single spot urine sample collected from that individual.24,45 In addition to UI concentration, other methods for assessing iodine status include serum thyroglobulin, the goiter rate, and serum concentration of thyroid- stimulating hormone (TSH). While UI concentration is a sensitive indicator of recent iodine intake (days), serum thyroglob ulin reflects iodine status over an intermediate period of time spanning weeks to months, and the goiter rate reflects long-term iodine status spanning months to years. TSH is a sensitive indicator of iodine status during the neonatal period. Thyroid hormone concentrations are considered unreliable indicators of iodine status.41 Iodine Status in the United States In the early part of the 20th century, prior to the iodiza tion of table salt, iodine-deficiency diseases, particularly goiter, were common and a significant problem. Goiter 298 Nutritional Assessment table 9.8 World Health Organization Criteria for Assessing Iodine Nutrition Based on Median Urinary Iodine Concentrations of School-age Children (≥  6 Years of Age) and Adults\* Median Urinary Iodine (ng/mL) Category of Iodine Intake Iodine Status \< 20% of women of childbearing age are consuming sup plements containing iodine.24,50 Mandatory iodization of all table salt sold in the United States would also help ensure adequate iodine intakes in women of childbearing age. Iodization of table salt in the United States is volun tary, and iodized salt is chosen by only 50% to 60% of the U.S. population. In addition, approximately 75% of all sodium consumed by Americans is that added in the pro cessing and manufacturing of food, little of which is iodized. Consequently, efforts to reduce the amount of sodium in processed foods will have little, if any, adverse impact on iodine intakes in the United States. In fact, if the amount of sodium added during the manufacturing and processing of foods were reduced as some groups recommend, Americans would increase their use of iodized table salt during cooking and at the table, thus ensuring that more Americans have adequate iodine intakes. Efforts to prevent iodine deficiency and efforts to reduce salt consumption to prevent chronic disease do not conflict. As long as all salt consumed is iodized, adequate iodine intake is certain, even at levels of salt intake currently recommended by the Dietary Guidelines for Americans for persons 51 years of age and older and those of any age who are African American or have hyperten sion, diabetes, or chronic kidney disease.48,51,52 Vitamin A Status Vitamin A status can be grouped into five categories: deficient, marginal, adequate, excessive, and toxic. In the deficient and toxic states, clinical signs are evident, while biochemical or static tests of vitamin A status must be relied on in the marginal, adequate, and excessive states.53 Biochemical assessment of vitamin A status generally involves measurements of plasma concentrations of reti nol or the vitamin A carrier protein, retinol-binding pro tein (RBP), retinol isotope dilution, and dose-response tests. Used less frequently are examination of epithelial cells of the conjunctiva, assessment of dark adaptation, and liver biopsy.1,16,53,54 These approaches are best used when assessing the vitamin A status of populations. The approaches for diagnosing vitamin A deficiency in an individual are limited to retinol isotope dilution, assess ment of dark adaptation, and liver biopsy.54 Vitamin A levels in breast milk can also be used as an index of vita min A status in lactating women and in detecting response to maternal supplementation.55 Plasma Levels Measurement of plasma concentration of retinol is the most common biochemical measure of vitamin A in a pop ulation group.1,16,53,54 Under normal conditions, about 95% of plasma vitamin A is in the form of retinol and bound to retinol-binding protein, and about 5% is unbound and in the form of retinyl esters.53,54 Serum measurements are pre dictive of vitamin A status only when the body's reserves are either critically depleted or overfilled. Because plasma concentration of retinol may be within the expected range despite low vitamin A concentrations within the liver, test results should be interpreted with caution.54 However, data from plasma measurements can be of some value in 300 Nutritional Assessment drawing conclusions about the relationship of plasma mea surements to clinical signs of deficiency, dietary intake data, and various socioeconomic factors. Plasma concen trations \< 10 μg/dL (0.35 μmol/L) are considered severely deficient, and values \< 20 μg/dL (0.70 μmol/L) are considered deficient.20 Serum values \> 20 μg/dL (\> 0.70 μmol/L) are indicative of adequate status, while serum levels \> 300 μg/dL are diagnostic of chronic hypervitaminosis A.20 Using these criteria, data from NHANES indicate that the likeli hood of being vitamin A deficient was very low throughout the U.S. population and that the likelihood of vitamin A excess was also very low, but it increased with increasing age. For the past two decades, more than 95% of the U.S. population have had adequate serum concentrations of vitamin A.20 The plasma concentration of retinol-binding protein (RBP) can be used as a surrogate measure of plasma retinol, but it is affected by conditions other than vitamin A status. It is likely to be low during conditions of malnutrition and inflammation.53 Relative Dose Response The relative dose-response test (RDR) and modified rela tive dose-response test (MRDR) are based on the princi ple that, when stores of retinol are high, plasma retinol concentration is little affected by oral administration of vitamin A. But when reserves are low, the plasma retinol concentration increases markedly, reaching a peak five hours after an oral dose. As hepatic vitamin A stores become depleted, retinol-binding protein (RBP) accumu lates in the liver in an unbound state known as apo-RBP. When vitamin A is given to a subject whose stores are depleted, the vitamin A is absorbed from the intestinal tract; is taken up by the liver, where it binds to the apo RBP; and then is released from the liver in the form of holo-RBP (the complex of RBP and vitamin A). In the RDR, a fasting blood sample is taken, followed by oral administration of vitamin A as retinyl palmitate. Another blood sample is drawn five hours later. Comparison of the fasting and postdosing holo-RBP measurements repre sents the extent of apo-RBP accumulation, which is directly related to the shortage of vitamin A.1,16,53,54 The RDR is calculated using the following formula:1 RDR=Vit A5−vit A0 Vit A5 where vit A5 = serum vitamin A level five hours after receiving the dose of vitamin A and vit A0 = fasting serum vitamin A level. An RDR \> 50% is considered indicative of acute deficiency, values between 20% and 50% indicate mar ginal status, and values \< 20% suggest adequate intake.1 Limitations of the RDR include the five-hour waiting period and the need to draw two blood samples.1 The MRDR is based on the same principle but uses only one blood sample five hours after administration of the test dose of dehydroretinol, a naturally occurring form of vitamin A but one rarely present in most diets.1 The measured response is the molar ratio of dehydroretinol to retinol in the serum sample. A ratio \> 0.06 indicates mar ginal or poorer vitamin A status. A ratio \< 0.03 indicates adequate vitamin A status.1 The assay is limited by the fact that there currently is no commercial source of dehydroretinol, and the assay requires the use of high-pressure liquid chromatography to distinguish between the two forms of vitamin A. The assay is still under development but does have the advan tage of requiring only one blood sample.1 Conjunctival Impression Cytology Vitamin A deficiency can result in morphologic changes in epithelial cells covering the body and lining its cavities. It can result in a decline in the number of mucus-producing goblet cells in the epithelium of the conjunctiva of the eye. The epithelial cells also may take on a more squamous appearance---flatter cells, smaller nuclei, and the cytoplasm making up a greater proportion of the total cell. The con junctival impression cytology test involves the microscopic examination of the conjunctival epithelial cells to determine morphologic changes indicative of vitamin A deficiency.1,16 A minute sample of epithelial cells can be obtained by touching a strip of cellulose ester filter paper to the outer portion of the conjunctiva for three to five seconds and then gently removing it. The filter paper with the adherent epithelial cells is placed in a fixative solution, where it can be stored until being stained and examined by ordinary light microscopy.1 The test is limited by several factors and is not widely used.1 It is difficult to get tissue samples from children under 3 years of age, and the cytologists must follow stan dardized criteria in evaluating samples. The sensitivity of the test is limited by conjunctival and systemic infections and possibly by severe malnutrition.1,16 Dark Adaptation The best-defined function of vitamin A is its role in the visual process. The visual pigment rhodopsin is generated when the protein opsin in the rods of the retina combines with a cis-isomer of retinol. When light strikes the eye, rhodopsin is split into opsin and a trans-isomer of retinol, generating the visual-response signal. The trans-isomer is then converted back to the cis-isomer, which then com bines with opsin to reform rhodopsin. During this pro cess, some of the retinol isomer is lost and must be replaced by vitamin A present in the retina. Under normal conditions, sufficient retinol is present, and rhodopsin is readily formed. When vitamin A is in short supply, less rhodopsin is formed, and the eye fails to adapt as readily to low light levels after exposure to bright light levels.1,16 Tests are available to directly measure the level of rhodopsin and its rate of regeneration. Field tests measuring chapter 9 Biochemical Assessment of Nutritional Status 301 visual acuity in dim light after exposure to bright light also can be used.1 However, the relative dose response, when available, is a more specific and objective test of vitamin A status and is preferred over functional tests of dark adaptation. Direct Measurement of Liver Stores Direct measurement of hepatic vitamin A stores in liver tissue is considered the "gold standard" of vitamin A sta tus, but its invasive nature limits its usefulness.53 In many countries, the median vitamin A concentration in liver tissue of well-nourished persons is approximately 100 μg of retinol/g of liver tissue. A concentration \> 20 μg of retinol/g of liver tissue is considered adequate for both children and adults of all ages.53 Concentrations \< 5 μg of retinol/g of liver tissue are associated with vitamin A deficiency. The assay can be done on a very small amount of liver tissue obtained by inserting a biopsy needle through the abdominal wall. Because of the invasiveness of the biopsy procedure, assaying liver tissue for vitamin A is limited to situations when a liver biopsy is necessary for diagnostic purposes or when liver tissue can be obtained from postmortem examinations.16,53 Retinol Isotope Dilution The best way to represent vitamin A status is in terms of total body stores of vitamin A. Approximately 90% of the body's vitamin A stores are in the liver, but direct mea surement of vitamin A concentration in the liver is not practical in most instances because a liver biopsy is a highly invasive surgical procedure. A noninvasive, indi rect approach to measuring total body stores of vitamin A is retinol isotope dilution, which is considered the best approach currently available for assessing vitamin A sta tus with the least risk to subjects.1,16,56 The procedure involves administering to a subject a known amount of vitamin A that is labeled with a nonradioactive isotope. This is ingested with an adequate amount of fat to ensure suitable absorption. After a period of approximately two to three weeks, the isotopically labeled vitamin A mixes with the body's existing total pool of unlabeled vitamin A. A sample of blood is then removed from the subject and the plasma concentrations of the labeled and unlabeled vitamin A are measured. Using a mathematical formula, the ratio of the labeled vitamin A to the unlabeled vitamin A is calculated. This ratio is used to estimate the size of the subject's total vitamin A pool.56,57 Limitations of this method include the high cost of the isotopically labeled vitamin A and the sophisticated laboratory techniques needed to measure the plasma concentrations of the labeled and unlabeled vitamin A. However, it is the only noninvasive method currently available for assessing the full range of vitamin A status, from deficient to toxic (hypervitaminosis A).1,56,57 Vitamin d status The major physiological function of vitamin D in verte brates, including humans, is to maintain serum calcium and phosphorus concentrations in a range that supports bone mineralization, neuromuscular function, and various cellular processes. Vitamin D increases serum concentra tions of calcium and phosphorus by promoting their absorption by the small intestine, promoting their reab sorption by the kidney, and stimulating bone resorption with the release of calcium and phosphorus from bone.58 The role of vitamin D in maintaining skeletal health is well established.30 Deficiency of vitamin D causes growth retardation and rickets in children. In adults vitamin D deficiency causes osteomalacia, and precipitates and exacerbates bone demineralization, leading to osteopenia, osteoporosis, and increased fracture risk.58 What is less certain and a matter of debate is whether vitamin D expo sure at levels sufficient to prevent clinically evident bone disease is optimal in terms of reducing risk of a variety of other conditions not traditionally linked to vitamin D. Research suggests that increasing vitamin D exposure decreases risk of certain types of cancer, type 1 and type 2 diabetes, respiratory tract infections, and influenza and asthma in children.59--61 In general, the available data support that adequate vitamin D supplementation and sensible sunlight expo sure to achieve optimal vitamin D status are important in the prevention of cardiovascular disease and other chronic diseases.58 Vitamin D (calciferol) is a fat-soluble sterol found in a limited number of foods, including fish-liver oils, fatty fishes, mushrooms, egg yolks, and liver.20 The two major forms of vitamin D are cholecalciferol (D3) and ergocal ciferol (D2). Vitamin D3 is synthesized in the skin of ver tebrates by the action of ultraviolet B (UVB) radiation on 7-dehydrocholesterol present in the skin. Vitamin D2 is produced by UV irradiation of ergosterol, as sterol occur ring in molds, yeast, mushrooms, and higher-order plants. Vitamin D in humans comes from two sources: from that present in the foods we eat and from that synthesized in the body by sun exposure. If sun exposure is sufficient, dietary intake is not necessary. However, cutaneous syn thesis of vitamin D3 is decreased by such factors as living at a higher latitude, older age, darker skin pigmentation, less skin exposed to sunlight, and sunscreen use. Major dietary sources of vitamin D in the United States are foods fortified with vitamin D, including milk products, some citrus juices, and ready-to-eat breakfast cereals.20 Practically all fortified foods and supplements in the United States use vitamin D3 instead of vitamin D2.62 Assessing Vitamin D Status Vitamin D status is best assessed by measuring the serum concentration of 25-hydroxyvitamin D \[25(OH)D\], the major circulating form of the vitamin, which reflects total 302 Nutritional Assessment 70 80 62.3 67.4 65.665.5 59.2 69.1 54.753.9 42.8 46 75 66.7 60 50 40 30 20 10 0 All Persons ≥ 12 Years All Males All Females Mexican American Non-Hispanic Black Non-Hispanic White 25(OH)D (nmol/L) LC-MS/MS-Equivalent Figure 9.7 LC-MS/MS-equivalent serum 25(OH)D concentrations for persons aged 12 years and older stratified by NHANES cycle and grouped by demographic variables. The vitamin D status of the U.S. population showed modest increases in from 1988--1994 to 2009--2010 and corresponded in time with an increase in the use of supplements containing higher amounts of vitamin D. Source: Graph developed using data from Schleicher RL, Sternberg MR, Lacher DA, Sempos CT, Looker AC, Durazo-Arvizu RA, Yetley EA, Chaudhary-Webb M, Maw KL, Pfeiffer CM, Johnson CL. 2016. The vitamin D status of the US population from 1988 to 2010 using standardized serum concentrations of 25-hydroxyvitamin D shows recent modest increases. American Journal of Clinical Nutrition 104:454--461. vitamin D exposure from food, supplements, and synthe sis.20,30 Although serum concentration of 25(OH)D is not considered a validated health outcome indicator, it is the measure preferred by clinicians to assess total vitamin D exposure and allows comparisons to be made between exposure and health outcomes. What serum concentra tions of 25(OH)D represent an optimal vitamin D expo sure is a matter of some disagreement. There has been no systematic, evidence-based development process to establish 25(OH)D cutpoints for defining what is indica tive of vitamin D deficiency or adequacy.20,30,62 Based on its review of the available data, the Institute of Medicine's Committee to Review Dietary Reference Intakes for Vitamin D and Calcium concluded that persons are at risk of deficiency relative to bone health when the serum con centration of 25(OH)D is \< 30 nmol/L (nanomoles per liter). The Committee concluded that some, but not all, persons are potentially at risk of inadequate vitamin D exposure when their serum 25(OH)D is between 30 and 50 nmol/L. Practically all persons have a sufficient exposure with serum 25(OH)D between 50 and 75 nmol/L. The Committee concluded that serum con centrations between 75 and 125 nmol/L were not consis tently associated with increased benefits and that there may be reason for concern at concentrations \> 125 nmol/L.30 The Committee concluded that a serum 25(OH)D of 40 nmol/L represented a vitamin D exposure that met the average requirement and was consistent with the Estimated Average Requirement (EAR). A serum 25(OH)D concentration of approximately 50 nmol/L represented a vitamin D exposure that was associated with benefit for nearly all the population and was consistent with the Recommended Dietary Allowance (RDA).30 Beginning with NHANES III in 1988, the vitamin D status of the U.S. population has been monitored. NHANES collects survey information and biological samples during the summer from people living at higher latitudes and in the winter from those living in lower latitudes. Serum 25(OH)D concentration data from NHANES are shown in Figure 9.7. When 25(OH)D was expressed in LC-MS/MS equivalents (the most accurate method of measurement), the vitamin D status of the U.S. population 12 years of age and older showed modest increases in 2009--2010. The increase in 25(OH)D corresponded in time with an increase in the use of sup plements containing higher amounts of vitamin D. Marked race-ethnic differences in 25(OH)D concentra tions, however, are apparent, with the lowest levels found in non-Hispanic Blacks.62 NHANES data indicate that 46% of non-Hispanic Blacks have serum 25(OH)D con centrations below 40 nmol/L in comparison to 6.6% of non-Hispanic Whites (Figure 9.8).62 Vitamin c status Vitamin C is a generic term for compounds exhibiting the biological activity of ascorbic acid, the reduced form of chapter 9 Biochemical Assessment of Nutritional Status 303 All Males All Females Mexican American Non-Hispanic Black Non-Hispanic White YES: Vit D Supplement Use NO: Vit D Supplement 20 5.8 6.6 46 23 17 13 15 All Persons ≥ 12 y Prevalence (%) of Low Serum 25(OH)D (\< 40 nmol/L) 0 5 10 15 20 25 30 35 40 45 Figure 9.8 Prevalence of LC-MS/MS--equivalent serum 25(OH)D concentrations below 40 nmol/L for persons aged 12 years and older, grouped by demographic variables or vitamin D supplement use: NHANES 2009--2010. Forty nmol/L is the concentration consistent with an intake equivalent to the Estimated Average Requirement, which is useful for evaluating the possible adequacy of nutrient intakes of population groups. Vitamin D supplement use was defined as the use of any vitamin D--containing supplements in the month preceding the household interview. Source: Graph developed using data from Schleicher RL, Sternberg MR, Lacher DA, Sempos CT, Looker AC, Durazo-Arvizu RA, Yetley EA, Chaudhary-Webb M, Maw KL, Pfeiffer CM, Johnson CL. 2016. The vitamin D status of the US population from 1988 to 2010 using standardized serum concentrations of 25-hydroxyvitamin D shows recent modest increases. American Journal of Clinical Nutrition 104:454--461. vitamin C. The oxidized form of vitamin C is known as dehydroascorbic acid. The sum of ascorbic acid and dehydroascorbic acid constitutes all the naturally occur ring biologically active vitamin C.63,64 When used in this chapter, the term vitamin C refers to total vitamin C---the sum of ascorbic acid and dehydroascorbic acid. Vitamin C is necessary for the formation of collagen; the maintenance of capillaries, bone, and teeth; the pro motion of iron absorption; and the protection of vitamins and minerals from oxidation. Evidence from several large cohort studies suggests a protective effect against certain cancers and CHD, but these observations are not sup ported by most randomized clinical trials.65 Deficiency of vitamin C results in scurvy, a condition characterized by weakness, hemorrhages in the skin and gums, and defects in bone development in children. Vitamin C status is assessed by measuring total ascorbic acid in serum (or plasma) and in leukocytes (white blood cells). The serum concentration of ascorbic acid is considered an index of the circulating vitamin available to the tissues. The leukocyte concentration is considered an index of tissue stores.20 Vitamin C defi ciency is generally defined as a serum (or plasma) con centration \< 11.4 micromoles per liter (μmol/L) or the level at which signs and symptoms of scurvy begin to appear. A low serum (or plasma) concentration is 11.4 to 23.0 μmol/L.20 Data collected by NHANES in 2003--2006 show that the prevalence of low serum concentrations among all individuals in the United States 6 years of age and older is 6%. As shown in Figure 9.9, the prevalence of vitamin C deficiency (serum concentration \< 11.4 μmol/L) and low serum vitamin C (serum concentrations 11.4--23.0 μmol/L) is lowest in children and adolescents. As shown in Figure 9.10, U.S. females are at lower risk of vitamin C defi ciency than U.S. males, and non-Hispanic Whites have a greater prevalence of low vitamin C than non-Hispanic Blacks and Mexican Americans.20 Serum and Leukocyte Vitamin C Measurement of serum (or plasma) vitamin C is the most commonly used biochemical procedure for assessing vita min C status.20,64,66,67 However, in recent years, there has been increasing interest in using the level of vitamin C in polymorphonuclear leukocytes (the granular leukocytes: neutrophils, eosinophils, and basophils) and the mononu clear leukocytes (the agranular leukocytes: lymphocytes and monocytes) as indicators of vitamin C status.63,64 Serum levels of ascorbic acid have been shown to correlate 304 Nutritional Assessment 12 10 8 Prevalence (%) 6 4 2 0 6--11 12--19 20--39 40--59 ≥ 60 Serum Vitamin C \< 11.4 μmol/L 6--11 Age in Years 12--19 20--39 40--59 ≥ 60 Serum Vitamin C 11.4--23.0 μmol/L Figure 9.9 Prevalence estimates of vitamin C deficiency (serum concentrations \< 11.4 μmol/L) and of low vitamin C concentrations (11.4--23.0 μmol/L) in the U.S. population aged 6 years and older by age groups, NHANES 2003--2006. Source: U.S. Department of Health and Human Services. 2012. Second national report on biochemical indicators of diet and nutrition in the U.S. population. National Center for Environmental Health, Centers for Disease Control and Prevention. www.cdc.gov/nutritionreport. 8 6 Prevalence (%) 4 2 0 Male Female Sex Mexican American Serum Vitamin C \< 11.4 μmol/L Non-Hispanic Black Race/Ethnicity Non-Hispanic White Figure 9.10 Age-adjusted prevalence estimates of vitamin C deficiency (serum concentrations 200 mg of vita min C daily to achieve serum ascorbic acid levels typically seen in nonsmokers meeting the RDA currently set at 90 mg/d for adult males and 75 mg/d for adult females.67 After correcting for vitamin C intakes, females con sistently show higher vitamin C levels in tissues and flu ids than males.64,71,75 Age does not appear to influence vitamin C levels in adults.64,69 Vitamin B6 status The vitamin B6 group is composed of three naturally occur ring compounds related chemically, metabolically, and functionally: pyridoxine (PN), pyridoxal (PL), and pyri doxamine (PM). Within the liver, erythrocytes, and other tissues of the body, these forms are phosphorylated into pyridoxal 5′-phosphate (PLP) and pyridoxamine phos phate (PMP). PLP and PMP primarily serve as coenzymes in a large variety of reactions.68,69 Especially important among these are the transamination reactions in protein metabolism. PLP also is involved in other metabolic trans formations of amino acids and in the metabolism of carbo hydrates, lipids, and nucleic acids.68,69 Because of its role in protein metabolism, the requirement for vitamin B6 is directly proportional to protein intake.70 Although frank vitamin B6 deficiency resulting in clinical manifestations is not considered widespread in the general U.S. population, there is evidence of impaired status among certain groups---most notably, the elderly and alcoholic individuals. There is also concern about excessive vitamin B6 intake and the possibility of result ing peripheral nervous system damage.70 Vitamin B6 status can be assessed by several meth ods. Static measurements can be made of vitamin B6 con centrations in blood or urine, and functional tests can measure the activity of several enzymes dependent on vitamin B6.68,71,72 Plasma and Erythrocyte Pyridoxal 5′-Phosphate The most frequently used biochemical indicator of vita min B6 status is plasma PLP.68,71,72 PLP accounts for approximately 70% to 90% of the total vitamin B6 present in plasma.68,72 PL is the next most abundant form in plasma, followed by lower levels of PN and PM.71 Fasting measurements of plasma PLP are considered the single most informative indicator of vitamin B6 status for healthy persons. Use of this single measure is limited by the fact that abnormally low concentrations of plasma PLP may result from asthma, coronary heart disease, and table 9.10 Factors Affecting Plasma PLP\* Concentrations Factors Effect Increased vitamin B6 intake Increased protein intake Increased glucose Increased plasma volume Increased physical activity Decreased uptake into nonhepatic tissues Increased age Increases Decreases Decreases (a)† Decreases Increases (a) Increases Decreases Source: Leklem JE. 1990. Vitamin B6: A status report. Journal of Nutrition 120:1503--1507. \*PLP = pyridoxal 5′-phosphate. †(a) indicates that the effect is an acute effect. pregnancy and may not reflect a true vitamin B6 defi ciency.71 Dietary intake of vitamin B6 and protein can affect plasma PLP concentrations as well. As shown in Table 9.10, increases in dietary vitamin B6 intake raise plasma PLP, and plasma levels fall in response to increased protein consumption.70,72 Thus, although plasma PLP is a valuable measure, the assessment of vitamin B6 status is best accomplished by using several indicators in conjunc tion with each other---for example, measures of other vita min B6 forms and/or functional tests.70--72 Table 9.11 lists expected biochemical values for adults with adequate vita min B6 status for various measures of the vitamin. Measurement of PLP in erythrocytes has been sug gested as another approach to assessing vitamin B6 status. Certain characteristics of the erythrocyte may make it unrepresentative of other body tissues. The ability of hemoglobin to bind tightly to PLP and PL, along with the relatively long life of red blood cells (about 120 days), may make red blood cells a significant reservoir for vitamin B6 and may complicate the use of erythrocyte PLP levels as a useful indicator of vitamin B6 status.70--72 Plasma Pyridoxal Measurement of plasma PL has been suggested as an additional indicator of B6 status to use with plasma PLP. PL is the major dietary form of the vitamin, crosses all membranes on absorption from the gastrointestinal tract, and comprises about 8% to 30% of the total plasma vita min B6. There are questions about how well plasma PL represents vitamin B6 status, and further research is needed on this indicator. Despite these questions, plasma PL is recommended in the assessment of B6 status.72 Urinary 4-pyridoxic Acid 4-pyridoxic acid (4-PA) is the major urinary metabolite of vitamin B6. Urinary excretion of 4-PA has been shown 306 Nutritional Assessment table 9.11 Indices for Evaluating Vitamin B6 Status and Suggested Values for Adequate Status in Adults Indices Suggested Values for Adequate Status\* Direct Blood Plasma pyridoxal 5′-phosphate (PLP) Plasma pyridoxal Plasma total vitamin B6 Erythrocyte PLP Urine 4-pyridoxic acid Indirect Urine 3-g methionine load; cystathionine Oxalate excretion Diet Intake Vitamin B6 intake, weekly average Vitamin B6 : protein ratio Other Electroencephalogram pattern \> 30 nmol/L NV† \> 40 nmol/L NV \> 3.0 μmol/day 1.2--1.5 mg/day \> 0.020 NV Source: Leklem JE. 1990. Vitamin B6: A status report. Journal of Nutrition 120:1503--1507. \*These values are dependent on sex, age, and, for most, protein intake. †NV = no value established; limited data are available. ‡The index value for each transaminase represents the ratio of the enzyme activity with added PLP to the activity without PLP added. to change rapidly in response to alterations in vitamin B6 intake and to be indicative of immediate dietary intake. Thus, it is considered useful as a short-term index of vitamin B6 status. In studies of subjects whose usual dietary intake of vitamin B6 was known, males had a 4-PA excretion of 3.5 μmol/day, and females had a 4-PA excretion of \> 3.2 μmol/day. Urinary excretions of 4-PA of ≥ 3.0 μmol/day appear to be indicative of acceptable vitamin B6 status.1 4-PA is likely to be absent from urine of persons with a marked vitamin B6 deficiency. The value of this method is limited by the need for a complete 24-hour urine collection.20,68,72 Methionine Load Test The principle of the methionine load test is similar to that of the tryptophan load test. PLP is required in the metabo lism of the amino acid methionine. Compared with per sons with adequate vitamin B6 status, those with impaired vitamin B6 status have higher urine levels of the metabo lites cystathionine and cysteine sulfonic acid following consumption of 3 g of methionine.72 Use of this test is limited by the required 24-hour urine sample and factors other than vitamin B6 status that can affect test results (for example, protein intake). Because this test has been used in a limited number of studies, no definitive values for urinary cystathionine are available. The value of \< 350 μmol/day given in Table 9.11 is based on three studies.72 folate status Folate, or folacin, is a group of water-soluble compounds with properties and chemical structures similar to those of folic acid, or pteroylglutamic acid. Folate functions as a coenzyme transporting single carbon groups from one compound to another in amino acid metabolism and nucleic acid synthesis. One of the most significant of folate's functions appears to be purine and pyrimidine synthesis. Folate deficiency can lead to inhibition of DNA synthesis, impaired cell division, and alterations in protein synthesis. It is especially important during peri ods of rapid cell division and growth, such as occurs dur ing pregnancy and infancy.20,70,73,74 Clinical trials have shown that a marginal folate intake during pregnancy increases the risk of an infant being born with a neural tube defect (i.e., spina bifida, encephalocele, and anen cephaly) and that folate supplementation reduces the number of pregnancies affected by neural tube defects (NTDs). This led the U.S. Food and Drug Administration to require the addition of folic acid to enriched cereal products beginning in 1998.20 Since then, the rate of NTDs has declined by 36%.75 The Dietary Guidelines for Americans recommend that women who are capable of becoming pregnant or who are pregnant pay special atten tion to folic acid. The RDAs for folate are based on the prevention of folate deficiency, not on the prevention of neural tube defects. The RDA for adult women is 400 micrograms (mcg) Dietary Folate Equivalents (DFE) and, for women during pregnancy, 600 mcg DFE daily from all sources. To prevent birth defects, all women capable of becoming pregnant are advised to consume 400 mcg of synthetic folic acid daily, from fortified foods and/or supplements. This recommendation is for an intake of synthetic folic acid in addition to the amounts of food folate contained in a healthy eating pattern. All enriched grains are fortified with synthetic folic acid. Sources of food folate include beans and peas, oranges and orange juice, and dark-green leafy vegetables, such as spinach and mustard greens.51 The recommended measurements for assessing folate status are serum folate concentration and red blood cell (RBC) folate concentration.20,73 Serum folate is largely the product of absorbed dietary folate and fluctuates daily. It provides information about recent folate intake and does not necessarily represent tissue stores. Nonnutritional factors that can increase serum folate concentrations include acute renal failure, active liver 16 chapter 9 Biochemical Assessment of Nutritional Status 307 14 12 Serum Folate (ng/mL) 10 8 6 4 2 0 1988--1994 1999--2002 2003--2006 Total Population Males Females Mexican American Non Hispanic Black Non Hispanic White Figure 9.11 Age-adjusted mean concentrations of serum folate in the U.S. population aged 4 years and older by sex or race/ethnicity, NHANES 1988--2006. Serum folate concentrations more than doubled after the FDA required the addition of folic acid to enriched cereal products beginning in 1998. Source: U.S. Department of Health and Human Services. 2012. Second national report on biochemical indicators of diet and nutrition in the U.S. population. National Center for Environmental Health, Centers for Disease Control and Prevention. www.cdc.gov/nutritionreport. disease, and hemolysis of red blood cells. Alcohol con sumption, cigarette smoking, and oral contraceptive use may lower serum folate levels.70,73 RBC folate concentra tion is considered the best clinical index of folate stores in the tissues of the body. RBC folate concentration reflects folate status at the time the erythrocyte was synthesized, because only young cells in the bone marrow take up folate.70 Unlike serum folate, RBC folate is less subject to transient fluctuations in dietary intake. It decreases after tissue stores are depleted because erythrocytes have a 120-day average life span and reflect folate status at the time of their synthesis. It has been shown to correlate with liver folate stores and to reflect total body stores.70,73 The monitoring of serum folate concentration in the U.S. population began with NHANES I (1974--1975). The measurement of RBC folate concentration began with NHANES II (1978--1980). NHANES has continued mea suring these biomarkers since then.74 Figure 9.11 shows the age-adjusted mean concentrations of serum folate in the U.S. population aged 4 years and older by sex or race/ ethnicity from 1988 to 2006.20 Based on the assay method consistently used throughout this time period, a low serum folate concentration is considered \< 2 ng/mL. As can be seen in Figure 9.11, serum folate concentrations more than doubled after the FDA required the addition of folic acid to enriched cereal products beginning in 1998. Figure 9.12 shows the age-adjusted mean concentra tions of RBC folate in the U.S. population aged 4 years and older by sex or race/ethnicity from 1988 to 2006.20 Based on the assay method consistently used throughout this time period, a low RBC folate concentration is considered \< 95 ng/mL. The cutoff values for defining low values will vary depending on the assay used to measure folate con centrations.20,73,74 The plasma concentration of homocyste ine, an amino acid naturally found in the blood, can be elevated by folate deficiency, but plasma homocysteine is not specific for folate deficiency because it is also elevated during deficiency of vitamin B6 and vitamin B12.20 Vitamin B12 status Vitamin B12, or cobalamins, includes a group of cobalt- containing molecules that can be converted to methylco balamin or 5′-deoxyadenosylcobalamin, the two coenzyme forms of vitamin B12 that are active in human metabo lism.70,76 Vitamin B12 is synthesized by bacteria, fungi, and algae, but not by yeast, plants, and animals. Vitamin B12 synthesized by bacteria accumulates in the tissues of animals that are then consumed by humans. Thus, animal products serve as the primary dietary source of vitamin B12. Although plants are essentially devoid of vitamin B12 (unless they are contaminated by microorganisms or soil containing vitamin B12), foods such as breakfast cereals, soy beverages, and plant-based meat substitutes are often fortified with vitamin B12. The diets of most Americans supply more than ade quate amounts of vitamin B12. The average vitamin B12 intake is well above the RDA for all sex-age groups, including pregnant and lactating females; however, after age 40 the risk of low vitamin B12 status increases due to 308 Nutritional Assessment 350 300 Red Blood Cell Folate (ng/mL) 250 200 150 100 50 0 1988--1994 1999--2002 Total Population Males 2003--2006 Females Mexican American Non Hispanic Black Non Hispanic White Figure 9.12 Age-adjusted mean concentrations of red blood cell folate in the U.S. population aged 4 years and older by sex or race/ethnicity, NHANES 1988--2006. Red blood cell folate concentrations increased by approximately 50% after the FDA required the addition of folic acid to enriched cereal products beginning in 1998. Source: U.S. Department of Health and Human Services. 2012. Second national report on biochemical indicators of diet and nutrition in the U.S. population. National Center for Environmental Health, Centers for Disease Control and Prevention. www.cdc.gov/nutritionreport. diminished vitamin B12 absorption in older per sons.20,70,77,78 Vegans, or strict vegetarians (persons eating no animal products), could become vitamin B12 deficient, although this is unlikely because many commercially available foods such as soy beverages, ready-to-eat break fast cereals, and plant-based meat substitutes are fortified with vitamin B12. Despite these facts, vitamin B12 defi ciency does occur, although rarely because of a dietary deficiency. Approximately 95% of the cases of vitamin B12 deficiency seen in the United States are due to inadequate absorption of the vitamin, generally because of pernicious anemia caused by inadequate production of intrinsic fac tor. Approximately 2% of elderly people in the United States have clinical vitamin B12 deficiency and 10--20% have subclinical deficiency.78 Because most vitamin B12 absorption occurs in the distal ileum, B12 malabsorp

Use Quizgecko on...
Browser
Browser