CHr, a better screening test for infant iron deficiency
A study, published in the JAMA, demonstrated that unique blood test detects iron deficiency in infants earlier and more accurately than the commonly used hemoglobin screening test.
Iron deficiency is estimated to affect nearly 10 percent of American children one to two years of age. Early detection and treatment are critical because iron deficiency can impair infant mental development, possibly permanently, even before it progresses to anemia, clinically identified as a low hemoglobin level.
The study, done at Children's Hospital Boston, is the first to compare the test, called CHr, with the standard hemoglobin test as a screen for iron deficiency in infants.
Hemoglobin is the iron-containing, oxygen-carrying molecule in red blood cells; the CHr test measures the hemoglobin content of reticulocytes, or immature red blood cells, whereas the standard hemoglobin test is based on the entire population of red blood cells.
Because reticulocytes are present in the bloodstream for only 24 to 48 hours, as compared with several months for mature red blood cells, measuring the reticulocyte hemoglobin content ( i.e., CHr ) provides a more timely indication of iron status, the investigators say.
In this study, 200 healthy infants 9 to 12 months of age underwent both tests, as well as a transferrin saturation test, which is the "gold standard" test for iron deficiency but is impractical for routine screening. Using the optimal CHr cutoff value ( established as 27.5 picograms ), CHr correctly identified 83 percent of the iron-deficient infants, compared with only 26 percent identified by the current screening standard ( a hemoglobin level less than 11 grams/decilitre ).
" Our findings are important because, while iron deficiency can be readily treated, practitioners haven't had a simple, reliable and practical screening test to detect it early enough. Now they might," said Henry Bernstein, at Children's Hospital Boston and the principal investigator of the study. " This study shows that CHr can be used to detect iron deficiency earlier and more accurately than standard hemoglobin screening. Once confirmed in larger, multicenter studies, these findings could change our preferred screening practices for the early detection of iron deficiency." He added that the CHr test is simple, requires no extra tubes of blood to be drawn and involves no additional cost.
" There is mounting evidence that iron deficiency in infants can cause permanent neurocognitive deficits, even before it has progressed to the point of causing anemia," said lead investigator Christina Ullrich, at Children's Hospital Boston and the Dana Farber Cancer Institute. " The ability of the CHr test to identify more infants, at an earlier stage of iron deficiency, makes it a better choice for screening than the current hemoglobin test."
Iron deficiency is the most common nutritional deficiency in the world. Infants and toddlers are especially susceptible because of their rapid growth, increased demands for iron, and variable dietary intake.
The deficiency progresses in three stages: 1) depletion of the body's iron stores; 2) deficiency, in which hemoglobin synthesis is impaired, resulting in a fall in CHr; and 3) anemia, in which red-blood-cell hemoglobin is below normal for a person's age.
Iron deficiency is usually readily treated with dietary iron supplementation.
Under current guidelines, children are first screened with the hemoglobin test at nine to 12 months of age. However, iron deficiency can exist for some time before causing anemia. There are other tests that can diagnose iron deficiency in the absence of anemia, but they are impractical for routine clinical screening. Thus, current screening practices miss iron deficiency in non-anemic infants in whom adverse consequences may be developing.
Source: Children's Hospital Boston, 2005
XagenaMedicine2005