Iron deficiency—when there’s too little iron in the blood—may affect a quarter of the world’s population, and in particular, women of reproductive age. Symptoms can include fatigue and shortness of breath, and without treatment, iron deficiency can progress to anemia, a condition characterized by low red blood cell count that can cause more severe heart and health problems.
There is a simple blood test for iron deficiency that measures ferritin—a protein in the blood that stores iron—but it’s not part of a standard checkup in most countries, including the United States. Further, the World Health Organization has, for decades, earmarked 15 micrograms per liter of ferritin as the threshold for low iron, which is too low, according to some experts, and does not reflect the true prevalence of iron deficiency. This lack of standardization creates blind spots, leaving many symptomatic individuals without access to treatment.
“We know for a fact that we’re missing people who are truly iron deficient,” says George Goshua, MD, MSc, an assistant professor of medicine (medical oncology and hematology) at Yale School of Medicine. Goshua led a team that explored the benefits and costs of screening adult women for iron deficiency and recently published their findings in the American Journal of Hematology.
Their work is especially timely, given that the American Society of Hematology, an international task force of researchers and clinicians tackling blood disorders, convened a panel last year to establish the first medical practice guidelines for iron deficiency. The panel’s results are expected next year, and Goshua hopes his team’s work will help inform the experts’ decision.
“We believe this is how we start an important conversation that has been ignored for decades,” says Goshua, who is also a member of Yale Cancer Center.