Skip to main content

Iron Deficiency

Iron deficiency occurs when there is an insufficient intake of iron—primarily found in flesh foods and, to a lesser extent, dairy products and plant foods—as well as in fortified foods or supplements. Iron deficiency can also be caused by poor absorption and excessive loss of the mineral, including blood loss. The more severe stages of iron deficiency can result in anemia when there is not enough iron to produce adequate amounts of hemoglobin for red blood cells (WHO and CDC 2004). Iron deficiency is a major contributor to anemia, though the actual extent of overlap between iron deficiency and anemia is context-specific and varies by setting (Kassebaum and GBD 2013 Anemia Collaborators 2016). Specific groups at an increased risk of iron deficiency include children (due to rapid growth), pregnant women (due to expansion of the red blood cell mass and the need for more iron for the fetus), and women of reproductive age, including adolescent girls (due to blood loss during menstruation).

How is iron deficiency measured?

Bone marrow aspirates are the gold standard for assessing iron deficiency, but this method is not practical for population-based measurements. Ferritin is the most commonly used biomarker for iron status; the World Health Organization (WHO) recommends the use of ferritin to assess iron status in population-based surveys. Ferritin measures the amount of iron stores in the body; low levels reflect depleted iron stores. Serum transferrin receptors (sTfR), which reflect the need for iron at the cellular level, is also a biomarker used to assess iron status (WHO 2011).

Ferritin and sTfR levels can be determined using a venous or capillary blood sample and require maintaining a cold chain. Laboratory assessments commonly include enzyme-linked immunosorbent assay (ELISA), immunoturbidimetry, or others (WHO and CDC 2004).

How is iron deficiency categorized?

A definition for what constitutes a public health problem for iron deficiency has not been established. Table 4 describes the cut-offs for defining iron deficiency using ferritin, with differences based on age and pregnancy status.

Table 4: Iron Deficiency Cut-offs Based on Serum Ferritin Concentration

Serum Ferritin (mcg/l)
 Less than 5 years of ageFive years of age or older
MaleFemaleMaleFemale
Depleted iron stores12
(where infection and inflammation are not prevalent)
12
(in areas where infection and inflammation are not prevalent)
1515
Depleted iron stores in presence of infection3030--
Severe risk of iron overload (adults)-->200>150

Note: For sTFR, use cut-off values recommended by manufacturer or the assay
Source: WHO 2011

Where can we get these data?

Iron deficiency is measured in population-based surveys and research studies, among women of reproductive age and children. Of the commonly administered population-based surveys, the National Micronutrient Survey is usually the only one that collects and analyzes information on the prevalence of iron deficiency.

Methodological issues

  • While the prevalence of anemia is sometimes used as a proxy indicator for iron deficiency, this poses many problems, because iron deficiency is only one of many causes that lead to anemia and, depending on the setting, may not even be a major contributor (Kassebaum et al. 2014). Additionally, mild and moderate levels of iron deficiency may not manifest as anemia, although they probably still result in functional impairment (WHO 2001).
  • Infection and inflammation can increase ferritin concentrations, which can complicate the interpretation of iron status using ferritin concentrations. In addition to being a biomarker of iron status, ferritin concentrations are also a positive acute phase protein and they rise in response to inflammation. In other words, ferritin levels may be elevated in the presence of inflammation, irrespective of iron status, and may lead to an underestimation of the prevalence of iron deficiency.
  • Approaches have been developed to adjust ferritin concentrations for inflammation using the inflammation biomarkers alpha-1-acid-glycoprotein and C-reactive protein. The four types of approaches currently proposed are to—
    1. Exclude individuals with elevated inflammation from calculations of iron status (WHO and CDC 2004),
    2. Raise the ferritin threshold to 30 mcg/l for those with elevated inflammation (WHO and CDC 2004),
    3. Use a categorical correction factor (Thurnham et al. 2010),
    4. Use a regression correction (Namaste et al. forthcoming).
    Verify if any adjustment approach was used to determine iron deficiency when using ferritin concentrations. If it was not used, note this in your limitations and recognize that iron deficiency is probably a bigger problem than your data indicates. If you have the raw data available, apply these adjustments. Present both adjusted and unadjusted prevalence levels.
  • As an alternate to using adjustment approaches, in areas with a high prevalence of inflammation, you can use the combination of ferritin and sTfR. This method may help you determine if iron deficiency is a problem in your setting by using the definition in Table 5.

Table 5: Interpretation of Serum Ferritin and Transferrin Receptor Concentrations in Population Surveys

Percentage of Serum Ferritin Values Below ThresholdaPercentage of Serum Transferrin Receptor Above Cut-Off ValuesbInterpretation
20c10Iron deficiency not prevalent
20c≥10Iron deficiency prevalent
≥20d≥10Inflammation prevalent
≥20d10Iron deficiency prevalent

a Apply cut-off values by age group as described in table 4; b Apply cut-off values recommended by manufacturer of assay until an international reference standard is available; c less than 30% for pregnant women; d 30% or higher for pregnant women
Source: WHO 2011