Article Text
Statistics from Altmetric.com
Iron deficiency is a global health problem affecting likewise two billion people, and this condition is more prevalent in developing than in industrialised countries. Specifically, in childhood, iron deficiency and subsequent iron deficiency anaemia are considered to negatively affect the development of children causing growth and mental retardation.1 This is due to the fact that iron is essential for oxygen transport as being the central molecule in haemoglobin and referred to the metal's function as an essential compound of many vital enzymes in mitochondrial respiration, metabolism, hormone synthesis and DNA replication. In order to prevent the negative consequences of iron deficiency on children's development, several controlled clinical trials were initiated to study the effects of dietary fortification with iron and other micronutrients on children's health mostly in rural sites of developing countries where such nutrients are scarce and where iron deficiency is highly prevalent. However, the results of these studies were staggering because iron supplementation resulted in a significant increase of infection related mortality, mostly related to malaria and invasive bacterial infections or increased morbidity due to diarrhoea.2 ,3 The mechanisms underlying these observations remained elusive thus far but may be linked to two major factors. First, iron is an essential nutrient for many micro-organisms, and getting access to iron is linked to microbial proliferation and pathogenicity.3 On the other hand, iron exerts subtle effects on immune function because the metal is needed for proliferation and differentiation of immune cells whereas iron availability also impacts on antimicrobial immune effector pathways with a negative effect on the activity of the cytokine interferon-gamma which is essential for stimulation of innate immune functions of macrophages and dendritic cells.3
In the current issue of Gut, Jaeggi et al 4 present an additional explanation for negative outcomes in dietary iron supplementation studies in tropical countries with a high burden of infectious diseases. They performed two double blinded randomised trials of dietary iron fortification in weaning infants in Kenya. In the first study, children at the median age of 6 months were supplemented with micronutrient powder (MNP) containing 2.5 mg/g of an iron source or no iron, whereas in the second study children received MNP plus 12.5 mg/g of another iron salt versus no iron both for 2 months. The authors analysed and compared microbiota composition before and after dietary iron supplementation. They found that dietary iron fortification resulted in a significant expansion of facultative pathogenic enterobacteriae such as Escherichia coli, Shigella or Clostridia which were already part of the gut microbiome although at lower numbers at study initiation. This expansion of pathogenic bacteria in the gut microflora translated into a more than threefold higher incidence of diarrhoea episodes requiring treatment in children supplemented with 12.5 mg/g MNP iron as compared with those without dietary iron supplementation. Of interest, children receiving MNP with iron were found to have significantly increased faecal concentration of calprotectin, an established marker of intestinal inflammation.
What are the consequences and the explanation of these findings?
First, it clearly suggests that unbiased iron fortification in areas with a high endemic burden of infectious diseases may be hazardous. This goes along with observations made in a recent prospective, observational study of 785 children in Tanzania enrolled at birth which indicated that iron deficiency protects from malaria as reflected by a reduced incidence and severity of this infection and by a significant survival benefit.5 Still, we do not know whether these results and those presented by Jaeggi et al 4 apply to all children in general or whether or not a potential benefit of iron fortification in truly/severely iron deficient and anaemic children will outweigh the potential hazards of severe infections.
Second, we need more specific analyses and likewise new prospective trials to specifically characterise individuals who may benefit from iron fortification and those who are at a high risk for adverse effects including life-threatening infections. This also points to the need for the identification and introduction of biomarkers into clinical routine which are also easy to use at rural sites and which can help physicians in performing this important therapeutic decision—to supplement or not to supplement iron. One such promising test may be the determination of the master iron regulatory protein hepcidin in blood which predicts iron deficiency in children with malaria with a high sensitivity and specificity.6
Third, there appear to be multiple mechanisms contributing to increased inflammation in children receiving dietary iron. These include obviously the emergence of pathogenic and enteroinvasive bacteria which induce a local immune-response in the intestine. Moreover, a chronic inflammatory state which is prevalent in countries with a high burden of infections negatively impacts on oral iron absorption due to inflammation driven increased formation of the iron regulatory hormone hepcidin which blocks the transfer of iron from enterocytes to the circulation.7 The subsequent reduction in iron absorption results in higher concentrations of metal in the small and large intestines where it can aggravate inflammation via its catalytic activity towards the formation of tissue-toxic radicals.8
Fourth, the results of Jaeggi et al are likewise also of importance for other diseases, specifically for inflammatory bowel disease where oral iron supplementation in animal models of IBD resulted in disease aggravation which may be linked to iron mediated emergence of pathogenic bacteria and subsequent infection9 along with additional inflammation promoting effects of the metal. However, the relevance of this finding still awaits proof of concept in humans.
This leads to the final question, on how a potentially pathogenic microflora can be beneficially modified to avoid negative effects of dietary iron supplementation in the setting of severe iron deficiency anaemia. Again, a biowarfare concept attracts interest based on the observation that probiotic bacteria, such as E coli Nissle, which are already available for clinical use, can outcompete pathogenic bacteria such as Salmonella by limiting their availability for iron thereby restricting their growth.10
Footnotes
Competing interests None.
Provenance and peer review Commissioned; internally peer reviewed.