Iron-induced heart disease remains the most frequent cause of death in thalassemia major and a critical life-limiting complication of other transfusion-dependent disorders, hereditary hemochromatosis, and other forms of iron overload. Cardiac iron toxicity is insidious, characteristically remaining clinically covert until heart failure and arrhythmias abruptly appear. Early detection of patients at increased risk by endomyocardial biopsy is ineffective because iron is deposited heterogeneously in the heart. As summarized in a recent NIDDK workshop (Brittenham and Badman, Blood. 2003;101:15-19), magnetic resonance (MR) studies offer a potential noninvasive means of identifying patients with heart iron deposition. Use of MR to assess cardiac iron has been hindered because clinical MR instruments detect tissue iron indirectly by the magnetic effects of ferritin and hemosiderin iron on nearby hydrogen nuclei. Because a detailed theoretical understanding of these complex interactions is lacking, empirical efforts have used a variety of instruments, imaging sequences, and parameters.
In this issue, Jensen and colleagues (page 4632) report the first repeated MR measurements of cardiac iron in transfused patients treated with deferoxamine. Using signal intensity ratios, they found increased cardiac iron (with one exception) only in patients with a liver iron concentration above a threshold concentration (350 μmol/g dry weight), as well as significant correlations between heart iron and serum ferritin concentration and, intriguingly, deferoxamine-induced urinary iron excretion. As the authors emphasize, other MR approaches (T2, T2*, magnetization transfer ratios) have yielded seemingly contradictory results (eg, no hepatic iron threshold for cardiac iron, no significant relationship with serum ferritin). Jensen and his colleagues, painstaking pioneers in the use of MR for studies of tissue iron, caution that clinical use of MR estimates of myocardial iron must await an “ironing out” of these apparent anomalies.