In a prospective randomized, multicenter study of immune tolerance induction (ITI) in patients with hemophilia A refractory to replacement therapy after the development of alloantibodies that inhibit factor VIII (FVIII) activity, Hay and DiMichele compared two regimens, a low dose of FVIII (50 IU/Kg thrice weekly) or a high-dose (200 IU/kg daily).1
These therapeutic regimens have reported to induce a similar overall success rate (roughly two-thirds of patients responding as defined by inhibitor disappearance and normal plasma FVIII half-life and recovery). This study, which enrolled patients from 17 countries on 4 continents, contributes data that should help define an evidence-based practice of ITI. It also stands as a milestone because it shows that investigator-driven, randomized clinical trials can be carried out and completed even in rare diseases such as hemophilia without the direct involvement of the pharmaceutical industry.
How to eradicate FVIII inhibitors and successfully treat the most challenging complication of hemophilia has been a long-standing issue. As early as 1977 Brackmann and Gormsen reported that, in a patient with severe hemophilia A, a long-standing, high-titer inhibitor disappeared completely by infusing high daily doses of FVIII for a prolonged period of time.2 This striking and puzzling observation, which led to the development of the so-called Bonn ITI regimen (that broadly corresponds to the high-dose regimen evaluated by Hay and DiMichele1 ) was received with skepticism by the hemophilia community. The reasons for the skepticism were multiple: lack of an experimental basis for this novel approach; limited knowledge on FVIII immunology; the substantial cost of the huge FVIII dosages employed at a time when therapeutic coagulation factors still had limited availability; the challenges put on patients' resilience and venous access by daily or twice-daily factor infusions; and the fear of transmitting viral hepatitis and then later, HIV infection. Nevertheless, the persistent and unswerving efforts of the Bonn group to confirm and extend the original case report3 led to several small, nonrandomized cohort studies4,5 that confirmed that flooding the immune systems of these patients with the antigen (FVIII) did quench the production of the neutralizing antibody with a success rate ranging from 63% to 80%.4,5
How did studies subsequently advance on clinical implementation of ITI? Cognizant of the difficulties implicit in the interpretation of retrospective data from small cohorts of patients with a rare complication of a rare disease, some clinicians chose to gather and analyze data from registries. The International, German, and North American registries obtained broadly consistent results on the variables that influence ITI outcome.6-8 The main predictors of success were low inhibitor peaks, low inhibitor titres before ITI start, and low anamnestic peaks after start. These factors provide clinicians with crucial information to select the most suitable patients for this expensive and demanding treatment.
Before the current study by Hay and DiMichele, the main unresolved question was the influence of FVIII dosage regimens on the ITI success rate, particularly after a Dutch study obtained a high success rate (83%) using as little as 50 U/kg of FVIII 3 times weekly, instead of the 200-300 U/kg daily doses used by the acolytes of the Bonn regimen.9 This smaller dosage regimen, which achieved a high rate of ITI within a time frame similar to that of the high dosage (on average ∼ 1 year from onset), was highly appealing due to decreased cost and patient acceptability.
Is there an answer to this question from Hay and DiMichele? Their study was designed to test the hypothesis of noninferiority, that is, that the ITI success rate is independent of the FVIII dosage. Even though equivalence between high and low dose was not formally established, my clinical interpretation of the results is that in good-risk patients (ie, those who were relatively likely to get rid of their anti-FVIII inhibitor), either regimen can be used successfully. Does this result imply that one should prefer the low FVIII dosage for reasons of cost and patient convenience? The low-dose regimen was associated with 2-fold more bleeding episodes than the high-dose regimen.1 Moreover, the cost of the FVIII bypassing products (recombinant activated factor VII, and anti-inhibitor plasma-derived complex)10 needed to treat the intercurrent bleeding episodes might nullify or substantially reduce any cost saving obtained in terms of less FVIII usage. Hence, one important piece of information still missing to make a meaningful therapeutic choice is a cost-effectiveness analysis. An answer to this question is particularly cogent at a time when the global economic crisis is mounting pressure on healthcare costs and austerity measures are imposed on drug spending, even for therapies as effective as those used in hemophilia that allow these patients to have a life expectancy similar to that of their peers without hemophilia (at least in high-income countries).11 On the other hand, it is obvious that more bleeding episodes may definitely impair the safety and quality of life of patients treated with low-dose FVIII. Hence, the risk:benefit ratio of the two regimens and a quality-of-life analysis are needed to evaluate the two regimens.
Conflict-of-interest disclosure: The author declares no competing financial interests. ■
This feature is available to Subscribers Only
Sign In or Create an Account Close Modal