Abstract
Efforts to develop curative treatment strategies for chronic lymphocytic leukemia (CLL) in recent years have focused on allogeneic stem cell transplantation (alloSCT). The crucial anti-leukemic principle of alloSCT in CLL appears to be the immune-mediated anti-host activities conferred with the graft (graft-versus-leukemia effects, GVL). Evidence for GVL in CLL is provided by studies analyzing the kinetics of minimal residual disease on response to immune modulation after transplantation, suggesting that GVL can result in complete and durable suppression of the leukemic clone. AlloSCT from matched related or unrelated donors can overcome the treatment resistance of poor-risk CLL, ie, purine analogue refractory disease and CLL with del 17p-. Even with reduced-intensity conditioning, alloSCT in CLL is associated with significant mortality and morbidity due to graft-versus-host disease, which has to be weighed against the risk of the disease when defining the indication for transplantation. Therefore, it can be regarded as a reasonable treatment option only for eligible patients who fulfill accepted criteria for poor-risk disease. If alloSCT is considered, it should be performed before CLL has advanced to a status of complete refractoriness to assure an optimum chance for a successful outcome. Prospective trials are underway to prove whether allo-SCT can indeed change the natural history of poor-risk CLL.
Chronic lymphocytic leukemia (CLL) is the most common type of leukemia, with an incidence of 3 to 6/105 per year. Although not curable, CLL often has an indolent behavior with good responsiveness to cytoreductive treatment or no need for treatment at all. However, about 20% of the patients who need treatment show an aggressive course and die within a few years of diagnosis despite early institution of intensive immunochemotherapy.1,2 These so called “poor-risk” patients are characterized by preexisting or rapidly developing resistance to conventional chemotherapy, including modern purine analogue-antibody combination regimens.3,4 Although newer agents, such as alemtuzumab, lenalidomide, and flavoperidole, show promising activity in some patients with poor-risk CLL,5 –9 none of these compounds seem to have the potential of significantly improving the dismal natural course of poor-risk CLL.
In order to improve the prognosis of patients with poor-risk CLL, efforts to develop effective treatment strategies have focused on autologous and allogeneic stem cell transplantation (SCT) during the last two decades. Despite some methodological similarities, autologous and allogeneic SCT (alloSCT) represent basically different treatment approaches. Whereas the efficacy of autografting relies exclusively on the cytotoxicity administered with the (high-dose) regimen, similar to any other pharmacological treatment, allotransplantation adds the immune-mediated anti-host activities conferred with the graft as a second fundamental principle of anti-leukemic efficacy (GVL effects). Thus, alloSCT represents the initiation of permanently active cellular immune therapy in the recipient, thereby providing a treatment modality that is, in a biological sense, completely different to any other cytotoxic or immunological therapy.
Evidence for GVL Activity in CLL
Since studies in autologous SCT demonstrate that even myeloablative therapy cannot cure CLL, alloSCT can have curative potential only if GVL effects are active in CLL. Evidence for GVL efficacy in CLL derives from the observation that, in contrast to autologous SCT or other intensive therapies, the relapse incidence seems to decrease over time even if the alloSCT was performed with reduced-intensity conditioning (RIC).10,–16 Furthermore, GVL activity in CLL is indicated by a reduced relapse risk in the presence of chronic graft-versus-host disease (GVHD)14,17,18 and an increased relapse risk associated with the use of T-cell depletion (TCD).19,20
The most compelling proof of the GVL principle in CLL comes from studies analyzing the kinetics of minimal residual disease (MRD) after RIC alloSCT. In CLL, MRD can be quantified even at very low levels by sensitive (>10−4) PCR- or flow cytometry-based assays. Ritgen et al performed an analysis of MRD kinetics in 32 patients who had undergone RIC allo-SCT and demonstrated that in the majority of cases achievement of MRD negativity was clearly linked to immune intervention, such as tapering of immunosuppression (n = 12) or donor lymphocyte infusions (DLI; n = 6).21 Seventy-two percent of these 18 patients developed chronic GVHD. Four additional patients showed a significant MRD response upon chronic GVHD onset, which was, however, only temporary despite ongoing GVHD. This suggests a kind of “GVL escape” mechanism that might be acquired during the post-transplantation course. Explanations for this secondary resistance of CLL cells developing under the pressure of ongoing GVL activity might be (1) clonal evolution, (2) survival of clonogenic cells at “GVL-sanctuary” sites as suggested by the fact that disease recurrence can be restricted to extranodal manifestations in these patients,21 or (3) the tumor stem cell hypothesis.22 This GVL resistance phenomenon might, in addition to the complete absence of GVL, contribute to late CLL relapses that are observed in a minority of patients after both RIC and myeloablative T-replete alloSCT.11,15 –17
Farina et al found delayed clearance or decrease of MRD levels upon chronic GVHD onset in 8 of 29 patients in clinical CR after RIC alloSCT, strongly suggestive of GVL activity.18 Similarly, delayed MRD clearance or decrease was observed in 10 of 20 patients after myeloablative alloSCT by Moreno et al23 (and C. Moreno, personal communication, May 19, 2009). In all three studies, MRD negativity 1 year post transplant was durable over the entire follow-up period in more than 90% of patients and predictive for the virtual absence of clinical relapse (except for 2 patients with extranodal disease recurrence) (Table 1 ).
In conclusion, MRD kinetics studies consistently indicate that permanent MRD negativity after alloSCT for CLL can be reached in the context of chronic GVHD and/or immunomodulating intervention. Both the durability of MRD remission and its sensitivity to immunomodulation strongly suggest that GVL is effective in CLL. Unfortunately, GVL in CLL seems to be closely correlated to chronic GVHD, implying that it is essentially dependent on allogeneic effects with broader specificity rather than on a CLL-specific reactivity of donor GVL effector cells.
AlloSCT is Effective in Poor-risk CLL
As summarized in Table 2 , long-term progression-free survival (PFS) can be achieved in 30% to 60% of transplanted patients by RIC alloSCT. Where studied, patients with poor-risk CLL as defined by purine-analogue refractoriness or presence of deletion 17p- had a similar outcome to patients without poor-risk characteristics: In the CLL3X trial of the German CLL Study Group (GCLLSG), patients with purine analogue resistance showed exactly the same PFS as patients without purine analogue refractoriness, and purine analogue refractoriness did not emerge as an adverse factor for PFS in multivariate analysis.16 Similar observations had been made earlier in a smaller study by Schetelig et al.13 In the study of the Seattle Consortium, a 5-year PFS of 39% was reported with only very few late events despite a high proportion (87%) of patients with a history of fludarabine refractoriness.15
A retrospective analysis of the European Group for Blood and Marrow Transplantation (EBMT) on alloSCT in 44 patients with del 17p- CLL showed a 3-year PFS of 37% (95% confidence interval 22% to 52%) with no event occurring later than 3.5 years after transplant (median follow-up 39 months [range, 18–101]). Of note, the vast majority (89%) of the patients in this study had received RIC.20 Similarly, 5 of 13 patients with del 17p- prospectively followed in the CLL3X trial became long-term MRD-negative survivors (maximum follow-up 59 months). No event was observed from 24 months after alloSCT onwards.16
Mortality of AlloSCT in CLL
Whereas non-relapse mortality (NRM) rates of up to 44% were reported in older registry analyses of myeloablative alloSCT for CLL,10,11 more recent data obtained with RIC uniformly show an NRM between 15% and 25% (Table 2 ). This advantage of RIC is even more remarkable as RIC cohorts are generally older and are characterized by higher comorbidity scores.14,26 Moreover, it has to be stressed that in the era of RIC, where the direct toxic effect of the conditioning regimen is often moderate and NRM is essentially due to GVHD and its complications, non-relapse deaths mostly do not occur in the transplant phase but are distributed over the first 12 post-transplant months, with some additional events occurring thereafter.13,15,16,20,24 For instance, the “early death” rate as defined by mortality at day +100 post alloSCT was less than 3% in the CLL3X trial. This has to be taken into account when considering the risk of dying with and without transplant. On the other hand, even the prospective series on RIC alloSCT represent selected patients and not intent-to-treat analyses from the time of relapse or transplant indication, implying that they will be biased in favor of a more favorable patient population in comparison to studies on non-transplant salvage trials in poor-risk CLL. However, even if poor-risk patients who respond to salvage treatment do better than completely refractory patients,7 it has to be considered that these patients also had a significantly superior outcome after RIC alloSCT in the majority of published studies.14,16,24,25
In conclusion, RIC alloSCT in CLL is associated with protracted NRM with a cumulative incidence of 15% to 25% after 2 to 4 years. Whether this means impaired short-term survival of allografted patients with poor-risk CLL in comparison to non-transplant salvage strategies can only be answered by prospective comparative studies on an intent-to-treat basis.
Morbidity of AlloSCT in CLL
The major determinant of long-term morbidity affecting quality of life after alloSCT is chronic GVHD. A recent systematic study documented a significantly reduced self-reported long-term health status in patients allografted for various hematological malignancies who had active chronic GVHD.27 Adverse health outcomes for each of 6 categories (general health, mental health, functional impairment, activity limitation, pain, anxiety) were reported by 10% to 20% of patients without chronic GVHD, whereas the proportion of affected patients was at least twice as high in each individual category in patients with active GVHD. Notably, there was no difference between those patients who had never GVHD and those with resolved chronic GVHD.
Depending on the immunomodulating strategy employed, extensive chronic GVHD develops in up to 60% of patients at risk after RIC alloSCT for CLL (Table 2 ). In their series of 82 patients, Sorror et al observed a 5-year cumulative incidence of extensive chronic GVHD of 49% for related and 53% for unrelated recipients. However, in the majority of affected patients clinical symptoms of chronic GVHD resolved over time, allowing discontinuation of systemic therapeutic immunosuppression after a median of 25 months.15 Unfortunately, systematic investigations of quality of life after alloSCT in CLL are lacking to date.
In conclusion, transplant-related long-term morbidity after alloSCT for CLL can be significant but is mainly restricted to those patients who have ongoing active chronic GVHD. The morbidity caused by alloSCT for patients with poor-risk CLL must be weighed against the morbidity due to uncontrolled disease and palliative treatment associated with non-transplant salvage strategies.
Indications of AlloSCT in CLL
In 2006, the EBMT worked out a consensus on indications for alloSCT in CLL, stating that alloSCT is a reasonable treatment option for eligible patients with previously treated, poor-risk CLL.28 Criteria for “poor-risk CLL” according to the EBMT CLL Transplant Consensus are purine analogue refractoriness, early relapse after purine analogue combination therapy, and CLL with p53 lesion requiring treatment (Table 3 ). In the absence of prospective randomized trials, the Consensus recommendations were based on evidence of grade II or less.
Since then, novel treatment modalities and a huge body of additional scientific information have become available. In particular antibody-purine analogue combination regimens appear to have very promising activity in CLL when given as first-line or salvage treatment. The regimen studied best is fludarabine-cyclophosphamide-rituximab (FCR). However, even this highly effective combination does not seem to be capable of improving the dismal natural course of patients with del 17p-deleted or p53-mutated CLL.1,29,30 Similarly, no significant progress has been made in terms of improving the outcome of purine analogue-refractory CLL. Salvage trials have mainly focused on regimens based on the pan-lymphocyte antibody alemtuzumab. The GCLLSG CLL2H study investigated alemtuzumab monotherapy in 103 patients who had fludarabine-refractory CLL according to the iwCLL/NCI guidelines.4 Despite an overall response rate of 34%, median progression-free survival and overall survival were only 8 month and 19 months, respectively.7 Also more complex antibody-purine analogue combinations, such as the (OFAR) regimen (oxaliplatin, fludarabine, ara-C, and rituximab), fail to achieve sustained disease control in fludarabine-refractory CLL.31 The combination of bendamustine and rituximab showed promising activity in relapsed or refractory CLL in a large GCLLSG phase II trial. Whereas the overall response rate was 92%, only 4 of 9 patients (44%) with 17p- CLL showed a (partial) response.32
Flavoperidole and lenalidomide are studied best among the newer drugs for CLL salvage treatment. Both yield a 30% to 40% response rate when used as salvage therapy in fludarabine-refractory or del 17p- CLL.8,9 Although it appears unlikely that their activity could translate into long-term rescue of a significant proportion of patients with poor-risk CLL, these compounds or new combinations may be helpful to put patients into remission as prerequisite for successful alloSCT.31 With regard to the second EBMT criterion, the extrapolation that CLL progression within 2 years after purine analogue combination therapy indicates a dismal prognosis is very plausible but still needs to be substantiated by systematic studies (to date it is only at the expert opinion evidence level).
As mentioned earlier, evidence is increasing that RIC alloSCT can mitigate the adverse prognostic impact of purine analogue refractoriness and del 17p- as long as CLL has not become generally resistant to salvage treatment.
In conclusion, the additional knowledge accumulated during recent years suggests that poor-risk CLL remains poor-risk CLL and that alloSCT is the only treatment with the potential of providing long-term disease control in this condition, thereby confirming the indications for alloSCT in CLL as defined in the EBMT CLL Transplant Consensus (Table 3 ). In addition to the disease risk, patient-related risk factors, such as age and comorbidity, have to be considered when the decision about alloSCT is made.15,33
When Should alloSCT in CLL Be Performed?
According to the first and second EBMT criteria the eligibility for allotransplantation in CLL is defined by the quality of response to therapy. Therefore alloSCT is never indicated as part of the first-line treatment of CLL except for those few patients who have del 17p- with treatment indication (third EBMT criterion). This implies that transplantation should not be considered too early.
On the other hand, the larger RIC prospective trials uniformly show that in CLL the results of alloSCT are considerably impaired if the disease is not in remission at the time of transplantation, due to nodal bulks and/or chemotherapy resistance (Table 4 ). Thus, alloSCT should be performed before CLL has advanced to a status of complete refractoriness or large resistant tumor burden in order not to miss the “window of opportunity” for a successful outcome.
In conclusion, in eligible patients alloSCT should be considered as soon as the first and/or the third EBMT criterion (purine analogue refractoriness, del 17p- with treatment indication) is met. The situation is less clear in patients with early relapse after combination therapy who still respond to purine analogue-based salvage treatment (second EBMT criterion). The optimum timing of alloSCT in this risk group needs to be defined by systematic studies.
Reduced Intensity or Myeloablative Conditioning?
Several prospective RIC studies with a combined number of more than 300 patients included on the basis of modern CLL-specific risk stratification give relatively concordant results. In contrast, the experience with myeloablative conditioning (MAC) in CLL largely relies on registry analyses and a few small single-center studies.10,–12,17,19 Therefore, RIC rather than MAC represents the standard procedure for alloSCT in CLL. Given the fact that the crucial therapeutic principle of allotransplantation in CLL is GVL activity, and that evidence for superior direct cytotoxic activity of MAC over RIC is lacking,34 it remains questionable whether MAC can improve on RIC. However, taking into account that an increased incidence of relapse after RIC in comparison with MAC was observed in a registry analysis, impaired disease control by RIC cannot be ruled out.14 Thus, according to the individual situation, the optimum choice of conditioning regimens may vary: Although in the presence of comorbidity and sensitive disease reduced-intensity regimens appear to be more appropriate, high-intensity regimens might be preferable in younger patients with good performance status but poorly controlled disease.26,28
Donors
Even if selected by high-resolution HLA typing, matched unrelated donors (MUD) show a small but significant disadvantage against HLA-identical sibling donors when tested in good-risk hematological disease.35 Therefore, it can be anticipated that matched siblings should be superior over MUD in CLL, too. However, a significantly unfavorable effect of MUD could not be found in any of the prospective RIC studies listed in Table 4 , indicating that this factor has only very limited clinical implications, if any, in CLL. Accordingly, a matched sibling donor should be preferred for alloSCT in CLL whenever possible, but a MUD is a reasonable alternative if an HLA-identical sibling is not available.
T-cell Depletion
Ex-vivo and in-vivo T-cell depletion are effective means of preventing acute and chronic GVHD after alloSCT. In CLL, however, TCD has been associated with increased relapse and rejection rates.19,20,36 Prophylactic or therapeutic DLI can partly overcome these problems but re-expose the patients to the risk of GVHD.37 Therefore, the value of TCD in alloSCT for CLL remains to be settled.
Additional Immune Intervention
Numerous approaches aiming at increasing the benefit/risk ratio of alloSCT in CLL are currently under investigation, such as peritransplant rituximab25 or MRD-based preemptive immune intervention.16 Since none of these strategies has yet proven to be beneficial, they must be regarded as experimental and should be restricted to clinical trials.
The Role of AlloSCT in the Treatment Algorithm for CLL
In summary, there is convincing evidence that alloSCT can provide long-term disease control and possibly cure in selected patients with CLL, including those with a biologically highly unfavorable risk profile. To date, however, there has been no controlled trial on alloSCT in CLL, and none evaluating the effect of alloSCT in a given disease situation by intention-to-treat analysis. All published studies merely describe the outcome of a selected group of patients from the time of transplant. Thus, although there is no doubt that alloSCT can largely improve the prognosis of individual poor-risk patients,28 it is unclear to what extent alloSCT can indeed impact the natural history of the patient population with aggressive CLL and what its overall clinical value for the treatment armory of CLL might be.
This question can be properly addressed only by prospective trials comparing alloSCT with non-transplant strategies in defined clinical risk situations by intention-to-treat. It is crucial for such comparisons that patients are followed from the time of reaching transplant indication as triggered by need for treatment rather than from the time of alloSCT. The GCLLSG is currently performing a trial aiming at validation of the EBMT criteria in patients with high-risk and very high-risk CLL. Whereas those with very high-risk disease (first and third EBMT criterion) will undergo a biological randomization by donor availability, patients with high-risk disease (second EBMT criterion) who have an HLA-identical donor will be statistically randomized to alloSCT and continuation of conventional salvage therapy, respectively. This trial will be finished in 2012 and hopefully give some guidance when and how to use alloSCT in poor-risk CLL.
Disclosures Conflict-of-interest disclosure: The author receives honoraria from Roche and Hospira. Off-label drug use: None disclosed.
Acknowledgments
I thank Paolo Corradini and Carol Moreno for proofreading the MRD data table.
References
Author notes
Department of Medicine V, University of Heidelberg, Heidelberg, Germany