The treatment of newly diagnosed acute lymphocytic leukemia (ALL) in adults remains unsatisfactory. Not withstanding the outstanding progress in curing childhood ALL, only approximately one third of adults younger than 60 years can be cured, and the overall published survival curves have not changed significantly during the past 15 years. Recent therapeutic advances in allogeneic transplantation through the conduct of large collaborative studies, better understanding of the relevance of cytogenetics, improved molecular techniques for the detection of minimal residual disease, and clinical research into novel biologic and targeted therapies have all combined to offer potentially a better hope for an improved outcome in this disease. The current approach in 2007 to the management of this disease is presented by way of a discussion of illustrative cases. In this uncommon and difficult disease, well-structured intergroup studies will remain vital for future progress.
Introduction
In contrast to childhood acute lymphocytic leukemia (ALL), therapy of adults with ALL remains unsatisfactory. The long-term survival rate for adults has not significantly changed during the past 2 decades with a 5-year overall survival of only 30% to 40% for patients younger than 60 years, less than 15% for patients older than 60 years, and less than 5% for patients older than 70 years.1,,,,,,,,,,,,–14 At the same time, much information has recently been learned about prognostic factors, cytogenetics, the definition of patients most suitable for an allogeneic transplantation, and the increasing indications for use of alternative donor transplants for high-risk ALL. Good prospective studies for relapsed ALL searching for a better outcome and the treatment of older patients remain an elusive goal. At the same time, recent data suggest that perhaps older teenagers are better treated on traditional pediatric protocols rather than on current adult protocols.
To best illustrate some of these new approaches, several case scenarios are presented, followed by a discussion of the therapeutic implications.
Patient 1
A 33-year-old woman presents with pre-B ALL. Her complete blood cell (CBC) count at presentation shows a white blood cell (WBC) count of 11.0 × 109/L (11 000/μL), the hemoglobin level is 98.0 g/L (9.8 g/dL), and the platelet count is 51.0 × 109/L (51 000/μL). She has a normal karyotype and has an HLA-compatible sibling. What is the most appropriate induction and postremission therapy?
This is a young adult who has Philadelphia chromosome-negative ALL. Such patients are considered “standard risk” (there is no such thing as a low-risk adult ALL). The reasons for this are fairly arbitrary, but, by convention, patients are designated as such by a low WBC count at presentation (<30.0 × 109/L [<30 000/μL] for B-lineage and 100.0 × 109/L [<100 000/μL] for T-lineage ALL) and age younger than 35 years.15,16 A normal karyotype has neither “good” nor “poor” prognostic significance.
Recent studies have reported complete response rates for induction therapy of more than 90%. The large international MRC UKALL XII/ECOG 2993 study reported a complete remission rate of 97% in 533 adults at standard risk.17
Whether there is a single best induction regimen is not known. Although the regimen used in the MRC/ECOG study showed a high complete remission (CR) rate, an alternative induction therapy using the hyper–cyclophosphamide, vincristine, Adriamycin, and dexamethasone (CVAD) regimen also reported a CR rate in excess of 90%.6 The hyper-CVAD regimen is of particular interest because it does not include l-asparaginase as part of the therapeutic armamentarium. l-Asparaginase has been the mainstay of treatment of childhood ALL for the past 3 decades, since the initial study by the Children's Cancer Study Group reported in 1977.18 Although sequential studies since 1977 have shown an improved outcome with asparaginase, there have been no recent prospective studies that have directly compared the outcome of patients, pediatric or adults, with or without asparaginase. The issue is of particular interest because there are increasing data that emphasize the improved outcome of effective depletion of asparagine compared with patients not achieving effective asparagine depletion,19 making it unlikely that such a study will ever be done. The MRC/ECOG study has proposed several modifications of its previously reported regimen and incorporates the pegylated form of asparaginase in place of the previously used native asparaginase. The polyethylene glycol–conjugated asparaginase was developed to extend the half-life and to lessen the frequency of injections.20 There have been several randomized trials studying the pegylated asparaginase compared with the native Escherichia coli asparaginase, all suggesting an improved efficacy.21,–23
Anthracyclines have been a cardinal feature of induction in ALL.8 However, there is no consensus on the optimal anthracycline, dose, or schedule. Daunorubicin is most frequently used, but idarubicin, doxorubicin, and mitoxantrone have also been given usually in weekly schedules, without convincing data that any agent is better than the other. For young adults a weekly dose of 60 mg of daunorubicin given for 4 weeks has been well tolerated and is the one suggested in Table 1, although alternative weekly and daily schedules are probably equally effective.
Although the overall reported induction mortality is approximately 5%, most of those patients died of opportunistic infections, including Aspergillus. The prolonged immunosuppression associated with the 2 phases of induction is thought to contribute to this risk, and the lengthy exposure to corticosteroids is likely to add to this risk. Recent data from the pediatric literature have suggested that the use of dexamethasone, as opposed to prednisone, may improve the outcome. There are also data that suggest that dexamethasone has greater penetration of the central nervous system (CNS) and fewer thromboembolic events than prednisone, and its in vitro antileukemic activity is greater.24 Furthermore, 2 of 3 randomized studies have shown improved survival in children receiving dexamethasone as opposed to prednisone.25,–27 For this reason, a truncated use of corticosteroids, in the form of dexamethasone, has been used in the induction regimen, as shown in Table 1.
It must be emphasized that, although successful induction results are critical for long-term survival in ALL,17 this does not of itself predict for long-term survival because most patients, including those with high-risk ALL, achieve a complete remission. Thus, although it is unlikely that any modifications to the induction regimen will show an improvement in the initial response rate, it is important to continue to study modifications with the view of improving the long-term outcome.
Phase 3 studies are under way to evaluate the role of monoclonal antibody therapy in B-lineage ALL, focusing mostly on either epratuzumab (anti-CD22) or rituximab (anti-CD20).28,29 Such therapy may decrease the level of minimal residual disease (MRD) at the completion of induction therapy. Recent reports suggest an advantage when hyper-CVAD plus rituximab was compared with historical patients with B-ALL treated with hyper-CVAD alone.30 There have also been preliminary reports suggesting a role for rituximab together with hyper-CVAD in patients with CD20+ pre-B ALL.31 Rituximab has already formed part of the standard protocol for older and younger adults with B-precursor ALL in the German Multicenter Study Group for Adult ALL (GMALL) protocols.17 In contrast to the antiproliferative effects of rituximab, epratuzumab appears to function more by an immunomodulatory mechanism,32 and CD22 is expressed on the surface of most pre-B ALL cells. There are mature data in lymphoma, and more preliminary data in ALL, that demonstrate its safety and tolerability as well as significant efficacy, thus making it an attractive agent to study in ALL.33,–35 There have also been encouraging data using a combination of rituximab and epratuzumab in lymphomas with reported 60% to 70% response rate and easy tolerability.36,37 Guanine arabinoside, or nelarabine, has been shown to have significant activity in refractory T-cell malignancies; thus, for T-lineage ALL, nelarabine is likely in future studies to be incorporated either into the late phase of induction or the early part of intensification or maintenance therapy, or both.38,39 All the foregoing have much promise and are to be investigated in prospective randomized trials; they cannot yet be considered as standard of care.
Alemtuzumab, a monoclonal antibody directed against CD52, is also an attractive agent to be studied in ALL because of its broad expression on most T- and B-cell ALLs. Limited activity has been reported in advanced ALL,40 and this is likely to be incorporated in future studies for patients with newly diagnosed ALL who are CD52+. Other promising novel agents that are likely to be incorporated into clinical trials in the future include Notch1-activating mutations,41 liposomal vincristine,42 and intrathecal liposomal cytarabine.43
For postremission therapy the standard of care has evolved during the past few years. Historically, there has been resistance to the use of an allogeneic transplantation for ALL in first complete remission based, in part, on extrapolation from the success in childhood ALL using only chemotherapy,44 as well as retrospective comparisons with registry data.45 This was somewhat surprising because the clinical importance of the graft-versus-leukemia (GvL) effect has been recognized since 1979, and the initial reports were in patients with ALL.46,47 During the past 2 decades, the role of allogeneic transplantation, however, has been recognized in patients at high risk, such as those with Philadelphia chromosome-positive ALL and other high-risk features.16,48,–50 Most clinicians have, however, shied away from offering allogeneic transplantations to patients with adult ALL who had an HLA-compatible sibling and were not at high risk.
During the past 14 years the MRC/ECOG study has assigned such therapy to all adult patients, including those with standard-risk ALL, who had an HLA-compatible sibling. The final data were recently reported and showed an unequivocal benefit for HLA-compatible sibling transplantation for patients with standard-risk ALL over other forms of conventional therapy, whether this is consolidation or maintenance therapy or an autologous transplantation.51 Patients with standard-risk ALL (n = 218) who had an HLA-matched donor were compared with 286 patients who did not have a donor with a 5-year overall survival of 63% versus 51%, respectively (P ≤ .05). The patients with the best outcome with standard chemotherapy derived the greatest benefit from an allogeneic transplantation. Such a transplantation should be performed as soon as possible after induction to minimize the transplant-related morbidity and mortality. A total body irradiation (TBI)–based conditioning has been traditionally preferred in ALL, in contrast to acute myelogenous leukemia (AML), and this also provides for added CNS prophylaxis.
The importance of the GvL effect for ALL in first complete remission cannot be overemphasized because the relapse rate is significantly reduced in all patients with ALL. This could have been possibly misunderstood because of the poor data of donor leukocyte infusions in patients with relapsed ALL.52 Extrapolating data from relapsed ALL to patients in first remission has mistakenly led to the assumption that the GvL effect in ALL is not of an overriding significance.
An evaluation of minimal residual disease (MRD) in ALL is assuming increasing clinical relevance and is clearly an important prognostic factor also in adults with ALL, although a decrease in MRD occurs more slowly in adults than in children, and fewer patients reach a negative status.53,,–56 Whether such evaluation will result in the ability to better predict for patients who may not require an allogeneic transplantation remains to be proven in prospective studies. Conversely, the presence of detectable MRD by molecular methods after completion of both phases of induction predicts for a poor prognosis even in patients with standard-risk ALL and such patients should proceed to an allogeneic transplantation from an alternative donor.
This patient should be offered a standard induction regimen, similar to the one outlined in Table 1, but without the intensification. She should then receive a myeloablative, TBI-based, allogeneic transplant from an HLA-matched sibling.
Patient 2
An 18-year-old male adolescent presents with common acute lymphoblastic leukemia antigen (CALLA)–positive ALL with a hyperdiploid karyotype. His CBC count at presentation shows a WBC count of 23.0 × 109/L (23 000/μL), the hemoglobin level is 72.0 g/L (7.2 g/dL), and the platelet count is 22.0 × 109/L (22 000/μL). He has an HLA-matched sibling. What is the best induction and postremission therapy?
The issue of treating an adolescent with ALL has come under much scrutiny during the past decade. There are increasing data that suggest that adolescents treated with adult ALL protocols have a worse outcome than similar adolescents treated on pediatric protocols.8,19,57,,,,–62 All these comparisons have been retrospective and recognize the inherent difficulty in comparing adolescents who were treated in adult with those treated in pediatric units; for example “older looking” young adults may be referred to an adult ward. Nevertheless, the preponderance of the data suggest that such patients are more likely to benefit from pediatric regimens that, as a general rule, are more intensive, include far more asparaginase, and probably reflect a greater protocol discipline, especially about timeliness of administration and adherence to dosing schedules. Thus, although this particular patient has standard-risk ALL, there are no data that have unequivocally shown the benefit of an allogeneic transplantation for such a patient. Although all standard-risk patients older than 15 years have been included in the MRC/ECOG study,51 a typical adult regimen was used; thus, this remains an open issue. Although children having a hyperdiploid karyotype have been reported to have a more favorable prognosis, this has not been observed in adults.63,–65 The data recently reported by the Dana-Farber Cancer Institute in Boston, Massachusetts, for 51 adolescent patients aged 15 to 18 years showed a remarkable event-free survival of 78% in this population. This is clearly better than anything else that has been reported to date, and, if such data are confirmed in an unselected population, there clearly will be no role for an allogeneic transplantation in this age group.66
This patient should be treated on an institutional pediatric regimen for standard-risk patients with ALL and should not undergo an allogeneic transplantation.
Patient 3
A 48-year-old woman presents with T-lineage ALL. Her WBC count at presentation is 140.0 × 109/L (140 000/μL), the hemoglobin level is 81.0 g/L (8.1 g/dL), and the platelet count is 8.0 × 109/L (8000/μL). Cytogenetic analysis revealed a complex karyotype in 14 of 20 metaphases. She does not have an HLA-matched donor.
This patient clearly has a poor prognosis for several reasons. Her age and elevated WBC count are established poor prognostic features. Until recently, the significance of a complex karyotype in ALL was uncertain. Although this has known poor prognostic significance in AML, and almost intuitively was considered as such in ALL, there had been no data that unequivocally established this. The very recent report of cytogenetics in ALL from the MRC/ECOG study showed for the first time the independently poor prognosis conferred by a complex karyotype.65 In that study there were 41 patients who were negative for the Philadelphia chromosome and had a complex karyotype defined as the presence of 5 or more chromosomal abnormalities. There was a statistically worse overall and event-free survival that was independent from other known prognostic variables.
Historically, most clinical trials have confirmed immunophenotyping as a critical part of the prognostic evaluation of patients with ALL.67 Most trials have reported a superior outcome of T-lineage ALL compared with B-lineage.8,51 However, as with B-lineage disease, this broad classification is likely to be superseded by subtyping for early versus thymic and mature T-ALL.8 Furthermore, Notch1-activating mutations may, in the future, provide further prognostic information for patients with T-ALL.68
Although the GvL effects for adult ALL in first remission are clearly present in all patients, including those at high risk, this particular patient presents a management dilemma because of the high nonrelapse mortality associated with allogeneic transplantation in this age group. In the MRC/ECOG study the nonrelapse mortality was 29% and 39% at 1 and 2 years, respectively, with most of the deaths related to graft-versus-host disease (GvHD). In fact, the high transplant-related mortality abrogated the reduction in relapse risk for high-risk older patients with a 5-year survival (39%) and event-free survival that were only marginally and nonsignificantly improved at 5 years compared with patients without a donor (36%).51 The difficulties with managing this patient are further compounded in that she did not have an HLA-matched sibling donor.
At the same time, the GvL effect remains the single most potent antileukemic strategy; thus, every effort must be made to offer this patient an allogeneic transplantation, recognizing the need to consider a less toxic transplantation-conditioning regimen.
Unfortunately, there are only scanty data for the use of reduced-intensity transplantation conditioning in ALL. Such an approach takes into consideration all the known hazards, including failure of engraftment and increased likelihood of relapse. Recent preliminary data reported a 1-year overall survival of approximately 70% for high-risk patients with ALL using reduced-intensity conditioning with fludarabine and melphalan.69 The use of alemtuzumab for such patients has also been reported with encouraging results.70,71
The reported data from Ph-positive ALL showed that for high-risk patients a matched unrelated donor (MUD) is an acceptable alternative for those who do not have a family donor.72 There are also other published data indicating that the risk of a MUD allogeneic transplantation for patients with ALL is not much greater than the risk of a sibling allogeneic transplantation.73,–75 Thus, the approach for this patient should be to offer standard induction therapy which is likely to lead to a 90% complete response rate.17 A search for an unrelated donor must be initiated urgently so that this could be offered, without delay, to the patient after successful induction therapy. Another alternative is the use of a haploidentical transplantation. The experience with this is limited by comparison with a MUD transplantation,76 but an update from the group in Perugia, Italy, reported an event-free survival of 46% for the 24 patients with ALL who received a transplant in first remission.77
This high-risk patient should be offered standard induction followed by an allogeneic transplant from a matched unrelated donor using the best institutional regimen for reduced intensity conditioning. Transplantation-related risks need to be emphasized.
Patient 4
A 53-year-old man presents with Ph-positive ALL. He does not have an HLA-compatible sibling. His CBC count at presentation shows a WBC count of 13.0 × 109/L (13 000/μL), a hemoglobin level of 93.0 g/L (9.3 g/dL), and a platelet count of 23.0 × 109/L (23 000/μL). He received standard induction therapy and imatinib mesylate. At the end of induction and intensification, 3 months from diagnosis, he has no cytogenetic or molecular evidence for BCR-ABL.
Management of patients with Ph-positive ALL has long been recognized as the single worst prognostic predictive factor in ALL. More than 90% of such patients succumb to their disease if an allogeneic transplantation was not performed in first remission. For young adults with Ph-positive ALL an allogeneic transplantation, either from a histocompatible sibling or from a matched unrelated donor, is able to affect a cure in 35% to 50% of patients.
For these reasons, traditional therapy has been to offer some form of an allogeneic transplantation, at almost any cost, to patients with Ph-positive ALL.78,79 Until a year or 2 ago, based on sound trial data, this patient would have been offered a MUD transplantation.67
The advent of imatinib mesylate for patients with Ph-positive ALL has altered the standard of care.80 Although there has never been a prospective study of patients with ALL comparing the results of an allogeneic transplantation with or without concurrent imatinib mesylate therapy, the preponderance of the data report an important role for imatinib mesylate in inducing a more profound disease response at all ages, making it unethical to conduct any therapeutic protocol in ALL without the use of imatinib mesylate. Important issues remain. At what points should imatinib mesylate be given? Should this be at the start of induction or at some later point? Conversely, is there a role for initiating therapy with imatinib mesylate before chemotherapy? And for those patients undergoing an allogeneic transplantation should imatinib mesylate be given after transplantation and for how long?
One of the most important issues that needs to be emphasized is the fact that since the advent of imatinib mesylate there has never been a study that supports the ongoing imperative for an allogeneic transplantation in Ph-positive ALL.81 Should all adult patients, even those at high risk of transplant-related mortality, be referred for an alternative donor transplantation, when a histocompatible sibling is not available? The transplant-related mortality in these patients, using a conventional transplantation regimen, must be in the order of 40% at 2 years. Moreover, even in the era before imatinib mesylate a small percentage of such patients were cured without a transplantation.
Given that this patient had at 3 months no evidence for MRD and was completely negative for BCR-ABL, there are no compelling data to offer him a matched unrelated donor transplantation, although an allogeneic transplantation would likely still be offered if a sibling donor were available. Much like chronic myeloid leukemia (CML), therapy with imatinib mesylate should be continued indefinitely.
This patient should be continued on the standard maintenance protocol, including imatinib mesylate, which should be continued indefinitely.
Patient 5
A 67-year-old man presents with pre-B ALL. His CBC at presentation shows a WBC count of 53.0 × 109/L (53 000/μL), a hemoglobin level of 82.0 g/L (8.2 g/dL), and a platelet count of 21.0 × 109/L (21 000/μL).
The management of older patients with ALL represents a unique therapeutic challenge. Unfortunately, there have been few prospective studies in this age group, although the median age of ALL in adults is older than 60 years.82 Most older patients are excluded from such trials even if they are eligible. Furthermore, patients who are referred to tertiary care medical centers are a select group and even there, only a few are referred for clinical trials. Thus, the true long-term response rates are likely to be even significantly lower than the published data. Taking all this into account, it is likely that among patients who are aged between 60 and 70 years, there may be 10% who are long-term survivors; and older than 70 years the long-term survival is less than 5%.
The problem for older adults is 2-fold. First, there are inherent poor prognostic features more common among older adults, putting this age population at high risk. Most important among these is the high incidence of the Philadelphia chromosome, approximately 50%, among older patients with B-lineage ALL.65,83,–85 Because approximately 90% of older patients are of B-lineage, it means that almost half of older adults are at high risk.84 The second problem is the associated comorbidities and the inability to tolerate the rigors of protracted therapy. As with AML, a significant proportion of older patients can achieve a complete remission; the problem is maintaining them in such a state. In the few studies that have been designed specifically for patients older than 65 years, initial complete response rates of approximately 50% were obtained with the best results being patients who had low white cell counts, were Ph-negative, or those who had T-lineage. The overall survival, however, was less than 10%.86,87 However, among 519 patients older than 60 years, who received intensive chemotherapy on a prospective trial that was not limited by age, the median remission duration was reported to be 9 months and the median survival was 7 months.88
It is clear that a design of a regimen for older adults needs to be attenuated from standard therapy for younger patients.89,90 Because achieving a complete remission seems to be a sine qua non for any form of sustained disease-free survival in ALL,51 a serious attempt should be made to get patients into complete remission. A standard regimen for younger adults should be used, but the duration of corticosteroids should be truncated, and full supportive measures with growth factors should be used to minimize the period of neutropenia. The benefit of cytokines in ALL has been unequivocally shown in several well-designed controlled trials.91 To further reduce the toxicity, investigations are ongoing using agents that would allow for maximal drug delivery with lower toxicity such as liposomal vincristine and, possibly, liposomal daunorubicin. Monoclonal antibody therapy assumes even greater importance in this population that is unable to withstand intensive therapy. Clearly imatinib mesylate is the mainstay for patients who are Philadelphia chromosome positive. This tyrosine kinase inhibitor, as well as the new agents dasatinib and nilotinib, are rapidly changing the prognosis for this particular subset of patients. Emerging data suggest that this agent may be offered to older patients in preference to standard chemotherapy because it has considerably less toxicity.10 Other monoclonal antibodies, such as rituximab and epratuzumab, should be incorporated wherever possible among B-lineage patients. Because the percentage of patients who can be cured is low, it is important to administer drugs that preserve the quality of life as far as is possible. However, a palliative approach at diagnosis is not appropriate for those who are reasonably fit because an initial remission can usually be achieved with reasonably acceptable toxicity and may attain a significant period free from disease. Reduced-intensity allogeneic transplantation may in the future affect a cure in some such patients, but there are no data at present to support this.
A suggested regimen for this patient should consist of phase 1 of induction as in Table 1 with the addition of granulocyte-colony stimulating factor (G-CSF). Phase 2 should also be given together with G-CSF, but without the day 29 of cyclophosphamide and the last 4 days of cytarabine (days 27-30). If complete remission is achieved, maintenance therapy should consist only of methotrexate and 6-mercaptopurine, with vincristine and prednisone added every 2 to 3 months. Clearly, if the patient is Ph-positive, imatinib mesylate should be given throughout. If the patient is CD20+, rituximab should also be added every 4 weeks, starting before the first cycle of induction.
Patient 6
A 32-year-old man presents with T-lineage ALL. His CBC at presentation shows a WBC count of 51.0 × 109/L (51 000/μL), hemoglobin level of 111.0 g/L (11.1 g/dL), and a platelet count of 6.0 × 109/L (6000/μL). Cytogenetic analysis showed the t(10;14)(q24;q11.2) karyotype in 7 of 20 metaphases. The patient reported a mild headache, and a routine lumbar puncture revealed the presence of lymphoblasts in his cerebrospinal fluid. He does not have an HLA-matched sibling.
This is a patient whose age, immunophenotype, WBC count at presentation, and karyotype do not place him into the high-risk category. This karyotype is almost uniquely found among patients with T-cell ALL,51 but it has no important prognostic significance. It is clear that such patients require additional therapy to the CNS. However, the issue at hand is whether otherwise standard protocols for ALL are appropriate for such a patient. The overall incidence of CNS involvement at presentation is in the region of 5% to 8%.92,,–95 Historically, many such patients were considered to be at inherently high risk of relapse and were excluded from many standard protocols. It was assumed by many that such patients required a far more intensive induction and postremission therapy, even on rapid resolution of findings in the cerebrospinal fluid. During the past few years, 2 large studies reported the results based on protocols that offered unchanged standard of care for patients who presented with CNS disease. The LALA-94 study described 48 patients who had CNS disease at presentation with an outcome that did not differ from the more than 700 remaining patients.96 The MRC/ECOG study reported on 77 patients who had CNS disease at presentation of 1508 patients.97 That study showed an overall survival at 10 years of 34% for patients who did not have CNS disease compared with 29% for patients who did have CNS disease. This improvement is marginally significant (P = .03). However, when tested for heterogeneity, looking at the influence of white cell count at presentation, the differences were not statistically significant, especially for the 25 patients with T-ALL who presented with CNS disease.
Although it is always possible that the incidence of CNS disease at presentation is greater than reported in these main groups, because many clinicians may have selectively excluded such patients from analysis, taken together these data suggest that at the current time there is no evidence that any therapy is known to be better for such patients and such persons should be encouraged to enter standard clinical protocols for ALL. It is clear, however, that such patients need to be given added CNS therapy at diagnosis, and the regimen used by the MRC/ECOG is indicated in Table 2.
This patient who is at standard risk would have been a candidate for an allogeneic transplantation had a histocompatible sibling been available. In the absence of such a donor there are 3 options, which include standard protracted consolidation maintenance therapy, an autologous transplantation, or a MUD transplantation. Because he is at standard risk and in the absence of data showing the presence of minimal residual disease after induction and intensification, there are no data that would support subjecting him to the increased risks from an unrelated donor transplantation.
The issue of offering an autologous transplantation has been studied by several groups. It is theoretically attractive because the possibility of offering a single autologous transplantation with relatively low morbidity and minimal mortality, using current practice, would seem to be an attractive theoretical alternative. Several modestly sized studies have reported that an autologous transplantation was at least comparable with standard chemotherapy. The LALA-87 trial compared 95 patients who underwent an autologous transplantation with 96 patients who were randomly assigned to chemotherapy. The 10-year overall survival was 34% compared with 29% (NS).98 The subsequent LALA-94 study compared 70 patients who received an autologous transplant with 59 patients who received chemotherapy with a 6-year disease-free survival of 21% compared with 6% (NS).96 The ECOG/MRC study prospectively evaluated the role of an autologous transplantation in large numbers and for the first time actually reported that there was no evidence that a single autologous transplantation could replace standard consolidation or maintenance therapy in any risk groups because of an inferior event-free survival of 33% in patients undergoing an autologous transplantation compared with 42% for patients randomly assigned to chemotherapy (P ≤ .05). Thus, such an option cannot be recommended for this patient, although a TBI-based conditioning regimen has a theoretic advantage in this patient with CNS disease at presentation.
This patient with CNS disease at presentation but otherwise with standard-risk features should be treated with standard induction therapy together with treatment of his CNS disease. Once in complete remission, he should receive intensification and postremission therapy. An allogeneic transplantation should be offered if he has a matched sibling. Otherwise he should receive standard maintenance therapy.
Patient 7
A 52-year-old woman with pre-B ALL went into complete remission and was treated with standard intensification or maintenance therapy. She relapsed 2 years from diagnosis while receiving maintenance. Her CBC count shows a WBC count of 1.1 × 109/L (1100/μL), hemoglobin level of 63.0 g/L (6.3 g/dL), and a platelet count of 12.0 × 109/L (12 000/μL). Her bone marrow is completely replaced with lymphoblasts having the same original immunophenotype. She does not have a compatible sibling.
Patients with relapsed ALL have a uniformly poor prognosis. Among 609 relapsed patients, the median survival in the MRC/EGOG study was 24 weeks, and 22% survived for 1 year and only 7% for 5 years.14 The best treatment for patients with relapsed ALL is unknown. Several regimens have been suggested.99,,–102 A few facts are clear. The only cure for an adult who has relapsed ALL is an allogeneic transplantation. It is also clear that it is possible to achieve a complete remission with a variety of regimens that range from the administration of vincristine and prednisone alone to a repeat of any of the standard induction regimens or to some of the refractory regimens, such as fludarabine, cytosine arabinoside, and G-CSF (FLAG) or hyper-CVAD, if this was not used in induction. The actual rate of achievement of a CR will depend, much like AML, on the duration of first remission, whether the relapse occurs while on maintenance therapy, and, to some extent, on the degree of the intensity of the prior therapy received and the degree of marrow involvement at relapse. There is also the dilemma of deciding whether to attempt reinduction therapy or to proceed directly to an allogeneic transplantation if a sibling donor is available or if an unrelated donor has been identified. It is clear that the results from an allogeneic transplantation are far superior if performed in remission, and there may be occasions, especially if the relapse occurs after a short remission of say, less than 6 months, when the likelihood of achieving a second remission is so small that the toxicity of reinduction and its effect on a subsequent transplantation may put the balance in favor of an immediate allogeneic transplantation.
The MRC/ECOG also evaluated the efficacy of transplantation after relapse.14 If one selects patients who have survived 3 months from transplantation and had not received prior transplantation in first remission, then patients who undergo an allogeneic transplantation from a sibling donor have an approximately 25% chance of sustained survival, especially if they have achieved a second complete remission. An analysis of 125 such patients reveals a 5-year survival of 25% for 42 patients who underwent a sibling allograft, 16% for those undergoing a MUD transplantation, 15% for those undergoing an autograft, and only 4% for those receiving chemotherapy only.
This particular patient should be reinduced. Although her relapse occurred while on maintenance therapy, this is mitigated by almost 2 years in first remission. A search for an unrelated donor should be instituted as soon as possible, and, if a donor is identified, she should receive a transplant, provided she gets complete or partial response from reinduction. If the disease continues to progress on reinduction therapy, the likelihood of a response from any form of a transplantation is exceedingly low, less than 5%, and at that point consideration should be given for palliative care.
This patient should be reinduced with a regimen such as hyper-CVAD, or any other institutional regimen for relapse. If she achieves a complete or partial response, she should undergo a transplantation from a matched unrelated donor.
Acknowledgment
We thank Sonia Kamenetsky for assistance in the preparation of the manuscript.
Authorship
Contribution: J.M.R. and A.H.G. wrote the paper.
Conflict-of-interest disclosure: The authors declare no competing financial interests.
Correspondence: Jacob M. Rowe, Department of Hematology and Bone Marrow Transplantation, Rambam Medical Center and Technicon, Israel Institute of Technology, Haifa 31096, Israel; e-mail:rowe@jimmy.harvard.edu.