In this issue of Blood, Kataoka et al1 leverage their landscape of adult T-cell leukemia/lymphoma (ATL) to define associations between outcome and specific genetic alterations.2 They identify gene alterations in both high-grade and low-grade disease that correlate with outcome. Future studies are needed to refine and validate this clinicogenetic classification of ATL. Once validated, prospective studies should assess whether patients with poor-risk disease benefit from alternative strategies and whether those with low-risk disease can achieve long-term survival with less intensive treatment.
Human T-cell leukemia virus type-1 infection is endemic within parts of Japan, the Caribbean, Latin America, Africa, Iran, and Romania. Among infected persons, the lifetime risk of developing ATL is approximately 5%. Laboratory and imaging findings can classify patients with ATL into 4 subtypes: acute, lymphoma, chronic, and smoldering.3 The chronic subtype can be further divided into favorable and unfavorable groups on the basis of the presence of high serum lactate dehydrogenase, high blood urea nitrogen, and low albumin levels.4
The acute, lymphoma, and unfavorable chronic subtypes are considered aggressive disease, whereas smoldering and favorable chronic subtypes are considered indolent.3 This distinction is pivotal for selecting therapy. Patients with aggressive disease typically receive highly intensive chemotherapy with or without allogeneic stem cell transplantation. The anti-CCR4 antibody mogamulizumab is also widely used in countries where it is available. Overall outcomes, even with this intensive approach, remain poor. In stark contrast, watchful waiting or combined interferon-α and zidovudine therapy is standard treatment for indolent disease.4
Kataoka and colleagues recently published a landmark description of ATL genetics.2 Like other T-cell malignancies, the ATL exomes are dominated by recurrent alterations in factors involved in T-cell receptor and cytokine signaling, immune escape, and transcriptional regulation.2,5-7 In the Kataoka study in this issue, the authors used Cox proportional hazards regression modeling to identify statistically robust associations between genetic alterations and survival in patients with ATL. There were several notable findings. First, almost all single-nucleotide and insertion/deletion mutations recurrently identified in ATL are subclonal. Thus, even though genetic alterations can be targeted with available agents (eg, mutations in CSNK1A1 with immunomodulatory imide drugs8 ), those agents may only be active against subclones.
The second notable finding comes as no surprise: mutations in aggressive disease differ from those in indolent disease. In particular, mutations of IRF4 (also known as MUM1), which is a central transcription factor in both B- and T-cell maturation, were enriched nearly fivefold in high-risk disease. These mutations are believed to be gain-of-function and were essentially all subclonal; therefore, they may indicate a dependence on IRF4 that has yet to be exploited therapeutically. Strategies for targeting IRF4 with proteolysis-targeting chimeras or other novel approaches are therefore a high priority in this disease.
Finally, Kataoka et al show that gene mutations can add prognostic value to the traditional classification of ATL. They identify age ≥70 years, mutations of PRKCB, and amplifications of chr.9p24 (including CD274, which encodes PD-L1) as independent risk factors for poor outcome among patients with aggressive disease. These factors were particularly impactful in patients with moderate-risk disease (according to the Japan Clinical Oncology Group prognostic index) because none of the 12 with 2 or more risk factors had long-term survival.
Among patients with more indolent disease, those with IRF4 mutations, amplifications of chr.9p24, or deletions of chr.9p21 had a worse prognosis (median overall survival, <2 years) compared with those with none of these abnormalities (median overall survival, not reached). From a treatment standpoint, this genetic information may be most important for patients with unfavorable chronic subtype; among the two-thirds of these patients who lacked any of the genetic risk factors, median overall survival was approximately 5 years compared with approximately 1 year for those with 1 or more genetic risk factors.
These findings suggest but do not prove that genetic testing can help guide treatment selection among patients with ATL (see figure). However, several outstanding issues must first be addressed. The analysis by Kataoka et al must be validated using an independent cohort, preferably from a region outside Japan. Additional alterations known to occur in ATL but not analyzed by Kataoka et al should also be considered. These include structural variations within the 3′ untranslated region of CD274 that significantly increase transcript levels, rearrangements involving CD28, and high levels of soluble interleukin-2 receptor.9 From a statistical standpoint, it may or may not be preferable to create a risk score that sums predictor values weighted by Lasso coefficients for each clinical or genetic factor10 instead of using the approach of Kataoka et al.
Once all of that is done, it remains to be seen whether outcomes for patients with poor-risk disease can be improved with new therapeutic strategies and whether de-intensification of therapy for patients with good-risk disease (especially those with the unfavorable chronic subtype) will compromise their good outcomes. Inhibitors of both PD-1/PD-L1 and CTLA4 are likely to have activity in patients with ATL who harbor specific genetic lesions, but the appropriate timing for these agents may also depend on their risk with current therapies. In other words, we now seem to have multiple agents that can target vulnerabilities in ATL. If we do the right trials and the right analyses of patients on those trials over the next decade, patients with ATL are likely to benefit forever after.
Conflict-of-interest disclosure: The authors declare no competing financial interests.