In this issue of Blood Global Hematology, Pfister et al1 derived, from the Brazilian Registry of Chronic Lymphocytic Leukemia (BRCLL), critical real-world evidence on the efficacy, safety, and patterns of ibrutinib use in Latin America, highlighting both the therapeutic promise and the practical challenges in a resource-variable setting.2-4 The advent of Bruton tyrosine kinase inhibitors (BTKis) has markedly transformed the therapeutic landscape of chronic lymphocytic leukemia (CLL).5 Although randomized controlled trials have demonstrated impressive progression-free and overall survival (OS) benefits with BTKis,6-8 especially in patients with high-risk genetic features, the applicability of these results to broader, more heterogeneous populations remains an area of ongoing investigation.
This retrospective cohort included 310 patients with CLL treated with ibrutinib across 32 Brazilian centers, offering one of the most comprehensive evaluations of ibrutinib outcomes in a nontrial population from Latin America. The data underscore a favorable efficacy profile of ibrutinib across all lines of therapy, with a 4-year OS of 68% and time-to-next treatment (TTNT) notably longer when ibrutinib was used in earlier lines (TTNT at 4 years: 72% for first/second line vs 48% for later lines; P < .0001). These results align with international real-world series, supporting the robustness of ibrutinib efficacy even beyond the clinical trial setting.2-4
Nevertheless, the study reveals key concerns about drug tolerability and treatment sustainability. More than half of the patients discontinued therapy, with progression (33%), toxicity (32%), and death while on treatment (26%) being the leading causes. Adverse events (AEs) occurred in 51% of patients, more frequently in those receiving ibrutinib in later lines (59% vs 42%; P = .02). Infections, bleeding, hematologic toxicities, and cardiac complications, consistent with known safety profiles, were prominent. These findings not only mirror those from European and North American cohorts2-4 but also emphasize the need for vigilant AE monitoring, particularly in older patients or those with comorbidities. Interestingly, atrial fibrillation and hypertension, common cardiovascular AEs in previous ibrutinib trials,9 appeared less frequently in this cohort (5% and 4.2%, respectively). Whether this discrepancy could reflect underreporting, variations in comorbidity profiles, or differences in clinical surveillance remains unclear. However, the relatively high rate of deaths due to infections, including COVID-19, reinforces the importance of infection prophylaxis and vaccination strategies during BTKi therapy.
Another observation concerns the impact of molecular features on outcomes. Despite incomplete molecular profiling, the study confirms that patients with TP53 disruption or del(17p) experienced significantly inferior TTNT (43% at 4 years vs 68% in TP53 wild-type; P = .002), consistent with their historically poor prognosis. However, the fact that nearly one-quarter of patients with these adverse features received ibrutinib in the first line indicates appropriate risk-adapted therapy in many cases. That said, the limited availability of TP53/IGHV (immunoglobulin heavy-chain variable) status across the cohort (only 62% and 33%, respectively) underscores the urgent need to improve access to molecular diagnostics in real-world practice, particularly in emerging health systems.
A further strength of this study lies in its reflection of true clinical heterogeneity. Nearly three-quarters of patients had comorbidities at ibrutinib initiation, including cardiovascular disease and diabetes. These factors likely contributed to higher AE rates and treatment discontinuations. Nonetheless, the study suggests that careful patient selection, AE management, and individualized dose modifications may mitigate toxicity without compromising efficacy, echoing findings from other observational cohorts that support flexible-dosing strategies.
In light of these findings, several considerations emerge for optimizing CLL management in real-world settings. First, the superior outcomes with first- or second-line use reinforce current guidelines that prioritize early BTKi use in high-risk patients, especially those with TP53 disruption. Where feasible, earlier intervention may translate into longer treatment durations and improved survival. Second, a high rate of discontinuation due to toxicity highlights the need for proactive monitoring and management of AEs, including early identification and mitigation strategies. In this sense, educational efforts targeting health care providers may be key. Third, incomplete molecular characterization limits optimal treatment selection. Expanding access to fluorescence in situ hybridization and next-generation sequencing panels for del(17p), TP53, and IGHV mutational status, is essential for risk stratification and therapeutic planning. In fact, the cost/benefit of these testing procedures is far more favorable than any of these therapies. Last, real-world challenges, such as drug access, comorbidity burden, and infectious complications (notably COVID-19), must be considered when interpreting outcomes and planning treatment strategies in resource-limited environments.
In conclusion, the BRCLL study provides robust real-world validation of the efficacy of ibrutinib in a large cohort of Latin American patients with CLL. The findings reinforce its position as a standard-of-care therapy while also underscoring the need for improved strategies in managing treatment-related toxicities, enhancing diagnostic capabilities, and addressing disparities in health care delivery. Importantly, this study acknowledges the contributions of the investigators and emphasizes the critical role of regionally conducted clinical research in advancing the global management of CLL.
Conflict-of-interest disclosure: The authors declare no competing financial interests.