Acute lymphoblastic leukaemia (ALL) accounts for one-third of paediatric cancer cases in developed societies1. Although treatment of paediatric ALL is highly successful with cure rates of around 90%2, the longer-term complications of therapy and impact on quality of life during treatment are substantial. Two-thirds of childhood ALL survivors face severe morbidity for decades after their disease is eradicated, as well as 20 times higher mortality compared with their healthy, age-matched counterparts3. This largely unaccounted for burden of ALL justifies the key importance of ongoing research into its aetiology and pursuit of a longer-term goal of primary prevention4,5.
Research over recent decades has unravelled the natural history and clonal evolution of the main genetic subtypes of B cell precursor-ALL (BCP-ALL), which account for the majority of childhood ALL cases and have a peak incidence around the ages of 2–6 years. These observations have endorsed a two-stage model for this cancer originally termed the ‘delayed infection’ hypothesis6,7, which has features in common with the so-called hygiene hypothesis proposed for allergies and type 1 diabetes8. The first stage is manifested by genomic lesions arising in progenitor cells in utero leading to the development of clinically covert preleukaemic clones with only modest, proliferative advantage9. Backtracking of BCP-ALL cases with neonatal Guthrie cards and umbilical cord blood samples as well as comparative genomics in monozygotic twins with ALL have confirmed the prenatal origin of the predominant initiating chromosomal aberrations ETS translocation variant 6 (ETV6)::runt-related transcription factor 1 (RUNX1)10 and high hyperdiploidy11,12. However, as shown in mouse models13 and human umbilical cord blood samples14, these initiating events are not sufficient for leukaemic transformation15,16. ETV6::RUNX1 fusions (in-frame and in lymphocytes) are present in 1–5% of healthy newborn babies, but the overwhelming majority (~99%) will not develop leukaemia indicating low penetrance of the disease and the need for additional, postnatal mutational events14,17.
The ‘delayed infection’ hypothesis predicted that persistent preleukaemic clones acquire the essential, postnatal, secondary mutations as a result of a dysregulated immune response to common infections or chronic inflammation7. The nature and diversity of these ‘triggering’ infections for ALL remain uncertain, although respiratory viruses are implicated by epidemiological studies18,19. Experimental modelling data have highlighted possible mechanisms via which inflammatory cytokines might both expand preleukaemic clones and trigger the commonly observed secondary genetic changes20. The highly recurrent secondary genetic changes are primarily copy-number alterations (deletions) in genes elicited by off-target immunoglobulin heavy chain V(D)J recombination-activating protein (RAG) activity20. In this context, activation-induced cytidine deaminase (AID), which can be expressed in BCPs following repetitive strong inflammatory signals, has been shown to cooperate with RAG to result in genomic instability that can drive the evolution of pre-leukaemic clones13.
But, critically, the delayed infection model also predicted that the infection-driven dysregulated immune response triggering these crucial second hits was contingent upon a deficit of microbial exposure in infancy and a consequent failure of adequate immune network priming or maturation. Epidemiological evidence supports that contention via surrogate measures7. The risk of BCP-ALL is increased by caesarean section (C-section) birth21,22, brief or absent breastfeeding23,24 and paucity of social contacts during infancy25,26,27. We note that these social risk factors are shared with type 1 diabetes and allergies, raising the possibility of a common underlying immune priming deficit22. More recently, such early-life exposures were shown to have a profound impact on the acquisition and robustness of the neonatal and infant gut microbiome28,29,30,31, which, in turn, is recognized as fundamental to the maturation of the naive immune network of infants32. This led to the suggestion that the key risk factor of microbial underexposure (or ‘delay’) in BCP-ALL resides in the pivotal role of the microbiome and the prediction of a deficient gut microbiome in patients who develop BCP-ALL4,7.
Recent longitudinal studies have now revealed that a delayed maturation of the gut microbiome by the age of 12 months is associated with an increased risk of asthma diagnosis by 5 years of age33,34,35. Although the emerging role of the gut microbiome in the pathogenesis of childhood ALL has been discussed in recent review articles5,7,36,37,38,39,40,41, the rarity of the disease has thus far precluded any prospective longitudinal studies. Here, we comprehensively analyse the findings of existing case–control gut microbiome studies of childhood ALL at the time of diagnosis in the context of newly discovered maturation patterns of the gut microbiome. We discuss how early-life exposures associated with an increased risk of childhood ALL can induce gut microbiome instability and perturb its maturation, which in turn can jeopardize the integrity of the immune network. Finally, we propose methods to further delineate the role of the gut microbiome in BCP-ALL pathogenesis in future clinical trials and mouse models in the era of rapidly evolving gut microbiome research.
More information: Here