Registered users receive a variety of benefits including the ability to customize email alerts, create favorite journals list, and save searches.
Please note that a BioOne web account does not automatically grant access to full-text content. An institutional or society member subscription is required to view non-Open Access content.
Contact helpdesk@bioone.org with any questions.
Although clinical studies with carbon ions have been conducted successfully in Japan and Europe, the limited radiobiological information about charged particles that are heavier than protons remains a significant impediment to exploiting the full potential of particle therapy. There is growing interest in the U.S. to build a cancer treatment facility that utilizes charged particles heavier than protons. Therefore, it is essential that additional radiobiological knowledge be obtained using state-of-the-art technologies and biological models and end points relevant to clinical outcome. Currently, most such ion radiotherapy-related research is being conducted outside the U.S. This article addresses the substantial contributions to that research that are possible at the NASA Space Radiation Laboratory (NSRL) at Brookhaven National Laboratory (BNL), which is the only facility in the U.S. at this time where heavy-ion radiobiology research with the ion species and energies of interest for therapy can be done. Here, we briefly discuss the relevant facilities at NSRL and how selected charged particle biology research gaps could be addressed using those facilities.
Diethylenetriaminepentaacetic acid (DTPA) is currently still the only known chelating drug that can be used for decorporation of internalized plutonium (Pu) and americium (Am). It is generally assumed that chelation occurs only in biological fluids, thus preventing Pu/Am deposition in target tissues. We postulate that actinide chelation may also occur inside cells by a mechanism called “intracellular chelation”. To test this hypothesis, rats were given DTPA either prior to (termed “prophylactic” treatment) or belatedly after (termed “delayed” treatment) Pu/Am injection. DTPA decorporation efficacy was systematically tested for both plutonium and americium. Both prophylactic and delayed DTPA elicited marked decreases in liver Pu/Am. These results can be explained by chelation within subcellular compartments where DTPA efficacy increased as a function of a favorable intracellular DTPA-to-actinide molar ratio. The efficacy of intracellular chelation of liver actinides decreased with the delay of treatment. This is probably explained by progressive actinide binding to the high-affinity ligand ferritin followed by migration to lysosomes. Intracellular chelation was reduced as the gap between prophylactic treatment and contamination increased. This may be explained by the reduction of the intracellular DTPA pool, which declined exponentially with time. Skeletal Pu/Am was also reduced by prophylactic and delayed DTPA treatments. This decorporation of bone actinides may mainly result from extracellular chelation on bone surfaces. This work provides converging evidence for the involvement of an intracellular component of DTPA action in the decorporation process. These results may help to improve the interpretation of biological data from DTPA-treated contamination cases and could be useful to model DTPA therapy regimens.
As hematopoietic stem and progenitor cells (HSPCs) self-renew throughout life, accumulation of genomic alterations can potentially give rise to radiation carcinogenesis. In this study we examined DNA double-strand break (DSB) induction and repair as well as mutagenic effects of ionizing radiation in CD34 cells and T lymphocytes from the umbilical cord of newborns. The age dependence of DNA damage repair end points was investigated by comparing newborn T lymphocytes with adult peripheral blood T lymphocytes. As umbilical cord blood (UCB) contains T lymphocytes that are practically all phenotypically immature, we examined the radiation response of separated naive (CD45RA) and memory (CD45RO) T lymphocytes. The number of DNA DSBs was assessed by microscopic scoring of γ-H2AX/53BP1 foci 0.5 h after low-dose radiation exposure, while DNA repair was studied by scoring the number of residual γ-H2AX/53BP1 foci 24 h after exposure. Mutagenic effects were studied by the cytokinesis block micronucleus (CBMN) assay. No significant differences in the number of DNA DSBs induced by low-dose (100–200 mGy) radiation were observed among the three different cell types. However, residual γ-H2AX/53BP1 foci levels 24 h postirradiation were significantly lower in CD34 cells compared to newborn T lymphocytes, while newborn T lymphocytes showed significantly higher foci yields than adult T lymphocytes. No significant differences in the level of radiation-induced micronuclei at 2 Gy were observed between CD34 cells and newborn T lymphocytes. However, newborn T lymphocytes showed a significantly higher number of micronuclei compared to adult T lymphocytes. These results confirm that CD34 cell quiescence promotes mutagenesis after exposure. Furthermore, we can conclude that newborn peripheral T lymphocytes are significantly more radiosensitive than adult peripheral T lymphocytes. Using the results from the comparative study of radiation-induced DNA damage repair end points in naive (CD45RA) and memory (CD45RO) T lymphocytes, we could demonstrate that the observed differences between newborn and adult T lymphocytes can be explained by the immunophenotypic change of T lymphocytes with age, which is presumably linked with the remodeling of the closed chromatin structure of naive T lymphocytes.
Terez Shea-Donohue, Alessio Fasano, Aiping Zhao, Luigi Notari, Shu Yan, Rex Sun, Jennifer A. Bohl, Neemesh Desai, Greg Tudor, Motoko Morimoto, Catherine Booth, Alexander Bennett, Ann M. Farese, Thomas J. MacVittie
In this study, nonhuman primates (NHPs) exposed to lethal doses of total body irradiation (TBI) within the gastrointestinal (GI) acute radiation syndrome range, sparing ∼5% of bone marrow (TBI-BM5), were used to evaluate the mechanisms involved in development of the chronic GI syndrome. TBI increased mucosal permeability in the jejunum (12–14 Gy) and proximal colon (13–14 Gy). TBI-BM5 also impaired mucosal barrier function at doses ranging from 10–12.5 Gy in both small intestine and colon. Timed necropsies of NHPs at 6–180 days after 10 Gy TBI-BM5 showed that changes in small intestine preceded those in the colon. Chronic GI syndrome in NHPs is characterized by continued weight loss and intermittent GI syndrome symptoms. There was a long-lasting decrease in jejunal glucose absorption coincident with reduced expression of the sodium-linked glucose transporter. The small intestine and colon showed a modest upregulation of several different pro-inflammatory mediators such as NOS-2. The persistent inflammation in the post-TBI-BM5 period was associated with a long-lasting impairment of mucosal restitution and a reduced expression of intestinal and serum levels of alkaline phosphatase (ALP). Mucosal healing in the postirradiation period is dependent on sparing of stem cell crypts and maturation of crypt cells into appropriate phenotypes. At 30 days after 10 Gy TBI-BM5, there was a significant downregulation in the gene and protein expression of the stem cell marker Lgr5 but no change in the gene expression of enterocyte or enteroendocrine lineage markers. These data indicate that even a threshold dose of 10 Gy TBI-BM5 induces a persistent impairment of both mucosal barrier function and restitution in the GI tract and that ALP may serve as a biomarker for these events. These findings have important therapeutic implications for the design of medical countermeasures.
Acute radiation-induced symptoms reported in survivors after the atomic bombings in Hiroshima and Nagasaki have been suspected to be associated with rain that fell after the explosions, but this association has not been evaluated in an epidemiological study that considers the effects of the direct dose from the atomic bombs and other factors. The aim of this study was to evaluate this association using information from a fixed cohort, comprised of 93,741 members of the Life Span Study who were in the city at the time of the bombing. Information on acute symptoms and exposure to rain was collected in surveys conducted by interviewers, primarily in the 1950s. The proportion of survivors developing severe epilation was around 60% at levels of direct radiation doses of 3 Gy or higher and less than 0.2% at levels <0.005 Gy regardless of reported rain exposure status. The low prevalence of acute symptoms at low direct doses indicates that the reported fallout rain was not homogeneously radioactive at a level sufficient to cause a substantial probability of acute symptoms. We observed that the proportion of reported acute symptoms was slightly higher among those who reported rain exposure in some subgroups, however, suggestions that rain was the cause of these reported symptoms are not supported by analyses specific to the known areas of radioactive fallout. Misclassification of exposure and outcome, including symptoms due to other causes and recall bias, appears to be a more plausible explanation. However, the insufficient and retrospective nature of the available data limited our ability to quantify the attribution to those possible causes.
Zhang Zhang, Michelle Wodzak, Olivier Belzile, Heling Zhou, Brock Sishc, Hao Yan, Strahinja Stojadinovic, Ralph P. Mason, Rolf A. Brekken, Rajiv Chopra, Michael D. Story, Robert Timmerman, Debabrata Saha
Stereotactic body radiation therapy (SBRT) has found an important role in the treatment of patients with non-small cell lung cancer, demonstrating improvements in dose distribution and even tumor cure rates, particularly for early-stage disease. Despite its emerging clinical efficacy, SBRT has primarily evolved due to advances in medical imaging and more accurate dose delivery, leaving a void in knowledge of the fundamental biological mechanisms underlying its activity. Thus, there is a critical need for the development of orthotropic animal models to further probe the biology associated with high-dose-per-fraction treatment typical of SBRT. We report here on an improved surgically based methodology for generating solitary intrapulmonary nodule tumors, which can be treated with simulated SBRT using the X-RAD 225Cx small animal irradiator and Small Animal RadioTherapy (SmART) Plan treatment system. Over 90% of rats developed solitary tumors in the right lung. Furthermore, the tumor response to radiation was monitored noninvasively via bioluminescence imaging (BLI), and complete ablation of tumor growth was achieved with 36 Gy (3 fractions of 12 Gy each). We report a reproducible, orthotopic, clinically relevant lung tumor model, which better mimics patient treatment regimens. This system can be utilized to further explore the underlying biological mechanisms relevant to SBRT and high-dose-per-fraction radiation exposure and to provide a useful model to explore the efficacy of radiation modifiers in the treatment of non-small cell lung cancer.
The in vivo mouse transgenic pKZ1 chromosomal inversion assay is a sensitive assay that responds to very low doses of DNA-damaging agents. pKZ1 inversions are measured as the frequency of cells expressing E. coli β-galactosidase protein, which can only be produced from an inverted pKZ1 transgene. In previous studies we reported that a single whole-body low dose of 0.01 mGy X rays alone caused an increase in pKZ1 chromosomal inversions in spleen when analyzed 3 days postirradiation, and yet this same dose could protect from high-dose-induced inversions when delivered as a conditioning dose 4 h before or after a 1 Gy challenge dose. In an attempt to explain these results, we performed temporal studies over a wide radiation dose range to determine if the inversion response was temporally different at different doses. pKZ1 mice were irradiated with a single whole-body X-ray dose of 0.01 mGy, 1 mGy or 1 Gy, and spleen sections were then analyzed for pKZ1 inversions at 7 h, 1 day or 7 days after exposure. No change in inversion frequency was observed at the 7 h time point at any dose. At day 1, an increase in inversions was observed in response to the 0.01 mGy dose, whereas a decrease in inversions below sham-treated frequency was observed for the 1 mGy dose. Inversion frequency for both doses returned to sham-treated inversion frequency by day 7. To our knowledge, this is the first reported study to examine the temporal nature of a radiation response spanning a wide dose range, including doses relevant to occupational exposure, and the results are dynamic and dose specific. The results suggest that inversions induced after low-dose irradiation are removed by homeostatic mechanisms within a short time frame, and underscore the importance of studying responses over a period of time when interpreting radiation effects.
Down syndrome (DS) is a genetic disorder caused by the presence of an extra partial or whole copy of chromosome 21. In addition to musculoskeletal and neurodevelopmental abnormalities, children with DS exhibit various hematologic disorders and have an increased risk of developing acute lymphoblastic leukemia and acute megakaryocytic leukemia. Using the Ts65Dn mouse model, we investigated bone marrow defects caused by trisomy for 132 orthologs of the genes on human chromosome 21. The results showed that, although the total bone marrow cellularity as well as the frequency of hematopoietic progenitor cells (HPCs) was comparable between Ts65Dn mice and their age-matched euploid wild-type (WT) control littermates, human chromosome 21 trisomy led to a significant reduction in hematopoietic stem cell (HSC) numbers and clonogenic function in Ts65Dn mice. We also found that spontaneous DNA double-strand breaks (DSBs) were significantly increased in HSCs from the Ts65Dn mice, which was correlated with the significant reduction in HSC clonogenic activity compared to those from WT controls. Moreover, analysis of the repair kinetics of radiation-induced DSBs revealed that HSCs from Ts65Dn mice were less proficient in DSB repair than the cells from WT controls. This deficiency was associated with a higher sensitivity of Ts65Dn HSCs to radiation-induced suppression of HSC clonogenic activity than that of euploid HSCs. These findings suggest that an additional copy of genes on human chromosome 21 may selectively impair the ability of HSCs to repair DSBs, which may contribute to DS-associated hematological abnormalities and malignancies.
We reported in an earlier study that using mass spectrometry and bioinformatic analysis demonstrated Rac1 protein might be mostly mitochondrial target in the radiosensitization process of nasopharyngeal carcinoma CNE-1 cells. The goal of our current study was to reveal the relationship between Rac1/NADPH pathway and radiosensitization in CNE-1 cells using Rac1 activator, phorbol 12-myristate 13-acetate (PMA) and Rac1 inhibitor NSC23766. The Rac1-GTP expression was determined using a pulldown assay, the Rac1 location using a immunofluorescence with a laser scanning confocal microscope, the NADPH oxidase activity with NBT assay and the reactive oxygen species with DCFH-DA probe. The apoptosis rate was analyzed by flow cytometry, and the expressions of p67phox and NFκB-p105/p50 were analyzed by Western blot. After treatment with PMA and 2 Gy radiation (compared to the control), Rac1-GTP was activated and translocated to the cell membrane. NADPH oxidase activity, reactive oxygen species of intracellular concentration and the apoptosis rate increased significantly. The expression of p67phox and NFκB-p50 protein also increased. However, in the cells treated with NSC23766 alone or NSC23766 combined with 2 Gy irradiation, the results were just the opposite. Overall, these results indicate that the Rac1 protein may be the key target involved in the radiosensitization of nasopharyngeal carcinoma cells. The activated Rac1/NADPH pathway combined with radiation can increase the radiosensitivity of nasopharyngeal carcinoma cells, and the Rac1/NADPH pathway may be the signaling pathway involved in the radiosensitization process.
Microgravity and radiation are stressors unique to the spaceflight environment that can have an impact on the central nervous system (CNS). These stressors could potentially lead to significant health risks to astronauts, both acutely during the course of a mission or chronically, leading to long-term, post-mission decrements in quality of life. The CNS is sensitive to oxidative injury due to high concentrations of oxidizable, unsaturated lipids and low levels of antioxidant defenses. The purpose of this study was to evaluate oxidative damage in the brain cortex and hippocampus in a ground-based model for spaceflight, which includes prolonged unloading and low-dose radiation. Whole-body low-dose/low-dose-rate (LDR) gamma radiation using 57Co plates (0.04 Gy at 0.01 cGy/h) was delivered to 6 months old, mature, female C57BL/6 mice (n = 4–6/group) to simulate the radiation component. Anti-orthostatic tail suspension was used to model the unloading, fluid shift and physiological stress aspects of the microgravity component. Mice were hindlimb suspended and/or irradiated for 21 days. Brains were isolated 7 days or 9 months after irradiation and hindlimb unloading (HLU) for characterization of oxidative stress markers and microvessel changes. The level of 4-hydroxynonenal (4-HNE) protein, an oxidative specific marker for lipid peroxidation, was significantly elevated in the cortex and hippocampus after LDR HLU compared to controls (P < 0.05). The combination group also had the highest level of nicotinamide adenine dinucleotide phosphate oxidase 2 (NOX2) expression compared to controls (P < 0.05). There was a significant decrease in superoxide dismutase (SOD) expression in the animals that received HLU only or combined LDR HLU compared to control (P < 0.05). In addition, 9 months after LDR and HLU exposure, microvessel densities were the lowest in the combination group, compared to age-matched controls in the cortex (P < 0.05). Our data provide the first evidence that prolonged exposure to simulated microgravity and LDR radiation is associated with increased oxidative stress biomarkers that may increase the likelihood of brain injury and reduced antioxidant defense. NOX2-containing nicotinamide adenosine dinucleotide phosphate (NADPH oxidase) may contribute to spaceflight environment-induced oxidative stress.
Telomeres consist of GC-rich DNA repeats and the “shelterin” protein complex that together protect chromosome ends from fusion and degradation. Telomeres shorten with age due to incomplete end replication and upon exposure to environmental and intrinsic stressors. Exposure to ionizing radiation is known to modulate telomere length. However, the response of telomere length in humans chronically exposed to radiation is poorly understood. Here, we studied relative telomere length (RTL) by IQ-FISH to leukocyte nuclei in a group of 100 workers from the plutonium production facility at the Mayak Production Association (PA) who were chronically exposed to alpha-emitting (239Pu) radiation and/or gamma (photon) radiation, and 51 local residents serving as controls, with a similar mean age of about 80 years. We applied generalized linear statistical models adjusted for age at biosampling and the second exposure type on a linear scale and observed an age-dependent telomere length reduction. In those individuals with the lowest exposure, a significant reduction of about 20% RTL was observed, both for external gamma radiation (≤1 Gy) and internal alpha radiation (≤0.05–0.1 Gy to the red bone marrow). In highly exposed individuals (>0.1 Gy alpha, 1–1.5 Gy gamma), the RTL was similar to control. Stratification by gender revealed a significant (∼30%) telomere reduction in low-dose-exposed males, which was absent in females. While the gender differences in RTL may reflect different working conditions, lifestyle and/or telomere biology, absence of a dose response in the highly exposed individuals may reflect selection against cells with short telomeres or induction of telomere-protective effects. Our observations suggest that chronic systemic exposure to radiation leads to variable dose-dependent effects on telomere length.
Resistance to radiation is considered to be an important reason for local failure after radiotherapy and tumor recurrence. However, the exact mechanisms of tumor resistance remain poorly understood. Current investigations of microRNAs as potential diagnostic and therapeutic tools for cancer treatment have shown promising results. With respect to radiotherapy resistance and response, there is now emerging evidence that microRNAs modulate key cellular pathways that mediate response to radiation. These data suggest that microRNAs might have significant potential as targets for the development of new therapeutic strategies to overcome radioresistance in cancer. This review summarizes the current literature pertinent to the influence of microRNAs in the response to radiotherapy for cancer treatment, with an emphasis on microRNAs as novel diagnostic and prognostic markers, as well as their potential to alter radiosensitivity.
This article is only available to subscribers. It is not available for individual sale.
Access to the requested content is limited to institutions that have
purchased or subscribe to this BioOne eBook Collection. You are receiving
this notice because your organization may not have this eBook access.*
*Shibboleth/Open Athens users-please
sign in
to access your institution's subscriptions.
Additional information about institution subscriptions can be foundhere