Registered users receive a variety of benefits including the ability to customize email alerts, create favorite journals list, and save searches.
Please note that a BioOne web account does not automatically grant access to full-text content. An institutional or society member subscription is required to view non-Open Access content.
Contact helpdesk@bioone.org with any questions.
Allison Gibbs, Pawan Gupta, Buddha Mali, Yannick Poirier, Mathangi Gopalakrishnan, Diana Newman, Andrew Zodda, Julian D. Down, Artur A. Serebrenik, Michael D. Kaytor, Isabel L. Jacksone
The objective of the current study was to establish a mouse model of acute radiation syndrome (ARS) after total-body irradiation with 2.5% bone marrow sparing (TBI/BM2.5) that progressed to the delayed effects of acute radiation exposure, specifically pneumonitis and/or pulmonary fibrosis (DEARE-lung), in animals surviving longer than 60 days. Two hundred age and sex matched C57L/J mice were assigned to one of six arms to receive a dose of 9.5 to 13.25 Gy of 320 kV X-ray TBI/BM2.5. A sham-irradiated cohort was included as an age- and sex-matched control. Blood was sampled from the facial vein prior to irradiation and on days 5, 10, 15, 20, 25, and 30 postirradiation for hematology. Respiratory function was monitored at regular intervals throughout the in-life phase. Animals with respiratory dysfunction were administered a single 12-day tapered regimen of dexamethasone, allometrically scaled from a similar regimen in the non-human primate. All animals were monitored daily for up to 224 days postirradiation for signs of organ dysfunction and morbidity/mortality. At euthanasia due to criteria or at the study endpoint, wet lung weights were recorded, and blood sampled for hematology and serum chemistry. The left lung, heart, spleen, small and large intestine, and kidneys were processed for histopathology. A dose-response curve with the estimated lethal dose for 10–99% of animals with 95% confidence intervals was established. The median survival time was significantly prolonged in males as compared to females across the 10.25 to 12.5 Gy dose range. Animal sex played a significant role in overall survival, with males 50% less likely to expire prior to the study endpoint compared to females. All animals developed pancytopenia within the first one- to two-weeks after TBI/BM2.5 followed by a progressive recovery through day 30. Fourteen percent of animals expired during the first 30-days postirradiation due to ARS (e.g., myelosuppression, gastrointestinal tissue abnormalities), with most deaths occurring prior to day 15. Microscopic findings show the presence of radiation pneumonitis as early as day 57. At time points later than day 70, pneumonitis was consistently present in the lungs of mice and the severity was comparable across radiation dose arms. Pulmonary fibrosis was first noted at day 64 but was not consistently present and stable in severity until after day 70. Fibrosis was comparable across radiation dose arms. In conclusion, this study established a multiple organ injury mouse model that progresses through the ARS phase to DEARE-lung, characterized by respiratory dysfunction, and microscopic abnormalities consistent with radiation pneumonitis/fibrosis. The model provides a platform for future development of medical countermeasures for approval and licensure by the U.S. Food and Drug Administration under the animal rule regulatory pathway.
Cardiotoxicity is a well-recognized, serious adverse effect of thoracic radiation therapy. This study aimed to evaluate longitudinal electrocardiogram (ECG) changes in patients receiving thoracic radiation therapy and identify correlating factors that can predict the risk of cardiotoxicity. This retrospective study included 202 patients treated with thoracic radiation therapy, and chemotherapy and targeted therapy were allowed. Mean heart dose (MHD) was evaluated on dose-volume histograms. ECG, high-sensitivity cardiac troponin T (hs-cTnT), and N-terminal B-type natriuretic peptide (NT-proBNP) analyses were conducted before irradiation and during the follow-up period of 6–12 months (average 8 months). Chi square test and logistic regression analysis were applied to identify risk factors associated with ECG changes. At a median time of 3 months postirradiation, 46.5% of patients showed ECG changes, and 33.0% of patients achieved baseline ECG levels during the follow-up period at a median of 5 months postirradiation. Logistic regression analysis identified MHD, hs-cTnT and NT-pro BNP as significant factors associated with ECG changes (P < 0.05). Hs-cTnT and NT-proBNP were increased significantly after radiation therapy compared with baseline levels (P < 0.05), and these increases were observed as a median time of 2 months postirradiation, which was earlier than ECG changes. Higher MHD and elevated hs-cTnT and NT-proBNP levels correlated with an increased risk of ECG changes in patients receiving thoracic radiation therapy. Early identification of patients at high risk of cardiotoxicity and timely intervention might reduce the incidence of radiation-induced cardiac toxicity.
Radiation-induced heart injury (RIHI) limits the dose delivery of radiotherapy for thoracic cancer. Shenmai injection (SMI) is reported to have potential cytoprotective properties and is commonly used in cardiovascular diseases. So, we aimed to investigate the potential protective effects of SMI treatment on RIHI. In this study, we established the RIHI model using Sprague-Dawley rats and H9c2 cell line. In vivo, the biochemical assay was used to measure serum cardiac injury-related biomarkers and echocardiography to evaluate heart function. The pathological analysis was also applied to observe the myocardial structural changes. In vitro, we further measured the cell viability and reactive oxygen species (ROS) levels after irradiation with or without SMI treatment. Our data showed the administration of SMI reduced the level of serum cardiac injury biomarkers and ameliorated cardiac dysfunction after irradiation in rats. Pathological analysis revealed that SMI mitigated cardiac structural damage, fibrosis, and macrophage infiltration. Besides, treatment with SMI increased cell viability and decreased excess ROS production after irradiation in vitro. Taken together, our study demonstrated the protective role of SMI treatment on RIHI by inhibiting oxidative stress and decreasing structural remodeling.
This study explores the likely prevalence of false indications of dose-response nonlinearity in large epidemiologic cancer radiation cohort studies (A-bomb survivors, INWORKS, Techa River). Reasons: Increasing numbers of tests of nonlinearity are being made in studies. Hypothesized nonlinear dose-response models have been justified to policy makers by analyses that rely in part on isolated findings that could be statistical fluctuations. After removing dose nonlinearity (linearization) by adjusting person-years of observation at each dose category, indications of nonlinearity, necessarily false, were counted in 5,000 randomized replications of six datasets. The average frequency of any false positive for five indicators of nonlinearity tested against a linear null was roughly 25% in Monte Carlo simulations per study, consistent with binomial calculations, increasing to ∼50% within 6 studies assessed. Comparable frequencies were found using Akaike's information criterion (AIC) for model selection or multi-model averaging. False above-zero threshold doses were found more than 50% of the time, averaging to 0.05 Gy, consistent with findings in the 6 studies. Such bias, uncorrected, could distort meta-analyses of multiple studies, because meta-analyses can incorporate high P value findings. AIC-based correction for the extra threshold parameter lowered these false occurrences to 8 to 19%. Given the simulation rates, the possibility of false positives might be noted when isolated findings of nonlinearity are discussed in a regulatory context. When reporting a threshold dose with a P value > 0.05, it would be informative to note the expected high false prevalence rate due to bias.
Tritium is found in the environment under three forms: free in the water, gaseous, and bound to organic matter. Once internalized in living organisms, it can be found in two forms: tissue free water tritium (TFWT) and organically bound tritium (OBT). This study aims to better understand OBT internalization in living organisms and to show the complementarity between experimental procedures and microdosimetry simulations that have often been used to obtain more information on imparted energy to cell nuclei. To do so, tritiated thymidine, an organic form of tritium, was chosen and zebrafish embryos [3.5 h post fertilization (hpf)] were exposed to a range of activity concentrations (2.21 × 103 to 5.95 × 105 Bq/mL). First, individual zebrafish embryos were sampled after different exposure times (1 to 96 h) to qualify the internalization kinetics. Then, the barrier role of the chorion was assessed after 2 days of exposure. Lastly, individual zebrafish embryos were sampled after 1 and 4 days of exposure to measure the internalization in the whole fish and its DNA, but also to highlight a possible link between the internal dose rate and the external activity concentration. Microdosimetry simulations were also made to quantify the imparted energy that could occur in the zebrafish cells after exposure to tritium. Results showed that when bound to thymidine, tritium rapidly incorporates in zebrafish early life stages, with the internalization being almost complete after 24 h. Results also showed that while the chorion acted as a barrier to prevent thymidine from entering the embryos, significant levels could still be measured in the whole organisms as well as in DNA. This study also highlighted that when the external activity concentration increased, the internal dose rate increased as well, following a sigmoidal trend. Microdosimetry simulations highlighted that the size and shape of the cell matters, and that the smallest cells seem to be at the greater risk, with only low-energy electrons inducing energy depositions. A linear fit was also found between the mean energy deposited and the logarithm of the radius of the cell, thus showing that the quantity of deposited energy is proportional to the radius of the cell. While this study highlighted important internalization pattern, it will also be used as the starting point of a study focusing on the toxic effects of tritiated thymidine on zebrafish in its early life stages.
The cytokinesis-block micronucleus (CBMN) assay in cytogenetic biodosimetry uses micronucleus (MN) frequency scored in binucleated cells (BNCs) to estimate ionizing radiation dose exposed. Despite the faster and simpler MN scoring, CBMN assay is not commonly recommended in radiation mass-casualty triage as human peripheral blood is typically cultured for 72 h. Furthermore, CBMN assay evaluation in triage often uses high-throughput scoring with expensive and specialized equipment. In this study, we evaluated the feasibility of a low-cost method of manual MN scoring on Giemsa-stained slides in shortened 48 h cultures for triage. Both whole blood and human peripheral blood mononuclear cell cultures were compared for different culture periods and Cyt-B treatment [48 h (24 h at Cyt-B); 72 h (24 h at Cyt-B); 72 h (44 h at Cyt-B)]. Three donors (26-year-old female, 25-year-old male, 29-year-old male) were used for dose-response curve construction with radiation-induced MN/BNC. Another 3 donors (23-year-old female, 34-year-old male, 51-year-old male) were used for triage and conventional dose estimation comparison after 0, 2 and 4 Gy X-ray exposure. Our results showed that despite lower percentage of BNC in 48 h than 72 h cultures, sufficient BNCs were obtained for MN scoring. Triage dose estimates of 48 h cultures were obtained in 8 min in non-exposed donors, and 20 min in 2 or 4 Gy exposed donors with manual MN scoring. One hundred BNCs could be scored for high doses instead of 200 BNCs for triage. Furthermore, observed triage MN distribution could be preliminarily used to differentiate 2 and 4 Gy samples. The number of BNCs scored (triage or conventional) also did not affect dose estimation. Dose estimates in 48 h cultures were also mostly within ±0.5 Gy of actual doses, thus showing the feasibility of manual MN scoring in the shortened CBMN assay for radiological triage applications.
Biological dosimetry is a key technique for retrospective radiation dosimetry that provides individual estimates of absorbed dose of ionizing radiation, applicable for use in a large scale radiological/nuclear event. Current techniques for biodosimetry are labor intensive and time consuming and not high through-put. In this proof-of-concept study, we developed a new approach for detecting irradiated blood based on Raman spectroscopy of blood combined with multivariate analysis. Peripheral blood samples from 8 healthy male and female, anonymous donors, were exposed to either 5 Gy X ray radiation or unirradiated (0 Gy). At 3 h postirradiation, the blood was immediately frozen at –80°C. Raman spectra were measured from thawed blood using a portable spectrometer system. Data were preprocessed and analyzed using principal component analysis (PCA) to observe trends in the data, and by using partial least squares-discriminant analysis (PLS-DA) to build a model to discriminate between Raman spectra of control (0 Gy) and irradiated (5 Gy) blood. We found strong evidence of inter-donor variability in the form of donor-wise clustering of PCA scores corresponding to the control Raman spectra, in addition to the poor separation of PLS-DA scores associated with Raman intensities of 0 Gy vs. 5 Gy spectra. However, after adjustment for donor covariates using a linear mixed-effects model, we obtained a better separation between control and irradiated blood using PLS-DA. Evaluation of the coefficients of the PLS-DA loading vectors indicated radiation-induced changes in proteins, lipids and hemoglobin to be major contributors for this discrimination. This pilot study demonstrates the potential of application of Raman spectroscopy to support biodosimetry of blood and blood components.
Altered cellular responses to DNA damage can contribute to cancer development, progression, and therapeutic resistance. Mutations in key DNA damage response factors occur across many cancer types, and the DNA damage-responsive gene, TP53, is frequently mutated in a high percentage of cancers. We recently reported that an alternative splicing pathway induced by DNA damage regulates alternative splicing of TP53 RNA and further modulates cellular stress responses. Through damage-induced inhibition of the SMG1 kinase, TP53 pre-mRNA is alternatively spliced to generate TP53b mRNA and p53b protein is required for optimal induction of cellular senescence after ionizing radiation-induced DNA damage. Herein, we confirmed and extended these observations by demonstrating that the ATM protein kinase is required for repression of SMG1 kinase activity after ionizing radiation. We found that the RNA helicase and splicing factor, DDX5, interacts with SMG1, is required for alternative splicing of TP53 pre-mRNA to TP53b and TP53c mRNAs after DNA damage, and contributes to radiation-induced cellular senescence. Interestingly, the role of SMG1 in alternative splicing of p53 appears to be distinguishable from its role in regulating nonsense-mediated RNA decay. Thus, ATM, SMG1, and DDX5 participate in a DNA damage-induced alternative splicing pathway that regulates TP53 splicing and modulates radiation-induced cellular senescence.
In vitro studies allow evaluation of normal or cancer cell responses to radiation, either alone or in combination with agents used to modify these biological responses. Ionizing radiation can be produced by a variety of particles and sources, with varying energy spectra, interaction probabilities, linear energy transfer, dose uniformity, dose rates, and delivery methods. Multiple radiation sources have been used to irradiate cells in the published literature. However, the equivalence of response in cell culture models across radiation sources has not been rigorously established. Moreover, current reporting of radiation source parameters lacks consistency and rigor which may impact the reproducibility of pre-clinical data between laboratories. Relevant choices of radiation source are also of high importance due to growing interest in comparing photon versus particle radiation effect on biological responses. Therefore, this study robustly evaluates the cellular response (cell survival, apoptosis, and DNA damage) of three distinct cell lines using four unique photon generating radiation sources. We hypothesize there may be subtle differences across the radiation sources, without an appreciable difference in cellular response. The four photon irradiation energies investigated, 662 keV, 100 kVp, 220 kVp, 6 MV, did produce subtle differences in DNA damage and cell survival when treating three distinct tumor cell lines. These variations in cellular response emphasize the need to carefully consider irradiation source, energy, and dose rate depending on study goal and endpoint.
This article is only available to subscribers. It is not available for individual sale.
Access to the requested content is limited to institutions that have
purchased or subscribe to this BioOne eBook Collection. You are receiving
this notice because your organization may not have this eBook access.*
*Shibboleth/Open Athens users-please
sign in
to access your institution's subscriptions.
Additional information about institution subscriptions can be foundhere