Registered users receive a variety of benefits including the ability to customize email alerts, create favorite journals list, and save searches.
Please note that a BioOne web account does not automatically grant access to full-text content. An institutional or society member subscription is required to view non-Open Access content.
Contact helpdesk@bioone.org with any questions.
Carmen I. Rios, David R. Cassatt, Brynn A. Hollingsworth, Merriline M. Satyamitra, Yeabsera S. Tadesse, Lanyn P. Taliaferro, Thomas A. Winters, Andrea L. DiCarlo
As the multi-systemic components of COVID-19 emerge, parallel etiologies can be drawn between SARS-CoV-2 infection and radiation injuries. While some SARS-CoV-2-infected individuals present as asymptomatic, others exhibit mild symptoms that may include fever, cough, chills, and unusual symptoms like loss of taste and smell and reddening in the extremities (e.g., “COVID toes,” suggestive of microvessel damage). Still others alarm healthcare providers with extreme and rapid onset of high-risk indicators of mortality that include acute respiratory distress syndrome (ARDS), multi-organ hypercoagulation, hypoxia and cardiovascular damage. Researchers are quickly refocusing their science to address this enigmatic virus that seems to unveil itself in new ways without discrimination. As investigators begin to identify early markers of disease, identification of common threads with other pathologies may provide some clues. Interestingly, years of research in the field of radiation biology documents the complex multiorgan nature of another disease state that occurs after exposure to high doses of radiation: the acute radiation syndrome (ARS). Inflammation is a key common player in COVID-19 and ARS, and drives the multi-system damage that dramatically alters biological homeostasis. Both conditions initiate a cytokine storm, with similar pro-inflammatory molecules increased and other anti-inflammatory molecules decreased. These changes manifest in a variety of ways, with a demonstrably higher health impact in patients having underlying medical conditions. The potentially dramatic human impact of ARS has guided the science that has identified many biomarkers of radiation exposure, established medical management strategies for ARS, and led to the development of medical countermeasures for use in the event of a radiation public health emergency. These efforts can now be leveraged to help elucidate mechanisms of action of COVID-19 injuries. Furthermore, this intersection between COVID-19 and ARS may point to approaches that could accelerate the discovery of treatments for both.
P. Ostheim, M. Majewski, Z. Gluzman-Poltorak, V. Vainstein, L. A. Basile, A. Lamkowski, S. Schüle, H. L. Kaatsch, M. Haimerl, C. Stroszczynski, M. Port, M. Abend
Radiosensitivity differs in humans and likely among closely-related primates. Reasons for variation in radiosensitivity are not well known. We examined preirradiation gene expression in peripheral blood among male and female rhesus macaques which did or did not survive (up to 60 days) after whole-body irradiation with 700 cGy (LD66/60). RNA samples originated from a blinded randomized Good Laboratory Practice study in 142 irradiated rhesus macaques. Animals were untreated (placebo), or treated using recombinant human IL-12, G-CSF or combination of the two. We evaluated gene expression in a two-phase study design where phase I was a whole genome screen [next generation sequencing (NGS)] for mRNAs (RNA-seq) using five RNA samples from untreated male and female animals per group of survivor and non-survivor (total n = 20). Differential gene expression (DGE) was defined as a statistically significant and ≥2-fold up- or downregulation of mRNA species and was calculated between groups of survivors and non-survivors (reference) and by gender. Altogether 659 genes were identified, but the overlapping number of differentially expressed genes (DGE) observed in both genders was small (n = 36). Fifty-eight candidate mRNAs were chosen for independent validation in phase II using the remaining samples (n = 122) evaluated with qRT-PCR. Among the 58 candidates, 16 were of significance or borderline significance (t test) by DGE. Univariate and multivariate logistic regression analysis and receiver operating characteristic (ROC) curve analysis further refined and identified the most outstanding validated genes and gene combinations. For untreated male macaques, we identified EPX (P = 0.005, ROC=1.0), IGF2BP1 (P = 0.05, ROC=0.74) and the combination of EPX with SLC22A4 (P = 0.03, ROC=0.85) which appeared most predictive for the clinical outcome for treated and combined (untreated and treated) male macaque groups, respectively. For untreated, treated and both combined female macaque groups the same gene (MBOAT4, P = 0.0004, ROC = 0.81) was most predictive. Based on the probability function of the ROC curves, up to 74% of preirradiation RNA measurements predicted survival with a positive and negative predictive value ranging between 85–100% and associated odds ratios reflecting a 2–3-fold elevated risk for surviving per unit change (cycle threshold value) in gene expression. In conclusion, we identified gender-dependent genes and gene combinations in preirradiation blood samples for survival prediction after irradiation in rhesus macaques.
In the event of a mass casualty radiological or nuclear scenario, it is important to distinguish between the unexposed (worried well), low-dose exposed individuals and those developing the hematological acute radiation syndrome (HARS) within the first three days postirradiation. In previous baboon studies, we identified altered gene expression changes after irradiation, which were predictive for the later developing HARS severity. Similar changes in the expression of four of these genes were observed using an in vitro human whole blood model. However, these studies have provided only limited information on the time frame of the changes after exposure in relationship to the development of HARS. In this study we analyzed the time-dependent changes in mRNA expression after in vitro irradiation of whole blood. Changes in the expression of informative mRNAs (FDXR, DBB2, POU2AF1 and WNT3) were determined in the blood of eight healthy donors (6 males, 2 females) after irradiation at 0 (control), 0.5, 2 and 4 Gy using qRT-PCR. FDXR expression was significantly upregulated (P < 0.001) 4 h after ≥0.5 Gy irradiation, with an 18–40-fold peak attained 4–12 h postirradiation which remained elevated (4–9-fold) at 72 h. DDB2 expression was upregulated after 4 h (fold change, 5–8, P < 0.001 at ≥ 0.5 Gy) and remained upregulated (3–4-fold) until 72 h (P < 0.001). The earliest time points showing a significant downregulation of POU2AF1 and WNT3 were 4 h (fold change = 0.4, P = 0.001, at 4 Gy) and 8 h (fold change = 0.3–0.5, P < 0.001, 2–4 Gy), respectively. These results indicate that the diagnostic window for detecting HARS-predictive changes in gene expression may be opened as early as 2 h for most (75%) and at 4 h postirradiation for all individuals examined. Depending on the RNA species studied this may continue for at least three days postirradiation.
Due to high metabolic activity, proliferating cells continuously generate free radicals, which induce DNA double-strand breaks (DSB). Fluorescently tagged nuclear foci of DNA repair protein 53 binding protein-1 (53BP1) are used as a standard metric for measuring DSB formation at baseline and in response to environmental insults such as radiation. Here we demonstrate that the background level of spontaneous 53BP1+ foci formation can be modeled mathematically as a function of cell confluence, which is a metric of their proliferation rate. This model was validated using spontaneous 53BP1+ foci data from 72 cultures of primary skin fibroblasts derived from 15 different strains of mice, showing a ∼10-fold decrease from low to full confluence that is independent of mouse strain. On the other hand, the baseline level of spontaneous 53BP1+ foci in a fully confluent cell population was strain-dependent, suggesting genomic associations, and correlated with radiation sensitivity based on previous measurements in the same cell lines. Finally, we have developed an online open-access tool to correct for the effect of cell confluence on 53BP1+ foci-based quantification of DSB. This tool provides guidelines for the number of cells required to reach statistical significance for the detection of DSB induced by low doses of ionizing radiation as a function of confluence and time postirradiation.
Mai Utada, Alina V. Brenner, Dale L. Preston, John B. Cologne, Ritsu Sakata, Hiromi Sugiyama, Naohiro Kato, Eric J. Grant, Elizabeth K. Cahoon, Kiyohiko Mabuchi, Kotaro Ozasa
There is limited evidence concerning the association between radiation exposure and ovarian cancer. We evaluated radiation risk of ovarian cancer between 1958 and 2009 among 62,534 female atomic bomb survivors in the Life Span Study cohort, adding 11 years of follow-up from the previously reported study. Poisson regression methods were used to estimate excess relative risk per Gy (ERR/Gy) for total ovarian cancer and according to tumor type. We assessed the modifying effect of follow-up period and other factors on the radiation risk. We ascertained 288 first primary ovarian cancers including 77 type 1 epithelial cancers, 75 type 2 epithelial cancers, 66 epithelial cancers of undetermined type and 70 other cancers. Radiation dose was positively, although not significantly, associated with risk of total ovarian cancer [ERR/Gy = 0.30, 95% confidence interval (CI): –0.22 to 1.11]. There was a suggestion of heterogeneity in radiation effects (P = 0.08) for type 1 (ERR/Gy = –0.32, 95% CI: <–0.32 to 0.88) and type 2 cancers (ERR/Gy = 1.24, 95% CI: –0.08 to 4.16). There were no significant trends in the ERR with time since exposure or age at exposure. Further follow-up will help characterize more accurately the patterns of radiation risk for total ovarian cancer and its types.
Kiyohiko Mabuchi, Dale L. Preston, Alina V. Brenner, Hiromi Sugiyama, Mai Utada, Ritsu Sakata, Atsuko Sadakane, Eric J. Grant, Benjamin French, Elizabeth K. Cahoon, Kotaro Ozasa
Epidemiological evidence for a radiation effect on prostate cancer risk has been inconsistent and largely indicative of no or little effect. Here we studied prostate cancer incidence among males of the Life Span Study cohort of atomic bomb survivors in a follow-up from 1958 to 2009, eleven years more than was previously reported. During this period there were 851 incident cases of prostate cancer among 41,544 male subjects, doubling the total number of cases in the cohort. More than 50% of the cases were diagnosed among those who were less than 20 years of age at the time of the bombings and who were at, or near, the ages of heightened prostate cancer risks during the last decade of follow-up. In analyses of the radiation dose response using Poisson regression methods, we used a baseline-rate model that allowed for calendar period effects corresponding to the emergence of prostate-specific antigen screening in the general population as well as effects of attained age and birth cohort. The model also allowed for markedly increased baseline rates among the Adult Health Study participants between 2005 and 2009, a period during which a prostate-specific antigen test was included in Adult Health Study biennial health examinations. We found a significant linear dose response with an estimated excess relative risk (ERR) per Gy of 0.57 (95% CI: 0.21, 1.00, P = 0.001). An estimated 40 of the observed cases were attributed to radiation exposure from the bombings. There was a suggestion of the ERR decreasing with increasing age at exposure (P = 0.09). We found no indication of effects of smoking, alcohol consumption and body mass index on the baseline risk of prostate cancer. The observed dose response strengthens the evidence of a radiation effect on the risk of prostate cancer incidence in the atomic bomb survivors.
Administration of diethylenetriaminepentaacetic acid (DTPA) is the treatment approach used to promote the decorporation of internalized plutonium. Here we evaluated the efficacy of PEGylated liposomes coated with DTPA, primarily designed to prevent enhanced plutonium accumulation in bones, compared to marketed nonliposomal DTPA and liposomes encapsulating DTPA. The comparative effects were examined in terms of reduction of activity in tissues of plutonium-injected rats. The prompt treatment with DTPA-coated liposomes elicited an even greater efficacy than that with liposome-encapsulated DTPA in limiting skeletal plutonium. This advantage, undoubtedly due to the anchorage of DTPA to the outer layer of liposomes, is discussed, as well as the reason for the loss of this superiority at delayed times after contamination. Plutonium complexed with DTPA-coated liposomes in extracellular compartments was partly diverted into the liver and the spleen. These complexes and those directly formed inside hepatic and splenic cells appeared to be degraded, then released from cells at extremely slow rates. This transitory accumulation of activity, which could not be counteracted by combining both liposomal forms, entailed an underestimation of the efficacy of DTPA-coated liposomes on soft tissue plutonium until total elimination probably more than one month after treatment. DTPA-coated liposomes may provide the best delivery vehicle of DTPA for preventing plutonium deposition in tissues, especially in bone where nuclides become nearly impossible to remove once fixed. Additional development efforts are needed to limit the diversion or to accelerate cell release of plutonium bound to DTPA-coated liposomes, using a labile bond for DTPA attachment.
Cold inducible RNA binding protein (CIRP), also named A18 hnRNP or CIRBP, is a cold-shock RNA-binding protein which can be induced upon various cellular stresses. Its expression level is induced in various cancer tissues relative to adjacent normal tissues; this is believed to play a critical role in cancer development and progression. In this study, we investigated the role of CIRP in cells exposed to ionizing radiation. Our data show that CIRP reduction causes cell colony formation and cell viability reduction after irradiation. In addition, CIRP knockdown cells demonstrated a higher DNA damage rate but less cell cycle arrest after irradiation. As a result, the induced DNA damage with less DNA repair processes led to an increased cell apoptosis rate in CIRP knockdown cells postirradiation. These findings suggest that CIRP is a critical protein in irradiated cells and can be used as a potential target for sensitizing cancer cells to radiation therapy.
The recent rollout of 5G telecommunications systems has spawned a renewed call to re-examine the possibility of so-called “non-thermal” harmful effects of radiofrequency (RF) radiation. The possibility of calcium being affected by low-level RF has been the subject of research for nearly 50 years and there have been recent suggestions that voltage-gated calcium channels (VGCCs) are “extraordinarily sensitive” to ambient RF fields. This article examines the feasibility of particularly modulated RF coupling to gating mechanisms in VGCCs and also reviews studies from the literature from the last 50 years for consistency of outcome. We conclude that the currents induced by fields at the ICNIRP guideline limits are many orders of magnitude below those needed to affect gating, and there would need to be a biological mechanism for detection and rectification of the extremely-low-frequency (ELF) modulations, which has not been demonstrated. Overall, experimental studies have not validated that RF affects Ca2+ transport into or out of cells.
This article is only available to subscribers. It is not available for individual sale.
Access to the requested content is limited to institutions that have
purchased or subscribe to this BioOne eBook Collection. You are receiving
this notice because your organization may not have this eBook access.*
*Shibboleth/Open Athens users-please
sign in
to access your institution's subscriptions.
Additional information about institution subscriptions can be foundhere