Registered users receive a variety of benefits including the ability to customize email alerts, create favorite journals list, and save searches.
Please note that a BioOne web account does not automatically grant access to full-text content. An institutional or society member subscription is required to view non-Open Access content.
Contact helpdesk@bioone.org with any questions.
Many studies in biomedical research and various allied fields, in which cells or laboratory animals are exposed to radiation, rely on adequate radiation dose standardization for reproducibility and comparability of biological data. Due to increasing concerns regarding international terrorism, the use of radioactive isotopes has recently been met with enhanced security measures. Thus, a growing number of researchers have considered transferring their studies from gamma-ray to kilovoltage X-ray irradiators. Current commercially-available X-ray biological irradiators produce radiation beams with reasonable field geometry and overall dose-homogeneity; however, they operate over a wide range of different energies, both between different models and for a specific unit as well. As a result, the contribution from Compton scattering and the photoelectric effect also varies widely between different irradiators and different beam qualities. The photoelectric effect significantly predominates at the relatively low X-ray energies in which these irradiators operate. Consequently, a higher dose is delivered to bony tissues and the adjacent hematopoietic cells of the bone marrow. The increase in average radiation absorbed dose to the bone marrow compartment of the mouse can be as high as 30%, causing higher hematological sensitivity of animals when exposed to kilovoltage X rays. Adjusting the radiation dose to simply provide biological equivalency is complicated due to steep dose gradients within the marrow tissue and the qualitatively different outcomes depending on the spatial location of critical stem and progenitor populations in relationship to bone. These concerns may be practically addressed by efforts to implement X rays of the highest possible beam energy and penetration and increased awareness that radiation damage to hematopoietic cells will not be identical to data obtained from standard 137Cs gamma rays.
D-methionine (D-met), a dextrorotatory isoform of the amino acid L-methionine (L-met), can prevent oral mucositis and salivary hypofunction in mice exposed to radiation. However, the mechanism of its radioprotection is unclear, especially with regard to the stereospecific functions of D-met. Radiation is known to cause injury to normal tissue by triggering DNA damage in cells. Thus, in this study we sought to determine whether the chirality of D-/L-met affects radiation-induced events at the DNA level. We selected plasmid DNA assays to examine this effect in vitro, since these assays are highly sensitive and allow easy detection of DNA damage. Samples of supercoiled pBR322 plasmid DNA mixed with D-met, L-met or dimethylsulfoxide (DMSO) were prepared and irradiated with a Bragg peak beam of carbon ions (∼290 MeV/u) with a 6-cm spread. DNA strand breaks were indicated by the change in the form of the plasmid and were subsequently quantified using agarose gel electrophoresis. We found that D-met yielded approximately equivalent protection from carbon-ion-induced DNA damage as DMSO. Thus, we propose that the protective functions of methionine against plasmid DNA damage could be explained by the same mechanism as that for DMSO, namely, hydroxyl radical scavenging. This stereospecific radioprotective mechanism occurred at a level other than the DNA level. There was no significant difference between the radioprotective effect of D-met and L-met on DNA.
Genetic and epigenetic profile changes associated with individual radiation sensitivity are well documented and have led to enhanced understanding of the mechanisms of the radiation-induced DNA damage response. However, the search continues to identify reliable biomarkers of individual radiation sensitivity. Herein, we report on a multi-biomarker approach using traditional cytogenetic biomarkers, DNA damage biomarkers and transcriptional microRNA (miR) biomarkers coupled with their potential gene targets to identify radiosensitivity in ataxia-telangiectasia mutated (ATM)-deficient lymphoblastoid cell lines (LCL); ATM-proficient cell lines were used as controls. Cells were 0.05 and 0.5 Gy irradiated, using a linear accelerator, with sham-irradiated cells as controls. At 1 h postirradiation, cells were fixed for γ-H2AX analysis as a measurement of DNA damage, and cytogenetic analysis using the G2 chromosomal sensitivity assay, G-banding and FISH techniques. RNA was also isolated for genetic profiling by microRNA (miR) and RT-PCR analysis. A panel of 752 miR were analyzed, and potential target genes, phosphatase and tensin homolog (PTEN) and cyclin D1 (CCND1), were measured. The cytogenetic assays revealed that although the control cell line had functional cell cycle checkpoints, the radiosensitivity of the control and AT cell lines were similar. Analysis of DNA damage in all cell lines, including an additional control cell line, showed elevated γ-H2AX levels for only one AT cell line. Of the 752 miR analyzed, eight miR were upregulated, and six miR were downregulated in the AT cells compared to the control. Upregulated miR-152-3p, miR-24-5p and miR-92-15p and all downregulated miR were indicated as modulators of PTEN and CCDN1. Further measurement of both genes validated their potential role as radiation-response biomarkers. The multi-biomarker approach not only revealed potential candidates for radiation response, but provided additional mechanistic insights into the response in AT-deficient cells.
We report the generation of dose point kernels for clinically-relevant radionuclide beta decays and monoenergetic electrons in various tissues to understand the impact of tissue type on dose point kernels. Currently available voxel-wise dosimetry approaches using dose point kernels ignore tissue composition and density heterogeneities. Therefore, the study on the impact of tissue type on dose point kernels is warranted. Simulations were performed using the GATE Monte Carlo toolkit, which encapsulates GEANT4 libraries. Dose point kernels were simulated in phantoms of water, compact bone, lung, adipose tissue, blood and red marrow for radionuclides 90Y, 188Re, 32P, 89Sr, 186Re, 153Sm and 177Lu and monoenergetic electrons (0.015–10 MeV). All simulations were performed by assuming an isotropic point source of electrons at the center of a homogeneous spherical phantom. Tissue-specific differences between kernels were investigated by normalizing kernels for effective pathlength. Transport of 20 million particles was found to provide sufficient statistical precision in all simulated kernels. The simulated dose point kernels demonstrate excellent agreement with other Monte Carlo packages. Deviation from kernels reported in the literature did not exceed a 10% global difference, which is consistent with the variability among published results. There are no significant differences between the dose point kernel in water and kernels in other tissues that have been scaled to account for density; however, tissue density predictably demonstrated itself to be a significant variable in dose point kernel distribution.
Abdulnaser Alkhalil, John. L. Clifford, Robert Ball, Anna Day, Rosanna Chan, Bonnie C. Carney, Stacy Ann Miller, Ross Campbell, Raina Kumar, Aarti Gautam, Rasha Hammamieh, Lauren T. Moffatt, Jeffrey W. Shupp
In the event of a mass casualty radiation scenario, rapid assessment of patients' health and triage is required for optimal resource utilization. Identifying the level and extent of exposure as well as prioritization of care is extremely challenging under such disaster conditions. Blood-based biomarkers, such as RNA integrity numbers (RIN), could help healthcare personnel quickly and efficiently determine the extent and effect of multiple injuries on patients' health. Evaluation of the effect of different radiation doses, alone or in combination with burn injury, on total RNA integrity over multiple time points was performed. Total RNA integrity was tallied in blood samples for potential application as a marker of radiation exposure and survival. Groups of aged mice (3–6 mice/group, 13–18 months old) received 0.5, 1, 5, 10 or 20 Gy ionizing radiation. Two additional mouse groups received low-dose irradiation (0.5 or 1 Gy) with a 15% total body surface area (TBSA) burn injury. Animals were euthanized at 2 or 12 h and at day 1, 2, 3, 7 or 14 postirradiation, or when injury-mediated mortality occurred. Total RNA was isolated from blood. The quality of RNA was evaluated and RNA RIN were obtained. Analysis of RIN indicated that blood showed the clearest radiation effect. There was a time- and radiation-dose-dependent reduction in RIN that was first detectable at 12 h postirradiation for all doses in animals receiving irradiation alone. This effect was reversible in lower-dose groups (i.e., 0.5, 1 and 5 Gy) that survived to the end of the study (14 days). In contrast, the effect persisted for 10 and 20 Gy groups, which showed suppression of RIN values <4.5 with high mortalities. Radiation doses of 20 Gy were lethal and required euthanasia by day 6. A low RIN (<2.5) at any time point was associated with 100% mortality. Combined radiation-burn injury produced significantly increased mortality such that no dually-injured animals survived beyond day 3, and no radiation dose >1 Gy resulted in survival past day 1. More modest suppression of RIN was observed in the surviving dually challenged mice, and no statistically significant changes were identified in RIN values of burn-only mice at any time point. In this study of an animal model, a proof of concept is presented for a simple and accurate method of assessing radiation dose exposure in blood which potentially predicts lethality. RIN assessment of blood-derived RNA could form the basis for a clinical decision-support tool to guide healthcare providers under the strenuous conditions of a radiation-based mass casualty event.
In this work, we utilized spontaneously hypertensive rats (SHR) and Wister Kyoto rats (WKY), from which the SHR was established, to evaluate the effects of whole-body acute radiation on the cardiovascular system at doses from 0 to 4 Gy. In the irradiated SHR, the systolic blood pressure (SBP) increased with increasing dose, while body weight gain decreased with increasing radiation dose. Furthermore, pathological observations of SHR demonstrated that the number of rats with cystic degeneration in the liver increased with increasing dose. The effects observed among SHR, such as increased SBP and retardation of body weight gain, appear very similar to those observed in Japanese atomic bomb survivors. In contrast, the SBP among WKY did not change relative to dose; the body weight, however, did change, as in the SHR. Therefore, the association between radiation exposure and SBP, but not between radiation exposure and retardation of body weight gain, may be affected by genetic background, as evident from strain difference. These results suggest that the SHR and WKY animal models may be useful for studying radiation effects on non-cancer diseases including circulatory diseases, chronic liver disease and developmental retardation.
Chemical-induced premature chromosome condensation (PCC) is an alternative biodosimetry method to the gold-standard dicentric analysis for ionizing radiation. However, existing literature shows great variations in the experimental protocols which, together with the different scoring criteria applied in individual studies, result in large discrepancies in the coefficients of the calibration curves. The current study is based on an extensive review of the peer-reviewed literature on the chemical-induced ring PCC (rPCC) assay for high-dose exposure. For the first time, a simplified yet effective protocol was developed and tested in an attempt to reduce the scoring time and to increase the accuracy of dose estimation. Briefly, the protein phosphatase inhibitor, calyculin A, was selected over okadaic acid for higher efficiency. Colcemid block was omitted and only G2-PCC cells were scored. Strict scoring criteria for total rings and hollow rings only were described to minimize the uncertainty resulting from scoring ring-like artefacts. It was found that ring aberrations followed a Poisson distribution and the dose-effect relationship favored a linear fit with an α value of 0.0499 ± 0.0028 Gy–1 for total rings and 0.0361 ± 0.0031 Gy–1 for hollow rings only. The calibration curves constructed by scoring ring aberrations were directly compared between the simplified calyculin A-induced PCC protocol and that of the cell fusion-induced PCC for high-dose exposure to gamma rays. The technical practicalities of these two methods were also compared; and our blind validation tests showed that both assays were feasible for high-dose γ-ray exposure assessment even when only hollow rings in 100 PCC spreads were scored.
This article is only available to subscribers. It is not available for individual sale.
Access to the requested content is limited to institutions that have
purchased or subscribe to this BioOne eBook Collection. You are receiving
this notice because your organization may not have this eBook access.*
*Shibboleth/Open Athens users-please
sign in
to access your institution's subscriptions.
Additional information about institution subscriptions can be foundhere