Registered users receive a variety of benefits including the ability to customize email alerts, create favorite journals list, and save searches.
Please note that a BioOne web account does not automatically grant access to full-text content. An institutional or society member subscription is required to view non-Open Access content.
Contact helpdesk@bioone.org with any questions.
Under a future climate for south-eastern Australia there is the likelihood that the net effect of elevated CO2, (eCO2) lower growing-season rainfall and high temperature will increase haying-off thus limit production of rain-fed wheat crops. We used a modelling approach to assess the impact of an expected future climate on wheat growth across four cropping regions in Victoria. A wheat model, APSIM-Nwheat, was performance tested against three datasets: (i) a field experiment at Wagga Wagga, NSW; (ii) the Australian Grains Free Air Carbon dioxide Enrichment (AGFACE) experiment at Horsham, Victoria; and (iii) a broad-acre wheat crop survey in western Victoria. For down-scaled climate predictions for 2050, average rainfall during October, which coincides with crop flowering, decreased by 32, 29, 26, and 18% for the semiarid regions of the northern Mallee, the southern Mallee, Wimmera, and higher rainfall zone, (HRZ) in the Western District, respectively. Mean annual minimum and maximum temperature over the four regions increased by 1.9 and 2.2°C, respectively. A pair-wise comparison of the yield/anthesis biomass ratio across climate scenarios, used for assessing haying-off response, revealed that there was a 39, 49 and 47% increase in frequency of haying-off for the northern Mallee, southern Mallee and Wimmera, respectively, when crops were sown near the historically optimal time (1 June). This translated to a reduction in yield from 1.6 to 1.4 t/ha (northern Mallee), 2.5 to 2.2 t/ha (southern Mallee) and 3.7 to 3.6 t/ha (Wimmera) under a future climate. Sowing earlier (1 May) reduced the impact of a future climate on haying-off where decreases in yield/anthesis biomass ratio were 24, 28 and 23% for the respective regions. Heavy textured soils exacerbated the impact of a future climate on haying-off within the Wimmera. Within the HRZ of the Western District crops were not water limited during grain filling, so no evidence of haying-off existed where average crop yields increased by 5% under a future climate (6.4–6.7 t/ha). The simulated effect of eCO2 alone (FACE conditions) increased average yields from 18 to 38% for the semiarid regions but not in the HRZ and there was no evidence of haying-off. For a future climate, sowing earlier limited the impact of hotter, drier conditions by reducing pre-anthesis plant growth, grain set and resource depletion and shifted the grain-filling phase earlier, which reduced the impact of future drier conditions in spring. Overall, earlier sowing in a Mediterranean-type environment appears to be an important management strategy for maintaining wheat production in semiarid cropping regions into the future, although this has to be balanced with other agronomic considerations such as frost risk and weed control.
Pyramiding of quantitative trait loci (QTLs) can be an effective approach for developing durable resistance to powdery mildew in wheat (Triticum aestivum L.). The Chinese wheat cultivars Bainong 64 and Lumai 21, with outstanding agronomic traits, possess four and three QTLs, respectively, for adult-plant resistance (APR) to powdery mildew. To achieve optimal durable resistance, 21 F6 lines combining two–five powdery mildew APR QTLs were developed from the cross Bainong 64/Lumai 21 using a modified pedigree selection. These lines were planted in a randomised complete block design with two replicates in Beijing during the 2009–10 and 2010–11 cropping seasons, and were evaluated for powdery mildew response using the highly virulent Blumeria graminis f. sp. tritici isolate E20. Based on the phenotypic data of both maximum disease severity (MDS) and area under the disease progress curve (AUDPC), analysis of variance indicated that there were highly significant effects of QTL combinations on reducing powdery mildew MDS and AUDPC. Six pyramided QTL combinations possessing QPm.caas-1A and QPm.caas-4DL in common along with one or more of the others expressed better APR to powdery mildew than the more resistant parent, Bainong 64. Thus, pyramiding these two QTLs with one or more of QPm.caas-2BS, QPm.caas-2BL, and QPm.caas-2DL from Lumai 21 could be a desirable strategy to breed cultivars with high levels of durable resistance to powdery mildew. Experienced breeders with a good knowledge of minor genes can achieve APR by phenotypic selection, and selection by molecular markers will still require uniform field testing for powdery mildew and disease phenotype to validate the resistance. These results provided very useful information for pyramiding APR QTLs in wheat breeding programs.
The pea weevil, Bruchus pisorum, is one of the most intractable pest problems of cultivated field pea (Pisum sativum) in the world. Pesticide application, either as a contact insecticide spray to the field pea crop or fumigation of the harvested seed, is the only available method for its control. The aim of the study was to develop a quick and reliable method to screen for pea weevil resistance and increase efficiency in breeding for this important trait. Backcrossed progenies derived from an interspecific cross between cultivated field pea and its wild relative (Pisum fulvum, source of resistance for pea weevil) were subjected to natural infestation in field plots. Mature seeds were hand-harvested, stored to allow development of adult beetles, and then separated into infested and non-infested using a density separation method in 30% caesium chloride (CsCl). Susceptibility and resistance of the progenies were calculated based on this method and further confirmed by a glasshouse bioassay. Resistance in backcross populations improved considerably through selection of resistant lines using the density separation method. We found that the method using CsCl separation is a useful tool in breeding for pea weevil resistance. We were able to introgress pea weevil resistance from P. fulvum into cultivated field pea through backcrossing to produce several advanced pea weevil resistant lines following this procedure.
The expression and inheritance of several qualitative traits was examined in four cultivated × wild hybrid populations involving each of two mungbean (Vigna radiata ssp. radiata) cultivars, cvv. Berken and Kiloga, and each of two Australian accessions of the wild subspecies (V. radiata ssp. sublobata). One of the wild accessions, ACC 1, was representative of a prostrate, fine-stemmed, gracile type and the other, ACC 87, was representative of a more robust perennial form endemic in north-eastern Australia. For each of the four cultivated × wild populations, trait expression was observed in plants from the parent, F1, F2, and the two F1–parental backcross generations, when grown under favourable conditions in large pots on benches in the field at CSIRO Davies Laboratory, Townsville, Australia. Models of inheritance were inferred based on the segregation patterns in the different generations of the cultivated v. wild phenotypes. For most traits, the model of inheritance depended more on the wild than the cultivated parent, with more traits in the crosses involving ACC 1 being digenic than in those involving ACC 87. For all the observed morphological and seed traits, the wild phenotype was dominant, consistent with the cultivated phenotype having arisen through mutations that inhibited expression of the wild type. In contrast, the apparent resistance of the wild parents to field strains of powdery mildew disease was recessive to the strong susceptibility of the two cultivars. The segregation patterns for presence or absence of tuberous roots were remarkably similar in the two crosses involving the perennial accession ACC 87, and were consistent with the formation of tuberous roots being conditioned by two complementary, dominant genes. The fact that an apparently complex trait like perenniality might be conditioned by so few genes suggested that perenniality may also be an ancestral wild trait, disruption of which has led to the now more common, annual form. Linkage analyses suggested that perenniality was associated with the wild-type seed traits, black speckled testa and pigmented hilum, which previous molecular studies have indicated are both located on mungbean linkage group 2.
The effect of grazing of vegetative canola (Brassica napus) with sheep on crop growth and yield was investigated in two field experiments (Expts 1 and 2) in 2008 at Wagga Wagga, New South Wales, Australia. The experiments included a range of cultivars, sowing rates, and grazing periods to investigate the influence of these factors on grazing biomass, crop recovery, and grain yield. Three spring canola cultivars (representing triazine-tolerant, conventional, and hybrid types) were used in both experiments and were sown at three sowing rates and grazed by sheep for 7 days in midwinter in Expt 1, while two different grazing periods were compared in Expt 2. Supplementary irrigation was applied to Expt 1 to approximate average growing season conditions, while Expt 2 received no irrigation. Increased sowing rate produced greater early shoot biomass for grazing, but the-triazine tolerant cultivar produced less biomass than the conventional or hybrid cultivars in both experiments. Grazing reduced dry matter and leaf area by >50%, delayed flowering by 4 days on average, and reduced biomass at flowering by 22–52%. However, there was no impact of cultivar or sowing rate on the recovery of biomass and leaf area after grazing. Grazing had no effect on final grain yield under supplementary irrigation in Expt 1, but reduced grain yield under the drier regrowth conditions in Expt 2. The results demonstrate that grazing canola is feasible under average seasonal conditions in a medium-rainfall environment (400–600 mm) without yield penalty, provided the timing and intensity of grazing are matched to available biomass and anticipated seasonal water supply to support grain production. More broadly, we suggest that grain yield reductions from grazing could be avoided if suitable conditions for regrowth (residual dry matter, length of regrowth period, and adequate moisture) can generate biomass levels in excess of a target value of ∼5000 kg ha–1 at flowering. This target value represents a biomass level where >90% of photosynthetically active radiation was intercepted in our study, and in other studies represents a biomass level above which there is little further increase in potential yield. Such a target provides a basis for more objective grazing management but awaits further confirmation with experimentation and modelling.
Cotton is one of the most important irrigated crops in subtropical Australia. In recent years, cotton production has been severely affected by the worst drought in recorded history, with the 2007–08 growing season recording the lowest average cotton yield in 30 years. The use of a crop simulation model to simulate the long-term temporal distribution of cotton yields under different levels of irrigation and the marginal value for each unit of water applied is important in determining the economic feasibility of current irrigation practices. The objectives of this study were to: (i) evaluate the CROPGRO-Cotton simulation model for studying crop growth under deficit irrigation scenarios across ten locations in New South Wales (NSW) and Queensland (Qld); (ii) evaluate agronomic and economic responses to water inputs across the ten locations; and (iii) determine the economically optimal irrigation level. The CROPGRO-Cotton simulation model was evaluated using 2 years of experimental data collected at Kingsthorpe, Qld The model was further evaluated using data from nine locations between northern NSW and southern Qld. Long-term simulations were based on the prevalent furrow-irrigation practice of refilling the soil profile when the plant-available soil water content is <50%. The model closely estimated lint yield for all locations evaluated. Our results showed that the amounts of water needed to maximise profit and maximise yield are different, which has economic and environmental implications. Irrigation needed to maximise profits varied with both agronomic and economic factors, which can be quite variable with season and location. Therefore, better tools and information that consider the agronomic and economic implications of irrigation decisions need to be developed and made available to growers.
Cullen australasicum is a legume species from Australia that holds promise for development as a drought-tolerant perennial pasture species, yet only a few accessions have been evaluated for agronomic traits. Several Cullen species aside from C. australasicum may also have potential for use as perennial pastures. We compared the field survival and aboveground biomass production of 100 germplasm accessions from 9 Cullen species, 2 lucerne (Medicago sativa) cultivars and 2 perennial Lotus species over 18 months in a low-rainfall region of the wheatbelt of Western Australia. Nutritive value of selected Cullen accessions was also compared with lucerne and L. australis. Several accessions of C. australasicum demonstrated good survival, productivity and nutritional value, and some accessions of C. discolor, C. lachnostachys, C. pallidum and C. pustulatum also showed promise in some or all of these traits. Significant phenotypic variation was seen among accessions of C. australasicum, C. pallidum, C. cinereum and C. tenax for some agronomic traits. We discuss the implications of this variation for further experiments or development of Cullen species. While survival and productivity of many Cullen accessions was similar to lucerne, only a few C. australasicum accessions were more productive than lucerne. We conclude that C. australasicum is currently the best prospect among Cullen species for cultivar development as a perennial pasture legume, and our analysis has highlighted accessions of particular interest. In addition, further work on C. discolor, C. lachnostachys, C. pallidum and C. pustulatum may also, in the longer term, provide useful pasture species.
Subterranean clover (Trifolium subterraneum) is a key pasture legume across southern Australia and elsewhere. Decline in subterranean clover pastures was first recognised in Australia during the 1960s and manifests as an increase in weeds and a decrease in desirable legume species. While both root disease and poor nutrition contribute to subterranean clover pasture decline, the relationships between root disease and nutrition have not been determined. The objective of this study was to define these relationships. Field experiments were undertaken to determine the nutritional and pathogen status of soils and subterranean clover from three Western Australian field sites. Subsequently, controlled environment experiments were undertaken to determine the relative severities of tap and lateral root disease and growth of plants when soil cores taken from these three field sites were amended with a complete nutrient solution or a range of individual macro- or micronutrient treatments. Application of a ‘Hoaglands’ complete nutrient solution decreased the severity of tap root disease by an average of 45% and lateral root disease by 32%. Amendment with K alone reduced the severity of tap root disease an average of 32%; while the application of N alone reduced the severity of tap root disease by 33% and lateral root disease by 27%. Application of Hoaglands, K, N or Zn increased shoot and root dry weight, while Mo only increased shoot dry weight. This is the first report to show that mineral nutrients can substantially ameliorate root disease in subterranean clover. The results demonstrate that while root disease limits plant growth, improvement in the nutritional status of nutrient-impoverished soils can significantly reduce root disease. There is significant potential to incorporate nutrient amendments into an integrated and more sustainable approach to better manage root disease and to increase productivity of pasture legumes where soils are inherently nutrient deficient in one or more nutrients.
Control of dry matter losses (DML) is a major concern of forage conservation systems. Measuring DML during hay and silage storage is difficult and time-consuming, so it is usually limited to experimental conditions. The lack of a practical way of measuring DML to monitor forage conservation efficiency has contributed to the poor adoption of good practices. The availability of a practical, easy, and economic technique capable of estimating on-farm DML would facilitate advisory and extension work. The objective of this study was to assess the accuracy and precision of an indirect technique based on compositional changes to estimate storage DML for silages and hays. Data were generated through a Monte Carlo simulation developed to test the effects of type of data distribution (normal or log-normal), variability (5 and 10% coefficient of variation), and sample size (1000, 30, 20, and 10). Results indicated that potential markers (acid detergent fibre and acid detergent lignin were explored) had log-normal distribution and that a coefficient of variation of ∼10% was reasonable. Summary statistic analysis showed that means and medians were coherent for different sample sizes. It was concluded that changes in marker concentrations could lead to a reasonably robust system of predicting DML during hay or silage storage.
This article is only available to subscribers. It is not available for individual sale.
Access to the requested content is limited to institutions that have
purchased or subscribe to this BioOne eBook Collection. You are receiving
this notice because your organization may not have this eBook access.*
*Shibboleth/Open Athens users-please
sign in
to access your institution's subscriptions.
Additional information about institution subscriptions can be foundhere