Registered users receive a variety of benefits including the ability to customize email alerts, create favorite journals list, and save searches.
Please note that a BioOne web account does not automatically grant access to full-text content. An institutional or society member subscription is required to view non-Open Access content.
Contact helpdesk@bioone.org with any questions.
Frost damage remains a major problem for broadacre cropping, viticulture, horticulture and other agricultural industries in Australia. Annual losses from frost events in Australian broadacre agriculture are estimated at between $120 million and $700 million each year for this sector. Understanding the changing nature of frost risk, and the drivers responsible, are important steps in helping many producers manage climate variability and change. Our analysis, using Stevenson screen temperature thresholds of 2°C or below as an indicator of frost at ground level, demonstrates that across southern Australia, despite a warming trend of 0.17°C per decade since 1960, ‘frost season’ length has increased, on average, by 26 days across the whole southern portion of Australia compared with the 1960–1990 long-term mean. Some areas of south-eastern Australia now experience their last frost an average 4 weeks later than during the 1960s. The intersection of frost and wheat production risk was quantified at 60 sites across the Australian wheatbelt, with a more in-depth analysis undertaken for 15 locations across Victoria (i.e. eight sites common to both the National and Victorian assessments and seven sites exclusive to the Victorian analysis). The results of the national assessment highlight how frost-related production risk has increased by as much as 30% across much of the Australian wheatbelt, for a range of wheat maturity types, over the last two decades, in response to an increase in later frost events. Across 15 Victorian sites, sowing dates to achieve anthesis during a period with only a 10% chance of a 0°C night occurring shifted by 23 days (6 June) for the short-season variety, 20 days (17 May) for the medium-season variety and 36 days later (9 May) for the long-season variety assessed.
We used life cycle assessment methodology to determine the cradle-to-farmgate GHG emissions for rainfed wheat grown in monoculture or in sequence with the break crops canola (Brassica napus) and field peas (Pisum sativum), and for the break crops, in the south-eastern grains region of Australia. Total GHG emissions were 225 kg carbon dioxide equivalents (CO2-e)/t grain for a 3 t/ha wheat crop following wheat, compared with 199 and 172 kg CO2-e/t for wheat following canola and field peas, respectively. On an area basis, calculated emissions were 676, 677 and 586 kg CO2-e/ha for wheat following wheat, canola and field peas, respectively. Highest emissions were associated with the production and transport of fertilisers (23–28% of total GHG emissions) and their use in the field (16–23% of total GHG emissions). Production, transport and use of lime accounted for an additional 19–21% of total GHG emissions. The lower emissions for wheat after break crops were associated with higher yields, improved use of fertiliser nitrogen (N) and reduced fertiliser N inputs in the case of wheat after field peas. Emissions of GHG for the production and harvesting of canola were calculated at 841 kg CO2-e/ha, equivalent to 420 kg CO2-e/t grain. Those of field peas were 530 kg CO2-e/ha, equivalent to 294 kg CO2-e/t grain. When the gross margin returns for the crops were considered together with their GHG emissions, the field pea–wheat sequence had the highest value per unit emissions, at AU$787/t CO2-e, followed by wheat–wheat ($703/t CO2-e) and canola–wheat ($696/t CO2-e). Uncertainties associated with emissions factor values for fertiliser N, legume-fixed N and mineralised soil organic matter N are discussed, together with the potentially high C cost of legume N2 fixation and the impact of relatively small changes in soil C during grain cropping either to offset all or most pre- and on-farm GHG emissions or to add to them.
Cold temperature stress at the reproductive stage, particularly at booting and flowering stages can cause significant reductions in rice (Oryza sativa L.) yield particularly at high latitudes and elevation. Although genotypic variation for cold tolerance is known to exist, the tolerance mechanisms and genotypic consistency across the stages are yet to be understood for segregating populations. Three experiments were conducted under controlled temperature glasshouse conditions to determine floral characteristics that were associated with cold tolerance at the flowering stage and to determine genotypic consistency at the booting and flowering stages. Twenty F5 Reiziq × Lijiangheigu lines from two extreme phenotypic bulks selected for cold tolerance at booting stage in the F2 generation were utilised.
Spikelet sterility under cold stress at booting was significantly correlated with spikelet sterility under cold stress at flowering (r = 0.62**) with five lines identified as cold tolerant across reproductive stages. There was also a positive correlation (r = 0.47*) between spikelet sterility under cold stress at booting at the F5 and at the F2 generation. The quantitative trait loci (QTL; qLTSPKST10.1) previously identified on chromosome 10 contributing to spikelet sterility within the F2 generation, was also identified in the F5 generation. Additionally, genomic regions displaying significant segregation between the progenies contrasting for their cold tolerance response phenotype were identified on chromosomes 5 and 7 with Lijiangheigu as allelic donor and an estimated reduction in spikelet sterility of 25% and 27%, respectively. Although genotypic variation in spikelet sterility at the booting stage was not related to the development rate for heading or flowering, those cold-tolerant genotypes at the flowering stage were the quickest to complete flowering. Cold-tolerant genotypes at the flowering stage had larger numbers of dehisced anthers and subsequently pollen number on stigma, which contributed to reduced spikelet sterility. It is concluded that enhanced anther dehiscence plays a significant role in improved cold tolerance at the flowering stage.
Foxtail millet (Setaria italica (L.) P.Beauv.) is an ancient cereal cultivated worldwide in arid and marginal lands. It is an ideal crop for the changing climate, with high photosynthetic efficiency. A trait-based selection for drought tolerance is sought for yield stability. The present work had segregated the drought yield as total water use (T), transpiration efficiency (TE) and harvest index (HI) and assessed the importance of these components and their association with drought tolerance. The core collection of foxtail millet germplasm (n = 155) was evaluated in mini-lysimeters under both terminal drought stress (DS) and well-watered (WW) environments. The contribution of T to grain yield under drought was minor but the contribution of TE was positive and of HI negative. Crop duration, T and TE positively influenced, and HI negatively influenced, shoot biomass production. Under drought, the core germplasm accessions varied in shoot biomass, grain yield, HI and T by >3-fold and in TE by 2-fold. Categorisation of the germplasm for TE had differentiated groups of accessions as high TE (n = 17) and low TE (n = 22). Among the three races of foxtail millet, indica was strong for T and TE, and maxima and moharia for HI, with useful exceptions.
Maize (Zea mays L.) and faba bean (Vicia faba L.) have contrasting responses to low phosphorus (P) supply. The aim of this work was to characterise these responses with respect to the partitioning of biomass between shoot and root and biochemical modification of the rhizosphere. Maize and faba bean were grown in rhizoboxes in soil with a low P (10 mg kg–1) or high P (150 mg kg–1) supply. Solutions were collected from rhizosphere and bulk soil by suction, using micro-rhizons in situ. The pH and water-soluble P (Pi) were determined on the solutions collected by using micro-rhizons. Olsen P, soil pH and acid phosphatase activity were determined on samples of rhizosphere and bulk soil. Organic acids released from root tips were collected non-destructively and analysed by high performance liquid chromatography. Plants grown with low P supply had higher ratios of root : shoot dry weight than plants grown with high P supply. This response was greater in maize than in faba bean. Rhizosphere acidification, organic acid concentrations and acid phosphatase activity were greater in faba bean than maize. The Pi concentration in the maize rhizosphere solution was less than in the bulk soil, but the Pi concentration in the rhizosphere solution of faba bean was greater than in the bulk soil. It was concluded that maize responded to low P supply by investing more biomass in its root system, but acidification, concentrations of organic acids, acid phosphatase activity and mobilisation of P in the rhizosphere were greater in faba bean than in maize.
Canola (Brassica napus L.) is an important rotational crop in the temperate cropping zone of southern Australia. Herbicide-resistant weeds are rapidly spreading and reducing canola grain yield and quality. Crop competition is a useful tool for reducing weed costs and dependence on herbicides, and retarding the spread of herbicide resistance. The potential interaction of canola seeding rate and cultivar for weed management has not been quantified in Australia. A field experiment was conducted in three environments to examine the impact of two contrasting canola cultivars (a low vigour type and a high vigour hybrid) at four seeding rates (10–100 plants/m2) on volunteer wheat (∼50 plants/m2). Significant but variable effects of crop seeding rate, cultivar and weed were detected on canola density and grain yield, and on the suppression of volunteer wheat. The canola hybrids suppressed volunteer wheat more than the less vigorous cultivars in all the experiments. There was no benefit of increasing canola seeding rate above the normally recommended rate of 40 plants/m2 for weed suppression. The seed production of volunteer wheat on average doubled when canola density dropped from 40 to 10 plants/m2. Treatment effects on canola grain yield losses from weeds were less than those on weed suppression. The grain yield of both cultivars was reduced between 30% and 40% with weeds at a canola density of 40 plants/m2 and plateaued above this density in weedy conditions. Maintaining canola plant establishment and using competitive cultivars is critical to avoiding weed seedbank replenishment, and reducing canola yield losses from weed competition.
Irrigation was applied at different rates and frequencies during five consecutive periods of vegetative growth of the forage turnip Brassica rapa var. rapa cv. Barkant, grown in the field in north-west Tasmania, Australia, during the spring and summer of 1999–2000 (Season 1) and 2000–01 (Season 2). Irrigation applied before root expansion did not increase the dry matter (DM) of turnips (leaf plus root) in either season. At the following four harvests in each season, DM increased linearly in proportion to the cumulative amount of irrigation applied before the harvests. Irrigation water use efficiency, as measured by the slopes of the linear regressions, ranged from 5.7 to 17.2 kg DM ha–1 mm–1 in Season 1 and from 19.2 to 26.0 kg DM ha–1 mm–1 in Season 2. The effective use of water (EUW; yield increase/evapotranspiration within a period) was calculated for each of the five periods in Season 2 to identify the vegetative growth periods when the response ( kg DM ha–1 mm–1) was greatest and limited irrigation water could be applied most effectively. EUW of irrigated turnip increased from 16.8 kg DM ha–1 mm–1 at the onset of root expansion to 53.5 kg DM ha–1 mm–1 when root growth rate was a maximum, but declined thereafter. Scarce irrigation should be applied between the onset of root expansion and approximately 8 weeks later, when the response to irrigation ( kg DM ha–1 mm–1) was greatest.
Excessive fertiliser has been commonly applied in the soybean (Glycine max (L.) Merr.) cropping system in fertile Mollisols in Northeast China. However, it is necessary to understand how reducing nitrogen (N) fertiliser application may affect plant N acquisition and remobilisation, which is associated with photosynthetic carbon (C) assimilation and seed yield. The aim of this study was to investigate the origin of plant N (i.e. derived from N2 fixation, fertiliser or soil) under two different levels of N application, and the subsequent influence on C assimilation. A pot experiment was conducted with soybean grown in a Mollisol supplied with 5 mg N kg–1 soil (N5) or 100 mg N kg–1 soil (N100). Nitrogen was applied as 19.83% of 15N atom-excess in urea before sowing, and 13CO2 labelling was performed at the R5 (initial seed-filling) stage. Plants were harvested at R5 and full maturity stages to determine the 15N and 13C abundance in plant tissues. Seed yield and N content were not affected by different N rates. Symbiotically fixed N accounted for 64% of seed N in treatment N5, whereas fertiliser-derived N dominated seed N in N100, resulting in 58% of seed N. The proportion of soil-derived N in shoot and seed showed no difference between the two N treatments. A similar trend was observed for whole-plant N. The enhanced N2 fixation in N5 significantly increased assimilation of N and C during the seed-filling period compared with N100. Nodule density (nodule number per unit root length) and amount of photosynthetically fixed 13C in roots in N5 were greater than in N100. These results indicate that a greater contribution of N2 fixation to N assimilation during the seed-filling period is likely to meet N demand for maintaining soybean yield when fertiliser N supply is reduced. Greater allocation of photosynthetic C to roots and enhanced nodulation would greatly contribute to the alteration of N acquisition pattern under such condition.
Depending on their depth, watertables can have a positive effect on plants by supplying water, a negative effect by creating waterlogged and/or saline conditions or a neutral effect. Rhodes grass (Chloris gayana), a tropical perennial forage adapted to saline soils, floods and droughts, is a viable choice for the lowlands in the Pampas region of Argentina. The effects of the depth and salt concentration of the watertable on the growth dynamics and biomass accumulation of Rhodes grass were quantified in a greenhouse experiment. The experiment consisted of 10 treatments, resulting from the factorial combination of five watertable depths (25, 75, 125, 175 and 225 cm) and two salt treatments (EC 1.4 and 20.5 dS m–1). The presence of non-saline watertable at a depth of 25 cm produced a 5-fold greater biomass and showed an increase in water consumption of equal magnitude compared with deeper watertables. The increase in shoot biomass was explained primarily by higher tiller and stolon density, which increased 3.3- and 7.7-fold respectively, at watertables that were 25 cm deep compared with deeper treatments. Furthermore, groundwater use efficiency was 30% higher in non-saline watertables at 25 cm depth. Similarly, at this depth, the leaf blades were 50% longer compared with the deepest watertables evaluated. In contrast, the presence of saline watertables at 25 cm depth had a detrimental effect on the production of biomass and its components, whereas the effect at 125 cm and greater depths was neutral. Therefore, Rhodes grass is a species that can take advantage of the widespread shallow watertable environments of the Pampas region as long as the salinity levels are low.
Quantification of forage quality is essential for the identification of elite genotypes in forage grass breeding. Perennial ryegrass is the most important temperate species for global pastoral agriculture. However, the protein content of ryegrass generally exceeds the requirements of a grazing animal, and the ratio of water soluble carbohydrate (WSC) to protein is too low for efficient protein utilisation. This results in poor nitrogen use efficiency (NUE) in the farming system by livestock, and hence limits optimal animal production. New ryegrass cultivars with optimised WSC and protein content are desirable for farming efficiencies. Several methods are available for quantification of WSC and plant protein (such as near-infrared spectroscopy [NIRS] and high performance liquid chromatography [HPLC]). However, such methods are labour-intensive, low-throughput and cost-prohibitive for commercial breeding programs, which typically need to assess thousands of samples annually. An accurate high-throughput micro-plate-based protocol has been developed and validated, with the ability to simultaneously process and quantify WSC and plant protein with a high level of automation, and an increase in sample processing of ∼10-fold compared with commonly-used methods, along with a 3-fold cost reduction. As WSC and protein are extracted simultaneously and quantified within micro-plates, consumable costs are minimised with optimal reagent use efficiency, resulting in a low per sample cost that is suitable for commercial pasture breeding companies. This is the first demonstration of a forage quality phenotyping protocol suitable for broad-scale application, and will allow breeders to select elite genotypes based not only on visual assessment but also on WSC : protein ratios for improved ruminant nutrition.
Optimal evaluation and use of introduced germplasm for species improvement is an ongoing challenge. Research was conducted to survey a select set of introduced white clover (Trifolium repens L.) germplasm from broad geographic origins to assess their genetic potential, based on F1 crosses to elite New Zealand cultivars. The bulk progeny generated from test crosses to Grasslands cultivars Demand, Sustain and Kopu II were evaluated at Palmerston North under rotational grazing by sheep. The replicated trial consisted of the 26 germplasm accessions, three cultivars used as maternal parents, and 78 F1 bulk progeny breeding lines. Three morphological traits and estimated seasonal dry matter yield were measured over four years. Significant (P < 0.05) genotypic variation was observed for all these traits among the parents and F1 progeny lines. F1 progeny lines with traits values greater than the cultivars were identified. Significant (P < 0.05) genotype-by-season (σ2gs) and genotype-by-year (σ2gy) interactions were estimated for dry matter yield. Principle component analysis of the F1 progeny-by-trait BLUP matrix identified 16 elite progeny lines with mean seasonal dry matter yield equal to or higher than the cultivars. Half of the lines had Demand as the cultivar parent, while only three had Kopu II as a parent. Fourteen of these progeny lines were derived from crosses to Australasian adapted germplasm. This study indicated that choice of adapted cultivar with which to cross is important, and introduced germplasm from Australasia is a valuable source of adaptive variation in these F1 progeny. More complex approaches may be needed to identify and use adaptive allelic variation from germplasm sources beyond Australasia.
This article is only available to subscribers. It is not available for individual sale.
Access to the requested content is limited to institutions that have
purchased or subscribe to this BioOne eBook Collection. You are receiving
this notice because your organization may not have this eBook access.*
*Shibboleth/Open Athens users-please
sign in
to access your institution's subscriptions.
Additional information about institution subscriptions can be foundhere