514
Views
4
CrossRef citations to date
0
Altmetric
Research articles

Assessing the environmental implications of applying dairy cow effluent during winter using low rate and low depth application methods

, , &
Pages 449-469 | Received 24 Feb 2017, Accepted 09 Aug 2017, Published online: 24 Aug 2017

ABSTRACT

Dairy cow effluent collected over winter from a loose-housed barn was applied to a series of large infield plots (400 m2) using low rate and low depth (LRLD) application methods. Applications were confined to the winter period, at a time when soil moisture content was often at or very near to field capacity and was applied over two seasons. Cows were confined to the housing facility during winter only, and outside of this period they remained on pasture. Losses of nitrogen (N), phosphorus (P) and the faecal indicator bacterium Escherichia coli (E. coli) in surface runoff and subsurface drainage from the LRLD treatment were compared with losses from effluent applications that occurred during spring to autumn, at an application depth not exceeding the soil water deficit, i.e. a standard practice treatment (SP, typically 10–15 mm per application). The annual quantities of nutrients applied by the treatments and the grazing managements imposed were similar. Although winter losses of N were significantly greater for the LRLD treatment (15 vs. 8 kg N ha−1 for the SP treatment), on an annual basis fluxes were similar between treatments (approximately 20 kg N ha−1 year−1). Effluent management had no significant effect on the annual fluxes of P and E. coli although the latter varied considerably. Average contaminant fluxes over a 2-year period indicated that the LRLD management system did not lead to a significantly greater risk to water quality compared with standard practices.

Introduction

The New Zealand dairy industry has expanded and intensified in response to favourable returns from the sale of milk protein and fats (generically termed ‘milk solids’). Over the past 15 years, the national herd size has grown at a rate of approximately 102,000 cows per annum and the total (effective) dairy farmed area has increased by approximately 28,000 ha per annum (Livestock Improvement Corporation Citation2015). In many regions, trends in agricultural intensification have paralleled declines in water quality that have been attributed to increased losses of nutrients, sediment and faecal microorganisms to water from pastoral farming activities (PCE Citation2015). Regional Councils are setting Land and Water Regional Plans that require many farmers to reduce the quantity of nutrients leaving their farms. In response to these pressures, as well as the perceived benefit of improved cow performance, use of off-paddock facilities, e.g. feed pads, stand-off pads and animal shelters to house non-lactating cows during winter months has increased (Beukes et al. Citation2013). Implicit with the use of such facilities comes a need to collect and manage winter-generated effluents and manures. On poorly drained soils, which are common throughout New Zealand, national guidelines for the safe application of effluent to land recommend that irrigation depths do not exceed the soil water deficit (SWD) present at the time that effluent is applied (Houlbrooke and Monaghan Citation2010). Effluent application depths in excess of the SWD pose a significant risk to surface water quality due to the rapid transmission of nutrients and contaminants via soil macropores. For instance, in Manawatu, 25 mm of effluent applied to a Pallic soil when the soil water content (SWC) was close to field capacity (FC) resulted in approximately 40% of effluent nitrogen (N) being lost directly into the mole-pipe drainage network via macropore flow (Houlbrooke et al. Citation2004a). Total losses from this single event (inclusive of surface flow losses) equated to 12 kg N ha−1 and 2 kg P ha−1, which are equivalent to approximately one third of the annual drainage N loss and twice the drainage P loss from grazed dairy pastures on Pallic soils, without effluent application or winter grazing (Houlbrooke et al. Citation2003; Monaghan et al. Citation2005).

Problematically, in many dairying regions of New Zealand, poorly drained soils remain ‘wet’ (i.e. without SWD) throughout the winter and early spring seasons, prohibiting effluent from being applied. An off-paddock facility for wintering cows must include sufficient storage to contain all the effluents generated during the period that cows are housed. For instance, in South Otago, New Zealand, where long-term (15 years) median annual rainfall and potential evapotranspiration values are 664 and 797 mm year−1, respectively; housing 150 cows in a covered barn of 700 m2 with a 1242 m2 uncovered feeding apron for 90 days during winter (20 May–18 August) in this region would require storage for approximately 305 m3 of effluent (95th percentile of years between 1998 and 2012). The total cost of the storage facility accounts for approximately 50% of the total infrastructural expense associated with the installation of an off-paddock facility (Laurenson et al. Citation2016) and is thus a significant cost. In this study, we sought to evaluate the merits of an alternative approach for handling winter-generated effluents that had potential to reduce storage capacity requirements. This approach was to apply effluent to land using low rate (i.e. 4 mm h−1) and low depth (2 × 1 mm day−1) application methods during winter (referred to as LRLD). The expected benefit (in terms of nutrient attenuation) from using LRLD effluent application methods during winter originates from our understanding of soil-water hydraulics, including storage and flow, which has been well described in previous studies (Seven and Germann Citation1981; Jarvis et al. Citation1991). It is generally accepted that, in wet soils, gravity-driven water flow via macropores will dominate over capillary flow via micropores (Beven and Germann Citation1982). The theoretical line between gravity-dominated and capillary-dominated flow occurs when a soil is at the point of FC and assumes that a rapid transmission of effluent constituents through macropores occurs when the application depth exceeds the SWD. In New Zealand, FC is defined as the SWC at −10 kPa (NEMS Citation2013). However, as many studies point out (NEMS Citation2013; Ottoni Filho et al. Citation2014), the FC of a given soil may occur at matric potentials between −2 kPa and −30 kPa and may vary across both spatial and temporal scales. It is likely that combinations of pore sizes are involved in the drainage of surplus water (i.e. that in excess of FC) which results in relatively different rates of transmission. This in turn will affect the rate and extent of water mixing and the diffusion of nutrients within the soil matrix. A high degree of water redistribution is expected to occur as an irrigation front moves into a wet soil via macropores (i.e. when antecedent SWC = FC). For instance, Klaus et al. (Citation2013) estimated irrigation water to comprise only 20% of the resulting drainage volume, while the remaining 80% of drainage comprised of water that was previously stored in the soil and displaced by the ‘new’ water. As noted by Beven and Germann (Citation1982), at low rates of application, a high degree of edge flow around the macropore occurs which is encouraged by the tortuous configuration of macropores as well as their complex size arrangement. This in turn will enable a high degree of mixing between resident and invading water (Klaus et al. Citation2013), that will enable contaminants to interact with the soil surfaces. We therefore hypothesise that a high degree of nutrient attenuation will be possible when winter-generated effluent is applied to land using LRLD application methods, even when soils are relatively wet. If proven correct, such a strategy could reduce winter effluent storage requirements, and thus avoid much of the cost of building or retrofitting existing effluent systems, when installing off-paddock facilities. It would also provide an option for applying effluent to land on occasions when effluent ponds are full, as sometimes happens following prolonged periods of wet weather during spring. Our specific objectives were to compare fluxes of contaminants in drainage and overland flow from a LRLD treatment, whereby small depths of effluent were frequently applied throughout winter, with a standard practice (SP) treatment whereby effluent collected during winter was applied during the milking season (September to May) at comparatively greater depths (i.e. 10–15 mm per event) and aligning with industry accepted ‘good management practice’ (DairyNZ Citation2015).The loose-housed barn from which effluent was sourced for this study was used during winter only. For the remainder of the year cows were grazed on pasture as is standard practice for New Zealand dairying systems.

Materials and methods

Research site and facilities

The experimental site was located on the Telford Agricultural College Dairy Farm near Balclutha, South Otago, New Zealand (46° 17′S 169° 43′E). Long-term (15 years) weather data (Tait et al. Citation2006) indicate this region has a median annual rainfall of 658 mm year−1 and median annual evapotranspiration of 748 mm. The average monthly rainfall depth is similar across all months of the year and is equivalent to approximately 55 ± 22 mm month−1. A weather station was installed at the site to monitor wind speed and direction, soil and air temperature and rainfall and solar radiation inputs. The poorly drained soil type at the site is classified as a Mottled Fragic Pallic soil using the New Zealand soil classification (Hewitt Citation2010), or an Aeric Fragiaquept using the USDA taxonomy system (Soil Survey Staff Citation2014). Soil fertility was monitored from an annual regime of soil sampling (0–75 mm depth; ).

Table 1. Some selected soil and farm properties for the Telford experimental site.

The experiment comprised 14 hydraulically isolated plots (20 × 20 m) that were established on slightly sloping (approximately 2–3°) terrain in 2011 (approximately 12 months prior to the first irrigation treatment being imposed; ). A 0.9 m deep trench was excavated along the upper three sides of each plot and 450 µm thick black plastic sheeting placed along the exposed trench face to prevent water flow across the plot boundary. A perforated drainage pipe was installed along the base of the trench, on the external face of the plastic lining to intercept external drainage and direct it away from the plot area into a drainage network. This stopped any external subsurface flow from entering the plots. The trench was then back filled with soil and a bund 150–200 mm high was created at the surface around the plot perimeter to prevent surface flow entering into plots. A sloping trench was excavated along the downslope end of each plot to a depth of 600 mm, increasing to 800 mm at the point where subsurface drainage collected by perforated drainage pipe (150 mm diameter) exited each plot. The trench was then back filled with washed river gravel approximately 10 mm in diameter. A 200 mm high bund was built at the surface of the trench to direct surface flow into a drainage pipe, without allowing it to flow into the subsurface drain. For each plot, subsurface and surface drainage was directed to separate 5 L tipping buckets (i.e. 28 tipping buckets in total) that were installed in monitoring stations located in a small gully at the edge of the paddock. Flow-proportional samples of drainage were collected by a siphon system located on one side of the tipping bucket spillway that collected and stored 0.8% of each tip. In February 2012 (approximately 6 months prior to the first irrigation treatment being imposed), mole drainage was created in each plot by pulling a mole chisel plough through the soil at a depth of 450 mm. This mole drainage was installed at 1.5 m spacing and ran lengthwise down the slope to the subsurface drainage trench.

Figure 1. A, Schematic diagram of the trial site showing the layout of plots and assignment of treatments. B, The design of the plots and C, the drainage collection system.

Figure 1. A, Schematic diagram of the trial site showing the layout of plots and assignment of treatments. B, The design of the plots and C, the drainage collection system.

An irrigation system was built in May 2012 to distribute effluent from a 33 m3 tank to a low-rate sprinkler (K-line, RX Plastics, Ashburton, New Zealand) mounted in the centre of each plot. The tank was filled periodically via a pipe from the main effluent pond located on the farm. A remotely operated computer was installed to enable single or multiple nozzles to be activated for irrigation at any one time. A date- and time-stamped record of irrigation (inflow) and drainage (outflow) was collected every minute; weather data were collected at the same frequency.

Plots received maintenance applications of P in the form of monocalcium phosphate (i.e. single superphosphate (Ca(H2PO4)2), which was applied between December and January each year at a rate of 29 and 34 kg P ha−1 in years 1 and 2, respectively. Agricultural lime (CaCO3) was also applied annually (December/January) at a rate of 269 kg ha−1 to maintain a pH(1:5 water) within the range of 5.8–6.2. Soil macroporosity, bulk density and plant available water contents were determined at the start of the experiment using the methods described in Houlbrooke and Laurenson (Citation2013).

Irrigation scheduling

Drainage flows in response to rainfall were monitored across the 14 plots for approximately 8 months (during 2011–2012) prior to the instigation of the experiment and installation of the irrigation system (i.e. no irrigation was applied in that period). Flow data were used to assign experimental treatments to ensure the variation in flow volume of individual plots was evenly distributed across the two treatments: the LRLD and SP effluent management strategies. During winter (10 June to 30 August), effluent was applied to the LRLD plots twice a day at a rate of 4 mm h−1 and depth of 1 mm (i.e. 2 mm in total day−1). Within a day, there was a 6-h period between each application. Effluent was applied regardless of SWD when the following criteria were met: rainfall in the preceding 24 h < 4 mm, air temperature > 4°C, average wind speed < 4 m s−1. Scheduling was focused on achieving a target cumulative total N loading equivalent of 80 kg N ha−1 over the winter period rather than ensuring effluent was applied every day; there were, therefore, days when effluent was not applied. During winter, the SP treatment received rainfall only. Effluent was applied to SP plots during summer at a rate of 4 mm h−1 and at depths that did not exceed SWD values, and not exceeding 15 mm in a given day. Similar meteorological and N loading criteria used for the LRLD treatment were adhered to for scheduling effluent applications to the SP treatment. Farm dairy effluent (FDE) that was generated from the yards and milking parlour during the lactation season as part of routine milking operations, as opposed to barn accommodation, was directed to other areas of the farm outside of the research site that accommodated the drainage plots.

Effluent generation from the barn facilities

The farm on which this experiment was located collected effluent drainage from a loose-housed barn that had a bark and sawdust bedding. The barn was used to house approximately 95 cows for 24 h a day during winter (approximately 10 June to 20 August) each year. The basic barn structure was a roofed area over the bark bedding (10 m × 72 m) with an adjacent concrete feeding strip (5 m × 72 m) and feeding troughs (1 m × 72 m), both of which were uncovered. Barn effluent, combined with scrapings from the feeding strip, was screened through a weeping wall system (a passive solid separation system where raw effluent is separated by a wall that is constructed of wooden slats or plastic panels and allows liquid to seep through the gaps leaving the solid effluent, approximately 4 mm or greater in size, behind). The resulting liquid fraction was a relatively concentrated effluent that was then pumped into the effluent pond used to collect FDE during the lactation (i.e. the ‘main pond’) where it was stored. Across the 2 years of this experiment, two effluent storage scenarios were considered within the experimental design: (1) effluent generated in winter was stored in the main pond (i.e. that described above) and (2) the concentrated effluent (i.e. liquid fraction post solid separation) generated from the wintering barn was pumped directly to storage tanks that prevented dilution from rainfall. Nutrient concentrations in this effluent stream were therefore considerably greater than that of the main pond. In year one, effluent from the main pond was used to replenish the tank located at the experiment site.

The original plan for year 2 was to utilise the concentrated effluent from the weeping wall (as described above). However, this was not possible due to constraints with pumping and piping logistics. To generate an effluent with similar nutrient and loading to that from the weeping wall, urea (CH4N2O), di-ammonium phosphate ((NH4)2HPO4; DAP) and fresh cow dung were added to the effluent holding tank in year 2 prior to effluent applications in winter and summer. Fresh cow dung was also added to the holding tank in order to raise the faecal microbe concentration. In winter, analysis of samples, collected from the weeping wall on filling of the holding tank, enabled the determination of the quantities of urea, DAP required to deliver a standardised (based on total N and P) concentrated effluent stream and accounting for nutrients added with the fresh dung. In summer, the target effluent N and P concentrations were set so as to emulate the average monthly concentrations used in winter. Sufficient mixing of the tank was ensured by circulating the effluent via the pump for approximately 1 h prior to application to treatments.

Animal and pasture management

Pasture at the experimental site comprised a perennial ryegrass (Lolium perenne) and white clover (Trifolium repens) sward that was rotationally grazed as part of the wider farm and stock management regime. Average pasture mass across the paddock was measured weekly by rising plate metre. This information was used to schedule grazing rotations to maintain average pasture mass between 1500 and 3000 kg DM ha−1. Average pasture height across individual plots was measured from no fewer than 50 readings per plot, made immediately pre- and post-grazing of the paddock. Pastures were grazed on approximately eight occasions per annum, beginning in early spring (September) and finishing in later autumn (May). Each grazing event lasted for approximately 42 h (2 × 21 h day−1) and the average stocking density per event was 59 cows ha−1. For each grazing event, cows were provided access to the whole paddock (1.9 ha); no distinction of plot boundaries was visually evident. For two grazing events that occurred in autumn (31 March and 11 May) of the second year, four plots from each of the LRLD and SP treatments were removed from the experiment design. These plots were used to assess the effect of duration controlled grazing during autumn (with and without winter applied effluent) on total nutrient losses and are not described in this manuscript. The remaining three plots were grazed for the usual 21 h per day for 2 days.

Drainage and effluent measurement and sampling

Drainage samples were collected weekly during winter and after heavy (approximately >10 mm) rainfall events. During summer, samples were collected whenever drainage was generated and usually within 24 h of drain flows ceasing. A sample of the effluent that was applied to plots was collected weekly during winter and immediately before every application that occurred during summer. Effluent and drainage samples were subjected to the same analytical procedures. A portion of each sample was filtered through a 0.45 µm membrane and the remainder left unfiltered. E. coli concentrations (most probable number (MPN) per 100 mL) in the unfiltered drainage and effluent samples were determined on the day of sample collection from the field using the Colilert-Quanti-Tray system (IDEXX Laboratories, USA). E. coli was measured in this study because it is the preferred faecal bacteria indicator for freshwater in New Zealand (McBride et al. Citation1991). Ammonium-N (), nitrate-N () and nitrite-N () analyses were undertaken colorimetrically using a Skalar SAN++ segmented flow analyser (Mulvaney Citation1996). Total Kjeldahl nitrogen (TKN) was determined following acid digestion and distillation of samples (APHA Citation1998) and TN calculated as the sum of TKN and NO3-N. The total N in combined particulate and dissolved organic forms was calculated as the sum of total N minus the sum of NO2–N, NO3–N + NH4+–N. Total P (TP), total filterable P (TFP) and filterable reactive P (FRP) concentrations were determined colorimetrically using a SEAL Auto Analyser 3 using the phospho-molybdenum complex at 880 nm (Murphy and Riley Citation1962). This was following a persulphate digestion for TP and a UV digestion for TFP. Dissolved organic P (DOP) was calculated as the difference between TFP and FRP.

Losses of P and E. coli in mole-pipe drainage flows were calculated for each event as the product of measured drainage volume and concentration. For calculations of N losses, total drainage volume was determined from a daily soil water balance model to account for N that may have drained between the mole drains and was, therefore, not captured by the tipping bucket measurement system. It was assumed that the transmission of P and E. coli via deep drainage beyond the mole-pipe drains was insignificant. The conceptual structure of the soil water balance model was essentially that described by Scotter et al. (Citation1979) and Woodward et al. (Citation2001) and referred to by Monaghan et al. (Citation2016). Briefly, it was assumed that the ratio of actual evapotranspiration to potential evapotranspiration had a value of 1.0 when soil moisture contents were between FC and a limiting SWD of 50% of plant available water. Thereafter, this ratio decreased linearly to become zero at the permanent wilting point. Potential evapotranspiration was measured at the experimental site.

Statistical analysis

For each measuring date, the concentrations of contaminants in drainage or surface runoff flows were compared between treatments using a linear mixed model (LMM). The LMM modelled correlation among the concentration values measured from the same plots repeatedly over time as random effects, by assuming compound symmetric covariance structure. The LMM also included full factorial combination of two factors: treatment and measurement date, both as fixed effects. This LMM analysis was carried out with SAS (version 9.3) statistical software. For each contaminant flux, comparisons between treatment groups within each season or year were assessed using analysis of variance (ANOVA). Each ANOVA consisted of a single factor Group, whose levels were defined by a combination of year × treatment. Prior to the ANOVA, total N and total P values were log transformed to stabilise their variation. For E. coli, variables were compared between treatment groups within each seasonal year using a generalised linear model (GLM), which assumed group-specific negative binomial distributions with log link function. The groups in each of these GLM analyses were defined by the same single factor Group as that of ANOVA for other contaminants. Pasture growth rates were compared for each grazing event; significant differences in pasture DM production between treatments were determined using a Student’s t-test (n = 7 for all dates prior to 31 March 2015; for remaining sampling dates, n = 3). All reported differences between treatments are significant at the 5% level unless otherwise stated.

Results

Rainfall and effluent application to plots

Total annual rainfall and evaporation were generally similar across years and aligned with long-term (15 years) climatic patterns for the region (). For each 12-month monitoring year (10 June to 9 June), rainfall occurred on 217 days, yet on only 25% of these days was rainfall depth greater than 4 mm. Approximately 150 mm (±7.6 mm between years) of rainfall was received during winter each year. Average daily evapotranspiration rates were 2.7, 3.8, 1.4 and 0.7 mm for spring, summer, autumn and winter, respectively, and total evapotranspiration was approximately 780 mm between June and May (averaged over years 1 and 2).

Figure 2. Total monthly rainfall and evaporation (mm) at the research site between 10 June 2013 and 1 June 2015. Long-term (15-year average) monthly average rainfall (+) and potential evaporation (…) values are also shown.

Figure 2. Total monthly rainfall and evaporation (mm) at the research site between 10 June 2013 and 1 June 2015. Long-term (15-year average) monthly average rainfall (+) and potential evaporation (…) values are also shown.

The average total depths of effluent applied to the LRLD treatment during winter were 120 mm (SD ± 8.5 mm) and 44 mm (SD ± 0.5 mm) in years 1 and 2, respectively. The average depths of effluent applied to the SP treatment during summer were 68 (SD ± 4.0 mm) and 48 mm (SD ± 1.5 mm) in years 1 and 2, respectively. SWD values remained low (i.e. generally <2 mm) during both winters and were similar for the LRLD and SP treatments. Because soils in the LRLD treatment were at or near FC, effluent application had minimal effect on soil water storage status (i.e. the water held in the soil at a matric potential of −10 kPa or less). In both years of the experiment, SWD values remained low for much of spring, thereby preventing effluent application to the SP treatment until late November in year 1 and late January in year 2. SWD values were greatest during December to February and declined rapidly in autumn. This indicates that the season for effluent irrigation is short at this location, if effluent is to be applied only when there is a SWD.

Average nutrient loadings in the effluent applied across the 2 years were 90 and 76 kg N ha−1 year−1, respectively, for the LRLD and SP plots (). Ensuring similar quantities of nutrient application to the LRLD and SP treatments proved to be a problem, despite routine analysis of effluent samples. In year 1 of the experiment, the LRLD treatment received approximately 26 kg more N per hectare than the SP plots (the E. coli loading to the SP plots was 10-fold greater than the LRLD plots). Additional inputs of nutrients and faecal microorganisms would have occurred via the urine and faces deposited by cows when grazing plots during the milking season; these contributions have not been quantified here but are assumed to be similar between treatments.

Table 2. Nutrient and faecal (E. coli) inputs to treatments via effluent applied during the 2-year experiment. Additional nutrients applied in dung and urine during grazing were not quantified and are expected to be similar across treatments.

Volumes of drainage and overland flow

The total water (surface and subsurface) flows drained from plots during year 1 of the study were 220 mm (SD ± 20 mm) and 170 mm (SD ± 33 mm) for the LRLD and SP treatments, respectively. In year 2, average drainage totals were less, at 140 mm (SD ± 12 mm) and 128 mm (SD ± 25 mm), respectively. Subsurface drainage was the predominant pathway of water and contaminant loss from the site, accounting for more than 97% (± 1%) of the total water collected from the LRLD treatment, and approximately 90% (± 2.5%) of all water collected from the SP treatment. Volumes of subsurface drainage were significantly greater during winter relative to the remainder of the year, comprising 68% (SD ± 6%) and 48% (SD ± 10%) of the annual water volumes leaving the LRLD and SP treatments, respectively. Total drainage during spring and summer of both years was less than 20 mm (<13% of the total water loss). Effluent application to the LRLD plots increased subsurface drainage volumes during winter by 42% and 30% in years 1 and 2, respectively (). These increased volumes of winter drainage closely align with a 45% and 24% increase in hydraulic loadings due to effluent application in years 1 and 2, respectively. Because the depth of effluent applied to SP plots was always less than the SWD, drainage never occurred as a direct response to the application. During autumn (1 March to 30 May), drainage was, however, greater from the SP treatment due to the relatively recent applications of effluent.

Figure 3. Cumulative drainage from the LRLD and SP treatments between June 2013 and December 2015. Effluent was applied to LRLD plots between June and August each year, while similar loadings of effluent (based on kg N ha−1) were applied to the SP treatment between November and May of each year. Values represent the mean of seven replicates, except for measurements made from 14 May 2015 onwards, in which case n = 3. Rainfall (mm) at the site is also shown as inverted bars.

Figure 3. Cumulative drainage from the LRLD and SP treatments between June 2013 and December 2015. Effluent was applied to LRLD plots between June and August each year, while similar loadings of effluent (based on kg N ha−1) were applied to the SP treatment between November and May of each year. Values represent the mean of seven replicates, except for measurements made from 14 May 2015 onwards, in which case n = 3. Rainfall (mm) at the site is also shown as inverted bars.

Application of a more concentrated effluent in year 2 of the experiment meant that the hydraulic loading to meet a prescribed cumulative N input (75 kg N ha−1) to treatments was less than implemented during year 1 (94 and 46 mm, for years 1 and 2, respectively, averaged across all plots). While effluent applications to the LRLD treatment ceased on 24 August in both years, commencement dates in years 1 and 2 were 10 June and 3 July, respectively; consequently, the number of days that effluent was applied during winter in year 2 was fewer (52 days) than in year 1 (76 days). This was an important factor contributing to the significantly (P < .05) reduced losses of N, P and faecal microorganisms measured in drainage and surface runoff during year 2 of the study, despite broadly similar nutrient loadings via effluent applications each year ().

Concentrations of contaminants in subsurface drainage

The overwhelming majority of nutrients (e.g. 93% and 85% of total N and P, respectively) were lost in subsurface drainage; the focus of our analysis is, therefore, directed to this loss pathway. Concentrations of NH4-N in drainage were low compared with other measured forms of N (A). Furthermore, NH4-N concentrations were considerably less than those of the effluent applied, indicating that preferential flow via large macropores was minimal. The greatest ammonium concentration was measured following a drainage event which occurred in April 2014, 2 days after a grazing event, and was significantly greater in the SP than LRLD treatment. On all other occasions, there were no significant differences in ammonium concentrations between treatments.

Figure 4. Concentrations (g m−3) of A, ammonium (NH4-N), B, nitrate (NO3-N), C, dissolved organic nitrogen (DON) and particulate nitrogen (PN), and D, total nitrogen (TN) in subsurface drainage from the LRLD and SP treatments. Values represent the mean of seven replicates except measurements made from 14 May 2015 onwards, in which case n = 3. Grazing events throughout the experimental period are shown by triangle symbols. For each sampling event, statistically significant differences (P < .05) between treatments are indicated with a cross.

Figure 4. Concentrations (g m−3) of A, ammonium (NH4-N), B, nitrate (NO3-N), C, dissolved organic nitrogen (DON) and particulate nitrogen (PN), and D, total nitrogen (TN) in subsurface drainage from the LRLD and SP treatments. Values represent the mean of seven replicates except measurements made from 14 May 2015 onwards, in which case n = 3. Grazing events throughout the experimental period are shown by triangle symbols. For each sampling event, statistically significant differences (P < .05) between treatments are indicated with a cross.

Concentrations of NO3-N (B), DON + PN (C) and total N (D) followed expected patterns of loss (Monaghan et al. Citation2016), being greatest in autumn and coinciding with the period when drainage commenced each year. There were also significant peaks in NH4-N, NO3-N, DON + PN and total N concentrations in both the SP and LRLD treatments which occurred outside of the autumn drainage period. Elevated concentrations of contaminants in drainage appeared to be closely linked with grazing events that occurred shortly before a significant (e.g. >5 mm) rainfall event. During the winter period, nitrate concentrations in drainage were not significantly different between the SP and LRLD treatments, despite the greater drainage volume in the latter. The exception to this was a significantly greater nitrate-N concentration in drainage from SP plots in August 2013, albeit this was associated with a very small volume of drainage (<1 mm drainage).

Concentrations of FRP (A), DOP (B) and total P (C) remained below <3.5 mg L−1 throughout the year with no clear increase during autumn, as observed for N analytes. Significantly higher concentrations of FRP, DOP and TP were measured in drainage from the LRLD treatment in July 2013 (i.e. during the period that effluent was applied). These peaks in P concentrations were recorded for a small (3 mm) yet significant drainage event. More sizeable peaks in P concentrations (including FRP, DOP and TP) were measured in drainage from the SP treatment for two events in February and March 2014 and corresponded to grazing events that shortly preceded rainfall and drainage events. Drainage volumes in both instances were very small at less than 0.7 and 0.07 mm, respectively, however. For both treatments, peaks in FRP and Total P concentrations were measured on 12 February 2015, again associated with a very small drainage event (< 0.06 mm) observed in both treatments. Concentrations of E. coli tended to be high in autumn and then progressively declined during winter, presumably reflecting a degree of source (i.e. faecal input) limitation as bacteria were eluted from the soil in drainage. There were no significant (P > .05) differences in the concentrations of E. coli in the SP and LRLD treatments; as observed for N and P, peak concentrations were also observed for small drainage events that occurred during spring and summer months ().

Figure 5. Concentrations (g m−3) of A, filtered reactive phosphorus (FRP), B, dissolve organic phosphorus (DOP), and, C, total phosphorus (TP) in subsurface drainage from the LRLD and SP treatments. Values represent the mean of seven replicates except for measurements made from 14 May 2015 onward, in which case n = 3. Grazing events throughout the experimental period are shown by triangle symbols. For each sampling event, statistically significant differences (p < .05) between treatments are indicated with a cross.

Figure 5. Concentrations (g m−3) of A, filtered reactive phosphorus (FRP), B, dissolve organic phosphorus (DOP), and, C, total phosphorus (TP) in subsurface drainage from the LRLD and SP treatments. Values represent the mean of seven replicates except for measurements made from 14 May 2015 onward, in which case n = 3. Grazing events throughout the experimental period are shown by triangle symbols. For each sampling event, statistically significant differences (p < .05) between treatments are indicated with a cross.

Figure 6. Concentrations of E. coli (MPN 100 mL−1) in subsurface drainage from LRLD and SP treatments. Values represent the mean of seven replicates except measurements made from 14 May 2015 onwards, in which case n = 3. Grazing events throughout the experimental period are shown by triangle symbols. For each sampling event, statistically significant differences (P < .05) between treatments are indicated with a cross.

Figure 6. Concentrations of E. coli (MPN 100 mL−1) in subsurface drainage from LRLD and SP treatments. Values represent the mean of seven replicates except measurements made from 14 May 2015 onwards, in which case n = 3. Grazing events throughout the experimental period are shown by triangle symbols. For each sampling event, statistically significant differences (P < .05) between treatments are indicated with a cross.

E. coli concentrations in subsurface drainage from the SP and LRLD treatments were significantly different for a number of sampling events occurring in year 1. However, there was no trend that suggested concentrations of E. coli in subsurface drainage were consistently greater under SP or LRLD treatments. Generally, E. coli concentrations in subsurface drainage were greatest in early (October) and late (April) season drainage.

Fluxes of contaminants in combined drainage and surface runoff

As the predominant pathway of water and contaminant loss was via the mole-pipe drainage system, values for contaminant fluxes via surface and subsurface drainage flow pathways were combined. These are considered below on a seasonal (winter) and annualised basis. During the winters of years 1 and 2, total fluxes of N in subsurface and surface flows were significantly (P < .05) greater from the LRLD compared to the SP treatment (). In contrast, fluxes of total P and E. coli during the same period were not significantly (P > .05) different between treatments. Fluxes of N and P were significantly lower in winter of year 2 compared with year 1, for both treatments. Annual fluxes of N and P in combined subsurface and surface flows were not significantly different between treatments in year 1 of the experiment, despite the greater winter losses recorded for the LRLD treatment (). The N flux in year 2 was significantly (P < .05) greater from LRLD plots with an additional 4 kg N ha−1 being lost compared with SP plots. Despite this, there were no significant differences in the total cumulative fluxes of N and P between treatments over the 2-year experimental period. Fluxes of E. coli in annual drainage during year 1 were significantly greater for the SP treatment than calculated for the LRLD plots; no significant difference was observed during year 2.

Table 3. Total seasonal and annual per hectare losses of N, P and E. coli.in combined surface and subsurface flows from hydraulically isolated plots (Mean ± SEM) (n = 7 unless otherwise indicated).

Pasture growth

Pasture growth rates following winter applications of effluent to the LRLD plots were significantly (P < .05) greater than observed in the SP treatment (). The average annual pasture growth measured in the LRLD treatment was 1322 and 866 kg DM ha−1, for years 1and 2, respectively (three sampling events during year 2 of the trial were not recorded for SP or LRLD treatments). For the SP treatment, average annual pasture growth measured was 1206 and 941 kg DM ha−1 for years 1 and 2, respectively. ||Approximately 1000 kg DM ha−1 more pasture was measured in the LRLD plots during early spring of each year. In contrast, total pasture grown in the SP plots during summer and autumn of each year was consistently greater than observed in the LRLD treatment, albeit these differences were small and generally insignificant (P > .05).

Figure 7. Pasture growth (kg DM ha−1) measured between grazing events from September 2013 to October 2015. For each value, n = 7 except grazing events from 31 March 2015 to October 2015 where n = 3. For each sampling event, significant differences between treatments are marked as: * (P < .05), ** (P < .01), *** (P < .001).

Figure 7. Pasture growth (kg DM ha−1) measured between grazing events from September 2013 to October 2015. For each value, n = 7 except grazing events from 31 March 2015 to October 2015 where n = 3. For each sampling event, significant differences between treatments are marked as: * (P < .05), ** (P < .01), *** (P < .001).

Discussion

In both years of the experiment, N and P losses were predominantly confined to late autumn and winter when most of the drainage occurred. This is also a period when plant growth is limited by low temperatures and N demand is subsequently low (de Klein et al. Citation2006). Under typical cow management practices, a surplus of N accumulates in the soil during this period and when drainage occurs, is leached from the soil profile (Christensen et al. Citation2012; Cichota et al. Citation2012; Selbie et al. Citation2015). Accordingly, a high proportion of the annual N flux in drainage from both treatments in the current experiment is expected to be associated with the deposition of urinary N by grazing cows and is consistent with reviews by Ledgard (Citation2001) and Cameron et al. (Citation2013) which identify animal excretion as the primary driver of N loss from pastoral farms. Application of effluent to the SP treatment during autumn had the combined effect of increasing the quantity of surplus N in the soil prior to the onset of late autumn drainage, and increasing drainage volumes due to the elevated soil moisture contents that were maintained. Of note is that drainage and the transport of N within the SP treatment was not in direct response to the application of effluent; application depths were always less than SWD values, and incidental losses of effluent were thus avoided.

The application of effluent to LRLD plots during winter significantly increased N losses in subsurface drainage during this period. This was closely aligned with the additional hydraulic loading in this treatment and the resulting increase in drainage volumes. As a consequence of the more concentrated effluent applied in year 2, total N losses were significantly reduced. This can be attributed to the reduced hydraulic loading applied to the LRLD treatment that enabled the start of effluent applications to be delayed in this year to avoid periods of heavy rainfall. Despite the lower total losses of N in year 2, a similar temporal pattern of N loss was observed whereby winter losses from the LRLD treatment were significantly greater than those from the SP treatment. On an annual basis, the management imposed in the LRLD treatment led to greater N losses in year 2, yet over the 2 years there was no statistically significant difference between treatments.

Where temperatures allow, nutrient losses to surface waters can encourage the growth of nuisance aquatic macrophytes and algae if the concentrations, forms and ratios of N and P are sufficient for growth (McDowell et al. Citation2009). Across New Zealand, most regional authorities are in a process of defining frameworks for managing land-based activities, including pastoral farming, that prevent threshold nutrient concentrations in surface water bodies enabling eutrophication (OECD Citation2017). Generally, threshold values are developed with context to local river flow processes that influence contaminant concentrations. In Otago, for instance, permitted activity discharge thresholds (Schedule 16, Discharge Thresholds) for water quality will come into effect from 2020 (Otago Regional Council Citation2017). For Balclutha, where this trial was located, the concentration of contaminants in tile drain and overland flows must be maintained below 3.6 mg L−1 for ; 0.045 mg L−1 for FRP when stream flow, as measured at the nearest reference site (Catlins at Houipapa; http://water.orc.govt.nz/WaterInfo/Site.aspx?s=CatlinsFlow) is at or below median flow (2.34 m3 s−1). Average monthly flow (recorded between 1993 and 2017) at this reference site is lowest during February and March (1.99 and 2.07 m3 s−1, respectively) and highest during May to August (5.4, 7.3, 5.9 and 5.1 m3 s−1 for May, June, July and August, respectively). At a broad level, nutrient losses to surface waters that occur during winter (i.e. as under the LRLD effluent management strategy evaluated here) are expected to have lesser impact on surface water quality relative to losses that occur during summer months when flows of water through the river network are much lower and temperatures greater. During the 2 years of this study, there were, however, periods, including during winter (June–August), when river flow at the reference site was below median flow; discharge thresholds for water quality would therefore have been applicable if operational. Regardless of effluent treatment, all drainage events that occurred during winter when the reference site was at or below median flow exceeded the proposed thresholds yet were maintained below the FRP thresholds with the exception of one event (7 August 2013) when FRP concentrations in drainage from the LRLD treatment exceeded the threshold. During year 2, no drainage events occurred when the reference site was at or below median flow; the permitted activity discharge thresholds for water quality were therefore not applicable. There were six drainage events that occurred outside of winter when the reference site was at or below median flow, of which six, three and four events exceeded the thresholds for , and FRP, respectively, with no distinction across effluent treatments. Although it is evident that there are occasions when drainage concentrations exceeded proposed guidelines, these appear to be independent of the specific management adopted in the LRLD and SP treatments and are instead likely to be associated with cow grazing management.

Pasture growth, and subsequently cow stocking density, are key determinants of urinary N loading to soils and are influenced by N inputs (i.e. in the form of fertilisers and effluent). Total losses of N in drainage water (surface and subsurface) over the 2-year study period averaged 43.3 (±3.2) kg N ha−1 across both SP and LRLD treatments. This equates to approximately 27% of the N applied in effluent. In a similar study located on Pallic soils, Monaghan et al. (Citation2005) monitored N losses from a cattle-grazed system that received 100 kg of urea-N ha−1 yet no effluent. The average N flux measured in mole and pipe drain discharge was 34 kg N ha−1 year−1 (34% of the N applied) and was attributed to indirect losses of urinary N that were greatest during autumn. As previously discussed, hydraulic loading parameters (including rate and depth) are also important determinants of nutrient loss which are in addition to those losses associated with cow grazing. For instance, a literature review on land application of FDE in New Zealand by Houlbrooke et al. (Citation2004b) indicated that between 2% and 20% of both the N and P that was applied to land in effluent was lost either in surface runoff or via leaching below the root zone. In our case, direct N losses due to the LRLD management, i.e. those associated with the hydraulic parameters, can be broadly quantified as the difference in N loss between the effluent treatments. This difference was approximately 6 kg N ha−1 over the 2 years, equivalent to 3% of the effluent that was applied and is relatively low compared to N losses associated with grazing. Effluent management had no significant effect on the temporal and annual fluxes of P throughout the 2-year study period; the proportion of applied effluent P that was lost via drainage or surface runoff was therefore effectively nil. Although there were differences in E. coli losses between treatments, these appeared to be independent of the effluent managements that were imposed and were more closely related to grazing events that occurred during or immediately prior to rainfall events. As noted by others (McDowell and Sharpley Citation2002; Laurenson and Houlbrooke Citation2014), freshly deposited faecal material is an important source of P and E. coli that can be entrained in drainage and surface runoff, and would appear to be more important than the effluent applied to the LRLD and SP treatments in this study.

Holding cows on an off-paddock facility can incur significant capital expenditure, of which the costs associated with effluent management (pond storage facilities) are significant (Laurenson et al. Citation2016). The LRLD approach requires basic irrigation infrastructure (pipes, sprinkler heads and pump timers) that is common on dairy farms in New Zealand. It is important that farmers adhere to the LRLD principles and monitor climate and soil conditions to avoid applying effluent immediately prior to heavy rainfall and drainage events so as to limit the preferential flow of contaminants through or across the soil profile. It is recommended that the LRLD strategy be considered as a means to reduce storage volume requirements as opposed to fully replacing the need for storage. Using 30 years of climate data from the research location, a simple spreadsheet model that (i) considered effluent inputs and outputs to a storage tank, and the management rules specified in the research methodology for LRLD effluent application, and (ii) assumed a loose-housed barn with sawdust bedding (as was used at the experiment site) was constructed. This indicated that estimated storage requirements (m3 100 cows−1) for capturing winter-generated effluent from a loose-housed barn system could be reduced by approximately 36%, from 325 to 209 m3 100−1 cows, compared with standard SP effluent management. Reducing the storage capacity of the infrastructure will represent a significant cost saving to those farmers considering off-paddock cow wintering systems. Storage capacity simulations suggested that storage demand seldom reached the estimated maximum, which itself was dictated by the maximum number of consecutive days in which effluent application was not allowed.

When compared with a common practice of winter grazing cows on fodder crops, there would appear to be a significant environmental benefit from holding cows off paddock. For instance, at the same research farm, estimated annual fluxes of contaminants in surface and subsurface flow from a winter-grazed forage crop (14 cows ha−1 over 75 days during winter) averaged 39 and 4.3 kg ha−1, for N and P, respectively, and up to 4.9 × 1010 MPN ha−1 for E. coli (Monaghan et al. Citation2016). These values are equivalent to annualised fluxes of 2.8 kg N, 0.31 kg P and 3.5 × 109 MPN E. coli per cow wintered on crop. In contrast, equivalent metrics for a wintering system for cows kept in a loose-housed barn and applying effluent at a LRLD during winter are 0.18 kg N per cow winter, while losses of P and E. coli were effectively nil.

Pasture growth at the start of spring was significantly greater in plots that received winter applications of effluent. Pasture growth in southern New Zealand is typically slow at this time of year, due mostly to low temperatures but also low rates of N supply from the soil to the pastures (Cameron et al. Citation2013). It is expected that the winter applications of effluent to the LRLD treatment provided a source of inorganic N that would have supported the enhanced pasture growth observed in spring. It is important to note that such benefits to pasture growth will be less apparent when rainfall enables surplus N to be mobilised in winter drainage. Therefore, application of N via the LRLD methodology is best carried out in late winter when soil and air temperatures begin to increase with the commencement of spring and, in the case of this trail site location, when the risk of large rainfall has reduced.

Conclusion

LRLD application of effluent to pastoral land led to a greater quantity of N lost to water. However, such losses were small in comparison with those associated with cow grazing practices (i.e. background losses). Annual losses of P and E. coli were not affected by the effluent management practices employed in this study, although the temporal patterns of loss did shift, with greater losses observed for late winter and spring drainage in the LRLD treatment compared to autumn in the SP treatment. Under the management protocol developed here, there was limited evidence of adverse effects of the LRLD strategy, particularly when compared to the considerably greater contaminant fluxes associated with the more common local wintering practice based on in situ grazing of forage crops. The LRLD strategy can be considered as a viable alternative to investing in large storage ponds or retrofitting existing ponds; vigilant monitoring of soil and climatic conditions is, however, incumbent to the successful operation of such a system.

Acknowledgements

The support of Telford Dairy Farm is also gratefully acknowledged. Dr Neil Cox and Chikako van Koten are thanked for providing statistical analysis and Louise Gibson for technical support.

Disclosure statement

No potential conflict of interest was reported by the authors.

Additional information

Funding

This research was funded by the Pastoral21 programme, a collaborative venture between DairyNZ, Fonterra, Dairy Companies Association of New Zealand, Beef + Lamb NZ and the Ministry of Business, Innovation and Employment.

References

  • [APHA] American Public Health Association. 1998. Standard methods for the examination of water and wastewater. 20th ed. Washington (DC): American Public Health Association.
  • Beukes PC, Romera AJ, Clark DA, Dalley DE, Hedley MJ, Horne DJ, Monaghan RM, Laurenson S. 2013. Evaluating the benefits of standing cows off pasture to avoid soil pugging damage in two dairy farming regions of New Zealand. New Zealand Journal of Agricultural Research. 56:224–238. doi: 10.1080/00288233.2013.822002
  • Beven K, Germann P. 1982. Macropores and water flow in soils. Water Resources Research. 18:1311–1325. doi: 10.1029/WR018i005p01311
  • Cameron K, Di H, Moir J. 2013. Nitrogen losses from the soil/plant system: a review. Annals of Applied Biology. 162:145–173. doi: 10.1111/aab.12014
  • Christensen CL, Hedley MJ, Hanly JA, Horne DJ. 2012. Nitrogen loss mitigation using duration-controlled grazing: field observations compared to modelled outputs. Proceedings of the New Zealand Grassland Association. 74:115–120.
  • Cichota R, Snow VO, Vogeler I, Wheeler DM, Shepherd MA. 2012. Describing N leaching from urine patches deposited at different times of the year with a transfer function. Soil Research. 50:694–707. doi: 10.1071/SR12208
  • DairyNZ. 2015. Farm dairy effluent (FDE): design standards and code of practice. Version 3, September 2015; [accessed 2017 August 18]. https://www.dairynz.co.nz/publications/environment/farm-dairy-effluent-design-standards-and-code-of-practice/.
  • de Klein CAM, Smith LC, Monaghan RM. 2006. Restricted autumn grazing to reduce nitrous oxide emissions from dairy pastures in Southland, New Zealand. Agriculture, Ecosystems & Environment. 112:192–199. doi: 10.1016/j.agee.2005.08.019
  • Hewitt AE. 2010. New Zealand soil classification. 3rd ed. Lincoln: Manaaki Whenua Press.
  • Houlbrooke DJ, Horne DJ, Hedley MJ, Hanly JA, Scotter DR, Snow VO. 2004a. Minimising surface water pollution resulting from farm-dairy effluent application to mole-pipe drained soils. I. An evaluation of the deferred irrigation system for sustainable land treatment in the Manawatu. New Zealand Journal of Agricultural Research. 47:405–415. doi: 10.1080/00288233.2004.9513609
  • Houlbrooke DJ, Horne DJ, Hedley MJ, Hanly JA, Snow VO. 2003. The impact of intensive dairy farming on the leaching losses of nitrogen and phosphorus from a mole and pipe drained soil. Proceedings of the New Zealand Grassland Association. 65:179–184.
  • Houlbrooke DJ, Horne DJH, Hedley MJ, Hanly JA, Snow VO. 2004b. A review of literature on the land treatment of farm-dairy effluent in New Zealand and its impact on water quality. New Zealand Journal of Agricultural Research. 47:499–511. doi: 10.1080/00288233.2004.9513617
  • Houlbrooke DJ, Laurenson S. 2013. Effect of sheep and cattle treading damage on soil microporosity and soil water holding capacity. Agricultural Water Management. 121:81–84. doi: 10.1016/j.agwat.2013.01.010
  • Houlbrooke DJ, Monaghan RM. 2010. Land application for farm dairy effluent: development of a decision framework for matching management practice to soil and landscape risk. In: Currie LD, editor. Farming’s future: minimising footprints and maximising margins. Palmerston North: Fertilizer and Lime Research Centre, Massey University; p. 35–45.
  • Jarvis N, Jansson PE, Dik P, Messing I. 1991. Modelling water and solute transport in macroporous soil. I. Model description and sensitivity analysis. Journal of Soil Science. 42:59–70. doi: 10.1111/j.1365-2389.1991.tb00091.x
  • Klaus J, Zehe E, Elsner M, Kulls C, McDonnell JJ. 2013. Macropore flow of old water revisited: experimental insights from a tile-drained hillslope. Hydrological Earth System Science. 17:103–118. doi: 10.5194/hess-17-103-2013
  • Laurenson S, Houlbrooke DJ. 2014. Nutrient and microbial loss in relation to timing of rainfall following surface application of dairy farm manure slurries to pasture. Soil Research. 52:513–520. doi: 10.1071/SR13358
  • Laurenson S, Houlbrooke DJ, Beukes PC. 2016. Assessing the production and economic benefits from preventing cows grazing on wet soils in New Zealand. Journal of the Science of Food and Agriculture. 96:4584–4593.
  • Ledgard SF. 2001. Nitrogen cycling in low input legume-based agriculture, with emphasis on legume/grass pastures. Plant and Soil. 228:43–59. doi: 10.1023/A:1004810620983
  • Livestock Improvement Corporation. 2015. New Zealand Dairy Statistics 2014–2015. Hamilton: Livestock Improvement Corporation, DairyNZ.
  • McBride GB, Cooper AB, Till DG. 1991. Microbial water quality guidelines for recreation and shellfish gathering waters in New Zealand. Wellington: Department of Health.
  • McDowell RW, Larned ST, Houlbrooke DJ. 2009. Nitrogen and phosphorus in New Zealand streams and rivers: control and impact of eutrophication and the influence of land management. New Zealand Journal of Marine and Freshwater Research. 43:985–995. doi: 10.1080/00288330909510055
  • McDowell RW, Sharpley AN. 2002. Phosphorus transport in overland flow in response to position of manure application. Journal of Environment Quality. 31:217–227. doi: 10.2134/jeq2002.2170
  • Monaghan RM, Paton RJ, Smith LC, Drewry JJ, Littlejohn RP. 2005. The impacts of nitrogen fertilisation and increased stocking rate on pasture yield, soil physical condition and nutrient losses in drainage from a cattle-grazed pasture. New Zealand Journal of Agricultural Research. 48:227–240. doi: 10.1080/00288233.2005.9513652
  • Monaghan RM, Smith LC, Muirhead RW. 2016. Pathways of contaminant transfers to water from an artificially-drained soil under intensive grazing by dairy cows. Agriculture, Ecosystems & Environment. 220:76–88. doi: 10.1016/j.agee.2015.12.024
  • Mulvaney RL. 1996. Nitrogen-inorganic forms. In: Sparks DL, editor. Methods of soil analysis: part 3. Madison (WI): Soil Science Society of America; p. 1123–1184.
  • Murphy J, Riley HP. 1962. A modified single solution method for the determination of phosphate in natural waters. Anal Chim Acta. 27:31–36. doi: 10.1016/S0003-2670(00)88444-5
  • [NEMS] National Environmental Monitoring Standards. 2013. Soil water measurement: measurement, processing and archiving of soil water content data. Version: 1.0; [accessed 2017 August 18]. http://www.nems.org.nz/documents/.
  • OECD. 2017. OECD environmental performance reviews: New Zealand 2017. Paris: OECD Publishing.
  • Otago Regional Council. 2016. Schedule 16: discharge thresholds. Dunedin: Otago Regional Council; [accessed 2017 July 10] www.orc.govt.nz/Documents/Publications/Regional/Water/Regional%20Plan_%20Water%20Schedules%2016.pdf.
  • Ottoni Filho TB, Ottoni MV, Oliveira MBD, Macedo JRD, Reichardt K. 2014. Revisiting field capacity (FC): variation of definition of FC and its estimation from pedotransfer functions. Revista Brasileira de Ciência do Solo. 38:1750–1764. doi: 10.1590/S0100-06832014000600010
  • PCE. 2015. Water quality in New Zealand: land use and nutrient pollution. Update Report from the New Zealand Parliamentary Commissioner for the Environment.
  • Scotter DR, Clothier BE, Turner MA. 1979. The soil water balance in a Gragiaqualf and its effect on pasture growth in central New Zealand. Australian Journal of Soil Research. 17:455–465. doi: 10.1071/SR9790455
  • Selbie DR, Buckthought LE, Shepherd MA. 2015. The challenge of the urine patch for managing nitrogen in grazed pasture systems. Advances in Agronomy. 129:229–292. doi: 10.1016/bs.agron.2014.09.004
  • Seven K, Germann P. 1981. Water flow in soil macropores II. A combined flow model. Journal of Soil Science. 32:15–29. doi: 10.1111/j.1365-2389.1981.tb01682.x
  • Soil Survey Staff. 2014. Keys to soil taxonomy. 12th ed. Washington (DC): USDA-Natural Resources Conservation Service
  • Tait A, Henderson R, Turner R, Zheng X. 2006. Thin plate smoothing spline interpolation of daily rainfall for New Zealand using a climatological rainfall surface. International Journal of Climatology. 26:2097–2115. doi: 10.1002/joc.1350
  • Woodward SJR, Barker DJ, Zyskowski RF. 2001. A practical model for predicting soil water deficit in New Zealand pastures. New Zealand Journal of Agricultural Research. 44:91–109. doi: 10.1080/00288233.2001.9513464

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.