Abstract

There is a potential in the southeastern US to harvest winter cover crops from cotton (Gossypium hirsutum L.) fields for biofuels or animal feed use, but this could impact yields and nitrogen (N) fertilizer response. An experiment was established to examine rye (Secale cereale L.) residue management (RM) and N rates on cotton productivity. Three RM treatments (no winter cover crop (NC), residue removed (REM) and residue retained (RET)) and four N rates for cotton were studied. Cotton population, leaf and plant N concentration, cotton biomass and N uptake at first square, and cotton biomass production between first square and cutout were higher for RET, followed by REM and NC. However, leaf N concentration at early bloom and N concentration in the cotton biomass between first square and cutout were higher for NC, followed by REM and RET. Seed cotton yield response to N interacted with year and RM, but yields were greater with RET followed by REM both years. These results indicate that a rye cover crop can be beneficial for cotton, especially during hot and dry years. Long-term studies would be required to completely understand the effect of rye residue harvest on cotton production under conservation tillage.

1. Introduction

Nitrogen is the most difficult nutrient to manage when growing cotton. About 5,445,749 ha of the cotton were planted in the USA in 2003 [1]. Applying optimum N rates is necessary to maximize economic yields and minimize the negative impacts that N overapplication can have on the crop and environment [2]. Higher N rates than required can result in excessive vegetative growth which increases the proportion of immature bolls, reduces lint quality and cotton yields, and increases disease and insect damage [36]. However, N deficiencies can reduce vegetative and reproductive growth and decrease yields [3]. Many parameters combine to determine the optimum N rates for cotton, such as soil type, location, N application method, tillage system, water availability, use of winter cover crops, and potential yield [7].

Conservation systems for cotton production in the southeastern US have increased in adoption to approximately 50% of the 2.9 million ha planted in this area [8]. The use of winter cover crops has been well documented as an effective method for improving soil chemical, biological, and physical properties [9, 10]. Among winter crop species, winter cereals like rye can have many benefits because they produce high amounts of biomass, are easy to establish and kill, and provide good ground cover during the winter [8, 11]. However, the high biomass grass cover crops can produce combined with their high C/N ratios and can lead to N immobilization, which can increase the N fertilizer demand for maximizing cotton yields [10, 12, 13]. Additionally, the probability of N immobilization increases when the N fertilizer is broadcast over a soil covered with grass residue [7].

Higher N fertilizer requirements for cotton following small grain cover crops were reported by Howard et al. [7], Varco et al. [14], and Mitchell [15]. Varco et al. [14] reported that 120 kg N ha−1 was required to achieve maximum cotton lint yields after a rye cover crop compared to 96 kg N ha−1 needed after winter fallow, but lint yields were greater after rye than winter fallow. Howard et al. [7] stated that for achieving similar yields, 101 and 67 kg N ha−1 were required for maximizing lint yields when cotton followed corn stover and native winter weed vegetation, respectively. However, it is expected that the long-term use of high biomass cover crops in conservation tillage systems will increase the soil organic carbon levels with a simultaneous increase of organic fractions of N in the soil, and once a new equilibrium is reached, N rates for crops could be reduced due to an increase of N provided through mineralization [16].

Recently, it has been proposed that winter cover crop biomass could be used as an alternative source of energy or for animal feed. Alternative uses for cover crop biomass would help farmers to increase revenue while diversifying market opportunities [17]. Cover crop biomass removal could cause significant changes in soil C and N dynamics and also impact crop yields and their response to N fertilization. Crop biomass removal can cause reductions in soil organic C levels with a subsequent deterioration of soil physical, chemical, and biological properties [1823]. As a result of these changes in soil properties, reductions in crops yields are expected to occur [24, 25]. The impact of residue removal on soil properties and crop productivity has been well documented, but no research has been conducted emphasizing the potential impact of winter cover crop biomass removal on cotton yields and its response to N fertilization under conservation tillage.

We speculate that when rye residue is removed, N rates required for maximizing cotton production could be reduced because of the lower effect of N immobilization under conditions of low levels of residue with a high C/N ratio. Even though differences in soil properties in response to new management practices require some time to occur, we consider that short-term rye residue removal may produce enough changes in the soil environment to cause reductions in cotton yields. The objectives of this research were (i) to determine the effect of rye residue management on cotton growth parameters and yield, (ii) to quantify the impact of rye residue management and cotton response to N fertilization, and (iii) to determine if optimum N rates for cotton can be reduced under rye residue removal conditions.

2. Materials and Methods

A 2-year field experiment under supplemental irrigation was established in November 2005 at the Alabama Agricultural Experiment Station’s E.V. Smith Research Center, Field Crops Unit (32° 25′ 19′′N, 85° 53′ 7′′W), near Shorter in central Alabama, USA. The soil was a Marvyn loamy sand (fine-loamy, kaolinitic, thermic Typic Kanhapludult). This region is characterized by a humid subtropical climate, with an average annual precipitation of about 1100 mm [8]. The experimental area was previously managed with conventional tillage. Three rye RM schemes and four N rates were evaluated for cotton production. Rye RMs were no cover (NC), residue removed (REM), and residue retained (RET). Each RM was evaluated with cotton N fertilization rates of 0, 50, 100, and 140 kg ha−1 applied at the first pinhead square stage. The RMs were the main plots (18 m long by 8 m wide) and N rates for cotton were the subplots (9 m long by 4 m wide).

2.1. Soil Management

Before planting rye the first year, the entire area was deep-tilled with a noninversion, bent-leg subsoiler to a depth of 46 cm to remove any soil compaction present, and leveled with a field cultivator. In early May each year the experimental area was tilled in-row (1 m between rows) with a narrow-shanked subsoiler to a depth of 40 cm. The in-row tillage was conducted using a tractor equipped with an automatic steering system with centimeter level precision. The NC treatment was kept free of weeds during winter by applying herbicide when required.

2.2. Crop Management

Rye (cultivar “Elbon”) was drilled at 100 kg ha−1, in early November each year, using a no-till drill. Plots planted with rye received 40 and 30 kg N ha−1 as ammonium nitrate applied manually three weeks after planting and in late February, respectively. In the RET treatment, rye was rolled down at the early milk development stage [26] in late April each year and then sprayed with glyphosate (N-phosphonomethyl glycine) at a rate of 0.9 kg a.i. ha−1. At the same time rye was terminated in the RET treatment, the aboveground rye biomass in the REM treatment was harvested with a small forage harvester to a height of 10 cm over the soil surface and removed from the plots.

The entire experimental area received an application of 21, 10, 42, and 6 kg ha−1 of nitrogen, phosphorus (P) as P2O5, potassium (K) as K2O, and sulfur (S) as SO4, respectively, each year by early May, based on the Alabama Cooperative Extension System soil test recommendations [27]. Cotton, cultivar DP 454 BG/RR (Delta Pine and Land Co., Scott, MS), was planted on May 19 and 18 in 2006 and 2007, respectively, using a four-row vacuum planter at a rate of 17 seeds m−1. Row spacing was one meter. Herbicides, insecticides, defoliant, and boll opener applied to cotton were based on the Alabama Cooperative Extension System recommendations. The entire research area received supplemental irrigation of 70 and 160 mm during the 2006 and 2007 cotton seasons, respectively, using a linear-movement sprinkler irrigation system. Nitrogen treatments for cotton were applied manually as ammonium nitrate at the first pinhead square stage (37 days after planting (DAP)). Cotton was chemically defoliated and a boll opener was applied when 60–70% of the bolls in RET were opened. Before cotton harvest, one meter of each end of the plots was cut off with a rotary mower. After harvesting, cotton stalks were shredded with a rotary mower.

2.3. Data Collection

Cotton population, leaf blade samples, and seed cotton yield were determined from the two middle rows of each subplot and cotton biomass from the two exterior rows of each subplot. Cotton population was determined by counting the number of plants from a 3 m length in each of the two middle rows of the subplots at 37 DAP. Ten upper-most fully developed blades of leafs were collected from recently matured leaves in the upper canopy of each subplot, at 37 and 65 DAP in 2006 and at 37 and 69 DAP in 2007. Total aboveground cotton biomass was determined at 37 and 92 DAP in 2006 and 2007 by randomly cutting eight plants per subplot. Leaf blade and whole plant samples were oven dried at 55°C until constant weight, finely ground to pass a 1 mm sieve, and analyzed for total N by dry combustion using a LECO TruSpec analyzer (LECO Corp., St. Joseph, MI). Total cotton biomass was estimated using the dry weight per plant and cotton population. Plant N uptake at each sampling time was calculated based on the total biomass and plant N concentration. The cotton biomass and N uptake between first square and cutout were calculated by subtracting the amount at first square from the amount at cutout for each parameter. Seed cotton yield was determined at 139 and 125 DAP in 2006 and 2007, respectively, by harvesting a 14.6 m2 (2 m wide by 7.3 m long) area from each subplot using a spindle picker.

2.4. Weather

Daily average temperature data for both years were taken from an automated weather station located at the Experimental Station, beginning when cotton was planted and ending at the cutout stage of cotton development. Daily heats units (HUs) between planting and cutout were calculated as the difference between the average daily temperature and a base temperature of 15.6°C [28]. Rainfall and irrigation during each season were measured directly in the experimental area with a rain gauge connected to a data-logger.

2.5. Experimental Design and Statistical Analyses

The experiment was arranged in a randomized complete block design (RCBD) with a split-plot restriction on the randomization and four replications. Rye RM was the main factor and N rates for cotton the subfactor. As N treatments were applied at the first pinhead stage of cotton development, data collected before this N application were analyzed using the MIXED procedure of SAS [29] only considering the RM effect (RCBD). The LSMEANS PDIFF option was used to establish mean differences between RM treatments. Data collected after applying N treatments to cotton were analyzed through covariance analysis using the MIXED procedure of SAS [29] considering N as covariate. Replication and its interactions were considered as random effects. Treatments and year were considered fixed effects. When a significant interaction including year occurred, data were presented separately for each year. When Year × RM × N or RM × N interactions were not significant, the LSMEANS PDIFF option was used to establish means differences between RM treatments. The covariance analysis was used to evaluate linear and quadratic effects of N rates on cotton parameters measured and to fit the best linear or quadratic regression model. Linear or quadratic effects were considered significant when [30]. Treatment effects and differences of least squares means were considered significant when .

3. Results and Discussion

3.1. Climate Data

Rainfall and irrigation during both years were different in amount and distribution. In 2006, rainfall and irrigation between one week before planting cotton and cutout were 247 and 70 mm, respectively. For the same period during 2007, they were 207 and 176 mm, respectively. Rainfall in 2006 and 2007 was 23 and 36% lower than the 10-year average. In 2006, rainfall was below the 10-year average until midseason, after which it was similar or greater. However, in 2007 rainfall was below the 10-year average early and late in the cotton season and it was not uniformly distributed, with 75% of rainfall occurring during the first 10 days of July, resulting in a higher amount of irrigation applied during 2007. The main difference in HU between years occurred at the end of the cotton season. For the last 20-day period before cutout, HUs in 2007 were 20% higher than that in 2006, indicating that higher temperatures occurred during this period in 2007 with respect to 2006.

3.2. Cotton Population

Rye residue management had a significant effect on cotton population 37 DAP, across years (Table 1). Rye residue retained had a significantly higher population than NC , but population for REM was not significantly different with respect to the other two treatments. Population for RET was 4 and 7% greater than REM and NC, respectively. Tillage operations were identical among RM, so differences in cotton populations can be attributed to differences in soil water content among treatments during the establishment period of the crop. Higher soil water content was measured in RET compared to REM and NC until 20–25 days after cotton planting in both years (data not shown) which probably contributed to better plant establishment.

Cotton population was also significantly different between years when averaged across RM. Higher cotton populations were observed in 2006 compared to 2007. In both years, the quality of the seed bed at planting and the soil water content between planting and the following two weeks were similar, indicating that other factors could be responsible for this difference between years. Accumulated HUs during the first 13 days after planting were 24% lower in 2007 compared to 2006, indicating that this period of 2007 was colder than 2006. These low temperatures could explain the population reduction in 2007, which probably contributed to slower plant growth, extending the period of time that young plants are susceptible to water deficit and pest damage. Cotton populations in 2006 and 2007 were about 150,000 and 140,000 plants ha−1, respectively, which were in a range considered high for cotton production, even though seed cotton yields can be stable for a wide range of plant densities [31]. However, this yield stability may be threatened if dry periods occur later in the growing season.

3.3. Leaf and Plant N Concentration at First Square

There was a significant Year × RM interaction for leaf and plant N concentration at first square (Table 1). In 2006, RET had a significantly higher leaf N concentration than NC and REM ( and , resp.), and NC was not significantly different from REM (Figure 1(a)). In 2007, RET and REM had significantly higher leaf N concentration than NC ( and , resp.), but differences between these two treatments were not significant. Leaf N concentration values ranged between 38 and 43 g kg−1 in 2006 and between 40 and 46 g kg−1 in 2007. These values were lower than the 54 g kg−1 critical level reported by Bell et al. [32] at first pinhead square, for cotton in the southern USA. The N applied before planting and the N possibly provided through mineralization were not enough to increase leaf N concentration to this critical value. Nonetheless, Bell et al. [32] also stated that high cotton yields can still be achieved by crops having low leaf N at first pinhead square, if N deficiencies are corrected at this stage of development and leaf N concentrations at early bloom are in the sufficiency range. Further, leaf N concentration levels in our study were in the sufficiency range reported by Mills and Jones [33].

In 2006, RET had a significantly higher plant N concentration than REM and NC ( and , resp.), but REM was not significantly different from NC (Figure 1(b)). However, in 2007 differences among RM treatments were not significant and plant N concentration values were very similar for each treatment. Plant N concentration for RM in 2006 followed a similar pattern to leaf N concentration, but their values were lower. This is expected because plant samples that include older tissues other than upper leaves are characterized by lower N concentrations. Further, there was a higher accumulation of HU during June 2007 compared to June 2006, which could help explain the greater leaf and plant N concentrations at first square. Additionally, the amount of soil mineral N at first square was 19% higher in 2007 compared to 2006 averaged across RM (data not shown), indicating that N availability during June was probably greater in 2007. However, soil mineral N amounts for all RM treatments in both years appeared to be sufficient, indicating that N availability was not a limiting factor for cotton growth at first square.

3.4. Cotton Biomass and N Uptake at First Square

Rye RM had a significant influence on cotton biomass and N uptake at first pinhead square, averaged across years (Table 1). The RET treatment had significantly higher cotton biomass than NC and REM ( and , resp.; Figure 2). Rye residue removed had a cotton biomass 35% higher than NC, but this difference was not significant. Similar results were obtained for N uptake, with RET having values 96 and 166% higher than REM and NC ( and , resp.). No significant differences occurred between REM and NC, but N uptake was 39% higher for REM (Figure 2). Differences in N uptake among RM treatments can be partially explained by differences in cotton biomass and plant N concentration. When averaged across years, plant N concentration was 34, 35.7, and 37.1 g N kg−1 for NC, REM, and RET, respectively. Although these two growth parameters influenced N uptake in the same manner, cotton biomass could have had the highest impact on N uptake, because its variability between RM treatments was higher in proportion to plant N concentration.

These results show that in both years RET provided better conditions for cotton growth and N uptake early in the season. This could have been a consequence of greater N availability between planting and first square, as indicated by the residual levels of N at this time, and also because of the higher soil water content during this period of time. Although plant populations were greater with RET than REM and NC, plant biomass was also greater with RET at first square (130 and 77% greater relative to NC and REM, resp.). Further, the lack of a Year × RM interaction and a Year effect on cotton biomass and N uptake suggests that, at least until this stage of development, growth of the cotton crop was very similar in both seasons.

3.5. Leaf N Concentration at Early Bloom

Leaf N concentration at early bloom was significantly affected by RM, N rate, and year, but interactions were not significant (Table 2). No winter cover crop and REM had significantly higher leaf N concentration compared to RET ( and , resp.). It is possible that the rye residue immobilized some of the soil mineral N between first square and early bloom, decreasing the availability of N for cotton and reducing the N concentration in cotton tissues. Another possible explanation can be that higher cotton biomass production in RET could have caused a N dilution effect within cotton tissues. Cotton biomass was not measured at this growth stage, but plant heights at this sampling time averaged across years and N rates showed that plants in RET were taller than in REM and NC (data not shown), indicating a possible higher cotton biomass and potential N dilution effect. Similar results were reported by Fridgen and Varco [34] and Balkcom et al. [35], who found a dilution of leaf N when cotton biomass production was high.

Averaged across years and RM, leaf N concentration responded in a quadratic manner to fertilizer N rate, with a maximum leaf N concentration of 40 g kg−1 observed at the highest N rate applied (Figure 3(a)). This value was very close to the sufficiency range of 43 g kg−1 reported by Bell et al. [32]. However, our results do not agree with Fridgen and Varco [34], who reported a higher leaf N concentration using similar N rates as we did. Further, a maximum leaf N concentration was not achieved even with the highest N rate applied. This indicates that to reach the maximum leaf N concentration in the conditions of this experiment would have required a greater fertilizer N application than 140 kg ha−1.

Year also significantly affected leaf N concentration at early bloom. The leaf N concentration was significantly lower in 2007, with a decrease of 22% with respect to 2006 (Figure 3(b)). Rainfall distribution and HU during 2007 could explain this trend between years. A more detailed analysis of the rainfall effect on cotton growth parameters will be provided when data of cotton biomass N concentration are presented hereinafter.

3.6. Cotton Biomass Production between First Square and Cutout

Rye RM had a significant effect on cotton biomass production between first square and cutout, averaged across years, and N rates (Table 2). Rye residue retained had significantly higher cotton biomass production than REM and NC (both of which were ), and REM was significantly higher than NC (Figure 4). Cotton biomass production for RET was 24 and 43% higher than REM and NC, respectively, while REM was 16% higher than to NC. These results demonstrate that RET provided better conditions for cotton growth. Govaerts et al. [19] reported that keeping residue on the soil surface improves infiltration, increasing water available for plants.

The cotton biomass response to N produced a significant interaction with year, when averaged across RM treatments (Table 2). In both seasons, cotton biomass response to N was quadratic (Figure 5). The small increase between the 100 and 140 kg ha−1 N rates indicates that the N rate required to maximize cotton biomass would be similar to the highest rate used in this experiment. Cotton biomass in 2006 was similar to the one reported by Bassett et al. [36] for an N rate of 134 kg ha−1, but it was extremely low compared to the findings of Boquet and Breitenbeck [2]. In spite of the similar trend between both seasons, cotton biomass was lower for all N rates in 2007 than that in 2006. The difference for the no N control was very low between years, with a decrease of 9% in 2007 compared to 2006. However, about 20% less biomass was produced in 2007 than that in 2006 when N was applied independent of the N rate used. This cotton biomass reduction could be explained by the lower rainfall and nonuniform distribution during 2007. Additionally, the last 20 days before cutout in 2007 were characterized by elevated temperatures as indicated by the higher accumulation of HU units relative to 2006. High temperatures and low rainfall in 2007 could have imposed a stress to the crop causing lower biomass production. This may have occurred even though irrigation was applied since low amounts of water were applied with each irrigation event (10 to 12 mm) and the high temperatures would have increased water loss due to evapotranspiration. These results agree with Balkcom et al. [35], who reported lower cotton biomass in hot and dry years regardless of irrigation.

3.7. Cotton Biomass N Concentration

The N concentration in cotton biomass accumulated between first square and cutout was significantly affected by RM, N, and year, but interactions were not significant (Table 2). No winter cover crop had 12 and 22% significantly higher cotton biomass N concentration compared to REM and RET, respectively ( and , resp.; Figure 6). Rye residue removed had a numerically higher cotton biomass N concentration compared to RET (9%), but this difference was not significant. As previously mentioned, N immobilization probably occurred in both growing seasons but at low levels. This would indicate that the reduction in cotton biomass N concentration in REM and RET relative to NC could be explained by the higher cotton biomass compared to NC (Figure 4) which may have contributed to a dilution of N in cotton tissues. Rye residue retained and REM accumulated 43 and 16% more biomass between first square and cutout than NC, respectively, but their increment in N uptake relative to NC was only 18 and 5%, respectively. This result also suggests an occurrence of N dilution in the accumulated biomass. Gerik et al. [3] and Bell et al. [32] reported that under conditions of high availability of N, cotton plants increase vegetative growth very quickly which leads to a N dilution in the biomass produced and a subsequent drop in tissue N concentration.

Cotton biomass N concentration response to N rates was linear (averaged across years and RM), indicating that the highest N rate applied did not maximize N concentration in the biomass (Figure 7(a)). This trend was similar to that observed for leaf N concentration, even though that response to N was quadratic.

Cotton biomass N concentration was significantly influenced by year (Table 2). In 2007, there was a significant decrease of about 28% in cotton biomass N concentration compared to 2006 (Figure 7(b)). A similar pattern was also observed at early bloom for leaf N concentration and cotton biomass production. There was a simultaneous decrease in cotton biomass and N concentration, but the reduction in cotton biomass N concentration was greater with respect to cotton biomass (28 versus 18%, resp.), providing strong evidence that N dilution in plant tissues occurred. These results indicate that the 2007 crop was affected by N dynamics in the soil-plant system. The rainfall regime during 2007 may have played an important role in these findings. The high rainfall that occurred during the first 10 days of July (about 150 mm) was twice than the 70 mm of available water that the soil in the experimental area can retain to a depth of 50 cm. The excess rainfall above the soil water holding capacity could have leached part of the N fertilizer out of the root zone. These high rainfall events at the beginning of July in 2007 occurred only one week after the N fertilizer was applied to cotton.

3.8. Nitrogen Uptake between First Square and Cutout

Table 2 shows that there was a significant RM × N interaction for N uptake between first square and cutout. Nitrogen uptake response to N rates was linear for RET and REM, whereas for NC this relationship was best described by a quadratic model (Figure 8(a)). The response of N uptake per kg of N added up to the highest N rate was 0.49, 0.61, and 0.68 for NC, REM, and RET, respectively. Cotton plants in RET absorbed more N independent of the N rates applied. Even though RET had higher values of N uptake than REM and NC at all N rates, these differences were magnified with increasing rates of N fertilizer (Figure 8(a)). The highest N uptake for each RM treatment occurred at the highest N rate applied, where at this rate, N uptake for RET was 32 and 15% higher than NC and REM, respectively, and it was 15% higher for REM compared to NC. The linear relationship between N uptake and N rate for RET and REM indicates that the highest N rate applied was not enough to maximize N uptake under the conditions of this experiment. Conversely, NC had a quadratic relationship with a very low N uptake increment between the 100 and 140 kg ha−1 N rates, indicating that the N rate required for maximizing N uptake was very similar to the highest N rate we applied. Our results for RET were similar to the findings of Basset et al. [36], who found a total N uptake of 142 kg ha−1 for irrigated cotton that received 134 kg N ha−1. However, a study by Mullins and Burmester [37] revealed greater N taken up with an N rate of 112 kg ha−1. The N uptake by cotton plants at the highest N rate (140 kg N ha−1) represented 72, 83, and 95% of the N added, for NC, REM, and RET, respectively. This would indicate that RET provided better growing conditions for cotton that possibly improved the N use efficiency of the fertilizer applied.

The amount of N absorbed by a crop depends on its biomass production and its N tissue concentration. When averaged across years, the interaction RM × N was not significant for cotton biomass N concentration and cotton biomass production between first square and cutout (Table 2). Even though no significant interaction existed, cotton biomass N concentration values were slightly higher for NC, followed by REM and RET, but cotton biomass was higher for RET, followed by RM and NC, at each N rate (Table 3). This tendency supports the findings of higher biomass production with RET compared to REM and NC (particularly for the 100 and 140 kg ha−1 N rates) and its greater N uptake. These results agree with Gastal and Lemaire [38], who stated that N taken up by crops is mainly affected by the crop growth rate.

There was a significant Year × N interaction for cotton N uptake between first square and cutout (Table 2). Nitrogen uptake response was quadratic in 2006 and linear in 2007 (Figure 8(b)). In both years, the uptake response to N occurred up to the highest N rate applied, with 0.7 and 0.4 kg of N taken up per kg of N added in 2006 and 2007, respectively. Nitrogen uptake was lower for 2007 compared to 2006 for all N rates, but differences were greater when N was applied. Nitrogen uptake in 2007 was 33 and 39% lower for the no N control and for the 140 kg ha−1 N rate, respectively, relative to 2006 (Figure 8(b)). This observed reduction in N uptake can be attributed to the lower cotton biomass production in 2007 for all N rates and the reduction in cotton biomass N concentration measured for both years (Figures 5 and 7(b)).

3.9. Seed Cotton Yield

A significant Year × RM × N interaction occurred for seed cotton yield (Table 2). In 2006, observed seed cotton yields ranged from 1,740 (REM, no N control) to 3,970 kg ha−1 (REM, 140 kg of N ha−1). The seed cotton yield response to N for RET was linear, while REM and NC were quadratic (Figure 9 ). Highest observed and predicted yields corresponded to RET and REM with the application of 140 kg N ha−1, with both treatments producing similar yields (Figure 9). Seed cotton yield response to N for RET and REM occurred up to the highest N rate applied without reaching a maximum. The fact that a plateau yield was not achieved in RET and REM indicates that the maximum yield potential with a cover crop was greater than that with no cover in 2006. Further, seed cotton yield in NC reached an estimated maximum at a N rate of 102 kg ha−1 and higher N rates caused a decrease in yields. The highest predicted seed cotton yield for RET and REM was about 17% higher than that for NC, showing that growing a cover was the best scenario for obtaining higher yields during 2006, whether or not the cover crop residue was removed or left on the soil surface. Our results for RET and REM are similar to the findings of Reiter et al. [12] who stated that conservation-tilled cotton on a Decatur silt loam responded up to 134 kg N ha−1. Seed cotton yields observed in RET and REM were also similar to results of Clawson et al. [39], who found cotton response to N up to 151 kg ha−1. However, Wiatrak et al. [40] reported a linear increase in lint yields up to 200 kg N ha−1, a rate considerably greater than that used in this experiment.

The seed cotton yield response to N during 2006 was 11.3, 15.9, and 9.7 kg of seed cotton per kg of N added for NC, REM, and RET at N rates of 102, 140, and 140 kg ha−1, respectively. The highest yield increase with respect to the no N control corresponded to REM, followed by RET and NC (128, 53, and 45%, resp.), at the previously mentioned N rates. The lower response to N for RET can be somewhat explained by the greater seed cotton yield for the no N control relative to REM and NC. The REM treatment had the lowest seed cotton yield when no N was applied, with yield lower by 22 and 33% compared to NC and RET, respectively. Lower yields in REM compared to NC for the no N control were unexpected, since all cotton growth parameters for REM were at least similar or better than for NC when N was not applied. This yield decrease could be related to a factor or combination of factors directly affecting some of the yield components. However, the application of 50 and 100 kg of N ha−1 was enough to increase yields up to levels similar to NC and RET, and with 140 kg of N ha−1 the yield for REM was one of the highest. This observed trend for REM indicates that a severe N deficiency possibly occurred with this treatment when no N was added.

In 2007, seed cotton yields ranged from 1,295 kg ha−1 (NC, no N control) to 2,677 kg ha−1 (RET, 140 kg of N ha−1). Rye residue retained and REM had a quadratic seed cotton yield response to N fertilization, while the yield increase for NC followed a linear trend (Figure 9). Rye residue retained had the highest predicted yield with 125 kg N ha−1, followed by REM and NC at N rates of 106 and 140 kg N ha−1, respectively (Figure 9). Boquet et al. [41] and Varco et al. [14] reported an optimum N rate of about 118 kg ha−1 for conservation tillage cotton, a value similar to the one we found for RET in 2007. Rye residue retained required 21 kg N ha−1 more than REM for maximizing yields but it had a higher yield. The highest estimated seed cotton yield for RET was 12 and 8% higher than that for NC and REM, respectively. Nitrogen rates above the optimum for REM tended to slightly decrease yields. A similar reduction in seed cotton yields occurred for NC in 2006 with N rates higher than 102 kg ha−1. Cotton yield reductions with application of high N rates were reported by McConnell et al. [42] and Boquet et al. [43]. High N levels in soil can cause excessive vegetative growth, with a subsequent competition between vegetative and reproductive structures, which generally is detrimental to bolls and lint development, lint quality, and yield [4]. Regardless of its linear response to N, the decreasing yield with increasing N rate for NC indicates that the 140 kg N ha−1 we applied was near the optimum rate. In 2007, not only did NC have the lowest yield but also it required the highest N rate for achieving its highest seed cotton yield. Yields during 2007 were highly dependent on residue management. The best situation for achieving high seed cotton yields was to have a cover crop and keeping the residue on the soil surface.

The seed cotton yield response to N in 2007 was very similar among RM treatments, 7, 8, and 7 kg of seed cotton per kg of N for NC, REM, and RET (at N rates of 140, 106, and 140 kg ha−1, resp.). No winter cover crop had the highest yield increase relative to the no N control, followed by REM and RET (75, 53, and 43%, resp.), for the previously indicated N rates. As in 2006, in 2007 RET had the lowest yield increment compared to the no N control, even though it had the highest estimated seed cotton yield. This pattern is explained by its greater seed cotton yield when no N was added.

In both years, RET had higher seed cotton yield than REM and NC in the no N control. This result was not expected because the presence of rye residue with a high C/N ratio on the soil surface has been commonly associated with the occurrence of N immobilization, which reduces levels of soil mineral N and decreases yields [10]. In situations with no N added this effect would have a greater negative impact on crop yields. However, results of cotton N uptake between first square and cutout in the no N control averaged across years (Figure 8(a)) followed a similar trend as seed cotton yield. The cotton N uptake in RET was 9 and 15% higher than that in NC and REM, respectively, for the no N control. These results indicate that under the conditions of our experiment N immobilization was not high enough to reduce seed cotton yields in RET.

4. Conclusions

Rye residue management treatments significantly influenced cotton growth parameters and seed cotton yield. In general, cotton population, leaf and plant N concentration, cotton biomass and N uptake at first square, and cotton biomass production between first square and cutout were higher for RET. However, leaf N concentration at early bloom and cotton biomass N concentration between first square and cutout were higher for NC. Leaf N concentration at early bloom, cotton biomass, and N concentration between first square and cutout increased with increasing N rates, when averaged across RM treatments. The highest N uptake was measured in RET, at the highest N rate. In 2006, the highest predicted seed cotton yield corresponded to RET and REM with the application of 140 kg N ha−1 (about 3,950 kg ha−1). In 2007, RET had the highest predicted seed cotton yield with 125 kg N ha−1 (2,657 kg ha−1) followed by REM with 106 kg N ha−1 (2,466 kg ha−1). In both years, the lowest predicted yield was for NC. In 2006, the increase in cotton biomass for RET compared to REM did not necessarily result in an increase in seed cotton yields. However, a stronger association between cotton biomass production and seed cotton yields was observed in the hot and dry 2007 season. Even though RET had low leaf N concentration values at early bloom, it had high yields in both years, indicating that in our study leaf  N concentration was not a good predictor of seed cotton yields. Results of this study show that short-term effects of rye residue removal can occur mainly in vegetative cotton parameters, but its effect on seed cotton yield and cotton response to N fertilization would depend more on the characteristics of the season. No rye residue removal effect would be expected in years with average temperatures and rainfall. However, during hot and dry years, rye residue removal may lead to a decrease in cotton yields. We anticipate that cotton N requirements under rye residue removed conditions would not be lower compared to residue retained. The year dependence of rye residue removal impact on seed cotton yields and cotton response to N fertilization suggests that long-term studies are required to strengthen conclusions concerning this management practice.

Abbreviations

RM:Rye residue management
NC:No winter cover crop
REM:Rye residue removed
RET:Rye residue retained
N:Nitrogen
P:Phosphorus
K:Potassium
S:Sulfur
DAP:Days after planting
HU:Heat units.

Acknowledgments

The authors would like to acknowledge the staff of the Field Crops Unit at the E. V. Smith Research Center, especially the late Mr. Bobby Durbin, for their support with this experiment. This paper contributes to the USDA-Agricultural Research Service cross-location Renewable Energy Assessment Project (REAP).