Abstract

Wheat (Triticum aestivum L.) is one of the most important and highly productive crops grown under supplementary irrigation in the central region of Santa Fe. However, its production is limited by the presence of diseases in the main stages for yield definition. The objective of this work was to assess wheat health in response to different supplementary irrigation strategies under greenhouse and field conditions. The field experiment included three treatments: dry (D), controlled deficit irrigation (CDI), and total irrigation (TI) using the central pivot method. Disease incidence from stem elongation and severity in flag leaf and the leaf below the flag leaf were measured. Leaf area index (LAI), harvest index, air biomass, and yield components were determined. In greenhouse the treatments were TI and CDI, with evaluations similar to the field. The major leaf diseases observed were tan spot, leaf rust, and septoria leaf blotch. Significant differences in disease burden, LAI and yield components were observed in the different treatments. Under greenhouse conditions, only tan spot was observed. The results of this study indicated that the application of supplemental irrigation in wheat improved the yield, without increasing the incidence and severity of foliar diseases.

1. Introduction

The amount of water available for crops is defined by the balance between precipitation and evapotranspiration [1]. Wheat (Triticum aestivum L.) cultivated in the central region of Santa Fe is subjected to periods of water deficit that can significantly decrease yields [2].

Because most farmers are focused on grain yield potential, irrigation technology has become an important tool both to maximize production [3] and to reduce the inter-annual variability of yields [4]. Furthermore, wheat is one of the most important agricultural crops that are treated with supplementary irrigation in humid/subhumid regions [2]. Wheat is also important in crop rotation schemes, because its stubble has beneficial effects on soil structure and for diversify production [5].

Central pivot irrigation is the dominant technique used in this region, but it is not clear whether this technology affects disease susceptibility. This method wets the foliage, thus reducing its temperature while increasing the relative humidity and the length of time during which the leaves remain wet; both of them can promote foliar diseases.

Foliar diseases are the main biotic restrictions that reduce wheat yield in Argentina [6]. Photosynthesis, respiration, the translocation of water and nutrients, and reproduction are affected by pathogens. Any interference in these vital processes prevents the plant from taking advantage of the environmental factors necessary for their growth and development [7], resulting in decreased yield potential. This can be measured through the total amount of biomass generated and the proportion of it which is allocated to reproductive organs [6]. In wheat, the period from the beginning of stem elongation to flowering, during which the stalk and spike grow together and compete intensely, is crucial to determine the number of grains per unit area [6], the variable that is most closely associated with crop yield. Maintaining an adequate area of healthy and functional leaves during this period is essential to achieve higher rates of photosynthesis, allowing greater availability and partitioning of photoassimilates towards the ears and therefore a larger number of grains [6].

However, the negative effects of foliar diseases on wheat yield and quality have increased in Argentina over the last several years due to, among other things, the expansion of no-till, the dissemination of susceptible genotypes, and the use of infected seed [8]. Therefore, there has been an increase either in the prevalence of known foliar diseases like in the threat of the emergence of new diseases, according to Perelló and Moreno [8].

The major foliar fungal diseases caused by necrotrophic pathogens in Argentina have historically been tan spot (DTR) and septoria leaf blotch (SLB); the latter is caused by Septoria tritici Rob. ex Desm., teleomorph Mycosphaerella graminicola (Fuckel) J. Schröt. in Cohn. Together with some other pathogenic fungi (mainly Bipolaris sorokiniana (Sacc.) Schoem., teleomorph Cochliobolus sativus (Ito & Kuribayashi) Drechsler ex Dastur and Alternaria spp.), tan spot and septoria leaf blotch form a leaf spot disease complex in Argentina [9].

According to Fernández and Corro Molas [10], the most common and severe wheat diseases in Argentina are leaf rust (LR) (Puccinia triticina Eriks), SLB, DTR [Pyrenophora tritici-repentis (Died.) Drechs, anamorph Drechslera tritici-repentis (Died.) Shoemaker], and white blow or fusarium head blight (FHB) (Fusarium graminearum). Massaro et al. [11] conducted an experimental survey of wheat foliar diseases in the southern region of Santa Fe over seven consecutive years (2000 to 2006) and concluded that the most prevalent diseases were DTR and LR (71% and 86%, resp.); SLB was observed less frequently, in only one of the seven years studied (14%). Work carried out in the Santa Fe center since 2003 has shown that FHB is the most important disease of the year, with an erratic appearance that is highly dependent on the environmental conditions at the time of flowering [12].

Serrago et al., cited by Simón et al. [9], indicated that a complex of diseases formed by DTR, SLB, and LR reduced grain yield by 1020 kg ha−1 on average. Other authors have reported SLB yield losses of 2–50% [1318] and as high as 75% [19]. In Argentina, yield losses from 21 to 37% [20, 21] and from 20 to 50% [21, 22] in high yielding cultivars have been found.

Additionally, in Argentina, the losses caused by the DTR can reach values as high as 14% in grain yield, as well as an 8 to 11% reduction in thousand grain weight and between 1.2 and 4.5% in hectoliter weight [23]. Globally, yield losses were reported up to 40% [24].

Wheat cultivars that are susceptible to LR regularly suffer yield reductions of 5–15% or greater, depending on the stage of crop development [25]. Reductions of 10–30% have also been reported [26, 27].

Seed quality is also essential; the health status of a seed lots is the main criterion for seed quality, together with purity, energy, and germinative power [28].

Few studies have investigated wheat diseases grown under supplementary central pivot irrigation in Argentina. Work carried out in southern Alberta (Canada) showed that wheat foliar diseases increased in the presence of sprinkler irrigation [29]. Crops cultivated under irrigation tend to be denser, and this modification of the microclimate influences not only the contraction of diseases but also the sporulation of pathogens and later spore dispersal [29]. The wetting of infested crops promotes the sporulation of pathogens, especially when the crop foliage is dense and the subhumid conditions produced by irrigation are prolonged. Pathogenic spores can be dispersed directly by irrigation water droplets or indirectly through the hydration of specialized fruiting bodies such as perithecia [29]. In southern Santa Fe, Andriani et al. [30] reported that wheat under central pivot irrigation developed a powdery mildew (Blumeria graminis f. sp. tritici) every year. In general, lower yields are closely related to the presence of diseases that affect the entire cycle of crop.

The concepts outlined above highlight the importance of obtaining local information about health problems in cultivated wheat and their possible effects on grain production, that is, comparing the yield maximization achieved through supplementary irrigation with the potential negative effects of irrigation on the evolution of diseases.

The objective of this work was to assess the relation between the health of a wheat crop (grown in the greenhouse or field) and the water management conditions used in the eastern/central region of Santa Fe.

2. Materials and Methods

2.1. General Procedures

The experiment was carried out over two successive growing seasons (2009-2010) in the “Miraflores” area (latitude 32°10′14′′ S, longitude 60°59′57′′ W), located in the eastern/central region of the Santa Fe province, with 800 ha under central pivot irrigation with water from the Coronda River. The system that they have has an intake in the river, which drives through channels, partly excavated and partly on an embankment, with four pumping stations. The central pivot covers an area of 32 ha (six towers, 325 m) with average irrigation flow and depth of 125 m3 h−1 and 8 mm day−1, respectively. The applied drops are between 1 and 2 mm, and the passage time on the leaves varies from a few minutes (extreme towers) to a few hours (central towers), depending on the applied depth.

The climate analysis considered historical information for the central region (Oliveros and Santa Fe), including rainfall, temperatures, pressure vapor, wind, radiation, and evaporation.

The soil is a Typic Argiudolls, which is suitable for agriculture (class I, INTA, 1992). Surface composite samples of soil (0–0.2 m) were extracted for chemical analysis (pH, total nitrogen, organic matter, phosphorous, sulfur) in order to calculate the fertilizer doses required.

2.2. Treatments

The treatments were as follows: D (rainfed, no irrigation) crops located outside the circle; TI, with irrigation managed according to the maximum expected yield and maximum demand for water; CDI, with irrigation managed strategically according to the water deficit. Three plots (replicates of 100 m2 each) in each treatment area were selected for evaluation.

The Cronox cultivar was used for all treatments. Cronox is a short-intermediate cycle plant with moderate susceptibility to DTR and LR, and moderate-to-low susceptibility to SLB, according to the information provided by their respective breeder. Seeding was carried out on June 10 with a density of 150 kg ha−1 seed, resulting in a density of 453 plants m−2. Fertilizer was applied based on a prior analysis of the soil: 150 kg ha−1 urea (broadcast applied), 70 kg ha−1 of diammonium phosphate, and 50 kg ha−1 calcium sulfate, and the harvest was on November 12. In the 2010 season, Cronox was sowed on June 23 but at a higher density (160 kg ha−1), 409 plants m−2. Plants were fertilized with 120 kg ha−1 urea (broadcast applied), 100 kg ha−1 of ammonium phosphate, and 80 kg ha−1 calcium sulfate, and the harvest was on November 14. Management practices, which were usually carried out by the farmers, included the preventive treatment of seeds with an antifungal agent (25% carbendazim + 25% tiram).

2.3. Blotter Test

Seed samples with and without treatment (4 samples of 100 seeds each) were obtained and incubated to measure germination energy (GE) and germinative power (GP). Incubation was carried out using the top of paper method according to Peretti [31]. Seed health was measured in terms of pathogen burden as determined by “Blotter test” or, when it was necessary to isolate specific pathogens, through selective culture [32].

Incubation was carried out at 21 ± 1°C, a relative humidity of 80%, 12 h light, and 12 h of darkness [33] for four to ten days [31]. The protocol published by Peretti [31] was not used because the incubation temperature was inappropriate. The GE count was conducted after four days of incubation, and the final count to determine GP was conducted after eight days. Pathogen load was determined by visually observing incubated seeds for fungal colonies with a magnifying binocular, both from above and from below [31]. A stereoscopic microscope was used to diagnose fungal structures in specially made preparations.

2.4. Foliar Disease Incidence, Severity, and Biomass Determination

The Zadoks scale was used to monitor crop phenology [34]. Disease monitoring was conducted in the field, from stem elongation onwards because during tillering, new leaves quickly appear and there is a reduction in the intensity of disease [35]. Monitoring consisted of two weekly visits to evaluate LR and once per week to evaluate foliar spots caused by Drechslera tritici-repentis, Septoria tritici, or Bipolaris sorokiniana. The severity and incidence of these diseases were quantified as the percentage of affected leaf area on the flag leaf (FL) and the leaf below the flag leaf (FL-1).

Fusarium head blight (FHB) results from the development of a complex of pathogenic fungi. Fusarium consists of five main species (Fusarium graminearum, Fusarium culmorum, Fusarium avenaceum, Fusarium poae, and Fusarium triticum), with several strains per species. The most common of these species, which causes FHB, is F. graminearum [36]. To quantify FHB, we measured the percent incidence (sick spikes/assessed spikes × 100). We also monitored the cereal disease Gaeumannomyces graminis, which has become important in the wheat region within the last several years due to the increase of inoculum in soil [37], and a powdery mildew (Blumeria graminis f. sp. tritici) disease that was previously observed to develop upon irrigation [30]. Batch sampling was conducted by randomly selecting 50 tillers taken in a zigzag path of the sampling area. The Cobb scale was applied to evaluate the severity of LR and foliar diseases on FL and FL-1, and the Stack and McMullen scale was used to evaluate FHB [38, 39]. All scores were expressed as percentages [40]. The incidence (percentage of infected plants) and percentage of sick leaves (with respect to the total number of leaves) were calculated by separating green leaves and expanded bearers with symptoms from those that were healthy. Leaves that exhibited at least one lesion or leaf spots >2 mm [35] were considered to be infected with rust sheet. Because it is difficult to differentiate lesions caused by DTR and SLB, the accurate diagnosis was made based on the sign: Drechslera has long conidiophores and conidia, and those of septoria, pycnidia, and conidia are shorter, as observed at 40x with an optical microscope [41].

Distrain software was also used to estimate the severity of several diseases, including LR, powdery mildew, SLB, striated rust, stem rust, and DTR [42].

To estimate the total aboveground biomass (TAB), samples were taken from plants at three timepoints, 3.1, 6.5, and 9.2, according to Zadoks et al. [34]. Twenty stems were extracted from each of the first two samples, and leaf area index (LAI) was measured with a LIQUOR LI team index 3000.A instrument. The stem and leaf components were separated and dried to a constant weight at 65°C.

2.5. Yield and Yield Components

Yield was determined from two samples extracted at random from physiologically mature plants along one linear meter per plot. In the laboratory, plants and stems were counted for each sample, and subsamples (20 stems) were separated by components (stalk and spike); the number of spikelets per spike and fertile and infertile spikelets was counted. Each component and the rest of the sample were dried separately at 65°C to a constant weight. Each sub-sample of spikes was threshed manually, and the resulting grains were subsequently weighed and counted.

2.6. Statistical Analysis

The experiment was conducted in a random block design with three replicates, and analysis of variance (ANOVA) was used to evaluate the severity, impact, LAI, and yield parameters using the program INFOSTAT/professional-version 2009 [43]. Homogeneity of variance was tested by comparing the error mean squares for all dependent variables and Shapiro-Wilks modified [43]. Means were compared by Tukey ( ). The data on severity percentage and incidence percentage were arcsine square root transformed for analysis.

2.7. Greenhouse Experiment

In addition to epidemiological studies in the field, we evaluated plants grown in a greenhouse in order to compare the health and yield of this cultivar under different irrigation conditions.

The same variety of wheat was used (Cronox) with a sowing date of May 31, 2010, in furrows of 0.3 m and separated by 0.2 m. Greenhouse plants received either TI (irrigated at 100% field capacity) or CDI (75% of field capacity) treatments, but it was not feasible to use D (rainfed, no irrigation). Cultivation occurred normally with a density of 47 pl per treatment, equivalent to 400 plants m−2. The treatments began with an initial moisture equivalent to field capacity.

The plants were kept in a greenhouse with a temperature of 22°C and a photoperiod of 16 : 8 (light and dark) [44] with high relative humidity (100% for the first four hours, followed by 80%) [45] and grown in plastic containers with a capacity of 84 dm³ (approximately 0.6 m long × 0.4 m wide × 0.35 m deep). Each container was divided into two equal parts such that each pot contained both treatments. The soil was textured silt/clay which had been conditioned by grinding and sifting (2 mm mesh). The bulk density was 1200 kg m−3, and P, N, and S fertilization was conducted according to a soil analysis.

Interval irrigation was initiated when 75% of the available water had been depleted. A 20 mm fixed dose was used, representing the estimate of useful water in the container, and a pressurized sprayer (Giber) was used to simulate sprinkler irrigation.

Given that wheat stubble constitutes a natural reservoir of many fungi that cause necrotrophic “leaf spots,” such as Drechslera tritici-repentis, Septoria tritici, and Bipolaris sorokiniana [46], plants were inoculated using a non-quantitative method through a recreation of the stubble on the surface of an infected crop [47]. The plants remained in the greenhouse until harvest (October 19).

Nondestructive methods (i.e., weekly observation through manual magnifiers) were used to evaluate disease from the beginning of tillering to the filling of grains. LAI was estimated by subsampling 10 plants per treatment and was repeated at three phenological timepoints: 2.3, 4, and 7. We used a non-destructive method to measure the length and maximum width of the sheet and subsequently multiplied this product by a correction coefficient previously obtained with LAI (LIQUOR LI 3000.A) measurement equipment.

Yield was determined using the same methodologies that were used in the growing field. The trial was conducted in a randomized block design with four replicates, and severity, impact, LAI, and yield parameters were evaluated using analysis of variance (ANOVA).

3. Results

3.1. Seed Analysis and Blotter Test

The GE and GP values obtained for the seeds from the 2009 season were 100% and 99%, respectively, for untreated seeds and 99.5% and 97.5%, respectively, for treated seeds; in 2010, these values were 98.75%, 98.25%, 99.5%, and 99%, respectively. According to Peretti [31], all of these values are within the acceptable ranges for regulated wheat seed.

In the untreated seeds from 2009, the incidence of microorganisms was 30.5%, predominantly “black point” grains caused by Alternaria spp. and, to a lesser degree, Aspergillus spp., Drechslera tritici-repentis, and Bipolaris sorokiniana. In contrast, the incidence of microorganisms in treated seeds was 0%.

The conditions of high humidity and high temperatures that occurred towards the end of the growing season in 2009, coupled with poor storage conditions, increased the incidence of the pathogens that cause discoloration and deterioration of seeds. This result was verified in the analyses performed on seeds that were stored by the farmer and used for seeding in the 2010 season, which contained Alternaria spp., Bipolaris sorokiniana, Penicillium spp., and Rhizopus spp. The overall incidence of pathogens was 72% for untreated seeds and 12% for treated seeds. A higher incidence of pathogens (Penicillium spp., Rhizopus spp. and Alternaria spp.) was detected during storage (Table 1), while the presence of Penicillium spp. in the seeds treated by the farmer would indicate an incorrect dose of fungicide. However, GE and GP were not affected by this pathogen.

Exposure to fungi in the field and during storage affects germination, seedling stand, grain size and weight, and industrial quality. In the case of wheat, these fungi are associated with the grain spotting known as “black scutellum,” or “blackpoint.” This pathology is characterized by a black or brown coloration in the area of the embryo, which could also be extended to the surrounding area and the groove [33].

3.2. Field Trials (2009 Season)

A total of 310 mm effective rainfall was received in 2009, which was greater than the historical average (Figure 1). Because of this heavy rain, only two irrigation treatments totaling 64 mm were applied during the growing season, and both the TI and CDI treatment received the same amount of water. The first irrigation was administered on August 15 during phenological state 2.3, and the second was administered on September 5 during phenological state 3.1.5.

The daily average air temperatures were lower than 16°C in June, July, and September, as well as in the first ten days of August and the second ten days of October. The lowest temperature was recorded on July 14 (−8°C). Three consecutive days with temperatures greater than 21°C were recorded during August and October (last ten days) and two days in November (second ten days).

The ambient relative humidity remained above 60%, and wet leaves were still observed after 15 hours on two consecutive days during the second ten days of July and the first ten days of September.

The three most frequent leaf pathologies, LR, DTR, and SLB, were identified in all treatments.

The incidence of foliar diseases was higher in the D treatment than in the other treatments, although the severity remained below 1% in all treatments. The average disease incidence (percentage of sick leaves with respect to the total number of leaves), both in general and at different phenological stages, was significantly different (Table 2) between the D treatment and the two remaining treatments (CDI and TI).

This pattern was likely observed because the nonirrigated wheat did not achieve total coverage of furrows, even at advanced stages of development ( 6, anthesis), suggesting that at the furrow minor coverage allowed the foliar disease to colonize the upper strata of the crop. This supposition is consistent with the LAI results, which were significantly different between D and the irrigation treatments at 4 (3.95 versus 4.99 and 5.56 for CDI and TI, resp.). In contrast, the length of leaf wetting caused by sprinkler irrigation was not significantly different than that from the normal wet period due to ambient humidity during the crop cycle.

The individual development of each foliar disease present during the crop cycle was analyzed. In general, epidemics of SLB is caused by a combination of favorable climatic conditions (usually characterized by long periods of light rain and moderate temperatures), certain cultivation practices, the availability of inoculum and the presence of susceptible varieties [48]. A relatively low intensity of foliar disease was observed during this growing season, and diseases were not identified in a uniform manner across treatments or sampling dates. The highest SLB incidence was just 10.5%, observed in samples from the D treatment analyzed on October 15. Injuries to FL-1 that corresponded with SLB were observed in a total of two leaves (of 50 analyzed) in the CDI treatment, but one of these exhibited a severity of less than 1%.

DTR was the most frequently observed disease throughout the analysis period, with an average incidence value of 20.07%. DTR also made up 31.58% of leaf injuries, together with LR. These injuries were observed on both FL and FL-1. The fungus survives in the stubble and, under humid conditions and adequate rainfall, releases spores that infect the lower leaves. From there, the disease advances to higher leaves by rain splashing or air circulation [49], which are conditions that occurred in the D treatment because the furrow was not completely covered.

There were significant differences between the D treatment and the irrigation treatments (CDI and TI), with the exception of the sampling on September 16, in which the differences were not significant (Table 3). This finding can be attributed to a dilution of the disease by an increase in leaf area; DTR was present in basal leaves initially, but these leaves had dried up at more advanced phenological stages.

The average incidence of LR was 11.52% over two sampling dates. LR was first identified in 3.9 (September 16) and was more frequent in the D treatment (11.73% versus 1.2 and 0.5 for CDI and TI, resp.). During the next week (September 22), an application of 18.7% trifloxystrobin (strobilurin) and 8% cyproconazole (triazole) was made to control the disease. This application remained active up to 60 days, which allowed a reduction in the number of active pustules of Puccinia triticina and the control of this disease. However, a second LR infection cycle followed. This likely happened because the urediniospores are relatively long lived and can survive in the field without being deposited on host plants for periods of several weeks [26]; this is why very early treatment, insufficient wetting of the basal leaves, or favorable environmental conditions may allow reinfection by this polycyclic pathogen.

On September 30 (the sampling that was conducted before the new LR attack), the conditions in the experimental area were highly favorable for pathogen development. According to INTA Gálvez, in the first ten days of October, the maximum, minimum, and average temperatures were 22.1°C, 4.7°C, and 13.5°C, respectively. Rainfall of 101 mm accumulated in just 15 days (for comparison, the historical average for October is 105 mm), and several days were misty and foggy, which resulted in water accumulation on the leaves.

At 7.05 (October 15), this disease was identified in all three treatments. D exhibited the highest incidence (48.2%), while CDI and TI exhibited incidences of 7.67 and 10.17%, respectively. LR infection was found in FL-1 (38% in D, 6% in CDI and 6% in TI), but with severity levels of less than 1% in all treatments. Some FL was also infected, but only in D, with an incidence of 4%. Significant differences ( ) were found between D and the two irrigation treatments (Table 4). These differences likely occurred because D, as a result of not adequately covering the furrows, allowed a greater remobilization of spores by wind and rain and consequently higher levels of infection, peaking in FL and FL-1.

In addition to all of the observed foliar diseases of fungal origin, large, dry, grayish-green lesions corresponding to bacterial blight caused by Pseudomonas syringae were observed in FL at the last sampling date (October 15). This “leaf blight” is favored by relatively cool temperatures (14 to 23°C) and high relative humidity, which are conditions that were present in the experimental field.

Finally, at physiological maturity (November 12), spikes were analyzed using wet chamber method. The presence of stained glumes caused by the saprotroph fungus Alternaria spp. was detected, resulting in 100% incidence in D, 50% in TI, and 46% in CDI treatments. The presence of this fungus was also observed through blotter analyses, as described in the previous section. No frost damage or Fusarium graminearum, Gaeumannomyces graminis, and Erysiphe graminis were observed, but insect damage was present.

Although foliar diseases were common throughout the growing season, high yields were obtained in all treatments, as evaluated by the number of spikes. The only significant differences observed between D and irrigation (DIC and TI) were in the weight of 1000 grains (Table 5).

The critical period for the main component of wheat yield (grains m−2) ranges from 20 to 30 days before and 10 days after flowering. This is therefore the period during which leaf health is the most crucial for the plant to take advantage of incident radiation to maximize the growth and viability of the grains. Serious losses can also occur when the flag leaf is infected prior to anthesis. However, even the most prevalent diseases never exceeded 4% incidence or 1% severity in FL, so those were considered unlikely to have caused yield loss, regardless of the time of occurrence. Furthermore, crop health was generally very good, and yield differences between treatments were attributed to other causes (e.g., water availability differential, LAI achieved in each treatment).

3.3. Field Trials (2010 Season)

During the wheat growing season, from implantation until the harvest, a total of 184 mm effective rainfall was received, well below the normal rainfall for the area of study. Due to the lack of rainfall, four irrigations were conducted, with a net sheet total of 180 mm. The first irrigation consisted of 40 mm conducted on August 7 ( 2.2) with a blade, the second was 50 mm on September 28 ( 3.9), the third was 40 mm on October 5 ( 5.5), and the final irrigation was 50 mm on October 20 ( 7).

The average daily temperature was below 16°C during the last third of June and during July, August, September, and October. In the first days of November, the daily average temperature exceeded 22°C (Figure 2). The lowest minimum temperature was recorded in the month of August at −7.7°C, and the highest maximum temperature was observed at the end of the growing season at 36.2°C. The average humidity remained above 57% over the whole growing season, and wet leaves were observed after 15 hours during the second ten days of July, the third ten days of August, and the first and third ten days of September.

Similar to the results from 2009, all three basic foliar diseases (LR, DTR and, to a lesser extent, SLB) were observed. Disease was significantly more prevalent in the D treatment than in either irrigation treatment ( ). Disease severity reached 15% on FL-1 and 10% on FL in the D treatment but only 5% and 1% on FL-1 in the CDI and TI treatment, respectively.

The first sampling was carried out in 2.2 (August 16), at which point some development of DTR could be observed on the basal leaves in all three treatments. Later (September 16), the plants that had been irrigated and those that had not been irrigated exhibited different phenological states ( 3.3 for D and 3.1 for TI and CDI). At this time, significant differences were observed between D and the two remaining treatments, both in terms of the variable incidence of diseases and in LAI (Table 6). Both DTR and, to a lesser degree, SLB were present. These diseases reached the upper strata in the D treatment but were restricted to the basal leaves in the irrigated treatments.

At the following sampling at 5 (September 30), only DTR was identified. This disease did remain confined to the lower strata in the irrigated treatments, in contrast to what was occurring upland, where DTR colonized the upper strata of the crop in the D treatment. This corresponded to an increased incidence of DTR: 52.7% in D compared to 23.7% and 20.1% for CDI and TI, respectively.

In the following sample, which was collected at 6.5 for D and 6.2 for TI and CDI (October 13), LR was observed in addition to DTR. Significant differences between irrigated and rainfed treatments were observed (Table 7). As suggested for the previous year, the differences in disease behavior could be due to the fact that plants in D did not totally cover the furrow, which is consistent with the measured LAI values (Table 6).

SLB infection levels were low due to the low rainfall and limited hours of wet leaves, which did not allow SLB establishment and dispersal. The registered incidence values were 1.22% in D, 3.14% in CDI, and 3.44% at TI. Septoria tritici, the causal organism of SLB, requires temperatures of 20 to 25°C [50] and water on leaves for 35 hours followed by 48 hours of high relative humidity, which favored heavy infection [51]. These conditions did not occur until November, and the disease was identified only in the first sampling.

DTR was present from the tillering stage to the end of the growing season. The stay of wheat straw at the soil surface, associated with moderately conducive weather conditions, favored the emergence and constant development of DTR throughout the entire crop cycle, with a variable but consistently increasing incidence according to phenological state. Significant differences were observed between D and the irrigated treatments (Table 8). DTR was observed more frequently on FL and FL-1 in D than in the irrigated treatments. For FL, the incidence of DTR was 24% in D versus 2% in CDI and 12% in TI; for FL-1, the incidence peaked at 86% in D versus 26% in CDI and 30% in TI.

The spread and infection of Drechslera tritici-repentis can occur under a wide range of environmental conditions; in general, temperatures between 10 and 30°C and 6- to 48-hour humid periods are sufficient [5255]. Therefore, tan or DTR appears every year, in contrast to other diseases, such as FHB, which are strongly dependent on environmental conditions [41].

The onset of LR was significantly delayed in 2010 relative to 2009 and was first registered only at the beginning of flowering. According to INTA Galvez, the maximum, minimum, and average temperatures during the second ten days of September were 27.5°C, 8.1°C, and 16.8°C, respectively. A total of 56.4 mm of rainfall was recorded in the last two weeks of September and the first ten days of October, and leaves were wet for up to 17 consecutive hours for several days in the last third of September.

Statistical analysis showed significant differences in LR incidence between D and irrigation treatments (37.07% versus 8.73% in CDI and 9.5% in TI).

The disease reached FL-1 with an incidence of 40% in D, 2% in CDI, and 10% in TI. The severity reached levels of 15% in D but was less than 1% with only one to two pustules per leaf in the irrigation treatments. LR was observed in FL only in the D treatment, with an incidence of 6% and a maximum severity of 10%.

Finally, on November 11, samples were extracted to analyze the crop yield. Very good results were obtained in all treatments, although a significant difference ( ) in the weight of 1000 grains was observed between D and TI. The differences between CDI and D or TI were not significant (Table 9).

In terms of the health of the spikes and grains, Fusarium and Alternaria spp. were not identified because there were no environmental conditions that favor their appearance. The grain is susceptible to infection by Alternaria during filling or ripening stage, particularly in the states called milky, pasty ([5658]). The sporulation of Alternaria range is between 0°C and 35°C, with optimum at 27°C, but is inhibited below 15°C or above 33°C [59]. Moschini et al. [57] determined that the severity of this disease in Argentina is favored by warmer temperatures, frequent rainfall, and days with relative humidities higher to 62%, in the grain development period spanning about 30 days after heading, but these conditions did not appear in the 2010 season.

Additionally, Gaeumannomyces graminis and Erysiphe graminis were not detected. On the other hand, agronomic frost did not generate the grains yield decrease, because it occurred in noncritical states for the crop.

3.4. Greenhouse Trials

The first irrigation was conducted during 2.2 (4 July) for both treatments. Over the entire growing season, TI received 340 mm, while CDI received 240 mm. The initial inoculum from the straw, accompanied by droplets of water from the first irrigation, led to the development of DTR and SLB.

The first symptoms were observed during full tillering ( 2.2, 16 July), although differences became significant after September 14, when the TI treatment was in 7 and the CDI treatment was in 6. At this point, yield components had already been defined (Table 10).

DTR infection reached both FL-1 and FL. The maximum incidence in FL-1, observed at 7.0, was 65% in TI and 45% in CDI, and severity values reached 10% in both treatments. The incidence of DTR in FL reached 70% for TI and 25% for CDI, with a maximum severity of 5% for both treatments.

It should be noted that lower levels of incidence and severity were reported in CDI in the greenhouse trials than under field conditions.

LAI values were similar between treatments (Table 11), but significant differences ( ) in the weight of 1000 grains and BHG were observed between the two treatments (Table 12).

4. Discussion

The genera of fungi identified in this analysis correspond to those recognized by Can Xing et al. [60] and Ramirez et al. [61]. These results highlight the importance of using cured seed for seeding. The treatment of seeds with fungicide both eradicates the inoculum so that they do not constitute a primary or initial source of infection as well as protects the seed and seedlings from fungal infection in the soil, which indirectly leads to increased germination and ensures the implementation of cultivation [28].

During the two agricultural cycles evaluated, DTR and LR were the dominant foliar diseases. The cultivated plants remained healthy until advanced stages of development, and the severity of both foliar diseases was low in all of the treatments tested. In the 2009 season, 100% of plants in all treatments exhibited some degree of infection, although the severity was very low (less than 1%). Similarly, in the 2010 season, 100% of the experimental plants exhibited some degree of infection, again with relatively low severity (less than 15% in D, below 5% in CDI and 1% in TI).

Plants that received irrigation treatments exhibited lower levels of foliar diseases in both years. These results conflict with those of a previous study [29] conducted in southern Alberta, which suggested sprinkler irrigation to generate crops that are denser, thus modifying the local microclimate and creating optimal conditions for the development of diseases. However, these authors also suggested that irrigation influences the development of diseases not only through its impact on infection conditions but also through the sporulation of pathogens and later spore dispersal.

The lower disease burden of irrigated plants, observed during both years, may be attributed to the fact that better nourished plants (i.e., plants with greater water accessibility) are generally more tolerant of or less affected by foliar diseases. The work of Annone et al. [62] and that of Formento et al. [49] have shown that nitrogen fertilization at the right time may reduce the development of diseases such as DTR and increase the green tissue remaining in many leafy cultivars. The incidence of DTR in D was lower in 2010 than in the wet year 2009, despite the drier environmental conditions and thus limited water availability. In contrast, Annone and García [63] assert that any measure that directly or indirectly reduces the likelihood of secondary inoculum displacement, among plants both lower and higher levels of culture, reduces the final level of symptoms. Therefore, some management practices to obtain the highest possible density, such as the adjustment of seeding density based on grain weight, balanced fertilization to produce a compact cultivation structure, and the use of the lowest possible distance between lines and an appropriate cultivar for the desired sowing date, mitigate the development of “leaf spots.” This is consistent with the higher incidence of DTR identified in D, which exhibited incomplete furrow coverage and low LAI, and therefore a less dense cultivation structure, which allowed higher levels of infection, even of FL and FL-1. The tests performed by Perello et al. [18] show that the disease incidence increased with the plant age and the severity increased with the growth stage when the evaluation was performed at 14 days compared to 7 days after inoculation. This coincides with the higher incidence found in more advanced stages of the crop at different treatments.

SLB was minimal (trace levels) in both years and was observed more frequently in D plants than in irrigated plants, especially during the more humid 2009 season. These results can be attributed to the density of plants generated in each treatment; as discussed above, plants in the D treatment did not fully cover the grooves, unlike the plants under irrigation, thus allowing the disease to develop further. This finding is consistent with the work of Massaro et al. [64], who emphasized that growing crops at an optimal density, without large spaces between plants, can reduce the epidemic development of “SLB of the road" through secondary infections from the sites of primary infection (basal leaves) into the upper leaves. In contrast, Klatt and Torres [48] have noted that tall varieties of wheat tend to be less affected by SLB than short or semidwarf varieties. In general, this is due to a morphological resistance as a result of the increased distance between the leaves, which tends to impede the upward progress of the pathogen through the splashing of raindrops. In semiannual wheat cultivars, the leaves are closer to each other and the foliage tends to be denser, facilitating the upward spread of disease.

The results of our greenhouse experiments should not override those obtained in the field; significant differences in the parameters severity and incidence for both irrigation systems have not been verified.

Finally, significant differences in productivity were observed between irrigation and rainfed treatments. These differences were due to the application of water during the stem elongation stage ( 3.0), which allowed the survival of more tillers and therefore more spikes than in the D treatment [65].

5. Conclusions

Based on tests carried out over two consecutive years, supplementary sprinkler irrigation of cultivated wheat at opportune moments, even in small quantities, increases grain weight and thus yield without increasing the incidence of foliar disease. Two fundamental principles should be considered for the correct management of wheat diseases: (1) the initial health of the crop should be optimized by using seeds with a low pathogen load and (2) appropriate monitoring should be conducted to properly quantify the diseases present in the field.