Table of Contents Author Guidelines Submit a Manuscript
Journal of Sensors
Volume 2018, Article ID 3949415, 25 pages
Research Article

Affordable Bimodal Optical Sensors to Spread the Use of Automated Insect Monitoring

1Department of Music Technology and Acoustics, Technological Education Institute of Crete, Rethymno, Greece
2Department of Electronics, Technological Education Institute of Crete, Chania, Greece
3Mechanical Engineering Department, Technological Education Institute of Crete, Heraklion, Greece
4Biogents AG, Regensburg, Germany

Correspondence should be addressed to Ilyas Potamitis; rg.eterciet.ffats@sitimatop

Received 22 December 2017; Accepted 25 February 2018; Published 8 May 2018

Academic Editor: Eduard Llobet

Copyright © 2018 Ilyas Potamitis et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.


We present a novel bimodal optoelectronic sensor based on Fresnel lenses and the associated stereo-recording device that records the wingbeat event of an insect in flight as backscattered and extinction light. We investigate the complementary information of these two sources of biometric evidence and we finally embed part of this technology in an electronic e-trap for fruit flies. The e-trap examines the spectral content of the wingbeat of the insect flying in and reports wirelessly counts and species identity. We design our devices so that they are optimized in terms of detection accuracy and power consumption, but above all, we ensure that they are affordable. Our aim is to make more widespread the use of electronic insect traps that report in virtually real time the level of the pest population from the field straight to a human controlled agency. We have the vision to establish remote automated monitoring for all insects of economic and hygienic importance at large spatial scales, using their wingbeat as biometric evidence. To this end, we provide open access to the implementation details, recordings, and classification code we developed.

1. Introduction

In the context of integrated pest management (IPM), insect pest population monitoring is crucial [13]. The decision to take action against pests using chemical or biological measures is based on insect population measurements. These measurements define the economic injury level, that is, the landmark point in time after which an economic damage appears. The simplest method to monitor the population of insects is by using insect traps that are commercially available for all common pests. Insect traps are usually plastic, low-cost boxes coming at different configurations and carrying a pheromone or food attractant. The cost of applying population monitoring through a network of traps is mainly due to expenses for manual practices (wages for placement of traps, scouters to report counts, zone managers to oversee scouters, etc.). As reported in [4], the California Department of Food and Agriculture operates a network of roughly 63,000 attractant-based traps to monitor Diptera: Tephritidae; likewise, in Israel, approximately 2600 traps monitor 20,000 ha of citrus orchards for fruit flies. Except for insects of economic importance such as fruit flies, insects of hygienic importance are also commonly monitored. Mosquitoes and biting midges such as Culicoides (Diptera: Ceratopogonidae) can transmit serious diseases to humans and livestock. Therefore, large networks of traps are deployed to monitor their presence and population dynamics in an effort to keep an eye on the situation or the effectiveness of a treatment. Due to the enormous cost of manual monitoring and the compromises that often appear in practice, there is a pending need to automatize the monitoring process of the presence of insects by using electronic means, that is, smart traps also referenced as e-traps.

The concept of e-traps has sporadically appeared in the past [57] with a focus on upgrading typical plastic traps with a device that senses the incoming insects and a communication capability to log these counts. Early efforts, though inspirational, were at the time necessarily fragmentary and limited to small-scale paradigms that neither communicated with one another nor could be seamlessly integrated into a universal view of insect fauna monitoring. Recent approaches have advanced to the point of transmitting insect counts by using the General Packet Radio Service (GPRS) functionality [812]. The emergence of the Internet of things concept (IoT) [13] that allows the networking of physical devices to exchange and report data along with the possibility of providing Internet services with global coverage on the planet creates new opportunities for communication and cooperation between e-traps and a central agency.

It is the fortunate coincidence of the simultaneous maturation of diverse technologies such as low-power electronics, the IoT, and artificial intelligence that allows insect monitoring to be viewed through the prism of cyber-physical systems and possibly reach a global scale of application.

To our point of view, there are reasons why automatic insect monitoring has a prospect. To mention but a few, manual surveillance of large networks of insect traps is an indispensable stage of IPM for many countries. In addition, services that now do not exist because of the manpower constraint are bound to emerge: on-time infestation/outbreak prediction, time stamping of insect captures and their correlation with the efficiency of attractants, assessing the nocturnal activity of insects, and many more. Moreover, the first commercially available optoelectronic counters of insects have either made an entrance to the market [14] or are about to [15]. Likewise, image-based e-traps are currently in the market [1618]. The main advantage of our approach as compared to image-based systems is that it does not require human experts to interpret images. Wingbeat classification is performed during the entrance of the insect into the trap, thus making the process easier than analyzing the challenging conditions of an image with superimposed insects.

The objective of this work is to investigate two types of optical parameters that can originate from a wing-beating insect that is hit from the light of an infrared emitter: (a) The so-called extinguished light (or extinction radiation) which is the light variation due to the shadow cast on the receiver by the wings of the insects and (b) the side-scattered or reflected light by 90° by the wings and main body. These are different interaction mechanisms. In this work, we show their complementarity in extracting relevant information on the identity of the insect. We suggest an innovative type of sensor construction based on Fresnel lenses and capable of taking simultaneous recordings of both types of light of the wingbeat of an insect in flight. The information contained in the samples of these recordings is tested in the difficult task of discerning morphologically similar fruit flies whose wingbeat spectrum totally overlaps. Consequently, we embed a part of this sensor to a Fresnel-based version of the e-trap originally presented in [12, 19] along with technical improvements. The enhanced version of the e-trap is able to analyze in situ the recording by using a simple but powerful algorithm and also upload the wingbeat snippet to a server where it can be examined by heavy-duty machine-learning classifiers.

Our electronics are designed to meet multiple types of constraints mainly in terms of efficiency and compactness and, as regards the e-trap, power consumption. A top priority and a hard, nonnegotiable constraint we need to meet is their cost-effectiveness so that they can be integrated into practical and affordable insect traps, thus allowing new services to emerge. Therefore, in this work, we do not optimize in terms of reflectance at short-wave infrared (SWIR) wavelength, as this would require expensive photodiodes, two orders more expensive than working in the near infrared (NIR) [20]. Moreover, both at the emitter and the receiver of light, we make use of cost-effective acrylic Fresnel lenses that are able to provide collimated light and a probe volume (PV) that allows the reception of high-quality wingbeat recordings of fast-flying insects. The advantage of the Fresnel lens compared to a light guide used in [21] is significant as collimation of light provides better signal to noise ratio (SNR) at lower power consumption and allows the construction of a more compact and easier to assemble sensor in comparison with a light guide. Acrylic lenses are cost-effective compared to other kind of lenses used in laser applications.

Coming to recognition efficiency, we report recognition scores that exceed 98% on the discrimination of morphologically similar fruit flies using the stereo sensor and a close match of reported versus actual insect counts after two months of continuous operation in an olive orchard.

2. Materials and Methods

In Section 2.1, we describe in detail the bimodal sensor that senses the wingbeat. In Section 2.2, we present the stereo recorder associated with the sensor in Section 2.1. In Section 2.3, we present the electronic McPhail trap based on the findings of Sections 2.1 and 2.2. Finally, all supportive materials, including a well-documented database of recordings, can be found in the Schematics list of the Appendix. Further on, we describe the basic principle of the circuits in detail, and for the sake of clarity, we have moved all schematics to the Appendix. Any comment on the circuits refers to the schematics included therein (e.g., Figures 10 and 19 refer to the figure in Appendices E.1 and E.2, resp.).

2.1. The Wingbeat Sensor Based on Fresnel Optics

The sensor is described in detail in Figure 1. It is a cube with an open top and bottom so that the insects can fly through. The extinction light variation is produced by an infrared light emitter with its associated Fresnel lens and a receiver (top-bottom in Figure 1). We use acrylic Fresnel lenses that have molded curved inclined planes (the so-called “grooves”) on their surface to guide the light beam. In the context of our work, the emitter is composed of a single LED and an associated Fresnel lens that produces a collimated light beam (i.e., a light beam travelling parallel to the optical axis). Another Fresnel lens at the receiver is used as a collector that focuses on the collimated beam of light to a single photodiode. The emitter (see Figure 1) is based on a single infrared light-emitting diode (LED SFH4726S, OSRAM Opto Semiconductors, Germany) fixed to the focal point of the Fresnel lens. The receiver (Figures 11(a) and 11(b)) is a photodiode (TEMD5110X01, Vishay Electronic GmbH, Germany) fixed at the focal point of the receiving Fresnel lens. Both lenses are identical (item 3, diameter 50 mm, focal length 32 mm, overall size 58 × 58 mm, Fresnel Technologies Inc., USA). The TEMD5110X01 photodiode and the SFH4726S LED are matched at 940 nm. The LED also has a large beam angle of ±75 deg and conforms to the chosen Fresnel lenses that need at least ±40 deg so that the beam expands from the focal point up to the plane of the lens. The emitter illuminates the flying insect and we record the light variation at the receiver due to the time-varying shadow that the wings of an insect cast on the receiver. The maximum flow of light between emitter and receiver of extinction light appears when there is no insect inside the PB. Therefore, the variation of light intensity measured at the receiver starts from its maximum value and lowers in the presence of a wing-beating insect.

Figure 1: (a) Sensor. An illustration of the bimodal sensor concept. The light of the infrared LED (bottom) passes through the Fresnel lens and is collimated. The insect’s wingbeat casts a shadow to the opposite receiving Fresnel lens (top). The collimated light is also partially side scattered at 90° and directed to the passive Fresnel lens that records the reflected light (a). (b) A CAD design of the prototype sensor. A cone-shaped, dark plastic fixes the LED and photodiodes to their correct focal point.

The reflected light, on the other hand, relates to the refractive index of the wing membrane and the glittering of the insect [22, 23]. The sensor of the reflected light is passive and fixed to look at a black termination plane at the opposite plane (MAXiBLACK, Advanced Coating Products, Acktar Store LTD, Israel). Therefore, the intensity of light in the absence of an insect is theoretically zero and practically equal to the minimum light reflection stemming from the black termination plane. In short, we illuminate our targets and detect light scattered at 90 and 0 degrees (see Figure 1).

This setting allows us to record the same wingbeat event as it is perceived by the two different modalities simultaneously and, therefore, allows us to assess the complementarity of this information with regard to identifying the insect from its wingbeat. Note that extinction and scattered light, in principle, could reveal different aspects of insects’ characteristics. Two identical insects with differences only in the melanisation of the wings would cast the same shadow (provided the wings are not transparent) but would produce different reflected light patterns.

On the other hand, two insects with the same reflective properties would produce different extinction light variations if they had morphological differences or even the smallest differences in their wing-beating muscles.

The receiver of the scattered light receives a small amount of side-scattered intensity from the wingbeat of the insect; therefore, we choose a high power 3.4 W LED driven at 0.85 W to achieve a good SNR. To avoid saturation at the photodiode, we apply two layers of an absorptive infrared-blocking film on the receiver of the extinction light which is opposite to the emitter (Crystalline CR40, 3M, Saint Paul, MN, USA) so that the inverse current of the photodiode falls under 100 μAmp.

2.2. The Stereo Wingbeat Recorder

In [21], we presented an approach that is based on modulating the insect wingbeat signal to high frequencies, cleaning the lower frequencies contaminated with all sorts of electromagnetic interferences, and demodulating back the insect’s frequencies. This approach is kept in the new recorder as this equipment needs to function in laboratories that have several sources of optical and electromagnetic interference.

The embedded processor (STM32F429ZI ARM Cortex-M4) (Figure 12) follows the Inter-IC Sound (I2S) protocol for connection with digital audio devices. The ADC (PCM4220, Texas Instruments) has two independent multibit sigma delta ADCs with PCM output word length reduction at 16 bits. The embedded microprocessor runs a constantly looping program which processes data captured by the two modalities. The board is programmed in C/C++. The line-level output from the optoelectronic sensor is copied to two circular buffers. The first buffer is used to monitor the signal’s root mean square (RMS) using a window of 128 samples. If the RMS of the window exceeds a predefined threshold, we assume that an event has occurred, that is, an insect has crossed the sensor’s PV. The processor checks out all processing stages of the device. It is powered by 3.3 V through IC2 and has an embedded real-time clock (RTC) that keeps time powered by the B1 battery (Figure 12). It provides the clock signal to the ADC, accepts the digital audio samples through the I2S interface, decides if a triggering has taken place, and, if positive, stores the 625 msec snippet in the SD card in a wav stereo 16 bits, 16 KHz format recording. We have placed a low-pass RC filter (R75, C120, and C116 in Figure 13) in the power line of CDCE913 to eliminate the noise that occurred during the writing in the SD card (see Figure 13). The power supply unit of the device accepts input of 9 to 16 VDC and produces +5 V, +12 V, and −12 V (see Figure 14).

The driving circuit of the LED (Figure 10) receives a pulse sequence from the clock (CDCE913 TI) (Figure 13) producing 455 KHz sequences with the use of a crystal at 24.576 MHz. TEMD5110X01 photodiode and the SFH4726S have small rising and fall times and we need this functionality as the modulation circuit (Figure 10) will create high-frequency pulses. The CPU instructs the clock to commence with a single command and the receiver functions as a simple, envelope detector. Therefore, the lock-in detector AD630 (see Figure 15) is configured as an envelope detector. The driver (Figure 10) controls the LED current of the emitter. It is powered by 5 VDC and accepts clock signal from the integrated circuit CDCE913. The circuit that produces clocked pulses for the LED is controlled by the MCU through the I2C interface and produces a frequency of 455 KHz for the LED driver (Figure 13). Current stabilization of the LED is achieved through the MOSFET, a transistor, and two resistances (R1, R2). The current is defined by R1 (ILED = 0.65 V/R1), where 0.65 V is the voltage at which that transistor starts to conduct. The clock pulse reaches the gate of MOSFET through R1. When the voltage at R1 reaches 0.65 V, the transistor starts to conduct thus reducing the voltage at the gate of the MOSFET (the conductive ability of the MOSFET). Through the feedback of the transistor, the current remains stable at 0.65/R1 ampere. The advantage of this solution is that it can function at the frequencies that we need for high-frequency modulation.

While there is one emitter, there are two receivers, one for receiving scattered light and one for extinction light. The photodiode of both receivers is TEMD5110X01. The receiver of scattered light is amplified by the operational amplifier IC1 with TI gain 1 MΩ (Figure 11(a)), while for the extinction light receiver, the amplification is through the IC3 having a gain of 27 ΚΩ (Figure 11(b)). The signal from the sensor reaches the band-pass filters IC14 and IC15 through the connector CON2 (see Figure 15). The filters have a band width of 30 KHz and reject signals outside 455 ΚHz. In this work, we use a band-pass ceramic filter at 455 kHz (CFWLA455KCFA-B0, Murata Manufacturing Co. Ltd., Japan) that reduces the complexity of the circuit and has the desired response fn ± 12.5 kHz. We tried alternative solutions by using the low-noise operational amplifier OPA1612 configured as active band-pass filters, but the noise was higher. The output of the filters drives the demodulators IC12 and IC13 (Figure 15). The output of the demodulators is the audio signal that includes the 455 KHz frequency and is filtered with low-pass filters (see Figure 16). The demodulators bring the analogue signal back to the audible frequency range which is subsequently sampled by the ADC at 16 KHz sample rate. The LPF are composed of quad operational amplifiers IC9 and IC11. They accept as input the output of the demodulators and they drive the input to the ADC (Figure 16).

The ADC circuit is based on the converter PCM4220. The converter is driven by the operational amplifiers IC2, IC3, IC4, and IC5 that transform the single output to differential. The digital data are transferred to the MCU through the I2S interface (Figure 17). The sampling frequency, window length, and triggering threshold are prestored in the SD card of the system and are read once from the SD card during powering up the device. The circular buffer requires 120 Kbytes (2 channels 2 bytes/channel/sample 10000 samples (625 ms at 16 KHz)/recording) 3 recordings in buffer—we practically utilize all available RAM.

After the device has been powered up, the MCU requests measurements of humidity, temperature, barometric pressure, and background light intensity (Figure 18). Each detected event is augmented by a series of metadata coming from different sensors. For each registered snippet, we include temperature and humidity (Si7021, Silicon Laboratories Inc., USA), light intensity (OPT3002, Texas Instruments Inc., USA), and a barometric record (MS5637, TE connectivity Ltd., USA). The environmental sensors require an instruction from the processor to start measuring, and after a certain time (up to 800 ms), we need to read and process the measurement (Figure 18). To handle all these processes timely, we realized a timer with a 200 ms timeout. This way, the processor does not need to wait until the measurement is completed by every sensor. It just commences the process and carries on to other tasks (triggering, recording, and plotting), and after 200 ms, it performs a quick check of the order of microsecs to see if measurements are available. If the sensors have available measurements, it reads them and transforms them into the appropriate measurement units, and after a 5 sec pause, the algorithm repeats the loop.

The time stamp of a detected flight event is related to the circadian rhythm of the insect. These metadata serve a double role. First, insect species whose wingbeat totally overlaps in the spectrum domain may have different activity patterns during the day or may not coexist spatially, and therefore, the probability of correct classification when we apply monitoring at large spatial scales, that is, in different countries, increases by transmitting this information to the server [24].

Second, we suggest integrating the spectrum with environmental metadata and GPS coordinates so that the e-traps augment their monitoring capabilities to better track the dynamics of the insects’ population. Once these data are logged for a small number of years, then prediction models can come into force [25]. All metadata information is passed to the filename of the recording for easier parsing without having to access the samples of the recording.

2.2.1. Fresnel Lenses versus Optical Light Guides

The benefit of a large receiving aperture compared to a single LED or 1D array of diodes is that fast-flying insects spend more time in front of the PV and, therefore, offer more information on the flight process. The PV of the sensor reported in [21] is a volume of 70 mm × 59 mm × 11 mm, whereas in the setting of Figure 2, we have a cylinder π × 252mm2 × 70 mm, regarding the extinction mode and π × 252mm2 × 50 mm for the scatter mode. Both types of receivers (light guide and Fresnel lenses) have the advantage of having a smooth, compact, surface; there are no gaps in the receiving surface of both sensors that could potentially lead to false frequencies when a fast insect passes through the PV.

Figure 2: Sensor. (a) The prototype stereo sensor recording extinction and scattered light simultaneously stemming from the same wingbeat event. Opposite (a, b) circular Fresnel lenses are coupled as emitter-receiver of collimated light (extinction radiation). The central lens has at its opposite side a black termination cavity. The central lens receives scattered light from the emitter on the left. The flying insect passes through the funnel that the Fresnel lenses create. (b) An internal view of the prototype recorder.

There are several advantages in using the Fresnel lenses compared to a light guide. The construction of the sensor becomes easier, perfectly repeatable, and more compact. A Fresnel lens provides collimation of light, and in Figure 1, we use a single LED as an emitter and a single photodiode as a receiver as opposed to [21], where 11 photodiodes make a linear array for receiving the light and 42 LEDs make an array that emits a light plane with the help of a diffuser. Using a single photodiode results to small capacitance and lesser noise in the transimpedance amplifier (TIA). Due to collimation, we do not face the loss of optical power that can be a subsidiary effect of using a diffuser as in [21]. Therefore, one needs much less power to achieve the same SNR. By using a Fresnel lens, the distance of the emitter-receiver can become as large as 3 meters without significantly losing light power (as opposed to the diffused light of a light guide).

The Fresnel receiver focuses the light of the emitter only to its focal point where a photodiode rests. This means that the receiver is not susceptible to noise stemming from an external origin and light reflected from the wingbeats of insects flying over the sensor.

The single disadvantage of the Fresnel lens is the need to place the LED/photodiode at a specific focal point and this can make the construction bulkier compared to the 4 mm slim configuration of the light guide [26].

2.3. The McPhail-Fresnel Trap Case

After completing the stereo-recording device, we proceed into transferring a low-power version of this technology into a useful product, that is, an automatic electronic McPhail trap (Figures 3 and 4). The McPhail trap is a commonly used trap to monitor fruit flies in different parts of the world [27]. McPhail-type traps, in general, are either fully transparent or semitransparent because light is a strong attractant for fruit flies. The insects enter the trap following the odor of the chemical attractants placed inside the trap and also because of the light, the round shape of the trap and the yellow color sometimes applied to the lower part of the funnel. In our experiments, we used a gel-type (nonliquid) form of food attractants to avoid problems with the electronics. As a means for verification data needed to validate the automatic counting module, we have developed a quick process on applying entomological glue active for more than 2 months on a transparent plastic sheet (Folex BG-72) that is subsequently inserted to cover the walls of the trap and a part of the interior funnel. The insects must not have a safe place to land as free flight patterns for a long time may lead to double counts.

Figure 3: A prototype of the electronic McPhail trap based on Fresnel lenses.
Figure 4: The story of a wingbeat in the era of IoT and artificial intelligence. Insect monitoring is ubiquitous. A wingbeat taking place in one part of the world can be made available immediately to another part where it can be classified and logged.

After thorough experimentation, we did not integrate the scattered light receiver but only the extinction one. The scattered light stemming from the wingbeat is feeble while the trap is transparent operating under the bright Mediterranean sun, and therefore, we would need to overamplify the current of the photodiodes. In the presence of sunlight, the photodiodes can give a large current up to 5 mAmp. This entails that the noise at the preamplifier output of photodiodes can become substantial. There were also power consumption constraints we needed to satisfy. Therefore, we ended up integrating a suitable type of Fresnel lens (an array of Fresnel lenslets 540.4 of Fresnel Technologies, Inc.) for the extinction light mode. The emitter (Figure 19) is composed of 4 LEDs (SFH4247) located in the focal point of each subarray. The receiver (Figure 20) is composed of 4 photodiodes (TEMD5110X01) located in the opposite of the emitter. The emitter and the receiver are placed inside a box that provides shade; otherwise, the sunlight can burn the LEDs as well as the photodiodes as it becomes concentrated through the collimation ability of the Fresnel lenses. The use of a single Fresnel did not prove practical in the context of our trap. It required large focal distances and the drop of power at the borders of the lens was significant. The subarray structure does not create false frequencies due to horizontal insect movement as the insects entering the trap fly in a lateral orientation towards the interior of the trap, and therefore, a flight across the array would be atypical and was never observed in our experiments.

Each LED of the infrared emitter is fixed to the focal point of each lenslet of the 540.4 lens, driven from the MOSFET BSS138 (Figure 19). The transistor and MOSFET setup provides stable current to LEDs 0.65 V/42 ohms = 15.4 mA per LED. Regarding the photodiodes’ amplifier, the infrared receiver is composed of 4 photodiodes TEMD5110X01 whose current is amplified by IC3B (Figure 20). The feedback with IC3A and the TR1 transistor makes the circuit capable of functioning in the presence of sunlight (high-amplitude DC current, Figure 20). The circuit is powered by 3.3 V and admits a CLOCK 455 KHz from the MCU (Figure 21 signal: LEDS).

The band-pass filter (BPF-FL1) accepts input from the output of the photodiodes’ preamplifier that filters it so that only the desirable 455 KHz frequencies pass (Figure 22). The output of BPF is amplified by IC2 and driven to the demodulator (Figure 23).

The demodulator accepts input from the output of BPF and its output is an audio signal stored in the SD card (Figure 24). Its demodulation requires a sync signal (SYNC) that is received from the MCU (Figure 21).

The low-pass filter accepts input from the output of the demodulator. It filters the signal so that the audio signal is accepted, and the high frequencies of the demodulator output are rejected. It is composed of two stages, IC6A and IC6B. The IC6D provides the required DC reference voltage at VCC/2 level (Figure 25).

The output of low-pass filter drives the audio-processing circuit which controls the amplification of the audio signal from the output of the demodulator with IC3 and converts the single output of IC2 to differential to drive the ADC of MCU using the IC5A and IC5B (Figure 26). Finally, the Internet connection of the trap and GPS to get the latitude and longitude is done with GSM-GPS module of SIMCOM SIM908 (Figure 27).

We have added an acceleration sensor (Figure 28) that monitors the movement of the trap. If a triggering is registered and the acceleration measurement exceeds a threshold, the triggering is rejected. We anticipate that the acceleration sensor combined with the GPS sensor can function as an antivandalism measure that can inform the owner that a displacement of the traps is taking place.

The triggering process of the e-trap is the same as in the stereo device described in Section 2.2 above and in [19]. In this work, we present only the differences encountered in the e-trap. Once the recording has been autotriggered, we receive a time domain input of 1024 samples. The sampling rate is low at 4 KHz as fruit flies beat their wings at around 200–250 Hz. The embedded algorithm performs a periodogram of 7 × 256 overlapping windows using a Hanning window of 256 samples and an overlap of 128 samples. The fundamental frequency of the wingbeat of B. oleae, our targeted pest, is expected at 200 Hz [19, 21]. Because the insect is ectothermic, the fundamental frequency can vary around 200 Hz, typically from 160 to 230 Hz [21]. The e-trap has an embedded algorithm that decides in situ if the recorded wingbeat snippet belongs to the targeted insect. Our approach is simple and proved quite efficient in field experiments. We calculate the power of the adjacent frequency bins around 200 Hz and its 3 higher harmonics (i.e., 400, 600, and 800 Hz). The higher harmonics are at integer multiples of the fundamental frequency. We do not search for the fundamental frequency for several reasons. We know what we are searching for, this being a fundamental frequency around 200 Hz and a harmonic structure at multiples of it. Then, depending on the orientation of the insect as it enters the field of view, the fundamental may or may not be larger than the first harmonic. That is, the fundamental may not be the highest spectral component in the signal and, therefore, one should avoid estimating it. The function estimates a signal power (Ps) from the frequency bins around 200 Hz and its three subsequent harmonics. The DC components are excluded from the calculation. The noise power, Pn, is then estimated as a subtraction of the signal level from the values of the total power Ptotal, that is, Pn = PtotalPs. If a recording has an SNR below zero, then it is rejected as not originating from B. oleae.

Since SNR = 10log10(Ps/Pn), we just need to compare Ps to Pn, and if Ps−Pn, then we accept that the recording originates from a fruit fly (i.e., SNR > 0). We have validated this rule from recordings taken in the lab from known cases of B. oleae. This rule is correct 99.9% when confronted with signals originating from B. oleae. The algorithm has also rejected a corpus of 500 recordings of various noise types and insects’ wing flap with f0 < 150 Hz. Note that this rule cannot discern between two different species of fruit flies and it is left to the cultivator to use species specific attractants. However, if the snippet is transmitted to a server instead of being classified in situ by simple spectrum-based rules, then we can use top-tier classifiers running on the server, and as shown in the results, the classification is expected to be accurate even for different species of fruit flies. Finally, safety measures must be taken to insulate both the stereo-recording device and the trap against electromagnetic interference stemming from RF devices and the power line. Our insulation was based on covering the analogue parts with copper sheets connected to the ground of the system. Moreover, the cables had double insulation for transferring the signal from the sensor to the recorder.

3. Results

We assess the quality of information of the recorded wingbeat by two means: visual observation of the spectrum and classification of the bimodal biometric evidence, i.e., scattered light and extinction radiation, using top-tier machine-learning classifiers. We subsequently evaluate the Fresnel-based e-trap which we compare to our previous version [19].

3.1. Experiments Using the Stereo Recorder

In Figure 5, we see a typical case of a stereo recording corresponding to a single wingbeat. The SNR is quite high for both modalities (~35 dB) and they coincide in the spectral areas in which the fundamental frequency and the harmonics reside. The scattering light returns a better-quality signal as it has more harmonics which add finer detail in the received signal. The finer the detail, the better for classification. The relative amplitude of the harmonics is not the same in both modalities and this entails that they can possibly carry different kinds of information.

Figure 5: Optical wingbeat recording of a fly using two different modalities simultaneously (i.e., backscattered and extinction light). Recordings are treated as audio and the amplitude in y-axis is normalized between [−1, 1]. We used the sensor and recording device in Figure 2. At least 23 harmonics are identifiable in the spectrum of the scattered light modality. The quality of a scattered light recording is superior to that of an extinction light recording in terms of SNR and number of harmonics emerging over the noise floor.

Subsequently, we evaluate the efficacy of the sensor to provide recordings that are capable to distinguish the identity of insects based on their wingbeat with a view to assess the discriminative capability of extinction and scattered radiation as well as the gain of fusing both. We deal with two tough problems in ascending order of difficulty: (a) the discrimination of B. oleae versus C. capitata that are both fruit flies and, moreover, morphologically similar and (b) the discrimination of sex between B. oleae male and female based solely on their wingbeat. Note that in the case of B. oleae, the main morphological difference between male and female is the ovipositor of the female and there is no great difference in size as in, for example, the case of mosquitoes.

Both pests are important as C. capitata is a major pest of fruit crops but does not infest olive orchards and is also being monitored by McPhail traps. All adult insects of all species in this work started as larvae inside fruits (i.e., olives for B. oleae, and peaches for C. capitata) and fed upon the pulp until they emerged, usually as third instar larvae. Then, they were collected and grown in an insectary cage. As larvae turned into adult insects, we supplied them with yeast hydrolysate-sugar diet and water to sustain them. We experimented only on first-generation insects, and each insectary cage contained strictly one species. There were about 250–300 adults of both sexes of C. capitata and 120 adults, similarly of both sexes, of B. oleae. All experiments took place in the cage they were grown; therefore, there was no chance of an error in the identity of an insect.

The same sensor (Figure 2(a)) was inserted successively in cages containing B. oleae and C. capitata adults of both sexes. We used the same equipment to avoid inserting any particular noise profile of the sensor. For the second experiment, an expert entomologist visually identified and manually separated the flying adults of B. oleae into a female and male insectary box, respectively. Variation in temperature and humidity was kept as low as possible in the case of B. oleae. C. capitata individuals were recorded in summer and had a small differentiation in temperature. All insectaries were placed at the same location and we made every effort to avoid environmental differentiation. All adults were 2–10 days old. In Figure 6, we depict the similarity of the spectral content of both species of fruit flies and in Figure 7 the similarity of the spectra of a male and female of the same age.

Figure 6: Mean ± SD power spectral densities of scattered light of the wingbeat of B. oleae versus C. capitata. C. capitata’s harmonics differ from B. oleae’s.
Figure 7: Mean ± SD power spectral densities of scattered light of the wingbeat of male B. oleae versus female B. oleae. The fundamental and the harmonics totally overlap.

These figures depict only an initial data exploration stage of the situation that the classifiers will face. Notice, however, that in Figure 6, the mean spectrum of both fruit flies, though overlapping, has notable differences. This is an encouraging observation to be verified by statistical significant classification tests. On the contrary, in Figure 7, we do not observe visible differences between male and female B. oleae wingbeat “spectral fingerprints.”

For the classification experiment of distinguishing between B. oleae and C. capitata, all B. oleae wingbeat snippets were tagged with label “1” and all the others with label “0.” For the classification experiment of distinguishing between B. oleae females and B. oleae males, all B. oleae female wingbeat snippets were tagged with label “1” and all male with label “0.” The composition of the datasets is depicted in Table 1.

Table 1: Dataset composition.

We first assessed the verification performance of popular classifiers based on 10-fold cross-validation. The whole dataset was randomly shuffled; 80% of the dataset was used for training and the rest 20% for testing. The whole procedure was repeated ten times and the mean accuracy and standard deviation over 10-folds is reported in Table 2.

Table 2: B. oleae versus C. capitata recognition results.

We clearly observe the superiority of the scattered signal but also the fact that the fusion of scattered and extinction light is beneficial. Fusion, in our case, is achieved simply by appending both spectrums. To further elaborate on the recognition accuracy, we used precision, recall, and F1 score metrics on a random 20% holdout part of the dataset [28]. One should note that in binary classifications, there are two sources of errors: the system may fail to classify correctly a target (i.e., a miss) and erroneously classify a fruit fly that is not B. oleae as such (i.e., a false alarm). Precision () is defined as the number of true positives () over the number of true positives plus the number of false positives ().

Recall () is defined as the number of true positives () over the number of true positives plus the number of false negatives ().

These quantities are also related to the () score, which is defined as the harmonic mean of precision and recall.

High precision relates to a low false positive rate, and high recall relates to a low false negative rate.

High scores for both show that the classifier is returning accurate results (high precision), as well as returning a majority of all positive results (high recall). We did not try to optimize the feature set as this is not the focus of this work (in Matlab and Python, the feature extraction takes one line of code: c = 10log10(pwelch(x, 256, 192, 256, 4000)); % include main body movement). The results on accuracy in Table 3 are higher than those reported in [19]. The Fresnel lens is able to lead to wingbeat signals that have better SNR. We cannot rule out also the small difference in the temperature variation of C. capitata that is a bit higher than B. oleae as we recorded in laboratory with a normal background temperature.

Table 3: Different accuracy metrics using a 20% random hold out set. XGB classifier. Both modalities appended.

Regarding sex differentiation, we report a mean accuracy of (63.19 ± 1.9) % over 10-folds using both modalities. This grants a statistical significant advantage over the chance level of 50%, but the results for sex differentiation are not high enough to be practical at the moment and are excluded from further investigation in this work.

To sum up, (a) we report success in discerning species’ identity even between morphologically similar fruit flies, (b) the two modalities carry complementary information and their fusion grants an advantage in terms of classification accuracy, and (c) the sensing modalities failed to discern the sex of B. oleae.

3.2. Experiments with the McPhail Trap in the Field (Fresnel Optics–Extinction Light Mode)

In Figure 8, we give some examples for the reader to get a better insight into the internal procedures of the McPhail trap while operating in the field. The top picture is typical of a B. oleae recording taken from the SD card of the trap. We were positive that this particular insect was of the B. oleae species because we released a number of them below the trap and this one was visually confirmed to fly into it. In the second figure, we see the spectrum of the wingbeat (i.e., the frequencies that constitute the “signature” of the wingbeat in the frequency domain). The harmonic structure of the spectrum is typical of an oscillatory movement. The first peak is the wingbeat frequency corresponding to the so-called fundamental frequency (f0). One can see that it is located at 200 Hz as expected, a typical situation of the spectral pattern originating from a B. oleae. The “peaks” numbered from 2 to 6 are the so-called harmonics f1f5 approximately at integer multiples of f0. One can see that the detection algorithm in Section 2.3 attributes a high SNR value to this recording, much higher than 0. The zero threshold is the one under which a recording is classified as non-B. oleae (i.e., it is rejected as not being B. oleae).

Figure 8: The harmonic detector applied to recordings of the e-trap. (a) a true positive case. Time in sec and y-axis of audio signals normalized between [−1, 1]. Note the characteristic wingbeat frequency at 200 Hz. (b) a nontarget, wingbeat signal rejected (SNR < 0) for not having the f0 and its associated harmonics in the spectral area where B. oleae is expected. (c) a rejected interference (SNR < 0).

The third in row figure is a recording of an insect (not B. oleae) flying in the trap. One can again see the structure of a wingbeat (i.e., multiple peaks in the frequency domain at integer multiples of a fundamental frequency). Note that in the 4th in row figure, the fundamental frequency is around 130 Hz and this is impossible for B. oleae. The detection algorithm attributes SNR < 0 to this recording and, therefore, rejects the signal as not originating from B. oleae, although it is a perfectly valid wingbeat signal. Last, in the two figures at the bottom, we have the case of an interference. We know that, as there is no wingbeat structure in the signal. The recording cannot be originating out of any insect as there is no oscillation. Instead we see a shock pulse. Note that the algorithm attributes a small SNR below zero and confidently rejects the signal as not originating from B. oleae.

The evaluation for the in situ experiment is based on counting the number of insects trapped on the sticky surface and comparing them to the counts that the trap uploads to the remote server by using the GPRS functionality. We hereby include some observations after deploying e-traps in the field for over two months (see Figure 9): (a)The traps sustained several abrupt showers of light rain without a problem. No malfunction was observed, and the traps were not triggered (i.e., they did not register false alarms for the presence of insects). The humidity sensor correctly reported the events. Due to this event, we modified the software that switches off the trap when the humidity sensor exceeds 99%. We do not believe that this action effects a compromise in the utility of the traps as in such conditions the flight activity of insects is doubtful.(b)During a thunderstorm, the traps registered false alarms. We attributed them to thunders that cause large-scale electromagnetic interferences and to very strong winds that possibly effected vibration shocks to the trap.(c)The traps were initially fastened tightly on the trees in order to prevent them from hitting the trunk or branches. We later relaxed this constraint so that the traps swung freely in the air to make their use more convenient for cultivators. An abrupt hit to the device will produce a vibration that is propagated through the trap to the sensor. The slight displacement of the emitter with respect to the receiver due to this shock can trigger the recording process of the trap.Note that triggering does not necessarily imply that a valid insect count is registered as we further examine the spectral content of the recording after triggering. However, we tried to avoid false triggering as much as possible (see Section 2.2, on the acceleration sensor).(d)After each triggering, we applied a delay of 30 seconds to the next permissible triggering. This constraint was added to avoid the relatively rare event of a double count. A double count is observed when an insect lands abruptly on the sensor and flies off again.(e)A minimum duration constraint was applied to a recorded event to qualify as valid wingbeat recording. In this way, we discarded events that we attributed to the thermal expansion of the sensor due to external temperature variations.Wherever transparent material was used as a holder for the sensor, this was made by polycarbonate material resistant to thermal expansion.(f)We report zero false alarms due to the sun and slowly changing light variations (because of passing clouds) or fast-changing light variations (because of the movement of tree branches and leaves).

Figure 9: Two traps deployed in the field for 2 months counting flies. (a) A light guide version as in [19] placed in an olive orchard: lat: 35.0735, lon: 24.8376. (b) A Fresnel type e-trap as suggested in this paper placed in an adjacent olive orchard (lat: 35.06574, lon: 24.827354). Both traps follow the dynamics of counts closely. The Fresnel version reports less false alarms and follows better the insect-count dynamics.
3.3. Discussion

In this work, we presented two novel devices: (a) a bimodal wingbeat recorder and its associated sensors based on acrylic Fresnel optics that looks at the same wingbeat event from different perspectives and (b) a functional e-trap for fruit flies that is validated in the real field for the task of detecting B. oleae. The point of this device is to replace human monitoring services. There are many problems with manual practices, the most important being the high cost, followed by feasibility and reliability issues when thousands of traps are involved. Note that monitoring is not a new service that we introduce as a novelty but an established process that is currently part of the IPM regulations.

Because of manual practices, large contracts are granted every year to each interested country. The experimental results from field observation support the following conclusions: (a)The trap does not report false alarms due to sun or other environmental reasons, although it is transparent and functioning in the open field. Triggering from noninsect sources occurs at very low rates; the recordings produced by false alarms are successfully rejected by the frequency analysis of its content.(b)The trap can sustain bad weather conditions including rain and strong winds without malfunctioning and without false alarms.(c)The detector of the trap discerns the wingbeat of insects and can lock on a specific wingbeat pattern. A temperature-dependent SNR calculation will be included in the future to follow the ectothermic nature of insects that changes their wingbeat frequency rate according to temperature variations.(d)There is a close correlation between insects found trapped inside and insects counted automatically. A small divergence between counts of captured insects and reported numbers does not affect the decisions of the qualified entomologists responsible for taking decisions on the initiation of treatments.(e)Based on the SNR algorithm (i.e., in the case of in situ decisions), the trap can attract and count flies quite reliably. It relies on attractants specific to the target pest to be able to count the targeted pest among similar pests.(f)The device also transmits the wingbeat snippet to be classified to a server, and therefore, a general purpose food bait can be used. Transmitting the waveform proves very convenient in practice and allows the composition of historical data. In the server, top-tier machine-learning techniques can classify insect wingbeats even among fruit flies. A large-scale evaluation of data gathered in the server is pending.

Currently, actuation and remote control take place to a limited degree, but it can be extended in the future. The trap can be activated and deactivated automatically according to humidity levels and GPS/acceleration readings. Moreover, activation and deactivation of the trap takes place for temperatures below 16 and above 35°C, where B. oleae is inactive. Sampling of environmental conditions requires a very low power and the e-trap is reactivated when the proper range of environmental conditions holds. The battery levels are also transmitted to the server and are monitored, and the actuation process can be extended in the future by allowing automatic, remote update of the firmware.

Finally, the Fresnel version is more compact and rigid compared to the light guide version in [19]. The former is more robust compared to false alarm rates due to environmental conditions. The quality of the recording in terms of SNR and number of harmonics included is in favor of the Fresnel version as well.

4. Conclusions

The experiments demonstrate that each modality of the stereo sensor carries complementary information on the task of identifying different species of insects. This type of sensor opens the way for more specific analysis of the wingbeat based on low-budget, multispectral analysis.

We expect e-traps to change the way IPM is applied as the dynamics of insect population and the environmental parameters associated with them are delivered almost in real time. The surveillance of insects through e-traps that record their wingbeat is now ubiquitous thanks to the proliferation of wireless sensors and the GPRS as well as the IoT functionality. The recording of the wingbeat of a B. oleae taken in Crete, Greece, was automatically uploaded to a server located in the US as soon as the insect flew into the trap and was subsequently downloaded in the UK.

We envision that low-cost plastic traps upgraded with low-budget optical sensors and wireless communication will form a network of uniquely addressable objects transmitting insect counts, wingbeat snippets, and environmental parameters to a cloud server over vast expanses of land. This information will be visualized in GIS (geographical information system) compatible format so that monitoring will be a continuous process that will keep an eye on infestations, perform posttreatment analysis, and establish a link with predictive analytics [29].


A. Recordings

All datasets used in this paper are available for downloading from

Recordings bear a time stamp and measurements of environmental parameters in their filename. A typical example follows: F171020_090453_0093_Temp24.2_Hum55.6_Bar1012.0_Opt06.99.wav.

Wav recordings are audible and can be processed as normal audio using the appropriate software.

B. Code

The classification code of the datasets is used in this paper.

C. Power Consumption

C.1. The Stereo Multimodal Wingbeat Recorder

Power consumption totals to 2.79 W from which 1.41 W is consumed on the analogue part of the device (LEDs 738 mW, demodulator 195 mW, filter 345 mW, and photodiode receivers 135 mW) and 1.38 W on the digital part (MCU 323 mW, ADC 300 mW, ADC driver 720 mW, and LED clock 36 mW).

C.2. The Fresnel Type of an Electronic McPhail Trap

Power consumption totals to 79.2 mW from which 49.5 mW is consumed on the transmit LEDs of the sensor, 49.5 mW on the photodiodes’ receiver, 6.6 mW on demodulator and low-pass filter, and 6.6 mW on the digital parts (MCU, RTC). A mean power consumption for the GSM modem is 1 mW provided it emits once per day and the same amount of power for the GPS.

D. Cost

The cost is, of course, subject to change but in order to help with the assessment of the cost/benefit tradeoff of this new technology and to make it affordable for end users, we present a detailed cost breakdown and an item list in Table 4 and 5.

Table 4: Cost breakdown of the stereo wingbeat recorder as per 22/2/2018 in Euros (€).
Table 5: Cost breakdown of an electronic McPhail trap based on Fresnel lenses 22/2/2018 in Euros.

E. Schematics

E.1. The Stereo Multimodal Wingbeat Sensor/Recorder
Figure 10: LED driver. The driver controls the LED current of the emitter. It is powered by 5 VDC and accepts clock pulses from the integrated circuit CDCE913.
Figure 11: (a) Scattering photodiode receiver. (b) Direct photodiode receiver. The stereo sensor has one emitter and two receivers. One for receiving the scattered light and one for the extinction light. The photodiode of both receivers is TEMD5110X01. The receiver of scattered light is amplified by the operational amplifier IC1 with transimpedance gain 1 MΩ, whereas the extinction light receiver gets amplification through the IC3 with a gain of 27 ΚΩ.
Figure 12: MCU. The processor checks out all processing stages of the device. It is powered by 3.3 V through IC2. It has embedded RTC and keeps time using B1 battery. It controls the ADC converter through the I2S interface, decides if a triggering has taken place, and if positive stores the 625 msec snippet in the SD card in wav stereo 16 bits, 16 KHz. After triggering, it takes measurements of humidity, temperature, barometric pressure, and background light intensity and stores this information in the filename of the snippet.
Figure 13: Clock unit. The circuit that produces clocked pulses for the LED is controlled by the MCU through the I2C interface and produces a frequency of 455 KHz for the LED driver.
Figure 14: Power supply unit. The power supply unit of the devie. It accepts input of 9 to 16 VDC and produces +5 V, +12 V, and −12 V.
Figure 15: Demodulator. The signal from the sensor reaches the band-pass filter IC14 and IC15 through the connector CON2. The filters have a bandwidth of 30 KHz and reject signals outside 455 ΚHz. The outputs of the filters drive the demodulators IC12 and IC13. The output of each demodulator is an audio signal that includes the 455 KHz frequency and is filtered by a low-pass filter.
Figure 16: Low-pass filters. The LPF are composed of 4-stage TI amplifiers IC9 and IC11. They accept as input the output of the demodulators and they drive the input to the ADC.
Figure 17: ADC. The ADC is based on the converter PCM4220. The converter is driven by the TI IC2, IC3, IC4, and IC5 that transform the simple output of the LPF to differential. The digital data are transferred to the MCU through the I2S interface.
Figure 18: Temperature, humidity, optical sensor, and barometric pressure sensor. For temperature and humidity, we use the Si7021, Silicon Laboratories, for light intensity the OPT3002, Texas Instruments, and for barometric pressure the MS5637, TE Connectivity. All sensors have an I2C bus; therefore, their connection to the MCU is simplified.
E.2. The Fresnel Version of an Electronic McPhail Trap
Figure 19: LED driver. The infrared emitter is composed of 4 LED SFH4247 driven from the MOSFET BSS138. The transistor and MOSFET setup provides stable current to LEDs 0.65 V/42 ohms = 15.4 mA per LED. The circuit is powered by 3.3 V and admits a CLOCK 455 KHz from the MCU.
Figure 20: Photodiode amplifier. The infrared receiver is composed of 4 photodiodes TEMD5110X01 whose current is amplified by IC3B. The feedback with IC3A and the TR1 transistor makes the circuit capable of functioning in the presence of sunlight (high-amplitude DC current).
Figure 21: MCU. The MCU is an ARM microcontroller MSP432P401R, Texas Instruments. It produces all the necessary signals for the operation of the trap and records the optical wingbeat signal in the form of an audio wav directed to the SD card.
Figure 22: Band-pass filter. The BPF (FL1) accepts input from the output of the photodiodes preamplifier that filters it so that only the desirable 455 KHz frequencies pass. The output of BPF is amplified by IC2 and is driven to the demodulator.
Figure 23: Demodulator. The demodulator accepts input from the output of BPF and its output is an audio signal. For its demodulation, a sync signal (Sync) is required that is received from the MCU.
Figure 24: SD card reader. The circuit of the SD card has an independent power supply (IC10) with “enable” function so that is controlled by the processor but consumes no power when not working.
Figure 25: Low-pass filter. The low-pass filter accepts input from the output of the demodulator. It filters the signal so that the audio signal is accepted, and the high frequencies of the demodulator output are rejected. It is composed of two stages, IC6A and IC6B. The IC6D provides the required DC reference voltage at VCC/2 level.
Figure 26: Audio-processing circuits. The circuit controls the amplification of the audio signal from the output of the demodulator using IC3 and converts the single output to differential that is needed by the ADC by using IC5A and IC5B.
Figure 27: GSM-GPS: the gate to the Internet is through the GSM module SIM908, SIMCOM. The same module has an embedded GPS.
Figure 28: Acceleration sensor. The circuit that detects vibration shocks is based on the MMA7361L of NXP. The IC2 and IC3 are analogue comparators with a reference voltage of 1.25 V and take as input the analogue outputs Χ and Y of the acceleration sensor. In the output, they give an amplitude voltage of 3.3 V so that it is recognizable by a simple I/O pin of the MCU.


171020:Year:month:day (2017, October the 20th)
0093:File counter
Temp24.2:Temperature in °C
Hum55.6:Humidity (%)
Bar1012.0:Barometric sensor, normalized
Opt06.99:Optical sensor, normalized
Wav:The snippet is in stereo wav format 16 KHz, 16 bits as regards the standalone stereo recorder and a mono file 4 K, 14 bits for the e-trap case.


The funders had no role in the study design, data collection and analysis, decision to publish, or preparation of the manuscript.

Conflicts of Interest

The authors declare that they have no conflicts of interest.


The research leading to these results has received funding from the European Union’s Seventh Framework Program managed by REA—Research Executive Agency ( (FP7/2007–2013). The authors acknowledge that the plastic housing of the electronics in Figure 3 was manufactured by the Institute for Microelectronic and Mechatronic Systems (IMMS) in Germany. The authors acknowledge Anastasia Kambouraki from the Department of Biology, University of Crete, Heraklion, Greece, for breeding insect batches for the recording experiments and Professor John Vontas, Director of the Pesticide Science Lab, Agricultural University of Athens, for allowing the authors access to all insectaries and associated equipment. The authors thank M. Brydegaard from Lund University for the critical reading of our work. Finally, this work is dedicated to those observing the “less” of the world, enchanted by the elusive charm of an insect’s wingbeat. This work was supported by the European Commission FP7, under Grant Agreement no. 605073, project ENTOMATIC and its associated matching funds by GSRT—Greece.


  1. E. C. Oerke, H. W. Dehne, F. Schönbeck, and A. Weber, Crop Production and Crop Protection: Estimated Losses in Major Food and Cash Crops, Elsevier Science, Amsterdam, Netherlands, 1994.
  2. M. L. Flint and R. Van den Bosch, Introduction to Integrated Pest Management, Springer Science & Business Media, New York City, NY, USA, 2012.
  3. L. P. Pedigo and M. E. Rice, Entomology and Pest Management, Waveland Press, Long Grove, IL, USA, 2014.
  4. E. Goldshtein, Y. Cohen, A. Hetzroni et al., “Development of an automatic monitoring trap for Mediterranean fruit fly (Ceratitis capitata) to optimize control applications frequency,” Computers and Electronics in Agriculture, vol. 139, no. 15, pp. 115–125, 2017. View at Publisher · View at Google Scholar · View at Scopus
  5. D. E. Hendricks, “Portable electronic detector system used with inverted-cone sex pheromone traps to determine periodicity and moth captures,” Environmental Entomology, vol. 14, no. 3, pp. 199–204, 1985. View at Publisher · View at Google Scholar
  6. D. E. Hendricks, “Electronic system for detecting trapped boll weevils in the field and transferring incident information to a computer,” Southwestern Entomologist, vol. 15, no. 1, pp. 39–48, 1990. View at Google Scholar
  7. A. Moore and R. H. Miller, “Automated identification of optically sensed aphid (Homoptera: Aphidae) wingbeat waveforms,” Annals of the Entomological Society of America, vol. 95, no. 1, pp. 1–8, 2002. View at Publisher · View at Google Scholar
  8. J.-A. Jiang, C.-L. Tseng, F.-M. Lu et al., “A GSM-based remote wireless automatic monitoring system for field information: a case study for ecological monitoring of the oriental fruit fly, Bactrocera dorsalis (Hendel),” Computers and Electronics in Agriculture, vol. 62, no. 2, pp. 243–259, 2008. View at Publisher · View at Google Scholar · View at Scopus
  9. T. Okuyama, E.-C. Yang, C.-P. Chen, T.-S. Lin, C.-L. Chuang, and J.-A. Jiang, “Using automated monitoring systems to uncover pest population dynamics in agricultural fields,” Agricultural Systems, vol. 104, no. 9, pp. 666–670, 2011. View at Publisher · View at Google Scholar · View at Scopus
  10. J.-C. Shieh, J.-Y. Wang, T.-S. Lin et al., “A GSM-based field monitoring system for Spodoptera litura (Fabricius),” Engineering in Agriculture, Environment and Food, vol. 4, no. 3, pp. 77–82, 2011. View at Publisher · View at Google Scholar · View at Scopus
  11. M.-S. Liao, C.-L. Chuang, T.-S. Lin et al., “Development of an autonomous early warning system for Bactrocera dorsalis (Hendel) outbreaks in remote fruit orchards,” Computers and Electronics in Agriculture, vol. 88, pp. 1–12, 2012. View at Publisher · View at Google Scholar · View at Scopus
  12. I. Potamitis, I. Rigakis, and K. Fysarakis, “Insect biometrics: optoacoustic signal processing and its applications to remote monitoring of McPhail type traps,” PLoS One, vol. 10, no. 11, article e0140474, 2015. View at Publisher · View at Google Scholar · View at Scopus
  13. C. N. Verdouw, S. Wolfert, and B. Tekinerdogan, “Internet of things in agriculture,” CAB Reviews: Perspectives in Agriculture, Veterinary Science, Nutrition and Natural Resources, vol. 11, no. 35, 2016. View at Publisher · View at Google Scholar · View at Scopus
  14. A. Rose, M. Weber, I. Potamitis et al., “The BG-Counter, the first operative automatic mosquito counting device for online mosquito monitoring: field tests and technical outlook,” in 20th European Society for Vector Ecology (E-SOVE), Lisbon, Portugal, October 2016.
  15. D. E. Norris, “The premonition trap: first field trials of a robotic smart trap for mosquitoes with species recognition,” in 47th Annual Conference of Society for Vector Ecology, Anchorage, Alaska, September 2016.
  16. P. Boissard, V. Martin, and S. Moisan, “A cognitive vision approach to early pest detection in greenhouse crops,” Computers and Electronics in Agriculture, vol. 62, no. 2, pp. 81–93, 2008. View at Publisher · View at Google Scholar · View at Scopus
  17. W. Ding and G. Taylor, “Automatic moth detection from trap images for pest management,” Computers and Electronics in Agriculture, vol. 123, pp. 17–28, 2016. View at Publisher · View at Google Scholar · View at Scopus
  18. A. Guarnieri, S. Maini, G. Molari, and V. Rondelli, “Automatic trap for moth detection in integrated pest management,” Bulletin of Insectology, vol. 64, no. 2, pp. 247–251, 2011. View at Google Scholar
  19. I. Potamitis, I. Rigakis, and N.-A. Tatlas, “Automated surveillance of fruit flies,” Sensors, vol. 17, no. 12, p. 110, 2017. View at Publisher · View at Google Scholar · View at Scopus
  20. R. Wang, C. Hu, X. Fu, T. Long, and T. Zeng, “Micro-Doppler measurement of insect wing-beat frequencies with W-band coherent radar,” Scientific Reports, vol. 7, no. 1, p. 1396, 2017. View at Publisher · View at Google Scholar · View at Scopus
  21. I. Potamitis and I. Rigakis, “Large aperture optoelectronic devices to record and time-stamp insects’ wingbeats,” IEEE Sensors Journal, vol. 16, no. 15, pp. 6053–6061, 2016. View at Publisher · View at Google Scholar · View at Scopus
  22. M. Brydegaard, “Towards quantitative optical cross sections in entomological laser radar—potential of temporal and spherical parameterizations for identifying atmospheric fauna,” PLoS One, vol. 10, no. 8, article e0135231, 2015. View at Publisher · View at Google Scholar · View at Scopus
  23. D. G. Stavenga, “Thin film and multilayer optics cause structural colors of many insects and birds,” Materials Today: Proceedings, vol. 1, pp. 109–121, 2014. View at Publisher · View at Google Scholar · View at Scopus
  24. H. Mukundarajan, F. J. H. Hol, E. A. Castillo, C. Newby, and M. Prakash, “Using mobile phones as acoustic sensors for high-throughput mosquito surveillance,” eLife, vol. 6, 2017. View at Publisher · View at Google Scholar · View at Scopus
  25. C.-L. Chuang, E.-C. Yang, C.-L. Tseng, C.-P. Chen, G.-S. Lien, and J.-A. Jiang, “Toward anticipating pest responses to fruit farms: revealing factors influencing the population dynamics of the oriental fruit fly via automatic field monitoring,” Computers and Electronics in Agriculture, vol. 109, pp. 148–161, 2014. View at Publisher · View at Google Scholar · View at Scopus
  27. Plant Health Australia, The Australian Handbook for the Identification of Fruit Flies. Version 2.1, Plant Health Australia, Canberra, ACT, Australia, 2016.
  28. F. Pedregosa, G. Varoquaux, A. Gramfort et al., “Scikit-learn: machine learning in python,” Journal of Machine Learning Research, vol. 12, pp. 2825–2830, 2011. View at Google Scholar
  29. I. Potamitis, P. Eliopoulos, and I. Rigakis, “Automated remote insect surveillance at a global scale and the internet of things,” Robotics, vol. 6, no. 4, p. 19, 2017. View at Publisher · View at Google Scholar · View at Scopus