Abstract

The detection of short optical transients of astrophysical origin in real time is an important task for existing robotic telescopes. The faster a new optical transient is detected, the earlier follow-up observations can be started. The sooner the object is identified, the more data can be collected before the source fades away, particularly in the most interesting early period of the transient. In this the real-time pipeline designed for identification of optical flashes with the “Pi of the Sky” project will be presented in detail together with solutions used by other experiments.

1. Introduction

The first robotic telescopes were designed to work autonomously and without too much human supervision. The development of these devices was strongly stimulated by optical observations of Gamma-Ray Bursts (GRBs) ([13]). These powerful explosions, discovered by VELA satellites in the late 60 s [1], take place on very short timescales, ranging from a fraction of second up to hundreds of seconds. In the early days chances for observation of optical counterparts were very limited due to large errorboxes of burst positions. Bursts localizations were typically provided a long time after the event, thus early optical observations in the most interesting period of these processes were practically impossible. The idea of automatic identification of optical transients (OT) was born as an answer to the above problems. In the 90 s the Burst And Transient Source Experiment (BATSE) detector on board of the Compton Gamma Ray Observatory (CGRO) [4] was providing real-time alerts with the positional accuracy of the order of . Alerts were distributed in almost real-time via the Gamma-ray bursts Coordinates Network (GCN) [5]. Optical system covering large field of view (FOV) together with the pipeline automatically identifying transients would allow find optical counterparts of BATSE bursts. In 1997 the first optical counterpart of the gamma-ray burst was observed, triggering rapid development of robotic telescopes. In 1999 great milestone was achieved by the ROTSE robotic telescope [6], which for the first time observed the optical emission simultaneously with the -rays emission (GRB 990123). Since that time many automatic robotic telescopes have been built in order to quickly react to GCN alerts [5] and observe prompt optical signal as early as possible. During the last decade these devices became more and more sophisticated. Many of them are fully automatic and autonomous telescopes, not requiring too much human attention. The standard approach is to wait for the GCN alert and, when it comes, quickly move to the burst position, reaching it normally in less than . Nowadays, real-time alerts are mostly provided by Swift [7] and Fermi [8] satellites, but only Swift gives positional accuracy of the order of 3 arcminutes. The basic requirements for fast robotic telescopes have already been achieved, but new challenges are coming.

Nowadays, robotic telescopes and their associated data pipelines are expected to perform automated data analysis, find interesting events, and send alerts to the community so that larger telescopes can observe such objects as soon as possible. Since the very beginning robotic telescopes were producing a lot of optical data which was generally analysed off-line. Successful implementation of a real-time pipeline requires that additional conditions are fulfilled, some of them specific for a particular detector architecture: stable hardware and software for data acquisition, good efficiency of algorithms, and sufficient CPU power to handle data while it is collected. Usually it requires also effective data management systems (i.e. databases) to catalogue reduced data and analyse it in real time. As it can be clearly seen from the example of gamma-ray bursts, offline or late identification of the phenomena is insufficient and does not guarantee a chance of follow-up observations which could allow the collection of important early data. Real-time discovery, alert distribution, and immediate reaction of large telescopes and other detectors are needed to successfully investigate short timescale astrophysical phenomena.

The regime of short timescale (scaled in seconds and less) is a relatively unexplored area. The examples of GRBs, blazars, and Active Galactic Nuclei (AGNs) show that the most violent and interesting astrophysical processes occur on very short timescales. Thus many interesting discoveries can be expected in this regime. This includes optical transients related to GRBs, as it was already confirmed by observations of prompt optical emission from at least a few GRBs (990123 [6], 080319B [9]), these processes may have even very bright optical counterparts. Particularly the case of GRB080319B showed that the brightness of prompt optical flashes from the GRB can reach which makes them accessible even for a naked eye and certainly for wide field detectors. This further confirms the necessity and suitability of building wide field systems for observations of prompt optical signal from GRBs. It also motivates for developing real-time pipelines, able to self-trigger follow-up observation even without GCN alerts. It has to be noted, however, that except for the two cases mentioned above (GRB 990123 and 080319B) observed optical counterparts are typically fainter than ([10, 11]). On the other hand, there is a very limited number of optical observations of GRBs in the very beginning of the burst.

Currently we have solid proof that such short time optical transients can even reach brightness of the order of . Wide field arrays of photo camera lenses and even fish-eye cameras can be effective tools for automatic identification of such events in real time. The “Pi of the Sky” system will be able to reach magnitudes of the order of for combined images, a magnitude range largely unexplored for OTs. Discovered events should be immediately distributed via global communication networks (like VOEventNet [12], Heterogeneous Telescope Networks (HTN) [13] or other), which could become viable extensions of the GCN network.

The most important and interesting processes that could be detected with this kind of analysis are untriggered GRBs or orphaned afterglows. Current GRB models predict that optical emission may be less collimated than high energy emission and thus there may be more optical flashes related to GRBs than -ray flashes. The observation of such events would probably be one of the milestones in GRB understanding.

Most of optical observations are related to so called long GRBs, but there are also short GRBs, with very short time scales (time durations ). Optical counterparts of short GRBs have been observed only in very few cases and in relatively long time after the outburst. Optical observations of short GRBs in the early phase would therefore give important input to the theoretical understanding of these processes.

In addition to GRBs there are other processes which should be identified as fast as possible in order to allow deeper investigations. These can be supernovae explosions (SN), cataclysmic variables like novae or dwarf nova explosions, but also activities of AGNs and blazars. Very similar tools are useful also for the automatic identification of Near Earth Objects (NEOs), Potentially Hazardous Asteroids (PHAs) or Space Debris, which has recently become a serious problem for all modern space missions.

Every time technology allows to enter new regimes, new types of phenomena can be discovered. It is very probable that on the timescales of seconds and less there are optical processes in the Universe which were not yet classified or even observed. In order to systematically study such kind of processes, effective pipelines for real-time identification are required. Typically, they are implemented in wide field systems, but the synergy between wide field and large telescopes would allow the good coverage of optical light curves. Recently similar pipelines have been developed also for large telescopes and this direction is also very promising, giving chances for the detection of fainter OTs.

Finally there are practical reasons for development of such pipelines, as astronomical experiments produce more and more data. In certain points they can reach amounts which will not be possible to store permanently. They may almost approach the amounts produced by the high energy experiments and thus it may be necessary to use similar methods for reducing the stream of data by using on-line selection algorithms. Real time analysis pipelines can identify interesting event candidates in the images and store only raw data related to these events ignoring the rest of the data. It may be necessary to develop specific triggers for specific types of phenomena, exactly like in particle physic experiments. Today it is not yet a very crucial problem, but it may become an issue in the near future, when experiments like Large Synoptic Survey Telescope (LSST) [14] or the Panoramic Survey Telescope & Rapid Response System (Pan-STARRS) [15] will start to collect 30 TB of data per night.

2. Methods of Identification of Optical Transients in Short Timescales

In this section, we give an overview of methods used for the identification of optical flashes. We concentrate on short (10 s) optical flashes, but also methods for detecting events with longer timescales will be described. Depending on the type of experiment and especially on exposure times, different methods of flash recognition have to be implemented. The exposure time is the main limitation for the complexity of image analysis. A new image has to be analyzed while the next one is being collected, therefore time and CPU consuming algorithms cannot be used. The situation is different in the case of longer exposures where much more sophisticated algorithms can be implemented. Typical methods will be described in the first subsection. In the second subsection methods used in the pipeline developed for the “Pi of the Sky” experiment will be presented.

2.1. Identification of New Objects in the Sky

The main purpose of the algorithm looking for optical transients is to analyze new images of the sky and find new objects which were not present in the previous images nor in the star catalogues. The purpose is to find candidates for optical flashes with high efficiency, but also to effectively reduce the background so that the amount of events requiring visual inspection is reasonable. The first step is to compare the new image with the reference image. The type of the reference image depends on the realization of the algorithm. It may be a series of previous images taken in similar conditions just a moment ago or it may be a reference image resulting from a combination of best images of the same field of the sky taken before at very good observing conditions. Another way is to compare a new image with the reference catalogue of astrophysical objects. It is also possible to combine both solutions to obtain the most satisfying result.

The implementation of the method depends on the specific needs and characteristics of the experiment. Typically, experiments collect subsequent images and the time period for the analysis of a new image is limited by the exposure time: new image should be analyzed before the next image is read out. In case of the short time scale (10 s) surveys, like for example “Pi of the Sky” (10-second exposures), this may be a strong limitation, imposing requirements on the type of analysis methods that can be implemented. Therefore the image subtraction method is widely used. Images are taken under almost the same conditions, so the previous image can be subtracted from the new one, exhibiting changes in the sky. A precise and exact method of image subtraction was described in [16]. However, in many applications this method can be too time consuming, thus very simple, pixel based subtraction is applied. In case of sub second exposures, like in the case of the TORTORA experiment ([17, 18]) it is probably the only method that can be used. There are many variations of this method, in some cases a reference image or median of images from earlier observations (i.e., taken at very good observing conditions) is used for subtraction. After the subtraction, when the image revealing changes in the sky is obtained, the list of flash candidates is usually built by a selection of pixels, where the signal exceeds certain threshold (5 or so). However, one has to be aware of several types of artifacts which appear in the image after the subtraction. For example Point Spread Functions (PSF) of stars fluctuate and the subtraction can produce a false signal at the edges of the stars PSF. The suppression of this type of false alerts (by, e.g., rejection of alerts close to edges of stars or division by a variability map) is one of the main tasks of flash selection algorithm. After this step a list of flash candidates is produced and subsequent steps are introduced to reject background events. A typical background that has to be handled at this stage by subsequent cuts of the algorithms is mainly due to

(i)sky background fluctuations,(ii)fluctuations of faint stars,(iii)fluctuations of PSF of bright stars,(iv)saturated stars,(v)cosmic rays hitting a CCD chip,(vi)defects of the CCD matrix (hot pixels, bad columns, etc.),(vii)artifacts (e.g., strip due to permanently opened shutter),(viii)flashes due to artificial satellites,(ix)flashes due to planes.

In case of observations in longer timescales (30 s) the situation is usually easier. The time is long enough to implement more sophisticated algorithms. In such a situation exact photometry can be performed. The sextractor package [19] is usually applied as one of the fastest photometric programs, many other photometric packages are also in use. After the photometric analysis a list of stars in the new image is produced and it is compared with the reference list of stars in the observed field. A catalogue of reference stars is stored in the computer's memory (or on hard drive) and indexed (spherical index or bucket index), allowing very fast binary searches, so that every star in the new image can be verified very fast. The reference list can be obtained from external star catalogues like USNO, Guide Star Catalog (GSC) or SIMBAD depending on the limiting magnitude of the survey. It can also be obtained from the self-produced star catalogue or it may be a list of stars extracted from the reference image which was collected at very good observing conditions. After the comparison of the two lists of stars, a list of new objects in the sky is produced. They are candidates for optical transients and have to be verified by a more sophisticated analysis.

The usage of exact photometry has many advantages over the simple pixel based image comparison. First of all it produces a list of stars and if the photometry takes the stars' PSF properly into account, the problem of star image fluctuations can be easily avoided. In case of longer exposures most of the background types listed above are irrelevant. Typically a few (2) images with exposure time are collected within an interval of a few hours and new objects are required to be visible on at least 2 images. Such a requirement rejects all background events due to cosmic-ray hits, flashing satellites and planes. However, it rejects also short optical transients.

2.2. Identification of Short Optical Flashes in the “Pi of the Sky” Experiment
2.2.1. Pi of the Sky experiment

The “Pi of the Sky" system is designed for observations of a large fraction of the sky with a temporal resolution of the order of 10 . The system will consist of two farms of 16 cameras (Figure 1 in [20] in these proceedings), installed in a distance of several dozens of kilometers. Each camera will cover a field of view (FOV) of , which will result in a total sky coverage of; 2 steradians for a single set of 16 cameras. This corresponds to the field of view of Swift BAT [7] and Fermi LAT [8] detectors. Every GRB detected by the Swift satellite, when followed by the “Pi of the Sky” system, will already be in its field of view. The presented algorithms will analyze images in real time, in search for short optical flashes. The two sets of 16 cameras will observe the same part of the sky. The large distance between two sets is needed to use parallax for rejection of short optical flashes caused by artificial satellites and other near Earth sources (Table 1) .

The design and construction of the prototype was the first step towards the final “Pi of the Sky” system. The prototype was installed in Las Campanas Observatory (LCO) in Chile in June 2004. It consists of two custom designed cameras (almost the same as built for the final system) [21] installed on a equatorial mount. Each camera has a CCD with pixels of and is equipped with CANON EF mm, photo lenses, covering and resulting in a pixel scale of 36 arcsec/pixel. The cameras continuously collect 10 s images of the same field in the sky with 2 s breaks for the CCD readout.

The typical astrometric error of a star position is 10 arcsec. Large angular size of pixels can be a cause of source confusion, particularly in dense fields. However, it is not a problem for identification of optical transients as astrometric precision is sufficient for source identification with larger instruments (e.g., in cases of discovered nova stars). It becomes more important issue in the case of variable star analysis. Therefore lightcurves of stars possibly affected by blending have to be excluded from the analysis.

The values of a limiting magnitude for the prototype are the same as expected for the final system: on a single 10 s exposure and on 20 averaged images. It was estimated that 1/5 of the Swift's GRBs will occur in the FOV of the system during good weather (75% of observational nights assumed). Based on the observed optical lightcurves of GRBs, it was estimated that at least 1-2 bursts per year should be bright enough for a positive or marginal detection. For the rest of bursts, optical limits for the moment of the onset will be determined. Hopefully, first optical limits or detections of short GRBs in the very moment of the burst will also be obtained. More details about the prototype and final design of the system can be found in [10, 20, 22].

2.2.2. First Level Trigger—Identification of Flashes from a Single Camera

The aim of this algorithm is to find optical flashes occurring in a single image (time scale 10 s). The signature of such events is the following: an object appears in a new image of the sky in the position where nothing was present in the previous images of the same field. In the “Pi of the Sky” experiment a new image is compared to a series (typically 7) of images collected just before it.

The first step of the analysis is the subtraction of a dark frame. In the next step, the image is transformed by a custom designed transformation, called laplace (because it resembles a discrete version of Laplace operator). The value of each pixel is calculated as a simple function of several surrounding pixels as illustrated in Figure 1. Values of pixels just around the transformed pixel (black squares in Figure 1) are summed and values in further pixels (opened squares) are subtracted with proper weight. This transformation is equivalent to the calculation of a simple aperture brightness for every pixel. The distribution of pixel values after such a transformation is centered around zero (Figure 2). Thetransformation allows for easy identification of pixels significantly brighter than the local background level. For every collected image a Gaussian curve is fitted to this distribution and the threshold required for signal is calculated as a multiplicity of the dispersion value (), typically .

Several types of filters which were tested are shown in Figure 1. Images before and after applying the g54 laplace filter (aperture 4 in Figure 1) are shown in Figure 2. After comparing efficiencies and false event rates for different filters, laplace 12 (aperture 12 in Figure 1) was chosen for the current optical setup.

At this stage, algorithm must handle the highest data rate, of the order of the number of the CCD pixels (), so it must be very fast and simple. It should preserve most of the signal and reject big fraction of non-interesting measurements. The algorithm does not require astrometry to be performed on every image, it is enough to run it every ten images or so. The algorithm takes advantage of the fact that the constant mount rotation is compensating very efficiently for the Earth rotation. However, it was not clear in the beginning whether it would be the case, so an image shift determination was also implemented and could be eventually used (e.g. for a fixed mount setup). At this stage flash-like events in a single camera are identified. The following two criteria are required to select candidates for new objects in the sky.

(i) —this cut selects stars in the new image by requiring a signal in the analyzed pixel. The condition for the signal presence is , where is the value of pixel on the laplace transformed image, —denotes signal threshold and stands for the new image. The threshold is specified by configuration parameters in multiplicities of the value. Usually it is set to 5, but for full moon nights it is automatically increased to 6. The goal of this cut is the identification of all pixels belonging to stars or other point like objects in the new image.

(ii) —this cut rejects objects present on previous images. It requires that there is no signal on the previous frame. “Previous frame” in this case means not just one single image, but the average of several previous images. The condition imposed on the value of a pixel in the reference image is the following: , where is the maximum value in the surrounding of pixel on the average of previous images, —denotes veto threshold and stands for "previous images". The pixel is usually chosen from 9 pixels ( square) around . It allows to avoid an influence of small mount tracking inaccuracies and fluctuations of edges of brighter stars. Mount tracking ensures that in most cases an object falls in the same pixels square on the new image. Pixels remaining after this cut should be new objects which appeared on the new image and were not present on previous images. Typically less than 100 such objects are identified per image.

After these two cuts the list of flash candidates is created. Most of them are still due to the background which can be rejected by the following criteria:

(i) MinLaplace—rejects pixels which have a value on the previous image lower than the minimum allowed value (). This cut allows to reject edges of bright stars where the values of pixels after laplace filter often become negative, but can also vary to values exceeding .

(ii) IfMoreAfterTv—rejects the whole image if the number of pixels accepted after cut exceeds a certain limit . This cut allows to reject images with a big number of events which are usually due to system errors, moonlight or clouds. If the image is flagged as bad, all events are certainly false, so they are rejected and no further analysis of this image is performed. In the final version of the project it will be possible to use cloudmeter in order to reject cloudy images, but it is not yet available for the prototype system.

(iii) SkipOverlaps—checks the number of pixels accepted in a certain radius around the current pixel. The algorithm keeps only one event and removes the overlapping ones. This reduces the number of pixels to be analyzed, related to the same object, to a single one.

(iv) Shape—the object shape indicator is calculated as: where is the area of a cluster and is the area of the smallest circle circumscribed on this cluster. The cluster is defined as a group of pixels around the current pixel with values satisfying (usually set to ). Elongated events, which are probably due to moving objects, result in small values. These are rejected by the requirement (usually set to value 0.2).

(v) BlackPixels—this cut rejects pixels which have signal much smaller than neighboring pixels. In case such a pixel is one of the “minus” pixels in the laplace filter (Figure 1), the resulting value of the filter is too high and would produce false alerts.

(vi) HotPixels—due to CCD chip defects some pixels can give a much higher signal than normal “good” pixels. Such effects should generally be eliminated by the dark image subtraction. However, sometimes new hot pixels can appear temporarily during a night and become quiet again later. Two ways of rejecting such events have been implemented. The first one is the calculation of the average value in a pixel on previous images, pixels with sky background should have this value close to the average value of the laplace filter (i.e., close to zero). Hot pixels and stars will have a large average value on previous images, in case it is larger than 3.5, the pixel is rejected. The second antihotpixel cut is the rejection of pixels by the list of known hot-pixels. This list is updated regularly when new defects are found.

After the above selection cuts, the list of event candidates from a single camera is created, usually it consists of not more than several tens of events per image.

2.2.3. Second Level Trigger: Transient Verification

The goal of the second level trigger is to confirm events from a single camera and reject background events, which at this stage are mostly due to cosmic-ray hits, satellites and planes. The action at this level depends on the type of the system setup. Generally three configurations are possible.

(i) Two cameras on a single mount working in coincidence. In this configuration events found by the first camera are verified in the corresponding image from the second camera. Only events present in images from both cameras are accepted. This configuration is realized by the prototype system in LCO.

(ii) Confirmation of an event on the next images. It is also a very effective way of rejecting flashing satellites and cosmic-ray hits. However, short optical flashes are also rejected by such a requirement, thus this is not the best way to fulfill all the requirements for the ideal algorithm. Formerly it was used when only one camera of the prototype was operational.

(iii) Two cameras in distant locations working in coincidence. This will be realized in the final version of the “Pi of the Sky” system. Cameras will be paired, and each pair will observe the same field in the sky. Spatial and time coincidence of the flash in both cameras will be required.

In any case coincidence requirement is one of the most important cuts. The main goal is the rejection of cosmic rays hitting the CCD chip and imitating astrophysical flashes. In many cases cosmic-ray hits have a PSF completely different than the PSF of stars and they could be rejected by a shape recognition procedure. However, in some cases they are very similar to PSFs of the stars. Even if this is a very small fraction of all cosmic ray events this would cause all flashes found by the algorithm to be uncertain. The probability that different cosmic ray particles will hit two chips in the same time and in the same positions (with respect to stars) is negligible. Coincidence is also a very effective way of rejecting background events due to sky background fluctuations, edges of bright stars and clouds. In the prototype version, the collection of images by two cameras is synchronized so the only parameter is the maximum allowed angular distance of events in both cameras. The value  arcsec is used in the current setup. It wasdetermined from the distribution of angular distances of events in both cameras (Figure 3).

After the coincidence requirement the remaining events are real optical transients coming from the sky. However, most of them are still background events, mainly due to flashing satellites. In order to reject most of these events databases of orbital elements in Two Lines Element (TLE) format are retrieved from the Internet every evening. They are combined to a single larger database containing 13000 orbital elements. For every image, positions of all satellites in the database are calculated (using the predict package [23]) and every flash candidate is verified, it is rejected, if it is closer than from any of the satellites. The rejection radius was determined from the distribution of angular distances from flashes to the closest satellite from the database which is clearly peakedaround zero (Figure 4).

The red dots on the plot represent the distance distribution for randomly generated flashes to the closest satellite from the catalogue and illustrate the size of combinatorial background.

The orbital elements databases are not complete and many satellites are not included there. In order to reject those, event candidates from many consecutive images are examined against track conditions. If it is possible to fit a track to the set of events from different images and the velocity of the object is constant, all events on the track are rejected. This rejects a big fraction of flashing satellites and planes (Figure 5); however, it is possible that rarely flashing (rotating) satellites can still survive this cut.

Artificial satellites orbiting the Earth may rotate and sometimes reflect the sunlight towards the apparatus causing flash-like events. The most problematic are those entering the FOV of the telescope and flashing once or twice. Another problem is that verifying whether an event belongs to a track or not requires a collection of some images after the flash. In order to confirm the event, the program must wait for some time and therefore it is not a real time transient identification any more.

A better solution allowing for efficient rejection of satellites in real time is a coincidence between cameras installed in distant locations. In such a configuration, it is possible to reject near Earth flashing objects by using a parallax (Figure 1 in [20], these proceedings). In the prototype version of the experiment two cameras are installed on a single mount. Nevertheless, this method was tested by coincidence with the RDOT experiment [24] located at La Silla at a distance of 30 km. The final design of the “Pi of the Sky” experiment will consist of the two subsystems in a distance of several dozens of kilometers.

Table 1 shows a maximum distance to which artificial satellites can be rejected, assuming optics and CCD chip used for the final design of the “Pi of the Sky” experiment (pixel angular size of 36 arcsec). The distance of 100 km would be optimal and would allow to reject all objects even further than the orbit of the moon. Building such a wide field systems is a large project. However, the number of robotic systems looking for optical transients is rapidly growing, thus another promising possibility may be looking for coincidences of transients in co-operation with other experiments. Development and joining networks like VOEventsNet [12] allow to correlate events from different experiments (not only optical) in almost real time.

2.2.4. Third Level Trigger: Final Confirmation and Classification

The first two levels of trigger retain a very small number of events. On average it is not more than 20 per night. It depends strongly on weather conditions and in case of cloudy night this number can reach hundreds. However, in the final system the number of events will be 16 time larger reaching 300–400 per night which would be much more difficult to be inspected. For this reason the third level of the trigger (TLT) has been implemented. It checks the final events accepted by the previous levels which ensures that only a small number of events have to be examined. Therefore it is possible to implement more sophisticated and time consuming algorithms to check every event. The current implementation of the TLT consists of several cuts. As an example, criteria developed for rejection of plane and meteor-like events will be described. A simple Hough transform (Hough transform is a technique of image transform from to cylindrical (, ) coordinates in order to find particular shapes in an image)-can be applied to a small image part surrounding the event. It finds pixels with signal above a certain level and creates the distribution of their polar angle . A significant peak in the distribution means that the event looks like a “straight line" which is most probably due to a plane or a satellite (Figure 6).

The event will be considered as “straight line” due to a plane if the maximum of this distribution is larger than certain threshold above the mean value (Figure 6). More details about criteria implemented in the TLT, can be found in [10].

2.2.5. Testing and Results

The “Pi of the Sky” prototype in Las Campanas Observatory has been collecting data since June 2004. The algorithm works every observational night. The efficiency of the flash recognition was determined by simulating optical flashes and counting how many of artificial stars added to real sky images were identified and how many of them were wrongly rejected. This efficiency is typically of order of 70%–80% for cloudless nights and objects brighter than . Overall efficiency of the algorithm for stars brighter than is 35%. Further improvements of the algorithm are planned for the final system.

The greatest success of the experiment, but also of the real-time pipeline for optical transients was automatic discovery of the OT from GRB080319B [9]. Another interesting event was automatic identification of the outburst of flare star CN Leo (, ) on 2005.04.02 1 : 13 : 40 UT [25]. There were also a few events which were observed on 2 consecutive images, but only on a single camera (second one was not working then). And there were more than 200 events visible on single 10 s exposure by both cameras simultaneously. None of them was positively correlated with any astrophysical events or signal from other experiments. Most of them are probably due to rarely flashing satellites which could not be rejected by the algorithm. However, some of them can be due to interesting astrophysical processes. In the near future OTs discovered by the “Pi of the Sky” detector will be automatically published via the VOEventNet network. Based on statistics of the optical transients identified by the prototype preliminary, upper limits for the number of OTs brighter than on the whole sky were determined.

(i)Number of OTs of time duration is events//day. (ii)Number of OTs of time duration is events//day.

3. Implementation of Real Time Pipelines in other Experiments

The number of telescopes and wide field systems is rapidly growing. In many cases the main goal is strictly related to GRBs, but covers also other short time scale processes, especially optical transients (OTs). It is impossible to present all experiments which developed pipelines searching for OTs or other processes in the real time. The choice of experiments presented in this contribution had to be subjective. The idea was to present several experiments focused on different areas of astrophysical processes and present a variety of applications in which real-time pipelines are already very important tools. Most of the experiments presented here collect images with exposures times 30 s, in many cases a few images of a given field are collected and later after a few hours the next set of images of the same field is collected. In most cases algorithms require object to be visible on at least 2 images from a single or both sets of images. Thus several problems, typical for algorithms acting on 10 s images are irrelevant (see Section 2).

Perhaps one of the most advanced systems for wide field observations is the RAPTOR experiment [26]. It already consists of two sites RAPTOR-A and RAPTOR-B separated by a distance of . The single site consists of 4 wide field cameras covering FOV of and a fovea telescope which is used for followup observations. The parallax gives a possibility of rejecting near Earth objects. Integration time is typically 30 s, the algorithm compares new images with self-produced star catalogue. The catalogue is started from the GSC star catalog and extended when a new data is collected. The system is planned to be extended to a large array of wide field cameras.

Another very advanced project from the opposite site of the Earth is MASTER system in Russia ([2729]). According to author's knowledge it already consists of 6 wide field cameras, each covering FOV of . The experiment covers timescales 0.15 s, reaching for exposures. The real time pipeline was implemented and successfully searches for supernovae explosions, several events have been discovered automatically. The important components of the system are 40 cm telescopes for followup observations of the detected transients.

The ROTSE system ([6, 30]) was so far probably one of the most important robotic telescopes in the history. It was the first experiment to observe optical emission contemporaneous with the -ray emission from GRB 990123. Since that time ROTSE changed its strategy from the wide field system to a narrow field () telescopes located in 4 sites around the Earth. Typical exposure times are . The pipeline for an automatic identification of OTs was implemented and about 30 supernovae events and few OTs of unknown origin per year are identified.

The TORTORA project ([17, 18]) is a wide field system which for the first time observed optical signal from a GRB (GRB080319B) with a sub-second time resolution ( [9]). The camera covers FOV and collects 1/7 s images, which are analyzed in real time in search for optical transients. Such a high time resolution system results in an enormous amount of the produced data, making implementation of real-time analysis inevitable.

The first Polish robotic telescope and one of the first in the world is the All Sky Automated Survey (ASAS) ( [31]). The system is located in Las Campanas Observatory in Chile and also in Maui at Hawaii Islands. It consists of wide field cameras covering . They typically collect 2-3-minute exposures on each field. The system scans the entire sky every 1-2 days. New objects in the sky are identified in real-time by a comparison of list of stars from new images with the own star catalogues resulting from previous observations. The main goal of the experiment is creation of complete catalogue of variable stars. However, pipeline designed for optical transients discovered many new cataclysmic variable stars and also several comets.

Perhaps one of the first real-time pipelines ever implemented was the one developed for the Optical Gravitational Lensing Experiment (OGLE) experiment [32]. It is not robotic telescope, but the pipeline for automatic discovery of optical gravitational lensing events is a very successful one and is definitely worth presenting here. Analysis of this type of phenomena requires good coverage of optical lightcurve so early detection and distribution of alert in the community is a very important issue. In order to find brightening due to gravitational lensing, differential photometry technique is used [16]. The cumulative reference image of the same field observed in previous seasons is subtracted from the new image in order to find brightness variations. The OGLE group has also developed a pipeline dedicated for OTs (NOOS system—New Objects in OGLE Sky) which discovered some supernovae events. OGLE microlensing events are already distributed via the VOEventNet network in almost real-time.

Recently real-time pipelines for OTs have been implemented and tested on the data from several large telescopes. One of the examples can be Catalina Real-Time Transient Survey (CRTS) [33] which uses images from Catalina Sky Survey (CSS) in order to find optical transients. The telescope observes FOV and collects four 30 s exposures of a given field every 30 minutes. The real-time pipeline based on an image subtraction technique has been implemented. The transients are also verified against star catalogues USNO-B, the Sloan Digital Sky Survey (SDSS) and Palomar Quest (PQ) survey. The survey has detected 350 events in 6 months. They were mostly supernovae, cataclysmic variables, UV Ceti flare stars, blazars, and Near Earth Objects (NEOs). An important feature of the experiment is that events are already published in real-time through the VOEventNet network.

A similar example is the Palomar Transient Factory [34] which uses telescope in Palomar Observatory. It covers and reaches limiting magnitude of on exposures. The real-time pipeline for OTs has been developed in order to search for SNs and also exotic optical transients of unknown origin.

The Liverpool Telescope Transient Rapid Analysis Pipeline (LT-TRAP) is an interesting example of the pipeline for real-time GRB analysis. It was originally designed for 2 m Liverpool Telescope (LT) [35], which reacts to GCN alerts. The main purpose of LT-TRAP pipeline is to automatically identify afterglow candidates in images collected just after the telescope had been re-pointed to the GRB position. The real-time analysis results in the decision on the strategy of the OT observations. It must quickly decide whether polarimetry and spectroscopy should be performed or rather multi-color imaging and so forth. The LT-TRAP pipeline has also been deployed on other facilities (i.e., Faulkes Telescopes).

There are currently many efforts to implement and deploy real-time pipelines on large telescopes. In several cases the main goal is to study the discovery potential of such devices in the short time scale regime and background studies. They provide a science-driven testbed for future projects such as the Large Synoptic Survey Telescope (LSST) which is planned to start collecting data in 2014 [14]. The LSST will be telescope, with FOV of and 3.2 gigapixel CCD mosaic. It will cover time scales scanning the whole sky every 3-4 days. One of the important points of the scientific program is discovering OTs in real-time. The experiment will produce up to 30 TB of data per night, the amounts which will probably have to be analyzed in real-time.

Another large scale future project for wide field multiband observations is Pan-STARRS [15]. It is now in the prototype phase PS1 (one mirror) and is intended to be fully operational in the next few years. Its main goal is identification of moving Solar System objects, especially Potentially Hazardous Asteroids (PHAs) and Near Earth Objects (NEOs). However, identification of other varying sources will also be covered. Single exposure (typically 30 s) will have a size of 2 GB, making real-time data analysis pipeline a must. The cumulative master image of every observed field will be produced. It will be used by the pipeline to be subtracted from the new image in order to find moving or varying sources. Only the most interesting data will be stored longer for further analysis.

On the other side there are plenty of fish-eye cameras observing night sky all around the world. Since the famous GRB080319B it is already clear that even such a small devices can be successful and provide interesting scientific results. In many cases real-time pipelines were implemented and search for optical transients. For example CONtinuous CAMera (CONCAM) system [36] consists of 11 stations all around the world. They collect 180 s images of the whole sky reaching limiting magnitude of . The algorithm analyzes only good quality images, compares them to the reference images collected at good conditions and automatically rejects false alerts due to planets or variable stars. An interesting optical transient OT060420 of unknown origin and brightness of was identified and reported in [37].

4. Conclusions

The number of experiments implementing real-time pipelines is rapidly growing. They span from the very small fish-eye all sky cameras, through complex wide field systems to very large, not only robotic telescopes. They cover time scales from fraction of second up to hours and days. The areas of scientific interest also form a reach variety. There are many interesting and successful solutions in the field. Probably the best solution is stereo observation with the two systems in distant locations. This allows real-time rejection of near Earth objects, especially flashing artificial satellites. The problem is that this doubles the costs of the system. The main problem with many optical transients discovered in different experiments is that they lack confirmation from other projects. Therefore attention should also be paid to joining networks (like VOEventNet). They allow the distribution of alerts about optical transients and the correlation with other experiments in almost real-time. In a certain moment a number of observatories may reach the point where correlation with other experiments will be as efficient as stereo double-systems. The data streams from optical telescopes are becoming larger and larger, this will also imply necessity of fast real-time data analysis tools. The creation of good standard solutions, widely used and tested by the community, would save many future efforts. The implementation of pipelines for the identification of short optical transients in the “sPi of the Sky" data was presented. It successfully works on the data from the prototype system in LCO. The algorithm automatically identified a bright optical transient related to GRB080319B and many short OTs of unknown origin. It is intended to be used in the final design of the system. The final system will correlate OTs from two farms of wide field cameras, separated by a distance of several dozens of kilometers. It will be able to credibly identify optical transients of astrophysical origin in real-time.

Acknowledgments

The authers are very grateful to G. Pojmanski for giving access to ASAS dome and sharing his experience with us. They would like to thank the staff of the Las Campanas Observatory for their help during the installation of the apparatus. This work was financed by the Polish Ministry of Science in 2005–2009 as a research project.