Abstract

Particle detectors based on nuclear emulsions contributed to the history of physics with fundamental discoveries. The experiments benefited from the unsurpassed spatial and angular resolution of the devices in the measurement of ionizing particle tracks and in their identification. Despite the decline of the technique around the 1970’s caused by the development of the modern electronic particle detectors, emulsions are still alive today thanks to the vigorous rebirth of the technique that took place around the beginning of the 1990’s, in particular due to the needs of neutrino experiments. This progress involved both the emulsion detectors themselves and the automatic microscopes needed for their optical scanning. Nuclear emulsions have marked the study of neutrino physics, notably in relation to neutrino oscillation experiments and to the related first detection of tau-neutrinos. Relevant applications in this field are reviewed here with a focus on the main projects. An outlook is also given trying to address the main directions of the R&D effort currently in progress and the challenging applications to various fields.

1. Introduction to Nuclear Emulsions

Particle detectors based on the nuclear emulsion technique contributed to the history of particle physics with fundamental discoveries and measurements that profited from their unsurpassed spatial and angular resolution in the measurement of charged elementary particle tracks. Moreover, thanks to specific detector arrangements, accurate momentum and energy measurements were also carried out. Despite the decline of the technique around 1960–1970 due to the development and use of the modern electronic particle detectors, emulsions are still used today, thanks to the vigorous rebirth of the technique that took place around the beginning of the 1990’s, driven by the needs of neutrino experiments.

Nuclear emulsions have been effectively used in many particle physics experiments and in particular contributed to neutrino oscillation physics and to the related issue of the detection of tau-neutrinos ( ). Here, focus is on this specific physics subject and will unfortunately exclude the many scientific results that were obtained in other different fields. For those, we recommend the reader to consult existing reviews [13]. The reader is also invited to note that the emulsion detection technique is based on two independent aspects that have been synergic throughout their technological development: the emulsion detector itself and the devices (microscopes) necessary for extracting the information stored in the emulsions.

An emulsion usually consists of a huge number of crystals of silver bromide (AgBr) dispersed in gelatin. Light or ionizing particles passing through an emulsion can activate some of the silver “grains.” This process is rather complex and details can be found in the above-mentioned review papers. As a result, the absorption of energy in the crystal of silver bromide leads to the concentration of a few silver atoms into an aggregate that can act as a development center, hence creating a latent image, at this point still invisible. A physicochemical process then allows transforming those grains into metallic silver. After a suitable development procedure, the silver halide emulsion is placed into a bath (fixer) that only leaves the small black granules of silver. The detection of space-correlated sequences of these granules by suitable optical microscopes allows measuring the trajectory of ionizing particles. The role of the gelatin is to provide a three-dimensional matrix that allows to locate the small crystals of the halide and to prevent them from migrating during the development and fixation procedures. This looks very similar to what happens for normal photographic films. However, nuclear emulsions are different from the latter for several reasons: for the higher content of silver halide (by about one order of magnitude), for the larger thickness , and for the smaller size and more uniform silver grains.

Several quantities characterize nuclear emulsions as a particle detector. At first their sensitivity, related to the number of grains that are produced for a given energy release by an incoming particle. The second feature is the so-called fading. This is the oxidation process that causes the reduction of the number of “stored” grains along the track as a function of time. High relative humidity and temperature could accelerate fading, leading to the loss of detectable tracks. As we will see, fading was exploited for recent applications in order to “clean” emulsions prior to their use, erasing unwanted, accumulated cosmic-ray and environmental radiation tracks. The third factor is the mechanical distortion of the emulsion layers. This occurs either locally or globally and can obviously affect the knowledge of the absolute position of particle tracks and more seriously their direction and curvature. However, several effective techniques exist and are routinely employed to reduce distortions and their effects. Finally, random distributed grains (the so-called “fog”) produced by thermal excitation constitute a background for track recognition in the emulsion. This potential background for track identification and reconstruction is usually quantified by the number of (fog) grains present per 1000 μm3. Figure 1 shows an electron microscope image of the silver crystals in the emulsion gel, and the reconstructed image of a minimum ionizing particle track (MIP). The sensitivity corresponds to about 30 grains/100 μm.

2. A Brief History

Emulsions have a long history. The first use of the detector dates to the pioneering work of Kinoshita [4] and later on by Powell et al. [5] who got important results in the 1950’s by profiting from the continuous progress in the production of gel with improved quality. The discovery of the pion in 1947 constituted an outstanding success of the technique [6]. An important breakthrough was represented by the invention of the so-called Emulsion Cloud Chamber (ECC) (Figure 2). In the ECC, emulsion films or thicker plates are interspaced by passive material layers made of plastic or metal, constituting what we would call today a sampling-calorimeter. In this way, one can potentially fully reconstruct all tracks of an electromagnetic or hadronic shower. With the ECC arrangement emulsions became high space-resolution tracking devices with full 3D reconstruction capabilities, in addition to the “standard” use as a pure volume/imaging detector. Similar sandwich arrangements were proposed even before the advent of the ECC technique, as reported in [2].

The study of events taken with ECC detectors profits from the potential reconstruction of all tracks produced in a primary interaction occurring in the passive material. Space angles are measured for track segments and shower energy and axis reconstruction is performed as well. Particle decays can be identified by detecting kinks in the tracks. Moreover, by exploiting Multiple Coulomb Scattering and emulsion ionization measurements, one can obtain accurate measurements of the particle momenta. In the early times, the mechanical stability of the ECC sandwich was generally ensured by a vacuum packing paper known as “origami,” also needed to protect the films from the external light, humidity and polluting gases. Film-to-film alignment can be performed by suitable X-ray exposure. As we will see later on, with the OPERA experiment the ECC technique reached its highest development with several insights and innovative technological solutions.

The ECC was first developed and used by Kaplon to study heavy primaries in cosmic-ray interactions [7]. ECC detectors were then applied to the study of the cosmic-ray spectrum and to very high energy interaction processes. Nishimura, who contributed significantly to the development of the technique, proposed the ECC for the measurement of the energy of γ-rays through the study of the induced electromagnetic shower, and proposed the method of tuning the development of electron showers by a suitable use of passive material plates [8]. Still in Japan and in collaboration with the FUJI company, Niu proposed double-sided emulsion plates where the sensitive emulsion is deposited on either side of a plastic base [3]. The optical properties of the base were chosen to be close to those of the emulsion gel (meta-acrylic, lucite). The use of a plastic base between the two emulsion layers allows the precise measurement of the incident particle track angle by connecting grains near the base, which are not affected by distortions (Figure 3).

With ECC detectors the emulsion technique got an impressive boost due to the possibility of realizing large size (surface and volume) detectors, specially for cosmic-ray detection, thanks to the use of dense metal plates complementing the outstanding tracking properties of the emulsion layers. One can mention, in particular, the Chacaltaya [9] and the Mt. Fuji [10, 11] experiments and, in more recent times, the RUNJOB [12] and JACEE [12] detectors.

Thanks to an experiment conducted with an ECC detector placed on a cargo flight in 1971 Niu et al. discovered the so-called X-particles [13] by observing a cosmic-ray induced event where two charged particles produced in a primary interaction exhibit a clearly identified kink decay-topology (Figure 4). Today we know that that event had to be attributed to charmed meson production and decay. This happened three years before the discovery of the particle by the groups of Richter and Ting with electronic detectors at accelerators. More information on Niu’s results can be found in [3] and references therein.

With the advent of the electronic particle detectors, one started using ECCs in conjunction with electronic devices, with the main purpose of giving to some extent time-resolution to the emulsions, through proper association of tracks reconstructed “at the same time” in both detectors, although with a different spatial resolution. We talk in this case of a hybrid experimental setup. Thanks to the electronic detectors, tracks originated by particles interacting in the ECC are preselected to approximately identify the position where to start the emulsion scanning or look for interaction vertices, as shown in Figure 2. The consequent emulsion scanning allows, accurately, measuring and studying the events with much higher space accuracy (see, e.g., [14]). For this “step-by-step” event reconstruction procedure one could introduce a further intermediate phase, by employing additional thin emulsion films, originally called changeable sheets (CS), to act as an intermediate interface between the electronic tracker and the actual emulsions of the ECC. The CS concept was first applied in the E531 experiment at Fermilab [14].

The CS detector arrangement is shown in Figure 5, as it is being used for the OPERA experiment. Scintillator-strip planes identify some of the particle tracks produced in the interaction of a primary particle in the ECC sandwich. These tracks are then extrapolated to the CS interface films that are scanned to search for track segments matching the scintillator predictions. Only if this correspondence is validated, the ECC detector is unpacked, its emulsion films developed and scanned to search for the primary interaction. For several applications, the CS detectors were periodically replaced during the physics runs in order to limit the integration of background tracks and to facilitate the identification of tracks found in the electronic detectors.

In the 1970’s emulsion detectors of increasing mass and complexity were developed for applications to particle physics experiments conducted at particle accelerators with hybrid setups. Emulsions were used as active targets with high space-resolution, combined with electronic trackers, calorimeters and spectrometers, used to pre-select or trigger specific events in the emulsions and to complement the emulsion-based kinematical information. Important for our discussion, emulsion experiments were successfully designed and operated to study neutrino interactions. As one of the first examples one can mention the observation in 1976 of the decay of a charmed particle produced in a high-energy neutrino interactions in a Fermilab experiment [15]. Another experiment performed at CERN in 1977 used emulsions coupled to the large bubble chamber BEBC exposed to a neutrino beam [16]. More than 500 charged current neutrino interactions in the emulsions were identified, 8 of them attributed to neutrino-produced charmed particles.

The already mentioned E531 experiment was originally proposed to study charmed particles produced in neutrino interactions. The result on the cross-section measurements are given in [17] and the measurement of the lifetime of charmed particles is reported in [1821]. The active neutrino target consisted of nuclear emulsions where short-lived particles could be detected with micrometric accuracy. The decay products were measured by means of an electronic spectrometer, thus making E531 the first hybrid particle physics experiment. 122 events were tagged by the presence of a secondary vertex in the target, 119 induced by neutrinos and 3 by antineutrinos. Events with a charmed hadron in the final state were studied in detail in order to detect the presence of heavily ionizing particles (baryons) and fully reconstruct the kinematics at the decay vertex. Among those events, 57 were classified as D0 candidates [22]. One important side result of E531 was a sensitive search for neutrino oscillations [23, 24].

The excellent capability of hybrid emulsion experiments in the detection of short-lived particles was exploited in many projects that yielded outstanding science contributions. One can mention in particular the CERN WA75 experiment with the first direct observation of the production and decay of B mesons [25], and the Fermilab E653 experiment aimed at the measurement of B meson lifetime [26], in addition to other several studies conducted with proton and nucleus-nucleus interactions (see [3] for a review).

This ability of the emulsion technique in the measurement of short-lived particles was an asset for the studies started around the 1990’s, which involved neutrino beams, in particular for the detection of tau-neutrinos, either promptly produced or coming from the oscillation process. However, before attacking this subject, it is worth describing the parallel progress of the emulsion scanning methods and technology that made feasible such studies.

3. Modern Emulsion Detectors

The progress of the emulsion plate and film technology has been very much driven by the applications, notably for the detection of short-lived particles and, in the context of this review, for the need of identifying tau-leptons. The latter, in turn, can be indication of the charged current interaction of tau-neutrinos, either prompt (e.g., from a beam dump experiment) or coming from the oscillation of muon neutrinos, as in the case of the dominant channel for the so-called atmospheric neutrino-mixing signal [27]. We will review these experiments later on. Here, one can summarize the main technological advances that made these experiments feasible.

An important milestone was met by fulfilling the requirements of the CHORUS neutrino oscillation experiment at CERN [28, 29]. The hybrid detector was based on an unprecedented amount of nuclear emulsions to be used as an active target for the interaction of tau-neutrinos possibly generated from muon-neutrino oscillations. The emulsion target had a mass of 770 kg, segmented into four stacks, each consisting of eight modules individually composed of 36 plates with a surface of 36 × 72 cm2. The plates had a plastic support of 90 μm coated on either side with emulsion layers of 350 μm thickness [29]. CHORUS represented a milestone in the history of nuclear emulsions for the size of the target and of the related CS interfaces, also massively used in the experiment. The realization of the target required labor-intensive procedures for emulsion gel production, manual pouring of the plastic bases and development, conducted in the CERN emulsion laboratory [30], originally proposed for the CERN WA75 experiment. This effort progressed in parallel to the development of the required analysis tools, in primis of the first automatic scanning microscopes, as described in the next section.

With the conceptual proposal and initial design of the OPERA experiment [31, 32] we assisted in the materialization of original ideas meant for the realization of an unprecedented use of the ECC technique for neutrino oscillation experiments [33]. The project determined a consequent revolution in the technique. Emulsions transformed from active bulk target detectors, realized and assembled by researchers at typical laboratory scales, into industrially produced films thought as high-resolution microtracking devices assembled in a large number of ECC units. This effort was very successful thanks to the collaboration between the Nagoya group led by Niwa and the FUJI company [34], together with other relevant industrial contributions. One should mention, in particular, the realization of the ~10 million lead plates with high mechanical uniformity, the automatic assembly of the ~150000 ECC detectors constituting the OPERA target by robotized assembly lines working underground at LNGS, the large film developing stations at the LNGS, and the changeable sheet production line operational in one of the underground LNGS halls. A description of these complex and large-scale infrastructure can be found in [35].

Very uniform automated machine coating of 44 μm emulsion layers on either side of a 250 μm plastic base was successfully achieved on the huge scale of about 10 million films, each of about 10 × 12.5 cm2, for a total emulsion surface of nearly 110000 m2. In order to allow for the automatic industrial coating, the gel needed to be diluted in order to reduce its viscosity. This implied a reduction of the grain density, in turn recovered by improvements in the gel sensitivity obtained, for example, by a controlled double-jet method for the production of mono-dispersion of AgBr microcrystals. The number of crystals along the particle trajectories was increased, by keeping constant the volume occupancy of AgBr and the average diameter of the crystals. A multi-coating of the films was adopted by FUJI: after the first 20 μm layer is coated on both sides of a rolled plastic base, a second layer is coated over the first one. A thin 2 μm gelatin spacer protects adjacent emulsion layers. A FUJI OPERA film is shown in Figure 6.

Another development that made the OPERA experiment feasible was the realization of the so-called emulsion refreshment. High temperature and high relative humidity increase the rate of the progressive loss of the latent image. As mentioned above, this process is called fading. This possibility, however, might even be useful when the exposure occurs much later than the (large scale) film production and a low background is required, as in the case of OPERA. Fading of the OPERA films after production and storage time was achieved by introducing 5-methylbenzotriazole into the emulsion gel [34]. Absorption of this compound by the silver specks induced by radiation decreases the oxidation reduction-potential, while the sensitized centers of sulfur and gold remain stable against oxidation. In this way, most of the tracks recorded after production (due to ambient radioactivity and cosmic-rays) are erased, still maintaining a high enough sensitivity of the refreshed films. A typical cycle consists of three days at 98% relative humidity and 27°C. Such a procedure allows erasing more than 95% of all accumulated tracks. This is illustrated in Figure 7 that shows a film treated with refreshing compared to another one not refreshed.

Finally, a few words on the continuing R&D on the emulsion gel technology. We will see in the following that specific applications demand gel with improved sensitivity and efficiency features. Some work is in progress, in particular in Nagoya, where gels of different characteristics can be produced in the laboratory. In Figure 8 one can see photographs of emulsions films realized and tested in Bern by employing custom-made emulsion gel from Nagoya and from the Russian company Slavic. It is clear that one can achieve performances appreciably better than those obtained for the large-scale production of the OPERA experiment. These results will open the way to several other developments and to the precision tuning of the emulsion properties, tailored to the specific application.

4. Modern Emulsion Scanning Systems

In 1974, a great progress in the emulsion scanning technology took place with the introduction of tomographic readout of the emulsion layers by the Nagoya group [36], today still leader in the exploitation and development of emulsion detectors and scanning devices. If the thickness of the active layer is appreciably larger than the focal depth of the microscope optics (10–20 times) one can acquire multiple (tomographic) pictures of the layer by moving step-by-step the objective along the direction orthogonal to the emulsion surface. Suitably combining the images one could be able to identify and reconstruct tracks of particles traversing the layer (Figure 3). A first microscope system based on this approach was successfully employed for the readout of the CS of the E653 experiment at Fermilab. In that case, 16 tomographic images of the emulsion layer were taken thanks to a TV tube used as image grabber. The technique was further developed by replacing the video by a CCD camera and introducing an FPGA-based processor. This led to the so-called Track Selector system continuously operated and updated by the Nagoya group and used for a series of experiments [37].

With the advent of semiautomatic (operator-assisted) and fully-automatic scanning the emulsion detection technology got a consequent boost, given the largely increased number of events (or equivalently of emulsion surface) that could be analyzed for a given time interval. The scanning speed was limited by the time needed for the computer-controlled movement of the objective lenses to reach the different focal positions, due to the waiting time, in turn required to damp mechanical vibrations that become relevant when dealing with micrometric accuracies. The following generations of the original Track Selector system improved substantially with time [38]. The system was complemented by a fully automatic data offline analysis, applied after the digitization of the individual tracks around a given angle. This feature was made possible by the availability of fast electronics and CCDs and of more and more performing stage mechanics. The so-called net-scan method [39], also developed in Nagoya, eventually allowed for the reconstruction of tracks by associating all detected track segments independently of their angle and allowed complete event reconstruction [40], momentum determination by Multiple Coulomb Scattering [41, 42] and electron identification by shower analysis [43, 44].

The presently running version of the Track Selector, the S-UTS [45], based on the use of highly customized components, features impressive performance and bears testimony to the outstanding work conducted in Nagoya in the course of about twenty years (Figure 9). For the S-UTS one of the main design features is the removal of the stop-and-go process of the stage in the image data taking, which, as said above, is the mechanical bottleneck of the first originally developed systems. The objective lens moves at the same (constant) speed as the stage while moving along the vertical axis and grabbing images with a very fast CCD camera. A piezoelectric device drives the optical system. A frontend image processor makes zero-suppression and pixel packing. A dedicated processing board performs track recognition, builds microtracks and stores them in a temporary device. The routine scanning speed of the S-UTS is of more than 20 cm2/hour/layer while one of the S-UTS systems has reached the speed of 72 cm2/hour/emulsion layer by using a larger field of view, without deteriorating the intrinsic micrometric accuracy of the emulsion films.

Another scanning system was originally developed by the Salerno group [47]. This seminal work eventually led to the currently used European Scanning System (ESS) [4850] intended for the OPERA experiment. Also, the ESS employs commercial subsystems in a software-based framework. The microscope is a Cartesian robot, holding the emulsion film on a horizontal stage with a CMOS camera mounted on the vertical optical axis, along which it can be moved to vary the focal plane with a step equal to the focal depth of about 3 μm. The control workstation hosts a motion control unit that directs the stage to span the area to be scanned and drives the camera along the vertical axis to produce optical tomographic image sequences. The stage is moved to the desired position and the images are grabbed after it stops, with a stop-and-go algorithm. The images are grabbed by a Megapixel camera at the speed of 376 frames per second while the camera is moving in the vertical direction. The whole system can work at a speed analogous to the standard S-UTS. Microscopes based on the S-UTS and on the ESS are shown in Figure 10.

Regarding the future developments, one can mention two main lines of research. The first one is the further upgrade of Nagoya’s S-UTS system; R&D studies are in progress with the ambitious goal of reaching a scanning speed of more than a few hundred square centimeters per hour [3]. The idea is based on the use of commercial steppers used for lithography and Megapixel cameras with high frame (>60 fps) and high pixel readout rates (>370 Mpixels/s). The steppers can provide a space resolution better than 350 nm and an exposure field larger than 20 × 20 mm2.

Another approach, very recent, is being considered both in Japan and in Europe, which features the use of Graphic Processor Units (GPUs). This approach constitutes a sort of revolution in the reconstruction of emulsion data. State-of-the-art GPUs are envisioned, such as the NVIDIA GeForce TITAN with a power of about 4.6 Teraflops, with 2688 processing cores in one GPU, while even a modern CPU with 6 cores (of the type i7-3960X) corresponds to only 0.16 Teraflops. A further advantage of this approach is that the achievable performance is scalable with the number of GPUs. This idea was proposed by Ariga [51] in Bern, where this development is presently being actively followed. By now, several other groups are investigating the possibilities offered by the use of GPUs for a further step in the development of automatic emulsion scanning. One can mention in particular the work done by the Toho [52] and by the Nagoya groups, which, by using dedicated optics and 72 GPUs, has shown the possibility of reaching a scanning speed equivalent to 10000 cm2 of emulsion surface per hour.

5. CHORUS and DONUT

Neutrino mixing means that the neutrino flavor eigenstates involved in weak interaction processes ( , and ) can be expressed as a superposition of mass eigenstates (with mass eigenvalues , and ) through a rotation and unitary mixing matrix called Pontecorvo-Maki-Nakagawa-Sakata (PMNS) [5355]. The occurrence of mixing generates oscillations of neutrinos propagating in space and time, namely the periodic variation of their flavor composition, provided that neutrinos have nondegenerate mass eigenvalues. The parameters characterizing oscillations are three mixing angles, similarly to what happens for quark mass and weak eigenstates through the CKM matrix, and two mass eigenvalue squared differences and , in addition to a possible CP violating phase term in the matrix [27]. The oscillatory flavor conversion probability also depends on the ratio between the distance traveled by the neutrinos before detection and their energy. The latter parameter can in principle be chosen when designing the experiment in order to place the detector(s) in the condition of actually measuring the appearance of a different flavor absent in the beam or the disappearance of neutrinos of the initial flavor.

Following theoretical arguments developed around the beginning of the 1990’s favoring the as a constituent of the Dark Matter of the Universe (provided it could have a mass of 1–10 eV2), the neutrino community started arguing that searching for neutrino oscillations (from muon- to tau-neutrinos) could be a powerful method to identify such a massive neutrino. It should be noted that at that time no direct evidence existed yet for the . The already mentioned CHORUS experiment [28] was designed with this twofold goal: to search for neutrino oscillations in the parameter region corresponding to a 1–10 eV2 mass and to provide the first evidence for the detection of this elusive lepton. This could be accomplished by a so-called short baseline experiment, with the detector placed at about 1 km distance from the source of high-energy neutrinos, with the right distance and energy to develop a signal of oscillations corresponding to a in the 1–10 eV2 mass range.

Nuclear emulsions played a key role in the design of the experiment. The goal was the detection of the very short track of the lepton coming from the charged current interaction of a , in turn produced by the oscillation of a constituting the neutrino beam. For this one could profit from the use of the dense, high space-resolution emulsions to realize a hybrid detector well suited to a high sensitivity study of the decay topologies of the (Figure 11). Emulsions could also allow the full reconstruction of the event kinematics, in turn required for background suppression. This approach was supported and justified by the advances in the emulsion technique, mainly in relation to the handling of large quantities of emulsions, and also thanks to the above-mentioned progress in the emulsion scanning and offline analysis, such to reduce the analysis time of the emulsions by orders of magnitude as compared to the early times.

The hybrid CHORUS apparatus combined a large nuclear emulsion target previously described with various electronic detectors. Since charmed particles and the lepton have similar lifetimes, the detector was also well suited for the observation of the production and decay of charm, a source of background for the oscillation search but also an interesting physics subject per se. The very large emulsion stacks were followed by a set of scintillating fiber planes. Three CS detectors with a 90 μm emulsion layer on both sides of a 800 μm thick plastic base were used as interface between the fiber trackers and the “bulk” emulsion. The accuracy of the tracker prediction in the CS was about 150 μm in position and 2 mrad in track angle.

The electronic detectors downstream of the emulsion target and its associated trackers included a hadron spectrometer measuring the bending of charged particles with an air-core magnet, a high-resolution calorimeter where the energy and direction of showers were measured and a muon spectrometer. The CHORUS apparatus is schematically depicted in Figure 12 and a photograph taken during its installation is shown in Figure 13.

The operation of the experiment consisted of several steps. It is worth noting that the large-size emulsion target was replaced only once during the entire duration of the experiment, while the CS were periodically exchanged with new detectors, therefore integrating background tracks for a relatively short period. A much better time resolution was obviously provided by the electronic detectors. With the CS scanning the association between electronic detectors and emulsions takes place, and tracks with position and angle compatible with that of the electronic trackers’ predictions are searched for in the interface emulsions. If found, these tracks are further extrapolated into the bulk emulsion, with a much better spatial resolution, up to the track stopping point, with the above-mentioned scan-back procedure, consisting in connecting emulsion layers progressively more upstream. After that, a “volume scan” (net-scan) around the presumed vertex is accomplished and repeated for all stopping tracks until the neutrino interaction vertex is found. In the search for charmed particle decays, a dedicated topological selection was applied to the collected net-scan data. The analysis procedure was complemented by the visual inspection of the selected event candidates, aimed at checking both primary and secondary vertices making use of the “stack” configuration. Decay topologies could be well separated from ordinary nuclear interactions, since the latter usually exhibit fragments from nuclear break-up or so-called “blobs” from nuclear recoil.

More than 100000 neutrino interactions were identified (located) and measured in the emulsions of CHORUS. Unfortunately, the search for oscillations was negative and just an upper limit to the oscillation probability was eventually set [56]. The negative result is illustrated in Figure 14, which shows the so-called exclusion plot indicating the region of the oscillation parameter excluded by the experiment at the 90% confidence level. The vertex of a muon neutrino interaction in one CHORUS emulsion stack is shown in Figure 15. Figure 16 illustrates an event with a topology similar to that of a lepton, due to the production and decay of a charmed particle. The study of neutrino induced charm events has been one of the main side results of the experiment. A large statistics of ~2000 fully neutrino-induced charmed hadron event vertices was gathered, with the outstanding reconstruction features provided by the emulsion detectors. With such events, CHORUS has been able to measure the and D0 exclusive production cross-section [57], the double-charm production cross-section in neutral and charged current interactions [58] and the associated charm production [59].

As said above, CHORUS represented a milestone in the history of nuclear emulsions for the size of the target and of the CS detectors, for the complexity of the related technical facilities, as well as for the great challenge represented by the unprecedented event statistics. All major international emulsion groups from Japan, Italy, Korea, and Turkey participated in the experiment. The convergence of virtually all the available worldwide expertise (apart from several emulsion groups from Russia) substantially contributed to the success of the project and to the already mentioned rebirth of the technique. This is also confirmed by the fact that other groups and individuals formerly not working on emulsion detectors soon got acquainted with the technique providing valuable technical and scientific contributions.

A high-sensitivity follow-up of the CHORUS experiment was then proposed at CERN with the goal of increasing by more than one order of magnitude its sensitivity in the measurement of the oscillation mixing-angle, still as suggested by a scenario where the tau-neutrino could be a Dark Matter candidate. The conceptual idea was first discussed in [60]. In that case, emulsions were proposed as large-surface trackers for the high-resolution measurement of hadron and muon momenta. This idea was then incorporated in the proposal of the TOSCA experiment [61]. The project at the end was not realized, mainly because of the growing evidence for neutrino oscillations with atmospheric neutrinos, first provided by the Kamiokande [62] and then by the Super-Kamiokande experiments [63], in a complementary region of the oscillation parameters characterized by a large mixing angle and a very small value of the oscillation parameter, related to the mass eigenvalues of the oscillating neutrinos.

In parallel, it was considered important to directly observe the , whose existence, although indirectly established, was still not firmly proven by an appearance experiment. The first direct detection of the was the scientific goal of the Fermilab DONUT experiment [64, 65]. Once more, nuclear emulsions were considered for the identification of tau-leptons in turn produced in the charged current interaction of tau-neutrinos. The latter were promptly created in the decay of mesons produced in a 800 GeV proton beam dump. Also in this case we have to notice the key contribution of the Nagoya group joined by other Japanese emulsion groups.

An iron/emulsion ECC target was adopted in DONUT to provide large mass for neutrino interactions, and the identification of the interaction vertex through the detection of the millimetric track of the tau prior to its decay. The ECC target was complemented by fiber trackers (analogously to CHORUS) to help in the track extrapolation into the emulsions. The DONUT experiment allowed identifying 578 neutrino interactions with 9 signal candidate events for an estimated background of 1.5 events. This constituted the discovery of the tau-neutrino in the year 2000. One of the signal candidate events is shown in Figure 17. The short track of the tau (about 300 μm long) is clearly exhibiting the characteristic kink, unambiguous signature of a decay topology over the much less probable large-angle hadron or muon scattering.

6. OPERA: The Largest Nuclear Emulsion Detector Ever

The solution of the long-standing solar and atmospheric neutrino anomalies came from a series of crucial experiments conducted in the last two decades that led to the discovery of neutrino oscillations (see, e.g., [20]). The occurrence of oscillations and the consequent existence of a finite (although very small) mass for the neutrinos is in contrast to what is assumed in the Standard Model of particles and interactions; oscillations are so far the only evidence for new physics beyond the Standard Model.

Around the end of the 1990’s the scientific debate around neutrino oscillations became very intense. The indications from experiments involving atmospheric and solar neutrinos were more and more convincing, until the unambiguous discovery by the already mentioned Super-Kamiokande experiment [63] pointing to the occurrence of oscillations. At that time, there was a general consensus on the need for a confirmation of the oscillation signal obtained with atmospheric neutrinos by experiments exploiting man-made neutrino beams, but sensitive to the same parameter region. As mentioned above, this could be achieved through a suitable choice of the L/E parameter. The results from Super-Kamiokande and the other experiments with atmospheric neutrinos indicated a value of Δm2 around 10−3 eV2, much smaller than the ~10 eV2 region explored by CHORUS.

The conceptual idea of the OPERA experiment [31, 32], contrary to other approaches looking for the disappearance of an initial neutrino flavor flux coming from an accelerator or from a nuclear reactor was based on the possibility of detecting the direct appearance of oscillations in the channel, as indicated by the Super-Kamiokande atmospheric neutrino signal. Similarly to CHORUS, looking for appearance turned into the need of identifying the short-lived tau-lepton with a “vertex detector” of several thousand tons mass, because of the large distance required from source to detector to develop an oscillation signal, given the small value of Δm2. The only realistic possibility was the adoption of nuclear emulsions (high space resolution) used in the ECC configuration (high target mass), since a fully sensitive emulsion target a la CHORUS would have been unrealistically expensive and even impossible to produce. The detection principle of the experiment is described in Figure 18.

The idea of OPERA was to exploit a long baseline accelerator neutrino beam with L of the order of 1000 km, combined with a relatively high neutrino energy (  GeV) such to determine the “correct” L/E ratio and to be above the kinematical threshold (~3.5 GeV) for producing the relatively heavy τ lepton. This concept materialized into the design of an experiment to be hosted at LNGS, the largest underground physics laboratory in the world, about 730 km away from the neutrino beam source at CERN. The dedicated CNGS neutrino beam was produced by the interaction of primary protons from the CERN SPS onto a carbon target, hence (mostly) producing pions and kaons. The decay of these mesons into neutrinos generates the long baseline neutrino beam directed to the detector at LNGS.

As in CHORUS, the τ has to be identified by the detection of its characteristic decay topologies, in one prong (electron, muon or hadron) or in three prongs. This is done with a large number of ECC units made of 1 mm thick lead plates interspaced with thin emulsion films, acting as high-accuracy tracking devices, contrary to the role of active target that emulsions had in CHORUS. The full OPERA detector is made of two identical Super Modules each consisting of a target section of about 900 tons made of ECC modules (bricks), of a scintillator tracker detector, needed to prelocalize neutrino interactions within the target, and of a muon spectrometer.

The OPERA experiment is described in detail in [35] and shown in Figure 19. Here we just point to the main features, mostly in relation to the large ECC emulsion target. At the occurrence of a neutrino interaction, the resulting charged particle tracks are detected by scintillator counter planes placed behind each brick target wall, similarly to what happens in a sampling calorimeter. There are about 150000 ECC bricks in the detector target, each weighing about 8 kg and consisting of a sandwich of 57 emulsion films and 56 lead plates (see Section 3) arranged into planes of about 10 × 10 m2 surface each. Each target plane is followed by planes of scintillator strips of identical cross-section readout by multi-anode photomultipliers.

The event reconstruction procedure starts with signals from the scintillators upon interaction of a neutrino from the CNGS beam. On the average, 20–30 neutrino interactions are collected per day of CNGS operation. Such interactions are normally accompanied by a “particle shower” that develops in the calorimetric structure composed of brick and scintillator planes. The reconstruction of the shower axis and/or the identification of a muon track allow (with a certain efficiency) identifying the brick where the neutrino interaction occurred. The candidate brick is then extracted from its target plane by a robot and the CS emulsions attached to it are removed and developed. The brick is not unpacked and is stored underground waiting for a positive indication of tracks found in the CS (Figure 13). The CS is made of two emulsion films, for a total of 4 emulsion layers yielding up to 4 microtracks. A CS track is declared found if at least 3 out of the 4 microtracks are actually reconstructed. The scanning of the CS is the most time-consuming analysis procedure. It is performed in two large microscope laboratories located at LNGS and Nagoya with microscopes just dedicated to CS scanning. Several tens of square centimeters are normally scanned per each CS in order to look for neutrino-related track segments. If no full tracks compatible with the event reconstructed by the scintillators are found, the brick is put back in the detector with a new CS. Otherwise, the brick is exposed to cosmic-rays for a period of 12 hours, to provide the collection of a sufficient number of muon tracks for the precise alignment of the brick’s films and for the correction of emulsion deformations. After this, the brick is finally unpacked and all its films are developed in the large, 5 station facility at LNGS [35]. Then the bricks’ emulsion films are shipped to the various scanning laboratories of the collaboration for the vertex location analysis.

Starting from the position and angle information of the CS track(s), the so-called scan back procedure initiates, by looking for CS track extrapolations to the brick’s films, starting from the most downstream ones. Disappearance of a scan-back track might indicate the presence of an interaction vertex (Figure 2). In this case, a general scan is performed around the disappeared track position over an equivalent brick volume of about 1 cm3. This procedure, called decay search, namely, the identification of primary neutrino vertices and of secondary vertices due to particle decays (e.g., that of a tau into one or more prongs), is conducted automatically and routinely by the nearly 40 microscopes of the OPERA brick scanning laboratories. More details on the event reconstruction in OPERA are given, for example, in [68].

Once one or more vertices are found one can exploit all the rich information stored in the emulsions for a Multiple Coulomb Scattering (MCS) analysis (for track momentum determination) or for the search of electron- or gamma-induced downstream showers. Details on the track momentum determination by MCS can be found in [69]. The effectiveness of these methods is illustrated in Figure 20, which shows the display of the first tau candidate event found by OPERA. Several tracks are reconstructed originating from the primary vertex. One of those is found exhibiting the peculiar kink topology, indication of the production and decay of a tau-lepton [70, 71]. This interpretation is supported by the detailed kinematical analysis of the event, assumed as the interaction of a tau-neutrino coming from oscillation, interacting in the lead plate of the selected brick, and producing a tau in turn decaying into , finally followed by the decay .

The OPERA experiment successfully completed in 2012 its data taking in the CNGS beam, started in 2008. It exploited a total neutrino flux produced by protons from the SPS hitting the neutrino production target. The analysis of the events still “stored” in the bricks in the detector is in progress. At the moment of writing, several thousand neutrino interaction events have been located in the emulsion bricks. So far, 3ντ candidate events have been identified, with an expected signal of 2.2 events for an estimated background of 0.23 events. This corresponds to a ~3.2σ significance for a non-null observation of oscillations [72]. The topology of the second candidate event is compatible with that of a tau-lepton decaying into three hadrons, while the third event, announced very recently, is a clean τ muonic decay characterized by a very low background. Scanning of the remaining events will proceed until 2015 with the objective of a stronger statistical confidence in the observation of νμντ oscillations. A few more signal events are expected by the completion of the analysis.

Furthermore, given the capability of the experiment in identifying prompt electrons through their characteristic electromagnetic shower in the dense calorimeter constituted by the ECC sandwich, OPERA has recently provided limits to the observation of oscillation. Overall, 19 interactions were observed in the event sample corresponding to the 2008-2009 statistics, out of which 6 satisfying the selection criteria for oscillated . 1.3 signal events were expected with 9.4 events of background, mainly due to the ~1% contamination of of the beam [73]. This result is very interesting because it provides so far the best upper limit for the exclusion of the low Δm2 parameter region of the so-called LSND/MiniBooNE signal, possibly due to the existence of a fourth and “sterile” neutrino flavor. The event display of an OPERA -induced event is shown in Figure 21.

7. The Future: The Emulsion Technique Beyond Neutrino Oscillation Experiments

The vigorous R&D conducted for the OPERA experiment and the consequent dissemination of the modern emulsion technique has led some of the groups participating in the experiment to further continue this activity for future applications. This strategy is also pursued by other groups in Europe and Japan not previously involved in OPERA. Some of the applications concern fundamental physics, while others refer to technological or even industrial applications. In [3] and references therein, a few of the notable activities are presented on experiments conducted with balloon, airplane, satellite and space station experiments for cosmic-ray physics, or applications related to radiation dosimetry, neutron flux measurement and monitoring, imaging for medical applications, and radiation biophysics. As far as more recent applications are concerned, we can mention for example, the GRAINE balloon experiment for the study of cosmic gamma-rays [74], a study on hadron fragmentation for medical carbon therapy [75], and a development of a neutron camera for use in plasma physics experiments [76].

On the other hand, the use of emulsion detectors and modern (fast) scanning devices has caused a technological boost also in the active field of muon radiography. The latter was originally proposed long ago for the measurement of the thickness of mountains [77] and for the search for unknown burial cavities in pyramids [78]. The technique is based on the fact that the interaction of primary cosmic-rays with the atmosphere provides a stable source of muons that can be employed for various applications of muon radiography and in particular for the study of the internal structure of volcanoes. This application, pioneered in Japan [7982], has recently led to very interesting results and opened the way to unexpected applications. Related applications of muon radiography are in fact in the field of the investigation of geological structures such as glacier bedrocks and in the search for ice concentrations in Alpine mountains, as well as in the inspection of the interior of building structures and blast furnaces. In these cases, emulsion films represent a very appropriate detector option for two main reasons: the unbeatable position and angular resolutions in the measurement of the muon track (less than 1 μm and a few mrad, resp.) and the passive nature of the device, not requiring electric power, electronics, radio transmission of data, and so forth. However, a rather long exposure time might be required (up to several months) for any realistic detector surface, and quasi real-time analysis of the emulsion data poses challenging requirements due to the time needed to analyze emulsions by optical microscopes. The technique will definitely profit from the advances in the realization of suitable emulsion films and (mainly) from the availability of more and more performing scanning microscopes. More information can be found in [83].

A notable recent application of the emulsion technique is in the framework of proton radiography for medical diagnostics. A new method based on emulsion film detectors was proposed and tested in Bern and PSI [84]. This is a technique in which images are obtained by measuring the position and the residual range of protons passing through the patient’s body. For this purpose, nuclear emulsion films interleaved with tissue-equivalent absorbers can be used to reconstruct proton tracks with very high accuracy. Proton radiography can be applied to proton therapy in order to obtain direct information on the average tissue density for treatment planning optimization and to perform imaging with very low dose to the patient.

The detector used in [84] was coupled to two phantoms shown in Figure 22. The detection apparatus is divided into two parts. The first one is made of 30 OPERA-like emulsion films, each interleaved with 3 polystyrene absorber plates, allowing tracking of passing-through protons using a minimal number of films. The second part is composed of 40 films, each interleaved with one polystyrene plate, allowing a precise measurement of the proton range in correspondence of the Bragg peak. The overall dimensions of the detector are 10.2 × 12.5 cm2 in the transverse direction and 90.3 mm along the beam. The results of an exposure to 138 MeV protons of the detector placed behind the “rod” phantom (Figure 22) are shown in Figure 23. Also for this application, the technique will become more and more appealing with the further progress in the high-speed analysis of emulsion films.

Another notable field of application of emulsion detectors has been originally proposed by the Nagoya group for the detection of the Dark Matter of the Universe [85]. Weakly Interacting Massive Particles (WIMPs), possible Dark Matter candidates, could be in fact detected by measuring the slow nuclear recoil after their interaction with massive, high-density emulsion targets. In the case of a WIMP with a mass of about 100 GeV hitting an Ag nucleus in the emulsion gel, one would obtain a recoil momentum of ~100 keV, with a corresponding range of only 100 nm. In order to detect such a short, heavily ionizing track, the Nagoya group is presently studying the technical feasibility of the so-called Nanoimaging Tracking (NIT) [8688]. The first requirement is to go from the 200 nm AgBr crystal size of the present OPERA emulsion gel to about 40 nm size. This would correspond to a density of more than 10 AgBr crystals per micron. In addition, in order to have a sufficiently long track, the idea is to apply the so-called emulsion swelling prior to development, to expand the 100 nm long recoil tracks up to 1000 nm or more. First tests are encouraging. Internal radioactivity and fog grains making random track coincidences constitute the main background.

Still on fundamental science applications, emulsion films can be used as high-precision tracker devices to determine the antihydrogen annihilation vertices with micrometric space resolution, or even as targets by directly observing antimatter annihilation vertices. This is a recent application considered by the Bern group [89] for the AEgIS experiment at CERN [90]. In AEgIS the free-fall of antihydrogen atoms launched horizontally will be measured to directly determine for the first time the gravitational acceleration of neutral antimatter by matter. The vertical precision on the measured annihilation point will be around 1 μm R.M.S. This will be achieved by adding an emulsion detector to the originally foreseen μ-strip detector. For the first time, nuclear emulsion films will be used in vacuum at relatively low temperature. An intense R&D program on emulsion films has been started to cope with their operation under such conditions. First results on the behavior of emulsions in vacuum have already been published [89]. Results were also achieved with antiprotons annihilating in emulsions in vacuum and at room temperature [91].

For AEgIS new emulsion gels with higher sensitivity are being considered to increase the detection efficiency using glass instead of plastic as base material. Glass, in fact, is better suited to achieve the highest position resolutions thanks to its superior environmental stability (temperature and humidity), as compared to plastic bases. Emulsions of the new type were manufactured in Bern with gel provided by the University of Nagoya. Antiproton interactions reconstructed within the conventional (OPERA-like) and new gel are shown in Figure 24. One can definitely see the better features of the new gel in terms of sensitivity to nuclear fragments and noise. The new gel composition can be tuned, for example, to provide higher detection efficiency to fragments from antimatter annihilation rather than to MIP, contrary to what was done, for instance, in OPERA.