Materials, Devices, Fabrication, Characterization, and Applications for OLED Illumination and DisplayView this Special Issue
Research Article | Open Access
Constanze Großmann, Ute Gawronski, Martin Breibarth, Gunther Notni, Andreas Tünnermann, "Simulation and System Design of a 3D Metrology Optical System Based on a Bidirectional OLED Microdisplay", Advances in Materials Science and Engineering, vol. 2012, Article ID 417376, 9 pages, 2012. https://doi.org/10.1155/2012/417376
Simulation and System Design of a 3D Metrology Optical System Based on a Bidirectional OLED Microdisplay
Innovative display technologies enable a wide range of different system applications. specifically, in metrology, medical, and automotive applications microdisplays were increasingly used. In the last decades OLED microdisplays were in the focus of display development. A new class of OLED microdisplays with an integrated photodiode array is the latest development. The so-called bi-directional OLED microdisplays combine light-emitting devices (AM-OLED microdisplay) and photo sensitive detectors (photodiode matrix) on one single chip based on OLED-on-CMOS-technology. Currently this kind of display is still a prototype. Based on such a novel bidirectional OLED microdisplay, we present for the first time a system simulation and design of a 3D optical surface metrology system. The first step is the full characterization of the microdisplay. Depending on the characterization results the future system parameters are determined. Based on the characterization results and the application parameters the system design parameters are defined. The functionality of the system is simulated, and a theoretical proof of concept is presented. An example for our application on 3D optical surface metrology system is evaluated.
Expanding requirements on manufacturing technology increase the demands on noncontact metrology systems. Typical optical metrology systems are based on a separated light-emitting unit (e.g., projection unit) and detection unit (e.g., camera unit) [1, 2]. This fact limits the miniaturization of the sensor system. Furthermore, the use of two opto-electronical devices complicates the integration and the aligment, which leads to higher production costs.
Typically, a projection unit includes a light source with a collecting optics, illuminating a light modulator (such as DMD (digital micromirror device) or LCoS (liquid crystal on silicon) displays) and a projection lens, imaging the generated pattern into the measurement plane. A first step towards the miniaturization of such an unit is the application of a self-emitting microdisplay (as active matrix OLED microdisplay) for pattern generation . OLED microdisplays are state-of-the-art microdisplays, and they are used in a wide range of applications (e.g., multimedia, medical, and metrology applications [2–7]). Such devices comprise light source and light modulator in one element, minimizing the number of components in a system. In contrast to conventional projection systems OLED microdisplays allow a simple and small system integration .
A further miniaturization can be realized by applying a microdisplay combining OLED microdisplay and sensor unit (i.e., photodiode matrix) on one single element. Such a bidirectional OLED microdisplay (BiMiD) was realized using OLED-on-CMOS-technology by Fraunhofer IPMS [8–10]. That means that the light source/image device is placed in the same plane as the detector. The bi-directional display used in this investigation consists of an AM-OLED microdisplay with an integrated photodiode matrix. That implies that each display pixel contains an emitting OLED pixel and a photodiode. Both functions work simultaneously and in the same wavelength range. First applications that are based on a similarly working BiMiD are flow, color, and reflex sensors, which are presented in Reckziegel et al. . Another application based on a BiMiD that works in different wavelength ranges for the OLED imaging and photodiode detection is an HMD and a distance sensor, which are mentioned in Richter et al. [9, 11].
In this paper we present a 3D surface optical metrology system based on phase-shifting fringe projection [1, 12–17] with BiMiDs. Fringes are projected onto the surface of measuring object. As they are observed via a different angle (triangulation angle), the fringes appear deformed according to the surface deformation of the measuring object. This deformation of the fringes allows calculating the 3D coordinates of all visible points. Up to now such systems are divided in a projection and an imaging unit increasing the size of the system.
We will demonstrate that the application of a bi-directional OLED microdisplay enables the realization of a highly integrated compact surface metrology sensor.
2. Fringe Projection Principle
A simple fringe projection 3D measurement system consists of an image acquisition sensor and a digital pattern (e.g., fringe) projector (see Figure 1(a)) . The pattern/fringes are generated by a digital projection unit based on LCD, LCOS, DMD, or OLED microdisplay technology . The image acquisition sensor can be applied as a conventional CCD. The 3D metrology system is based on fringe projection onto the surface of the measurement object. The fringes appear deformed when observed from a different angle (triangulation angle). The trinangulation angle is the angle between the optical axis of the projection lens and the imaging lens. From the deformation of the fringes the 3D coordinates of all visible points can be calculated and thus the object shape can be determined.
(a) Principle schematic of the basic fringe projection system. On the right side, the projector unit is placed, and, on the left side, the camera unite is positioned. Both these units are positioned in a specified triangulation angle .
(b) Prototype of a 3D surface metrology system, based on an OLED projection unit and an imaging unit. The triangulation angle is 18°. On the right side, the measured target and the 3D shape model are shown [2, 4].
In Figure 1(b) a prototype of a 3D surface metrology system is shown. This system is based on an OLED projection unit and an imaging system. The trinangulation angle is 18°. The OLED microdisplay generates the fringe patterns that are projected via the projection lens onto the measuring object. The imaging lens displayed the object during the pattern sequence on the detector. Based on the variety of fringe images the 3D shape of the measuring object can be calculated. On the right side of Figure 1(b) the result of the measurement of a calibrated target is shown. A well-done conformity of the measured and target shape is presented.
A structured light approach combining the projection of a sequence of phase-shifted sinusoidal fringe patterns in combination with a sequence of Gray code patterns was used . Due to the fact that phase-shifted sinusoidal fringe patterns produce periodic phase values, additional phase unwrapping is necessary to solve these ambiguities, which is possible through the use of a Gray code sequence giving each sinus period a unique identifier [12, 17]. To be able to calculate 3D points on the objects surface using triangulation methods, at least 2 phase values for each 3D point are necessary in this setup along with a fully calibrated sensor arrangement (e.g., orientation parameters for the measurement camera and the fringe projector). This can be achieved using two projected pattern sequences rotated 90° to each other. As a result each pixel in the measurement camera has a pair of phase values assigned to it. This pair of phase values describes exactly one position in the projector matrix (e.g., interpolated projector pixel). Using the orientation parameters of both sensor units, a simple triangulation can be used to calculate the 3D point on the surface . Using this approach, the accuracy of the 3D coordinate measurement depends directly of the accuracy of the phase measurement (proportional).
3. Bidirectional OLED Microdisplay
OLED microdisplays are widely used in the commercial applications like displays, which are directly observed by the user (mobile phone screens, head-mounted displays) [7, 9]. This display technology benefits from small geometrical size, low weight, low power consumption, and potentially high resolution . High-brightness OLED microdisplays can also be applied as image generating devices in picoprojectors (e.g., for mobile phones) [2, 6]. However, those systems are unidirectional .
The development of light-emitting-polymer-(LEP-) on-CMOS-technology  opens the possibility to combine light emission and detection on one single chip. The so-called bi-directional OLED microdisplay (BiMiD), based on the OLED-on-CMOS-technology, has been developed by the Fraunhofer Institute for Photonic Microsystems (IPMS, Dresden)[8–10]. Such a display offers a new flexibility to optical metrology systems, because projection and imaging units can be combined in one optical path.
The CMOS-technology enables a simple electronics integration of OLED . Figure 2 shows the cross-section of the design of the BiMiD. The CMOS top metal represents simultaneously the OLED bottom electrode (OLED cathode). A semitransparent thin metal layer is used as OLED electrode. The OLED layer is directly deposited onto the CMOS substrate. A detailed OLED layer structure is described in Reckziegel et al. . The photodiodes (PDs) are embedded in the CMOS substrate. That means in each BiMiD pixel one OLED emitting pixel and one photodiode are integrated. Due to CMOS-technology (embedded photodiodes) a high fill factor of 90% comprising OLED pixel and photodiode can be realized.
The emitting unit is an active-matrix-(AM-) OLED microdisplay. Photodiodes with diameter of around 8 μm are integrated in each OLED pixel (34 μm2). The photodiodes are positioned about 7 μm below the OLED layer. Figure 3(a) shows the BiMiD displaying an image, and Figure 3(b) shows a detailed view of the pixel structure.
(a) Bi-directional OLED microdisplay showing an image.
(b) Detailed view of the pixel matrix. The green pixel represents the OLED pixel and the circular mark highlights the active integrated photodiodes.
Due to processing reasons, the first prototype of a BiMiD for our application has a limited photodiode resolution. Even though photodiodes are placed in each OLED pixel, only one photodiode out of four is integrated in the electronic control and therefore active. Accordingly, the photodiode resolution is reduced by four in comparison to the OLED resolution. The resulting resolution for the OLED microdisplay is QVGA (240 × 320), and the photodiode resolution is QQVGA (120 × 160). With this device, either simultaneous or sequential emission (OLED projection) and detection (photodiode) can be realized . In the sequential mode of operation, the OLED projection and photodiode detection are wavelength independent (). In the simultaneous mode of operation, as we use it in our BiMiD prototype, OLEDs and photodiodes work in the same wavelength range (). In this case, however, direct crosstalk effects between OLEDs and photodiodes can disturb the functionality.
We classify two different types of crosstalk: local and global crosstalk. Local crosstalk occurs directly between an OLED pixel and its neighbouring photodiodes, caused, for example, by internal reflection at CMOS layers (i.e., optical waveguide effect). In contrast to local crosstalk, global crosstalk indicates the influence of an emitting OLED pixel onto the photodiodes being spread over the whole display device. Global crosstalk can, for example, be caused by (e.g., multiple) reflections at the display cover glass and can therefore be detected by photodiodes being positioned not in the direct neighbourhood of the emitting OLED pixel. Local and global crosstalk have a strong impact onto the characteristics of the detected signal .
To limit the local crosstalk, we can take advantage of the limited resolution of photodiodes. As in a 2 × 2 pixel matrix only one photodiode is able to detect light, the OLED pixel surrounding the active photodiode is not used for light emission. Therefore, all images which are used are masked, that means that the OLED pixel including active photodiodes is inactive (e.g., black pixel projection). In this way, local crosstalk between the photodiode and its surrounding OLED can be prevented.
The current BiMiD prototype that we used for our prototype emits in the orange visible range ( nm) with a bandwidth of 48 nm (FWHM). The luminance of the OLED display at different voltage adjustments lies between 260 cd/m2 and 7.8 kcd/m2, which is suitable for high brightness projection applications. The radiation angle is around ±45° for each luminance level. The uniformity over the display is around 90%. The contrast ratio of the OLED display is around 30000 : 1 (ratio of full screen bright to full screen dark image). This impressible contrast ratio is a big advantage of OLED microdisplays in comparison to conventional microdisplays for projection purposes. The photodiodes exhibit a exposure time between 0.1434 ms and 1.174 s. The uniformity lies around 83% at the highest exposure times.
More details about the technology of the bi-directional OLED-on-CMOS-microdisplay are presented in Richter et al. .
4. Conceptual Design
The central element of our 3D sensor is the bi-directional OLED microdisplay (BiMiD). To prove the principle of the sensor the BiMiD was characterized. The measured parameters were used for the system simulation with the optical design program ZEMAX. In addition, a software for the generation of a 3D model was used.
First the BiMiD was characterized (see Section 3). The OLED microdisplay, the photodiode matrix (PD), and the crosstalk between OLED and photodiodes were evaluated. To measure the crosstalk of the BiMiD a paraxial lens design was used (see Figure 5). The BiMiD emits and detects light in the same plane and in the same spectral range. Figure 4(a) shows the projected test image (white square with a diagonal of 3 mm), and Figure 4(b) shows the detected image without additional optical elements (e.g., lens and mirror). A direct crosstalk between OLED pixel and photodiodes is detectable. The desired detection signal is lower than the crosstalk signal ( ). Due to the direct crosstalk the BiMiD active area was divided in two different fields: an object field and a detection field. In Figure 5 the paraxial lens design setup and the simplified laboratory setup that contains the separation of the projection and detection fields are shown. In the paraxial lens setup the blue path describes the projection path, and the green one describes the imaging path. Both object and image fields are placed next to each other. Fold mirrors are integrated in the projection path. The position of the fold mirrors and the dimension of the optical system configuration determine a triangulation angle (). The imaging path is unfolded. In both optical paths two paraxial lenses are integrated (Figure 5(a)). The realized setup including off-the-shelf optics is shown in Figure 5(b).
(a) Test image of the object field (projection path, diagonal 3 mm).
(b) Detected test image via the integrated photodiode matrix (PD) without optical elements (e.g., lens, mirror).
(c) Image of the detected test image in the labor setup (see Figure 5(b)) (detection path).
(a) Paraxial lens design of the function test.
(b) Labor setup of the paraxial lens design. The red path describes the projection path and the blue describes the imaging path.
Figure 4(c) shows the detected image of the object. On the left-hand side the direct crosstalk image of the projection field and on the right hand side the object image are shown. Due to the internal display effect the active OLED pixel is imaged onto the photodiodes that are placed next to each other. Therefore, the projection object field and the detection field have to be separated via ray path folding. As described before the detection signal is not measureable in the area of the crosstalk. The distance (gap) between the projection and detection fields has to be larger than the crosstalk range radius. The dimension of the gap is depending on the OLED luminance and the photodiode sensitivity. For the following paraxial simulation the crosstalk around the projection field is neglected (gap = 0). But for further development of an optical prototype a gap > 0 between object field and detection field has to be considered.
As discussed before, the detection field has to be separated from the projection field on the display. Thereby, the detection field (diagonal 4.46 mm) is two times larger than the projection field (diagonal 8.92 mm) to realize a higher resolution (592 × 592 Pixel) for the imaging path. Figure 6 shows the compact paraxial optical system design for the 3D sensor: on the left side the BiMiD and on the right side the MO.
The object (e.g., fringe pattern, see Figure 7) is imaged by a paraxial lens into the focal plane of the sensor. In the focal plane a MO is placed. Via a second paraxial lens the MO is observed during the fringe projection sequence and the image is detected by the PD. The system parameters are shown in Table 1. The OLED (object field) emits with lambertian radiation characteristic, and the measurement object emits with uniform radiation characteristics.
(a) Reference images: dark and bright fullscreen image.
(b) Exemplary fringe pattern images, in horizonal and vertical orientation.
To realize a measurement field of around 0.85 mm the magnification of the projection lens is about −1/5. The imaging magnification is 11x. The measurement field diagonal is 0.85 mm. The distance between BiMiD and measurement plane is approximately 160 mm. Both system apertures are small, and the paraxial lens diameter at the projection path is 5 mm and at the imaging path 6 mm. The triangulation angle is ; that means that the image is projected onto the measurement object under an angle of 9°.
The complete system principle is shown simplified in Figure 8. The first simulation part is to simulate the projection path of the OLED image. The OLED acts as an image/light source with lambertian radiation characteristics. The fringe pattern image is projected on the MO. In the first simulation step, the MO is used as a detector. The images are saved and used for the second simulation step, the imaging/detection path. The detected image, MO with fringes, acts as an image/light source for the imaging of the MO into the real detector plane, the BiMiD. The radiation characteristic of the MO-fringe image is uniform. Figure 8(a) shows one of the fringe pattern images that is displayed in the measurement plane on the MO. The MO is a sinusoidal ideal object which is shown in Figure 8(b) and Figure 9(a). Figure 8(c) depicts the detected image of the MO during one fringe projection sequence. For the second part of simulation this image acts as an image/light source. The image is projected into the BiMiD detection field as shown in Figure 8(d). 46 fringe pattern images and 2 reference images were imaged onto the MO. Therefore, 48 images of the MO with fringe pattern were imaged into the BiMiD detection field (23 pattern images for each orientation, horizontal or vertical, and 2 reference images) (see Figure 7). Based on these detection images (BiMiD) the simulated measurement object could be recalculated to a 3D model. The result is shown in Figure 8(e) and Figure 9(b).
(a) Measurement object simulated with ZEMAX.
(b) Measurement object recalculated with IOF-3D-Software tool.
For the 3D calculation we decided to use a 16-step phase shift of the fringe pattern. Each pattern consists of a series of fringes next to each other with a width of 16 pixels each. During the sequence each pattern is shifted by 1 pixel between two adjacent steps. The basic pattern size is 1024 × 1024 consisting of 64 periods of fringes (independent of the projector resolution, only the centered area is projected onto the object). This implies a gray code sequence of 7 images resulting in 23 images projected in one direction. As reference a black image and a bright image are also recorded. Therefore, we use 48 images for the complete measurement. In Figure 7 some test images are shown. During measurement each pixel records a series of intensity values. The periodic phase value is calculated from the intensity values of the 16 fringe patterns normalized with the dark and the bright image for reference using interpolation. Phase unwrapping is done using the Gray code images in the sequence. The intensity values of the Gray code sequence for each pixel are translated into a binary sequence representing the period number. Now a multiple of is added to each periodic phase value depending on its period number to unwrap the phase values. After this calculation the sinusoidal MO, which is simulated in ZEMAX, is recalculated to a 3D model. The recalculated and, for comparison, the simulated object is shown in Figure 9. The calculated 3D model shows well conformity to the simulated MO.
The monolithic design of OLED-on-CMOS backplane with photodiodes combines emitting and detecting units on one single chip. That offers a new flexibility for applications in optical metrology for surface and shape characterization and allows for compact optical systems, especially in the field of optical metrology.
In this paper we presented a compact, highly integrated 3D metrology system based on a fringe projection principle using a bi-directional OLED microdisplay developed by Fraunhofer IPMS. This microdisplay combines light emitting pixels called OLED microdisplay (projection unit) and light detecting pixels called photodiodes (camera unit) on one single device. This technology provides the opportunity for miniaturization of optical metrology systems.
In contrast to conventional 3D sensor systems (that are based on projection and imaging unti) the presented setup based on BiMiD is compact. The presented 3D metrology system is based on fringe projection onto the surface of the measurement object. The fringes appear deformed when being observed via a different angle (triangulation angle). Based on the deformation of the fringes the 3D coordinates of all visible points can be calculated and, thus, the object shape can be determined.
Due to the internal crosstalk effect two separate lenses for projection and imaging are necessary. The system lens design is based on the BiMiD and two paraxial lenses, which are orientated via a triangulation angle of 18°. Both apertures are smaller than 6 mm. The measurement field has a diagonal of 0.85 mm. For the recalculation of the measurement object different reference and fringe pattern images are necessary. 23 fringe pattern images and 2 reference images are simulated through the optical sytem in both directions, projection and imaging, and both orientations, horizontal and vertical. Based on the detected images (images of the measuring object during fringe projection sequence), fringes are deformed due to the irregular measurement surface. The simulated measurement object can be recalculated to a 3D object model. Well conformity of the simulated and calculated measurement object could be shown. This system simulation shows the proof of concept of a 3D surface sensor based on bi-directional sensor device.
Due to the application of the bi-directional OLED microdisplay the fringe generating elements and the detectors are combined into one single device. Therefore, an ultracompact and solid system concept for 3D surface metrology has been realized. Such a compact sensor is very suitable for applications like inline quality control in manufacturing processes. In case of elimination of the crosstalk, it would be possible to realize a sensor with only one optics whereas in a next step, different optical system configurations, the application of microoptics, hybrid optics, and freeform optical elements will be considered to design and construct a full working 3D surface sensor.
The authors would like to thank the colleagues from Fraunhofer IPMS for their good cooperation. The project was supported by the German Federal Ministry for Education and Research (ISEMO, Project no. 16SV3682).
- P. Kühmstedt, C. Munckelt, M. Heinze, C. Bräuer-Burchardt, and G. Notni, “3D shape measurement with phase correlation based fringe projection,” in Optical Measurement Systems for Industrial Inspection, vol. 6616 of Proceeding of SPIE, 2007.
- S. Riehemann, C. Grossmann, U. Vogel, B. Richter, and G. Notni, “Ultra small OLED pico projector,” Optik ' Photonik, vol. 4, no. 2, pp. 34–36, 2009.
- G. Notni, S. Riehemann, P. Kühmstedt, L. Heidler, and N. Wolf, “OLED microdisplays—a new key element for fringe projection setups,” in Interferometry XII: Applications, vol. 5532 of Proceedings of SPIE, pp. 170–177, usa, August 2004.
- S. Riehemann, M. Palme, U. Lippmann, N. Wolf, and G. Notni, “System concept and optical design of miniaturized projection and imaging systems with OLED microdisplays,” in Optical Design and Engineering II, vol. 5962 of Proceedings of SPIE, Jena, Germany, September 2005.
- S. Riehemann, U. Lippmann, M. Palme, V. Vangdal, T. Thomson, and G. Notni, “Ocular OLED-HMD with simultaneous eye-tracking,” in Proceeding of the 44th International Symposium, Seminar, and Exhibition, SID, vol. 37, no. 1, pp. 163–165, June 2006.
- C. Großmann, S. Riehemann, G. Notni, and A. Tünnermann, “OLED-based pico-projection system,” Journal of the Society for Information Display, vol. 18, no. 10, pp. 821–826, 2010.
- M. Scholles, U. Vogel, I. Underwood et al., “HYPOLED High-performance OLED microdisplays for mobile multimedia HMD and projection applications,” in Proceedings of the 48th Annual SID Symposium, Seminar, and Exhibition, vol. 3, pp. 1926–1929, 2010.
- S. Reckziegel, D. Kreye, T. Puegner et al., “Optical sensors based on monolithic integrated organic light-emitting diodes (OLEDs),” in Optical Sensors, vol. 7003 of Proceedings of SPIE, Strasbourg, France, April 2008.
- B. Richter, U. Vogel, R. Herold et al., “Bidirectional OLED microdisplay: combining display and image sensor functionality into a monolithic CMOS chip,” in Proceedings of the IEEE International Solid-State Circuits Conference (ISSCC '11), pp. 314–315, February 2011.
- U. Vogel, D. Kreye, S. Reckziegel, M. Törker, C. Grillberger, and J. Amelung, “OLED-on-CMOS integration for optoelectronic sensor applications,” in Silicon Photonics II, vol. 6477 of Proceedings of SPIE, San Jose, Calif, USA, January 2007.
- C. Grossmann, F. Perske, S. Zwick et al., “Surface metrology system based on bidirectional microdisplays,” in Optical Design and Engineering IV, vol. 8167 of Proceedings of SPIE, 2011.
- F. Wahl, Mustererkennung, 1986.
- J. Gerber, P. Kühmstedt, R. M. Kowarschik, G. Notni, and W. Schreiber, “Three-coordinate measuring system with structured light,” in Interferometry '94: Photomechanics, vol. 2342 of Proceedings of SPIE, pp. 41–49, May 1994.
- R. Kowarschik, P. Kühmstedt, J. Gerber, W. Schreiber, and G. Notni, “Adaptive optical three-dimensional measurement with structured light,” Optical Engineering, vol. 39, no. 1, pp. 150–158, 2000.
- W. Schreiber and G. Notni, “Theory and arrangements of self-calibrating whole-body three-dimensional measurement systems using fringe projection technique,” in Optical Engineering, vol. 39 of Proceedings of SPIE, no. 1, pp. 159–169, 2000.
- P. Kühmstedt, “Phasogrammetric optical 3D-sensor for the measurement of large objects,” in Optical Metrology in Production Engineering, vol. 5457 of Proceedings of SPIE, pp. 56–64, April 2004.
- T. Luhmann, S. Robson, S. Kyle, and I. Harley, Close Range Photogrammetry, Luhmann 2006: Wiley, Whittles, 2006.
- F. Chen, G. M. Brown, and M. Song, “Overview of three-dimensional shape measurement using optical methods,” Optical Engineering, vol. 39, no. 1, pp. 10–22, 2000.
- G. Kelly, R. Woodburn, I. Underwood et al., “A full-color QVGA microdisplay using light-emitting-polymer on CMOS,” in Proceedings of the 13th IEEE International Conference on Electronics, Circuits and Systems(ICECS '06), pp. 760–763, December 2006.
- D. Kreye, M. Toerker, U. Vogel, and J. Amelung, “Full colour RGB OLEDs on CMOS for active-matrix OLED microdisplays,” in Organic Ligh Emitting Materials and Devices X, vol. 6333 of Proceedings of SPIE, San Diego, Calif, USA, August 2006.
Copyright © 2012 Constanze Großmann et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.