Abstract

This paper proposes an augmented reality (AR) strategy in which a Lamb waves based impact detection methodology dynamically interacts with a head portable visualization device allowing the inspector to see the estimated impact position (with its uncertainty) and impact energy directly on the plate-like structure. The impact detection methodology uses a network of piezosensors bonded on the structure to be monitored and a signal processing algorithm (the Warped Frequency Transform) able to compensate for dispersion the acquired waveforms. The compensated waveforms yield to a robust estimation of Lamb waves difference in distance of propagation (DDOP), used to feed hyperbolic algorithms for impact location determination, and allow an estimation of the uncertainty of the impact positioning as well as of the impact energy. The outputs of the impact methodology are passed to a visualization technology that yielding their representation in Augmented Reality (AR) is meant to support the inspector during the on-field inspection/diagnosis as well as the maintenance operations. The inspector, in fact, can see interactively in real time the impact data directly on the surface of the structure. To validate the proposed approach, tests on an aluminum plate are presented. Results confirm the feasibility of the method and its exploitability in maintenance practice.

1. Introduction

Augmented reality (AR) is a live, direct or indirect, view of a physical, real-world environment whose elements are augmented (or supplemented) by computer-generated input such as sound, graphics, images, or video data. AR was first used for military, industrial, and medical applications, but it was soon applied to numerous commercial and entertainment areas [1]. Numerous studies, developments and applications of AR have been proposed, as reported in the surveys by Azuma [2], Azuma et al. [3], Krevelen and Poleman [4], and Wang et al. [5], and modern trends on AR can be found in some very recent papers [68]. However, to the best of the authors’ knowledge, AR has been scarcely used in nondestructive testing and structural health monitoring (NDT/SHM) applications, probably due to the required multidisciplinary expertises including but not limited to solid mechanics, numerical simulation, signal processing, and data visualization. The idea to harness AR for developing, supporting and improving NDT/SHM is an innovative topic which should be better addressed by the literature.

The use of augmented reality (AR), in fact, could boost the usability of some NDT/SHM applications in both technical and economic sense. For instance, AR can be used in conjunction with visual based techniques to gain insight on the structural health status from the visual appearance of cracks [9] or to provide information in dead angle areas such as black walls or partitions [10]. Alternatively, in ultrasonic based approaches, AR can be exploited to provide to the inspector an immediate visual representation of the target of the ultrasonic inspection, generally a flaw/hole/damage, overimposed to the structure under testing. This would support the inspector in the inspection/diagnosis and maintenance process through(i)facilitating the understanding of the results of the inspection;(ii)providing an immediate real size dimension of the damage compared to that of the structure;(iii)avoiding delay and possible mistakes while transferring the inspection results to the structure.

In this paper, an AR approach is proposed to visualize the outcomes of a nondestructive Lamb waves based impact detection methodology directly on plate-like structures. For the sake of validating the proposed approach, the plate is impacted through the stroke of an instrumented hammer by the experimenter in known positions and the AR visualization is performed in real time. In a realistic industrial scenario, such AR approach could be exploited during the maintenance phase, after the impact has taken place, driving the operator on the impact position to check if the component has been damaged or not, and supporting the final decision whether maintenance actions are required.

Impact detection in plate-like structures via guided waves has been the focus of many researches over the last years [1120]. Generally a network of piezoelectric sensors is used to detect in passive mode the Lamb waves produced in the plate by the impact. The information gathered by the sensors is sent to a central unit where the acquired responses are processed to estimate the position of the impact and eventually its energy.

Thanks to the low weight and low power consumption of the technology, such approaches can be proficiently used in SHM of plate-like structures. In practice, whenever an impact occurs, the SHM system activate an alert. Once the alert is detected, an operator should retrieve the acquired information and take a decision about the maintenance strategy. In particular, the operator has to carefully verify whether the impact has damaged or not the monitored structure. This is particularly important for composite plate-like structures, since the impact effects may generate damage invisible or barely visible at human eyes, complicating thus the maintenance operator tasks. Therefore, the effectiveness of methods and tools, such as AR, aimed at improving the damage detectability, should be investigated.

In such direction, this paper proposes a first attempt on the use of AR for post-impact data visualization on the structure. In particular, from the acquired waveforms, the implemented tool provides to the inspector the estimated impact position, a measure of the uncertainty in the localization, and an estimation of the impact energy. To the authors’ opinion, both pieces of information could avoid false alarms of the SHM system and prevent false assumptions during the inspection phase. A case study is proposed to show the methodology and its final outcome.

The work is organized as follows. Basic information on AR is provided in Section 2. The proposed algorithm for impact localization including uncertainty and impact energy estimation is presented in Section 3. In Section 4, the AR environment and tools are presented and an experimental validation is proposed through a case study in Section 5. The conclusions end the paper.

2. Augmented Reality

Augmented reality (AR) is a real-time technique [21, 22] allowing the experimenter to see virtual objects or scenario superimposed on real-world images interactively [2, 3, 23]. Image acquisition, Calibration, Tracking, Registration, and Display are the main process steps required for AR [24]. A brief description of each step is provided in the following.

Image Acquisition. The Image acquisition is usually obtained through a camera embedded in a device generally carried on the experimenter’s head, as the one represented in Figure 1(a).

Calibration. The Calibration step is required to measure precisely the internal camera parameters and to evaluate and correct image distortion. This operation must be performed only once since it depends on the camera features only.

Tracking. Tracking is required to evaluate the pose (i.e., orientation in space in terms of pitch, roll, and yaw angles) and position of the camera with respect to an external reference system: several techniques can be applied depending on the need of the final application, but all these methods can be grouped in sensor-based and vision-based tracking techniques. Sensor-based tracking techniques require GPS/accelerometers, magnetometers, acoustical, and optical or mechanical devices to detect the pose and position of the experimenter [25]. Vision-based tracking techniques are based upon the evaluation of the size and optical deformation of a geometrical marker to estimate the relative position between the marker and the camera reference system [26].

Registration. The Registration is the procedure applied to synchronize the virtual image or scenario with the external view (real-word image) accordingly with the user’s head movements [27]. It uses the information gathered by Calibration and Tacking phases and exploits the spatial coordinates transformation from the 3D scene to a 2D image.

For this purpose, first the simple pinhole camera model (see Figure 2 and [28]), based on a perspective projection, is used to describe the relationship between the coordinates of the generic point in the 3D space and its projection onto the 2D screen plane . Simple geometry shows that if the distance of the screen plane to the centre of projection , that is, the ideal pinhole of the camera, is denoted by , the screen coordinates are related to the object coordinates as

Next, each point on the screen plane can be related to a pixel of an image if the dimensions of the screen plane are known. Denoting by and the row and column of the pixel with respect to and , the row and column of the pixel in the origin of the screen plane coordinate system (), and being and scale factors in and axis (pixel/meter), respectively, such relation is

Combining (1) and (2) and exploiting homogeneous coordinates (4-dimensional), a transformation between pixels and coordinates in the 3D space can be obtained as in which the parameter , related to the camera features, is introduced to account for the distortion effects between the axis and [29]. The parameters of the matrix P, known as camera’s perspective matrix, are obtained by means of the Calibration process.

Subsequently, the relative position in the 3D space between the marker and the camera coordinate systems is introduced: where the Rotation matrix (R) and a Translation vector (T), computed in real time by the Tracking phase, describe the pose and position of the camera with respect to the marker reference system. The matrix M is also known as the external parameters matrix. Finally, substituting (4) into (3) yields a dynamic relation between each point described in 3D marker coordinate system and its representation on a 2D virtual image:

Since the marker and the plate reference systems are parallel and shifted of known quantities along the and axis, that is and (see Figure 2), the coordinates of a point with respect to to the plate coordinate system can be directly related to and by means of (5) taking and . In this application, the point is the impact point.

Display. Finally, the Display phase is meant to show to the experimenter the AR scene. The available devices can be divided into one of the two main groups, depending whether they show a computer generated image of the external world and synthetic environment (Head Mounted Display (HMD)), or a synchronized combination of real-world eyesight and computer-generated image (Optical See-Through devices). HMDs are composed by a dark head helmet, equipped with a pair of projectors displaying external world images acquired by a camera mounted on the HMD itself: the device is so able to combine a background image streaming coming from the camera (external view) and a foreground synthetic image, synchronized with the localized optical markers. On the other hand, see-through displays are usually equipped with a camera, projectors, and semitransparent lenses: the experimenter can see through the lenses the real-world eyesight, while the virtual scene, represented by the virtual elaboration of the image acquired by the camera, is projected onto the lenses.

According to [30], also new devices like mobile platforms (e.g., Android) can be used for AR, in which the augmented image is displayed onto the screen of a mobile phone or a tablet, using the device camera to acquire the external image.

3. Impact Detection Methodology

In this section, the guided waves based methodology to estimate the impact position , the uncertainty in such estimation , and impact energy is presented.

3.1. Hyperbolic Positioning

Hyperbolic positioning, also called , is a powerful method to locate the impact position in a plate. Given the positions of at least three sensors on the plate, with , such method exploits the differences in distance of propagation (DDOP) traveled by the waves from the impact point to the sensors : in order to determine hyperbolas on which the impact point must lie. The intersection of the three different hyperbolas , obtained by solving the system of equations with the Levenberg-Marquardt algorithm [31], is taken to be the impact position.

Generally, the difference in distance of propagation (DDOP) is obtained by multiplying the difference in time of arrival (DTOA), measured from the acquired signals as the first arrival over a certain threshold, by a nominal wave propagation speed. Unfortunately, the potential of such approach is limited in plates where several dispersive modes, that is, with a frequency-dependent velocity and attenuation, appear simultaneously in the received signals and the selection of a wave speed and thus the transformation from DTOA to DDOP is not trivial.

To overcome this detrimental effect, it was shown in [19] that processing the acquired signals with a suitable transform, namely, the Warped Frequency Transform (WFT), a robust estimation of the DDOP is obtained without the need of measuring the DTOA (some details are given in the next subsection).

3.2. DDOP Estimation via Warped Frequency Transform

Let us consider a guided wave signal detected passively in an isotropic plate, where denotes the time and is the unknown traveled distance, and assume that the mode is within the signal, as generally happen when a plate undergoes to an impact. The group velocity curve of the mode can be used to compute a warping map that defines uniquely a Frequency Warping operator [32]. Such operator, applied to a , yields to a so-called warped signal whose frequency transform is defined as being the first derivative of the warping map, the warped frequency spectra of the exciting pulse at the point of impact (zero traveled distance), and a warping map normalization parameter. The dispersive signal is thus transformed as in (7) where the dispersive effect of the distance is converted into a simple warped time delay proportional to the distance itself.

If so, the cross-correlation in the frequency domain of two warped signals acquired at sensors and is where the DDOP appears in the phase term. Thus, the abscissa value at which the cross-correlation envelope of two signal peaks in the frequency warped domain can be directly related to the DDOP of the dispersive signal acquired passively at two different sensors.

3.3. Theoretical Estimation Error of the Impact Position

The proposed impact localization strategy has an intrinsic uncertainty due to the imperfect knowledge of the material properties of the plate, the limited sampling frequency, and the unavoidable presence of noise in the measurements. Such uncertainty is strictly related to the sensor topology, and its lower bound, that is, the minimum theoretical estimation error of the source positions, can be estimated by using the Cramèr-Rao algorithm.

Considering DDOP measurements, the Cramèr-Rao value for a point on the plate is calculated as is the matrix whose elements are where is the equivalent wave speed obtained through the time distance mapping described in the warping procedure, is the warping map normalization parameter, and is the considered sampling frequency.

is the full-rank matrix:

is a diagonal matrix which represents the error in the measurements. The measurement errors are considered as independent identically distributed Gaussian random variables with zero mean and standard deviation , so that the nonnull elements of are equal to .

3.4. Impact Energy Estimation

Because of the acoustic energy confinement in the waveguide, Lamb waves experience a low attenuation with distance. In particular, the acquired signal amplitude is inversely proportional to the squared root of the distance of propagation. The estimation of impact energy deserves attention since it can be related to the level of damage in the component. Since the WFT is a unitary transform which preserves the energy of the signal [19], the impact energy can be estimated from the energy of the warped signals as where , with , are the warped versions of the signals acquired by the three sensors, and is a calibration constant.

4. Impact Data Visualization in AR

Hardware and software, settings, and testing procedures used in this work to visualize the outcomes of the impact detection methodology in AR are described in the following.

An aluminum 1050A square plate  mm 1000 mm and 3 mm thick was considered (see Figure 3). The nominal properties considered for the plate are Young’s modulus  GPa, Poisson’s coefficient , and material density  kg/. Three piezoelectric sensors (PZT discs PIC181, diameter 10 mm, thickness 1 mm) were bonded to the plate surface, at positions reported in Table 1.

In this study, a vision-based tracking technique is adopted. Such approach is usually preferred to sensor-based techniques in internal applications due to its higher precision and easier hardware implementation. To such purpose, the marker of Figure 1(b) is fixed on the plate at the mm coordinates. The marker has been simply sketched with graphical software and printed on a paper sheet. Different marker shapes could be used, provided they are asymmetric, black and white, and easily recognizable by the AR software. The tracking is performed by means of the AR toolkit, an available library for augmented reality shared by the University of Washington [33]. The AR toolkit is used to detect the marker, to measure the image distortion, and to relate in time the marker reference system to the image.

The experimenter wears a pair of Vuzix STAR 1200 glasses (see Figure 1(a)), a see-trough device specifically designed for AR applications [34]. These glasses include a camera in the front and two miniaturized projectors displaying images on the transparent lenses; two USB plugs connect the Vuzix glasses with a notebook or a PC. The resolution of the camera of this see-through glasses can be changed up to pixels, and the two projectors support video display resolutions up to . At first, an image in which the marker is framed with a preset distance and pose (known position) of the Vuzix glasses camera is taken; the AR toolkit macros automatically detect the marker in the image, and performs the Calibration computing (the other camera features are generally known).

The experimenter strikes the plate with the instrumented hammer (PCB Piezotronics), used to simulate impacts, and the guided waves are signals gathered by the sensors through an oscilloscope (LC534 series LeCroy) at a sampling frequency of  kHz. Acquisitions are triggered when the signal received from one of the sensors reaches a threshold level of 140 mV; pretrigger recording is enabled to obtain the previous history of each signal. The detected waveforms are directly processed in a PC to find the position , position uncertainty , and energy of the impact. Such data are passed to the AR algorithm for the final visualization.

During tracking, each frame acquired by the glasses’ camera is processed by the AR toolkit software following these steps: application of threshold, detection of connected objects, detection of objects contours, detection of marker edges, computation of image distance and pose with respect to the camera reference system by the evaluation of the marker distortion, and projection of the impact point coordinates on the image in terms of and .

Finally, a designed virtual symbol or image, centered in the right position ( and ), can be projected to the real world image in real time (see Figure 4) exploiting the glasses projectors.

The marker detection procedure does not require that the experimenters’ point of view is perpendicular to the surface on which the marker lies, so that the experimenter can view the impact point from several points of view, rotating the head in a natural way.

5. Results of the Case Study

5.1. Impact Location Estimation

To process the acquired signals with the WFT, first a guided mode must be selected and its warping map is designed. Here, the fundamental mode was selected since the contribution of in the signal is expected to be much more relevant than the one of for two main reasons: (i) the wavelength tuning effect imposed by the sensors/plate properties filters out most of the wave; (ii) due to the out-of-plane excitation the energy in the mode is considerably greater than that retained by the mode. Therefore, the group velocity curve of the mode was predicted [35], used to compute the warping map and the normalization parameter , as well as the proper Frequency Warping operator . At this point, for the given coordinates of the transducers , the selected sampling frequency , and the parameter , the Cramer-Rao lower bound was computed for each point of the plate and represented in Figure 5 as a contour plot.

Then, the plate was impacted, the signals acquired, and the cross-correlation (Xcorr) of the warped signals computed. For instance, the Xcorr between and for an impact  mm and  mm is depicted in Figure 6(a) together with its envelope computed as the modulus of the Hilbert Transform. Similarly, in Figure 6(b) is shown the Xcorr and its envelope for and .

As can be seen, the abscissas of envelope maxima correspond to the actual difference in distance of propagation and , respectively. Similar accuracy was obtained for not shown in Figure 6. Such DDOP, define the hyperbolas used to estimate the impact position.

Such approach was used next to locate 18 impacts on the plate surface. The results of the proposed procedure can be seen in Figure 7 where the crosses denote the positions where the plate has been impacted, the centre of the circles the impact positions estimated by the proposed procedure, and whereas the circles’ radii were set equal to the value of for the estimated impact position. It is worth noting how the circles always surround the true impact positions.

5.2. Impact Energy Estimation

The impact energy for the 18 different impacts considered was estimated as detailed in Section 3.4 and represented in Figure 8 as small circles. Such estimated impact energies were compared with those measured by the instrumented hammer . In particular, was computed as the squared value of the velocity time-history, obtained by integrating the acceleration time-history registered by the instrumented hammer. As can be seen, a very good agreement between the two quantities was found, thus verifying that an efficient impact energy estimation can be performed from warped waveforms . The discrepancy in the energy estimation that appears in cases 10 and 14 may be due to several factors, such as the suboptimal selection of the calibration constant , or the jamming effect due to edge reflections. Most likely, the discrepancy is due to the difference in the frequency response between the impact hammer (which acts as low pass filter in the 0–4 kHz range) and the piezoelectric sensors (which have a much broader frequency response).

5.3. AR Visualization

Following the descried AR approach, the experimenter wears a pair of AR glasses and frames both the marker and the zone in which the impact is applied. The aluminum panel is hit with the hammer, the impact detection procedure starts, and the estimated coordinates of the impact point with respect to the marker system and are passed to the visualization algorithm along with the and .

As shown in Figure 9, a symbol formed by three squares has been designed to represent the data obtained from the impact methodology: the centre of the red square shows the estimated impact position , the side length of the green square denotes the uncertainty in the estimated position , whereas the yellow square denotes the strength of the impact: a yellow square close to the red square means low energy while a yellow square close to the green square means high energy . The experimenter can thus visually evaluate the energy at the impact in an intuitive way. Figure 9 shows the view through the glasses lenses after the augmentation.

6. Conclusions

In this work, an augmented reality (AR) strategy is proposed to visualize on plate-like structures the outcomes of an impact detection methodology. In particular, the impact position and energy are considered. To such aim, first previous contributions of the authors devoted to signal processing strategies for impact localization are exploited and extended to the estimation of the impact energy. The experimental results on an isotropic plate show the accuracy of the methodology on both impact location and energy estimation. Next, as a new original contribution to the research, the visualization in augmented reality (AR) of impact position, uncertainty, and impact energy, is provided. The experimental results show how such data can be viewed by the experimenter in real-time directly overimposed on the structure.

The proposed strategy shows good potential to be used in damage detection and maintenance operations. In fact, beside the precise and low computational cost for impact detection (position and energy) it can ease the maintenance practice since data can be presented directly on the component to monitor in an effective and intuitive way. The AR is preferred to the Virtual Reality since the augmented scene is obtained by adding virtual objects to the real-world eyesight, mitigating thus the unphysical feeling that the end-user experience from a complete virtual approach while still supporting him during the operations in the real-world environment.

Further studies are needed to assess the reliability of the proposed technology and the applicability to more complex structures (e.g., stiffened plates) and to anisotropic materials.

Conflict of Interests

The authors did not receive any financial support from the commercial identities mentioned in the work. The authors declare that there is no conflict of interests regarding the publication of this paper.