Abstract

We address the problem of low amplitude oscillatory motion detection through different low-cost sensors: a LIS3LV02DQ MEMS accelerometer, a Microsoft Kinect v2 range camera, and a uBlox 6 GPS receiver. Several tests were performed using a one-direction vibrating table with different oscillation frequencies (in the range 1.5–3 Hz) and small challenging amplitudes (0.02 m and 0.03 m). A Mikrotron EoSens high-resolution camera was used to give reference data. A dedicated software tool was developed to retrieve Kinect v2 results. The capabilities of the VADASE algorithm were employed to process uBlox 6 GPS receiver observations. In the investigated time interval (in the order of tens of seconds) the results obtained indicate that displacements were detected with the resolution of fractions of millimeters with MEMS accelerometer and Kinect v2 and few millimeters with uBlox 6. MEMS accelerometer displays the lowest noise but a significant bias, whereas Kinect v2 and uBlox 6 appear more stable. The results suggest the possibility of sensor integration both for indoor (MEMS accelerometer + Kinect v2) and for outdoor (MEMS accelerometer + uBlox 6) applications and seem promising for structural monitoring applications.

1. Introduction

This work addresses the problem of in situ and close-range remote detection and characterization of oscillatory motions, even down to a few centimeters amplitude (0.02 m and 0.03 m) and frequency in the range of 1.5–3 Hz. This was achieved through low-cost sensors which are based on different technologies: a LIS3LV02DQ MEMS accelerometer, a Microsoft Kinect v2 range camera, and a uBlox 6 GPS receiver. Amplitude and frequency accuracies for detected positions, velocities, and accelerations were evaluated with respect to the reference data provided by a Mikrotron EoSens high-resolution camera, and integration problems were highlighted, in order to first of all assess the potential of these sensors for close-range monitoring, possibly in real-time. This goal was reached through experimental tests where sensors underwent on monitored oscillatory motions induced by a one-direction vibrating table suitable to work at different amplitudes and frequencies. Specifically, cameras remotely monitored the object undergoing the oscillatory motion at close range and were installed in a suitable location. To allow the Kinect v2 object tracking, a proper target was connected with the monitored object. On the other hand, the GPS receiver and the MEMS accelerometer were placed on the monitored object, since they have to undergo the oscillatory motion.

All sensors investigated are commonly employed for very different applications. Due to their low-cost and their different features and capabilities, they can offer complementary contributions for monitoring purposes, as summarized in Table 1.

The GPS sensor provides the receiver position (and displacement) in a global reference frame [1] together with accurate information in an absolute time reference frame (so-called GPS Time). Conversely, the MEMS accelerometer and cameras refer to a local spatial reference frame. Furthermore, the MEMS accelerometer does not provide the time stamp related to the acquired data (time information can only be retrieved knowing the acquisition frequency and it is computed from an arbitrary origin), while the Kinect v2 camera gives the time stamp related to each frame but in a relative time reference frame. The working conditions of the different sensors are also complementary. MEMS accelerometers and cameras are suitable for both indoor and outdoor applications; in particular, the Kinect v2 range camera requires good visibility for retrieving good quality texture, but at the same time direct sunlight must be avoided since depth data of the scene are not retrieved in these conditions. The GPS receiver is an only-outdoor equipment that requires good sky visibility.

Below we summarize the most common applications of the sensors considered in this research, with a focus on the applications related to monitoring purposes.

There are several fields of applications benefiting from miniaturized sensors as the MEMS accelerometers. Developed for military and aerospace market in the 1970s, the very first massive production was in the automotive industry (in particular for airbags, stability control, and GPS) [2]. Then they were widely employed for mobile phone and game applications (video game control and smartphone applications) and in the civil engineering field. With regard to monitoring purposes, they were used in structural engineering and in seismology. In particular, [3] investigated the effectiveness and robustness of a MEMS-based structural health monitoring system which represents a new strategy for failure detection of composite laminates. Through laboratory tests, the MEMS accelerometer responses were properly validated with simultaneous measurements.

A MEMS accelerometer senses the acceleration of gravity and an inclination with respect to the gravity direction can hence be detected. Not by chance, the use of MEMS as inclinometer was proposed since the early years of this decade: [4] shows suitable methods of determining tilt angles with accuracy of few tenths of degree arc (from to ).

Over the past decade, some researches have shown the reliability and sensitivity of low-cost triaxial accelerometers for use in geophysical and seismological applications [5]. Related promising results induced the design of original projects as the Community Seismic Network [6] which is a dense open seismic network consisting of low-cost sensors hosted by volunteers. The main product of the network, within few seconds, is a map of peak acceleration connected to earthquake events. Another example of network based on MEMS accelerometers is the Quake-Catcher Network for recording from moderate to large earthquakes. In [7] data recorded by GeoNet stations (http://www.geonet.org.nz/) and Quake-Catcher Network stations were compared after the September 3, 2010, Darfield earthquake. The peak ground accelerations observed by the two networks have been shown to be comparable.

A third significant example is Seismocloud (http://www.seismocloud.com/index.php), a project developed in Italy to support the earthquake early warning thanks to a dense network of low-cost seismometers.

Also, [8] has recently shown promising results with regard to the suitability of low-cost 3-axial MEMS accelerometers for civil and geophysical goals aimed at the reconstruction of vibrations and ground motion. In particular, the LIS331DLH low-cost MEMS accelerometer showed a very high agreement with EpiSensor FBA ES-T, an established very accurate accelerometer for strong-motion seismology field.

The Microsoft Kinect sensors (v1 and v2) are for all intents and purposes range cameras: although they have been originally designed for Natural User Interaction in a computer game environment, thanks to their depth sensor they can be used as full 3D scanners to easily reconstruct a physical scene, retrieving dense points clouds in real-time. These characteristics have immediately aroused the attention of researchers from several scientific fields.

Many studies [913] have evaluated the metric quality of the Kinect v1 data. With regard to the Kinect v2 sensor, which has been released on the market more recently, [14] investigated the ability of close-range 3D modelling.

Medical and health researchers have then explored the Kinect noninvasive body motion capture capabilities to provide remote rehabilitation facilities to patients [15, 16].

However, although the Kinects are specifically designed for body motion tracking, the object tracking is not yet a deeply studied topic. At present, the Kinect v1 sensor has been already used for real-time tracking of moving objects reaching the accuracy of few millimeters in 3D position detection [17, 18], whereas no particular attention was paid to both velocity and acceleration measurements. On the other hand, [19] investigated the use of Kinect v1 for monitoring the deflection of reinforced concrete beams subjected to cyclic loads, measuring vertical displacements.

Moreover, to our knowledge, the Kinect v2 sensor, which was obtained as demo version directly from Microsoft after our research group was selected in the Microsoft Kinect for Windows V2 Developer Preview Program, was never tested for object tracking up to now.

It is well known that GPS, including low-cost receivers, has an enormous range of applications, so that here only the applications closer to the purposes of this research and mainly related to structural dynamic monitoring are shortly recalled.

As a matter of fact the use of these low-cost sensors in structural dynamic monitoring is increasing but not yet well established.

In [20], the suitability of low-cost single frequency receivers for structural monitoring was investigated and several static and dynamic tests were performed with four GlobalTop Gms-u1LP; imposed oscillations (with amplitude ranging from 0.25 m to 2 m) have been clearly identified in the frequency domain, while dynamic displacements were retrieved with a precision influenced by the oscillation amplitude and frequency.

If we consider single frequency GPS receivers but not strictly low-cost sensors, many other examples arise from literature. In [21], a single frequency GPS receiver was used for dynamic oscillation detection through experimental tests. Also, a real case on the Hawksham Bridge in Nackawic (New Brunswick, Canada) was carried out; after an appropriate polynomial adjustment, the results showed that it is possible to detect displacements of some centimeters considering the phase observations related to single satellite and millimeters if phase differences between satellites were considered.

In the GNSS seismology field, for the May 3, 2012, Emilia earthquake [22], 7 GPS permanent stations data (1 Hz sample rate) were processed exploiting the VADASE software potentiality [23] and the results were then compared with the ones obtained with other well-established strategies and software. The VADASE software was used for both applying the ionospheric free combination over dual frequency data (hence eliminating the ionospheric error that heavily affects GPS observations) and considering single frequency data only (for this case in the current VADASE version [24] the ionospheric delay is modeled according to [25]). The Root Mean Square Error (RMSE) for VADASE dual and single frequency solutions with respect to the reference ones turned out of the same order of magnitude (about 1 cm in the horizontal direction, less than 2 cm in vertical direction).

This paper is organized as follows. In Section 2, the main features of sensors as well as of the tracking (Kinect v2 range camera) and processing (uBlox 6) software are shortly presented together with details about the MEMS accelerometer, vibrating table, and experimental design. In Section 3 the data processing approach is illustrated and the obtained results are discussed. Finally, in Section 4 some conclusions and future prospects are outlined.

2. Experiments: Devices and Tools

The equipment involved in the experimental investigation consists of a LIS3LV02DQ MEMS accelerometer, a Microsoft Kinect v2 range camera, a low-cost uBlox 6 GPS receiver, and one Mikrotron EoSens high-resolution camera as reference (see Figure 1).

The LIS3LV02DQ MEMS accelerometer (7 × 7 × 1.8 10−9 m3) is a commercial low-power 3-axial linear accelerometer [26] with a selectable acquisition frequency ranging from 40 Hz to 640 Hz, a full scale of ±2 g (sensitivity 920–1126 Least Significant bit per 1 g (LSb/g)) and ±6 g (sensitivity 306–374 LSb/g), and a resolution of 1 mg. It is provided by STMicroelectronics with USB connection for power supply and communication with a personal computer. The software interface (i.e., Unico 0.7.0.5) is released by the company and allows the sensor configuration and management of data (acceleration [mg]) storage.

The Kinect v2 sensor is a low-cost Time of Flight (ToF) range camera (); the device consists of a depth sensor, a colour camera, an accelerometer, and an array of four microphones. Only depth and colour data were used for this investigation. The depth sensor, composed by an infrared emitter and an infrared camera, is the heart of the Kinect technology; it provides a real-time depth map, that is, an image in which each pixel contains its own distance from the sensor (more precisely, the distance from the reference plane passing through the sensor). The colour camera has the purpose of collecting the texture of the scene, for example, in applications like face tracking. In particular, the Kinect v2 sensor provides a 16-bit depth pixel stream (13 bits represent the pixel depth and the remaining 3 bits are used as segmentation mask) at 30 frames per second. The colour camera is a full HD, with a resolution of pixels returned in the YUY2 raw image format at 30 Hz (frame rate drops to 15 Hz in case of low light). A dedicated software tool was developed with the Microsoft Kinect for Windows SDK v2.0 to retrieve data from the sensor. It is based on both the depth map and colour video stream; it makes it possible to capture in real-time the 3D position of the edges of a moving chessboard grid target (see Figure 1) for each frame while preserving the native acquisition rate (30 Hz).

The GPS low-cost receiver is a standard uBlox 6 receiver evaluation kit ( m3), able to supply a single frequency code and phase observations, with an acquisition frequency ranging from 1 to 10 Hz. The GPS data processing was performed exploiting the capabilities of VADASE. The VADASE algorithm only requires the phase observations and the broadcast information provided in real-time by a stand-alone single frequency receiver, while the most used scientific or commercial software for GPS data processing in real-time commonly requires dual frequency observations (as explained in [22]). The variometric approach, implemented in VADASE software, is applied to time differences of phase observations continuously collected by the receiver. Then, epoch-by-epoch displacements, basically equivalent to velocities, are estimated.

Reference data were provided by an acquisition system consisting of a high-speed, high-resolution camera (Mikrotron EoSens) equipped with a Nikon 50 mm focal length lens capturing gray-scale images at up to 500 fps with a resolution of pixels (for the present set of measurements, images were acquired at 250 fps and 100 fps) and a high-speed Camera Link digital video recorder operating in full configuration (IO Industries DVR Express® Core) to manage data acquisition and storage. The captured images were transferred to a personal computer under the control of the Express Core software.

Native kinematic parameters retrieved by the sensors are different: displacements for the Mikrotron EoSens camera, velocities (through VADASE algorithm) for uBlox 6, displacements for Kinect v2 range camera, and accelerations for MEMS accelerometer. Also the acquisition rates are remarkably different: up to 250 Hz for Mikrotron EoSens camera, 40 Hz for MEMS accelerometer (the acquisition frequency was set at 40 Hz to cut down the noise), 30 Hz for Kinect v2, and 5 Hz for uBlox 6 (on the basis of our previous experience, 5 Hz is the effective maximum frequency for this sensor). In Table 2 the acquisition rate used during the tests, the kinematic parameter supplied by each sensor, and its paid cost (just for buying one full evaluation kit for each device only) are summarized; in this respect, due to the very fast technological sensors developments, it is worth underlining that these costs are already significantly reduced and they will become even lower in the next future. Moreover, as already mentioned, each sensor acquires its observations with respect to its own time scale, as these scales are generally asynchronous.

The MEMS accelerometer, the uBlox 6 receiver, and the chessboard target, suitable to being tracked by cameras, were located on board the one-direction vibrating table (Figure 1). Both the Kinect v2 range camera and the Mikrotron EoSens camera were placed at a distance of about one meter from the table, with the optical axis of both cameras orthogonal to the target. The orthogonality was checked with a laser pointer. All sensors were connected to a laptop for storing the acquired observations.

Two oscillation amplitudes (0.02 m and 0.03 m) were tested. For each amplitude, four oscillations frequencies ( Hz,  Hz,  Hz, and  Hz) were set, each kept constant for approximately 15 seconds. Oscillation frequencies were roughly set through the vibrating table controller (potentiometer). The values of those frequencies were determined by analysing the high temporal resolution data acquired with the Mikrotron camera.

3. Analysis of Results: Methodology and Discussion

Displacements, velocities, and accelerations of the vibrating table monitored by the LIS3LV02DQ MEMS accelerometer, the Microsoft Kinect v2 range camera, and the uBlox 6 GPS receiver were compared to those recorded by the Mikrotron EoSens high-resolution camera.

The images acquired by the Mikrotron EoSens camera were processed using a Lagrangian Particle Tracking technique named Hybrid Lagrangian Particle Tracking (HLPT) [27]. Although HLPT was developed for tracking a passive tracer seeding a fluid in fluid mechanics experiments [28], it was successfully employed here to track both the chessboard edges and the texture of objects undergoing the oscillatory motions. The cornerstone of the image analysis algorithm is the solution of the Optical Flow equation, which defines the conservation of the pixel brightness intensity at time :where is the generic image pixel coordinate and is the unknown velocity vector at location . Since (1) is insufficient to compute the two unknown velocity components associated with a single pixel, the equation is computed in a window (where and are the horizontal and vertical dimensions of the window, resp.) centered at the pixel location. Equation (1) is solved for a limited number of image pixels, defined features (image portions suitable to being tracked because their luminosity remains almost unchanged for small time intervals). HLPT selects image features and tracks them from frame to frame. The matching measure used to follow a feature (and its interrogation window) and its “most similar” region at the successive time is the “Sum of Squared Differences” (SSD) among intensity values: the displacement is defined as the one that minimizes the SSD [29]. In HLPT, one applies the algorithm only to pixels where the solution for the displacement exists: those points are called “good features to track.”

Once the trajectories are reconstructed, displacements, velocities, and accelerations are computed via central differences. Displacement, velocity, and acceleration components belonging to the same frame are arithmetically averaged to compute their time history. To characterize the reference signal, the standard deviations of its amplitude were computed by averaging the detected amplitudes for the entire signal. The mean amplitude for 0.02 cm amplitude test turned out to be 0.0199 m with a standard deviation of 0.0001 m; for 0.03 cm amplitude the mean is equal to 0.0299 m with a standard deviation of 0.0002. The Fast Fourier Transform (FFT) on displacement data was then employed to identify the four different oscillation frequencies of the vibrating table. The same procedure was also applied to the raw data acquired by each device. Results are presented in Figures 2 and 3.

It is evident that the vibrating table is only roughly a harmonic oscillator, so the frequency peaks are identifiable but they are not perfectly separated from each other.

For uBlox 6, up to the third frequency is identified. Kinect v2 range camera failed in the test at the fourth frequency with amplitude 0.03 m. To study the four main peaks, the spectra of the low-cost sensors were divided into four intervals (hereinafter subtests) and for each interval a passband filter was applied in order to better analyse the kinematic parameters of each subset; the band width was selected analysing the peaks of the Mikrotron EoSens high-resolution camera power spectra. Successively, the filtered results were resampled at 100 Hz through cubic splines to facilitate the comparison and the synchronization with reference data. It is worth noting that the results obtained by processing the Mikrotron EoSens camera data at 100 Hz and 250 Hz were comparable. For these reasons only the results at 100 Hz will be presented.

Figure 4 shows the results obtained with the three sensors for the lowest frequency () and 0.02 m oscillation amplitude in the displacement domain. It is evident how it was challenging to correctly estimate the oscillation amplitude with uBlox 6 data, since the Nyquist frequency is only approximately 1.5 times the oscillation frequency (1.7 Hz).

The quantitative measure of the similarity among the kinematic parameters of low-cost sensors and reference data is the RMSE defined as where is the amount of data available within each subtest, is the detected kinematic parameter () for the low-cost sensor under investigation, and is the kinematic parameter detected with the Mikrotron camera.

To compute RMSE time scales synchronization was required. Time scales were approximatively aligned through cross-correlation, and then synchronization was improved through a linear interpolation, whose slope coefficient was calculated by comparing the zero-crossing times of the Mikrotron EoSens high-resolution camera with the corresponding zero-crossing times of each low-cost sensor. The RMSE was not calculated for all the differences but only on the LE95 population.

Results are summarized in Tables 3 and 4 where mean and standard deviations of residuals were reported as well. Figure 5 shows the RMSE trend of the kinematic parameters retrieved by the MEMS accelerometer and the Kinect v2 range camera as a function of the vibrating table oscillation frequency and amplitude. The MEMS accelerometer RMSE follows an almost constant trend for the first three frequencies, with an increase at the highest one; instead, the Kinect v2 range camera RMSE shows a generally increasing trend. Furthermore, as expected for the Kinect v2 range camera, the maximum value of the RMSE is reached in the test with oscillation amplitude of 0.03 m and frequency which was not properly identified (see Figure 3); in addition, both Table 4 and Figure 3 show that Kinect v2 failed during the second frequency test with 0.03 m amplitude, probably due to tracking algorithm errors. With regard to uBlox 6, only the results at the lowest frequency are reported, since only in this case the oscillation frequency is sufficiently lower than the Nyquist frequency (2.5 Hz).

According to the results summarized in Figure 5 and Tables 3 and 4 and considering displacement results, RMSE is always lower than or equal to 0.0014 m for the MEMS accelerometer, while for the Kinect v2 it is lower than 0.0012 m except for frequencies and at 0.03 m amplitude where RMSE is between 0.0004 and 0.0005 m. uBlox 6 RMSE is between 0.0088 and 0.0079 m for both tests.

Tables 3 and 4 show also the results of the correlation analysis aimed at obtaining the parameter, computed with the least squares regression method. To do so, for each amplitude and frequency, kinematic parameters detected or derived from low-cost sensor acquisitions were drawn in a 2D plot versus reference data. In particular, Figure 6 shows the results for the 0.03 m amplitude test at the lowest frequency and the high values are representative of the effectiveness of the adopted synchronization strategy.

The MEMS accelerometer generally provides the best results for all kinematic parameters (mainly for accelerations and velocities) and the time synchronization is guaranteed even for such a short time interval (15 seconds). Displacement accuracy (RMSE) is within 1.5–2% of the reference solution, except for the frequency where RMSE drops to 5% of the reference solution. With regard to velocities and accelerations, the accuracy of the MEMS accelerometer is 1 to 10 times better than the other sensors. This is mainly due to the fairly low noise (standard deviation), which is always lower (up to 1-2 order of magnitude) than the one related to the solutions of the other sensors. On the contrary, the bias due to integration becomes quite high for displacements, even for such a short interval, so that it will probably become unacceptable for longer intervals, herein not investigated.

The Kinect v2 displays slightly lower performance with respect to the MEMS accelerometer in terms of noise (rather stable across all the tests), but it appears superior in terms of stability (lower bias) on displacements. This aspect suggests that for indoor applications such as, for example, indoor positioning, monitoring and tracking, the integration of MEMS accelerometers, and Kinect range cameras, can lead to clear benefits in conditions similar to the ones of the tests here presented (in terms of kind of movement and acquisition frequency of the sensors); in fact, the two sensors showed quite complementary features in terms of stability and noise. In the test at the highest frequency and 0.02 m amplitude, the Kinect v2 performance is similar to the MEMS accelerometer. Displacements accuracy (RMSE) is within 4-5% of the reference solution, except for the already mentioned two failures.

The uBlox 6 investigation is limited to the tests at the lowest frequency, due to the 5 Hz acquisition rate which is the maximum allowed in our experience; it supplies the worst results globally, mainly due to the aliasing effect even in the lowest frequency tests, causing the underestimation of the oscillation amplitude of about 30% (Figure 4) and an overall accuracy around 30% of the reference solution. Nevertheless, the bias on velocities is almost null, comparable to the MEMS accelerometer and the Kinect v2 range camera, and the bias on displacement is definitely not significant with respect to the standard deviation. Therefore, it was shown how it is possible to correct the kinematic parameters achieved by the high-resolution camera with the results obtained with a low-cost receiver (even if the real oscillation amplitude is underestimated) and then refer all the solutions to a global time reference frame (Table 1). In general, this means that the data acquired with the adopted sensors can refer to such absolute time reference frame thanks to the GPS capability. Hence, for outdoor applications such as structural and infrastructural monitoring and seismology, the integrated use of MEMS accelerometers and GPS receivers represents a promising opportunity. The MEMS accelerometers are very precise in retrieving high frequency movements also with very small amplitude, thanks to their high sensitivity and relatively low noise. Instead, usually the GPS receivers (in particular if cost-effective) work within a narrower frequency band as their acquisition frequency is generally limited to 20 Hz. On the other hand, they are stable and provide such valuable details as position and time information in absolute reference frames.

4. Conclusions and Future Prospects

In this paper we address the problem of detection and characterization of oscillatory motions through different low-cost sensors: a LIS3LV02DQ MEMS accelerometer, a Microsoft Kinect v2 range camera, and a uBlox 6 GPS receiver. Suitable assessment tests were performed using a one-direction vibrating table in order to first of all demonstrate the potential of such sensors and related tools. Four oscillation frequencies ( Hz,  Hz,  Hz, and  Hz) and two different amplitudes (0.02 m and 0.03 m), kept approximately stable for about 15 seconds, were considered for the analysis. The estimated oscillatory motion parameters (accelerations, velocities, and displacements) were compared to those obtained with a Mikrotron EoSens high-resolution camera.

To quantify the sensor performances it is possible to outline the following:(i)All sensors identify the frequencies of the oscillatory motions, if compatible with the Nyquist frequency of their acquisition rate.(ii)MEMS accelerometer generally supplies the best solutions with regard to all the kinematic parameters (mainly acceleration and velocity), provided the time synchronization is guaranteed for such a short interval; RMSE is within -2 × 10−3 m, owing to the fairly low noise, but the bias due to the double integration (from acceleration to displacements) becomes quite high for displacements, again even for such a short time interval.(iii)Kinect v2 range camera displays slightly lower performance with respect to MEMS accelerometer in terms of noise (rather stable across all the tests), but it appears superior in terms of stability (lower bias) on displacements; RMSE is within -2 × 10−3 m, apart from two evident failures, probably due to the tracking algorithm.(iv)uBlox 6 GPS receiver usage is limited by its low (5 Hz) acquisition rate, causing an aliasing effect and a significant underestimation of the oscillation amplitude (about 30% for the small—0.02 m and 0.03 m—considered amplitudes); RMSE is therefore around –9 × 10−3 m; nevertheless, the almost null velocity bias enables the synchronization of uBlox 6 with the high-resolution camera and, more generally with the other sensors, making it possible to represent the solutions obtained in a global time reference frame.

The results obtained are promising in the prospective of employing these kinds of low-cost sensors in the field of oscillatory motions monitoring. The application fields are manifold (structural monitoring, industrial control system development, ground monitoring, and so on) and the complementarity of these sensors is remarkable (high temporal stability of the Kinect v2 and the GPS receiver, the MEMS low noise): this suggests their integration for indoor (MEMS accelerometer + Kinect v2) and outdoor (MEMS accelerometer + uBlox 6 GPS receiver) applications, even in real-time.

With regard to future prospects and possible improvements, some items can be addressed:(i)The upgrading of the Kinect v2 tracking tool and improving the target automatic collimation by optimizing real-time data management in order to avoid failures (as happened for the and frequencies of the test at 0.03 m amplitude).(ii)The possibility of tracking different targets simultaneously with the Kinect v2 must be considered, together with the possibility to use the Kinect v2 reference frame with axes directed independently from the object to be monitored (in our tests the optical axis was aligned orthogonally to the object motion direction).(iii)The investigation of the uBlox 6 performances for lower frequency oscillatory motions.(iv)The repetition of the tests with different MEMS accelerometers arranged with different orientations with respect to the predominant motion.(v)The repetition of the tests over longer periods, in order to investigate the effectiveness of the synchronization procedure and possibly to refine it.

Competing Interests

The authors declare that they have no competing interests.

Acknowledgments

The authors are indebted to Dr. Augusto Mazzoni for the fruitful discussions during the experiments and the preparation of the paper and to Mr. Stefano Putgioni for the careful realization of the vibrating table.