Abstract

Statistical techniques based on scaling indices are applied to detect and investigate patterns in empirically given time series. The key idea is to use the distribution of scaling indices obtained from a delay representation of the empirical time series to distinguish between random and non-random components. Statistical tests for this purpose are designed and applied to specific examples. It is shown that a selection of subseries by scaling indices can significantly enhance the signal-to-noise ratio as compared to that of the total time series.