This site uses cookies to deliver our services and to show you relevant ads and job listings. By using our site, you acknowledge that you have read and understand our Cookie Policy , Privacy Policy , and our Terms of Service . Your use of the Related Sites, including DSPRelated.com, FPGARelated.com, EmbeddedRelated.com and Electronics-Related.com, is subject to these policies and terms.

Markus Nentwig ● June 9, 2012 ● ● Coded in June 9, 2012Coded in Matlab

Estimates delay and scaling factor between two signals by iteration ( Code snippet: )

Introduction

Quite many DSP-related tasks involve an unknown time delay between signals that needs to be estimated, and maybe corrected. The code snippet presents a "quick-and-dirty" solution - it's not intended for implementation on a DSP, but for testbenches and the like, where robustness and simplicity are more important than efficiency.

The topic is well-researched, for example the references in [1] should be a good starting point.

A Matlab implementation that is stated to be maximum-likelyhood (which would be the "gold standard" for delay estimation) can be found in [2].

Figure 1 shows an example signal that has been delayed and scaled. The code snippet estimates delay and scaling factor, "un-delays" and scales back so that the output ideally matches the input.

The algorithm is simple but robust. It relies on FFT-based crosscorrelation, for example as illustrated in [3].

More sophisticated frequency domain methods such as [4] are more accurate and efficient, but the difficulty of resolving the +/- pi phase ambiguity may cause them to fail for some input signals, depending on power spectrum and group delay profile. The presented code snippet finds the time delay that maximizes crosscorrelation, regardless of whether "the" delay between the signals over frequency is well-defined or not.

Signals do not need to be coarse-aligned or even arrive in the right order (see "cyclic signals" section below).



Figure 1: delayed / scaled signals

Algorithm

The method isn't particularly imaginative - and that's an advertised feature, as little can go wrong:

Crosscorrelate, find the highest peak, make an educated guess where the maximum is hiding between the bins, time-shift and repeat.

The flowchart gives a general overview.



Figure 2: Algorithm

The actual iteration loop is somewhat more complex, as finite numerical accuracy gives bitwise identical values of the (shallow) correlation peak over some delay range. If this is observed for two points in the search window, the algorithm will continue to improve the third point, or exit if all three are identical. For a required accuracy up to 10-5 T sample , this is usually not an issue.

The "educated guess" fits a square polynomial through the three points of the search window and finds the maximum.

After some iterations, numerical accuracy will cause nonsense guesses. If the predicted location of the maximum is outside the search window, it is discarded and linear interpolation is used instead. Otherwise, the algorithm will converge, if slowly.

The polynomial equation is derived here (input file for Maxima computer algebra system (open-source) ).

Methods are known for the interpolation of peaks in FFT data [5], [6] (note, the problem appears similar but is probably not exactly the same). Those could be used to speed up convergence, but I haven't investigated them, as the numerical accuracy of the input data appears to be the main limitation.

Cyclic signals

The signals are treated as cyclic. The first and last sample point have no special meaning, i.e. circshifting both input signals by the same amount will give the same result.

Delaying a signal with sharp transients may reveal ringing that is usually invisible, as long as one stays on the sampling grid:

Nyquist's vestigial sideband theorem [7] promises "no intersample interference", but it keeps quiet about the times between samples...

The delay is returned in the interval [-n/2..n/2], with a signal length of n.

Code snippet

The code snippet includes a demo function that sets up test signals with a known delay. It calls iterDelayEst(...), applies the estimated delay / scaling factor and compares. The result is shown in Figure 3.



Figure 3: Output

Conclusion

The function should return delay and scaling factor to maximize the crosscorrelation, and not be fussy about it.

It's a cleaned-up rewrite, please let me know (comment!) in case of problems.

References

[1] Y. Zhang; W. H. Abdulla: A Comparative Study of Time-Delay Estimation Techniques Using Microphone Arrays

[2] Moddemeijer, R., An information theoretical delay estimator

[3] J. del Peral-Rosado et al: Preliminary Analysis of the Positioning Capabilities of the Positioning Reference Signal of 3GPP LTE

[4] M. Nentwig: Delay estimation by FFT

[5] R. Lyons: Accurate Measurement of a Sinusoid's Peak Amplitude Based on FFT Data

[6] Tutorial: Interpolating the peak location of a FFT

[7] Lecture notes: Pulse shaping and equalization