Abstract
In this article a new method to estimate optimum filter length in linear prediction is described. Linear prediction was used to enhance resolution of a spectrum. In particular, the dependence of prediction error on filter length has been studied. With calculations of simulated spectra it is shown that the prediction error falls rapidly when the filter length attains its optimum value. This effect is quite pronounced when the spectrum has a good signal-to-noise ratio and the modified covariance method is used to calculate prediction filter coefficients. The method is illustrated with applications to real Raman spectra.
Get full access to this article
View all access options for this article.
