Вопрос

There's something I don't understand: I computes the spectral density of a signal (by computing its FFT) and that seems to work correctly but it keeps having some kind of background noise, although I'm doing it on a perfect sine wave with 2 frequencies (10 and 30Hz) that I generate myself.

Of course, the noise isn't really too annoying, because it is only visible with a logarithmic scale, but even, where does it come from ? Is that normal ? Do I have a bug in my signal, or anywhere ?

Energy spectral density of 10 + 30Hz sin wave

Это было полезно?

Решение

It is mainly quantisation noise, but there may also be a small amount of noise from floating point rounding errors etc in the FFT itself.

Your "perfect sine wave" can not be perfectly represented in digital form, since you will always have finite precision. The difference between the theoretical value of a waveform at the time that it is sampled and the actual sample value is called "quantisation error". For N bit integer data the error will typically be approximately evenly distributed over the range +/- 0.5 LSB and will be notionally "white", i.e. have a roughly flat spectrum. Obviously the greater the sample resolution (greater N), the smaller the quantisation errors, but since N can not be infinite, there will always be a finite amount of quantisation noise. For N=16 bits, as used in e.g. "CD quality" digital audio, the quantisation noise is typically around 96 dB below full scale.

Лицензировано под: CC-BY-SA с атрибуция
Не связан с StackOverflow
scroll top