'Define correct scipy.signal.spectrogram input parameters
I have the following code:
sampling_rate=128
N = sampling_rate
_f, t, Sxx = signal.spectrogram(_signal, sampling_rate, nperseg=N, nfft=N, noverlap=N-1, mode="complex")
cm = plt.pcolormesh(t, _f, np.log(np.abs(Sxx)), cmap="viridis")
plt.savefig('Spectogram.png', dpi=300, frameon='false')
which is giving me the following plot:
What is the correct way to define the correct parameters, namely: nperseg, nfft and noverlap to obtain a correct and smooth plot?
Thank you!
Solution 1:[1]
I was able to solve this issue by normalising the input signal and doing:
t, f, Sxx = signal.spectrogram(filt_signal[sampleIDX], fs=128,
nperseg=200, noverlap=180, return_onesided=True)
Sxx = np.log(Sxx)
Sxx = (Sxx - np.min(Sxx)) / (np.max(Sxx) - np.min(Sxx))
image_Sxx = Image.fromarray(np.uint8(cm.viridis(Sxx)*255))
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
Solution | Source |
---|---|
Solution 1 | Ptb |