Noise floor depending on the zoom? [it's designed to work that way]

fbxfbx
edited December 2018 in Problems Now Fixed
Hi,

here I noted that if I enable the spectrum display I see that the noise floor varies depending on the zoom level. Zooming more
shows a lower noise floor by more or less 3-4dB for every zooming step.

Is anyone else experiencing the same? It does not seem quite right to me.

Thanks
fbx

Comments

  • I see that the noise floor displayed in the spectrum does change with zoom. However, for a given detection filter, say USB, the S meter is independent (separate process) as it should be. I take that to mean that the bin size is changing with zoom and the factor hasn't been completely correctly applied at each step.
    One also has to ask what is desired as zoom and Hz/bin changes. Since noise depends upon bw and detector type, would it be most desirable to always normalize to the same value, say 1 Hz? If there are 1024 bins across the spectral display, what do you want each bin to indicate?
  • Hi,
    the binning makes a good point indeed, and IMO the idea is to have a consistent, zoom-independent
    visualization. I would choose to normalize to a value that gives a visual output that is similar to the one that
    other high-end SDRs would give. I suspect that it would be a few hertz, as I often see Perseus or Elad screenshots
    with the noise floor at -130dB or so.
    ... thoughts?
    f
  • OpenHPSDR has a check box (defaulted to checked) that normalizes each bin to 1 Hz. The spectral display/waterfall has it's own selection of detection type: average(multiple types) sampled, peak... I would be happy with an averaging detector that properly measures noise power, just like the S meter, at a 1 Hz normalized BW.

    Such an arrangement might not make everyone happy. Some would rather have accurate measurement of peaky coherent signals such as SSB. As it is, the peak display on the spectrum varies with tuning and often misses the maximum of a coherent signal. I can live with this since the receiver/detector/S meter can always be used to measure it and get a correct answer as long as the user knows that the spectral display isn't a full featured spectrum analyzer but more of an indication of the spectrum. I'm interested in broadband noise contributors more than an individual signal amplitude so would prefer accurate noise display across the spectrum to accurate peak (what time constant is 'right', anyway) of coherent signals.

    Glenn n6gn
  • I am not sure I understand your second paragraph. One could keep the binning as it is and correct the final values in order to have an output that is coherent across different zoom levels. At the end the differences between zoom levels should be constants that depend on the bin width, not too difficult to apply. At least if other sdrs do it then it may make sense.
    Or am I missing something?
    f
  • I was going on to another variable - detector type. Some want to measure and see noise displayed in a spectrograph. Others want to see peak of a signal that has, say, high peak/average such as SSB. If I can only have one detector type I chose to measure noise, just like the S meter does. Not everyone may agree with this.
  • An answer of my questions to JKS contains some hints regarding 3 dB drop of noise floor:

    I have a question regarding the KiwiSDR spectrum scope.

    I assume that the bandwidth of the spectrum scope is only depending on the zoom level independent of the receiver settings?

    When I disconnect the antenna and terminate the receiver with 50 Ohms, set a frequencies around 15 Mhz and select the zoom level 12 at the spectrum scope (which is ~ 7 kHz bandwidth) then a ‘noise floor’ of ~-155 dBm is displayed.

    When I select the zoom level 0 (= 30 Mhz bandwidth) a ‘noise floor of -110 dBm. This makes sense as the noise level is proportional to the bandwidth. The S-Meter shows a different figure depending on the actual bandwidth selected in the waterfall?

    But now my central question.

    How can it be that such a low ‘noise floor’ of -155 dBm is displayed when the receiver is terminated?

    The minimal noise power (expressed in dBm normalized to 1 Hz) which is defined as the product of the Boltzmann constant and the temperature and the bandwidth, is -174 dBm/Hz at ambient temperature (normalized to 1 Hz)

    Considering a bandwidth of ~ 7khz at zoom level 12, an ideal receiver without own noise floor, then 38,5 dB need to be added -> 10*log10(7000). So I should rather see a higher noise floor of ~ -136 dBm instead of -155 dBm?

    Considering also the receiver noise figure it should be even higher.

    ------------------------------------------------------------------------------------------------------------------------------------------------------

    There are a number of things going on here:

    1) In the waterfall, as the zoom is changed, the value of the “WF min” slider is adjusted by +/- 3 dB per zoom level. This attempt to keep the noise floor a consistent color. That is, if you have WF min set such that the noise floor is shown in a shade of blue rather than black then adjusting the zoom should preserve that color. But this 3 dB adjustment has nothing to do with the spectrum display or the S-meter (which is derived from the audio DDC not the waterfall).

    But this situation breaks down when there is no antenna connected because of dynamic range limitations in the data delivered by the waterfall DDC. It is only 16-bits wide compared to the 24-bits for the audio DDC. When zooming in with no antenna connected (or a terminated antenna port) a zoom level above 9 or 10 will show a much darker color and then suddenly go to black as the 16-bits are exhausted. This is not a problem when a real antenna is connected unless it has very low sensitivity or the band is especially quiet.

    I’ve actually though about reducing the maximum waterfall/spectrum zoom level because of this and having a separate high-resolution FFT extension that uses the 24-bit audio data. Some people have been asking for zoom levels greater than 14. Adding more bits to the waterfall DDCs isn’t possible because of FPGA resource limitations.

    2) The dB scale on the spectrum display is really “dBFS”. It isn’t dBm unless you’ve calibrated it using the settings on the configuration tab of the admin page for the particular antenna you’re using. But this is true for any receiver/antenna combination not just the Kiwi. There are settings there for calibrating the spectrum scale as well as the S-meter.

    But the ~3 dB drop in the spectrum signal strength as the zoom is changed is a real effect. This is a result of the processing gain of the DDC just like a real spectrum analyzer. But the 16-bit dynamic range limitation comes into play here also. With no antenna the kT0 noise floor should set a lower limit. But because there aren’t enough bits you get non-linear behavior. It drops by 3 dB up until z10 at -140 dBFS. But then z11 drops by 10 dB to -150 and z12 to -160 which is obviously wrong. z13/14 goes off-scale low or shows clipping.

    But these problems should not effect normal operation when there are real signals on the antenna port.

    Best regards,

    John, ZL/KF6VO
    KiwiSDR
Sign In or Register to comment.