RSSI value doesn't match Spec plot
I am trying to record RSSI values of both the center frequency and the sidebands of various stations. The following command is an example of what I am executing to do this:
`python3 kiwirecorder.py -s bonaire.twrmon.net -p 8073 -m am --tlimit=10 --s-meter=0 --sdt-sec=5 --ts -f 860`
However, I seem to have discovered that the RSSI value returned by kiwirecorder.py does not match the value at that frequency. Rather, it seems to measure a wider range of frequencies (about 5kHz on either side). Thus, I am getting the largest RSSI value within 5kHz of the frequency I'm measuring at, when I just want the RSSI value at that frequency.
For example, this station is at 860kHz:
The peak RSSI value is at about -56 dBm, the sidebands are at about -102 dBm, and the noise level is at -122 dBm.
When I measure the RSSI value at a frequency between 856 to 864 kHz, I get -56dBm, which I would only expect to get when measuring right at 860 kHz. Similarly, I get -102 dBm at frequencies all the way down to 845 kHz, when I would only expect this down to 850.
Essentially, the range of frequencies looked at seems to be too large, and I want to narrow the resolution to get the RSSI value right at the frequency of interest so that my data matches the Spec plot. How can I do this?
I have noticed that if I measure in CW mode instead of AM, it seems to be much more accurate. Also, if I measure in LSB mode, it is slightly more accurate than AM, though it's not super clear how it is obtaining these values.
`python3 kiwirecorder.py -s bonaire.twrmon.net -p 8073 -m am --tlimit=10 --s-meter=0 --sdt-sec=5 --ts -f 860`
However, I seem to have discovered that the RSSI value returned by kiwirecorder.py does not match the value at that frequency. Rather, it seems to measure a wider range of frequencies (about 5kHz on either side). Thus, I am getting the largest RSSI value within 5kHz of the frequency I'm measuring at, when I just want the RSSI value at that frequency.
For example, this station is at 860kHz:
The peak RSSI value is at about -56 dBm, the sidebands are at about -102 dBm, and the noise level is at -122 dBm.
When I measure the RSSI value at a frequency between 856 to 864 kHz, I get -56dBm, which I would only expect to get when measuring right at 860 kHz. Similarly, I get -102 dBm at frequencies all the way down to 845 kHz, when I would only expect this down to 850.
Essentially, the range of frequencies looked at seems to be too large, and I want to narrow the resolution to get the RSSI value right at the frequency of interest so that my data matches the Spec plot. How can I do this?
I have noticed that if I measure in CW mode instead of AM, it seems to be much more accurate. Also, if I measure in LSB mode, it is slightly more accurate than AM, though it's not super clear how it is obtaining these values.
Comments
If what you want is an individual value in a single bin as shown by the "spec" button display then you need to use kiwiwfrecorder instead. Or if you want to measure the carrier of an AM signal just reduce the passband to a low value like 50 Hz to eliminate the contribution of the sidebands.
All of this can be demonstrated rather easily. I used a UK Kiwi and measured the nice strong, steady carrier of TDF on 162 kHz. It has no sidebands. In AM mode with a 10 kHz passband the S-meter says S9+23 and S-meter extension -50 dBm. These two values agree (S9 = -73 dBm, so S9+23 = -50 dBm). I type "/50" in the frequency entry box to drop the passband to 50 Hz. The S-meter values are the same since only the carrier is being measured.
If I press the "spec" button the waterfall value at the carrier frequency is exactly -50 dBm even though this is being computed by a path that is completely independent of the audio/S-meter.
You can do a similar measurement on a digital signal like STANAG 4285 or DRM that has a wide, flat, equal-amplitude spectrum. Measure with the S_meter extension using, say, a 2 kHz passband. Then drop the passband to 1 kHz. The S_meter reading will drop by 3 dBm. Half power to the S-meter equals -3 dBm. An example of that:
Attachments:
https://forum.kiwisdr.com/uploads/Uploader/fa/35c2fe2f3e58257d4616285532ad1e.png
Also, can you explain how the RSSI values are obtained in LSB mode? I want to compare the signal strength of the carrier to the signal strength of the sideband. Right now my method is to simply take various signal strength measurements at the center frequency and the nearby frequencies to determine where the sideband is and how its strength compares to the carrier. Is there a better method?
The RSSI/S-meter value is computed from the audio IQ stream before any demodulation. So the demod mode doesn't matter. Only the passband matters.
It's easy to forget because they both typically tune together. But the audio and waterfall are two entirely independent DDC (digital down conversion) receivers. You can re-tune the waterfall to go look at a completely different part of the spectrum without effecting the currently received audio. So a "4-channel" Kiwi is actually an 8-channel DDC SDR. This is why the unbalanced FPGA configurations are possible, i.e. rx8_wf2. The lack of waterfall on the other 6 audio channels is made up for by the audio FFT computed on the browser basically for free.
The equation from jks above says
s_meter_dBm = 10.0 * log10(s_meter_power / SND_MAX_VAL)
I think s_meter_power is in mW, so I just need to reverse this equation, which yields:
power_mW = 10^(s_meter_dBm / 10) * SND_MAX_VAL
What is SND_MAX_VAL? It's not super important, because when I divide it will cancel out, but I'm just curious.