8-10dBm drop in signals after update from v1.601 to v1.675 [fixed in v1.679]
After a year of putting it off (time gets away from me, and it has gotten surprisingly difficult to find anything less than a 32GB MicroSDXC locally), I finally tracked down some 16GB MicroSDHC cards so I could make a backup of v1.601 before letting the Kiwi update. A couple of days ago, I let it update to v1.675 and I noticed signal levels of local AM stations about 8 to 10dBm lower than they normally are. Signal difference also reflected on the spectrum display. While spending a good amount of time remaking antenna connections, testing antennas, and probing the TVS diodes, I noticed the Kiwi now begins to overload at around -28dBm. I suspect a simple s-meter recalibration may just be needed, although my SNR numbers also appeared worse than typical (but that could possibly be due to conditions over the past couple of days). Actual performance appears to be the same.
As a test, I re-flashed the Kiwi back to v1.601 from that backup MicroSD card I made, and signal levels of local AM stations are back to normal.
So the question is, is this normal? Did something change between v1.601 and v1.675 that would be expected to cause this "drop"? Is it just an s-meter recalibration that is needed?
I never messed with the S-meter or waterfall calibration. It's still set to -13dB in both v1.601 and v1.675.
This is a KiwiSDR 1 with no built-in attenuator.
Comments
I have not observed such a large change
Have you tried using the KiWi's internal signal generator to make a comparison between the two versions ?
This would rule out most hardware / antenna / off-air signal issues.
Regards,
Martin
Yeah, I very much doubt this. But I suppose now I'll have to spend time setting a Kiwi here with that old release so I can do some A/B comparisons.
I really don't have time for this. I am fighting so many fires. So many emails..
I'll play around with the signal generator when I get a chance, I honestly haven't spent a lot of time with signal generator extension so I'll need to figure out how to use it. The one thing that stood out was that it was overloading (images beginning to appear) at around -28dBm rather than around -22dBm (in my high RF environment). Which told me the S-meter was likely off (although calibration was unchanged at -13dB). Local signals were consistently lower than normal across the band, with the noise floor also lower. I'll take some screenshots of v1.601 and then allow it to update or I'll reflash it from my v1.676 backup card I made yesterday, and then take some screenshots there.
When I put it back to v1.601 yesterday (without touching any of the antenna connections, I left the end plate off the aluminum case since I already intended to also make a backup of the current software), signal levels were back to what I expected.
Okay, I think I found a clue.
Here is the signal generator on v1.601. -50dB attenuation, offset attenuation by waterfall cal -13dB, matching the readout on the s-meter.
Here is the signal generator on v1.676. -50dB attenuation, but with an offset of 11.6 dB (not sure where that came from?), resulting in the readout on the s-meter being off by 8.7dB. Which is consistent with what I'm seeing with signal levels.
How does one change this in the Admin panel? I still see the waterfall and S-meter calibration settings at -13dB.
This is what I saw going from v1.601 to v1.675 (and v1.676), screen shots taken within 30 minutes of each other: SNR remained same.
KTIS 900 at -23dBm on v1.601:
KTIS 900 at -32dBm on v1.676:
WCCO 830 at -31dBm on v1.601:
WCCO 830 at -39dBm on V1.676:
The waterfalls may look different if you have used different settings for FFT filter and averaging.
The 11.6 dB value in the earlier version I think is derived from the calibration values. So if that doesn't match your 13dB setting, there is possibly an error.
I seem to remember something like this problem in the back of my mind.
Try changing the calibration values in the earlier version and resort the server, then see if the offset has changed to the correct figure. If it has change them back again to 13dB, do another restart and see if that has fixed the problem.
Regards,
Martin
I'll give that a try and see.
As of right now, I can set the S-meter calibration to -4dB and get the signals about where I expect them to be. It doesn't sound like anyone else has to do that though, so it would nice to get to the bottom of it.
There is no perceivable change in performance, just appears to be the S-meter and possibly the waterfall calibration. It's not an urgent matter to fix, so take your time John.
As I suspected there is no issue here. Attached are screenshots from Kiwis running v1.676 (recent), v1.646 (13-Dec-23) and v1.601 (25-Apr-23 about a year ago). Admin config tab cal values: S-meter -16, WF -11.
S9 -73 dBm signal from an hp8657 into the antenna input. Response is the same: S-meter values all at -73 dBm. Noise floor in spectrum display is the same. Noise lines in the images is junk from the sig gen riding on the outside of the coax shield (i.e. not choked off like it should be).
Sig gen extension differences: The sig gen really needed an independent calibration value. Just like the S-meter and waterfall have their independent cals in the admin config tab. The earlier software versions just gave an option for "no cal" or "same cal as the waterfall". It turned out neither of those was really correct. The idea is that if you inject an S9 signal into the antenna port, and set the S-meter and waterfall cals to get S9 displayed, then when the sig gen is used instead it should show an accurate result as well. The 11.6 dB correction seems to be about the right value. The value field now allows an arbitrary value just like for the S-meter/WF.
BTW, don't use the 200mm SMA-SMA cable that comes with the Kiwi-2 for any metrology like this. It's not the highest quality and I've observes it inducing attenuation in some cases. It's fine for its intended purpose with the self-test function.
It must be something specific to mine. I've gone back and forth from v1.601 and the current version several times and I get the same results each time. Normal signal levels on v1.601, and about 8 dBm lower with v1.676 (and v1.677), both with WF and S-meter at -13 dB. Again, actual performance and SNR appears unaffected.
I'll put it back to v1.601 tonight and let it update on its own again.
What is the noise floor dBm reading on the digital S-Meter with the RF input disconnected ?
Does it vary between software versions ?
Regards,
Martin
I let it build from v1.601 to v1.677 earlier instead of reflashing it to v1.676 from the SD card.
The noise floor with the antenna removed appears to be the same between v1.601 and v1.677.
v1.601 (I forgot to flip to the stats tab, but you can tell it's v1.601 from the missing RF tab):
v1.677:
However: the signal levels of local stations (and my elevated noise floor when the antenna is connected) still have an ~8dBm difference between v1.601 and the newly built v1.677. I've done enough back and forths to rule out a poor antenna connection being the cause. I didn't physically touch the setup in the moments up to running this update from v1.601 to v1.677, and the results afterwards were still the same as I've reported before.
I'll have to see if I can check out a signal generator from work, or perhaps just finally buy a modern one for the bench. I do have a couple of restored 50s-60s tube signal generators that I could toss on the o-scope and fine adjust to a known set level. It's not terribly important that the S-meter is "correct", it's always being thrown off by the use of my step attenuator anyway (needed whenever I knock my closest 50kW AM station out of the null of my loop). As it is now, the calibration will probably be between -4dB or -5dB (rather than -13dB). I've just had it baked into my mind what I should expect for signal levels from local stations, and at what point my strongest AM station (with the loop in it's usual position) will begin to overload the Kiwi.
OK I suspect it is related to the CIC filter changes that were made between versions.
There was an issue with the original filter causing problems within the receive passband, this resulted in unwanted signals on adjacent channels interfering with wanted signals, and also some in-band IMD due to mirroring of demodulated signals within the pass band.
As a result the gain values had to be tweaked to compensate for the improved filter shape.
I'm not 100% certain this is the cause, but it seems likely, especially given the version numbers you are switching between, encompass these changes.
Bottom line, I think the sensitivity remains the same, but reception with the later version should be a lot cleaner, especially on the wideband modes such as AM and NBFM.
Regards,
Martin
I think the gain adjustment issue with the CIC FIR filter addition was settled. The wsprdaemon crowd turned up at my door carrying pitchforks. With the evidence being differences in the noise floor graphs on wspr.live of all the wsprdaemon reporting Kiwis.
I would really like to see the result here of testing with a signal generator.
@jks I could make a .dmg disk image of my v1.601 flasher and upload it to a Google Drive and send it to you if you want to look at it closer there and experiment. Assuming a .dmg image would preserve the proper structure of the flasher.
If you already have a signal generator and are not worried about the calibration accuracy.
Use CW and measure the peak signal level with the dBm scale digital S-Meter, and compare that against the level of the noise floor with the generator connected, but turned off.
Use the same generator setup and measure it again using the other software version.
The difference between the pairs of readings in dB, should remain the same between software versions, regardless of the calibration accuracy of either the generator or receiver.
As long as they are the same as each other, the actual receiver sensitivity is also the same with either version, and the problem is just the KiWi calibration value differing between software versions.
Regards,
Martin
I've just done some tests with a KiwiSDR 1 and an external signal generator (Marconi 2019A). At 1 MHz and -73dBm into the RX, the S-meter displays -70.5 dBm in the 4 and 8 RX modes but -79.2 dBm in the 3 RX mode. So there appears to be an 8.7 dB difference. (v1.678)
Thanks for those tests G4DYA. That's exactly what I'm seeing here.
I may have failed to mention it before (although it does show it in the stats tab), but I am running 3 channel mode (have been since I set it up several years ago). It actually crossed my mind to test it out on 4 and 8 channel mode last night, but I haven't done so yet.
Martin, I does appear to have the same sensitivity and SNR between the two versions, just that the s-meter calibration is off.
Ah OK, well done
RogerRichard, excellent remote diagnostics, I hadn't thought of that possibility.Regards,
Martin
Edit - Ooop,s sorry Richard, a senior moment on my part.
Sure enough, I just switched my Kiwi to 4 channel "classic" mode and the signal levels are back to normal. So, it appears to only affect 3 channel mode.
😕 Knowing that, I developed and tested a fix for this problem in about 10 minutes. Fix will be in the next release..