Re-designing the OV marker space?

Just a note: as soon as there are really strong signals, the red "OV" marker appears on the right side of the S-meter, actually preventing the user from seeing the current dBm value. During the worst times, the red OV can remain displayed for a long time. Therefore, would it make sense that the background of the dBm value placeholder will be red, instead that we now block the value altogether with the "OV" marker?

73 Jari


  • Hi Jari,

    I suspect that when the 0V indicator is operating, the ADC will be overloaded to clipping point and will have 'run out of bits', so the dBm value will no longer be valid.


    Martin - G8JNJ
  • Hi Martin, I don't know ... as the S-meter is working nevertheless although the OV indicator is on, showing the S-meter reading (based on the dBm value, I guess) of the signal on the current frequency ... just wildly guessing here :)

    73 Jari OH6BG
  • Hi Jari,

    I think it depends upon how much of an overload is occurring.

    If it is only very short duration and infrequent (just on the edge) then there is a chance that the reading may be OK.

    But if it is a big signal and practically constant, then it's probably not.

    However I don't think you could 100% rely upon the dB value being accurate if the 0V warning is present.

    SDR's don't behave the same way as traditional 'analogue' receivers when they are subjected to strong signals, they don't fail quite so gradually and tend to be 'all or nothing', as is the case with most digital circuits and processes, you don't get much of a warning before it happens :-(


    Martin - G8JNJ
  • Hi John,

    I wonder if it would be worthwhile bringing the OV indication out to one of the P9 GPIO pins ?

    It could then be used to drive an external attenuator circuit via a suitable averaging circuit, so that a crude form of AGC could be applied. This could possibly help folks in locations who suffer from excessive day / night variations in signal levels.


    Martin - G8JNJ
  • That's a good idea, maybe a GPIO Pin can drive a PIN Diode Attenuator on OV trigger or in the overnight strong signal environment a GPIO pin could be set by Time to kick in -10dB across everything ??
  • Yes, I was thinking that it may be possible to integrate the OV pulses and use the resultant voltage to drive a PIN diode ahead of the KiWi.

    It would then 'self stabilise' at a suitable value of gain reduction.

    It should be possible to do this with a very simple circuit, maybe just a transistor and RC network feeding the PIN.


    Martin - G8JNJ
  • Yep, a GPIO with PWM drive to RC storage to PIN bias.... something like that ;-)
  • The problem with that feedback approach is that the loop gain falls to zero at the desired operating point (no OV, ever). There is also the issue that broadband attenuation hurts sensitivity/noise floor everywhere.
    The ideal solution is location/antenna specific and may change with time, e.g. MW BCB often change ERP day/night. If a big daytime ERP forces lots of attenuation than 10m (say) gets its noise floor pushed up possibly above that of the incoming propagated noise and the receiver suffers.
    This may be adequate for some intended uses but a more flexible, tailored architecture might be worth considering.
  • "The problem with that feedback approach is that the loop gain falls to zero at the desired operating point (no OV, ever). There is also the issue that broadband attenuation hurts sensitivity/noise floor everywhere."

    Yes that's true, and if it was just one or two very strong stations then notching is definitely the best solution.

    However in my experience in Europe it tends to be more of an issue at night with SW BC stations, when propagation changes bring up an specific BC band to the point at which the OV indication starts to cut in. Often applying just a few dB of attenuation solves the problem for the period of time that it occurs, which is usually only a few hours at most. During nightime the HF spectrum tends to be less active anyway, so for me I don't think in practice such a technique would be particularly noticeable or problematic.

    I guess some experimentation is required.


    Martin - G8JNJ
  • edited April 2019
    FWIW, I attacked this problem with the much lower-performance RTL-SDR dongles (8 bit A/D converters) - and you may read about that here:

    This is being applied on the 90/80, 60, 41/40 and 30 meter receivers (RTL-SDR based) at the Northern Utah WebSDR and it seems to be very successful: Even though the disparate amplitudes experienced on the air in the Western U.S. are nothing like what might be experienced in Europe, there is still a lot of range to be handled. For example, in the early-mid evenings, signals on the order of -30dBm from SWBC stations may be present on the 41 meter band, but microvolt-level signals (around -95 to -100) are readily audible: The combination of the AGC action, oversampling and what amounts to "noise dithering" makes the seemingly impossible task of simultaneously handling 60+ dB dynamic range on an 8 bit A/D possible. To be sure, an A/B comparison of the "narrowband" 40 meter receiver on weak signals reveals that there is some audible degradation in the RTL-SDR's signal path, but in most cases, it might be noticed without having done such a comparison.

    The Log amp chip (the AD8307) is broad-banded enough that it should be able to handle the entire HF receive frequency range of the - but even so, you would want to precede the log amp with a bandpass filter commensurate with the Kiwi's response to prevent the AGC from "keying" on out-of-band signals.

    As mentioned in this thread: ( ) I would be a very good idea to put a "limited attenuation high-pass filter" in front of the Kiwi, not to mention an appropriate amount of RF amplification - following the AGC circuit - and then adjust the AGC "knee" for something like 12 dB below the OV indication.

    Adding an AGC would, of course, skew the S-meter calibration during the periods that it was operating, but this is likely an acceptable trade-off. If this really bothered someone, a representation of the AGC's action (e.g. a voltage representation of the attenuator's bias current, using a fixed-level "local" signal as an amplitude reference) could be used as a correction factor.

    Years ago, in writing firmware for an SDR-based HF transceiver (the mcHF) I did a similar scheme in software, but I had at my disposal a 1dB (approx) PGA (Programmable Gain Amplifier) at the signal input to the A/D and alterations of the RF signal path's gain could be compensated for on the fly with no effect on the S-meter accuracy being visible to the users. This method was described in this blog entry:

    While no similar utility is known to exist on the KiwiSDR natively, a binary-controlled step attenuator device could be used to similar effect, as implied in previous postings.


    Addendum: One could use the output of the front-end op-amp (the input to the A/D) to feed the log amplifier, provided on *carefully* makes a connection that does not otherwise degrade the signal path. This way, the log amp for the AGC would "see" precisely what the A/D converter sees and be able to operate appropriately with the signal levels.
Sign In or Register to comment.