Spectrum Analyzer Persistance
One of my users who was doing some A/B testing on antennas passed along this comment
When tuned to a signal, enable the spectrum function. Then, disconnect the antenna. The time for the display to decay down to the noise floor is quite long.
Comments
i know very well Rocky from VE3NEA, i used it a lot many years ago with my early
Softrock kits and i remember very well that its spectrum decay time is
much faster than Kiwisdr spectrum display.
As James, i was too wondering if there is the possibility to modify its time constant, maybe changing some Beagle configuration file via SSH connection.
Currently Kiwisdr spectrum takes something like 60 seconds to fall down after a signal drop, that seems way too much.
Can i modify the time constants / averaging factor in a fixed way simply changing a configuration file under Beagle file system, or such settings are "hard coded" in the source code and require a system re-build ?
I think that the main issue pertains to the asymmetrical attack versus decay spectrum ratio, attack is almoast instant while decay is in a different order of magnitude.
The result is that increasing the exponential factor to obtain a live spectrum representation renders the graph base line very prone to suddenly rise to very high level with lightning, ionosonde and pulsing noise occurring.
A possible solution would be to implement separate attack and decay time constant settings, even better if differentiated for spectrum an waterfall functions, in the Sdr# style.
I have a question for John :
waterfall stuttering and slow refresh rate are due to Beagle computational constrains or the consequence of the choice to limit the network bandwidth ?
I already know that waterfall to audio synchronization brought to a worst update behaviour, the option to disable this mechanism in favor of more constant update rate could be a nice to have point.