Is there a workaround to push autoscale of the waterfall deeper into blue?
The waterfall is a nice tool for identifying both signals and noise but the autoscale function tends to waste the best part of it, in my opinion. Virtually every time I hit 'autoscale' I end up also hitting 6-8 times to get the interesting region near the noise to have better visual identity. The default autoscale setting don't make good use of this IMO.
I'm unclear where this issue is best addressed, possibly in WebRX, but barring a systemic fix I'd like to know if there is a workaround that will achieve these results without user involvement every time a display is autoscaled.
Can someone tell me?
Glenn n6gn
I'm unclear where this issue is best addressed, possibly in WebRX, but barring a systemic fix I'd like to know if there is a workaround that will achieve these results without user involvement every time a display is autoscaled.
Can someone tell me?
Glenn n6gn
Comments
When you click autoscale the bins from one waterfall FFT are sorted by their dBm value. The 5th percentile value defines the "noise" level. The 98th the "signal" level. 30 dB is added to the signal level to set the WF max level. 10 dB is subtracted from the noise level to set the WF min level. All four of those values need to be adjustable.
I suspect that the SNR values obtained by the WSPRdaemon project are more accurate in reality and more useful to the end user given the values can be gathered and graphed locally, I'd happily lose the global consistency in favour of tunable waterfall auto scaling functionality.
Given the trade-off, I'd prefer John's suggestion every time.
Over time you learn which Kiwis are good/useful, and which ones, well, aren't.
"very difficult to do via an algorithm" is certainly true! In essence it is this algorithm , this model or theory, I have been trying to achieve during the last few years as I've been working with broadband antenna solutions for the Kiwi. If we had systems each capable of comparing to a common external source, measuring the excess noise coming from the galactic center say, then we might be able to compare our systems but of course this is too far below most Kiwi/antenna systems (but not all!) at this point.
Next to that solution might be every Kiwi owner stating the system's noise floor referenced to the thermal noise in the radiation resistance of the actual antenna, not the real part of the feed point impedance and not the ingress arriving by other paths. This kind of metric is perhaps possible but almost certainly beyond the scope of most Kiwi owners since it involves a knowledge and complex interpretation of actual antenna, efficiency, polarization, near-field 'ground', far-field 'ground', take off angle ...". To my understanding this is what the ITU tried to do in preparing and measuring noise as reported in Rec. ITU-R P.372-8. Getting diverse Kiwi systems all meaningfully referenced to this seems unlikely due to the complexity of usefully integrating both theory and measurement.
Even so the Kiwi is such a compelling tool for learning about, measuring and improving radio and site performance in general that it is hard for me not to keep trying to do better but I think it very unlikely that a single metric or ranking such as we have now is going to be useful any time soon, if ever.
They conclude that the noise ?oor is statistically signi?cantly higher than would be expected from the latest ITU-R noise ?oor data and that similar noise floor measurements should be repeated frequently to stay in line with the technological changes.
https://vienna.iaru-r1.org/wp-content/uploads/2019/03/VIE19-C7-007-VERON-Noise-Floor-Measurements.pdf
I'm concerned that most measurements at amateur locations may not be of propagated noise but rather near-field or common mode ingress to the measuring system. I don't have much to back this up except a few tentative measurements from quiet Kiwi sites such as KPH and a some amateur radio astronomy work locally. These data seem to indicate that, if anything, the ITU numbers are too high.
For example it appears possible to detect and measure galactic noise at an amateur location near 20 MHz. These levels can serve to qualify the system and thereby to validate a noise measurements, at least in the nearby spectrum. On systems that can achieve this measurement, the ITU numbers generally seem pessimistically noisy.
I think more investigation is needed.
Near-field or common mode ingress could have occurred although I am pretty sure the folks making the measurements were aware of this possibility and took utmost care to avoid that.
However only 3 sites classified as quiet rural were used in this study. The definition used for that was no residences or infrastructures within 1.5 km radius.
The Kiwi and other stations nowadays able to receive galactic noise around 20 Mhz will probably fall in that category as well and perhaps the study could have been more comprehensive adding more locations.
For the rural quiet areas that could be a problem in a small and densely populated country as Holland...
Regards,
Ben
I may be able to test this at home and nearby where I do have available locations that meet the 1.5 km criterion. However it probably requires a portable 20 MHz vertically polarized dipole which I haven't yet constructed. I tend to want a dipole because the symmetry allows verifying that CM is not significant. The location can be chosen to rule out near-field except for the measuring device.
[sorry for the topic drift in this thread!]