Worked the whole evening, with a fresh installation of everything. new VMware Virtual machine, new Ubuntu, all details of WD, which needed to be installed.... Finally: everything works - can see the noise reports also and massive number of spots...... HOWEVER: Nothing is spotted at all ! NO uploads ...
The webserver shows its original index file of Apache Changed permission of the whole /VAR/WWW/HTML folders to 777 - no change...
Do I need to change something at the router ? I haven't done it before and it worked... Is there any special firewall within Ubuntu which blocks everything
Why I can only see the original Apache Index file and not the noiseplot png ? Why there is no new Index file written in the html folder ?
I'm not sure wsprdaemon writes the index.html, just the noise image. I rename/delete the index.html and just link to the image (obviously that is just a local Apache for this, no other content)
Stu, I wrote my own index-file to link to the noise_graph.png... renamed the original Apache index file This works - tested with browser : the correct noise_graph.png is shown...
Now the only fundamentel problem is : NO spots ! (waited more than 10min - nothing appears)
I managed to get Noise-Graph shown on http://graphs.wsprdaemon.org/ON5KQ/ by changing permissions of /VAR/WWW/HTML The UFW (firewall Ubuntu) allows Apache service (sudo ufw status)....
Did I forget something, so the WD spots do not leave the Ubuntu environment (VMware) ?? Do I need more permissions for to get the spots sent?
How do I get my correctly decoded spots sent/recognized at wsprnet.org ?
I've just realised that the graphs are scaled as dBm/Hz, so it is the power delivered to the receiver and not a compensated value to derive the actual field strength.
So the figures on the graphs don't directly equate to ITU noise curves (which is what I'd imagined), they are just an indication of the power delivered to the receiver, the data has to be exported and an external calculation applied to compare the measured noise level vs ITU prediction.
Therefore I think the config should set to compensate for my 10dB pre-amp and include the gain of the antenna
Does wsperdaemon incorporate the ability to compensate for antenna gain and plot the noise level graphs in a similar format to the ITU curves, or am I still misunderstanding what's going on ?
WD does not allow moving the reference plane as far as the antenna. Alone, the kiwi with a -16 calibration factor is pretty accurate, within a dB at 14 MHz. It reads high above that and low below. With WD's correction that frequency un-flatness is corrected. Further corrections can also be added to move the measurement plane/point nearer the antenna but what those values are and how they are interpreted is an exercise left to for the user.
The larger problem is that final assumptions need to be described for the noise plots to be meaningful. Antenna factor gets one a correction factor from e-field to some measurement context, normally 50 ohms, but that doesn't account for in situ problems like polarization misalignment, arrival angle and direction differences and especially 'mismatch loss' from the radiation resistance of the antenna to a physical measurement point. Our antennas aren't in free space nor are they perfectly matched, ground/earth both nclose and far has an enormous effect on what is delivered. The ITU reference is an empirical one.
That ITU reference is a monopole over 'some kind' of earth. Like any antenna not in free space it has directional and polarization characteristics. Their values are adjusted for frequency mismatch (as I remember) such that the Fa values provided are a comparison to a "well matched vertical dipole", at least more or less. One can argue a dB or two of pattern variation but it's close enough that other factors swamp slight directivity differences and such. From very short out to a full half wave every antenna has pretty nearly the same directivity and if perfectly matched 'catches' the same power into its source resistance. Getting to, matching, that is of course an entirely different problem and for very electrically small antennas is impossible with any tools and techniques we have.
To all be on the same page with our plots, referenced to the ITU numbers, we have to back out the antenna factor to get to that reference dipole, at every frequency. We must still realize that our environs aren't identical, our local ground is different as is our far-field. Since we have few (only one?) common reference we have to make estimates in order to get a reasonable comparison with the ITU, and the ITU numbers are suspect in themselves. They seem to indicate a higher floor at least some places in HF.
The one common reference that we perhaps have is galactic noise. IF we can see it, there is reason to think it represents a sort of noise standard, though it's one that rises and sets with Sagittarius, is affected by ground and who knows what else. We have to delve into radio astronomy in order to understand our terrestrial receiver.
A well matched dipole and, say, a Beverage are both antennas, but corrections necessary to compare signal and noise performance are very different. If we don't know our reference and assumptions the noise plots we get are only locally relative. The correction/model for an antenna with high antenna factor may be something like a dipole with an attenuator. Each may be capable of delivering similar SNR but the absolute levels may vary widely. These can't be measured with a VNA.
I think that a worthwhile goal might be for all of us using the WD noise measurement and graphing, including the Grafana database, to strive to have reasonably good corrections in place such that we reference our systems/antennas to a "well matched vertically polarized dipole" so that absolute values start to mean a little more. Even here though the particulars matter to the interpretation, e.g. turning a well-matched vertical dipole horizontal may typically drop both signal and noise ~6 dB while also having a major impact on the direction/angle of the main lobe and of course on polarization, though this may not matter so much for DX signals and galactic noise.
I suggest leaving the KiwiSDR reference at the SMA connector with the "-16" setting (and also waterfall but don't use that path for any absolute measurements) and then letting WD include the flatness correction and the individual band corrections for everything it takes to make the numbers for the actual antenna relate to the ITU vertical-dipole-over-real ground. This means that antenna factor compared to a well-matched dipole, mismatch, amplifier gain, splitter loss, feed line loss etc is all lumped into the WD correction value at every frequency. It won't be perfect but hopefully it will allow better comparison among kiwis as well as give a user guidance as to when his/her system is nearing the achievable limit.
[Sorry if this reposts with edits, I'm having difficulty editing and seeing the results]
I'm using my TC2M broadband vertical monopole which has a modeled set of gain values that closely match those that I've been able to measure.
I'm currently trying to calibrate the wspraemon against values measured with a spectrum analyser.
I've now got most of the wsprdaemon plots close to the spectrum analyser values.
However the offset values I have used have turned out to be purely arbitary and don't seem to reflect actual calculated gain differences between bands.
The only one that still seems to be resisting is the 2200 band, which for whatever reason doesn't seem to be taking the offset value correctly when the other bands do.
Martin, At the risk of confusing things more, the following values for one kiwi from my WD.conf seem to work (doesn't mean the values themselves are meaningful though!)
I think this means that lacking a specific entry, "DEFAULT:-6.0" will be used as the amount to change the report, the amount ADDED to the report, to account for a nominal 6 dB gain ahead of the KiwiSDR. I hope I have this correct, it's easy to invert the sign and I have done this in the past so check to see if I have this right!
On the bands explicitly entered, e.g. "40:-7.5" a different value is used. In this 40m example there is upward slope of the preamp gain and MORE correction, now 7.5 dB, must be made to restore flatness vs. frequency.
Were my 40m antenna a well-matched dipole then the combination of the "-16" in the kiwi itself, 7.5 dB for the preamp and the frequency flatness correction in WD would get me an ITU-style reference.
In fact, this is NOT what I currently have for an antenna. During lightning season here I use a short active antenna which has an antenna factor different from that for a well matched 40m dipole. I haven't (yet) corrected for this so all my numbers are wrong relative to an ITU KTB-in-a-well-matched-monopole (approx a dipole) reference.
While you have probably checked this, I had problem when in the gain offset, I didn't use exactly the same name as that used in the receiver definition - but once I fixed that, all of the offsets started to work properly - and still do.
Clint reminds of another point. If you use merged receivers DO NOT change the name from "MERGED_RX..." because that name is somehow hardcoded. The suffix may change thus MERGED_RX_SYS1 is OK but make sure the MERGED_RX part precedes it. I discovered this the hard way.
Hi - to have some better idea of what the correction values must be in my situation with my antennas I plan the following: - build a reference vertical dipole of 2x1m long on a 3m fiber pole (so lowest point of dipole is 1m off ground) The dipole will be fed by LZ1AQ dipole mode pre-amp. Chavdar made a LT-spice calculation of the dipole with the amplifier and result is Antennafactor of +2db. I will just assume this is at least not unrealistic... - then just compare the results of this dipole with my actual receiving antennas...
My expectation is, that I will wonder how wrong NEC2 simulations with the actual antennas might be.
Antenna factor for a small antenna changes at 20 dB/decade so a single value is only correct at one frequency. On top of that, the additional mismatch to the preamp is frequency dependent and should be modeled. Circuit simulators can do this well and can be used to generate values for any frequency.
A 2x1m dipole (the same antenna I'm using here), or any short dipole has an impedance, Ra+jXa, that can be well approximated by
Ra = 20.*((pi().*L./lambda).^2); #series resistance in ohms Xa = (-i*120./(pi().*L./lambda)).*(log(L./(2*a)).-1); #series reactance in ohms
It is this Ra that sources the voltage that the preamp and any transmission line then transforms to the kiwi's input Z. There is likely additional mismatch loss at the preamp input so the entire voltage from Ra may not all appear at the preamp.
For a 2x1m dipole, Ra goes from ~80 micro-ohms at 30 kHz to ~8 ohms at 30 MHz. The dipole looks much like a 10 pf capacitor in series with that small, varying R. At the same time this is difficult to match to, the voltage gets very small at low frequencies, and often small compared to CM noise, and other ingress. Though the matched-power (and the pattern and the aperture) is nearly the same for a dipole of any size, the available voltage goes as the length.
Using the calculator provided by Owen, VK1OD, in 50 ohms the antenna factor for that dipole goes from about -62 dBm/meter at 30 kHz to about -2 dBm/meter at 30 MHz. The reason small antennas work at low frequencies in spite of this is, I think, because lightning QRN has an even larger slope with frequency. The ITU data seems to bear this out.
Just to clarify: The LTspice calculation for the 2x1m dipole antenna factor was with the pre-amp included ! According to Chavdar he designed it in a way that the antenna factor is nearly constant - at least that is, what I understand from the specifications on the webpage for the AAA1C pre-amp (With antenna) www.active-antenna.eu It was just because I use the LZ1AQ pre-amp in the current WSPR reception with my Kiwi_0 and a longer 2x3m vertical dipole, that I want to compare the signal strength at the input SMA of the Kiwi...
Recently I build another antenna for the second Kiwi (Kiwi_1 in the noise graph) - 100m longwire - shot over a tree (15m high) - so fed the southeast end at ground level (against a single copper ground rod) and terminated the other Northwest end like a Beverage. The impedance is very flat and can easily broadband matched with a imp. transformer. The simulation pattern is a stunning picture on the high band - like a beverage, but still with 6dbi gain on 10m, because of the height over ground, I think...
I use a 20db amp (50Ohm input/output impedance) with an equalization network in front of the preamp to not overdrive the system. Otherwise the kiwi would be overdriven from the huge signals on the lowbands from the longwire.
The noiseplots now are adjusted with the DEFAULT parameter so, that I take all losses of the feedsystem into account (measured with DG8SAQ network analyser) and also included the 4Nec2 simulations about expected antenna gain. I included 3db extra loss in addition to my calculation and in the NEC2 calculation I added extra 100Ohm ground loss spot impedance at the feedpoint.
The result are the noise plots we see http://graphs.wsprdaemon.org/ON5KQ/ What I do not know at all is, what field strength it represents - so comparison with ITU recommendation (part 13 ), I don't know
Interesting is the comparison of the noiseplots of the vertical dipole (omnidirectional, vertical polarised) with the highly directive Longwire to northwest. In the morning before lunchtime the Northwest antenna looking into the night is almost dead on 20m - even the very strong Chinese BCstations on 19 and 16m band in the morning are at least 20db weaker (may be way more) than the dipole - there is no QRN at all as well on the large longwire to the 310deg - as there are no condx in this direction. I kept the antenna connected though to compare the plots with the very sensitive vertical dipole at the same time at same qth. The dipole is full of strong signals on 20m in the morning, but also with relatively strong qrn occasianally on 20m. We can see the outliers over a stable base noise line on the dipole antenna not that visible on the longwire.
One of the 60m frequencies is often jammed by a digital signal of some kind, which is very strong all day long - we can see that also when it is on. During the week I have lots of QRM from a home-office with a faulty PC-powersupply, which increase noise by 10db - we can see, when buisiness hour starts and noise curve jumps up - (usually from about 7h utc =it switches off after 17hutc and is never there on weekends...
40m and 60m is affected by LED-lamps of neighbors - both about 10db increase of base noise level. Unfortunately it badly affects my 40m reception, which is basically very well, because I refurbished my old transmit antenna and use it now as additional 40m DX antenna - 8 resonant quarterwave verticals in a circle of 26m diameter, with extensive radial network. The array is best during the day, when there is no lighting and during weekends, when computers are off, and people have a good time on a trip to close Northsea coast rather than making noise on 40m with cheap electronic consumer equipment......
Unfortunately the time is gone, when I could use this large array in wintertime at twighlight or nightime, because all these LED lamps are annoying.
The 40m runs on a seperate receiver (ELAD), on the windows host PC on which the Kiwi's are running in a VMware virtual machine with Ubuntu 18.04LTS It would be great to connect the ELAD via Vitual Audio cable with the Ubuntu Virtual machine and then define it as audio input for wsprdaemon software. I have not found out how I can do that - because then I can also show the noisegraph of the 40m array - so far it is not included.
(only a comment about broadband match). It's not the *measured* antenna impedance that goes into the calculation but rather the radiation resistance and how it is coupled to system. A 1 mm monopole connected to a 50 ohm load is matched LF-HF pretty well but has an awful antenna factor below mm wavelengths.
A 'long' wire, still short at some long wavelength, has a low Ra there. At higher frequencies, the length of the wire rotates that radiation resistance delivered from low (~37 ohms for a lambda/4 vertical wire/monopole) to very very high at odd half waves -also known as the 'second resonance'). The 'delivered' Ra varies wildly and is not perfectly match-able over the entire range with components (at least excluding superconductors) and architectures available to mankind. See A New Antenna Model.
For asymmetric antennas like mono-poles, the ground resistance is in series with the Ra . So even though the match may look OK on a,say, 20m long loaded 630m vetical over real ground, the antenna factor is not nearly that of a dipole -the ~100 milli-ohm Ra is not well matched to the 50 ohm transmitter/receiver. For any electrically small antenna most of the power goes to heating matching elements and/or earth losses This is why ERP from LF transmitters are very often only a small fraction of transmitter power. Most of the power goes into matching or ground losses.
On receive only a fraction of the power in the approximately-constant aperture of a matched antenna arrives at a Kiwi's 50 ohm input. While the SNR on any antenna is the same (modulo directivity), the delivered SNR to the rx detector often is not because other contributors, front-end noise figure, common mode current noise, near-field noise sources ... dominate compared to the small voltage developed across Ra. My reason for being interested in examining noise floor, referenced to a well-matched/ITU antenna reference is that it gives guidance as to how much effort to make and the possible system improvements that would result. One needs to be able to compare KTB+Fa on the antenna in use to do this.
The noise graphs for me are only a kind of indicator, when using different antennas with WD: - the feedsystem losses are more or less known, as you can measure them - The antenna to my best possibilities is modelled with NEC2. We can use the results on the various bands as input for the Default parameter or don't adjust anything as some factors will stay unknown. However I feel it gives at least a good indication of how sensitive the whole antenna system is. At least for the bands 10Mhz and higher it is useful, I think. - I am aware of the many factors I don't know - therefore the ITU-graphs are not important for me.
I don't even think ITU recommendation has any value any more by today, as noiselevels increased in most residential QTHs in my area drastically in the last 20 years. There are many factors, which are not at all included apppropriately in the most recent ITU curves - I have noise plots from 15 years ago - same qth, same hardware/measurement equipment... today the noise is 20db higher than at that time of recording, where there were no LED lamps, radiating Telecom lines with highspeed Internet over old telefon wires from 60 years ago... Such faults and mistakes are normal and standard always everywhere - 20years ago such emmissions where rare exceptions. So why bother with the ITU recommendation curves which is not representing real life any more - If govenments would take these curves to be guaranteed in any residential area (taking residential curves) they would need to close all consumer and most professional equipment or even better switch off 230V utility lines (We have PLC installed for smart meters !) As a result the ITU curves for residential must be increased drastically to show the real situation - unfortunately this is not done.
So - I will keep it as it is: - antenna gain from Nec2 simulation deducted with DEFAULT parameter - feedsystem losses included as well Net results are the curves - KIWI_0 = vertical active dipole, KIWI_1 = 100m longwire
I prefer the antennas with better signal to noise ratio - the absolute noise isn't important to me as long as signal stick out more with the better antennas. It is not only the level of the noise graph rather than the actual picture they show (even if the level is different) - there can be drastical differences Often the noise graph looks like a dead band - however in reality it is just a very quit antenna, with excellent S/N ratio. You need to listen into your receiver, to find out...
Sorry that I missed the questions in this thread, but I have been dealing with the aftermath of the Northern California extended power outage at KPH.
The optional DEFAULT and following band-specific parameters affect only the noise level and noise graphs. If present, they are comma-separated 'BAND:ADJ" pairs with 'DEFAULT:ADJ' (which should be first) applied to bands for which there is no 'BAND:ADJ' definition.
They are there to allow you to adjust the noise level reports for any gain or loss in the transmission system. For example, if you have a +20 dB LNA ahead of your Kiwi, then 'DEFAULT:-20' should be added to your Kiwi's receiver definition line. DEFAULT:0 adds 0 dB to your measurements, so it will have no effect on your reported noise levels.
I support band-specific adjustments, since at many (if not most sites) there will be elements in the RF transmission chain which change the gain by frequency. Get at $50 NanoVNA and check out your rx system from antenna feed point to the Kiwi's SMA input. You may be surprised.
Ulli, I agree that it is difficult to get all the way to an absolute reference but I think it worth the effort. You say: "I don't even think ITU recommendation has any value any more by today, as noise levels increased in most residential QTHs in my area drastically in the last 20 years"
which I think worth a response.
From measurements we at KPH, AI6VN, N6GN .. and others such as the Long Wave Length Array have made it appears that the actual noise is considerably lower than the ITU numbers! It is my emerging opinion that actual radiated noise, that is, far-field inverse-square, plane wave level of noise probably has not increased significantly over the years. I think (again only an opinion) that the worldwide lightning noise still dominates below mid-HF and that above that galactic/propagated values can be reached or approached by carefully built systems even near cities and certainly in rural areas.
If noise generated by a large number of individual sources from a dense urban region had actually increased such that actual radiated noise of the sort that falls as inverse square were now a bigger problem I think we would measure a different result. Nobody near a city, even on the outskirts and beyond would be able to achieve anything like the noise floors that actually are possible. Falling at only 6 dB for every doubling of (ground wave) distance it wouldn't be possible.
So "How come everybody thinks noise is worse these days?" is a good question. It's not that I doubt your measurements or experience, you are by no means alone! it's that I don't think the mechanism is true far-field radiated noise. I don't think what you measured 15 years ago nor what you are seeing now is that. Rather what I think we are all seeing is a combination of coupled and near-field noise mechanisms. Of course, just that these are of a different sort doesn't solve the problem. Noise is still noise and still limits us but if it is true that the mechanism is different then there are things that can be done. I think this is a very important distinction and offers us all hope. Coupling mechanisms can be mitigated and near-field noise falls off very rapidly with distance, inverse fourth and inverse sixth powers being common.
I've found the KiwiSDR environment a really useful one for exploring these noise mechanisms and ways of mitigating them. My goal is to understand them first and the broadband capability of a single kiwi combined with a lot of test cases world wide is really useful. There haven't been geographically separate, 0-30 MHz spectrum analyzers available in this manner before now and I really like it.
So while I'm not hung up on becoming a radio astronomer with a kiwi I do find the attempt to obtain absolute calibration and thereby a metric for performance extremely useful. I'm not expecting everyone to do this. When I started this endeavor several years ago my 'normal' 20m antenna showed about -73 dBm in a USB bandwidth. I could only hear the strongest stations. Listening wasn't much fun. After working on it for a year, building active antennas, probes and trying to learn coupling mechanisms for ingress noise that wasn't the propagated/galactic noise that I really wanted to have as the limit, I was able to make about 30 dB of progress. I could hear 5 honest S units further down than before and the entire radio listening experience for me changed. And I wasn't even done, I hadn't reached galactic/propagated noise limits at upper HF.
Since then I've repeated this type of effort at several sites with good results. Again I'm probably not done except perhaps at KPH where with Rob, AI6VN, doing most of the heavy lifting we can get to a galactic/propagated noise limit (perhaps almost 10 dB *below* what ITU suggests!) at the other sites it is still very much a work in progress but one for which the kiwi plays a very useful role.
Glenn, "So "How come everybody thinks noise is worse these days?" is a good question." Look at the noise between 3.75 and 5.3Mhz here on5kq.ddns.net:8074 This noise comes from the local telecom operator and is the modems return pass of VDSL2 network. This noise is everywhere in our city - sometimes even much stronger sometimes less
Here is the noise 200m far away on a empty field (see fotos at my qrz page): on5kq.ddns.net:8073 The feedline has more about 100kOhm of chokes with multiple good earth connections - you can verify how clean the VLF band is (almost ONLY conducted noise).... in my case there is NO conducted noise at all - it is all radiated noise - not always far field - but most man-made noise is NOT far field - at least not on very low frequencies...
"Noise is still noise and still limits us but if it is true that the mechanism is different then there are things that can be done." I think this is an illusion! The local official regulator wrotes me a letter, saying that they don't have resources to serve radio amateurs - they have no manpower to come locally and check the situation. So nothing is been done with my claims against various illegal emmissions in the neighborhood. It is clear, that the owner of the VDSL2 network is a monopolist, because it is simply the only physical network available, although they are forced to lease it to competitor telecom operators for commercial fairness. It is still the only network. The government is not interested to fight a monopolist - the local officer told me: "we don't close Belgium for you..." In other words: Commercial interests are more important than everything else....
The radiated noise on shortwave has increased drastically here. LW and VLF is a different story as you can see in my rx. But higher frequencies, where good antennas are relatively short have very much increased radiated noise - this is 160m and high - even 6m sometimes is not usable - PLC with S9 everywhere.... so forget it. Commercial interests dictate the future and nothing else!
Since I don't know the correlation between the kiwi measurement (~-140 dBm/1-Hz at 20 MHz) and e-field for your "200 m distant, empty field" measurement, nor do I know whether there is transmission line or other CM noise going through the kiwi, I can't really say much about your spectrum other than it seems relatively free from some of the narrowband, semi-coherent signatures common to many kinds of SMPS systems. A WiFi link from the kiwi allowing it to have no transmission line common mode coupling and also a small self-capacitance could change this.
It may be that at this distance there is still near-field noise from VDSL but that seems a little unlikely. See LZ1AQ's comments about his experience as well.
Needless to say, though you may be correct about the future, I hope you are wrong, that it is not hopeless. I hope to continue to demonstrate, along with a few others that something can still be done.
I continue to battle CM noise at KPH and on the Beveridge at AI6VN/KH6 which is a deep rural location. My long term plan is to connect the Kiwis a the Beverage feed point, and then solar power and wifi connect to them. However RF quiet solar chargers and the switching power supplies needed to efficiently convert 12V to the 5V needed by Kiwi have impeded that effort. But in my experience long feed lines, even with isolation transformers and multiple 'grounds' don't seem to be sufficient to suppress CM introduced RFI.
Rob, hopefully you can realize the receivers at kph as planned. In the past the marconi T was not usable much, due unlinearities in the feedsystem, I think. I can imagine how much efford it needs to build something new at a remote location with limited access ... At the moment all SDR's seem to be Off air (or not connected to decent antennas), as no real signals on the receivers - at least I just checked at your local noon time the Marconi-T and the TCI 530
Can the kiwi support wifi connection ? Are the USB ports fully functional and do the Kiwi-Software support USB externel devices like a Wifi stick or even an LTE Modem (for highspeed mobile phone connections)? One could also consider optical connection for communication with the Kiwi Hardware and local batteries. However the Kiwi is really no low-power device - large batteries are necessary and solar power... with a lot of new problems...
A single LiPo battery can power a kiwi+Wifi+active antenna for hours> The whole unit is self contained so no CM noise from feed line is possible.
Since this is really only for testing, to see what your noise situation with no transmission line CM noise looks like this should be plenty of time. If you discover that there is noise ingress via the feed line then you have a measurement of the goal and can proceed to work on it. If not, you can discover the sensitivity to location and more about the source/coupling of the noise.
A single LiPo battery can power a kiwi+Wifi+active antenna for hours
What's the capacity of that LiPo? Can you quantify "hours" a bit more? How efficient is the (boost?) converter?
I ask because I have had a Kiwi-related idea rolling around in my head for a long time. But it has to be battery powered and has other severe constraints. Thanks!
Sure. The buck converter efficiency is very high, I use them on multiple kiwis to power both the kiwi and in some cases a GPSDO as well. Getting a really accurate value requires more careful measurement than I am set up to easily measure but I think you can assume well over 90% efficiency, probably above 95% by the feel of the (small thermal mass) thing, it's scarcely warm. You already know what a Kiwi takes to run. When the kiwi/WiFi pictured has settled down after booting, the 3 cell, 2200 mAh battery pictured which weighs 168g is delivering about .36A at 12.15V to the converter so let's call the load 4.4W average. I've previously done charge/discharge cycles on these LiPo batteries, which are what I used to power mid-sized quadcopters and find that on a new one 2000 mAh can be recovered without risk of under-voltage of the (finnicky) LiPo cells. If one says that the average potential over the discharge cycle is 3.6V, a number I pull out of the air plus a little experience then we can call the battery a 21.5 watt-hour source and would suggest that almost 5 hours is reasonable.
But these batteries are available in a variety of capacities so half to twice this time is just a matter of choice. Storage capacity and weight seem to track pretty closely so one just buys whatever is desired. I think the one shown is currently about US$15.
I visited KPH on Saturday and restored the Kiwis to operation. In addition, I asked the KPH MRHS operators to leave the 500 Khz filter enabled on the the feed from the Marconi T. That filter of course suppresses signals above and below 500 Khz, but it also keeps the legacy KPH LNA out of overload, so the N*10 KHz IMD products you previously saw on that Kiwi are not present today. Here is the Marconi T RF path:
100M Marconi T => remotely switchable filter bank, currently set to 500 Khz ====> 100M RGx ===> 2-way splitter ====> KPH legacy distribution amp ==> KPH legacy receivers ||====> AM BPF========> Kiwi73
The IMD products you previously observed are almost certainly due to overload of the KPH legacy distribution amp when the switchable filter bank is disabled. It's IMD feeds back through the 2-way splitter and appears on the input of the Kiwi
I have suggested that the filter bank be replaced by the AM BPF and we replace the 2-way with a 4-way passive and get all active devices out of the RF path. However KPH is a 'living history museum' and such changes require approval by others.
So this week you will see no IMD but mostly 200-500 KHz signals but very little above 1 MHz. Perhaps this will encourage listeners who have chosen KPH72/LF for 7=14 Mhz listening to move to KPH73/HF.
I had a slightly strange one today, input welcome. My BeagleboneAI was running from a 3A switcher without much mains filtering and a LZ1AQ mag loop, lots of earthing (KIWI_70). Another BBG was running on a linear supply with a decent size mains filter before it, Wellbrook antenna not earthed at far end (KIWI_163).
There was, I assume, a noise spike from the mains(?) from about 2PM that affected only the BBAI+LZ1AQ loop.
As I've said a thousand times it can get noisy here from a few sources including mains but I've not seen such a pronounced blip. The noise charts are online so if anyone can come up with some possible causes I'd be grateful.
I'm guessing mains born noise up the unfiltered switcher but as it is early days for the AI I wanted to flag in case others see it and can tie it to some event around the AI. I've put the AI on the linear since about 9:30PM so will see if it changes the plots tomorrow.
In the past I have checked the calibration of the noise graphs by placing a CW carrier of known amplitude within the WSPR passband. It's been a while, so I don't recall the details, but I do know that a persisting CW carrier can have a similar effect.
Thanks Clint, will have to watch it today, normally if I pick something up on one loop another six feet away gets at least something. I've just remembered the second Kiwi is on a wifi bridge, the first (70) connected directly to a small access point and fibre converter, two more supplies to check.
--later-- just checked the SDR's and there is some horrible repeating noise patern across a large part of :8075 (LZ1AQ) with slightly less visible on :8076 (Wellbrook), I noticed this spur patern only at the very high end of HF for about a week. I think both those antennas are nulled on the same house so I need to do some investigation later if it is a new source somewhere else. Looks like a low spec Chinese PSU to me. Ah the joys of having electrical ears.
Comments
Finally: everything works - can see the noise reports also and massive number of spots......
HOWEVER: Nothing is spotted at all ! NO uploads ...
The webserver shows its original index file of Apache
Changed permission of the whole /VAR/WWW/HTML folders to 777 - no change...
Do I need to change something at the router ? I haven't done it before and it worked...
Is there any special firewall within Ubuntu which blocks everything
Why I can only see the original Apache Index file and not the noiseplot png ?
Why there is no new Index file written in the html folder ?
What is going on here ?
I don't know what to do anymore...
Has someone an idea, what to do ???
Ulli, ON5KQ
I missed that on a VM.
http://forum.kiwisdr.com/discussion/1529/wsprdaemon-version-2-3-latest-2-5a-a-raspberry-pi-wspr-decoding-service#latest
I'm not sure wsprdaemon writes the index.html, just the noise image.
I rename/delete the index.html and just link to the image (obviously that is just a local Apache for this, no other content)
Stu
This works - tested with browser : the correct noise_graph.png is shown...
Now the only fundamentel problem is : NO spots ! (waited more than 10min - nothing appears)
I managed to get Noise-Graph shown on http://graphs.wsprdaemon.org/ON5KQ/ by changing permissions of /VAR/WWW/HTML
The UFW (firewall Ubuntu) allows Apache service (sudo ufw status)....
Did I forget something, so the WD spots do not leave the Ubuntu environment (VMware) ??
Do I need more permissions for to get the spots sent?
How do I get my correctly decoded spots sent/recognized at wsprnet.org ?
Ulli, ON5KQ
40min delay on wsprnet.org !!!
not seconds - almost one hour delay!!
Problem seems to be solved, but not realized in real-time...hi
Ulli, ON5KQ
P.s.: sorry for the noise here...
In a previous post I said
"So should my config be:-
DEFAULT:-10,2200:63,630:42,160:22,80:13,60:8.1,40:4.7,30:2.9,20:3.3,17:2.2,15:-1.5,12:-1.3,10:-0.4
Or am I misunderstanding the process ?"
I think I was misunderstanding it.
I've just realised that the graphs are scaled as dBm/Hz, so it is the power delivered to the receiver and not a compensated value to derive the actual field strength.
So the figures on the graphs don't directly equate to ITU noise curves (which is what I'd imagined), they are just an indication of the power delivered to the receiver, the data has to be exported and an external calculation applied to compare the measured noise level vs ITU prediction.
Therefore I think the config should set to compensate for my 10dB pre-amp and include the gain of the antenna
DEFAULT:-10,2200:-63,630:-42,160:-22,80:-13,60:-8.1,40:-4.7,30:-2.9,20:-3.3,17:-2.2,15:1.5,12:1.3,10:0.4
Does wsperdaemon incorporate the ability to compensate for antenna gain and plot the noise level graphs in a similar format to the ITU curves, or am I still misunderstanding what's going on ?
Regards,
Martin - G8JNJ
The larger problem is that final assumptions need to be described for the noise plots to be meaningful. Antenna factor gets one a correction factor from e-field to some measurement context, normally 50 ohms, but that doesn't account for in situ problems like polarization misalignment, arrival angle and direction differences and especially 'mismatch loss' from the radiation resistance of the antenna to a physical measurement point. Our antennas aren't in free space nor are they perfectly matched, ground/earth both nclose and far has an enormous effect on what is delivered. The ITU reference is an empirical one.
That ITU reference is a monopole over 'some kind' of earth. Like any antenna not in free space it has directional and polarization characteristics. Their values are adjusted for frequency mismatch (as I remember) such that the Fa values provided are a comparison to a "well matched vertical dipole", at least more or less. One can argue a dB or two of pattern variation but it's close enough that other factors swamp slight directivity differences and such. From very short out to a full half wave every antenna has pretty nearly the same directivity and if perfectly matched 'catches' the same power into its source resistance. Getting to, matching, that is of course an entirely different problem and for very electrically small antennas is impossible with any tools and techniques we have.
To all be on the same page with our plots, referenced to the ITU numbers, we have to back out the antenna factor to get to that reference dipole, at every frequency. We must still realize that our environs aren't identical, our local ground is different as is our far-field. Since we have few (only one?) common reference we have to make estimates in order to get a reasonable comparison with the ITU, and the ITU numbers are suspect in themselves. They seem to indicate a higher floor at least some places in HF.
The one common reference that we perhaps have is galactic noise. IF we can see it, there is reason to think it represents a sort of noise standard, though it's one that rises and sets with Sagittarius, is affected by ground and who knows what else. We have to delve into radio astronomy in order to understand our terrestrial receiver.
A well matched dipole and, say, a Beverage are both antennas, but corrections necessary to compare signal and noise performance are very different. If we don't know our reference and assumptions the noise plots we get are only locally relative. The correction/model for an antenna with high antenna factor may be something like a dipole with an attenuator. Each may be capable of delivering similar SNR but the absolute levels may vary widely. These can't be measured with a VNA.
I think that a worthwhile goal might be for all of us using the WD noise measurement and graphing, including the Grafana database, to strive to have reasonably good corrections in place such that we reference our systems/antennas to a "well matched vertically polarized dipole" so that absolute values start to mean a little more. Even here though the particulars matter to the interpretation, e.g. turning a well-matched vertical dipole horizontal may typically drop both signal and noise ~6 dB while also having a major impact on the direction/angle of the main lobe and of course on polarization, though this may not matter so much for DX signals and galactic noise.
I suggest leaving the KiwiSDR reference at the SMA connector with the "-16" setting (and also waterfall but don't use that path for any absolute measurements) and then letting WD include the flatness correction and the individual band corrections for everything it takes to make the numbers for the actual antenna relate to the ITU vertical-dipole-over-real ground. This means that antenna factor compared to a well-matched dipole, mismatch, amplifier gain, splitter loss, feed line loss etc is all lumped into the WD correction value at every frequency. It won't be perfect but hopefully it will allow better comparison among kiwis as well as give a user guidance as to when his/her system is nearing the achievable limit.
[Sorry if this reposts with edits, I'm having difficulty editing and seeing the results]
All understood.
I'm using my TC2M broadband vertical monopole which has a modeled set of gain values that closely match those that I've been able to measure.
I'm currently trying to calibrate the wspraemon against values measured with a spectrum analyser.
I've now got most of the wsprdaemon plots close to the spectrum analyser values.
However the offset values I have used have turned out to be purely arbitary and don't seem to reflect actual calculated gain differences between bands.
The only one that still seems to be resisting is the 2200 band, which for whatever reason doesn't seem to be taking the offset value correctly when the other bands do.
Regards,
Martin - G8JNJ
At the risk of confusing things more, the following values for one kiwi from my WD.conf seem to work (doesn't mean the values themselves are meaningful though!)
" ... DEFAULT:-6.0,2200:-6.0,630:-6.0,160:-6.0,80:-7.0,40:-7.5,30:-8.0,20:-8.8,17:-9.3,15:-11.5,12:-13:4,10:-16.3"
I think this means that lacking a specific entry, "DEFAULT:-6.0" will be used as the amount to change the report, the amount ADDED to the report, to account for a nominal 6 dB gain ahead of the KiwiSDR. I hope I have this correct, it's easy to invert the sign and I have done this in the past so check to see if I have this right!
On the bands explicitly entered, e.g. "40:-7.5" a different value is used. In this 40m example there is upward slope of the preamp gain and MORE correction, now 7.5 dB, must be made to restore flatness vs. frequency.
Were my 40m antenna a well-matched dipole then the combination of the "-16" in the kiwi itself, 7.5 dB for the preamp and the frequency flatness correction in WD would get me an ITU-style reference.
In fact, this is NOT what I currently have for an antenna. During lightning season here I use a short active antenna which has an antenna factor different from that for a well matched 40m dipole. I haven't (yet) corrected for this so all my numbers are wrong relative to an ITU KTB-in-a-well-matched-monopole (approx a dipole) reference.
As usual, I probably have something wrong here...
Clint
KA7OEI
Yes I think my understanding of the offsets is the same as yours, so we are on the same page.
I think I'll just continue to fiddle them to match the spectrum analyser and not worry about the antenna gain values too much.
Regards,
Martin - G8JNJ
- build a reference vertical dipole of 2x1m long on a 3m fiber pole (so lowest point of dipole is 1m off ground)
The dipole will be fed by LZ1AQ dipole mode pre-amp. Chavdar made a LT-spice calculation of the dipole with the amplifier and result is Antennafactor of +2db. I will just assume this is at least not unrealistic...
- then just compare the results of this dipole with my actual receiving antennas...
My expectation is, that I will wonder how wrong NEC2 simulations with the actual antennas might be.
would that make sense ?
Ulli, ON5KQ
A 2x1m dipole (the same antenna I'm using here), or any short dipole has an impedance, Ra+jXa, that can be well approximated by
Ra = 20.*((pi().*L./lambda).^2); #series resistance in ohms
Xa = (-i*120./(pi().*L./lambda)).*(log(L./(2*a)).-1); #series reactance in ohms
It is this Ra that sources the voltage that the preamp and any transmission line then transforms to the kiwi's input Z. There is likely additional mismatch loss at the preamp input so the entire voltage from Ra may not all appear at the preamp.
For a 2x1m dipole, Ra goes from ~80 micro-ohms at 30 kHz to ~8 ohms at 30 MHz. The dipole looks much like a 10 pf capacitor in series with that small, varying R. At the same time this is difficult to match to, the voltage gets very small at low frequencies, and often small compared to CM noise, and other ingress. Though the matched-power (and the pattern and the aperture) is nearly the same for a dipole of any size, the available voltage goes as the length.
Using the calculator provided by Owen, VK1OD, in 50 ohms the antenna factor for that dipole goes from about -62 dBm/meter at 30 kHz to about -2 dBm/meter at 30 MHz.
The reason small antennas work at low frequencies in spite of this is, I think, because lightning QRN has an even larger slope with frequency. The ITU data seems to bear this out.
The LTspice calculation for the 2x1m dipole antenna factor was with the pre-amp included ! According to Chavdar he designed it in a way that the antenna factor is nearly constant - at least that is, what I understand from the specifications on the webpage for the AAA1C pre-amp (With antenna)
www.active-antenna.eu
It was just because I use the LZ1AQ pre-amp in the current WSPR reception with my Kiwi_0 and a longer 2x3m vertical dipole, that I want to compare the signal strength at the input SMA of the Kiwi...
Recently I build another antenna for the second Kiwi (Kiwi_1 in the noise graph) - 100m longwire - shot over a tree (15m high) - so fed the southeast end at ground level (against a single copper ground rod) and terminated the other Northwest end like a Beverage. The impedance is very flat and can easily broadband matched with a imp. transformer. The simulation pattern is a stunning picture on the high band - like a beverage, but still with 6dbi gain on 10m, because of the height over ground, I think...
I use a 20db amp (50Ohm input/output impedance) with an equalization network in front of the preamp to not overdrive the system. Otherwise the kiwi would be overdriven from the huge signals on the lowbands from the longwire.
The noiseplots now are adjusted with the DEFAULT parameter so, that I take all losses of the feedsystem into account (measured with DG8SAQ network analyser) and also included the 4Nec2 simulations about expected antenna gain.
I included 3db extra loss in addition to my calculation and in the NEC2 calculation I added extra 100Ohm ground loss spot impedance at the feedpoint.
The result are the noise plots we see http://graphs.wsprdaemon.org/ON5KQ/
What I do not know at all is, what field strength it represents - so comparison with ITU recommendation (part 13 ), I don't know
Interesting is the comparison of the noiseplots of the vertical dipole (omnidirectional, vertical polarised) with the highly directive Longwire to northwest.
In the morning before lunchtime the Northwest antenna looking into the night is almost dead on 20m - even the very strong Chinese BCstations on 19 and 16m band in the morning are at least 20db weaker (may be way more) than the dipole - there is no QRN at all as well on the large longwire to the 310deg - as there are no condx in this direction. I kept the antenna connected though to compare the plots with the very sensitive vertical dipole at the same time at same qth. The dipole is full of strong signals on 20m in the morning, but also with relatively strong qrn occasianally on 20m.
We can see the outliers over a stable base noise line on the dipole antenna not that visible on the longwire.
One of the 60m frequencies is often jammed by a digital signal of some kind, which is very strong all day long - we can see that also when it is on.
During the week I have lots of QRM from a home-office with a faulty PC-powersupply, which increase noise by 10db - we can see, when buisiness hour starts and noise curve jumps up - (usually from about 7h utc =it switches off after 17hutc and is never there on weekends...
40m and 60m is affected by LED-lamps of neighbors - both about 10db increase of base noise level.
Unfortunately it badly affects my 40m reception, which is basically very well, because I refurbished my old transmit antenna and use it now as additional 40m DX antenna - 8 resonant quarterwave verticals in a circle of 26m diameter, with extensive radial network. The array is best during the day, when there is no lighting and during weekends, when computers are off, and people have a good time on a trip to close Northsea coast rather than making noise on 40m with cheap electronic consumer equipment......
Unfortunately the time is gone, when I could use this large array in wintertime at twighlight or nightime, because all these LED lamps are annoying.
The 40m runs on a seperate receiver (ELAD), on the windows host PC on which the Kiwi's are running in a VMware virtual machine with Ubuntu 18.04LTS
It would be great to connect the ELAD via Vitual Audio cable with the Ubuntu Virtual machine and then define it as audio input for wsprdaemon software. I have not found out how I can do that - because then I can also show the noisegraph of the 40m array - so far it is not included.
73s, good weekend,
Ulli, ON5KQ
It's not the *measured* antenna impedance that goes into the calculation but rather the radiation resistance and how it is coupled to system. A 1 mm monopole connected to a 50 ohm load is matched LF-HF pretty well but has an awful antenna factor below mm wavelengths.
A 'long' wire, still short at some long wavelength, has a low Ra there. At higher frequencies, the length of the wire rotates that radiation resistance delivered from low (~37 ohms for a lambda/4 vertical wire/monopole) to very very high at odd half waves -also known as the 'second resonance'). The 'delivered' Ra varies wildly and is not perfectly match-able over the entire range with components (at least excluding superconductors) and architectures available to mankind. See A New Antenna Model.
For asymmetric antennas like mono-poles, the ground resistance is in series with the Ra . So even though the match may look OK on a,say, 20m long loaded 630m vetical over real ground, the antenna factor is not nearly that of a dipole -the ~100 milli-ohm Ra is not well matched to the 50 ohm transmitter/receiver. For any electrically small antenna most of the power goes to heating matching elements and/or earth losses This is why ERP from LF transmitters are very often only a small fraction of transmitter power. Most of the power goes into matching or ground losses.
On receive only a fraction of the power in the approximately-constant aperture of a matched antenna arrives at a Kiwi's 50 ohm input. While the SNR on any antenna is the same (modulo directivity), the delivered SNR to the rx detector often is not because other contributors, front-end noise figure, common mode current noise, near-field noise sources ... dominate compared to the small voltage developed across Ra. My reason for being interested in examining noise floor, referenced to a well-matched/ITU antenna reference is that it gives guidance as to how much effort to make and the possible system improvements that would result. One needs to be able to compare KTB+Fa on the antenna in use to do this.
- the feedsystem losses are more or less known, as you can measure them
- The antenna to my best possibilities is modelled with NEC2. We can use the results on the various bands as input for the Default parameter or don't adjust anything as some factors will stay unknown. However I feel it gives at least a good indication of how sensitive the whole antenna system is. At least for the bands 10Mhz and higher it is useful, I think.
- I am aware of the many factors I don't know - therefore the ITU-graphs are not important for me.
I don't even think ITU recommendation has any value any more by today, as noiselevels increased in most residential QTHs in my area drastically in the last 20 years. There are many factors, which are not at all included apppropriately in the most recent ITU curves - I have noise plots from 15 years ago - same qth, same hardware/measurement equipment... today the noise is 20db higher than at that time of recording, where there were no LED lamps, radiating Telecom lines with highspeed Internet over old telefon wires from 60 years ago... Such faults and mistakes are normal and standard always everywhere - 20years ago such emmissions where rare exceptions.
So why bother with the ITU recommendation curves which is not representing real life any more - If govenments would take these curves to be guaranteed in any residential area (taking residential curves) they would need to close all consumer and most professional equipment or even better switch off 230V utility lines (We have PLC installed for smart meters !)
As a result the ITU curves for residential must be increased drastically to show the real situation - unfortunately this is not done.
So - I will keep it as it is:
- antenna gain from Nec2 simulation deducted with DEFAULT parameter
- feedsystem losses included as well
Net results are the curves - KIWI_0 = vertical active dipole, KIWI_1 = 100m longwire
I prefer the antennas with better signal to noise ratio - the absolute noise isn't important to me as long as signal stick out more with the better antennas.
It is not only the level of the noise graph rather than the actual picture they show (even if the level is different) - there can be drastical differences
Often the noise graph looks like a dead band - however in reality it is just a very quit antenna, with excellent S/N ratio. You need to listen into your receiver, to find out...
Ulli, ON5KQ
The optional DEFAULT and following band-specific parameters affect only the noise level and noise graphs. If present, they are comma-separated 'BAND:ADJ" pairs with 'DEFAULT:ADJ' (which should be first) applied to bands for which there is no 'BAND:ADJ' definition.
They are there to allow you to adjust the noise level reports for any gain or loss in the transmission system.
For example, if you have a +20 dB LNA ahead of your Kiwi, then 'DEFAULT:-20' should be added to your Kiwi's receiver definition line. DEFAULT:0 adds 0 dB to your measurements, so it will have no effect on your reported noise levels.
I support band-specific adjustments, since at many (if not most sites) there will be elements in the RF transmission chain which change the gain by frequency.
Get at $50 NanoVNA and check out your rx system from antenna feed point to the Kiwi's SMA input. You may be surprised.
I agree that it is difficult to get all the way to an absolute reference but I think it worth the effort. You say:
"I don't even think ITU recommendation has any value any more by today, as noise levels increased in most residential QTHs in my area drastically in the last 20 years"
which I think worth a response.
From measurements we at KPH, AI6VN, N6GN .. and others such as the Long Wave Length Array have made it appears that the actual noise is considerably lower than the ITU numbers! It is my emerging opinion that actual radiated noise, that is, far-field inverse-square, plane wave level of noise probably has not increased significantly over the years. I think (again only an opinion) that the worldwide lightning noise still dominates below mid-HF and that above that galactic/propagated values can be reached or approached by carefully built systems even near cities and certainly in rural areas.
If noise generated by a large number of individual sources from a dense urban region had actually increased such that actual radiated noise of the sort that falls as inverse square were now a bigger problem I think we would measure a different result. Nobody near a city, even on the outskirts and beyond would be able to achieve anything like the noise floors that actually are possible. Falling at only 6 dB for every doubling of (ground wave) distance it wouldn't be possible.
So "How come everybody thinks noise is worse these days?" is a good question. It's not that I doubt your measurements or experience, you are by no means alone! it's that I don't think the mechanism is true far-field radiated noise. I don't think what you measured 15 years ago nor what you are seeing now is that. Rather what I think we are all seeing is a combination of coupled and near-field noise mechanisms. Of course, just that these are of a different sort doesn't solve the problem. Noise is still noise and still limits us but if it is true that the mechanism is different then there are things that can be done. I think this is a very important distinction and offers us all hope. Coupling mechanisms can be mitigated and near-field noise falls off very rapidly with distance, inverse fourth and inverse sixth powers being common.
I've found the KiwiSDR environment a really useful one for exploring these noise mechanisms and ways of mitigating them. My goal is to understand them first and the broadband capability of a single kiwi combined with a lot of test cases world wide is really useful. There haven't been geographically separate, 0-30 MHz spectrum analyzers available in this manner before now and I really like it.
So while I'm not hung up on becoming a radio astronomer with a kiwi I do find the attempt to obtain absolute calibration and thereby a metric for performance extremely useful. I'm not expecting everyone to do this. When I started this endeavor several years ago my 'normal' 20m antenna showed about -73 dBm in a USB bandwidth. I could only hear the strongest stations. Listening wasn't much fun. After working on it for a year, building active antennas, probes and trying to learn coupling mechanisms for ingress noise that wasn't the propagated/galactic noise that I really wanted to have as the limit, I was able to make about 30 dB of progress. I could hear 5 honest S units further down than before and the entire radio listening experience for me changed. And I wasn't even done, I hadn't reached galactic/propagated noise limits at upper HF.
Since then I've repeated this type of effort at several sites with good results. Again I'm probably not done except perhaps at KPH where with Rob, AI6VN, doing most of the heavy lifting we can get to a galactic/propagated noise limit (perhaps almost 10 dB *below* what ITU suggests!) at the other sites it is still very much a work in progress but one for which the kiwi plays a very useful role.
"So "How come everybody thinks noise is worse these days?" is a good question."
Look at the noise between 3.75 and 5.3Mhz here on5kq.ddns.net:8074
This noise comes from the local telecom operator and is the modems return pass of VDSL2 network. This noise is everywhere in our city - sometimes even much stronger sometimes less
Here is the noise 200m far away on a empty field (see fotos at my qrz page):
on5kq.ddns.net:8073
The feedline has more about 100kOhm of chokes with multiple good earth connections - you can verify how clean the VLF band is (almost ONLY conducted noise).... in my case there is NO conducted noise at all - it is all radiated noise - not always far field - but most man-made noise is NOT far field - at least not on very low frequencies...
"Noise is still noise and still limits us but if it is true that the mechanism is different then there are things that can be done."
I think this is an illusion! The local official regulator wrotes me a letter, saying that they don't have resources to serve radio amateurs - they have no manpower to come locally and check the situation. So nothing is been done with my claims against various illegal emmissions in the neighborhood.
It is clear, that the owner of the VDSL2 network is a monopolist, because it is simply the only physical network available, although they are forced to lease it to competitor telecom operators for commercial fairness. It is still the only network.
The government is not interested to fight a monopolist - the local officer told me: "we don't close Belgium for you..." In other words: Commercial interests are more important than everything else....
The radiated noise on shortwave has increased drastically here.
LW and VLF is a different story as you can see in my rx.
But higher frequencies, where good antennas are relatively short have very much increased radiated noise - this is 160m and high - even 6m sometimes is not usable - PLC with S9 everywhere.... so forget it. Commercial interests dictate the future and nothing else!
Ulli, ON5KQ
It may be that at this distance there is still near-field noise from VDSL but that seems a little unlikely. See LZ1AQ's comments about his experience as well.
Needless to say, though you may be correct about the future, I hope you are wrong, that it is not hopeless. I hope to continue to demonstrate, along with a few others that something can still be done.
I can imagine how much efford it needs to build something new at a remote location with limited access ...
At the moment all SDR's seem to be Off air (or not connected to decent antennas), as no real signals on the receivers - at least I just checked at your local noon time the Marconi-T and the TCI 530
Can the kiwi support wifi connection ? Are the USB ports fully functional and do the Kiwi-Software support USB externel devices like a Wifi stick or even an LTE Modem (for highspeed mobile phone connections)?
One could also consider optical connection for communication with the Kiwi Hardware and local batteries.
However the Kiwi is really no low-power device - large batteries are necessary and solar power... with a lot of new problems...
Ulli
Since this is really only for testing, to see what your noise situation with no transmission line CM noise looks like this should be plenty of time.
If you discover that there is noise ingress via the feed line then you have a measurement of the goal and can proceed to work on it. If not, you can discover
the sensitivity to location and more about the source/coupling of the noise.
I ask because I have had a Kiwi-related idea rolling around in my head for a long time. But it has to be battery powered and has other severe constraints. Thanks!
I've previously done charge/discharge cycles on these LiPo batteries, which are what I used to power mid-sized quadcopters and find that on a new one 2000 mAh can be recovered without risk of under-voltage of the (finnicky) LiPo cells. If one says that the average potential over the discharge cycle is 3.6V, a number I pull out of the air plus a little experience then we can call the battery a 21.5 watt-hour source and would suggest that almost 5 hours is reasonable.
But these batteries are available in a variety of capacities so half to twice this time is just a matter of choice. Storage capacity and weight seem to track pretty closely so one just buys whatever is desired. I think the one shown is currently about US$15.
I visited KPH on Saturday and restored the Kiwis to operation. In addition, I asked the KPH MRHS operators to leave the 500 Khz filter enabled on the the feed from the Marconi T. That filter of course suppresses signals above and below 500 Khz, but it also keeps the legacy KPH LNA out of overload, so the N*10 KHz IMD products you previously saw on that Kiwi are not present today. Here is the Marconi T RF path:
100M Marconi T => remotely switchable filter bank, currently set to 500 Khz ====> 100M RGx ===> 2-way splitter ====> KPH legacy distribution amp ==> KPH legacy receivers
||====> AM BPF========> Kiwi73
The IMD products you previously observed are almost certainly due to overload of the KPH legacy distribution amp when the switchable filter bank is disabled. It's IMD feeds back through the 2-way splitter and appears on the input of the Kiwi
I have suggested that the filter bank be replaced by the AM BPF and we replace the 2-way with a 4-way passive and get all active devices out of the RF path. However KPH is a 'living history museum' and such changes require approval by others.
So this week you will see no IMD but mostly 200-500 KHz signals but very little above 1 MHz. Perhaps this will encourage listeners who have chosen KPH72/LF for 7=14 Mhz listening to move to KPH73/HF.
Rob
My BeagleboneAI was running from a 3A switcher without much mains filtering and a LZ1AQ mag loop, lots of earthing (KIWI_70).
Another BBG was running on a linear supply with a decent size mains filter before it, Wellbrook antenna not earthed at far end (KIWI_163).
There was, I assume, a noise spike from the mains(?) from about 2PM that affected only the BBAI+LZ1AQ loop.
As I've said a thousand times it can get noisy here from a few sources including mains but I've not seen such a pronounced blip.
The noise charts are online so if anyone can come up with some possible causes I'd be grateful.
I'm guessing mains born noise up the unfiltered switcher but as it is early days for the AI I wanted to flag in case others see it and can tie it to some event around the AI.
I've put the AI on the linear since about 9:30PM so will see if it changes the plots tomorrow.
Clint
KA7OEI
I've just remembered the second Kiwi is on a wifi bridge, the first (70) connected directly to a small access point and fibre converter, two more supplies to check.
--later-- just checked the SDR's and there is some horrible repeating noise patern across a large part of :8075 (LZ1AQ) with slightly less visible on :8076 (Wellbrook), I noticed this spur patern only at the very high end of HF for about a week. I think both those antennas are nulled on the same house so I need to do some investigation later if it is a new source somewhere else. Looks like a low spec Chinese PSU to me.
Ah the joys of having electrical ears.