The KiwiSDR 2 online store is open for orders! Please visit kiwisdr.nz

Kiwi HTTPS/SSL: Works fine. Yet its use is limited by difficult certificate deployment problems.

I've spent the last couple of weeks learning about SSL, certificates, Let's Encrypt (LE) et al. You probably didn't notice, but the forum now uses HTTPS. After some struggles I got the little web server inside the Kiwi software (Mongoose) to support HTTPS/SSL connections. I had to add some code to do automatic HTTP-to-HTTPS connection upgrades. An additional port for simultaneous, local-only HTTP (non-SSL) connections is also supported.

The performance impact of the SSL library encrypting all the web traffic is unknown. It will take some empirical measurements to decide if this is a problem or not.

The real problem is certificate management. There are many different Kiwi network deployment scenarios and not many of them support easy certificate issuance let alone automatic renewal (e.g. LE certs require renewal every 90 days).

For example, the Kiwi software could automate certificate handling pretty easily given the following conditions:

  • You use your own privately registered domain to address your Kiwi.
  • Your ISP does not block incoming connections on port 80 (a requirement of the LE cert challenge authenticator).

But not all Kiwis are setup this way. Some use a reverse proxy to get around the blocking of incoming connections. Some are addressed by ip address or via dynamic DNS. These limitations make it difficult or impossible to get a cert issued by a public certificate authority (CA). It is possible the Kiwi reverse proxy might be adapted to using SSL. I am still looking into that.

As a further example, the second line of http://rx.kiwisdr.com has been modified to show the number of public Kiwis addressed using: the Kiwi proxy service, a ddns service, a private domain name and an ip address. Only about 30% of public Kiwis use a private domain name.

More information:

Comments

  • Having port 80 open for LesEncrypt renewals is something that has caught me out before, increases traffic of course as the crawlers find the port open and regularly scan for content updates and poke other common ports.

    I wonder if a separate thread for crawler IP's to block might save extra loading later.

    I still don't see SSL as critical in this use generally, but can imagine scenarios where it is preferred, hopefully where it is seen as 'required' the user can justify a domain registration or fixed IP.

  • I went as far as restricting port 80 traffic to only the ./well-known/ subdirectory used by the ACME protocol HTTP-01 challenge mechanism used by Let's Encrypt. Except for those Kiwis that are actually configured to use port 80 instead of e.g. 8073 as their external port (there are some that do that). So the bots will get lots of HTTP 403 "Forbidden" errors if they start poking around. I should probably change that so there is no response at all, but I was debugging and needed the feedback.

  • Sounds good, I think the traffic I saw was trying well known directories and if it has been previously indexed, the structure. If the Spiders don't get much on any visit the impact should be negligible.

    Not sure if they really respect the noindex meta tag, I should try it on some server then use the URL inspection tool to prompt a visit. https://support.google.com/webmasters/answer/9012289

  • When I first added the code to open port 80 it was probably less than 60 seconds before my debugging printf tripped and I got this:

    webserver: bad ACME request evt=103 23.95.100.141:80 </ftptest.cgi>

    It took me a moment to realize what had happened, lol.

Sign In or Register to comment.