You're second point is a good one, but you absolutely can log the IP which requested robots.txt. That's just a standard part of any http server ever, no JavaScript needed.
Be sure not to create an open resolver, something commonly used in DDoS attacks. https://serverfault.com/questions/573465/what-is-an-open-dns-resolver-and-how-can-i-protect-my-server-from-being-misused#573471
This makes perfect sense. Thank you!
That makes some amount of sense. I'm not sure exactly how each article is stitched together to create the full file. Do you happen to know if it's just put together sequentially or if there's XORing or more complex algorithm going on there? If it's only the former, they would still be hosting copyrighted content, just a bit less of it.
EDIT:
https://sabnzbd.org/wiki/extra/nzb-spec
This implies that they are just individually decoded and stitched together.
Pretty good tool. I took the quiz out of curiosity, and the top result was my current distro
I think the point is that now he doesn't have to take the time to go around the house prying the batteries out and replacing them every year. A small chore to be sure, but one that I'd be happy to do any with.
Just wanted to let you know I somewhat found a solution and edited my post to reflect that.
I'll check it out. Thanks!
Didn't work, unfortunately. Same exact issues
Rootless podman. The plan is to eventually move WG into a container once I get it working, but it's running on bare metal at the moment.
Nope. I can't ssh in either.
I do see the request. I'm running it inside a container so all the clients show up as the container's hostname.