We all migrate to smaller websites try not to post outside drawing attention just to hide from the “Ai” crawlers. The internet seems dead except for the few pockets we each know existed away from the clankers

  • dual_sport_dork 🐧🗡️@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    11 hours ago

    There are several things you could do in that regard, I’m sure. Configure your services to listen only on weird ports, disable ICMP pings, jigger your scripts to return timeouts instead of error messages… Many of which might make your own life difficult, as well.

    All of these are also completely counterproductive if you want your hosted service, whatever it is, to be accessible to others. Or maybe not, if you don’t. The point is, the bots don’t have to find every single web service and site with 100% accuracy. The hackers only have to get lucky once and stumble their way into e.g. someone’s unsecured web host where they can push more malware, or a pile of files they can encrypt and demand a ransom, or personal information they can steal, or content they can scrape with their dumb AI, or whatever. But they can keep on trying until the sun burns out basically for free, and you have to stay lucky and under the radar forever.

    In my case just to name an example I kind of need my site to be accessible to the public at large if I want to, er, actually make any sales.