We all migrate to smaller websites try not to post outside drawing attention just to hide from the “Ai” crawlers. The internet seems dead except for the few pockets we each know existed away from the clankers

  • kazaika@lemmy.world
    link
    fedilink
    arrow-up
    5
    ·
    14 hours ago

    Servers which are meant to be secure usually are configured to not react to pings and do not give out failure responses to unauthenticated requests. This should be viable for a authenticated only walled garden type website op is suggesting, no?

    • Cooper8@feddit.online
      link
      fedilink
      English
      arrow-up
      1
      ·
      9 hours ago

      I have suggested a couple of times now that ActivityPub should implement an encryption layer for user authentication of requests and pings. It already has a system for instances vauching for each other. The situation is that users of “walled garden” instances in ActivityPub lack means of interfacing with public facing instances that doesnt leave the network open for scraping. I believe a pivot towards default registered users only content service built on encrypted handshakes, with the ability for servers to opt-in to serving content to unregistered users would make the whole network much more robust and less dependent on third party contingencies like CloudFlare.

      Then again, maybe I should just be looking for a different network, I’m sure there are services in the blockchain/cryptosphere that take that approach, I just would rather participate in a network built on commons rather than financialization at it’s core. Where is the protocol doing both hardened network and distributed volunteer instances?

    • dual_sport_dork 🐧🗡️@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 hours ago

      There are several things you could do in that regard, I’m sure. Configure your services to listen only on weird ports, disable ICMP pings, jigger your scripts to return timeouts instead of error messages… Many of which might make your own life difficult, as well.

      All of these are also completely counterproductive if you want your hosted service, whatever it is, to be accessible to others. Or maybe not, if you don’t. The point is, the bots don’t have to find every single web service and site with 100% accuracy. The hackers only have to get lucky once and stumble their way into e.g. someone’s unsecured web host where they can push more malware, or a pile of files they can encrypt and demand a ransom, or personal information they can steal, or content they can scrape with their dumb AI, or whatever. But they can keep on trying until the sun burns out basically for free, and you have to stay lucky and under the radar forever.

      In my case just to name an example I kind of need my site to be accessible to the public at large if I want to, er, actually make any sales.