• 0 Posts
  • 133 Comments
Joined 2 years ago
cake
Cake day: June 11th, 2023

help-circle

  • I live close to the central area of an ~80,000 population city.

    Looking at Google maps I’ve got about 10 general stores within 300 meters, probably thrice that within 500m, plus plenty of smaller specialised stores.

    300 to 600m seems like a reasonable distance to walk to and back with four to six bags of groceries.

    For smaller more specialised shopping trips one or two kilometres would be fine too.

    Three might be a bit much, though I’ve often walked that to go to the cinema.












  • in the unable-to-reason-effectively sense

    That’s all LLMs by definition.

    They’re probabilistic text generators, not AI. They’re fundamentally incapable of reasoning in any way, shape or form.

    They just take a text and produce the most probable word to follow it according to their training model, that’s all.

    What Musk’s plan (using an LLM to regurgitate as much of its model as it can, expunging all references to Musk being a pedophile and whatnot from the resulting garbage, adding some racism and disinformation for good measure, and training a new model exclusively on that slop) will produce is a significantly more limited and prone to hallucinations model that occasionally spews racism and disinformation.



  • You’re not taking into account the fact that LLMs are an obvious dead end.

    Once that bubble bursts it’ll take decades before anyone invests in AI research again and for anything attached to the term “AI” to not be seen as a scam (LLMs are obviously not AI or anything close, but they’re being sold as such and that’s what the term will be associated with), not to mention we’ll need decades to clean up all the LLM slop spillage before proper research of any kind can proceed.

    What you said was valid before the well got poisoned.

    Now it’s extremely unlikely we’ll survive long enough to get back on track.

    LLM peddlers murdered the future, in the name of short term profits.


  • We were on track for it, but LLMs derailed that.

    Now we’ll have to wait for the bubble to burst, which will poison the concept of AI (since LLMs are being sold as AI despite being practically the opposite) in the minds of both users and investors for decades.

    It’d probably take a couple generations for any funding for AI research to be available after that (not to mention cleaning up all the LLM slop spillage from our knowledge repositories)… but by that time we’ll almost certainly be extinct due to global warming.

    The LLM peddlers murdered the future for short term profits, and doomed us all in the process.