• Alex@lemmy.ml
    link
    fedilink
    arrow-up
    1
    ·
    4 hours ago

    The demand for LLM inference will drop off when people finally realise it is not the road to AGI. However there is still plenty of things GPU compute can be applied to and maybe spot prices will come down again.

    • BlameThePeacock@lemmy.ca
      link
      fedilink
      English
      arrow-up
      1
      ·
      4 hours ago

      It’s more likely that the demand for LLM inference will drop off only once AGI exists.

      There are billions of active users already having it do everything from coming up with ideas for Christmas presents, to helping them write e-mails to clients.

      That isn’t going anywhere until there’s a better option on the table. People would have already got bored and moved on if it wasn’t doing anything useful for them.