• Perspectivist@feddit.uk
    link
    fedilink
    arrow-up
    9
    arrow-down
    1
    ·
    4 hours ago

    ChatGPT alone has 800 million weekly users of which the vast majority are normal people - not companies. The demand is there despite it not being able to increase company profit margins the way people expected. I don’t see this computing infrastructure needing to run idle anytime soon.

    • Varyk@sh.itjust.worksOP
      link
      fedilink
      arrow-up
      12
      ·
      edit-2
      4 hours ago

      Chatgpt is constantly losing money, public surface-level interest won’t matter much when the capital runs out and they’re still accruing significant debt without any revenue.

      • FaceDeer@fedia.io
        link
        fedilink
        arrow-up
        2
        ·
        1 hour ago

        A major problem faced by first-mover companies like OpenAI is that they spend an enormous amount of money on basic research and initial marketing and hardware purchases to set up in the first place. Those expenses become debts and have to be paid off by the business later. If they were to go bankrupt and sell off ChatGPT to some other company for pennies on the dollar that new owner would be in a much better position to be profitable.

        There is clearly an enormous demand for AI services, despite all the “nobody wants this” griping you may hear in social media bubbles. That dermand’s not going to disappear and the AIs themselves won’t disappear. It’s just a matter of finding the right price to balance things out.

    • Melobol@lemmy.ml
      link
      fedilink
      arrow-up
      3
      ·
      edit-2
      3 hours ago

      The free plan of chatgtp is more than enough for most people. And when they decide to start charging for it, probably 30% of free users will switch to a different (mahbe even locally run) Ai.

    • ch00f@lemmy.world
      link
      fedilink
      arrow-up
      4
      ·
      3 hours ago

      I think OP is talking about all of the future data centers that are allegedly being build despite nobody even knowing where. Nvidia has agreed to pay OpenAI $10B per gigawatt of datacenter for 10 gigawatts of datacenter build up over the next few years.

      Unlikely that will fully materialize, but that’s the current outlook.