• AeonFelis@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    1 day ago

    A phone can do a lot. Much much more than ENIAC era supercomputer (I think you’ll have to get pretty close to the end of the previous century to find a supercomputer more powerful than a modern smartphone)

    What a phone can’t do is run an LLM. Even powerful gaming PCs are struggling with that - they can only run the less powerful models and queries that’d feel instant on service-based LLMs would take minutes - or at least tens of seconds - on a single consumer GPU. Phones certainly can’t handle that, but that doesn’t mean that “cant’ do anything”.

    • bandwidthcrisis@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 day ago

      I’ve run small models (a few Gb in size) on my steam deck. It gives reasonably fast responses (faster than a person would type).

      I know that they’re far from state-of-the art, but they do work and I know that the Steam Deck is not going to be using much power.