Chrome version 147 silently downloads Gemini Nano’s weights.bin file to local storage, sparking major privacy, data, and legal concerns.

  • Multiplexer@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    34
    ·
    20 hours ago

    But…
    Isn’t that a good thing?
    I mean, running an LLM locally is much more private than running it somewhere in the cloud at a provider that gets your raw data, isn’t it?
    All your data stays on your device, while making it much, much harder for Google to argument why it should be uploaded to their data centers.

    • slevinkelevra@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      56
      arrow-down
      2
      ·
      19 hours ago

      All your data stays on your device

      You don’t seriously believe that, do you? They just use your device’s memory and CPU, thus your electricity to shovel through your data and then sending all valuable data to their servers.

      • irate944@piefed.social
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        3
        ·
        17 hours ago

        For clarity sake, that’s not what’s happening here. (Don’t misunderstand this comment as defending google, I could write a book about how much they suck)

        The model downloaded is a LLM called Gemini Nano, and it’s used for things like “help me write”, checking if an incoming message is scam, summaries, etc.

        Don’t worry about it itself being a spyware. It’s not; but for argument sake, if we were to assume that it was: they already know a lot about you through their usual apps and services, and get a lot more info out of you through them. This LLM would hardly move that needle.

        The actual issue is that they download it for everyone, even if their devices don’t match the minimum requirements. And without consent. And to enable it, you need to go through several menus, as the default behaviour is to use the cloud (this could change eventually, my understanding is that in this update they’re just laying the foundation)

        But, it’s Google that we’re talking about. Last year they were sentenced to pay a fine for spying on users despite them having their tracking settings off. And it wasn’t the first time iirc. This kind of behaviour is par for the course with them

        • atomicbocks@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          5
          ·
          14 hours ago

          It’s already been pointed out in multiple threads that the terms of service specify that even if it uses the on board model it still sends your queries to Google.

    • brsrklf@jlai.lu
      link
      fedilink
      English
      arrow-up
      32
      ·
      edit-2
      13 hours ago

      It’s not a good thing if you don’t want a freaking LLM to begin with. Hidden 4GB download for a feature I can’t give a single fuck about is ridiculous.

    • fpslem@lemmy.world
      link
      fedilink
      English
      arrow-up
      16
      ·
      19 hours ago

      If the reporting is accurate, your data is still sent to Google’s servers for processing. This doesn’t appear to improve privacy, it’s more like an extension of the user surveillance business model that Google has pursued in the past decade.

    • BrightCandle@lemmy.world
      link
      fedilink
      English
      arrow-up
      17
      ·
      19 hours ago

      If someone chooses to do that then yes its a better option, but 4GB of LLM shouldn’t just be shipped in a browser.

    • Skankhunt420@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      10
      ·
      18 hours ago

      I’m reminded of when they pinky swore that they weren’t dissecting your data in incognito tabs.

      They lied. And nothing ever really happened to them for it. Proof is that they still have the audacity to do anti consumer shit like this and not even think twice.

      Also if I was someone who wanted to run an LLM locally there are many options other than whatever crap google is putting out. You can’t trust them at all with even a morsel of your data.

    • scytale@piefed.zip
      link
      fedilink
      English
      arrow-up
      10
      ·
      18 hours ago

      If they are doing this without user knowledge, I wouldn’t trust that everything the LLM ingests stays local either, until proven otherwise. Also, not everyone wants to have a local LLM running on their browser eating up 4GB of space.

    • crunchy@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      7
      ·
      19 hours ago

      If I choose to install and use an LLM on my device, sure. That doesn’t mean Google should take it upon themselves to ship one baked into the browser, with no way to opt out or remove it without it being re-downloaded.

      Assuming Google will respect privacy is certainly a take.

    • driving_crooner@lemmy.eco.br
      link
      fedilink
      English
      arrow-up
      6
      ·
      19 hours ago

      The model could interact with everything on the PC without overhead on connection or servers, or user consent and then report back compressed reports. And who knows, maybe even training the model in a distributed way with users interactions with the PC.

    • Darnton@piefed.zip
      link
      fedilink
      English
      arrow-up
      6
      ·
      19 hours ago

      Sure, but privacy isn’t the only issue. It still consumes a ton of energy all for basically nothing. So you are paying that electric bill, as well as as the wear and tear on your GPU.