• GregorGizeh@lemmy.zip
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    4 months ago

    While I dislike corporate ai as much as the next guy I am quite interested in open source, local models. If i can run it on my machine, with the absolute certainty that it is my llm, working for my benefit, that’s pretty cool. And not feeding every miniscule detail about me to corporate.

    • anarchrist@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 months ago

      I mean that’s that thing. They’re kind of black boxes so it can be hard to tell what they’re doing, but yeah local hardware is the absolute minimum. I guess places like huggingface are at least working to try and apply some sort of standard measures to the LLM space at least through testing…

      • grue@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        4 months ago

        I mean, as long as you can tell it’s not opening up any network connections (e.g. by not giving the process network permission), it’s fine.

        'Course, being built into a web browser might not make that easy…