• ClamDrinker@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    edit-2
    1 month ago

    If you’re here because of the AI headline, this is important to read.

    We’re looking at how we can use local, on-device AI models – i.e., more private – to enhance your browsing experience further. One feature we’re starting with next quarter is AI-generated alt-text for images inserted into PDFs, which makes it more accessible to visually impaired users and people with learning disabilities.

    They are implementing AI how it should be. Don’t let all the shitty companies blind you to the fact what we call AI has positive sides.

    • UnderpantsWeevil@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 month ago

      They are implementing AI how it should be.

      The term is so overused and abused that I’m not clear what they’re even promising. Are they localizing a LLM? Are they providing some kind of very fancy macroing? Are they linking up with ChatGPT somehow or integrating with Co-pilot? There’s no way to tell from the verbage.

      And that’s not even really Mozilla’s fault. It’s just how the term AI can mean anything from “overhyped javascript” to “multi-billion dollar datacenter full of fake Scarlett Johansson voice patterns”.

      • chrash0@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 month ago

        there are language models that are quite feasible to run locally for easier tasks like this. “local” rules out both ChatGPT and Co-pilot since those models are enormous. AI generally means machine learned neural networks these days, even if a pile of if-else used to pass in the past.

        not sure how they’re going to handle low-resource machines, but as far as AI integrations go this one is rather tame

    • AusatKeyboardPremi@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 month ago

      There are a lot of knee jerk reactions in the comments. I hope few of those commenters have read the article or, at the least, your comment.

  • fpslem@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 month ago

    tab grouping

    Sure, okay.

    vertical tabs

    To each their own.

    profile management

    Whatever, it’s fine.

    and local AI features

    HOLLUP

    • elliot_crane@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 month ago

      We’re looking at how we can use local, on-device AI models – i.e., more private – to enhance your browsing experience further. One feature we’re starting with next quarter is AI-generated alt-text for images inserted into PDFs, which makes it more accessible to visually impaired users and people with learning disabilities. The alt text is then processed on your device and saved locally instead of cloud services, ensuring that enhancements like these are done with your privacy in mind.

      IMO if everything’s going to have AI ham fisted into it, this is probably the least shitty way to do so. With Firefox being open source, the code can also be audited to ensure they’re actually keeping their word about it being local-only.

      • PseudorandomNoise@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 month ago

        Don’t you need specific CPUs for these AI features? If so, how is this going to work on the machines that don’t support it?

        • elliot_crane@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 month ago

          With it being local it’s probably a small and limited model. I took a couple courses on machine learning years ago (before it got rebranded as “AI”), and you’d be surprised at how well a basic image recognition model can run on the lowest-spec macbook from 2012.

          • ferret@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            0
            ·
            1 month ago

            Tbh the inversion of typical intuition that is LLMs taking orders of magnitudes more memory than computer vision can mess people unfamiliar up on estimates of the hardware required

        • lemmyvore@feddit.nl
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 month ago

          You only need lots of precessing power to train the models. Using the models can be done on regular hardware.

    • GregorGizeh@lemmy.zip
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      1 month ago

      While I dislike corporate ai as much as the next guy I am quite interested in open source, local models. If i can run it on my machine, with the absolute certainty that it is my llm, working for my benefit, that’s pretty cool. And not feeding every miniscule detail about me to corporate.

      • anarchrist@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 month ago

        I mean that’s that thing. They’re kind of black boxes so it can be hard to tell what they’re doing, but yeah local hardware is the absolute minimum. I guess places like huggingface are at least working to try and apply some sort of standard measures to the LLM space at least through testing…

        • grue@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 month ago

          I mean, as long as you can tell it’s not opening up any network connections (e.g. by not giving the process network permission), it’s fine.

          'Course, being built into a web browser might not make that easy…

  • phoenixz@lemmy.ca
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 month ago

    Local AI, or also, how AI should be. Actually helpful, instead of a spying and data gathering tool for companies

  • Larry@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 month ago

    Local AI sounds nice. One reason I’m cynical about the current state of AI is because of how many send all your data to another company

    • sugar_in_your_tea@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      1
      ·
      edit-2
      1 month ago

      Eh, I don’t particularly care too much either way. It seems to be solving problems with the 80/20 approach: 80% of the benefit for 20% of the effort. However, getting that last 20% is probably way more difficult than just building purpose-built solutions from the start.

      So I’m guessing we’ll see a lot more “decent but not quite there” products, and they’ll never “get there.”

      So it might be fun to play with, but it’s not something I’m interested in using day-to-day. Then again, maybe I’m completely wrong and it’s the best thing since sliced bread, but as someone who has worked on very basid NLP projects in the past (distantly related to modern LLMs), I just find it hard to look past the limitations.

  • sunbeam60@lemmy.one
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 month ago

    This is what Mozilla should have done a LONG time ago - focussed on browser features, ease of use, compatibility and speed. Make a better browser if you want to win a browser war.

    • VådFisk@feddit.dk
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 month ago

      Forcing useless features or features that are useless to most users is more or less what windows is doing. Why the double standars?

      Especially when Firefox could have included those features as optional modules (even as preinstalled extensions) that we could simply remove if we dont want them?

      • sunbeam60@lemmy.one
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        1 month ago

        I definitely don’t believe Mozilla should continue to add features. But I like them focussing on the ones they’ve got.

        Edit: Changed this comment to better reflect what I actually meant.

        • VådFisk@feddit.dk
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 month ago

          It might be me and in that case i apologize

          …focussed on browser features, ease of use …

          It just sounds like you think its good that they added all these featueas

            • sunbeam60@lemmy.one
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 month ago

              My apologies. I definitely wasn’t meaning to come across indignant. I guess it’s just one of those things of things sounding perfectly clear in your head and not perfectly clear in the receiver’s ear. Hope you have a good day going forward.

    • kirk781@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      1 month ago

      I do not know why browser makers like Opera or Brave(and now apparently Firefox) is going hey ho over AI. I don’t see a proper benefit of integration of local AI for most people as of now.

      As for vertical tabs, Waterfox got it just now. It is basically a fork of Tree Style Tabs and very basically implemented. I am honestly happy with TST on Firefox and while a native integration might be a bit faster(my browser takes just that few extra seconds to load the right TST panel on my slow laptop), it’ll likely be feature incomplete when compared to TST.

      • FooBarrington@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 month ago

        It depends. I really liked Mozillas initiative for local translation - much better for data privacy than remote services. But conversational/generative AI, no thank you.

        • barsoap@lemm.ee
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          1 month ago

          AI-generated alt-text for images inserted into PDFs

          Sounds more like classification so far. Things like summarising web-pages would be properly generative, LLMs in general could be useful to interrogate your browsing history. Doing feature extraction on it, sorting it into a graph of categories not by links, but concepts could be useful. And heck if a conversational interface falls out of that I’m not exactly opposed, unlike the stuff you see on the net it’s bound to quote its sources, it’s going to tell you right-away that “a cat licking you is trying to see whether you’re fit for consumption” doesn’t come from the gazillion of cat behaviour sites you’ve visited, but reddit. Firefox doesn’t have an incentive to keep you in the AI interface and out of some random webpage.

    • 9tr6gyp3@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 month ago

      Its honestly the only reason i use brave and edge over Firefox. Can fully commit to FF now.

        • stealth_cookies@lemmy.ca
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 month ago

          The way tree style tabs worked after they broke it was never very good. Floorp is what to use if you wanted side tabs on Firefox.

          That said I still went back to Vivaldi after trying to use Floorp because of stupid little ux issues like pinned tabs not being protected from closing, and broken session saving.