They support Claude, ChatGPT, Gemini, HuggingChat, and Mistral.

  • @fruitycoder@sh.itjust.works
    link
    fedilink
    7
    edit-2
    5 months ago

    I will say, the Le Chat provider is pretty decent. You really can use it more natural language. “Rewrite it with a better rhyme scheme” “remove the last line” and it just got it.

    Why no local option though? Why no anonmysing option?

    Edit: There is a right click option which does make this officially actually useful for me now (summarize this!).

    Other models do have RAG options and Mist real supports making agents with specified documentation too to at least fine tune too (not as good as full grounding though IMHO)

  • @fibojoly@sh.itjust.works
    link
    fedilink
    255 months ago

    Didn’t want it in Opera, don’t want it in Firefox. I mean they can keep trying and I’ll just keep on ignoring this shit :/

    • @thingsiplay@beehaw.org
      link
      fedilink
      85 months ago

      There are no open source ai models, even if they tell you that they are. HuggingFace is the closest thing to as something like open source where you can download ai models to run locally without internet connection. There are applications for that. In Firefox the HuggingChat uses models from HuggingFace, but I think it is running them on a server and does not download from?

      The reason why they are not open source is, because we don’t know exactly on what data they are trained on. We cannot rebuild them on our own. And for trustworthy, I assume you are talking about the integration and the software using the models, right? At least it is implemented by Mozilla, so there is (to me) some sort of trust involved. Yes, even after all the bullshit I trust Mozilla.

      • @chicken@lemmy.dbzer0.com
        link
        fedilink
        35 months ago

        It’s “open weights” if they are publishing the model file but nothing about its creation. There’s some hypothetical security concerns with training it to give very specific outputs for certain very specific inputs but I feel like that’s one of those kind of far fetched worries especially if you want to use it for chat or summarization and the comparison is getting AI output from a server API. Local is still way better.

    • @1rre@discuss.tchncs.de
      link
      fedilink
      105 months ago

      I think Mistral is model-available (ie I’m not sure if they release training data/code but they do release model shape and weights), huggingchat definitely is open source and model-available

  • @onlooker@lemmy.ml
    link
    fedilink
    95 months ago

    For a second I thought it said “experimental failure”. Would be more accurate, I think.

    • @HouseWolf@lemm.ee
      link
      fedilink
      English
      15 months ago

      I switched a while back before all the Ai and “privacy preserving” telemetry stuff.

      Every update note I see for Firefox now just reinforces my decision.

  • JokeDeity
    link
    fedilink
    365 months ago

    Unpopular opinion, I think they’re doing it right as well as it can be at least. It’s completely optional and doesn’t seem to be intrusive.

  • @celeste@lemmy.blahaj.zone
    link
    fedilink
    75 months ago

    If they do it in a privacy-preseeving way, this could help them get back market share which will generally benefit an open internet.

      • @celeste@lemmy.blahaj.zone
        link
        fedilink
        15 months ago

        Because browsers are the most useful tool on most computers. Ordinary People go on google/ask chatgpt for mundane questions. If their browser can do that they need 1 app less and it will be more convenient which is what especially non-tech savy people care about.

  • @dukatos@lemm.ee
    link
    fedilink
    25 months ago

    And I still can’t convince it to stop caching the images because it does not follows the RFC.