You are viewing a single thread.
View all comments
66 points

Didn’t even think of it as a possibility. WTF would a browser need with LLM?

permalink
report
reply
18 points

I’d love a browser-embedded LLM that had access to the DOM.

“Highlight all passages that talk about yadda yadda. Remove all other content. Convert the dates to the ISO standard. Put them on a number line chart, labeled by blah.”

That’d be great UX.

permalink
report
parent
reply
10 points

You are falling into a common trap. LLMs do not have understanding - asking it to do things like convert dates and put them on a number line may yield correct results sometimes, but since the LLM does not understand what it’s doing, it may “hallucinate” dates that look correct, but don’t actually align with the source.

permalink
report
parent
reply
1 point

Thank you for calling that out. I’m well aware, but appreciate your cautioning.

I’ve seen hallucinations from LLMs at home and at work (where I’ve literally had them transcribe dates like this). They’re still absolutely worth it for their ability to handle unstructured data and the speed of iteration you get – whether they “understand” the task or not.

I know to check my (its) work when it matters, and I can add guard rails and selectively make parts of the process more robust later if need be.

permalink
report
parent
reply
1 point

That’s actually fascinating to think about. Would be a fun project to mash something like Blazor Server and an LLM together and allow users to just kindly ask to rewrite the DOM in plain English.

permalink
report
parent
reply
1 point

Arc has an LLM that lets you replace your search functionality with search or ask, where if you type a question it tries to answer it based on the content on the page. Kinda close to what you’re talking about.

Arc is genuinely trying to use LLMs in their browser in interesting ways.

permalink
report
parent
reply
12 points

Local translation of text comes to mind.

permalink
report
parent
reply

Yup, Firefox has it: https://browser.mt/ (it’s now a native part of Firefox)

permalink
report
parent
reply
0 points

Hmm maybe this is why Firefox is so damn slow on my raspberry pi

permalink
report
parent
reply
1 point

Vivaldi has had local translation for about half a year now. No need for LLM for this feature.

permalink
report
parent
reply
1 point

I was thinking more along the lines of communicating with a Klingon captain on a D7 Battlecruiser.

permalink
report
parent
reply
7 points

Firefox has that already (without using an LLM). But yeah, it’s still another way this could be implemented or possibly improved.

permalink
report
parent
reply
58 points

Webpage authors use LLMs to generate extremely long articles, to make you scroll by ads for longer. You use LLMs in your browser to summarize those articles. The circle of life, or something.

permalink
report
parent
reply
2 points

I do this already. It’s great. Kagi has a browser plugin that does it.

permalink
report
parent
reply
9 points

I stumbled upon a website through DDG, and after a long intro, the main section supposedly where the thing I was searching for had “Sorry I can’t fulfill your request right now”. Basically a fully generated page to match my search with some parasitic seo tactics. The web be chaging. Front page of DDG.

permalink
report
parent
reply
4 points

Reminds me of the amazon products titled: “I’m sorry but I cannot fulfill this request it goes against OpenAI use policy,”.

I think we’re way beyond the point of no return. The internet has been ruined for good.

permalink
report
parent
reply
3 points

The circle of death.

permalink
report
parent
reply
63 points

Edge is branding itself “The AI Browser”. Chrome has plans to embed LLMs for text input. Opera, the browser which was commandeered from the original Vivaldi team and turned into a crypto/VPN gimmick browser, is of course among the hardest leaning into the LLM trend.

permalink
report
parent
reply
-4 points

Cheers DANNY BOOOOOOOYYY!

permalink
report
parent
reply
4 points

Quick tool to summarize a page, proofread, or compare it to another source. Still needs a functioning human brain to separate the wheat from the chaff so to speak, but I could see a LLM (especially local) being useful in some ways.

I’m sure there are disabilities or unique use cases that could increase it’s usefulness, especially once they improve more.

permalink
report
parent
reply

Technology

!technology@lemmy.world

Create post

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


Community stats

  • 18K

    Monthly active users

  • 12K

    Posts

  • 553K

    Comments