You are viewing a single thread.
View all comments View context
13 points

Yeah that won’t work sadly. It’s an AI we’ve given computers the ability to lie and make stuff up so it’ll just claim to have done it. It won’t actually bother really doing it.

permalink
report
parent
reply
2 points

Not quite. The issue is that LLMs aren’t designed to solve math, they are designed to “guess the next word” so to speak. So if you ask a “pure” LLM it what 1 + 1 is, it will simply spit out the most common answer.

LLMs with integrations/plugins can likely manage pretty complex math, but only things that something like wolfram alpha could already solve for. Because it’s essentially just going to poll an external service to get the answers being looked for.

At no point is the LLM going to start doing complex calculations on the CPU currently running the LLM.

permalink
report
parent
reply

Technology

!technology@midwest.social

Create post

Post articles or questions about technology

Community stats

  • 26

    Monthly active users

  • 102

    Posts

  • 654

    Comments

Community moderators