You are viewing a single thread.
View all comments

We invented multi bit models so we could get more accuracy since neural networks are based off human brains which are 1 bit models themselves. A 2 bit neuron is 4 times as capable as a 1 bit neuron but only double the size and power requirements. This whole thing sounds like bs to me. But then again maybe complexity is more efficient than per unit capability since thats the tradeoff.

permalink
report
reply
39 points

Human brains aren’t binary. They send signals in lot of various strength. So “on” has a lot of possible values. The part of the brain that controls emotions considers low but non zero level of activation to be happy and high level of activation to be angry.

It’s not simple at all.

permalink
report
parent
reply
25 points

Human brains aren’t 1 bit models. Far from it actually, I am not an expert though but I know that neurons in the brain encode different signal strengths in their firing frequency.

permalink
report
parent
reply

Firing of on and off.

permalink
report
parent
reply
24 points

Human brains aren’t digital. They’re very analog.

permalink
report
parent
reply
1 point

We really don’t know jack shit, but we know more than enough to know fire rate is hugely important.

permalink
report
parent
reply
10 points
*

The network architecture seems to create a virtualized hyperdimensional network on top of the actual network nodes, so the node precision really doesn’t matter much as long as quantization occurs in pretraining.

If it’s post-training, it’s degrading the precision of the already encoded network, which is sometimes acceptable but always lossy. But being done at the pretrained layer it actually seems to be a net improvement over higher precision weights even if you throw efficiency concerns out the window.

You can see this in the perplexity graphs in the BitNet-1.58 paper.

permalink
report
parent
reply
6 points

None of those words are in the bible

permalink
report
parent
reply
2 points
*

No, but some alarmingly similar ideas are in the heretical stuff actually.

permalink
report
parent
reply
4 points

We need to scale fusion

permalink
report
parent
reply
2 points

Multi bits models exist because thats how computers work, but there’s been a lot of work to use e.g. fixed point over floating for things like FPGAs, or with shorter integer types, and often results are more than good enough.

permalink
report
parent
reply

Technology

!technology@lemmy.world

Create post

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


Community stats

  • 18K

    Monthly active users

  • 12K

    Posts

  • 538K

    Comments