You are viewing a single thread.
View all comments View context
7 points

The article mentions AI. 16gigs feels far too little to run a LLM of respectable size so I wonder what exactly this means? Feels like no one is gonna be happy about a 16gig LLM (high RAM usage and bad AI features)

permalink
report
parent
reply
7 points

Of fucking course it’s AI, why the hell wouldn’t it be AI. For fuck sake, it’s like they want their users to switch to Linux.

permalink
report
parent
reply
5 points

No 16 gigs is ok for an LLM

On your gpu

permalink
report
parent
reply

Technology

!technology@beehaw.org

Create post

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community’s icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

Community stats

  • 2.7K

    Monthly active users

  • 3.5K

    Posts

  • 82K

    Comments