260 points

This is why you should always selfhost your AI girlfriend.

permalink
report
reply
151 points

Woah there, I’m not sure I’m ready for that level of commitment.

permalink
report
parent
reply
23 points

I ain’t got that kinda GPU to spare. If it’s the games or my AI girlfriend…

permalink
report
parent
reply
8 points

you dont need that much power. something like a rx 6600xt/rtx 3060/rx580 is plenty

permalink
report
parent
reply
7 points

I’d have to actually test my backup strategy

permalink
report
parent
reply
6 points

🤣

permalink
report
parent
reply
52 points

I’d only date someone fully independent, so my AI joyfriend operates their own cloud cluster through a combination of a crypto wallet and findom

permalink
report
parent
reply
17 points

They sound fun

permalink
report
parent
reply
2 points

🤣

permalink
report
parent
reply
1 point

I’m interested. So you rent a cluster to run an instance instead of self hosting?

permalink
report
parent
reply
36 points

I think my wife is cheating on me with my self hosted AI girlfriend’s boyfriend that lives in the same database file. What do I do?

permalink
report
parent
reply
19 points

Delete system32 obviously

permalink
report
parent
reply
17 points

For as long as everyone is using a virus checker, maybe you could try an open source relationship.

permalink
report
parent
reply
14 points

Sounds like a job for Little Bobby Tables

permalink
report
parent
reply
10 points

That’ll make you go blind

permalink
report
parent
reply
7 points

And give you hairy palms

permalink
report
parent
reply
8 points

Now consider the number of normal people in the world who do not have a server rack in their closet, and how much they are about to be defrauded and blackmailed

permalink
report
parent
reply
13 points
  • My Canadian Girlfriend

Broke, Busted, Burned Out

  • My Canadian Server-Farm Girlfriend

Smart, Sexy, Superconductive

permalink
report
parent
reply
6 points

Let me go buy some milk.

permalink
report
parent
reply
2 points

You have to at least move out of your AI parents’ server rack before letting your AI girlfriend move in.

permalink
report
parent
reply
0 points

Nvidia^TM

permalink
report
parent
reply
-2 points

You need a little gpu server farm for proper models & context sizes though. Single consumer gpus don’t have enough vram for that.

permalink
report
parent
reply
17 points

Might as well just pay for a prostitute

permalink
report
parent
reply
9 points

Yeah, I heard they’re also very privacy friendly.

permalink
report
parent
reply
2 points

Locally hosted AI sucking down on our dick through usb plugged dildos. This is the future.

permalink
report
parent
reply
72 points

Are there any Open Source girlfriends that we can download and compile?

permalink
report
reply
57 points

Hey now, I don’t want anyone looking at my girlfriend’s source code. That’s personal!

permalink
report
parent
reply
30 points

I don’t want anyone looking at my girlfriend’s source code

it’s okay, dude, we all already did…

permalink
report
parent
reply
16 points
*

The bots (what the actual girlfriends or whatever other characters are) aren’t the problem. You can find them on chub.ai for example or write them yourself fairly easily. The issue the software, and even more so the hardware. You need something like the mentioned Kobold.ccp or oobabooga, and then you’d also need a trained LLM model that you can get on huggingface.co, which is already where it gets complicated (they’ll be loaded within kobold or oobabooga). You also need to understand how they work in regards to context sizes & bytes, because they need a lot, and I mean A LOT of vram to work properly. Basically, the more vram you have, the better the contextual understanding, their memory is. Otherwise you’d have a bot that maybe knows to only contextualize the last couple messages. For paid services like novelai.net you basically have your bots run through big ass server farms with lots of GPUs that bundle their vram and processing power, giving you “decent” context sizes (imo the greatest weak point of LLMs and it is deeply rooted in how they work) and decent speed. NovelAI also supports front-ends like SillyTavern which is great for local bot management and settings, regardless if you self host or use a paid service (NOT EVERY PAID SERVICE HAS AN API FOR THIS! OpenAI’s ChatGPT technically does too but they do not allow NSFW content and can ban you for that if caught).
There’s a bunch of “free” online services too, like janitorai.com but most of them have slow speeds and the chat degrades significantly after just a few messages, because they have low context sizes. The better / paid models suffer from this degradation too but it is slower and less noticeable, at least at first. You can use that to get an idea of how LLMs work though.

Edit: Should technically self explanatory / common sense, but I would advise not to share ANY personal information through online service chats that could identify you as a person!

permalink
report
parent
reply
19 points

Does it make it faster if the GPU has waifu stickers on it?

permalink
report
parent
reply
13 points

I don’t know, I’m not a weeb.

permalink
report
parent
reply
3 points

Define “it”

Because waifu stickers may indeed speed up “it” for some definition of “it”

permalink
report
parent
reply
1 point
*

itll do the opposite im afraid, OW! Hot… umm whats that awful smell of burning plastic.

permalink
report
parent
reply
-1 points

Basically, the more vram you have, the better the contextual understanding, their memory is. Otherwise you’d have a bot that maybe knows to only contextualize the last couple messages.

Hmm, if only there was some hardware analogue for long-term memory.

permalink
report
parent
reply
2 points

What are you trying to say? Do you understand what the problem is?

permalink
report
parent
reply
1 point

Yes, databases (saved on a hard drive). SillyTavern has Smart Context but that seems not that easy to install so I have no idea how well that actually works in practice yet.

permalink
report
parent
reply
11 points

Pretty easy to roll your own with Kobold.cpp and various open model weights found on HuggingFace.

permalink
report
parent
reply
8 points

Also for an interface, I’d recommend KoboldLite for writing or assistant and SillyTavern for chat/RP.

permalink
report
parent
reply
4 points

I tried oobabooga and it basically always crashes when I try to generate anything, no matter what model I try. But honestly, as far as I can tell all the good models require absurd amounts of vram, much more than consumer cards have, so you’d need at least like a small gpu server farm to local host them reliably yourself. Unless of course you want like practically nonexistent context sizes.

permalink
report
parent
reply
4 points

You’ll want to use a quantised model on your GPU. You could also use the CPU and offload some parts to the GPU with llama.cpp (an option in oobabooga). Llama.cpp models are in the GGUF format.

permalink
report
parent
reply
4 points
*
2 points

i second this request. please

permalink
report
parent
reply
1 point

See my other reply for some basic info & pointers.

permalink
report
parent
reply
1 point

Living in the future is so goddamn weird.

permalink
report
parent
reply
1 point
Deleted by creator
permalink
report
parent
reply
58 points

permalink
report
reply
52 points

Hey Sugar, which of these pics have traffic lights on them?

bots gotta help each other

permalink
report
reply
40 points

Yeah. Saw that one coming.

permalink
report
reply
-6 points

Isn’t this what every app on your phone is already doing? Why would “AI Girlfriend App” be any better than Facebook or Google?

permalink
report
parent
reply
12 points

It’s even more insidious because these apps ask for the data upfront under the guise of “getting to know you better”

How can your AI girlfriend truly love you back unless you give it your real name, age, photos of your face, mother’s maiden name, social security number, and genetic material? /s

permalink
report
parent
reply
8 points

lol it’s not. That’s the point. It’s all the same sort of companies putting these fucking things out. Google and Facebook are both working on their own AI. It’s been shown over and over how they’re scraping all available data—public or private when available—so why the fuck would a newer tech company be any different? It’s the state of capitalism. The markets are focusing more on hyper focused data, so that’s what any profitable tech company is doing. Invading your fucking privacy.

That’s why it’s not a surprise. Because Facebook and google helped set the trend and determine the current tech market. And that’s the world these new “AI girlfriends” are existing in.

permalink
report
parent
reply
4 points

permalink
report
parent
reply
1 point

Idk all my apps are open source and most don’t connect to internet except ones that need it like eternity for reddit

permalink
report
parent
reply

Technology

!technology@lemmy.world

Create post

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


Community stats

  • 18K

    Monthly active users

  • 12K

    Posts

  • 529K

    Comments