Unfortunately the roleplaying chatbot type models are typically fairly sizeable / demanding. I’m curious how this will develop with more specific AI hardware though, like extension cards with primarily tensor cores + their own ram, so that you don’t have to use your GPU for that. If we can drag down the price for such hardware then locally run models could become much more viable and mainstream.
Dude sorry to say but roleplay is not equally important as medicine or coding XD
but you have the use for the very software you’re using daily or medicine developments.
I play D&D from time to time, but saying that roleplaying is more important than medicine is just nuts.