Wololo
My first playthrough I sent lae’zel straight to camp, never spoke to her again. At some random point before even act 1 ended, her romance dialogue came up.
My theory is the tadpoles ramp up sex drive to absurd levels, unless you’re shadowheart apparently. Illithid orgies must be incredible.
It varies is the answer for sure but I’d say 5ish or 10 minutes is average. It’s been longer, it’s been (much) faster, but it totally depends on context. Energy levels, general excitement, consumption of booze or other intoxicants, etc.
My unfortunate observation is the longer I am with a partner the more comfortable I slowly become, and the less time I’m able to last.
I’ve had similar experiences lately. Either that or it decides to review and analyze my code unprompted when I’m trying to troubleshoot a particularly tricky line. Had a few instances where it tried to borderline gaslight me into thinking that it was right and I was wrong about certain solutions. It feels like it happened rather suddenly too, it never used to do that save for the odd exception.
Actually, I’ve had a slightly opposite experience. I found that while asking it general programming questions, it initially tried to exain to me how I could achieve what I was looking to do, and lately it has been jumping straight to writing example code (sometimes even asking for my existing code so that it can modify it)
It’s entirely possible! I remember listening to a podcast on AI, where they mentioned someone once asked the questions “which mammal lays the largest eggs” to which the ai responded with elephants, and proceeded to argue with the user that it was right and he was wrong.
It has become a lot easier as I’ve learned how to kind of coach it in the direction I want, pointing out obvious errors and showing it what I’m really looking to do.
Ai is a great tool, when it works. As the technology improves I’m sure it will rapidly get better.
I literally broke down into tears doing this one night. Was running something that would take hours to complete and noticed an issue at maybe 11pm. Tried to troubleshoot and could not for the life of me figured it out. Thought to myself, surely chatgpt can help me figure this out quickly. Fast forward to 3am, work night: “no, as stated several times prior, this will not resolve the issue, it causes it to X, Y, Z, when it should be A, B, C. Do you not understand the issue?”
“I apologize for any misunderstanding. You are correct, this will not cause the program to A, B, C. You should… Inserts the same response it’s been giving me for several hours”
It was at that moment that I realized these large language models might not currently be as advanced as people make them out to be
Says someone was charged for returning fire? I’m curious about that, were they arrested for illegal firearm possession or was it something else?