Was using my SO’s laptop, I had been talking (not searching, or otherwise typing) about some VPN solutions for my homelab, and had the curiosity to use the new big copilot button and ask what it can do. The beginning of this context was actually me asking if it can turn off my computer for me (it cannot) and I ask this.
Very unnerved, I hate to be so paranoid to think that it actually picked up on the context of me talking, but again: SO’s laptop, so none of my technical search history to pull off of.
Absolutely amazing.
My guess is that at this point there are so many user prompts its received so far in its training set that bring up both Copilot and privacy concerns that it first interpreted the question, then searched for the most common topic associated with itself (privacy), then spit out a hardcoded MSFT override response for ‘inquiry’ + ‘privacy’.
I want to believe that is the explaination, I really would’ve expected at least a hardcoded “features and capability” response, or for it to be more than a neutered chatGPT that im sure neither of us are going to use
MSFT appears to still be using a fundamentally old chatbot model that they’ve just slapped a bunch of extra ‘features’ (namely, Wooow! It has APIs and works on other MSFT stuff!) to, much like Bethesda’s game engine.
Probably barely different from Tay in terms of broad conceptual design, just patched and upgraded to do what it does faster.
The core design is garbage, and just like Windows itself, its nearly certainly a giant fucking mess of layers upon layers of different versions of itself hiding under a trench coat, all standing on top of something 10 to 20 years old.
I never needed Bing to open up in a side panel, I’m not sure why I would want that now
They bought OpenAI and all, why aren’t they using the most cutting edge from them? Or is it just severely lobotomized with some preprompt to the point I still would rather open up chatGPT