If you’ve watched any Olympics coverage this week, you’ve likely been confronted with an ad for Google’s Gemini AI called “Dear Sydney.” In it, a proud father seeks help writing a letter on behalf of his daughter, who is an aspiring runner and superfan of world-record-holding hurdler Sydney McLaughlin-Levrone.
“I’m pretty good with words, but this has to be just right,” the father intones before asking Gemini to “Help my daughter write a letter telling Sydney how inspiring she is…” Gemini dutifully responds with a draft letter in which the LLM tells the runner, on behalf of the daughter, that she wants to be “just like you.”
I think the most offensive thing about the ad is what it implies about the kinds of human tasks Google sees AI replacing. Rather than using LLMs to automate tedious busywork or difficult research questions, “Dear Sydney” presents a world where Gemini can help us offload a heartwarming shared moment of connection with our children.
Inserting Gemini into a child’s heartfelt request for parental help makes it seem like the parent in question is offloading their responsibilities to a computer in the coldest, most sterile way possible. More than that, it comes across as an attempt to avoid an opportunity to bond with a child over a shared interest in a creative way.
Idk, I mean I think this is more honest and practical LLM advertising than what we’ve seen before
I like to say AI is good at what I’m bad at. I’m bad at writing emails, putting my emotions out there (unless I’m sleep deprived up to the point I’m past self consciousness), and advocating for my work. LLMs do what takes me hours in a few seconds, even running locally on my modest hardware.
AI will not replace workers without significant qualitative advancements… It can sure as hell smooth the edges in my own life
putting my emotions out there
You think AI is better than you at putting your emotions out there???
Talking to a rubber duck or writing to a person who isn’t there is an effective way to process your own thoughts and emotions
Talking to a rubber duck that can rephrase your words and occasionally offer suggestions is basically what therapy is. It absolutely can help me process my emotions and put them into words, or encourage me to put myself out there
That’s the problem with how people look at AI. It’s not a replacement for anything, it’s a tool that can do things that only a human could do before now. It doesn’t need to be right all the time, because it’s not thinking or feeling for me. It’s a tool that improves my ability to think and feel
Talking to a rubber duck that can rephrase your words and occasionally offer suggestions is basically what therapy is
well I am pretty sure Psychologists and Psychiatrists out there would be too polite to laugh at this nonsense.
That’s the problem with how people look at AI.
Precisely, you are giving it a TON more credit than it deserves
It’s a tool that improves my ability to think and feel
At this point, I am kind of concerned for you. You should try real therapy and see the difference
I get what they mean. It can help you articulate what you’re feeling. It can be very hard to find the right words a lot of the time.
If you’re using it as a template and then making it your own then what’s the harm?
It’s the equivalent of buying a card, not bothering writing anything on it and just signing your name before mailing it out. The entire point of a fan letter (in this case) is the personal touch, if you are just going to take a template and send it, you are basically sending spam.
I am 100% for this if it’s yet another busywork communication in the office; but personal stuff should remain personal.
This is the same reason people think giving cash as a valentine’s gift is unacceptable LOL
It can be very hard to find the right words a lot of the time.
That can be, in many cases, because you don’t read enough to have learned the proper words to express yourself. Maybe you’re even convinced that reading isn’t worth it.
If this is the case, you don’t have anything worth saying. Better stay silent.
It literally cannot since it has zero insight to your feelings. You are just choosing pretty words you think sound good.
I’d view it as an opportunity for AI to provide guidance like “how can I express this effectively”, rather than just an AI doing it instead of you in an “AI write this” way.
That’s true too, it can give you examples to get you started, although it can be pretty hit or miss for that. Most models tend to be very clinical and conservative when it comes to mental health and relationships
I like to use it to actively listen and help me arrange my thoughts, and encourage me to go through with things. Occasionally it surprises me with solid advice, but mostly it’s helpful to put things into words, have them read back to you, and deciding if that sounds true