Todd Mitchem continued to strive to have honest and productive conversations with his son. “He’s 15 years old,” Mitchem, 52, said with a smile. “It’s very difficult to connect with teenagers.”
Whenever she tried to bring up a sensitive topic, her son would either give vague answers or run off, preferring to avoid serious conversations altogether.
In the past, when Mr. Mitchem needed help as a father, he would read a book or ask a question to the men’s support group he meets weekly.
But recently he turned to ChatGPT. And you’re not alone: Others are turning to AI chatbots to figure out what to say in situations that seem high-stakes. They are using the tool to talk or read to their children, to approach bosses, to provide difficult feedback, to write wedding vows or write love letters.
Unlike asking friends or even professionals for help, the bot, Mitchem said, gives what seems like objective advice. “The bot is giving me answers based on analytics and data, not human emotions,” he said.
ChatGPT, the new virtual tool powered by Open AI, draws its information from a wide range of online material, including books, news articles, scientific journals, websites, and even message boards, allowing users to have similar conversions. to humans with a chat bot.
“It’s giving you what the collective hivemind on the Internet would say,” said Irina Raicu, who directs the Internet ethics program at Santa Clara University. (Other companies, including Google and Microsoft, have their own versions of this technology, and Microsoft’s, called Bing AI, recently became famous for aggressively declaring its love for New York Times reporter Kevin Roose.)
Mr. Mitchem, who lives in Denver and is executive vice president of learning and products for a leadership training company, opened their conversation by writing, in summary: “I need some friendly advice.”
“Okay, no problem,” ChatGPT responded, according to Mitchem. “What is your name?”
In the course of their conversation, ChatGPT told Mr. Mitchem that he is a good father for wondering how to approach a conversation with his son about the decision to join a basketball team. “He was like, ‘It’s okay if you don’t do it right, but it’s amazing that you try.’
Mr Mitchem said the bot then continued: “Teenagers, when they are growing up, are trying to force their independence. Remember that when you talk to him, he needs to know that you trust his decisions.
The next day, Mr. Mitchem approached his son and tried the advice. “I told him, ‘You have to make this decision, you’re 14 years old and I trust you’ll make a good decision,’” Mitchem said. “My son says, ‘Wow, that’s amazing. I’ll let you know what he decides.’”
“We left on a positive note,” Mitchem said. “It totally worked.”
Once upon a time …
For Naif Alanazi, a Ph.D. 35 years old. A student at Kent State University, bedtime is a sacred ritual for him and his 4-year-old daughter, Yasmeen. “I have to work all day,” he said. “This is our special moment.”
His family from Saudi Arabia has a deep tradition of telling oral histories. Wanting to continue it, she used to try to come up with new and exciting tales every night. “Do you know how hard it is to think of something new every day?” she asked, laughing.
Now though, let the bot do the work.
Every night she asks ChatGPT to create a story involving people (her daughter’s teacher, for example) and places (school, park) from her day, along with a cliffhanger at the end to continue the story the next day. next night. “Sometimes I ask her to add a value that she needs to learn, like honesty or being kind to others,” she said.
“Being able to give her something that is more than a generic story, something that can increase our bond and show her that I am interested in her daily life,” he said, “makes me feel so much closer to her.”
love languages
Anifa Musengimana, 25, who is in international marketing graduate school in London, is sure that chatbots can help make the tedium of online dating more interesting. “I’m having a lot of repetitive conversations on these apps,” she said. “The app can give me fun ideas about what to talk about and maybe I’ll find better people to date.”
“If I get intriguing answers, I’ll be drawn in,” he said.
She said she would tell her partner that she was using the tool. “I would like a guy who finds it fun,” she said. “I wouldn’t like a guy who is so serious that he gets mad at me for doing it.”
Some are using chat bots to improve the relationships they already have.
James Gregson, 40, a creative director living in Avon, Conn., has been using ChatGPT to compose love letters to his wife.
“I’m not a poet, I’m not a songwriter, but I can take themes about things my wife might like and put them into a song or a poem,” he said.
He also believes in full disclosure: “I’m going to give you one, but I’m going to tell you who wrote it,” he said. “I’m not trying to rip her off.”
office applications
Jessica Massey, 29, a financial analyst at Cisco Systems, who lives in Buffalo, has been drafting emails to her boss using ChatGPT. “I wanted to test her capabilities to see if there was a different way for AI to express what she was thinking in my head,” she wrote in an email. (One interviewee confessed to using ChatGPT to help prepare for his interview for this story. Another admitted to using it for employee reviews.)
Ms. Massey used the bot to write an email to her boss explaining why the company should pay for a certain professional certification. The bot gave her some nice boilerplate language, she said. She hasn’t sent it yet, but she plans to once she changes the verbiage a bit to make it sound more like me.
Ms. Massey, however, has a rule about trusting a chatbot: “Disclose it at the end of your job or don’t use it at all.”
However, academics who study technology and ethics have mixed feelings about using ChatGPT for highly personal communication.
“We shouldn’t automatically reject tools that might help people deal with a difficult conversation,” said Michael Zimmer, director of the Center for Data, Ethics and Society at Marquette University. He likens it to buying a Hallmark card for a birthday or anniversary. “We’ve all agreed to do that because the words on the card align with something I believe in,” he said.
However, Ms. Raicu, from Santa Clara University, is concerned that people are using ChatGPT for personal communication. She doesn’t like the idea that there is a “right” and “wrong” way to communicate. “I think the right words depend on who the people are communicating with and the context,” she said. “There is no formula for a lot of these things.”
Ms. Raicu said that using ChatGPT for personal communication can undermine trust: “People might ask, ‘Do I really know who I’m talking to?'”
Photos from Getty Images