A team of researchers at the University of Washington set out to study whether or not conversational agents, such as Alexa or Siri, affect the way children communicate with other humans. The recent study found that this is probably not the case, and children are sensitive to context in these conversations.
The experiments involved a conversation agent teaching 22 children between the ages of 5 and 10 to use the word “bungo” to ask the agent to speak quicker. The children consistently used the word when the robot’s speech slowed. However, the meaning did not transfer to the children when they had conversations with their parents, as most children took it as an inside joke. When the kids spoke to the researchers, they barely ever used bungo.
The research was presented at the 2021 Interaction Design and Children conference in June.
Alexis Hiniker is senior author and a UW assistant professor in the Information School.
“We were curious to know whether kids were picking up conversational habits from their everyday interactions with Alexa and other agents,” said Hiniker. “A lot of the existing research looks at agents designed to teach a particular skill, like math. That’s somewhat different from the habits a child might incidentally acquire by chatting with one of these things.”
A Five-Part Study
The five-part study involved each child visiting the lab with one parent and one researcher. In the first part, the children spoke to a simple animated robot or cactus on a tablet screen, which also displayed the text of the conversation. Another part of the study involved a researcher who was not present in the room asking each child questions. The app then translated them into a synthetic voice that was played for the child before the researcher listened to the responses and reactions.
During these experiments, 64% of the children remembered to use bungo the first time the robot slowed its speech. By the end of the sessions, every child had learned the routine.
The next part of the study introduced the children to the other agent, which started to periodically speak slowly after a normal speed. The agent did not remind the children to use bungo, and once the child said the word five times or let the agent speak for five minutes, the conversation ended.
This part demonstrated that 77% of the children had successfully used bungo with this agent.
The next step was the parent talking with the child after the researcher left the room. The parent also started to speak slowly and didn’t give any reminders.
Of the children who completed this part of the study, 68% used bungo when speaking with their parents. The researcher then returned to the room to hold a similar conversation at slow speed, and only 18% of the participating children used the word.
“The kids showed really sophisticated social awareness in their transfer behaviors,” Hiniker said. “They saw the conversation with the second agent as a place where it was appropriate to use the word bungo. With parents, they saw it as a chance to bond and play. And then with the researcher, who was a stranger, they instead took the socially safe route of using the more traditional conversational norm of not interrupting someone who’s talking to you.”
Testing it at Home
The researchers asked the parents to slow down their speech over the next 24 hours at home, and of the participating families, 11 reported that the children continued to use bungo. However, it was used in a playful manner. As for the children who expressed skepticism in the lab, many did the same at home.
“There is a very deep sense for kids that robots are not people, and they did not want that line blurred,” Hiniker said. “So for the children who didn’t mind bringing this interaction to their parents, it became something new for them. It wasn’t like they were starting to treat their parents like robots. They were playing with them and connecting with someone they love.”
The findings suggest that children will treat these agents differently than fellow humans, but it is possible that conversations with the agents could slightly influence a child’s habits.
“I think there’s a great opportunity here to develop educational experiences for conversational agents that kids can try out with their parents. There are so many conversational strategies that can help kids learn and grow and develop strong interpersonal relationships, such as labeling your feelings, using ‘I’ statements or standing up for others,” Hiniker said. “We saw that kids were excited to playfully practice a conversational interaction with their parents after they learned it from a device. My other takeaway for parents is not to worry. Parents know their kid best and have a good sense of whether these sorts of things shape their own child’s behavior. But I have more confidence after running this study that kids will do a good job of differentiating between devices and people.”
- Andrew Watson, Vice President of AI and R & D at Healx – Interview Series
- AI for Ukraine is a new educational project from AI HOUSE to support the Ukrainian tech community
- AI Helps Microrobots Learn to Swim and Navigate
- How to Hire a Data Scientist – Everything You Need to Know (2022)
- An AI System That Can Make Images of People More ‘Beautiful’