10.03.2026
12:14
119
News

AI companion robots remind goals, encourage users, and listen like humans

Researchers at Purdue University are developing robots designed to interact with humans in socially and emotionally natural ways, El.kz reports Interesting Engineering.

The systems aim to support people in everyday environments such as classrooms, hospitals, and therapy settings.

The work is led by Sooyeon Jeong, an assistant professor of computer science who studies how artificial intelligence systems communicate with people.

Her lab focuses on designing robots that respond to human behavior, emotions, and communication patterns.

“My goal, and the goal of my research group, is to design robots and AI that can have socially and emotionally natural interactions with people,” Jeong said. “I want anything I make or design to have a measurable positive impact on people’s lives.”

The research builds on earlier projects that used robots to support cancer patients, people with aphasia, hospitalized children, and older adults.

Robot study buddy

One of the team’s new experiments focuses on building a robot that can act as a study companion. The idea is to recreate the accountability and motivation people often get from working alongside peers.

“For a lot of people, working in a study group helps provide peer accountability,” Jeong said.

The researchers tested several versions of the robot during study sessions. In one version, the robot simply sat nearby and appeared to work on its own task while the student studied. In another, the robot periodically reminded the student about their goals.

A third version offered encouragement and wellness reminders during difficult study periods.

“In that third one, the robot would offer emotional support, saying things like, ‘You’re doing a great job! We can do this together!'”

The robot could also suggest short breaks for stretching or movement before returning to work.

The team found that no single strategy worked best for everyone. Some students responded well to reminders, while others preferred encouragement or simple companionship.

“It was interesting; we had mixed and nuanced results from the study,” Jeong said.

The results suggest future robots may need to adjust their behavior depending on a person’s mood, personality, and task.

Teaching robots to listen

Another project in Jeong’s lab focuses on making robots better conversational partners. The goal is to teach machines how to show they are listening during conversations.

“Many of the robot assistants people are used to — Siri, Alexa, Google — are discrete in their interactions with humans,” Jeong said. “You ask them a question, and they give you the answer.”

Human conversations, however, rely heavily on small signals that show engagement. These signals, known as backchannels, include gestures such as nodding or verbal cues like “right.”

“These things are called backchannels, and they’re part of how we as humans keep a conversation going.”

To build this capability, the researchers are analyzing recordings of human conversations. They use large language models and voice data to study changes in pitch, rhythm, and phrasing that signal attentive listening.

According to Jeong, adding these behaviors can make robot interactions feel more natural and supportive.

“And we’ve found that the more empathetic listening behavior we can implement, the more personalized support we’re able to provide to people.”

The research is part of Purdue Computes, a university-wide initiative focused on artificial intelligence, robotics, and advanced computing.