The robots are coming, and researchers have a plan to make them seem more human.
Key Takeaways
- New research could teach robots to seem more human.MIT researchers have developed an AI model that understands the underlying relationships between objects in a scene and helps robots perform complex tasks.A growing number of robots are designed to act like humans.
MIT researchers have developed an artificial intelligence (AI) model that understands the underlying relationships between objects in a scene. This work could be applied in situations where robots must perform complex tasks, like assembling appliances. It also moves the field one step closer to making machines that can learn from and interact with their environments like humans do.
“Humanoid robots designed with AI technology, perform a number of human tasks and carry out the roles of receptionists, personal assistants, front desk officers, and more in numerous sectors,” AI expert Sameer Maskey, a computer science professor and CEO at Fusemachines, told Lifewire in an email interview. “At the core of these near-human interactions lie AI algorithms that enable these systems, which are built to learn more with each new human interaction.”
Robots That Understand More
Humans can look at a scene and see the relationships between objects, but AI models have trouble following commands. This is because they don’t understand, for example, when a spatula is on the left side of a stove.
Detailing their efforts to solve this problem, MIT researchers recently published a study describing a model that understands the underlying relationships between objects in a scene. Their model represents individual relationships one at a time, then combines these representations to describe the overall scene.
“When I look at a table, I can’t say that there is an object at XYZ location,” Yilun Du, a co-lead author of the paper, said in a news release. “Our minds don’t work like that. In our minds, when we understand a scene, we really understand it based on the relationships between the objects. We think that by building a system that can understand the relationships between objects, we could use that system to more effectively manipulate and change our environments.”
Move Over Roombas
A growing number of robots are designed to act like humans. For example, Kime, developed by Macco Robotics, is a beverage and food serving robot with smart sensors that manages tasks using self-learning processes and adaptive human interaction through AI technology.
There’s also T-HR3, introduced by Toyota, a third-generation humanoid robot that mimics the movements of human operators with capabilities to assist humans at home, in hospitals, and even in disaster-stricken areas.
Amelia, a conversational AI solution, is a digital humanoid robot developed to provide a human-like customer service experience. Amelia flexibly switches between different informal contexts without any delays while recognizing human intents and emotional states.
New materials and sensors are even giving robots a “face” that lets them seem more realistic, Karen Panetta, a professor of electrical and computer engineering at Tufts University and IEEE fellow, told Lifewire in an email interview. Advancements in nanotechnology allow more sensors to be embedded in a robot’s face to emulate facial expressions far more accurately than ever before.
“The brains behind the robotic faces are leveraging the power of computational models utilizing artificial intelligence to process all the information it is sensing,” Panetta added. “Such as imagery, sounds, and environmental conditions to help train the robot to respond appropriately in both words and physical actions.”
One big market for humanoid robots is as aides for the elderly. Panetta explained these helper robots could monitor patient health, take vitals, or give directions to patients to help with medications or medical routines. They could also monitor patient safety and call for help if they detect the patient has fallen, hasn’t moved, or is experiencing some distress.
At the core of these near-human interactions lie AI algorithms that enable these systems…
“Making the robots appear as humans is intended to make the interaction with humans more compassionate, less intimidating, and hopefully, more cognitively engaging for the patient,” Panetta added. “They can also assist patients with dementia to engage them in conversation and monitor their safety.”
Robotics is evolving, and in the future, with greater AI advancements, robots might be capable of showcasing more human characteristics, Maskey said. However, as humans, we often find it difficult to understand emotions and gauge reactions.
“So the ability to pick up these subtle nuances and emotional cues is something that the robotic industry will continue to work on for a long time,” he added.
Get the Latest Tech News Delivered Every Day