10 Therapy Robots Designed to Help Humans


When most people think of robots, they probably think of cold, unfeeling machines (that may or may not be intelligent and hellbent on taking over the world). And yet, more and more people are using robots for companionship and therapy. Here are just a few of the many therapy robots out there.


Researchers at Japan's AIST developed PARO, which comes in the form of a cute baby white seal, for patients at hospitals and extended care facilities who could benefit from animal assisted therapy but for whatever reason, such as community rules that bar actual pets, can’t have an animal. The interactive robot has five types of sensors that can detect the environment around it; the device also remembers how people interact with it—if you repeatedly pet PARO in a certain spot, it will remember the spot and react favorably to your touch. If you hit it because it did something you didn’t like, PARO will remember not to do that action again.

PARO recently had a starring role in the Netflix TV show Master of None. The robot was introduced at the beginning of "Old People" as Arnold’s (Eric Wareheim) grandfather’s robotic pet.

Currently, PARO is available to lease for about $200 a month; it can be bought outright for $6000.


Hasbro has developed a new toy line, called Joy For All Companion Pets, for senior citizens in need of companionship who aren't able to care for a real animal. These robotic cats look, feel, and act pretty much like the real thing—they don’t walk, but thanks to built-in sensors, they purr and nuzzle when touched; they also meow, sleep, and roll over for belly rubs. The companion pets are available in three varieties (orange tabby, silver with white mitts, and creamy white) and retail for $100.


Student researchers at the University of Amsterdam developed Phobot, an interactive robot that serves as a strong visual and learning aid to help children who suffer from anxiety and phobias. It was built using various LEGO Mindstorms NXT kits and a number of RFID sensors. When Phobot is confronted by larger objects, it’s programmed to react with fear: It raises its eyebrows, turns around, and zips away in a panic. When it’s confronted with smaller objects, however, it can be coached not to be afraid. Then, when a larger object confronts Phobot again, it can be coached not to react with fear.

Researchers believe robots like Phobot can teach children how to deal with and ultimately overcome their phobias and anxieties. "This robot is there as a sort of buddy to help a child having any kind of actual fear, doing it step by step," team member Ork de Rooij said. Phobot was built for the University of Amsterdam's 2008 Human-Robot Interaction Student Design Competition, where it was voted the conference’s favorite robot; unfortunately, it's not available for purchase.



Studies have shown that our relationships with animals can create feelings of safety and security; being around domesticated animals like dogs and cats can have a positive affect on a patient’s social, emotional, or cognitive well-being. Ollie the Baby Otter was specifically built for Animal Assisted Therapy, which, as the name implies, relies on animals to help people suffering from things like cancer, dementia, or post-traumatic stress; scientists hope that allowing a patient to cuddle Ollie during therapy will help him or her through the healing process. 

In 2013, a class of MIT students in a course on Product Engineering Processes built Ollie for about $500, using a Raspberry Pi (a cheap and powerful computer) for its brain. Thanks to a sensor board and custom motor, it can also understand how someone is interacting with it through touch and respond favorably with movement and sound: The bot hugs a patient's hand and purrs when its belly is rubbed. Its users are encouraged to gently hold and cradle Ollie like an infant, but the robot is durable and waterproof.

Currently, Ollie is just a prototype, but its developers believe that the robot could be mass produced for as little as $90. 


BeatBots—a robot design studio based in San Francisco and Sendai, Japan—created Keepon Pro in 2003 specifically for children with developmental disorders like autism. People with autism often have trouble keeping eye contact with other people, so a therapist can use Keepon to interact with a child in a social setting without the child shutting down. Keepon's eyes are two small cameras, and its nose is a microphone, which feed information to the therapist in another room. The bot is equipped with four motors, which the therapist can control remotely. Keepon also features facial recognition software that can detect eye contact and movement. The robot is also a pretty good dancer; the professional version of the robot has been featured in several music videos, and a mainstream version for kids, called MyKeepon, was also developed.   


In 2001, Japanese toy manufacturer Omron developed and designed NeCoRo, one of the first robotic lap cats made for seniors in the country. While it couldn’t walk or perform tricks, the cat contentedly purred when stroked and gave positive or negative emotional feedback, depending on the user's actions. If a user neglected NeCoRo, for example, the robot would be less affectionate the next time the user interacted with it. Only 500 units of the limited edition item, which debuted in 2001, were produced; each one cost 185,000 yen ($1530 USD).


In 2014, French robotics company Aldebaran invented Pepper, a social humanoid robot that was designed to live with in a person's home. It interacted with its owner by using voice and touch, and was designed to understand human emotions: For example, if its owner laughs, the robot will understand that the person is happy; if a user frowns, the robot will know something is wrong. Pepper analyzes a user's body language, facial expressions, and analyzes a user's words to properly guess his or her mood. The robot is equipped with 3D cameras, an ultrasound system, and tactile sensors to explore the world around it and feel its owner's touch. It can even connect to the Internet to expand and broaden its knowledge. Currently, Pepper is used to greet and interact with customers at SoftBank Mobile stores in Japan. The SoftBank Group is Aldebaran’s parent company.


In 2007, Sega worked with scientists and researchers from Tohoku University's Institute of Development, Aging and Cancer in Northern Japan to develop and design the Dream Pet Series. While Sega is mostly known as a video game developer and publisher, they started to manufacture electronic toys in 2002, after the failure of the Sega DreamCast ended their run as a gaming console giant.

Sega wanted to make its robotic household pets more realistic and highly therapeutic for patients and the elderly, who use the mechanical animals for relaxation and to ease tension. Sega’s Dream Pet Series includes chicks, an owl, a kitten, a parrot, and a dog, along with two cats, Smile and Venus, released in 2007 and 2009, respectively. Sega’s Dream Cats Series retailed anywhere between $100 to $200.


The Spark Fund for Early Learning at The Sprout Fund in Pittsburgh helped local company Interbots develop Popchilla, a “puppeteerable robot” with a companion iPad app. The goal of the robot was to help children with autism learn to identify emotions, and, in turn, teach them to respond to social cues. “By using Popchilla as an intermediary, we hope to increase the understanding of the child’s internal feelings, thus reducing behavioral frustrations," Cindy Waeltermann, the Founder and Director of the Autism Centers of Pittsburgh, said. "If they are able to identify that they are ‘angry’ and what ‘angry’ means, it can significantly help them understand what they are feeling, reducing behavioral ramifications.”


The Hug is a soft and robotic pillow, or CareBot, that uses sensing and wireless phone technology to give an enhanced physical sensation and touch during a phone call. The pillow gave its users a stronger social and emotional connection to the person on the other line. Researchers at Carnegie Mellon in Pittsburgh discovered elderly people need the most emotional support, so The Hug [PDF] was designed with the sole purpose to deliver tactile and physical responses through voice recognition software and a small microphone built inside of its cushion. Sadly, The Hug is not available for purchase. It was part of an academic research initiative to link robotics technology to intimate communication. But the Hugvie, a similar Japanese product, was developed in 2012; you can get it for $148.

Jason Dorfman, MIT CSAIL
MIT’s New AI Can Sense Your Movements Through Walls Using Radio Signals
Jason Dorfman, MIT CSAIL
Jason Dorfman, MIT CSAIL

New artificial intelligence technology developed at MIT can see through walls, and it knows what you’re doing.

RF-Pose, created by researchers at the Computer Science and Artificial Intelligence Laboratory (CSAIL), uses wireless signals to estimate a person’s pose through a wall. It can only come up with a 2D stick figure of your movements, but it can nonetheless see your actions.

The system, described in a new paper [PDF], uses a neural network to piece together radio signals bouncing off the human body. It takes advantage of the fact that the body reflects radio frequency signals in the Wi-Fi range. These Wi-Fi signals can move through walls, but not through people.

Using data from low-power radio signals—1000 times lower than the power your home Wi-Fi router puts out—this algorithm can generate a relatively accurate picture of what the person behind the wall is doing by piecing together the signals reflected by the moving body.

The system can recognize movement in poor lighting and identify multiple different individuals in a scene. Though the technology is still in development, it’s not hard to imagine that the military might use it in surveillance, but the researchers also suggest that it may be useful for video game design and search-and-rescue missions. It might also help doctors monitor and analyze the movements of patients with disorders like Parkinson’s disease and multiple sclerosis.

This is just the latest in a series of projects using radio signals to mimic X-ray vision. CSAIL has been working on similar technology using Wi-Fi signals for several years, creating algorithms to recognize human forms and see motion through obstructions. In the future, they hope to expand the system to be able to recognize movement with 3D images rather than the current 2D stick figures.

MIT Wants to Teach Robots to Do Your Chores

Teaching a robot basic human tasks is more of a challenge than it seems. To teach a robot to pour you a glass of orange juice, for instance, the 'bot has to not just recognize the command to take the juice out of the fridge and pour it into a glass, but it has to understand the many tiny aspects of the task that the human brain infers—like, say, the steps where you have to walk into the kitchen, open the cupboard, and grab an empty glass.

VirtualHome, a 3D virtual environment created by MIT's Computer Science and Artificial Intelligence Laboratory with researchers at the University of Toronto, is designed to teach robots exactly how to accomplish household tasks like pouring juice. The simulator acts as a training ground for artificial intelligence, turning a large set of household tasks into robot-friendly, sequence-by-sequence programs.

First, researchers created a knowledge base that the AI would use to perform tasks [PDF]. The researchers asked participants on Amazon's Mechanical Turk to come up with descriptions of household activities, like making coffee or turning on the television, and describe the steps. Their descriptions naturally didn't include some of the steps that a robot would need, since they were composed as if speaking to another human—the "watch TV" command didn't include some obvious steps a robot might need, like "walk over to the TV" or "sit on the sofa and watch." They then had the same participants generate programs for these tasks using a simple system designed to teach young kids how to code. All told, they created more than 2800 programs for household tasks.

An avatar sets the table in a simulated dining room.

Then, the researchers tested these programs in a Sims-inspired virtual home to see if the crowd-sourced instructions could work to train robots. They turned the programs into videos in which a virtual agent would execute the household task based on the code.

The researchers were focused on creating a virtual environment that could serve as a dataset for future AI training, rather than training any actual robots right now. But their model is designed so that one day, artificial intelligence could be trained by someone who isn't a robotics expert, converting natural language commands into robot-friendly code.

In the future, they hope to be able to turn videos from real life into similar programs, so that a robot could learn to do simple tasks by watching a YouTube video. An artificial intelligence system like Amazon's Alexa wouldn't need to be programmed by its manufacturer to do every single task—it could learn on the fly, without waiting for a developer to create a new skill.


More from mental floss studios