CLOSE
Original image
ThinkStock

New Study Shows Humans Feel Empathy for Robots

Original image
ThinkStock

The humans on the television show Battlestar Galactica experience conflicting feelings when dealing with the humanoid Cylons. While some find it easy to torture the machines even though they resemble humans, many cringe at the thought of terrorizing the Cylons. It turns out the writers got this right—humans empathize with robots as much as they empathize with other people.

Astrid Rosenthal-von der Pütten, from the University of Duisburg Essen in Germany, began thinking about how humans relate to robots after a discussion about a YouTube video where people destroy a dinosaur robot. While she watched the video she experienced conflicting emotions: The video amused her, but she also felt bad for the dinosaur. She wondered if other people felt this way, too—and decided to do some research to find out.

She and her colleagues conducted two studies. In the first, 40 participants watched videos where a person either acts affectionately to a robot that looks like baby camarasaurus or attacks it. When the person kicked, strangled, punched, or dropped the robot, it cried, choked, or coughed. The researchers monitored the subjects as they watched the videos with a physiological monitoring device, which basically tracks how much someone sweats. The more stressed out we are, the more we sweat. The participants also answered questions about how they felt when they watched the person "hurt" the robot. The subjects sweated more and reported feeling badly about the camarasaurus’ plight.

In the second study, the researchers asked people to watch videos of a robot dinosaur and humans while an fMRI machine imaged the subjects’ brains to see how they processed it. The videos featured a woman or a robot in a positive situation—being stroked or tickled—or a negative one—being beaten and choked. The fMRI scans showed that when people watched robots and humans being abused the brain acted the same, which leads them to conclude that people feel empathy for robots. 

“[W]e did not find large differences in brain activation when comparing the human and robot stimuli. Even though we assumed that the robot stimuli would trigger emotional processing, we expected these processes to be considerably weaker than for human stimuli. It seems that both stimuli undergo the same emotional processing,” writes Rosenthal-von der Pütten.

She will present her findings at the International Communication Association Conference in London this June.

Original image
Robin Van Lonkhuijsen/AFP/Getty Images
arrow
Art
6 Great (and Not-So-Great) Works of Art Made by Robots
Original image
Robin Van Lonkhuijsen/AFP/Getty Images

Cold, calculating, unfeeling—none of the stereotypes associated with robots seem to describe makers of great art. But that hasn’t stopped roboticists from trying to engineer the next Picasso in a lab. Some machines and algorithms are capable of crafting works impressive enough to fool even the toughest critics. As for the rest of the robot artists and writers out there, let’s just say they won’t have creative types fearing for their jobs anytime soon. 

1. A BEATLES-ESQUE POP SONG

If you heard the song above at a party or in a crowded store, you might assume it’s just a generic pop tune. But if you listened closer, you’d hear the dissonant vocals and nonsense lyrics that place this number in the sonic equivalent of the uncanny valley. “Daddy’s Car” was composed by an artificial intelligence system from the Sony CSL Research Laboratory. After analyzing sheet music from a variety of artists and genres, the AI generated the words, harmony, and melody for the song. A human composer chose the style (1960s Beatles-style pop) and did the producing and mixing, but other than that the music is all machine. It may not have topped the pop charts, but the song did give us the genius lyric: “Down on the ground, the rainbow led me to the sun.”

2. A NOVEL THAT MADE IT PAST THE FIRST ROUND OF A FICTION CONTEST

Will the next War and Peace be written by a complex computer algorithm? Probably not, but that isn’t to say that AI can’t compose some serviceable fiction with help from human minds. In 2016, a team of Japanese researchers invented a program and fed it the plot, characters, and general structure of an original story. They also wrote sentences for the system to choose from, so the content of the novel relied heavily on humans. But the final product and the work required to string the components together was made possible by AI. The researchers submitted the story to Japan's Nikkei Hoshi Shinichi Literary Contest where it made it past the first round of judging. Though one notable Japanese author praised the novel for its structure, he also said there were some character description issues holding it back.

3. A 'NEW' REMBRANDT PAINTING

Robin Van Lonkhuijsen/AFP/Getty Images

In 2016, a 3D printer did something extraordinary: It produced a brand new painting in the spirit of a long-dead artist. The piece, titled “The Next Rembrandt,” would fit right in at an exhibition of art from the 17th-century Dutch painter. But this work is entirely modern. Bas Korsten, creative director at the Amsterdam-based advertising firm J. Walter Thompson, had a computer program analyze 346 Rembrandt paintings over 18 months. Every element of the final image, from the age of the subject and the color of his clothes to the physical brushstrokes, is reminiscent of the artist’s distinct style. But while it’s good enough to fool the amateur art fan, it failed to hold up under scruntiny from Rembrandt experts.

4. DREARY LOVE POETRY

What do you get when you dump thousands of unpublished romance novels into an AI system? Some incredibly bleak poetry, as Google discovered in 2016. The purpose of the neural network was to connect two separate sentences from a book into one whole thought. The result gave us such existential gems as this excerpt:

"there is no one else in the world.
there is no one else in sight.
they were the only ones who mattered.
they were the only ones left.
he had to be with me.
she had to be with him.
i had to do this.
i wanted to kill him.
i started to cry."

To be fair, the algorithm was designed to construct natural-sounding sentences rather than write great verse. But that doesn’t stop the passages from sounding oddly poetic.

5. A CREEPY CHRISTMAS SONG

Christmas songs rely heavily on formulas and cliches, aka ideal neural network fodder. So you’d think that an AI program would be capable of whipping up a fairly decent holiday tune, but a project from the University of Toronto proved this isn’t as easy as it sounds. Their algorithm was prompted to compose the song above based on a digital image of a Christmas tree. From there it somehow came up with trippy lyrics like, “I’ve always been there for the rest of our lives.”

6. A CROWDSOURCED ABSTRACT PAINTING

Art made by a robot.
Instapainter

The image above was painted by the mechanical arm of a robot, but naming the true artist of the piece gets complicated. That’s because the robotic painter was controlled by multiple users on the internet. In 2015, the commissioned art service Instapainting invited the online community at Twitch to crowdsource a painting. The robot, following script commands over a 36-hour period, produced what looks like graffiti-inspired abstract art. More impressive than the painting itself was the fact that the machine was able to paint it at all. Instapainting founder Chris Chen told artnet, “It was a $250 machine slapped together with quickly written software, so running it for that long was an endurance test.”

Original image
Guy Hoffman
arrow
technology
This Cuddly Robot Can Help Teach Social Cues to Kids With Autism
Original image
Guy Hoffman

When it sits still, Blossom resembles a handmade children's toy that's more basic than your average Barbie doll. But give it a moment and the soft, knitted body starts to move, bouncing and nodding in a way that doesn't make it seem any less warm and cuddly. Guy Hoffman of Cornell University designed Blossom to be a different type of robot, and he hopes his invention will eventually act as a social companion for kids with autism, Co.Design reports.

Kids who fall on the autism spectrum can have trouble picking up social cues like body language and facial expressions. Blossom could be used to demonstrate these interactions in an approachable way. Partnering with Google, Hoffman engineered his robot to watch YouTube videos and physically respond to the action on screen. By designing Blossom to detect and react to certain emotions, the idea is that it will teach the kids watching alongside it by example.

Hoffman understood that design is a crucial part of building an empathy robot. Instead of rigid metal, the skeleton is made from soft materials like rubber bands and silicon that make for imperfect, lifelike movements. The elements that are visible from the outside, like wooden ears and knitted wool, were chosen for their warmth and familiarity. Depending on how you dress it up, Blossom resembles a cat, a bunny, or an octopus.

Many of the items that make the device can be found around the household, and that's intentional. The goal is for families to one day build Blossoms of their own and pass them down generation to generation.

The project is still in its early stages, and details on when it will be introduced to kids—and how effective it will be—aren't yet clear.

For now you can experience Blossom's unconventional cuteness in the video below.

[h/t Co.Design]

SECTIONS

arrow
LIVE SMARTER
More from mental floss studios