Issue #74: Cute, Very Cute

*|MC:SUBJECT|*

The RoboPsych Newsletter

Exploring The Psychology of 
Human-Robot Interaction
Issue 74: August 7, 2017


Cute, Very Cute
Quote of the Week:


“Nearly every step wrought havoc upon the prototype walker's frame. Designed to activate landmines in the most direct means possible, the EOD robot was nevertheless persistent enough to pick itself back up after each explosion and hobble forth in search of more damage. It continued on until it could barely crawl, its broken metal belly scraping across the scorched earth as it dragged itself by a single remaining limb. The scene proved to be too much for those in attendance. The colonel in charge of the demonstration quickly put an end to the macabre display, reportedly unable to stand the scene before him. “This test, he charged, was inhumane,” according to the Washington Post.

But how can this be? This was a machine, a mechanical device explicitly built to be blown up in a human's stead. We don't mourn the loss of toasters or coffeemakers beyond the inconvenience of their absence, so why should a gangly robotic hexapod generate any more consternation than a freshly squashed bug? It comes down, in part, to the mind's habit of anthropomorphizing inanimate objects. And it's this mental quirk that could be exactly what humanity needs to climb out of the uncanny valley and begin making emotional connections with the robots around us.

These sorts of emotional connections come more easily in military applications, where soldiers’ lives depend on these devices working as they should. “They would say they were angry when a robot became disabled because it is an important tool, but then they would add ‘poor little guy,’ or they'd say they had a funeral for it,” Dr. Julie Carpenter of the University of Washington wrote in 2013. “These robots are critical tools they maintain, rely on, and use daily. They are also tools that happen to move around and act as a stand-in for a team member, keeping Explosive Ordnance Disposal personnel at a safer distance from harm.”

Andrew Tarantola
Engadget


 
Cute Mind Games 

Isn't that the cutest robot you've ever seen up there in that picture?

Cute. Nature has been working on our design for millions of years, slowly making untold minute changes in both the genes that eventually replicate into the patterns that create human bodies and their capabilities, as well as the environments in which those gene-productions will find themselves.

Can you imagine a time before “cute” existed? When looking at that picture didn't bring a reflexive “awww” to your lips? 

How about this one?



It turns out that the same evolutionary mechanisms that make that puppy irresistible are also at work when we're interacting with robots. 

And, that's no accident. See, we know a lot about “cute.” Biologists have been researching the characteristics that most of us call “cute” for a long time, going back to biologist Konrad Lorenz's work in the late 19th century. Here's how he distinguished cute from not cute. Can you tell which is which?



Of course you can. Cute = images on left; not-cute = images on right.

We've designed objects with cute features for centuries. Why? Because nature has pre-loaded biological mechanisms into our DNA that greatly enhance the probability that we will form attachments with, and nurture, organisms that fit the “cute” mold. (See, human infants.)

We just can't help ourselves. 

Enter robots. 

As soon as we began to imagine artificial humanoid life forms we were forced to confront the question of their cuteness. How cute should they be? That depended on the story you wanted to tell.

Want to tell a tale about the dangers of humans stepping into the divine job of creating life? Sounds like a job for not cute.



How about one about a boy who finds a robot that helps him find his life's purpose after his brother dies? Definitely calls for cute.

 

Now, all those psycho-physical mechanisms that evolved over the years automatically kick in the very second we perceive an object that resembles a human in even the remotest ways. Psychological experiments show that we will attribute human characteristics to anything...really, anything!...that looks or acts like a human in even the remotest ways.

Don't believe me? Go watch this short video. Now, tell someone what you saw. What did you say? Did you describe several geometric shapes moving around a confined space in precise engineering terms? Or did you say something else? Perhaps something about “characters” playing out an ancient tale?   

Even when robot designers do their best to minimize their machines' human resemblance, we humans will find some feature that triggers our anthropomorphic instincts...especially if that machine has even a scintilla of cuteness.  

Well, so what? So cute robots will tug on our heartstrings a little. Big deal.

But, is cute always desirable? Are there times when cute might be a detriment? 

This week's Quote contains a clue to that answer. When a robot's purpose is to perform some dull, dirty, or dangerous task (like removing battlefield Explosive Ordnance Devices; EODs) the last thing we want kicking in are cuteness-triggered nurturing instincts. The quoted colonel's comment that the EOD removal test was “inhumane” is a dead giveaway that this robot was a bit too cute for the task at hand. 

All of which is to say that our desire to create robots with which we will establish emotional connections is not without peril. My podcast co-host, Dr. Julie Carpenter has written that military personnel have, in fact, actually put themselves in danger due to reluctance to send EOD destroying robots into “harm's way.” Then there's the question of inappropriate emotional connections between robots and children. There's little doubt that increasingly capable, cute home robots will become important companions for young children (as well as the elderly). How will we manage these relationships so that people do not make inappropriate decisions based on the power of cuteness?

Robots are going to enter our homes, much as domesticated dogs did centuries ago. Like dogs, robots will require care, and owning them will bring with it both the joys of companionship and some measure of emotional (and physical) risk. It is imperative that robot designers take their responsibilities seriously since they risk triggering some of our deepest evolutionary mechanisms with just the shape of a face, or the “longing” looks of slightly oversized “soulful” eyes.

An object's integrated form/function has rarely been as important it will be in our soon-to-arrive social robots. 

Tom Guarriello, Ph.D. 

Thanks for subscribing to the RoboPsych Newsletter

If you're not subscribed, please sign up here.  
 
Be sure to subscribe to the RoboPsych Podcast!
Our mailing address is:
tom@RoboPsych.com

unsubscribe from this list    update subscription preferences 
 






This email was sent to *|EMAIL|*
why did I get this?    unsubscribe from this list    update subscription preferences
*|LIST:ADDRESSLINE|*

*|REWARDS|*