The World Before Smartphones

My memory isn't what it used to be.

Neither is yours.

That's because we both have smartphones. 

And, to borrow the current popular construction: Smartphones are eating our memory.

My first "smartphone" was a Nokia N95. I was given one by Nokia at PopTech! 2007. I was thrilled. 

I'd owned cell phones before, of course, going back to early car phones in the mid 80s. They were great. Expensive. Unreliable. But, great.

But, the N95 was the first device that transformed me into a mobile-connected production unit.

First off, it had a camera. A good camera. That meant for the first time in my life I had a camera with me at all times and could take photos of anything I wanted.

That camera also recorded video.

That changed everything for me.

I'd started video blogging on YouTube in August, 2006 and was increasingly fascinated by the social potential of video as a person-to-person communication platform. But, video blogging in those days meant lugging around a sizable camera and other gear.

Now, I had a video recording rig in my pocket all the time.

But, that was just the beginning.

What I also now had was instant, always-on access to the Internet. By 2007, Google had already become an important part of everyone’s life and now I had all that Googly search power in my pocket. 

That changed everything.   

Let's pause for a moment to think about a well-known psychological reality: convenient, reliable sources of information are rapidly appropriated into our cognitive repertoire. Our brains love to offload tasks that require close attention to simpler, less demanding methods and devices. 

I am, like you, a “cognitive miser.” 

As cognitive misers, our brains look for the simplest, most economic thought processes or operations that provide us with the information or answers, we need to function everyday. The cognitive miser model explains our extensive use (despite their shortcomings) of shortcuts, rules of thumb, or heuristics, to solve problems and make decisions. 

The tendency to offload complex cognitive tasks to economically superior alternatives accounts for everything from the invention of writing to alarm clocks. Which is what made the smartphone so popular. Our cognitive miserly brains never had a “friend” like the smartphone before. 

For all practical purposes we now all carried around all the world’s accumulated knowledge in our pockets, where, at a moment’s notice we could find answers to teasy questions like, “Who was the voice of Boba Fett?” (originally, Jason Wingreen, later Jeremy Bulloch) or, “When did Eddie Mathews break in to the big leagues?” (April 15,  1952). 

Before smartphones, answering these questions was highly cognitively expensive. After, trivial. 

Suddenly, over night, we were endowed with more cognitive power than any of the billions of people who’d preceded us on Earth.

As the last decade’s unfolded, we’ve become increasingly dependent on our smart devices for this cognitive outsourcing. As misers, we are loathe to spend even a minute trying to agonizingly recall Marilyn Monroe’s first husband’s name when the answer is ready-to-hand through our smartphones. 

This dependency has made some fearful that we are losing our cognitive capabilities though diminished use; that our technology is “dumbing us down.”

And, in some sense, that observation is true. Can you recall your family’s telephone numbers as easily as you did a decade ago? Probably not. Your miserly nature has led to that information being delegated to your smartphone, just as your first electronic calculator enabled you to demote the energy-hogging multiplication tables out of short-term memory.  

Today’s world is dramatically different than the world before smartphones. Less than a decade ago we still had to struggle to remember names, numbers, directions and other brute facts, using up precious cognitive resources that we now can turn to other matters.

And, that’s where robots come in…enabling us to turn energy formerly spent on routine, repeatable tasks towards more uniquely human, creative, fulfilling activities. 

The challenge we now face is to do so, and not turn ourselves into cognitive sloths in the process.

The world before smartphones is gone forever. We now need to use wisely the cognitive freedom they and their successors have given us.

In other words, I need to sharpen up my RoboPsych for the next big adventure. 

Pieces and Parts

The thing that most people don’t yet appreciate is the vast ecosystem that is developing around AI and robots. That system is comprised of lots of pieces and parts that will soon begin converging on devices that will utilize them all as needed.

Take Quill. Quill is a software platform that describes itself as, “artificial intelligence at machine scale.” That means that the platform can be applied to analyze all kinds of data, at any scale, which it will then summarize in “data-driven narratives” for specific audiences and purposes.

Think about that.

I have a data set…say, reader traffic to RoboPsych.com…that I want to understand. Theoretically, I can employ Quill to analyze that data and provide me with a set of insights and conclusions in narrative form. That means the system will tell me what is interesting about that data and give me hints about what I can do about it. Of course, Quill is currently pitched to business clients.

But, what will it be like when Quill-Lite comes out and can be run on a robot like Jibo? What will it be like when I can say to my bot assistant: “How much have I spent on gasoline in the last three months? How much could I save by adopting what kinds of habits?” Or, "here's the list of food I've eaten and activities I've engaged in over the last month that Jibo's kindly recorded for me. What do you think I should know about this dataset, Quill; what's interesting here?" As this site puts it: "a kind of executive summary of the self"

Lather. Rinse. Repeat.

And, this is where RoboPsych comes in. When early adopters acquire new, ever-present AI, our ability to use this kind of intelligent assistant to it's maximum abilities will be a key factor in being successful. The competitive advantages of this technology will be so pronounced that it will become another “haves/have nots” factor.

So, it’s key to see the current developments of system components…pieces and parts…like natural language and speech recognition, empathy engines, telepresence…in a holistic context. It's not that the individual pieces and parts are that interesting in and of themselves (some may interest you and and others may not) but that the overall AI/robotic system capabilities are being created in functional apps.   

Soon, bots will combine apps just like PCs and phones did, giving us all an intelligence boost that we're just now beginning to appreciate.

But, we need to be psychologically prepared to use this technology to its fullest: we need to adopt the RoboPsych mindset that will enable us to take advantage of the next phase of our evolutionary journey.  

The World: Before and After

That's a picture of a busy Detroit street in 1900. It's quite an image. There are hundreds of people on the street, all engaging in very familiar activities: shopping, cajoling, living modern turn-of-the-century lives.

What's striking to me is the absence of machines. The only technology is the invisible camera that's taking the picture. From the look of the people facing it, that camera has caught their attention.

Imagine the same scene 10 years later. The Ford Model T was about to transform that street into a very different place. Telephone wires would be strung from building to building, connecting families with one another and far-flung loved ones. Electricity would light up the night, indoors and out.

Nothing would ever be the same.

When I see that photo, I think of how a picture of our streets taken today will look 20 years from now. Will cars, trucks, busses, and wires still crowd the roads? Will people be staring at the "black mirrors" in their hands for updates from loved ones and marketers alike? Will today's image evoke the same quaint nostalgia as the 1900 Detroit shot?

I think one thing that is almost certain to be different will be the noticeable absence of robots.

There would be plenty of "robots" in a 2015 street snapshot. But, they're mostly invisible. Google, Siri, GPS, banking systems and scores of other software robots do our bidding today in easily beckoned efficiency. They make modern life possible. 

But, what about 20 years from now? What will that street scene look like then? Well, first of all, 2035's robots will definitely be more visibly physically present and deeply integrated into the daily lives of tomorrow's citizens. Physical robot assistants will be walking the streets performing all manner of previously human-only labor.

People and robots walking, talking, agreeing and disagreeing...having all manner of animated human-robot interaction...are likely to be evident in this scene. Our descendants will be as comfortable carrying on conversations with robots as we are talking with one another. Those robots will not only be able to follow the content of the conversation but also understand how the meaning of the topics being discussed will be uniquely experienced by each of the humans involved; capable of appreciating (more than any of us will be) the complexity and subtlety of each individual's unique emotional/cognitive reactions to the situation at hand. They'll read those reactions from our (micro) facial expressions, body language, and voice inflection as well as deducing them from algorithmically formed profiles using our individual histories, habits, mindsets, propensities, and behavioral patterns.  

Our interdependence with robots will be broad and deep, just as it is today with other generally capable technology platforms: electricity, automobiles, and computers. 

What will that be like? What will it be like to peer back into the mist and see today's streets in the same way as we see this 1900 scene? What will it be like to have robots become as commonplace as cell phones and pets? What will it be like for us to recognize that we are longer the planet's superior intellectual species? What will it be like when human beings recognize that we have been eclipsed by an artificial superintelligence that may very well have very different goals than assuring our own safe future?

What will it be like to look on 2015 as an example of the world in the days before everything changed?

How Do You React To Robots?

baxter at work

That's Baxter...on the job!

Take a second to look at the above image. How do you feel when you look at it? What do you think? What does it make you want to do?

That's Baxter, a state-of-the-art robot carrying out one of the many dull, dirty and/or dangerous jobs in today's industrial economy. 

Do you think the image is silly? A fake demonstration of some far-in-the-future fantasy? 

Do you feel vaguely anxious or angry when you see the photo?

Or, are you amused by Baxter's wide-eyed earnestness? Do you find yourself caught between contradictory feelings of pride in humanity's inventiveness and dread at the implications of the robot's relentless effectiveness and efficiency?  

Bottom line: Do you think it's a good thing that Baxter is picking and packing those widgets? Or, do you worry that the downsides of robot automation will ultimately lead to dangerous consequences?  

RoboPsych is about your psychological reactions to robots and what effects those reactions will have on your willingness and ability to use and work with the rapidly approaching robot workforce. 

If you haven't already, how about completing our brief survey concerning your thoughts and feelings about robots. The goal is to help you understanding your current reactions to robots, to help you to identify your RoboPsych attitudes.

Ultimately, RoboPsych is about putting together a plan to prepare yourself for that day in the not too distant future when you look up and see Baxter being wheeled into the cubicle next to yours. 

Because, while we may not recognize it, dull, dirty and/or dangerous jobs come in all shapes and sizes. 

More on that here soon.

Why 2015?

What will make 2015 a pivotal year in robotics? 

A combination of factors:

  • Sensors - Both capability and price/performance improvements in the sensor world make functions like facial/emotional recognition practical and robotic bodies safer in close proximity to people.
  • Speech Recognition - Natural language processors have developed to the extent that non-experts can use voice commands to interact with software and hardware systems. Siri and Google Now are leading the way with Cortana and other systems to come.
  • Actuators - Robots have gained improved range of motion and degrees of freedom through the widespread availability and decreased prices, making Lego-level actuators available and functional.
  • Makers - The “Maker mindset” has caught hold and people are creating sophisticated hardware in a wide array of categories.
  • Investors - Google, Intel and others have scooped up robotics companies and the tech and business press have publicized robotics developments accordingly.
  • Popular Press - It’s hard to go a day without new stories of new robots showing up in every kind of news stream. 
  • Drones - Remote controlled aerial hardware was a hot toy this last Christmas. Not to mention military usage. More and more drone/GoPro videos are showing up, giving normal consumers examples of the potential  of these systems. Capabilities will definitely grow.
  • Functional Expansion - As robots begin to show up in more and more workplaces, leading edge users will appreciate more fully the potential for human-robot collaboration. 
  • Zeitgeist - Maybe the biggest factor is the current moment in cultural history. Sometimes, technological developments come along and the culture is not yet ready for them, or, conversely, perfectly suited for them. Think: Japan and healthcare robots. Globally, we are moving towards greater acceptance of robots in our daily lives; our individual and collective RoboPsychs are developing to the point where fear is overshadowed by curiosity, functional insights (“these things can help me”) and the incredible exuberance of young people in the presence of robots.

All these factors add up to 2015 looking like the biggest year yet for robots of all kinds, with a strong emphasis on in-home consumer, social robots, like Pepper, Jimmy and Jibo

The Robot Ecosystem

Here's a neat interactive depiction of the history of robotics since 1950 from a Boston Consulting Group publication. Note the strong focus on industrial robot systems and component infrastructure. As we move closer to 2014 we see a movement towards anthropomorphic systems, unmanned ground and air vehicles, precision surgical robotic assistants and social robotics. 

The future is filled with a rich ecosystem of robots, all of which demand different skills from their human partners. 


What Is a Robot?

RoboPsych is about enabling effective, efficient relationships between people and robots. 

But, what, exactly, is a robot? 

It used to be simple to answer that question: it's a big lumbering hunk of metal with a stiff limbs and a mechanical voice.

Kind of like this:

Baxter and Human Friend

Baxter and Human Friend

But, times change. Nowadays robots come in all shapes and sizes.

And that's why the European Union created a commission called RoboLaw to investigate the ethical and legal implications of the dramatic increase in the world's robot population.

Dramatic increase?

Well, yes, if you include all the different types of robots the RoboLaw commission includes.

The commission's report creates a taxonomy for categorizing robots based on five characteristics:

  1. Task - Sometimes called, “use.” This refers to the specific purpose or application for which the robot is designed.
  2. Environment - This is the space outside of the robot, where the robot will carry out its actions. The main distinction is between physical and non-physical environments. That means that robots that operate in space, in the air, on land, water or within the human body (or other biological environments) and in cyberspace, such as software 'bots, or "softbots," all fit the definition.
  3. Nature - This is the way in which a robot manifests itself or exists. The main distinction here is embodied and disembodied robots. Machines, hybrid bionic systems and biological robots belong to the former sub-class, while software or virtual agents belongs to the latter. This makes it possible to enlarge the definition to include virtual robots or softbots, artificial biological robots, such as nanorobots (Dong, Subramanian & Nelson, 2007) and even hybrid-bionic systems, which are made of biological and mechatronic components (e.g. limb prosthesis).
  4. Human-robot interaction - This category takes into account the relationship between robots and human beings. It is a varied category including modes of interaction, interfaces, roles, and proximity between humans and robots.
  5. Autonomy - This is the robot’s degree of independence from an outside human supervisor in the execution of a task in a natural environment (i.e. out of a laboratory). Within this category different levels of autonomy can be included: full autonomy, semi-autonomy and tele-operation.

All five of these characteristics are important but Nature and Human-robot interaction are especially relevant for RoboPsych.

Why?

Our strong tendency to anthropomorphize (to attribute human characteristics to objects or animals) primes us to interact with even disembodied robots as if they were people. We easily ascribe motives, emotions and personalities to objects with the simplest human resemblance.  

We treat objects as if they were human from a very early age

We treat objects as if they were human from a very early age

If the robot system has interactive capability (like Siri's ability to answer questions), we will treat the system in a (at least) quasi-human fashion.  

So, embodied and disembodied autonomous and semi-autonomous systems will trigger deep social characteristics in us, reflecting the current state of our RoboPsych tendencies towards those systems. As robots become ubiquitous, and take on a wide array of forms, each of us will be relating to them more frequently, and more collaboratively, than ever before.

That means our RoboPsychs will become increasingly important skill sets in navigating our everyday world.


    Not The 'Bots You're Looking For

    There's a moment in Star Wars Episode IV when Obi Wan has to use a special Jedi mind trick to convince a couple of imperial stormtroopers that R2D2 and C3PO, "aren't the droids you're looking for.

    In a way, that's how it's going to be with robots in our world in the very near future.

    When we hear the word, "robot," most of us get an image of something like Robby from Forbidden PlanetThe Jetsons' maid, Rosie, or Robocop. Humanoids who look more or less like us.

    But it's the 21st century and not all robots look like people. Some are designed to look like industrial machines. Others, like animals. Still others, not like anything at all...they're software.

    What will it be like to live in a world surrounded by robots? Riding in driverless cars? Working next to a machine that never takes a break? Having your elderly parents cared for by a 'bot that hands our their medications?

    For many of us, it's going to take some adjustment. After all, the number of frightening stories about automatons goes back into antiquity. 

    And those stories have left impressions...created a mindset towards robots...that I think of as our collective and individual RoboPsych

    Gradually, we're going to develop our capacity to interact with robots, to develop relationships with them. Our natural tendency to anthropomorphize and the utility that robots will bring to our lives, will help us to overcome the resistance that many of us will feel towards them.

    And, make no mistake: Robots will be challenging and face resistance.

    Some of us will find our jobs being done by robots a lot sooner than we might imagine. Who would have predicted just a few year ago that Watson, an IBM software 'bot, could defeat the biggest Jeopardy winner of all time? Or, that machines could cook our hamburgers? Or even perform surgery

    Most of us will see our children and even grandchildren far outpacing our ability to use these ingenious systems. 

    What will we do?

    What will you do?

    A question we all will face is: "How will I improve my own RoboPsych, my ability to successfully interact with robots?"

    Well, just as you started as a novice when you learned to drive a car and gradually became proficient, you can also develop your robot interaction skillset, your RoboPsych.

    That's what RoboPsych.com is all about.  

    Human-Robot INTERACTION

    Learning to interact effectively with robots is a critical 21st century skill.

    Learning to interact effectively with robots is a critical 21st century skill.

    Remember your first bike ride? Your first driving lesson? If you do it's probably because they were pretty scary. 

    That's how many of us are feeling about the prospect of "interacting" with robots. 

    Even that phrase, "interacting with robots" is enough to send lots of us off into thoughts of being trampled or just being totally creeped out

    The truth is, even the simplest of machines take some getting used to, and the robots we'll be encountering in the coming decade will be anything but simple.

    But, learning to become a collaborative, interactive partner with robots is about to become one of the most important skills in the 21st century. How can you improve those skills?

    First, it's important to evaluate your current thoughts and feelings about robots. That sounds a little strange, doesn't it? When you think about it, however, you'll soon discover that the word "robot" elicits a set of emotional and cognitive reactions that might surprise you. After all, decades of science fiction about robots have left impressions that most of us are not even aware of. So, a good place to start might be to take this brief RoboThoughts survey

    What we see is that many of us carry around a set of negative attitudes about robots that we aren't even aware of. Those attitudes seem harmless enough. Who cares if we don't think it's a good idea to let robots drive cars?

    But, those attitudes can have a hidden impact on our willingness to take advantage of one of the greatest technological breakthroughs in our lifetimes: social robotics. 

    When robots moved from novels and the movie screen to a special fenced in part of the automobile factory floor, it wasn't that important for most of us to want to improve our ability to interact with them. 

    Now that we of have those robots in our pockets, homes, and cars, however, we'll all need to improve our ability to interact with them effectively and efficiently.

    That's what I call your level of "RoboPsych Competency." RoboPsych is a combination of emotional and cognitive attitudes and abilities that establish our level of willingness and proficiency at interacting with robots.

    Think of your current RoboPsych as a starting point for developing a level of expertise at integrating robots into your life to accomplish a wide range of tasks. The higher our RoboPsych Competency, the more creative we will be at using robots not just to make up for our deficiencies (e.g., reminding us not to forget things like birthdays and appointments) but also to help us become more productive, successful, happier versions of ourselves. 

    How can robots do that?

    That's one of the questions we'll be exploring together here over the coming months. 

    Caring Robots?

    Caregiver robots will succeed by making recipients comfortable with new kinds of relationships.

    Caregiver robots will succeed by making recipients comfortable with new kinds of relationships.

    Very soon, a new generation of caregivers will begin helping elderly people and others who need assistance with Activities of Daily Living or ADLs. 

    These caregivers will be robots.

    It's going to take some work on both ADL providers and recipients to get comfortable with these relationships.

    Providers are hard at work creating robots with several important characteristics:

    1. Technically capable - caregivers must be able to navigate a recipient's environment safely and carry out caregiving responsibilities adequately.
    2. Emotionally acceptable - caregivers must be able to deliver their caregiving services in ways that are, at minimum, non-threatening. This means their physical presence must be designed to send comforting signals. Caregiver robots will need "faces," "bodies" and "gestures" that reassure recipients that they are not dangerous and, in fact, "caring." Care is indicated via emotionally empathic behavior, a tall order for robots to exhibit.

    Recipients will be able to take advantage of these caregiver resources if they are:

    1. Open - the barriers to accepting care from robots will be predominantly emotional, so successful providers will help recipients to adopt an open attitude toward the experience of receiving robot care.
    2. Trained - teaching recipients to interact with robots will become an important part of provider services. Many ADL recipients today receive training in using various types of durable medical equipment. In the future, this training will have to include robot interaction.

    Given the size of the global elderly population, caregiver robots are poised to become a huge economic opportunity.  To succeed, providers will have to take the realities of this new relationship into account when designing hardware, software and user orientation. 

    Don't Get Stuck Using Extinct Thinking

    Daniel Kahneman, Nobel Prize Winner, Very Wise Man

    Daniel Kahneman, Nobel Prize Winner, Very Wise Man

    That's Nobel Prize winner Daniel Kahneman up there along with the cover of his highly successful book, Thinking Fast and Slow. One of the book's many insights is Kahneman's description of the two "systems," or ways of thinking, referenced by the title.

    Fast thinking ("System One") is largely unconscious and responsible for us safely navigating the world relatively effortlessly. It's also prone to a set of errors, biases arising from its use of rules of thumb, heuristic shortcuts that enable us to make quick decisions.

    Slow thinking ("System Two") is a deliberate approach to solving problems, utilizing logic and other step-by-step procedures.

    System One immediately knows that 2 + 2 equals 4 but has no idea what 24 x 17 is. That's a job for System Two.

    While System One contains a host of useful information and algorithms ("always pay attention to the source of the siren") it's also home to a host of questionable rules that can lead us astray in the modern world.

    Take robots, for example. 

    A lot of our ideas about robots come from fictional portrayals of out-of-control creatures. Ever since the villagers broke out the torches and pitchforks to hunt down Dr. Frankenstein's creation, we've been storing up images of doom machines. (Now, machines themselves have been scaring us since the Industrial Revolution, but robots added their extra layer of human resemblance to freak us out even more.)

    It's impossible to read a day's worth of robotics news without coming across the word "apocalypse" at least twice.

    The ideas we have about robots are our System One at work, helping to keep the world a simple, survivable place by sorting out the scary from the secure. 

    Because of these residuals, robots definitely scare System One. 

    “System 1 has been shaped by evolution to provide a continuous assessment of the main problems that an organism must solve to survive: How are things going? Is there a threat or a major opportunity? Should I approach or avoid?… situations are constantly evaluated as good or bad, requiring escape or permitting approach” (Kahneman, p. 89)

    But, in a world where we're about to be side-by-side with robots doing everything from working with us in a factory to taking care of a loved one in a nursing home, we're going to have to get over those fears.

    We're going to have to change our RoboPsych so that we're not unduly alarmed whenever we come across a robot.

    Think about your first driving lesson or first airplane flight. Chances are your System One alarm bells were pinging pretty steadily. 

    But you learned that you could control the car with those pedals and that wheel and that the folks up front had the plane pretty much figured out.

    That's the way it's going to be with robots, too.

    You'll find yourself becoming increasingly at ease in their presence and, maybe even developing a relationship with them, like you have with your car.

    The thing is, though, we continue to write lots of robot fear stories. Why? Because the possibilities, however remote, are sooooo scary. I mean, the robots could decide they don't want to obey us anymore, right? And, they might become impossible to identify and make bombs and...probably as likely as every airliner forgetting how to fly and plummeting to earth on the same day.

    As we begin encountering more and more robots in our everyday lives (the next decade is going to be eye-opening on this count) we'll have to come to terms with our current RoboPsych, our soon-to-be-outdated way of thinking about and emotionally reacting to robots. 

    Like our ancestors who didn't grow up with cars and jet planes, we're about to become the first generation to co-habit the world with amazing new intelligent machines. 

    And, System One's going to get quite a workout as we make the transition from our current world to that one.