Photograph by Max Aguilera-Hellweg, National Geographic
Published July 18, 2013
Would a robot serving you coffee in bed make waking up easier on weekday mornings? Could a household robot help an elderly relative who is living alone? How would you like to climb into a robotic car and eat breakfast with the kids while you're all driven to school and work?
These scenarios may sound like science fiction, but experts say they're a lot closer to becoming reality than you probably think.
Brown University roboticist Chad Jenkins expects a near-term robot revolution that will echo the computing revolution of recent decades. And he says it will be driven by enabling robots to learn more like humans do—by watching others demonstrate behaviors and by asking questions.
"The robots you're seeing now mostly are analogous to the mainframe computers of the 1970s," Jenkins said. "But you're starting to see things develop. The vacuum cleaners, the drones, those are the initial steps," he said, referring to iRobot's Roomba vacuum cleaner, which has autonomously cleaned millions of homes since its 2002 debut.
"And these platforms are going to get cheaper while becoming more capable and more compact," he added.
Jenkins isn't alone in anticipating a future in which robots will play a more active role in our lives. "Robotics is at a tipping point," said Sonia Chernova, director of Worcester Polytechnic Institute's Robot Autonomy and Interactive Learning (RAIL) lab, "transitioning from dirty, dull, and dangerous jobs to a broader set of applications."
And entrepreneurs like Dmitry Grishin, founder of a major Russian investment firm dedicated to personal robotics, have taken notice. He says the industry could be worth $18 billion by 2015. (Related: "Artificial Intelligence Is Working Hard So We Can Hardly Work.")
"Our planet has not seen revolutionary breakthroughs in the offline world for many years," Grishin wrote in announcing his group's initial investment plan in June 2012. "We are at the start of a transformative period for robotics."
But to really transform, robotocists say, robots will have to evolve from machines that can perform only the tasks that they're programmed to do into automatons that can truly learn. That's because it's impossible to pre-program a robot for everything it will encounter in the ever-changing real world.
"If you look at where robots are successful right now, it's primarily in applications where the environment is very controlled, like an assembly line where everything is the same, or in the hands of experts with PhDs, where we see them on Mars," said Chernova.
"To really get them to handle the complexities of our real world," she said, "they are going to have to be customized onsite. It's not going to happen tomorrow, but it's very close."
Such a leap could help bring robots into the mainstream.
"At the end of the day, that's what we're really about: How do we get robots out of the lab and into the real world, working with real people so that they can do something with them?" said Jenkins, who is a National Geographic Emerging Explorer.
"So you don't have to be a programmer to program the robot, you just show the robot what you want it to do and it will learn."
Still image from video by Jason Kurtis, National Geographic
Mapping the Robot World
The challenge of developing artificial robot intelligence capable of learning through teaching revolves around the problem of perception. While today's robots are armed with sensors, scanners, cameras, and other high-tech tools, roboticists are still learning to help them make sense of what they encounter. (Related: "Soccer Robots Compete for 6th Annual RoboCup.")
Jenkins and others who want to develop robots that can learn from demonstration—a phenomenon that roboticists call "LfD"—are in the habit of mapping the cloud of data points generated by a given demonstration action, such as picking up a glass or navigating a maze.
Then they analyze that cloud to produce learning algorithms that allow robots to perform the tasks on their own.
"What you're trying to learn when doing LfD is essentially a mathematical function," Jenkins said on a recent trip to Washington, D.C., where he was accompanied by a human-sized, 450-pound PR2 robot that looks a bit like Rosie from The Jetsons—minus the maid accoutrements.
"It's a function that takes an input—the robot's perceived state of the world—and outputs the action that the robot should take at a given time," Jenkins said.
"So the robot can autonomously say, 'Oh, here's the situation that I'm in, and here's what the human did in that situation or a situation that looked like it, so I'm going to do what the human did.'"
Getting there involves taking volumes of data and applying machine learning and statistical technologies that can produce a mathematical function map connecting what a robot sees to the action we want it to take.
"How do we give them knowledge of the 3-D world so that we can get them to do things and interact with them in the most natural way possible?" asked Jenkins, describing one of his key research questions. "Language and gestures are probably the easiest ways to make that happen." (See the Learning from Demonstration Robotics Challenge from the annual conference of the Association for the Advancement of Artificial Intelligence.)
For now, Jenkins's team and others are making steady progress toward that goal, training robots to stack blocks, pick up objects to tidy the lab space, and even boot a soccer ball.
Crowdsourcing Robot Learning
Even as they work intensely with robots in their labs, Jenkins and other roboticists say it's important to expose robots to a variety of teachers from the outside world.
Robots are better able to master learning from demonstration-type tasks when they're exposed to different demonstrations by different people. So Jenkins and others have turned to crowdsourcing to find teachers who can help train robots remotely.
The robots reside in facilities like the PR2 Remote Lab on Brown University's Providence, Rhode Island, campus, while the researchers who train them can be anywhere in the world.
The PR2 is an out-of-the-box, open-platform, R&D robot developed by Willow Garage robotics in Menlo Park, California, that can navigate in human environments and grasp or manipulate objects on command. Jenkins was on sabbatical at Willow Garage for the past year.
Chernova said that crowdsourcing is also an important step toward introducing robots into the daily lives of the masses.
In the past, algorithms that made robots work well when used by trained computer scientists often failed when ordinary people tried to use them.
"People may give the robot commands in the wrong order or too many at the same time, so it fails often and that's unacceptable," Chernov said.
To be more effective, robots need a more robust model of the world that includes the simple variations in terminology or knowledge that we take for granted. Consider the fact that something called a "red dish" or a "burgundy bowl" may in fact be the same object, or that "clean up this room" means different things to different people.
To help train robots to pick up on those nuances, Chernova turned to the model of microtask management, which uses the Internet to enable short-term business tasks—transcribing audio files or categorizing a company's inventory, for example—to be done by workers around the world.
"If it's not being used, we want to have the robot say, 'I'm free right now. I want to post a job on CrowdFlower,'" said Chernova, referring to a company that employs a million microtask workers in 90 nations. "'I need someone to teach me what these objects in my world are.' Hopefully a CrowdFlower worker will take the job and spend five minutes labeling things in the environment for the robot, or teaching it in some other way.
"Recruiting people can be a challenging and inefficient project," she added. "So we like having the robot be in charge of it." (Related: "Teaching Robots to Anticipate Human Actions.")
The Inquisitive Robot
To truly jumpstart their learning, robots need to ask more complex questions, such as, "Am I doing this correctly?"
Maya Cakmak, a post-doctoral student who is spending time at Willow Garage, said it's important for robots to ask questions because people aren't all that good at training them via demonstration.
Humans generally don't like repeating tasks, can't perform those tasks the exact same way every time, and are disinclined to demonstrate different methods a robot might use to complete the same task.
Inquisitive robots can help make us better teachers. Cakmak has performed studies that have helped to prove that—especially when robot programmers are non-experts.
In 2012, Cakmak led a team that had volunteers guide robots through assembly tasks to construct a toy house by combining a square block foundation with a triangular top. With passive learning, only one person in four showed the robot enough examples that it could understand how to complete the task on its own.
But when the robot asked questions about how to assemble the house, volunteers answered them—and the robot success rate soared to 100 percent. The robots processed the feedback into new actions and into mathematical functions that they could replicate later. (See a video of Cakmak and Simon the Social Robot.)
How Human Is Too Human?
Movies like 2001: A Space Odyssey and The Terminator series have portrayed future scenarios in which robots become so advanced that they get the better of their human creators.
But Jenkins doesn't give much credence to fears of a robot takeover.
"If you think of the science fiction, where robots become sentient and take over because they can do things more efficiently, that's not going to happen," he said. "Because robots will only do what we program them to do."
He continued, "Robots are good at scripted tasks. Humans are good at doing things that are not necessarily structured. We can adapt, we can make do, we can improvise, and we can handle a variety of situations. Those are things that are going to be uniquely human, and I don't think that's going to change."
Photograph by David Paul Morris, Bloomberg/Getty Images
But Jenkins does worry about people using intelligent robots to achieve nefarious ends. (Related: "Are Robot Warriors Headed Into Battle?")
"The thing that you do have to be concerned about is giving more power to a small group of individuals," he said. "You're seeing how this is playing out with the weaponized drone debate. I think there are ethical concerns about how drones are being used."
To help work through such issues, Jenkins stresses the importance of the academic field of robot ethics, which has been around for decades, along with the need for regulation and liability.
"You want to be able to say, 'If you use the technology this way, then you've crossed a line that's not appropriate,'" he said. "I worry that the way wars might be fought is the way that we might play a game of StarCraft."
Worcester Polytechnic Institute's Chernova said that while some roboticists are focused on making robots as human as possible, the majority simply want to make them functional at performing tasks to help humans. They might not resemble humans at all.
"Consider Google's autonomous cars," she said. "That's very much a robotics technology—Google hired a lot of the top robotics researchers to make that vehicle. A robot is something that senses the world, reasons about that information, and acts in response to that information. Beyond that, the shape and function of the robot is limited only by our imagination and skill."
Jenkins, for his part, has designed robots that help paralyzed people pilot drones to explore the world outside their homes. And he has contributed to the development of a robot that, thanks to a Georgia Tech team's efforts, allowed a quadriplegic to shave by himself for the first time in years. (Read more about the Robots for Humanity project.)
But Jenkins says the future forms and functions of robots won't be driven by those who make robots their careers.
"I don't think we as roboticists understand what the interesting applications are," he said. "So if you want to develop an awesome robot app store, it can't come from us. It has to come from designers, artists, people developing consumer products. We just need to give them the tools."
Evidence that robotics has evolved at a level most don't realize comes from the Navy's X47B prototype autonomous attack aircraft program. On July 10, 2013 the X47B completed, autonomously, it's aircraft carrier qualifications by preforming an autonomous arresting gear landing aboard the aircraft carrier George H.W. Bush. The X47B is designed and is capable of being catapulted from a carrier, preform it's assigned mission and return to the carrier all autonomously. What's most significant about the X47B's capabilities is it's ability to land autonomously aboard an aircraft carrier. Aircraft carrier landings, especially at night, have been tested and found to be more stressful than air-to-air combat.
Robots will at some point preform all the jobs that humans do today. On a positive note the robotic revolution will evolve along with Nano technology, which will, in theory, bring about a singularity allowing humans to evolve to a higher level mentally and physically. If the singularity theory is correct there will no longer be a need for a monetary system. The fly in the ointment is the 1% that controls 40% of the worlds resources are not going to be thrilled about losing that control.
I would LOVE to have a chance to teach one of these Robots how to have a sense of humor. It would benefit so many people in so many different ways. The invalids, those suffering from mental disorders such as depression and anxiety for starters. Then I would like the Robot to be able to monitor our medical status. Our Blood Pressure, Heart Rate, Oxygen levels, etc... and if someone is having or about to have a medical issue, such as a heart attack, it would be able to notify anyone within hearing distance and call 911 for assistance with the location where help is needed and give the persons medical stats to the 911 operator all on it's own. Then music would be the thing I would want it to understand and why humans around the world enjoy it so much. This is where I see the Robot becoming less of a machine and more of an extension of ourselves.
"Women create babies, men create robots!" So, we put all this money, brain and technology into creating a new class of citizens, that take away jobs from real people and perform jobs for the most disenfranchised people in society, such as those with disabilities, and the incarcerated. I think robotic applications can be a blessing in helping people achieve what they could not before, such as the quadriplegic man who shaved independently of someone else, but there again, it replaced a person. What happens when robots think so well, that they supersede most peoples' aptitudes? As most people are just like robots anyhow, doing what they are told to do, or what they are forced to do, enforced by a system of hierarchy. I think we need to spend more time in educating people to think for themselves, to act humanely and to create systems of equality. Maybe robots in the future can teach the masses these skills, but, first, robots must be taught by enlightened individuals and they are few and far in between. Let's hope they are instructed well, or else we will end up as their slaves!
If what illustrated in the article is true,robots will be a serious problem in ethnic in the near future about the social status of them.
Time to re-read Isaac Asimov's cautionary tale of teaching robots to think for themselves? It can be found as the Foundation Series.
Explore With Nat Geo
Anders Angerbjörn learns little foxes have big attitudes.
Special Ad Section
Shop book & DVD gifts for all ages. Plus, save on maps featuring award-winning cartography. Limited time only.