Editor’s note: This is the first entry in a new video series, HardWIRED: Welcome to the Robotic Future, in which we explore the many fascinating machines that are transforming society. And we can’t do that without first defining what a robot even is.
When you hear the word “robot,” the first thing that probably comes to mind is a silvery humanoid, à la The Day the Earth Stood Still or C-3PO (more golden, I guess, but still metallic). But there’s also the Roomba, and autonomous drones, and technically also self-driving cars. A robot can be a lot of things these days―and this is just the beginning of their proliferation.
With so many different kinds of robots, how do you define what one is? It’s a physical thing―engineers agree on that, at least. But ask three different roboticists to define a robot and you’ll get three different answers. This isn’t a trivial semantic conundrum: Thinking about what a robot really is has implications for how humanity deals with the unfolding robo-revolution.
I’d like you to think about two drones. One you have to pilot yourself, and the other is autonomous, taking off, navigating obstacles, and landing all on its own. Are these both robots? Nope.
“I would say that a robot is a physically embodied artificially intelligent agent that can take actions that have effects on the physical world,” says roboticist Anca Dragan of UC Berkeley. According to that definition, a robot has to make decisions that in turn make it useful―that is, avoiding things like running itself into trees. So your dumb, cheapo RC quadcopter is no more a robot than an RC car. An autonomous drone, however, is a thinking agent that senses and interacts with its world. It’s a robot.
Intelligence, then, is a core component of what makes a robot a robot and not a wind-up toy. Kate Darling, a roboticist at the MIT Media Lab, agrees. “My definition of a robot, given that there is no very good universal definition, would probably be a physical machine that’s usually programmable by a computer that can execute tasks autonomously or automatically by itself,” she says. “What a lot of people tend to follow is this sense, think, act paradigm.” An RC drone can act, but only because you order it to. It can’t sense its environment or think about its next action. An autonomous drone, however, can do all three. It’s a physical embodiment of an artificial intelligence.
Just how intelligent does a machine have to be to qualify as a robot, though? Lots of systems take in information from the outside world, process it, and then output an action—take the autopilot software that flies commercial planes. Hanumant Singh, a roboticist at Northeastern University, says a robot is “a system that exhibits ‘complex’ behavior and includes sensing and actuation.” He gives that definition to his students, then asks them to consider whether a Boeing 747 fits the bill. “It is automated, it is complex, it has sensing, it has actuation,” he says. “The students argue that it is not a robot because humans operate it a lot of the time, even though it has an autopilot.”
Also confusing are swallowable, magnetic “origami bots” that automatically unfold when they hit the acid of the stomach—reacting to their environment like an actually intelligent bot would. But then a human operator has to use magnets to steer them around the digestive system to pick up things that shouldn’t be there, like swallowed batteries. Not so much a bot.
If a machine is truly autonomous, there’s a good chance it’s a robot—but there are different degrees of autonomous intelligence. It’s easy enough to program a machine to respond to a single environmental input with a single output. But as machine learning algorithms improve, robots will respond to their environments in ways that humans didn’t explicitly teach them to. And that’s the kind of intelligence that will get robots driving us around, helping the elderly, and keeping us company. “I’d say, yes, a robot is a physically embodied artificial intelligent agent,” says Dragan, “but an artificially intelligent agent to me is an agent that acts to maximize a person’s utility.” Meaning, new thinkier robots are more sensitive to the user’s needs.
To demonstrate, in her lab Dragan shows me a robotic arm her team has programmed. Grasping a mug, the arm sweeps across a table. But Dragan doesn’t want it sweeping so high, so she grabs the arm and forces it closer to the surface. But she hasn’t programmed the robot to sweep this low, so it returns to its previous altitude. Its intelligence is limited to the simple rules it’s been given.
The second time around, though, the arm reacts differently to Dragan’s correction. She forces it to a lower altitude and it recognizes her new demand, continuing the rest of its sweep at that level. It’s a responsive brand of robot that we’ll be seeing more of in this world. Think robots that are not only sensitive to our needs, but anticipate them. More and more, we won’t need to intervene to correct robots’ behavior, but will interact with robots that learn to adapt to our whims.
This nuance is important, because “robot” is a powerful word. It is at once something that makes people uncomfortable (killer robots, job-stealing robots, etc.) and that makes them feel nice (Kuri the extremely endearing companion robot). “The word robot generates a lot of attention and fascination and sometimes fear,” says Darling. “You can use it to get people’s attention. I mean, it’s much sexier to call something a robot than call something a dishwasher.”
For that matter, “robot” certainly sounds sexier than “physically embodied artificially intelligent agent.” But a robot is a machine that senses and acts on its world. And soon enough, our world will be full of them. Just probably not in, you know, a The Day the Earth Stood Still kind of way.