Luis Sentis will lead a session, “A Developer’s Primer for Coding Human Behavior in Bots,” at SXSW on Sunday, March 12, 2017.
Human-centered robotics hold a special place in the robotics field because they both mimic human sensing and cognitive behavior, and are designed to assist humans for safety and productivity. To explore human-centered robotics is to explore human beings and how we sense the world, analyze complex and often conflicting information, and act upon our findings, modifying perception, understanding, and action as new information is available.
Such machines could be of great practical benefit to humans on long space flights to Mars, for instance, or as human proxies in hazardous environments such as a chemical spill or even ordinary circumstances like education or elder care.
Obviously, creating human-centered robots poses many challenges in conception, design, and the hardware and software that support them. My own work in this burgeoning field focuses on designing high-performance series elastic actuators for robotic movements, embedded control systems, motion planning for dynamic locomotion, and real-time optimal control for human-centered robots.
Once we have a platform for human-centered robotics, and once we can create the hardware and software and the logic to drive them, we can turn to its real-world applications, which are many.
Most readers probably have only a passing acquaintance with human-centered robotics, so allow me to use this brief blog to introduce a few ideas about this topic and its challenges.
Humanoid and Human-Centered
Since perhaps the 1950s, television and the movies have often portrayed humanoid robots—robots that take roughly human form—entertaining us with how closely they mimic humans or by how far they fall short. Sometimes, in a dramatic plot turn, a humanoid robot becomes malevolent or uncontrollable by humans.
I prefer the term “human-centered robot,” because it most closely describes my field of endeavor: How to create a robot that is focused on assisting a human being; sometimes guided by a human, but also learning on its own what action or behavior would be most helpful to that human.
In my view, we do not yet have sufficient evidence to say that humanoid robots are most effective when interacting with humans. They may well be, but we do not have definitive data on the question.
However, it appears anecdotally true that humanoid robots fire the human imagination and that has its benefits. In addition to their portrayal in popular media, I have found that humanoid robots draw the most, well, human interest. Soon after creating one we named “Dreamer” in 2013, in the Human Centered Robotics Lab (HCRL) at the Cockrell School of Engineering at the University of Texas at Austin, we generally received more attention from curious students, engineers, investors and—wouldn’t you know it—movie producers. (Dreamer eventually had a bit part in “Transformers 4” in 2014.) If humanoid robots help draw attention and interest to human-centered robotics, so be it.
Applications and “Productivity”
The more important aspect of this field is how to create human-centered robotics that sense their surroundings and either respond to human directions or intuit what actions would best serve their human counterpart.
I’ve mentioned the possible robotic applications of space travel, perhaps as a companion to astronauts on a space walk, as a human proxy in hazardous environments or as a caregiver to an elderly person. In each case, the notion of productivity is different.
If you think of “productivity” for robotics generally in a manufacturing setting, it can be measured in terms of hours of work performed and profits earned. But in a long space journey to Mars, productivity will be measured instead in terms of the astronauts’ enhanced safety and ability to accomplish difficult tasks. In a hazmat spill, productivity might be measured in terms of human lives saved. In elder care, how well did a robot perform in changing bandages or applying ointment to a sore, preserving the person’s health?
Robot Knows Best
Another quest in human-centered robotics is to create the ability of a robot to not just predict human behavior, but to perform what I call “intervention.” Whatever its level of complexity, can we build a robot with logic that assesses a situation for optimal actions, whether directed by a human or not? This translates to a robot’s ability to say to itself, “Well, the human is operating the system in such a way. We could do better if computationally I have a hypothesis about what would be best for the human and intervene with that particular behavior.”
This ability requires pairing social cognitive theories with mathematics. And I have found that advances are possible for what I call self-efficacy, which is basically the self-confidence to achieve a certain behavior.
At this point, self-efficacy can be achieved in very simple scenarios. One potential application is to use a human-centered robot to motivate students to solve problems by sensing and reacting to students’ level of engagement, then producing an interaction that motivates the student and enhances learning. I hope to demonstrate this and give attendees a chance to code such behavior in a human-centered robot at SXSW.
Touch and Whole Body Sensing
One major way in which humans interact is through touch. We place a hand on a shoulder or grasp someone’s forearm to gain their attention. Robots—particularly humanoid ones with mobility—are likely to be large and quick, so touch becomes an important element in the safety of their human counterparts. We do not want a robot that runs into an astronaut on a space walk or pins someone to a wall. Thus, we are developing what I call “whole body sensing.” Though some in this field are pursuing something known as “sensory skin,” at the HCRL we have taken a more economical approach to minimize the amount of electronics needed.
We use a distributed sensor array on the robot’s surface, but they number in the dozens, not the thousands employed in sensory skin. Instead, we combine different sensing modalities internal to the robot, such as accelerometers, which aid stabilization, and vibration sensors that enable the machine to triangulate information on what’s happening in the immediate environment. This enables the robot to respond to human touch, but within the context of other information it is receiving from its environment. We call this “whole-body contact awareness,” a combination of internal and external sensing and awareness.
Spin-offs
I hope this mere glimpse into the world of human-centered robotics piques your curiosity. It may serve to attract those who wish to work in the field. But the general public should also understand that advances in this field will eventually make their way into human-centered robotics in our homes, our businesses, manufacturing, agriculture, smart cities, the Internet of Things, you name it. We’ll have systems someday—we already do, with limited abilities—to sense human behaviors and intervene to produce optimal conditions based on an understanding of what’s best for the people involved in a particular situation.
Today, we have smart thermostats that learn our preferences for heating and cooling our homes. Tomorrow, we may have human-centered robotic systems that optimize our cities.
Luis Sentis is an Assistant Professor in the Department of Aerospace Engineering at the University of Texas (UT) at Austin. He also leads UT Austin’s Human Centered Robotics Laboratory and is co-founder of Apptronik Systems Inc., a contractor for NASA's Johnson Space Center.