Above: For researchers like Leila Takayama, who are at the forefront of the burgeoning field of human-robot interaction, robots must be designed with regular human users in mind.
It’s soft and fuzzy all over, with round, button eyes. Kai, as its latest incarnation is called, resembles a beakless baby owl with teddy-bear ears. It trembles with anxious heartbeat-like quivers, but when you stroke it, the beating slows down, coaxed into a purr. Kai is now calm.
Kai is a toy robot, developed by Sproutel, a company that based its design on a prototype from Katherine Isbister, a professor of computational media, and her collaborators and students. They conceived of the cute creature to help children calm down. Studies suggest that kids who don’t adequately learn to temper emotions—like anxiety, stress, or frustration—are at a higher risk of later developing mental disorders, doing worse at school, displaying criminal behaviors, and having low personal well-being. But by stroking Kai, children can, in turn, learn to soothe themselves.
Other than vibrating and responding to a comforting touch, Kai doesn’t do much else. It’s simple, and probably not what most people would think of as a robot, a word that conjures images of anthropomorphized chunks of metal with human-like intellect, such as C-3PO, Terminator, Wall-E, or Rosie from The Jetsons.
Exciting (or frightening) as they are, those kinds of robots are unlikely to exist anytime soon, says Leila Takayama, an acting associate professor of computational media.
“Science fiction does set a lot of our expectations of what robots are and what they can and should do,” she says. “But those are really far afield from what’s possible today or even in the near future.”
Instead, researchers like Takayama, Isbister, and others at UC Santa Cruz are ushering in a robotic future that’s more in line with Kai: technology that’s more realistic, modest, and—most importantly—useful.
“It’s going to be more human centered,” Takayama says. “That’s why we make the technology: not for the sake of making a Rosie, but because we need help with something.”
These robots will have specific tasks, and will be integrated into our daily lives so much that they may escape notice.
“We’ll have smaller, friendlier robots,” says Michael Wehner, an assistant professor of electrical and computer engineering, “and things you may not even consider to be robots.”
To be sure, many researchers are doing amazing things with sci-fi-like bots, such as Japanese roboticist Hiroshi Ishiguro’s unnervingly lifelike humanoids or Boston Dynamics’ four-legged machines that run and jump like dogs. But over the last five to ten years, the field of robotics has started to emphasize smaller, more practical devices that a regular person might actually use, according to Wehner.
“There has been a refocusing from the flashy to the realistic,” he says.
Wehner, for example, focuses on soft robotics—an approach that eschews metal for softer materials like gels, rubber, and plastic. Metal robots are sturdy, but can be dangerous. A collision with a steel-frame robot on the assembly line could send you to the hospital—or worse. Safety regulations dictate that such machines must move slower if humans are to be using them in close proximity, Wehner says. But that would also negate one of the main advantages of a robot: its speed.
So instead of metal, imagine robots made of soft components like air-filled plastic tubes driven by pneumatic pumps, making robots that look more like a balloon animal or a bouncy-castle than Optimus Prime.
“If one of these robots hits you at full speed, it’s annoying, rather than dangerous or deadly,” Wehner says. “We feel that this is a way to really integrate machinery and robots into people’s daily lives.”
This type of integration would be particularly well-suited for medicine—such as for robots that safely lift and transport patients. Many of the actuators and sensors already exist for such devices, Wehner says, so researchers like him are trying to systematically incorporate those components together.
Developing these design principles would also be useful for smaller medical devices like intubation tubes. Conventional intubation devices, which help a patient breathe in an emergency, are rigid tubes that a doctor inserts into the airway—which, if not properly done, can cause injury or even death. But a soft, semi-autonomous tube that can navigate down the airway could be a safer lifesaver.
Wehner has also worked on orthotic devices that help disabled people regain some mobility, such as for children with cerebral palsy or a condition called drop foot, in which they have difficulty lifting the front of their foot. These devices look like long underwear fitted with pneumatic tubes that push on the leg, giving its user a lift. Because they’re soft, they can be worn under clothes, making them not only more comfortable but also less noticeable, reducing any social stigma.
Suited for therapy
Although not soft, a similar kind of robotic wearable can also help in rehabilitation and physical therapy. Mircea Teodorescu and Sri Kurniawan are combining the power of robotic suits with virtual-reality games.
A person who, for example, has suffered a stroke or has cerebral palsy faces endless hours of physical therapy.
“The exercises are pretty monotonous and tedious,” says Kurniawan, a professor of computational media. “So anything that motivates you to exercise at home will be very helpful.”
Instead of the drudgery of repetitive exercises in your living room, you can virtually immerse yourself in outer space, where your therapy takes the form of catching falling stars with your arms. The games and competition serve as an escape that’s fun and beneficial. There’s even the cool factor of virtual-reality games, Kurniawan says, allowing older stroke patients to connect with grandchildren who are fellow gamers.
Because players have limited mobility, Teodorescu’s lab developed a robotic suit to give them a nudge.
“These suits can gently help you,” says Teodorescu, an associate professor of electrical and computer engineering. “It’s just going to lift your own hand a little bit.” That’s enough to keep someone playing, even if they can’t fully move their arm.
Players also wear sensors that monitor basic biofeedback data such as heart rate, stress levels, and the brain’s electrical activity. The feedback tells the game whether it’s too hard or too easy, prompting it to automatically adjust the difficulty level. This way, Kurniawan explains, your home therapy session adapts to your needs, enabling better progress between visits to a human physical therapist.
This kind of home therapy session would be invaluable for people who can’t afford regular physical therapy, don’t have adequate insurance, or live in remote places. With the cost of virtual- reality systems dropping to a few hundred dollars a unit, Kurniawan says, her goal is to make rehab and physical therapy accessible to as many people as possible.
The researchers now need to improve the suit—which so far is only designed for the upper body. The current iteration is essentially a wetsuit rigged with bicycle cables, requiring the assistance of grad students. The goal of the next one is to be more comfortable, reliable, and user-friendly, Teodorescu says.
The revolution will be user-friendly
And user-friendliness is crucial for robotics to reach their full potential—no matter how cutting edge their technology. For researchers like Takayama, who are at the forefront of the burgeoning field of human-robot interaction, robots must be designed with regular human users in mind.
“If we didn’t do it,” she says, “we would end up with a bunch of robotic systems designed for roboticists.” Systems like, she points out, a beer-delivering robot that’s controlled with commands on a computer terminal—fun and impressive, but not necessarily a world-changer or even usable by a non-roboticist.
Making robots user-friendly opens them up to a more diverse set of people, who can then push the technology toward a wider range of applications. Takayama likens this effort to the development of the personal computer, which empowered regular people with technology that was previously the sole domain of computer scientists.
In Takayama’s case, such wide-ranging applications include a preliminary project to design robotic furniture. Picture chairs and tables that rearrange themselves to foster collaboration. She’s also starting to work with teams at the Monterey Bay Aquarium Research Institute to develop more user-friendly underwater robots. Typically, only specially trained engineers can operate the robotic underwater vehicles that explore the deep sea. But redesigned robots could allow scientists, students, and others to drive them, which could mean more exploration and more discoveries.
Here, the focus isn’t necessarily on new robot technology, but on how to best exploit that technology. Which brings us back to Kai, the fuzzy toy that helps children calm down. The creature relies on a simple vibrating motor and a few sensors. The novelty is in the design, which draws upon Isbister’s experience from studying how people interact with avatars in video games.
A key insight, she says, is that the toy invites you to pick it up and bring it close to you. Without an animated face or any moving limbs, your only interaction is tactile and necessarily intimate.
“You don’t really see that with other kinds of robots, so this is a really new kind of strategy,” she says.
By soothing this creature, a child can externalize her stress and recognize that it’s a solvable problem.
“It’s a way of projecting your troubles on another being,” Isbister says.
She and her students worked with collaborators at University College London, who ran a pilot study in London showing that a prototype robot did indeed have a calming effect on children.
“I was really sad when I had to leave it,” one child said. “I had such a good time with it, and I will never be the same without it.”
Sproutel, in partnership with Committee for Children, a nonprofit organization that has supported this research, is the company now mass-producing the latest version of the Kai robot. The next step, Isbister says, is to show its efficacy in a more rigorous, large-scale studies, which will be led by Petr Slovak, an assistant professor of human-computer interaction at King’s College London.
Fuzzy toys, automated furniture, pump-powered balloon-like machines, and mechanized suits: Are these devices really robots? Roboticists don’t actually have an agreed-upon definition.
“It’s a blurry, fuzzy line,” Wehner says, “and one that I don’t think is necessarily one we have to draw.”
Takayama favors a broad definition: A robot is any machine that senses, plans, and acts. Which implies even a thermostat or a dishwasher is a robot.
“We call it a dishwasher because it’s so useful,” she says. “We tend to call something a robot because we’re not sure what to call it, because we’re not sure what its job is.”
So, in a way, the robot revolution may already be upon us, as automated technology continues to seep into our daily lives. Even if they’re not the human-like droids and bots of our science-fiction dreams, such devices are kind of robots.
Fear of the robot uprising—maybe not a violent one, but one in which automation takes our jobs—is overblown, Takayama says.
“That assumes there’s going to be some day when things suddenly change. There’s not,” she says. “It’s going to be much, much slower than we think it’s going to be.”
Marcus Woo (SciCom ‘07) is a freelance writer based in San Jose. He’s written for WIRED, the BBC, National Geographic, Scientific American, and NPR, among other outlets.