While interactive robots have been popularized by science fiction over the past century, Assistant Professor of Computer Science Stefanie Tellex and her research group are working to make them a reality. By focusing on robotic language, perception and action, Tellex hopes to create a robot that can assist humans in both simple and complex tasks.
The group mainly works with three Baxter model robots from Rethink Robotics, as well as a virtual reality system, a PR2 robot from Willow Garage and a variety of other equipment. Buying the robots as opposed to building them allows research groups across the country to work on the same platform, said David Whitney GS. The price of robots has been cut more than tenfold over the last seven or so years, he added.
Using these robots, each of Tellex’s three PhD candidates — assisted by undergraduates — focuses on one of three main areas: language, action and perception.
Whitney trains robots to understand natural language and move according to commands, while Nakul Gopalan GS focuses on perception, which includes a robot’s ability to recognize and navigate an environment. Specifically, Gopalan uses machine learning techniques to train robots to identify objects.
John Oberlin GS focuses on robotic action and the manipulation of objects. Using machine learning techniques and multiple cameras, Oberlin is enabling robots to recognize and pick up objects in various conditions.
Tellex recently gave a talk at the Massachusetts Institute of Technology explaining how the team has adapted the robots’ program to adjust for various amounts of light. While each PhD candidate specializes in a unique area, there is a great degree of collaboration among them, Tellex said. Everyone draws upon techniques and methods from each other, she added. For example, Whitney and Oberlin, along with others, collaborated to integrate natural language commands with actions.
Tellex’s group is working on a number of smaller projects as well. Josh Roy ’19 and Izzy Brand ’19 are working on robotic quadcopters, an area of future focus for the group, Tellex said. Additionally, Eric Rosen ’18, who has worked with the group since 2014 on a number of projects, is starting a new project that allows robots to interact in virtual reality. Rosen said that Tellex has been extremely supportive of his project and is always there as a resource even when projects are more independent.
Tellex completed her PhD at MIT in 2010 and started working at Brown in 2013. In addition to the three PhD candidates, she has nine undergraduates in her research group, and four more undergraduates are in the process of joining.
While Tellex’s group has made significant progress on human-robot interaction, much work remains, Tellex said. Over the next five years, the group hopes to fully integrate its three areas of focus, eventually adding motion to the stationary Baxter models, Tellex said. Additionally, the group hopes to flesh out smaller projects and potentially integrate them into the overall framework, Tellex added.
Tellex’s group is also working in collaboration with Claudia Rebola, an associate professor of industrial design at the Rhode Island School of Design, to develop a better, more human-like face for the robot. Horatio Han, a RISD student in Rebola’s group, has designed a face that can display a range of emotions — unlike the current stoic face of the Baxter robot. Such a change will make the robot more realistic, Whitney said.
Tellex has high hopes for the future of robots, though she stresses the importance of human-robot interaction. According to her lab website, “As these machines become more powerful and more autonomous, it is critical to develop methods for enabling people to tell them what to do. Robots that can communicate with people using language can respond appropriately to commands given by humans, ask questions when they are confused and request help when they get stuck.”