Robot see, robot do for socially apt machine


Within the 200,000 square foot Technology Square Research Building, the Socially Intelligent Machines (SIM) Lab is developing the next revolution in advanced human-machine interaction: robots that learn from socializing.

“First, we revolutionized the office and the home using dumb terminals, but now we have embodied agents that can automate things in the real world. This may have major implications for automating and changing the world around us without direct human intervention,” said Nick dePalma, CS graduate student and research assistant at the SIM Lab.

Most modern robots have been employed as relatively simple, user-programmable vacuums or in industrial automation like car assembly lines, where tasks are repetitive and there is very little chance of interference.

“A critical issue is that we will not be able to preprogram these robots with every skill they will need to play a useful role in society; robots will need the ability to interact and learn new things ‘on the job’ from ordinary people,” according to SIM Lab’s website.

SIM Lab is a research group within the College of Computing’s School of Interactive Computing. The goal is to develop machines that can function in social, dynamic human environments. Since its inception in 2007, research has led to the creation of Simon, an upper-torso robot with a socially expressive head.

“The main focus of my group right now is socially guided machine learning. We’re interested in designing algorithms and interfaces to allow robots to learn interactively from everyday people,” said Dr. Andrea Thomaz, Assistant Professor with the School of Interactive Computing and director of the SIM Lab.

The lab has been working on two projects on the Simon platform: social attention and interactive task learning.

When Simon is in a busy environment, the lab made sure it could react in an appropriate manner. First, Simon recognizes the most important aspects of its environment via visual and auditory stimuli and assigns a value of importance to each.

“Everything is fighting for the robot’s attention. If a loud sound is perceived the robot might glance in that direction, and then look back to see people trying to get the robot’s attention by waving objects,” Thomaz said.

Simon learns by demonstration and interaction; this can be accomplished by assigning it tasks. A human partner can tell Simon to grab an object and then tell it what should be done with the object. Simon can learn a model in just a few examples, and then the human partner can introduce new objects and they will be sorted into their proper locations.

“Additionally, the teacher can let Simon ask questions by saying ‘Do you have any questions?’ Then Simon will scan the workspace looking for any objects that it is uncertain about,” Thomaz said.

“If such an object is found, this will lead to a query like, ‘What should we do with this one?’” Thomaz said.

Simon’s first venture out of the lab at the premier international human-computer interaction conference allowed him to interact with over 100 people, including the attendees’ kids. This positive child interaction is a good sign for the future of Simon’s class of robots. Its body, proportioned to a 5’7” woman, is designed to work side-by-side with human counterparts and be unimposing and people-friendly. One of Simon’s eventual uses could be acting as a counterpart to teachers in classrooms.

“I try to analyze how people prefer teaching robots, and develop ways in which robots can improve a teacher’s instructions, for example by asking useful questions,” said Maya Cakmak, a Ph.D. student in Robotics and graduate student assistant with the SIM Lab.

Other robots already in service in schools have been lauded for feats like connecting with autistic children and teaching languages.

Simon is the latest incarnation of previous robot projects Junior, Jimmy and Jenny. Junior, built from Trossen Robotics’ Bioloid kit—a user-friendly advanced modular robotics system—and a webcam, interacts with people and objects in a simple environment. Jimmy and Jenny are Juniors with wheels, able to navigate the workspace and allow the lab to study peer learning and other biological characteristics as they work together.

Other Tech faculty collaborate with the SIM Lab for their own research relevant to human-robot interaction. Gil Weinberg, Director of Music Technology and Associate Professor with the Music Department, worked with the lab to develop the social work of his robotic musician, Shimon, that plays the marimba. Rosa Arriaga, Senior Research Scientist with the School of Interactive Computing and Director of Pediatric Research with the Health Systems Institute, is investigating how theories of developmental psychology and some of its seminal findings can be applied to the field of human robot interaction.

Current or incoming students can work with the SIM Lab by doing an Undergraduate Research Opportunities in Computing (UROC) project, contacting Dr. Thomaz for an independent study project or by taking Dr. Thomaz’s graduate courses in human-robot interaction and then joining as a graduate research assistant. The SIM Lab is also hiring a postdoctoral researcher in the areas of human-robot interaction and machine learning.