Stanford University’s Computational Vision and Geometry Lab has developed a robot prototype that can autonomously navigate around a campus, understanding human behaviour, reacting accordingly.

Jackrabbot is built on a Segway system, with a computer and sensors built into his little white body. The sensors build up a vision his 3D surroundings, with 360-degree cameras and GPS helping it navigate around its surrounding without bumping into anything. It uses artificial intelligence and machine learning to help the robot predict what’s going to happen and how to react in a natural way, but crucially, without being programmed.

These integrations also help the robot interact with other humans in a natural way, both avoiding them when they block its path and allowing personal space, just as humans would when traveling around a crowded area.

“These are some of the rules we are aware of; there might even be rules we are not aware of,” said postdoctoral researcher Alexandre Alahi, who works on the Jackrabbot. “We wanted to see how we can learn all these [social conventions] with a machine learning approach and see if we can simulate them and predict them.”

Jackrabbot took two years to build using a team of seven researchers and engineers from Stanford and was inspired by the way autonomous vehicles have sense and avoid technologies built in.

“For the self-driving car you can define a set of rules to encode the car to move around,” Alahi said. “We were wondering why we don’t have these robots in pedestrian-only areas, like on campus or airport terminals, to solve the problems of assisting people to move around.”

The team hopes its robots will be used to deliver goods in high-traffic areas, such as hospitals, or to aid the elderly, patrol grounds as a security guard or serve as a personal assistant.