MIT has developed a robotic brain integration powered by the VR Oculus headset that allows you to get into the mind of a robot. This means they can be better controlled by humans when it comes to industries such as manufacturing.

The Computer Science and Artificial Intelligence Lab (CSAIL) has developed a platform that puts sensor displays at the fingertips of robot operators. They can see as if they’re in the head of a robot and control its arms simply by moving their arms and hands.

The University explained it offers a wide range of benefits over more traditional computer-based controls, including 3D environments. For example, the operator is less likely to feel sick as is a common problem with 3D environments and it’s not as intensive on processing power, because it simple lifts of 2D images and applies them to the headset.

CSAIL said it could also be used for emergency scenarios where it may be required to enter dangerous environments, such as post-earthquake areas. Being able to see the surroundings and control its limbs would allow rescue missions to go ahead without putting lives in danger.

This new iteration of Baxter, which previously allowed operators to use their brain power to control robots, will make it more natural to control manufacturing robots, making production lines more productive. It could also potentially be used to perform fiddly operations such as surgery, although the dexterity of the robot itself would need to be adapted for those implementations.

Although it’s only a concept at the moment, MIT says it’s working on improvements all the time to its Baxter robot, so we expect to see more functions added soon.