Many robots require specific and often complicated programming to get them to understand what task needs to be completed, and how. Ashutosh Saxena of the Robot Learning Lab at Cornell University is teaching robots how to understand natural language so that they can be programmed simply by talking to them.
The robots will have programming language with command words. A built-in software will translate the words from spoken language into tasks that the robot can perform. This will continue to work even if there are steps or words left out of the verbal commands. For example, the robot would have the ability to heat water on a stove without being told to first turn on the stove.
The robot will also have a 3D camera that will work alongside the software to associate different objects with the tasks they've been given. For example, when heating water on the stove, the robot will be able to see which object is the stove.
Saxena's technology is much more advanced than other robots that understand single-word commands. This new artificial intelligence will have built-in association between objects and actions. So, when it sees various objects in front of it, it knows which one is required to complete its task.
Check out the video below to see the robot in action.