Research: Robots and Language

Fig. 1: Robovie-II carries groceries in a supermarket [2].

Robots will need advanced language capabilities so that they can work together collaboratively with humans using natural, verbal commands (Fig. 1). In one such idealized scenario, a person might ask:
“robot, could you help me with the groceries?”
The robot would be able to talk and provide feedback to the user, such as:
“I think you want me to carry these items to your car, is that correct?”
The importance of natural language like this, rather than specific single-word directives like “robot lift object” or “robot move forward”, is that it makes the robot significantly easier to use.

The Lingodroids research group at the University of Queensland has developed language-learning robots which they call “Lingo-droids” [1]. They use an evolutionary approach to language learning whereby the robots can independently evolve their own comprehensive language for describing the world they inhabit. The team have experimentally demonstrated spatial language learning, which is where the robots create their own names for places that they visit in the world.

In contrast to this, I propose that practical household robots must be capable of understanding human language from the first moment they are turned-on, and there must not be any doubt as to what the robot says or does. The reasoning behind this is as follows. Have you ever given an instruction to a co-worker and found out later that they misunderstood you? Sometimes we communicate using language that is not clear or can be interpreted in multiple ways. Languages like English contain syntactic (structural) and lexical ambiguity [3] that would make it difficult for humans and robots to communicate effectively, even if the robot knew English as well as we do. So instead of trying to build robots that can communicate in a human-like way, I propose that we should be trying to develop a lexicon that allows humans to communicate with machines in a way that removes ambiguity [4] and so the operator has no doubt about what the robot will do next. Also, it has been found that humans are more happy when they know the intentions of their robot coworker [5].

Fig. 2: This diagram shows how two simulated robots can see each other. The robot on the left has a green-colored fiducial marker and the other robot has a yellow-colored marker.

As part of this work I created a simulation with 3D graphics using the “Gazebo” robotics simulator (formerly called “Player”). The Lingodroids communicate in the real world using audible “chirps” and can’t actually see each other because they lack a camera or sufficient computing power. I modified a model of the Pioneer 3-AT robot and added a camera and a solid, colored cylinder shape (Fig. 2). The two robots could detect each other by applying simple colored thresholding to the camera image using the OpenCV library and looking for the specific color of the other cylinder.

To learn more about this topic, please read my paper below. It includes a literature review on symbol grounding and language development in robots. This work was undertaken as part of a research placement with the “Lingodroids” group at the University of Queensland in 2010.

 

David T. Butterworth, “Robots and Language: A Practical Perspective”, un-published, 2010
PDF

 

[1] R. Schulz, A. Glover, G. Wyeth and J. Wiles, “Robots, Communication, and Language: An Overview of the Lingodroid Project”, Australasian Conference on Robotics and Automation (ACRA), Brisbane, Australia, December 2010. PDF
Lingodroids: Language and robots
www.itee.uq.edu.au/lingodroids/
Complex and Intelligent Systems research group, UQ (University of Queensland)
www.itee.uq.edu.au/cis/
[2] Supermarket robot to help the elderly, Phys.org, December 2009.
phys.org/news/2009-12-supermarket-robot-elderly-video.html
[3] Here are some examples of ambiguity in the English language:
“The peasants are revolting” – wikipedia.org/wiki/Ambiguity
“Some cases of ambiguity in English”, for language students – www.iasj.net/iasj?func=fulltext&aId=17887
[4] There has been at least one attempt at developing a logical language:
J. W. Cowan, “The Complete Lojban Language”, The Logical Language Group, Inc., 1997. PDF
[5] The HV-100 robot from Harvest Automation uses a simple robotics technique to help with navigation whereby it follows a yellow line marked on the ground. The interesting thing is that this method was chosen not because it’s the only technology that works, but because it allows the human coworkers to quickly understand where the robot will move next.
www.harvestai.com
robohub.org/harvey-a-working-robot-for-container-crops