Instructing an Assembly Robot in Situated Natural Language and Gestures.
Jianwei ZhangTim BaierMarkus HüserPublished in: HCI (4) (2003)
Keyphrases
- human robot interaction
- natural language
- gaze control
- human robot
- gesture recognition
- pointing gestures
- mobile robot
- visually guided
- autonomous robots
- robotic tasks
- dialogue system
- humanoid robot
- robot programming
- service robots
- vision system
- sensory motor
- natural language processing
- natural language interface
- path planning
- knowledge representation
- machine learning
- robot manipulators
- manipulation tasks
- natural language generation
- language processing
- sign language
- semantic analysis
- motion planning
- real time
- mobile robotics
- hidden markov models
- printed circuit boards
- experimental platform
- question answering
- reference resolution
- embodied cognition
- information extraction
- assembly process
- robotic systems
- hand gestures
- simulated robot
- robot navigation
- robotic arm
- robot soccer
- natural language text
- process planning
- semantic interpretation
- robot control