Arrow Research search
Back to ICRA

ICRA 2007

A Visual Language for Robot Control and Programming: A Human-Interface Study

Conference Paper Accepted Paper Artificial Intelligence ยท Robotics

Abstract

We describe an interaction paradigm for controlling a robot using hand gestures. In particular, we are interested in the control of an underwater robot by an on-site human operator. Under this context, vision-based control is very attractive, and we propose a robot control and programming mechanism based on visual symbols. A human operator presents engineered visual targets to the robotic system, which recognizes and interprets them. This paper describes the approach and proposes a specific gesture language called "RoboChat". RoboChat allows an operator to control a robot and even express complex programming concepts, using a sequence of visually presented symbols, encoded into fiducial markers. We evaluate the efficiency and robustness of this symbolic communication scheme by comparing it to traditional gesture-based interaction involving a remote human operator

Authors

Keywords

  • Robot control
  • Robot programming
  • Communication system control
  • Intelligent robots
  • Robustness
  • Navigation
  • Human robot interaction
  • Underwater communication
  • Cognitive robotics
  • Robot sensing systems
  • Visual Target
  • Human Operator
  • Fiducial Markers
  • Hand Gestures
  • Communication Scheme
  • Unmanned Underwater Vehicles
  • Scope Of This Paper
  • Cognitive Load
  • Systemic Markers
  • Human-robot Interaction
  • Vocabulary Size
  • Undersea
  • Robot Operating
  • Visual Communication
  • Work Deals
  • Gesture Recognition
  • Expert Users
  • American Sign Language
  • Scuba Diving
  • Circular Markers
  • Distractor Task
  • Sequence Of Tokens

Context

Venue
IEEE International Conference on Robotics and Automation
Archive span
1984-2025
Indexed papers
30179
Paper id
948287667315632832