Arrow Research search
Back to ICRA

ICRA 1997

Visually guided manipulation using active camera-lens systems

Conference Paper Accepted Paper Artificial Intelligence ยท Robotics

Abstract

Visual servoing is a robust technique for aligning both static and moving parts using imprecisely calibrated camera-lens-manipulator systems. An important limitation of these systems is the workspace within which the alignment task can be successfully performed due to the position and orientation of the camera. An active camera can extend this region, however this changes the visual representation of the task itself. Therefore, the reference input that drives the visually servoed manipulator must change appropriately. In this paper a framework that allows for camera-lens motion during visually servoed manipulation is described. The main components of the framework include object schemas and port-based agents. Object schemas represent the task internally in terms of geometric models with attached sensor mappings. Object schemas are dynamically updated by sensor feedback, and thus provide an ability to perform three dimensional spatial reasoning during task execution, a capability traditional image-based visual servoing lacks. Object schemas are also able to dynamically create desired visual representations of the task from which reference inputs for vision-based control strategies are derived. The sensor mappings of object schemas are also used to guide camera motion based on task characteristics. Port-based agents are the executors of the visual reference inputs and the camera motion commands. They interact with the real world through visual servoing control laws. Experimental results that demonstrate system capabilities and performance are presented.

Authors

Keywords

  • Visual servoing
  • Cameras
  • Feedback
  • Robustness
  • Lenses
  • Solid modeling
  • Sensor systems
  • Spatial resolution
  • Manipulator dynamics
  • Mechanical engineering
  • Control Strategy
  • Optimal Control
  • Visual Representation
  • Workspace
  • Task Execution
  • Geometric Model
  • Task Characteristics
  • Visual Control
  • Reference Input
  • Components Of The Framework
  • Camera Motion
  • Task Representations
  • Camera Orientation
  • Alignment Task
  • Sensor Feedback
  • Set Of Results
  • Image Plane
  • Mental Representations
  • Systematic Framework
  • Vision Sensors
  • Agent Dynamics
  • Sensor Placement
  • Optical Flow
  • Direct Communication
  • Dynamic Reconfiguration
  • Dynamic Sensor
  • Path Planning
  • Jacobian Matrix
  • Data Cube

Context

Venue
IEEE International Conference on Robotics and Automation
Archive span
1984-2025
Indexed papers
30179
Paper id
1064201623512450041