Arrow Research search
Back to ICRA

ICRA 2013

Functional object descriptors for human activity modeling

Conference Paper Accepted Paper Artificial Intelligence · Robotics

Abstract

The ability to learn from human demonstration is essential for robots in human environments. The activity models that the robot builds from observation must take both the human motion and the objects involved into account. Object models designed for this purpose should reflect the role of the object in the activity - its function, or affordances. The main contribution of this paper is to represent object directly in terms of their interaction with human hands, rather than in terms of appearance. This enables the direct representation of object affordances/function, while being robust to intra-class differences in appearance. Object hypotheses are first extracted from a video sequence as tracks of associated image segments. The object hypotheses are encoded as strings, where the vocabulary corresponds to different types of interaction with human hands. The similarity between two such object descriptors can be measured using a string kernel. Experiments show these functional descriptors to capture differences and similarities in object affordances/function that are not represented by appearance.

Authors

Keywords

  • Videos
  • Image segmentation
  • Human Activities
  • Functional Description
  • Vocabulary
  • Types Of Interactions
  • Video Sequences
  • Human Motion
  • Different Types Of Interactions
  • Human Hand
  • Robots In Environments
  • Training Set
  • Support Vector Machine
  • Similarity Measure
  • Feature Space
  • Functional Class
  • Human Interaction
  • Point Cloud
  • Bounding Box
  • Indoor Environments
  • Depth Images
  • Inverse Reinforcement Learning
  • Representation Of Function
  • Pairwise Similarity
  • Beginning Of Activity
  • Robot Learning
  • Object Appearance
  • Imitation Learning
  • Bag-of-words
  • SIFT Features
  • Object Bounding Boxes

Context

Venue
IEEE International Conference on Robotics and Automation
Archive span
1984-2025
Indexed papers
30179
Paper id
632748145508611178