Arrow Research search
Back to AAAI

AAAI 2012

Relative Attributes for Enhanced Human-Machine Communication

Conference Paper Papers Artificial Intelligence

Abstract

We propose to model relative attributes1 that capture the relationships between images and objects in terms of human-nameable visual properties. For example, the models can capture that animal A is ‘furrier’ than animal B, or image X is ‘brighter’ than image B. Given training data stating how object/scene categories relate according to different attributes, we learn a ranking function per attribute. The learned ranking functions predict the relative strength of each property in novel images. We show how these relative attribute predictions enable a variety of novel applications, including zero-shot learning from relative comparisons, automatic image description, image search with interactive feedback, and active learning of discriminative classifiers. We overview results demonstrating these applications with images of faces and natural scenes. Overall, we find that relative attributes enhance the precision of communication between humans and computer vision algorithms, providing the richer language needed to fluidly “teach” a system about visual concepts.

Authors

Keywords

No keywords are indexed for this paper.

Context

Venue
AAAI Conference on Artificial Intelligence
Archive span
1980-2026
Indexed papers
28718
Paper id
970851859306181198