Arrow Research search

Author name cluster

Aaron Adler

Possible papers associated with this exact author name in Arrow. This page groups case-insensitive exact name matches and is not a full identity disambiguation profile.

2 papers
1 author row

Possible papers

2

AAAI Conference 2013 Conference Paper

A Morphogenetically Assisted Design Variation Tool

  • Aaron Adler
  • Fusun Yaman
  • Jacob Beal
  • Jeffrey Cleveland
  • Hala Mostafa
  • Annan Mozeika

The complexity and tight integration of electromechanical systems often makes them “brittle” and hard to modify in response to changing requirements. We aim to remedy this by capturing expert knowledge as functional blueprints, an idea inspired by regulatory processes that occur in natural morphogenesis. We then apply this knowledge in an intelligent design variation tool. When a user modifies a design, our tool uses functional blueprints to modify other components in response, thereby maintaining integration and reducing the need for costly search or constraint solving. In this paper, we refine the functional blueprint concept and discuss practical issues in applying it to electromechanical systems. We then validate our approach with a case study applying our prototype tool to create variants of a miniDroid robot and by empirical evaluation of convergence dynamics of networks of functional blueprints.

IJCAI Conference 2009 Conference Paper

  • David Tyler Bischel
  • Thomas Stahovich
  • Eric Peterson
  • Randall Davis
  • Aaron Adler

Mechanical design tools would be considerably more useful if we could interact with them in the way that human designers communicate design ideas to one another, i. e. , using crude sketches and informal speech. Those crude sketches frequently contain pen strokes of two different sorts, one type portraying device structure, the other denoting gestures, such as arrows used to indicate motion. We report here on techniques we developed that use information from both sketch and speech to distinguish gesture strokes from non-gestures — a critical first step in understanding a sketch of a device. We collected and analyzed unconstrained device descriptions, which revealed six common types of gestures. Guided by this knowledge, we developed a classifier that uses both sketch and speech features to distinguish gesture strokes from nongestures. Experiments with our techniques indicate that the sketch and speech modalities alone produce equivalent classification accuracy, but combining them produces higher accuracy.