Arrow Research search
Back to AAAI

AAAI 2024

A Theory of Non-acyclic Generative Flow Networks

Conference Paper AAAI Technical Track on Machine Learning I Artificial Intelligence

Abstract

GFlowNets is a novel flow-based method for learning a stochastic policy to generate objects via a sequence of actions and with probability proportional to a given positive reward. We contribute to relaxing hypotheses limiting the application range of GFlowNets, in particular: acyclicity (or lack thereof). To this end, we extend the theory of GFlowNets on measurable spaces which includes continuous state spaces without cycle restrictions, and provide a generalization of cycles in this generalized context. We show that losses used so far push flows to get stuck into cycles and we define a family of losses solving this issue. Experiments on graphs and continuous tasks validate those principles.

Authors

Keywords

  • ML: Deep Generative Models & Autoencoders
  • ML: Deep Learning Theory
  • ML: Graph-based Machine Learning
  • ML: Multimodal Learning
  • ML: Online Learning & Bandits
  • ML: Optimization
  • ML: Reinforcement Learning
  • SO: Heuristic Search
  • SO: Mixed Discrete/Continuous Search
  • SO: Non-convex Optimization
  • SO: Other Foundations of Search & Optimization

Context

Venue
AAAI Conference on Artificial Intelligence
Archive span
1980-2026
Indexed papers
28718
Paper id
152246606321711409