Arrow Research search
Back to AAAI

AAAI 2024

Permutation-Based Hypothesis Testing for Neural Networks

Conference Paper AAAI Technical Track on Machine Learning IV Artificial Intelligence

Abstract

Neural networks are powerful predictive models, but they provide little insight into the nature of relationships between predictors and outcomes. Although numerous methods have been proposed to quantify the relative contributions of input features, statistical inference and hypothesis testing of feature associations remain largely unexplored. We propose a permutation-based approach to testing that uses the partial derivatives of the network output with respect to specific inputs to assess both the significance of input features and whether significant features are linearly associated with the network output. These tests, which can be flexibly applied to a variety of network architectures, enhance the explanatory power of neural networks, and combined with powerful predictive capability, extend the applicability of these models.

Authors

Keywords

  • ML: Other Foundations of Machine Learning
  • ML: Transparent, Interpretable, Explainable ML

Context

Venue
AAAI Conference on Artificial Intelligence
Archive span
1980-2026
Indexed papers
28718
Paper id
799173294055523503