ECAI 2025
Owen Sampling Accelerates Contribution Estimation in Federated Learning
Abstract
Federated Learning (FL) aggregates information from multiple clients to train a shared global model without exposing raw data. Accurately estimating each client’s contribution is essential not just for fair rewards, but for selecting the most useful clients so the global model converges faster. The Shapley value is the principled choice for this, yet exact computation scales exponentially with the number of clients, making it infeasible for real-world FL deployments with many participants. In this paper, we propose FedOwen, an efficient federated contribution evaluation framework adopting Owen sampling to approximate Shapley values under the same total evaluation budget as the existing methods, while keeping the approximation error below a small threshold. In addition, FedOwen applies an adaptive client selection strategy that balances exploiting high-value clients with exploring under-sampled ones, avoiding bias toward a narrow subset, and uncovering rare but informative data. Under a fixed valuation cost, FedOwen achieves up to 23% improvement in final model accuracy within the same number of communication rounds, compared to state-of-the-art baselines on non-IID benchmarks. Code: https: //github. com/hoseinkhs/AdaptiveSelectionFL [17]
Authors
Keywords
No keywords are indexed for this paper.
Context
- Venue
- European Conference on Artificial Intelligence
- Archive span
- 1982-2025
- Indexed papers
- 5223
- Paper id
- 261081224667265037