Arrow Research search

Author name cluster

Ioannis Psarros

Possible papers associated with this exact author name in Arrow. This page groups case-insensitive exact name matches and is not a full identity disambiguation profile.

4 papers
2 author rows

Possible papers

4

ICML Conference 2025 Conference Paper

Faster Approximation Algorithms for k-Center via Data Reduction

  • Arnold Filtser
  • Shaofeng H. -C. Jiang
  • Yi Li 0002
  • Anurag Murty Naredla
  • Ioannis Psarros
  • Qiaoyuan Yang
  • Qin Zhang 0001

We study efficient algorithms for the Euclidean $k$-Center problem, focusing on the regime of large $k$. We take the approach of data reduction by considering $\alpha$-coreset, which is a small subset $S$ of the dataset $P$ such that any $\beta$-approximation on $S$ is an $(\alpha + \beta)$-approximation on $P$. We give efficient algorithms to construct coresets whose size is $k \cdot o(n)$, which immediately speeds up existing approximation algorithms. Notably, we obtain a near-linear time $O(1)$-approximation when $k = n^c$ for any $0 < c < 1$. We validate the performance of our coresets on real-world datasets with large $k$, and we observe that the coreset speeds up the well-known Gonzalez algorithm by up to $4$ times, while still achieving similar clustering cost. Technically, one of our coreset results is based on a new efficient construction of consistent hashing with competitive parameters. This general tool may be of independent interest for algorithm design in high dimensional Euclidean spaces.

TCS Journal 2023 Journal Article

Near-neighbor preserving dimension reduction via coverings for doubling subsets of ℓ1

  • Ioannis Z. Emiris
  • Vasilis Margonis
  • Ioannis Psarros

Randomized dimensionality reduction has been recognized as one of the cornerstones in handling high-dimensional data, originating in various foundational works such as the celebrated Johnson-Lindenstrauss Lemma. More specifically, nearest neighbor-preserving embeddings exist for ℓ 2 (Euclidean) and ℓ 1 (Manhattan) metrics, as well as doubling subsets of ℓ 2, where doubling dimension is today the most effective way of capturing intrinsic dimensionality, as well as input structure in various applications. These randomized embeddings bound the distortion only for distances between the query point and a point set. Motivated by the foundational character of fast Approximate Nearest Neighbor search in ℓ 1, this paper settles an important missing case, namely that of doubling subsets of ℓ 1. In particular, we introduce a randomized dimensionality reduction by means of a near neighbor-preserving embedding, which is related to the decision-with-witness problem. The input set gets represented with a carefully chosen covering point set; in a second step, the algorithm randomly projects the latter. In order to obtain the covering point sets, we leverage either approximate r-nets or randomly shifted grids, with different tradeoffs between preprocessing time and target dimension. We exploit Cauchy random variables, and derive a concentration bound of independent interest. Our algorithms are rather simple and should therefore be useful in practice.