In this paper, we describe the first strictly sublinear approximate near-linear time algorithm for computing the GED of two point sequences in constant dimensional Euclidean space. Specifically, we present a randomized \(O(n\log^2n)\) time \(O(\sqrt n)\)-approximation algorithm. Then, we generalize our result to give a randomized \(\alpha\)-approximation algorithm for any \(\alpha\in [\sqrt{\log n}, \sqrt{n/\log n}]\), running in time \(O(n^2/\alpha^2 \log n)\). Both algorithms are Monte Carlo and return approximately optimal solutions with high probability.
Extended abstract in Proceedings of the 30th International Symposium on Algorithms and Computation (ISAAC), 26:1–26:16, 2019.