Videos

Information theory of algorithms

Presenter
February 23, 2015
Abstract
Joachim Buhmann ETH Zürich Algorithms as selection procedures for mathematical structures are often exposed to uncertainty in the input or randomness during computation. The achievable precision of an algorithm, i.e., the attainable resolution in its output space, is determined by the predictive information in the input of an algorithm relative to its output space. Therefore, algorithms can be compared and optimized w.r.t. its runtime, its memory consumption and its informativeness, meaning its robustness to stochastic influences in the input and during computation. I will present an information theoretic framework for algorithm analysis where an algorithm is characterized as computational evolution of a (possibly contracting) posterior distribution over the output space. The tradeoff between precision and stability is controlled by an input sensitive generalization capacity (GC). GC objectively ranks different algorithms for the same data processing task based on the bit rate of their respective capacities. Information theoretic algorithm selection is rigorously demonstrated for minimum spanning tree algorithms and for greedy MaxCut algorithms. The method also allows us to rank centroid based and spectral clustering methods, e.g. k-means, pairwise clustering, normalized cut, adaptive ratio cut and dominant set clustering.
Supplementary Materials