Graduate Student Seminar: Robust Large-Margin Learning in Hyperbolic Space, Speaker: Melanie Weber

Graduate Student Seminars
Feb 25, 2020
12:30 pm
Fine Hall 214

Title:  Robust Large-Margin Learning in Hyperbolic Space

Abstract:  

Recently, there has been a surge of interest in representation learning in hyperbolic spaces, driven by their ability to represent hierarchical data with significantly fewer dimensions than standard Euclidean spaces. However, the viability and benefits of hyperbolic spaces for downstream machine learning tasks have received less attention. 

In this work, we present, to our knowledge, the first theoretical guarantees for learning a classifier in hyperbolic rather than Euclidean space. Specifically, we consider the problem of learning a large-margin classifier for data possessing a hierarchical structure. Our first contribution is a hyperbolic perceptron algorithm, which provably converges to a separating hyperplane. We then provide an algorithm to efficiently learn a large-margin hyperplane, relying on the careful injection of adversarial examples. Finally, we prove that for hierarchical data that embeds well into hyperbolic space, the low embedding dimension ensures superior guarantees when learning the classifier directly in hyperbolic space.

Joint work with Manzil Zaheer, Ankit Singh Rawat, Aditya Menon and Sanjiv Kumar (all at Google Research).