I gave a talk last night at the Berlin machine learning meetup on learning graph embeddings in hyperbolic space, featuring the recent NIPS 2017 paper of Nickel & Kiela. Covered are:
- An illustration of why the Euclidean plane is not a good place to embed trees (since circle circumference grows only linearly in the radius);
- Extending this same argument to higher dimensional Euclidean space;
- An introduction to the hyperbolic plane and the Poincaré disc model;
- A discussion of Rik Sarkar’s result that trees embed with arbitrarily small error in the hyperbolic plane;
- A demonstration that, in the hyperbolic plane, circle circumference is exponential in the radius (better written here);
- A review of the results of Nickel & Kiela on the (transitive closure of the) WordNet hypernymy graph;
- Some thoughts on the gradient optimisation (perhaps better written here).
And here are the slides!
Does anyone have hyperparameters to reproduce the results in Poincare Embeddings for Learning Hierarchical Representations ? It would be really helpful!