Learning Theory for Non-Euclidean Machine Learning
Abstract

Learning theory, which aims to evaluate machine learning models' performance quantitatively, is the fundamental building block of machine learning. This talk presents the recent advances in learning theory. Specifically, the talk focuses on the performance of machine learning models using non-Euclidean space. Recently, hyperbolic space, one variant of non-Euclidean space, has attracted attention since it can represent the hierarchical structure behind data in extremely low-dimensional space. However, such capability also involves a high risk of overfitting. Here, overfitting means performance degradation owing to data incompleteness. The presenter, for the first time, successfully provided the upper bound of hyperbolic-space-based machine learning's performance degradation by overfitting. This talk first reviews learning theory's motivation and the advantage of hyperbolic space in the machine learning context. Then, the talk introduces the presenter's recent work on performance analysis of hyperbolic-space-based machine learning using learning theory. If time permits, the talk briefly reviews the presenter's contribution to information-theoretic learning theory.

 

Speaker: Dr Atsushi SUZUKI
Date: 30 November 2023 (Thursday)
Time: 3:00pm – 4:00pm
PosterClick here

 

Biography

Dr Atsushi SUZUKI is a Lecturer in Machine Learning in the Department of Informatics at King's College London.Dr Atsushi SUZUKI was conferred a Doctoral Degree from the University of Tokyo. During the PhD study, Atsushi also worked as a research fellow on the Research Fellowship for Young Scientists (DC2) offered by the Japan Society for the Promotion of Science. Atsushi has published papers in the top academic venues, including ICML, NeurIPS, AAAI, IJCAI, IEEE Transactions on Information Theory, ISIT, ICDM, etc. Atsushi has also been invited to the top international conferences for talks, including IJCAI and ICLR.