OptFormer @ DLCT

Ғылым және технология

This is a talk delivered at the (usually not recorded) weekly journal club "Deep Learning: Classics and Trends" (mlcollective.org/dlct/ ).
Speaker: Yutian Chen
Title: Towards Learning Universal Hyperparameter Optimizers with Transformers
Abstract: Meta-learning hyperparameter optimization (HPO) algorithms from prior experiments is a promising approach to improve optimization efficiency over objective functions from a similar distribution. However, existing methods are restricted to learning from experiments sharing the same set of hyperparameters. In this paper, we introduce the OptFormer, the first text-based Transformer HPO framework that provides a universal end-to-end interface for jointly learning policy and function prediction when trained on vast tuning data from the wild. Our extensive experiments demonstrate that the OptFormer can imitate at least 7 different HPO algorithms, which can be further improved via its function uncertainty estimates. Compared to a Gaussian Process, the OptFormer also learns a robust prior distribution for hyperparameter response functions, and can thereby provide more accurate and better calibrated predictions. This work paves the path to future extensions for training a Transformer-based model as a general HPO optimizer.
Speaker bio: Dr Yutian Chen is a staff research scientist at DeepMind. He obtained his PhD in machine learning at the University of California, Irvine, and later worked at the University of Cambridge as a research associate (Postdoc) before joining DeepMind. Yutian took part in the AlphaGo and AlphaGo Zero project, developed Game Go AI programs that defeated the world champions. The AlphaGo project was ranked in the top 10 discoveries of the decade 2010s by the New Scientist magazine. Yutian has conducted research in multiple machine learning areas including Bayesian methods, offline reinforcement learning, generative models and meta-learning with applications in gaming AI, computer vision and text-to-speech. Yutian also serves as reviewers and area chairs for multiple academic conferences and journals.
Paper link: arxiv.org/abs/2205.13320

Пікірлер

    Келесі