Skip to content

Example hyper parameter code for paper "Bilevel Optimization: Nonasymptotic Analysis and Faster Algorithms"

License

Notifications You must be signed in to change notification settings

KhanduriPrashant/stocBiO

 
 

Repository files navigation

stocBiO and ITD-BiO for hyperparameter optimization and meta-learning

Codes for paper Bilevel Optimization: Nonasymptotic Analysis and Faster Algorithms.

Our hyperparameter optimization implementation is bulit on HyperTorch, where we propose stoc-BiO algorithm with better performance than other bilevel algorithms. The implementation of stoc-BiO is located in two experiments l2reg_on_twentynews.py and mnist_exp.py. We will implement our stoc-BiO as a class for an independent use soon!

Our meta-learning part is built on learn2learn, where we show the bilevel optimizer ITD-BiO converges faster than MAML and ANIL.

In the following, we provide some experiments to demonstrate the better performance of the proposed stoc-BiO algorithm.

We compare our algorithm to various hyperparameter baseline algorithms on newspaper dataset:

We evaluate the performance of our algorithm with respect to different batch sizes:

The comparison results on MNIST dataset:

This repo is still under construction and any comment is welcome!

About

Example hyper parameter code for paper "Bilevel Optimization: Nonasymptotic Analysis and Faster Algorithms"

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%