Highlights
- Pro
Stars
prime is a framework for efficient, globally distributed training of AI models over the internet.
This is the code for the TMLR submission: https://openreview.net/forum?id=e7mYYMSyZH
Simulation framework for accelerating research in Private Federated Learning
The Prodigy optimizer and its variants for training neural networks.
Major CS conference publication stats (including accepted and submitted) by year.
Distributed and decentralized training framework for PyTorch over graph
Code used for paper on Asynchronous SGD published at NeurIPS 2022
FL_PyTorch: Optimization Research Simulator for Federated Learning
This repo contains the code for FLIX.
YAAC: Another Awesome CV is a template using Font Awesome and Adobe Source Font.
Low Precision Arithmetic Simulation in PyTorch
collection of interesting Computer Science resources
scikit-learn: machine learning in Python
Hardware accelerated, batchable and differentiable optimizers in JAX.
Optax is a gradient processing and optimization library for JAX.
JAX - A curated list of resources https://github.com/google/jax
Acceptance rates for the major AI conferences
A beautiful, simple, clean, and responsive Jekyll theme for academics
Benchmarking optimization methods on convex problems.
Stochastic PDHG with Arbitrary Sampling and Imaging Applications
Operator Discretization Library https://odlgroup.github.io/odl/