Stars
Distributed Asynchronous Hyperparameter Optimization in Python
DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.
Let's reproduce paper simulations of multi-robot systems, formation control, distributed optimization and cooperative manipulation.
二阶系统比例一致性
Matlab library for gradient descent algorithms: Version 1.0.1
Decentralized Multiagent Trajectory Planner Robust to Communication Delay