
Starred repositories
[ICLR 2023] One Transformer Can Understand Both 2D & 3D Molecular Data (official implementation)
Official repository for the Boltz-1 biomolecular interaction model
Graph Attention Networks (https://arxiv.org/abs/1710.10903)
Convenience Python APIs for antibody numbering using ANARCI
✌🏻 Antigen-Specific Antibody Design and Optimization with Diffusion-Based Generative Models for Protein Structures (NeurIPS 2022)
Improved antibody structure-based design using inverse folding
YOLOv5 🚀 in PyTorch > ONNX > CoreML > TFLite
[ICLR 2023] "Mole-BERT: Rethinking Pre-training Graph Neural Networks for Molecules"
Implementation of Denoising Diffusion Probabilistic Model in Pytorch
Graph Denoising Diffusion for Inverse Protein Folding(NeurIPS 2023)
Explanation method for Graph Neural Networks (GNNs)
This repository contains the code for the work on protein-ligand interaction with GNNs and XAI
Geometric Vector Perceptrons --- a rotation-equivariant GNN for learning from biomolecular structure
Code and datasets for paper "K2: A Foundation Language Model for Geoscience Knowledge Understanding and Utilization" in WSDM-2024
A TensorFlow Implementation of the Transformer: Attention Is All You Need
Transformer: PyTorch Implementation of "Attention Is All You Need"
Trainable, memory-efficient, and GPU-friendly PyTorch reproduction of AlphaFold 2
Deep learning and Bayesian approach applied to enzyme turnover number for the improvement of enzyme-constrained genome-scale metabolic models (ecGEMs) reconstruction
Code for the ProteinMPNN paper
This package contains deep learning models and related scripts for RoseTTAFold
A generative model for programmable protein design
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Evolutionary Scale Modeling (esm): Pretrained language models for proteins
High-Resolution Image Synthesis with Latent Diffusion Models