Stars
This repo contains additional resources for the maxATAC Python package.
A game theoretic approach to explain the output of any machine learning model.
Gene regulatory network containing signed transcription factor-target gene interactions
Adversarial autoencoder (basic/semi-supervised/supervised)
For beginner, this will be the best start for VAEs, GANs, and CVAE-GAN. This contains AE, DAE, VAE, GAN, CGAN, DCGAN, WGAN, WGAN-GP, VAE-GAN, CVAE-GAN. All use PyTorch.
Graph-linked unified embedding for single-cell multi-omics data integration
Implementation of "Autoencoding beyond pixels using a learned similarity metric" using Pytorch
A BLAST-like toolkit for large-scale scRNA-seq data querying and annotation.
Infer copy number variation (CNV) from scRNA-seq data. Plays nicely with Scanpy.
Methods to discover gene programs on single-cell data
⛽️「算法通关手册」:超详细的「算法与数据结构」基础讲解教程,从零基础开始学习算法知识,850+ 道「LeetCode 题目」详细解析,200 道「大厂面试热门题目」。
add statistical significance annotations on seaborn plots. Further development of statannot, with bugfixes, new features, and a different API.
A Julia package for single cell and spatial data analysis
A novel machine learning pipeline to analyse spatial transcriptomics data
Code relating to http://dx.doi.org/10.2139/ssrn.4132721
Implementation of Alpha Fold 3 from the paper: "Accurate structure prediction of biomolecular interactions with AlphaFold3" in PyTorch
Implementation of Alphafold 3 from Google Deepmind in Pytorch
This is a reproduction of the paper 'Beyond Fully-Connected Layers with Quaternions: Parameterization of Hypercomplex Multiplications with 1/n Parameters' by Ege Demir and Mehmet Barutçu
这是一个efficientdet-pytorch的源码,可以用于训练自己的模型。
The pytorch re-implement of the official efficientdet with SOTA performance in real time and pretrained weights.
PyTorch implementation of EfficientNetV2 family
ProtTrans is providing state of the art pretrained language models for proteins. ProtTrans was trained on thousands of GPUs from Summit and hundreds of Google TPUs using Transformers Models.
Examples of using sparse attention, as in "Generating Long Sequences with Sparse Transformers"
Official Pytorch Code for "Medical Transformer: Gated Axial-Attention for Medical Image Segmentation" - MICCAI 2021
Use of Attention Gates in a Convolutional Neural Network / Medical Image Classification and Segmentation