Stars
Code for Concrete Dropout as presented in https://arxiv.org/abs/1705.07832
Low rank adaptation for Vision Transformer
PyTorch implementation of SwAV https//arxiv.org/abs/2006.09882
Prov-GigaPath: A whole-slide foundation model for digital pathology from real-world data
Diffusion model papers, survey, and taxonomy
My implementation of the original GAT paper (Veličković et al.). I've additionally included the playground.py file for visualizing the Cora dataset, GAT embeddings, an attention mechanism, and entr…
A pytorch implementation of the vector quantized variational autoencoder (https://arxiv.org/abs/1711.00937)
DiagSet: a dataset for prostate cancer histopathological image classification
PyTorch codes for "Real-World Blind Super-Resolution via Feature Matching with Implicit High-Resolution Priors", ACM MM2022 (Oral)
BertViz: Visualize Attention in NLP Models (BERT, GPT2, BART, etc.)
Simple and efficient pytorch-native transformer text generation in <1000 LOC of python.
Segment Anything in Medical Images
Awesome papers about machine learning (deep learning) on dynamic (temporal) graphs (networks / knowledge graphs).
Evaluation of Methods for Temporal Knowledge Graph Forecasting
Read and write life sciences file formats
The repository provides code for running inference with the SegmentAnything Model (SAM), links for downloading the trained model checkpoints, and example notebooks that show how to use the model.
Official PyTorch Implementation of "Scalable Diffusion Models with Transformers"
Hierarchical Image Pyramid Transformer - CVPR 2022 (Oral)
A PyTorch implementation of MAGE: MAsked Generative Encoder to Unify Representation Learning and Image Synthesis
PyTorch implementation of MoCo v3 https//arxiv.org/abs/2104.02057
Simultaneous Nuclear Instance Segmentation and Classification in H&E Histology Images.
PyTorch code for Vision Transformers training with the Self-Supervised learning method DINO
The pytest framework makes it easy to write small tests, yet scales to support complex functional testing
Course webpage for UCLA Biostat 203B (Intro. to Data Science)
Official Implementation of Paella https://arxiv.org/abs/2211.07292v2
Pretrain, finetune ANY AI model of ANY size on multiple GPUs, TPUs with zero code changes.
The repository contains a simple pipeline for training Nuclei Segmentation Datasets of Histopathology Images.