Stars
Codebase for Aria - an Open Multimodal Native MoE
Representation Engineering: A Top-Down Approach to AI Transparency
Visualizing the attention of vision-language models
Repository for Nicheformer: a foundation model for single-cell and spatial omics
Official repo for CellPLM: Pre-training of Cell Language Model Beyond Single Cells.
MoCLE (First MLLM with MoE for instruction customization and generalization!) (https://arxiv.org/abs/2312.12379)
This repository contains scripts for implementing various learning from expert architectures, such as mixture of experts and product of experts, and performing various experiments with these archit…