Skip to content
View aliuisbigger's full-sized avatar

Block or report aliuisbigger

Block user

Prevent this user from interacting with your repositories and sending you notifications. Learn more about blocking users.

You must be logged in to block users.

Please don't include any personal information such as legal names or email addresses. Maximum 100 characters, markdown supported. This note will be visible to only you.
Report abuse

Contact GitHub support about this user’s behavior. Learn more about reporting abuse.

Report abuse

Starred repositories

Showing results

python sdk for FISCO BCOS

Python 67 62 Updated Oct 30, 2024

A simple implementation of a deep linear Pytorch module

Python 19 3 Updated Oct 16, 2020

A Pytorch implementation of Global Self-Attention Network, a fully-attention backbone for vision tasks

Python 94 7 Updated Nov 21, 2020

Implementation of gMLP, an all-MLP replacement for Transformers, in Pytorch

Python 426 57 Updated Aug 14, 2021

Implementation of Transformer in Transformer, pixel level attention paired with patch level attention for image classification, in Pytorch

Python 305 43 Updated Dec 27, 2021

An All-MLP solution for Vision, from Google AI

Python 1,014 108 Updated Sep 13, 2024

Implementation of N-Grammer, augmenting Transformers with latent n-grams, in Pytorch

Python 72 1 Updated Dec 4, 2022

Graph neural network message passing reframed as a Transformer with local attention

Python 67 11 Updated Dec 24, 2022

Implementation of ETSformer, state of the art time-series Transformer, in Pytorch

Python 152 20 Updated Aug 26, 2023

Implementation of Nvidia's NeuralPlexer, for end-to-end differentiable design of functional small-molecules and ligand-binding proteins, in Pytorch

Python 50 3 Updated Nov 20, 2023

A simple cross attention that updates both the source and target in one step

Python 162 12 Updated May 7, 2024

Transformer based on a variant of attention that is linear complexity in respect to sequence length

Python 738 70 Updated May 5, 2024

Exploration into the proposed "Self Reasoning Tokens" by Felipe Bonetto

Python 55 4 Updated May 17, 2024

Fast and memory-efficient exact attention

Python 16 5 Updated Jul 22, 2024

A concise but complete full-attention transformer with a set of promising experimental features from various papers

Python 5,081 437 Updated Feb 17, 2025

Vector (and Scalar) Quantization, in Pytorch

Python 2,927 238 Updated Feb 12, 2025

An implementation of local windowed attention for language modeling

Python 421 44 Updated Jan 16, 2025

Database ORM layer for Nim

Nim 71 9 Updated May 1, 2024

Implementation of TACL 2017 paper: Cross-Sentence N-ary Relation Extraction with Graph LSTMs. Nanyun Peng, Hoifung Poon, Chris Quirk, Kristina Toutanova and Wen-tau Yih.

Python 61 19 Updated Nov 29, 2018

利用pytorch实现图像分类的一个完整的代码,训练,预测,TTA,模型融合,模型部署,cnn提取特征,svm或者随机森林等进行分类,模型蒸馏,一个完整的代码

Jupyter Notebook 1,394 338 Updated Feb 6, 2023