Highlights
- Pro
Stars
互联网仍有记忆!那些曾经在校招过程中毁过口头offer、意向书、三方的公司!纵然人微言轻,也想尽绵薄之力!
Code for loralib, an implementation of "LoRA: Low-Rank Adaptation of Large Language Models"
🐧 Linux教程,主要内容:Linux 命令、Linux 系统运维、软件运维、精选常用Shell脚本
2024中国翻墙软件VPN推荐以及科学上网避坑,稳定好用。对比SSR机场、蓝灯、V2ray、老王VPN、VPS搭建梯子等科学上网与翻墙软件,中国最新科学上网翻墙梯子VPN下载推荐,访问Chatgpt。
Resources about Aspect-based Sentiment Analysis (ABSA)
📚 The starting point of your career as a Software Quality Assurance Engineer | Quality Automation Engineer 📚
Transformer: PyTorch Implementation of "Attention Is All You Need"
Code for the ALiBi method for transformer language models (ICLR 2022)
🧑🏫 60+ Implementations/tutorials of deep learning papers with side-by-side notes 📝; including transformers (original, xl, switch, feedback, vit, ...), optimizers (adam, adabelief, sophia, ...), ga…
Code for the paper "Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer"
A small package to create visualizations of PyTorch execution graphs
Code and data for COLING 2022 paper titled "Structural Bias For Aspect Sentiment Triplet Extraction"
Repo for "MvP: Multi-view Prompting Improves Aspect Sentiment Tuple Prediction" [ACL'2023]
This is for paper "A semantically enhanced dual encoder for aspect sentiment triplet extraction"
a sample pytorch Implementation of ACL 2021 research paper "Learning Span-Level Interactions for Aspect Sentiment Triplet Extraction".
Code Implementation of "Learning Span-Level Interactions for Aspect Sentiment Triplet Extraction".
Code and data for paper "Grid Tagging Scheme for Aspect-oriented Fine-grained Opinion Extraction". Aspect opinion pair datasets and aspect triplet datasets.
An Extensible Toolkit for Finetuning and Inference of Large Foundation Models. Large Models for All.
IMPLabUniPr / BERT-for-ABSA
Forked from akkarimi/BERT-For-ABSAcode for our NAACL 2019 paper: "BERT Post-Training for Review Reading Comprehension and Aspect-based Sentiment Analysis"
程序员延寿指南 | A programmer's guide to live longer
Collection of different types of transformers for learning purposes
An ultimately comprehensive paper list of Vision Transformer/Attention, including papers, codes, and related websites
🍀 Pytorch implementation of various Attention Mechanisms, MLP, Re-parameter, Convolution, which is helpful to further understand papers.⭐⭐⭐
Fast and memory-efficient exact attention
中文LLaMA&Alpaca大语言模型+本地CPU/GPU训练部署 (Chinese LLaMA & Alpaca LLMs)