Stars
Python based web automation tool. Powerful and elegant.
Awesome Pretrained Chinese NLP Models,高质量中文预训练模型&大模型&多模态模型&大语言模型集合
Fast inference engine for Transformer models
spider-admin-pro 一个集爬虫Scrapy+Scrapyd爬虫项目查看 和 爬虫任务定时调度的可视化管理工具,SpiderAdmin的升级版
Present for Koishi Day 2022 - Paralleled Touhou Lyric Generator and Translator (PaToL-GT)
起源于旧项目爱奇艺解析器(iqiyi-parser)在开发、维护和扩展的过程中遇到的一些问题,而实现的一个基于任务流式的可视化爬虫引擎。引擎的执行单元是节点。脚本节点(script)作为根节点来完成对节点和流程的描述,经由任务节点(task)解析流程描述并生成节点的执行队列,最后交由工作者执行池处理。整个过程可视可控,所有节点处理器都以插件的形式导入,以最大程度实现易扩展性。
Transformers for Information Retrieval, Text Classification, NER, QA, Language Modelling, Language Generation, T5, Multi-Modal, and Conversational AI
Sparse and structured neural attention mechanisms
[COLING'22] Code for "Semantic Role Labeling as Dependency Parsing: Exploring Latent Tree Structures Inside Arguments".
CVPR and NeurIPS poster examples and templates. May we have in-person poster session soon!
Code release for the paper Rule Augmented Unsupervised Constituency Parsing to appear in the Findings of ACL 2021
Implementation of our ACL 2020 paper: Structured Tuning for Semantic Role Labeling
The official implementation of EMNLP 2020, "A Simple and Effective Model for Answering Multi-span Questions".
🚀AI拟声: 5秒内克隆您的声音并生成任意语音内容 Clone a voice in 5 seconds to generate arbitrary speech in real-time
AI拟声: 克隆您的声音并生成任意语音内容 Clone a voice in 5 seconds to generate arbitrary speech in real-time
Manipulate tensors with PackedSequence and CattedSequence
High Performance Structured Prediction in PyTorch
Modular implementation of an AM dependency parser in AllenNLP.
🛸 Use pretrained transformers like BERT, XLNet and GPT-2 in spaCy
Pytorch implementations of Bayes By Backprop, MC Dropout, SGLD, the Local Reparametrization Trick, KF-Laplace, SG-HMC and more
MixText: Linguistically-Informed Interpolation of Hidden Space for Semi-Supervised Text Classification
[NAACL 2021] A Frustratingly Easy Approach for Entity and Relation Extraction https://arxiv.org/abs/2010.12812
Code Repository for "Please Mind the Root: Decoding Arborescences for Dependency Parsing" and "On Finding the K-best Non-projective Dependency Trees"
[ICLR 2020] Lite Transformer with Long-Short Range Attention
A Fast(er) and Accurate Syntactic Parsing by Exacter Searching.