Stars
An open-source cross-platform alternative to AirDrop
Use LLMs to dig out what you care about from massive amounts of information and a variety of sources daily.
Master programming by recreating your favorite technologies from scratch.
fastHan是基于fastNLP与pytorch实现的中文自然语言处理工具,像spacy一样调用方便。
《神经网络与深度学习》 邱锡鹏著 Neural Network and Deep Learning
fastNLP: A Modularized and Extensible NLP Framework. Currently still in incubation.
中文自然语言处理工具包 Toolkit for Chinese natural language processing
🏅 Collection of Kaggle Solutions and Ideas 🏅
MindSpore is a new open source deep learning training/inference framework that could be used for mobile, edge and cloud scenarios.
Implementation of Electric Load Forecasting Based on LSTM(BiLSTM). Including Univariate-SingleStep forecasting, Multivariate-SingleStep forecasting and Multivariate-MultiStep forecasting.
Attention Based Spatial-Temporal Graph Convolutional Networks for Traffic Flow Forecasting, AAAI 2019, pytorch version
time series forecasting using pytorch,including ANN,RNN,LSTM,GRU and TSR-RNN,experimental code
Time series forecasting with PyTorch
PyTorch based Probabilistic Time Series forecasting framework based on GluonTS backend
Pretrain, finetune ANY AI model of ANY size on multiple GPUs, TPUs with zero code changes.
🪄 Create rich visualizations with AI
Deep learning PyTorch library for time series forecasting, classification, and anomaly detection (originally for flood forecasting).
proof of concept for a transformer-based time series prediction model
time series analysis models source code
time series analysis tutorial
🚀🚀 「大模型」2小时完全从0训练26M的小参数GPT!🌏 Train a 26M-parameter GPT from scratch in just 2h!
中英文敏感词、语言检测、中外手机/电话归属地/运营商查询、名字推断性别、手机号抽取、身份证抽取、邮箱抽取、中日文人名库、中文缩写库、拆字词典、词汇情感值、停用词、反动词表、暴恐词表、繁简体转换、英文模拟中文发音、汪峰歌词生成器、职业名称词库、同义词库、反义词库、否定词库、汽车品牌词库、汽车零件词库、连续英文切割、各种中文词向量、公司名字大全、古诗词库、IT词库、财经词库、成语词库、地名词库、…
transformer/self-attention for Multidimensional time series forecasting 使用transformer架构实现多维时间预测
Python tool for converting files and office documents to Markdown.
Fast and memory-efficient exact attention