-
LLMs enhance Recommendation
- Feature Engineering
- data augmentation
- generate open-world knowledge for user/item
- generate interaction data
- data condense
- feature selection
- feature imputation
- data augmentation
- Feature Encoder
- encode text information
- encode id information
- Feature Engineering
-
LLMs as Recommenders
- prompt learning
- instruction tuning
- reinforce learning
- knowledge distillation
- Pipeline Controller
- pipeline design
- CoT, ToT, SI
- Incremental Learning
-
Other Related work
- Self-distillation in LLM
- DPO in LLM
- LLM4CTR
Title | Model | Time | Description |
---|---|---|---|
CTR-BERT: Cost-effective knowledge distillation for billion-parameter teacher models | CTR-BERT | NIPS WS'21 | CTR-BERT 提出了一种成本效益的知识蒸馏方法,用于十亿参数教师模型。 |
DCAF-BERT: A Distilled Cachable Adaptable Factorized Model For Improved Ads CTR Prediction | DCAF-BERT | WWW'22 | DCAF-BERT 提出了一种经过蒸馏的可缓存可适应因式化模型,用于提高广告点击率预测的准确性。 |
Learning Supplementary NLP Features for CTR Prediction in Sponsored Search | - | KDD'22 | 为了在赞助搜索中进行点击率预测,该研究探索了学习补充自然语言处理特征的方法。 |
Practice on Effectively Extracting NLP Features for Click-Through Rate Prediction | - | CIKM'23 | 通过实践,研究了有效提取自然语言处理特征用于点击率预测的方法。 |
BERT4CTR: An Efficient Framework to Combine Pre-trained Language Model with Non-textual Features for CTR Prediction | BERT4CTR | KDD'23 | BERT4CTR 提出了一种高效的框架,将预训练语言模型与非文本特征结合,用于点击率预测。 |
M6-rec: Generative pretrained language models are open-ended recommender systems | M6-rec | arxiv'22 | M6-rec 提出了一种生成式预训练语言模型,用作开放式推荐系统。 |
Ctrl: Connect tabular and language model for ctr prediction | Ctrl | arxiv'23 | Ctrl 提出了一种连接表格数据和语言模型用于点击率预测的方法。 |
FLIP: Towards Fine-grained Alignment between ID-based Models and Pretrained Language Models for CTR Prediction | FLIP | arxiv'23 | FLIP 旨在实现基于ID的模型和预训练语言模型之间的细粒度对齐,用于点击率预测。 |
TBIN: Modeling Long Textual Behavior Data for CTR Prediction | TBIN | arxiv'23 | TBIN 提出了一种用于点击率预测的长文本行为数据建模方法。 |
An Unified Search and Recommendation Foundation Model for Cold-Start Scenario | - | CIKM'23 | 为冷启动场景提出了一个统一的搜索和推荐基础模型。 |
A Unified Framework for Multi-Domain CTR Prediction via Large Language Models | - | arxiv'23 | 提出了一个通过大型语言模型进行多领域点击率预测的统一框架。 |
UFIN: Universal Feature Interaction Network for Multi-Domain Click-Through Rate Prediction | UFIN | arxiv'23 | UFIN 提出了一种通用特征交互网络,用于多领域的点击率预测。 |
ClickPrompt: CTR Models are Strong Prompt Generators for Adapting Language Models to CTR Prediction | ClickPrompt | WWW'24 | ClickPrompt 提出了点击率模型作为强大提示生成器,用于调整语言模型以进行点击率预测。 |
PRINT: Personalized Relevance Incentive Network for CTR Prediction in Sponsored Search | WWW'24 | PRINT 提出了一种个性化相关性激励网络,用于赞助搜索中的点击率预测。 | |
Breaking the Length Barrier: LLM-Enhanced CTR Prediction in Long Textual User Behaviors | - | arxiv'24 | 提出一种在长文本用户行为中增强点击率预测的LLM方法。 |
KELLMRec: Knowledge-Enhanced Large Language Models for Recommendation | KELLMRec | arxiv'24 | KELLMRec 提出了一种增强知识的大型语言模型用于推荐任务。 |
Enhancing sequential recommendation via llm-based semantic embedding learning | - | WWW'24 | 通过基于LLM的语义嵌入学习来增强顺序推荐任务。 |
Heterogeneous knowledge fusion: A novel approach for personalized recommendation via llm | - | Recsys'23 | 通过异构知识融合,提出了一种通过LLM进行个性化推荐的新方法。 |
Play to Your Strengths: Collaborative Intelligence of Conventional Recommender Models and Large Language Models | - | arxiv'24 | 利用传统推荐模型和大型语言模型的协同智能来发挥各自优势。 |
Generative Explore-Exploit: Training-free Optimization of Generative Recommender Systems using LLM Optimizers | - | arxiv'24 | 通过LLM优化器实现生成式推荐系统的无训练优化,实现探索-利用策略。 |
Title | Model | Time | Description |
---|---|---|---|
ICE-SEARCH: A Language Model-Driven Feature Selection Approach | ICE-SEARCH | arXiv'24 | ICE-SEARCH 提出了一种基于语言模型的特征选择方法。 |
Large Language Model Pruning | - | arXiv'24 | Model Pruning |
Dynamic and Adaptive Feature Generation with LLM | - | arXiv'24 | 利用LLM进行动态和自适应特征生成。 |