ModelEmbedderDistillation [RUS]
ModelEmbedderDistillation
is a project focused on distilling complex embedding models (such as Sbert, E5) to simplify and enhance their efficiency for use in various machine learning applications.
- Sbert Distillation: The process of simplifying the Sbert model to optimize performance without significant loss of accuracy.
- Sbert Distillation with Layer Decomposition: An advanced distillation method that includes the decomposition of model layers for deeper optimization.
- C# Version Based on AI Framework: Implementation of the aforementioned distillation methods in C#, using the AI Framework for easy integration into .NET projects.
To get started with ModelEmbedderDistillation
, clone the repository and follow the instructions in the installation section.
Stage | Tasks | Status |
---|---|---|
Sbert Distillation | Researching distillation methods | 🟢 Completed |
Distillation with Layer Decomposition | Developing decomposition method | 🟢 Completed |
C# Version Based on AI Framework | Transferring algorithms to C# | 🟢 Completed |
Integrating with AI Framework | 🟢 Completed | |
Optimization and Expansion | Performance optimization | 🟡 In Progress |
Supporting additional models | 🟡 In Progress | |
Documentation and Examples | Developing documentation | 🟡 In Progress |
Creating usage examples | 🟡 In Progress |
- Distilled SBERT FractalGPT/SbertDistil
- Distilled SBERT with SVD FractalGPT/SbertSVDDistil
- Ported model for use in C# FractalGPT/SbertDistilAIFr
- This project is distributed under the Apache 2.0 License. See the LICENSE file for more details.