Skip to content

FractalGPT/ModelEmbedderDistillation

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

22 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

ModelEmbedderDistillation [RUS]

About the Project

ModelEmbedderDistillation is a project focused on distilling complex embedding models (such as Sbert, E5) to simplify and enhance their efficiency for use in various machine learning applications.

Features

  • Sbert Distillation: The process of simplifying the Sbert model to optimize performance without significant loss of accuracy.
  • Sbert Distillation with Layer Decomposition: An advanced distillation method that includes the decomposition of model layers for deeper optimization.
  • C# Version Based on AI Framework: Implementation of the aforementioned distillation methods in C#, using the AI Framework for easy integration into .NET projects.

Getting Started

To get started with ModelEmbedderDistillation, clone the repository and follow the instructions in the installation section.

Roadmap SBert

Stage Tasks Status
Sbert Distillation Researching distillation methods 🟢 Completed
Distillation with Layer Decomposition Developing decomposition method 🟢 Completed
C# Version Based on AI Framework Transferring algorithms to C# 🟢 Completed
Integrating with AI Framework 🟢 Completed
Optimization and Expansion Performance optimization 🟡 In Progress
Supporting additional models 🟡 In Progress
Documentation and Examples Developing documentation 🟡 In Progress
Creating usage examples 🟡 In Progress

Models

License

  • This project is distributed under the Apache 2.0 License. See the LICENSE file for more details.

About

Distillation of embedder models (Sbert, E5, etc.)

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published