GLiClass.js is a TypeScript - based inference engine for running GLiClass(Generalist and Lightweight Model for Sequence Classification) models. This is an efficient zero-shot classifier inspired by GLiNER work. It demonstrates the same performance as a cross-encoder while being more compute-efficient because classification is done at a single forward path.
It can be used for topic classification, sentiment analysis and as a reranker in RAG pipelines.
☋ Knowledgator • ✔️ LinkedIn • 📢 Discord • 🤗 Space • 🤗 GliClass Collection
- Flexible entity recognition without predefined categories
- Lightweight and fast inference
- Easy integration with web applications
- TypeScript support for better developer experience
npm install gliclass
const gliclass = new Gliclass({
tokenizerPath: "knowledgator/gliclass-small-v1.0",
onnxSettings: {
modelPath: "public/model.onnx",
executionProvider: "cpu",
multiThread: true,
},
promptFirst: false,
});
await gliclass.initialize();
const input_text = "Your input text here";
const texts = [input_text];
const labels = ["business", "science", "tech"];
const threshold = 0.5;
const decoded = await gliclass.inference({ texts, labels, threshold });
console.log(decoded);
- modelPath: can be either a URL to a local model as in the basic example, or it can also be the Model itself as an array of binary data.
- executionProvider: these are the same providers that ONNX web supports, currently we allow
webgpu
(recommended),cpu
,wasm
,webgl
but more can be added - wasmPaths: Path to the wasm binaries, this can be either a URL to the binaries like a CDN url, or a local path to a folder with the binaries.
- multiThread: wether to multithread at all, only relevent for wasm and cpu exeuction providers.
- multiThread: When choosing the wasm or cpu provider, multiThread will allow you to specify the number of cores you want to use.
- fetchBinary: will prefetch the binary from the default or provided wasm paths
To use GLiNER models in a web environment, you need an ONNX format model. You can:
- Search for pre-converted models on HuggingFace
- Convert a model yourself using the Python
convert_to_onnx.py
.
Use the convert_to_onnx.py
script with the following arguments:
model_path
: Location of the GLiNER modelsave_path
: Where to save the ONNX filequantize
: Set to True for IntU8 quantization (optional)
Example:
python convert_to_onnx.py --model_path /path/to/your/model --save_path /path/to/save/onnx --quantize True
GLiClass.js offers versatile text classification capabilities across various domains:
- Documents Classification
- Sentiment Analysis
- Reranking of Search Results ...
- Further optimize inference speed
- Add support for more architectures
- Enable model training capabilities
- Provide more usage examples
- for any changes, remember to run
pnpm changeset
, otherwise there will not be a version bump and the PR Github Action will fail.
For questions and support, please join our Discord community or open an issue on GitHub.