Skip to content

Latest commit

 

History

History
 
 

text-classification

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 
 
 
 
 

Text Classification

Text classification with LLMs via Ollama.

Running the application

The application relies on Ollama for providing LLMs. You can either run Ollama locally on your laptop, or rely on the Testcontainers support in Spring Boot to spin up an Ollama service automatically.

Ollama as a native application

First, make sure you have Ollama installed on your laptop. Then, use Ollama to pull the mistral large language model.

ollama pull mistral-nemo

Finally, run the Spring Boot application.

./gradlew bootRun

Ollama as a dev service with Testcontainers

The application relies on the native Testcontainers support in Spring Boot to spin up an Ollama service with a mistral model at startup time.

./gradlew bootTestRun

Calling the application

You can now call the application that will use Ollama and mistral to classify your text. This example uses httpie to send HTTP requests.

Each endpoint is backed by a progressively better prompt to increase the quality of the text classification task by the LLM.

Class Names:

http --raw "Basketball fans can now watch the game on the brand-new NBA app for Apple Vision Pro." :8080/classify/class-names

Class Descriptions:

http --raw "Basketball fans can now watch the game on the brand-new NBA app for Apple Vision Pro." :8080/classify/class-descriptions

Few Shots Prompt:

http --raw "Basketball fans can now watch the game on the brand-new NBA app for Apple Vision Pro." :8080/classify/few-shots-prompt

Few Shots History:

http --raw "Basketball fans can now watch the game on the brand-new NBA app for Apple Vision Pro." :8080/classify/few-shots-history

Structured Output:

http --raw "Basketball fans can now watch the game on the brand-new NBA app for Apple Vision Pro." :8080/classify/structured-output