Skip to content

Latest commit

 

History

History
7 lines (5 loc) · 503 Bytes

README.md

File metadata and controls

7 lines (5 loc) · 503 Bytes

Domain adaptation of Large Language Models for Sparse Information Retrieval

This repository contains our setup to evaluate domain adapated BERT models on an information retrieval task using DeepCT.

All the code except the last part of the evaluation can be directly done using the notebooks.

Evaluation

The evaluation cannot be done completly in the notebook setup because it needs Docker. Therefore, for the last step it is neccassary to setup Docker and evalaute the results on an own machine.