You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Firstly, congratulations for your amazing work. In order to test the CLIP model in Turkish, I trained 2M of sentences data in Turkish with distil-bert-multi-cased, as in the instructions. Also, I use precomputed embeddings which you provided.
When Loss values are around 0.02, I thought that I can test :D
When I examined the code to test the model, I realized that it was loading pickle files of about 2m in size as weights. I did not understand how to use the 2GB TF model which I have trained with this code. Could you give more information about this?
The text was updated successfully, but these errors were encountered:
Firstly, congratulations for your amazing work. In order to test the CLIP model in Turkish, I trained 2M of sentences data in Turkish with distil-bert-multi-cased, as in the instructions. Also, I use precomputed embeddings which you provided.
When Loss values are around 0.02, I thought that I can test :D
When I examined the code to test the model, I realized that it was loading pickle files of about 2m in size as weights. I did not understand how to use the 2GB TF model which I have trained with this code. Could you give more information about this?
The text was updated successfully, but these errors were encountered: