Skip to content

Commit

Permalink
fix the issue with strange strikethrough output
Browse files Browse the repository at this point in the history
  • Loading branch information
TsuTikgiau committed Apr 19, 2023
1 parent c37ef66 commit f5c2836
Showing 1 changed file with 3 additions and 1 deletion.
4 changes: 3 additions & 1 deletion minigpt4/conversation/conversation.py
Original file line number Diff line number Diff line change
Expand Up @@ -160,7 +160,9 @@ def answer(self, conv, img_list, max_new_tokens=300, num_beams=1, min_length=1,
temperature=temperature,
)
output_token = outputs[0]
if output_token[0] == 0:
if output_token[0] == 0: # the model might output a unknow token <unk> at the beginning. remove it
output_token = output_token[1:]
if output_token[0] == 1: # some users find that there is a start token <s> at the beginning. remove it
output_token = output_token[1:]
output_text = self.model.llama_tokenizer.decode(output_token, add_special_tokens=False)
output_text = output_text.split('###')[0] # remove the stop sign '###'
Expand Down

0 comments on commit f5c2836

Please sign in to comment.