Skip to content

Commit

Permalink
Use default max_sequence_length (mlc-ai#64)
Browse files Browse the repository at this point in the history
Fixes mlc-ai#61
  • Loading branch information
davidar authored Apr 23, 2023
1 parent 28332cb commit 1906d90
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion web_llm/utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ def get_config(hf_config, model):
from .relax_model.llama import LlamaConfig as RelaxConfig

return RelaxConfig(
max_sequence_length=hf_config.max_sequence_length,
#max_sequence_length=hf_config.max_sequence_length,
vocab_size=hf_config.vocab_size,
hidden_size=hf_config.hidden_size,
intermediate_size=hf_config.intermediate_size,
Expand Down

0 comments on commit 1906d90

Please sign in to comment.