Skip to content

Commit

Permalink
codacy: remove unused variable lora_config
Browse files Browse the repository at this point in the history
Co-authored-by: codacy-production[bot] <61871480+codacy-production[bot]@users.noreply.github.com>
  • Loading branch information
phanhongan and codacy-production[bot] authored Oct 26, 2024
1 parent c45d0f1 commit cad014a
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion raw_inference.py
Original file line number Diff line number Diff line change
Expand Up @@ -100,7 +100,7 @@ def main():
model, tokenizer = load_model_and_tokenizer(config, bnb_config)

# Load LoRA configuration and merge it with the model
lora_config = configure_lora(config)
configure_lora(config)
base_model = AutoModelForCausalLM.from_pretrained(

Check notice on line 104 in raw_inference.py

View check run for this annotation

Codacy Production / Codacy Static Code Analysis

raw_inference.py#L104

undefined name 'AutoModelForCausalLM' (F821)
config["model"]["model_name"],
low_cpu_mem_usage=True,
Expand Down

0 comments on commit cad014a

Please sign in to comment.