Skip to content

Commit

Permalink
save_pretrained: use tqdm when saving checkpoint shards from offloade…
Browse files Browse the repository at this point in the history
…d params (huggingface#31856)
  • Loading branch information
kallewoof authored Jul 9, 2024
1 parent 350aed7 commit cffa2b9
Showing 1 changed file with 4 additions and 1 deletion.
5 changes: 4 additions & 1 deletion src/transformers/modeling_utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -2657,7 +2657,10 @@ def save_pretrained(
):
os.remove(full_filename)
# Save the model
for shard_file, tensors in state_dict_split.filename_to_tensors.items():
filename_to_tensors = state_dict_split.filename_to_tensors.items()
if module_map:
filename_to_tensors = logging.tqdm(filename_to_tensors, desc="Saving checkpoint shards")
for shard_file, tensors in filename_to_tensors:
shard = {tensor: state_dict[tensor] for tensor in tensors}
# remake shard with onloaded parameters if necessary
if module_map:
Expand Down

0 comments on commit cffa2b9

Please sign in to comment.