XTuner Release V0.1.17
What's Changed
- [Fix] Fix PyPI package by @LZHgrla in #540
- [Improve] Add LoRA fine-tuning configs for LLaVA-v1.5 by @LZHgrla in #536
- [Configs] Add sequence_parallel_size and SequenceParallelSampler to configs by @HIT-cwh in #538
- Check shape of attn_mask during attn forward by @HIT-cwh in #543
- bump version to v0.1.17 by @LZHgrla in #542
Full Changelog: v0.1.16...v0.1.17