forked from meta-llama/llama
-
Notifications
You must be signed in to change notification settings - Fork 0
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
- Loading branch information
Showing
1 changed file
with
38 additions
and
0 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,38 @@ | ||
--- | ||
name: Bug report | ||
about: Create a report to help us reproduce and fix the issue | ||
title: '' | ||
labels: '' | ||
assignees: '' | ||
|
||
--- | ||
|
||
**Before submitting a bug, please make sure the issue hasn't been already addressed by searching through the [existing and past issues](https://github.com/facebookresearch/llama/issues)** | ||
|
||
## Describe the bug | ||
Please provide a clear and concise description of what the bug is. If relevant, please include a _minimal_ (least lines of code necessary) _reproducible_ (running this will give us the same result as you get) code snippet. Make sure to include the relevant imports. | ||
|
||
Remember to wrap the code and outputs in ```` ```triple-quotes blocks``` ````. | ||
|
||
### Minimal reproducible example | ||
|
||
```python | ||
# sample code to repro the bug | ||
``` | ||
|
||
### Output | ||
|
||
``` | ||
<paste stacktrace and other outputs here> | ||
``` | ||
|
||
## Runtime Environment | ||
- Model: [eg: `llama-2-7b-chat`] | ||
- Using via huggingface?: [yes/no] | ||
- OS: [eg. Linux/Ubuntu, Windows] | ||
- GPU VRAM: | ||
- Number of GPUs: | ||
- GPU Make: [eg: Nvidia, AMD, Intel] | ||
|
||
**Additional context** | ||
Add any other context about the problem or environment here. |