Skip to content
This repository has been archived by the owner on Nov 1, 2024. It is now read-only.

Refactor prompt generation to not require loading a model #12

Open
ramakanth-pasunuru opened this issue Mar 28, 2022 · 0 comments
Open
Labels
better-eng Things that can help make things sane enhancement New feature or request

Comments

@ramakanth-pasunuru
Copy link
Contributor

Currently, we use this code fairseq/eval/predictors.py for prompt generation and it relies on the input model to this class. It would be good to remove this false dependency, especially when one needs to only generate prompt data independent of the model (e.g, see the discussion in this PR).

@bigfootjon bigfootjon transferred this issue from another repository May 2, 2022
@suchenzang suchenzang added better-eng Things that can help make things sane and removed help wanted labels Oct 12, 2022
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
better-eng Things that can help make things sane enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants