-
Notifications
You must be signed in to change notification settings - Fork 24
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
extracting probabilities? #5
Comments
If you wish a bit of additional background, this is the generator I'm working on: |
Interesting use case! Depending on how closely you want the output to match the input grammar, that may be easy or hard to do. The output of Furthermore a number of optimizations are applied, where (specialized) symbols with only a small number of probabilities are just pre-expanded into all possible strings. Outputting the DAG in a human-readable format with branch probabilities for all disjunctions wouldn't be very hard, but mapping it back to the original CFG would be a lot harder. |
Interesting, thanks! |
Hello! I'm sorry to abuse github issues in this fashion, but I'm not sure how else to ask for a bit of advice.
I have a simple random program generator that looks like a decision tree. It is not context free, but it is not all that far from being context free. When making a uniform choice among alternatives at every decision point, it produces extremely skewed output. I want to be able to (approximately) uniformly sample the leaves of my decision tree.
I am happy to extract a CFG from my generator (by hand) so that I can run it through gramtropy, but what I want is not for gram to sample from the resulting grammar, but rather a list of branch probabilities that I can paste back into my program so that it (approximately) uniformly samples from its leaves.
Is this a feasible thing to ask your tool for? Any help appreciated, thanks.
The text was updated successfully, but these errors were encountered: