forked from SakanaAI/AI-Scientist
-
Notifications
You must be signed in to change notification settings - Fork 0
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Add Sketch RNN template. (SakanaAI#143)
* Add sketch RNN template. * Add general description of experiment.py * Add relevant citations to template.tex * Update README.md --------- Co-authored-by: Cong Lu <[email protected]>
- Loading branch information
1 parent
f539fa9
commit b651fc4
Showing
16 changed files
with
4,300 additions
and
0 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Large diffs are not rendered by default.
Oops, something went wrong.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,38 @@ | ||
[ | ||
{ | ||
"Name": "latent_space_decorrelation", | ||
"Title": "Enhancing Sketch Diversity through Latent Space Decorrelation", | ||
"Experiment": "Introduce a covariance penalty term in the loss function that encourages the covariance matrix of the latent vectors to be close to the identity matrix. This can be implemented by adding a term to the loss that penalizes the off-diagonal elements of the covariance matrix of the latent vectors. Modify the `train` method to include this additional regularization term in the loss computation.", | ||
"Interestingness": 8, | ||
"Feasibility": 9, | ||
"Novelty": 7, | ||
"novel": true | ||
}, | ||
{ | ||
"Name": "temporal_smoothing_loss", | ||
"Title": "Enhancing Sketch Generation with Temporal Smoothing Loss", | ||
"Experiment": "Introduce a Temporal Smoothing Loss (TSL) to the existing model to encourage smooth transitions in pen movements. Define TSL as the L2 norm of the difference between consecutive pen movements. Modify the `train` method to include TSL in the loss computation. Implement an adaptive weighting mechanism that adjusts the influence of TSL based on the variance of pen movements within a sequence. Evaluate the quality and diversity of the generated sketches with and without TSL.", | ||
"Interestingness": 7, | ||
"Feasibility": 9, | ||
"Novelty": 7, | ||
"novel": true | ||
}, | ||
{ | ||
"Name": "attention_mechanism", | ||
"Title": "Enhancing Sketch Generation with Attention Mechanism", | ||
"Experiment": "In this experiment, we introduce an attention mechanism to the decoder RNN. This involves adding an attention layer that computes a context vector at each decoding step by taking a weighted sum of the latent representations. The context vector is concatenated with the decoder's hidden state and the current input to generate the next output. Modify the `DecoderRNN` class to include the attention mechanism and adjust the `forward` method accordingly. Evaluate the quality and diversity of the generated sketches with and without the attention mechanism.", | ||
"Interestingness": 9, | ||
"Feasibility": 8, | ||
"Novelty": 8, | ||
"novel": true | ||
}, | ||
{ | ||
"Name": "adaptive_dropout", | ||
"Title": "Improving Sketch Generation with Adaptive Dropout", | ||
"Experiment": "Introduce an adaptive dropout mechanism that adjusts the dropout rate dynamically based on the training step. Implement a `dropout_scheduler` function that decreases the dropout rate from 0.5 to 0.1 linearly over the training steps. Modify the `EncoderRNN` and `DecoderRNN` classes to accept a dynamic dropout rate parameter. Update the dropout rate in each forward pass of the encoder and decoder based on the current training step. Integrate the `dropout_scheduler` into the training loop, updating the dropout rates in each step. Evaluate the quality and diversity of the generated sketches with and without adaptive dropout.", | ||
"Interestingness": 8, | ||
"Feasibility": 8, | ||
"Novelty": 7, | ||
"novel": true | ||
} | ||
] |
Oops, something went wrong.