Is this copy necessary in PytorchModel
?
#43
-
Hello, lambeq/lambeq/training/pytorch_model.py Line 136 in d911686 being the diagrams very big, especially with "not-so-toy" datasets, doing a deepcopy (even if faster, as you claim) is very demanding in terms of resources and only involves the CPU. Since you never alter the diagrams you receive, I was wondering what is the reason why you want to copy them. Commenting that line and reinstalling
However, before opening a pull-request I wanted to reach you guys, not to pollute your repository with possibly unwanted stuff and to make sure I'm not missing anything. |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
Hi @mspronesti, thanks for brining this up, this is indeed something we need to fix! It is true that the copying of the diagram requires some extra resources. We need this operation because we destructively substitute the symbols (placeholder of tensors) with the concrete values: lambeq/lambeq/training/pytorch_model.py Lines 137 to 144 in d911686 Deleting the line shouldn't change the behaviour of the model because PyTorch changes the parameters in-place during the training. However, the instance of the lambeq Dataset class won't be reusable after training. If this is not a problem for you, I'd say yes, commenting out line 136 should be save to do. However, we'll need to find a proper solution for this in general. |
Beta Was this translation helpful? Give feedback.
Hi @mspronesti,
thanks for brining this up, this is indeed something we need to fix! It is true that the copying of the diagram requires some extra resources. We need this operation because we destructively substitute the symbols (placeholder of tensors) with the concrete values:
lambeq/lambeq/training/pytorch_model.py
Lines 137 to 144 in d911686
Deleting the line shouldn't change the behaviour of the model because PyTorch changes the par…