-
Notifications
You must be signed in to change notification settings - Fork 32
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
support pytorch embedding? #1
Comments
The RNN example includes an embedding, so this package works for normal embeddings. If you found an error perhaps you can expand on what it was and how to replicate it? |
embedding in pytorch need Long type. But in summary, it gives float tensor. |
Hi. I'm sorry, I had been very busy recently. |
@ nmhkahn no worries, I thought I'd help you handle it @kangkang59812 I think it's because you need to provide a tensor not a shape (unlike torchsummary). So try: from torch import nn
from torchsummaryX import summary
embedding = nn.Embedding(10, 3)
summary(embedding.cuda(), torch.zeros((2,4)).cuda()) And it should work. |
Got this error when I ran the above piece of code: TypeError: rand(): argument 'size' must be tuple of ints, but found element of type Tensor at pos 2 |
Try this code on the latest version please: import torch
from torch import nn
from torchsummaryX import summary
embedding = nn.Sequential(nn.Embedding(10, 3) )
summary(embedding, torch.zeros((2,4)).long()) If that doesn't work, please post the full error stack and what versions of torch and torchsummaryX you are using, so we can replicate it. |
some error in torchsummary.
The text was updated successfully, but these errors were encountered: