Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

jit : update how nn.Parameters are detected and tagged with STATIC_MEMORY_LOCATION tag #1611

Open
wants to merge 4 commits into
base: main
Choose a base branch
from

Conversation

kshitij12345
Copy link
Collaborator

@kshitij12345 kshitij12345 commented Jan 7, 2025

Related #1575

On thunderfx path, we don't see the nn.Module but we can see nn.Parameter. In this PR, we tag the proxy with STATIC_MEMORY_LOCATION if the original tensor is an instance of nn.Parameter.

NOTE - This PR doesn't fix determining the buffers on thunderfx path.

Co-authored-by: Thomas Viehmann <[email protected]>
@kshitij12345 kshitij12345 changed the title [WIP] jit : update how nn.Parameters are detected and tagged with STATIC_MEMORY_LOCATION tag jit : update how nn.Parameters are detected and tagged with STATIC_MEMORY_LOCATION tag Jan 7, 2025
@kshitij12345 kshitij12345 marked this pull request as ready for review January 7, 2025 13:37
Copy link
Collaborator

@t-vi t-vi left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Seems terrible, but if it unblocks you.

@kshitij12345
Copy link
Collaborator Author

Seems terrible, but if it unblocks you.

Do you have another fix in mind? Or is it the fact that we can't see nn.Module via thunderfx path that is bad?

@mruberry
Copy link
Collaborator

mruberry commented Jan 7, 2025

Tagging parameters makes a lot of sense to me. They're parameters for a reason.

In the future we may also consider what happens if someone calls thunderfx on a module. We could detect that a module was passed to the call before passing it to torch.compile. However, there's no requirement that someone pass the module itself for optimization.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants