Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to fine-tuning? #3

Open
zethan1 opened this issue Mar 10, 2023 · 3 comments
Open

How to fine-tuning? #3

zethan1 opened this issue Mar 10, 2023 · 3 comments

Comments

@zethan1
Copy link

zethan1 commented Mar 10, 2023

How can I fine-tuning for the downstream tasks, such as segmentation? Could you share your code about fine-tuning?

@Zian-Xu
Copy link
Owner

Zian-Xu commented Mar 12, 2023

As with most fine-tuning, you need to just use the pre-trained model as initialization parameters on the downstream task. For this purpose, I recommend using a segmentation network that uses the same backbone, such as Swin-Unet, the same network I used for the downstream task in my paper. If you need more help, please feel free to contact me again.

@DLoboT
Copy link

DLoboT commented Oct 25, 2023

How do you use the Swin-Unet?

@Zian-Xu
Copy link
Owner

Zian-Xu commented Oct 26, 2023

Are you asking how to use Swin-Unet for transfer learning? The method is the same as the original Swin-Unet using the pre-trained model of Swin Transformer. The Swin-MAE trained model is used directly as the pre-trained weights for the encoder and bottleneck of Swin-Unet. And the swin Transformer blocks in the decoder use the same weights of the corresponding layers in the encoder. @DLoboT

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants