Skip to content

Implementation of some transfer learning algorithms

License

Notifications You must be signed in to change notification settings

afdeaf/TransferLearning

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

28 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

MIT License

Some transfer learning algorithms modified from microsoft/UDA .

Introduction

Some transfer learning and partial transfer learning algorithms were implemented using PyTorch. There are 5 algorithms including ERM, CORAL, DAN (MMD), DANN, and IWAN (partial transfer learning) in total. Note that the author mainly researched fault diagnosis, so the default dataset is the DDS dataset constructed by ourselves. See more information in the next section.

Usage

You can directly modify the params in the function parse_args in main.py, then run main.py. You can also run main.py in terminal like: python main.py --num-classes=5 --src='20R-0HP'.

Using your own dataset

You should split your data into a train/test set as follows:
  ├─x_train.pt
  ├─y_train.pt
  ├─x_test.pt
  └─y_test.pt
Then write the data loader file refering to the file in dds.py under datasets folder.
The file task.py under the folder tasks are set to generate transfer tasks, and our DDS dataset consists of 9 kinds of operation condition:
condition 20R-0HP 20R-4HP 20R-8HP 30R-0HP 30R-4HP 30R-8HP 40R-0HP 40R-4HP 40R-8HP
number 0 1 2 3 4 5 6 7 8
So there is a total of 72 transfer tasks. You can refer to the aforementioned file to generate your tasks. The shape of the 1-dimensional data is 1024 and the shape of the 2-dimensional data is 64x64. If you use the 2-dimensional data and use resnet18 as the backbone, please modify the _cnn_fidm parameter to 512 in line 16 in file base_model.py under the folder models.

About

Implementation of some transfer learning algorithms

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages