PyTorch Implementation of BERT4NILM: A Bidirectional Transformer Model for Non-Intrusive Load Monitoring
The csv datasets could be downloaded here: REDD and UK-DALE
We took the liberty of modifying certain appliance names to 'dishwasher', 'fridge', 'microwave', 'washing_machine' and 'kettle' in the 'labels.dat' file, see data folder
This is the PyTorch implementation of BERT4NILM, a bidirectional encoder representations from rransformers for energy disaggregation, in this repository we provide the BERT4NILM model as well as data functions for low frequency REDD dataset / UK Dale dataset, run following command to train an initial model, hyper-parameters (as well as appliances) could be tuned in utils.py, test will run after training ends:
python train.py
The trained model state dict will be saved under 'experiments/dataset-name/best_acc_model.pth'
Our models are trained 100 / 20 epochs repspectively for appliances from REDD and UK-DALE dataset, all other parameters could be found in 'train.py' and 'utils.py'
Please cite the following paper if you use our methods in your research:
@inproceedings{yue2020bert4nilm,
title={BERT4NILM: A Bidirectional Transformer Model for Non-Intrusive Load Monitoring},
author={Yue, Zhenrui and Witzig, Camilo Requena and Jorde, Daniel and Jacobsen, Hans-Arno},
booktitle={Proceedings of the 5th International Workshop on Non-Intrusive Load Monitoring},
pages={89--93},
year={2020}
}
During the implementation we base our code mostly on the BERT-pytorch by Junseong Kim, we are also inspired by the BERT4Rec implementation by Jaewon Chung and Transformers from Hugging Face. Many thanks to these authors for their great work!