Skip to content

Commit

Permalink
2023-3-13 update
Browse files Browse the repository at this point in the history
  • Loading branch information
zhgqcn committed Mar 13, 2023
0 parents commit 0b8675f
Show file tree
Hide file tree
Showing 41 changed files with 425 additions and 0 deletions.
328 changes: 328 additions & 0 deletions README.md

Large diffs are not rendered by default.

Empty file added federal-based.md
Empty file.
Binary file added img-tf/ELTransformer.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added img-tf/infocus.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added img-tf/midformer.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added img-wo/aae.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added img-wo/mul-stff.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added img-wo/semisupervised_multilabel.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added img-wo/stmtam.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added img/BERT4NILM.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added img/DeepDFML.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added img/ELECTRIcity.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added img/Edge-NILM-1.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added img/Edge-NILM-2.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added img/FL-NILIM-survey.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added img/Fryze-Current.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added img/GAN-NILM.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added img/L2L.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added img/Seq2Point.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added img/Short-Seq2Point.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added img/Subtask-NILM.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added img/TP-NILM.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added img/TTRNet.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added img/TransferNILM.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added img/Unet-NILM.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added img/VAE-NILM.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added img/WRG-nilm.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added img/WaveNILM.png
Binary file added img/Weak-NILM.png
Binary file added img/attention-NILM.png
Binary file added img/eeRIS-NILM.png
Binary file added img/fast-seq2point.png
Binary file added img/fedgbm.png
Binary file added img/image-nilm.png
Binary file added img/neural-nilm.png
Binary file added img/nilm-threshold-1.png
Binary file added img/nilm-threshold-2.png
Binary file added img/nilm-threshold.png
Binary file added img/online-multi-nilm.png
33 changes: 33 additions & 0 deletions paper-without-code.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,33 @@
# Deep Learning-Based Probabilistic Autoencoder for Residential Energy Disaggregation: An Adversarial Approach

> In this article, a new energy disaggregation approach based on adversarial autoencoder (AAE) is proposed to create a generative model and enhance the generalization capacity. The proposed method has a probabilistic structure to handle uncertainties in the unseen data. By transforming the latent space from a deterministic structure to a Gaussian prior distribution, AAEs decoder transforms into a generative model. [[PDF](https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=9712173)] [2022]
<p align='center'>
<img title="" src="./img-wo/aae.png" alt="" width="800" data-align="center">
</p>


# Multichannel Spatio-Temporal Feature Fusion Method for NILM

> In this article, a multichannel spatio-temporal feature fusion method is proposed, where the spatial features extracted by convolution neural network and the temporal features extracted by the recurrent neural network are fused. And the attention module is introduced to further improve the performance of the model. [PDF] [2022]
<p align='center'>
<img title="" src="./img-wo/mul-stff.png" alt="" width="1000" data-align="center">
</p>
# A Self-training Multi-task Attention Method for NILM

> In this paper, a self-training multi-task learning model is proposed. In the model, a parallel structure is used to deal with two different tasks, and the outputs of two branches are directly combined as the final output. The model only needs one loss function and is only trained once. In addition, we also introduce attention mechanism into the proposed model. [[PDF](https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=9858445)] [2022]
<p align='center'>
<img title="" src="./img-wo/stmtam.png" alt="" width="1000" data-align="center">
</p>



# Semisupervised Multilabel Deep Learning Based Nonintrusive Load Monitoring in Smart Grids

> In this article, a new semisupervised multilabel deep learning based framework is proposed to address this problem with the goal of mitigating the reliance on large labeled datasets. Specifically, a temporal convolutional neural network is used to automatically extract high-level load signatures for individual appliances. These signatures can be efficiently used to improve the feature representation capability of the framework. [[PDF](https://ieeexplore.ieee.org/document/8911216)] [2019]
<p align='center'>
<img title="" src="./img-wo/semisupervised_multilabel.png" alt="" width="1000" data-align="center">
</p>
64 changes: 64 additions & 0 deletions transformer-based.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,64 @@
# InFocus: Amplifying Critical Feature Influence on Non-Intrusive Load Monitoring through Self-Attention Mechanisms

> However, the global features especially the dependency correlations between different positions in a sequence cannot be properly acquired. Accordingly, we devise a novel model incorporating an added attention layer to overcome this limitation. The added self-attention mechanism can automatically assign attention scores/weights to different features outputted by convolutional layers, which amplifies the positive influence of critical knowledge while realizing global reference. Moreover, this model can explicitly extract the appliance’s multi-state information, which endows the model with more interpretability. We further improve our model by substituting the added self-attention mechanism with a lightweight one, which decreases the number of model parameters while maintaining the decomposing accuracy. [[PDF](https://ieeexplore.ieee.org/abstract/document/10016661)] [2023]
<p align='center'>
<img title="" src="./img-tf/infocus.png" alt="" width="800" data-align="center">
</p>



# Efficient Localness Transformer for Smart Sensor-Based Energy Disaggregation

> In this work, we propose an efficient localness transformer for non-intrusive load monitoring (ELTransformer). Specifically, we leverage normalization functions and switch the order of matrix multiplication to approximate self-attention and reduce computational complexity. Additionally, we introduce localness modeling with sparse local attention heads and relative position encodings to enhance the model capacity in extracting short-term local patterns. To the best of our knowledge, ELTransformer is the first NILM model that addresses computational complexity and localness modeling in NILM. [[PDF](https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=9881740)] [2022]


<p align='center'>
<img title="" src="./img-tf/ELTransformer.png" alt="" width="800" data-align="center">
</p>



# Transformer for Nonintrusive Load Monitoring: Complexity Reduction and Transferability

> In this article, we propose a middle window transformer model, termed Midformer, for NILM. Existing models are limited by high computational complexity, dependency on data, and poor transferability. In Midformer, we first exploit patchwise embedding to shorten the input length, and then reduce the size of queries in the attention layer by only using global attention on a few selected input locations at the center of the window to capture the global context. The cyclically shifted window technique is used to preserve connection across patches. We also follow the pretraining and fine-tuning paradigm to relieve the dependency on data, reduce the computation in modeling training, and enhance transferability of the model to unknown tasks and domains. [[PDF](https://ieeexplore.ieee.org/abstract/document/9745140)] [2022]
<p align='center'>
<img title="" src="./img-tf/midformer.png" alt="" width="800" data-align="center">
</p>



### ELECTRIcity: An Efficient Transformer for Non-Intrusive Load Monitoring

> Utilizing transformer layers to accurately estimate the power signal of domestic appliances by relying entirely on attention mechanisms to extract global dependencies between the aggregate and the domestic appliance signals. [[PDF](https://www.mdpi.com/1424-8220/22/8/2926)] [[Pytorch](https://github.com/ssykiotis/ELECTRIcity_NILM)] [2022]
<p align='center'>
<img title="" src="./img/ELECTRIcity.png" alt="" width="600" data-align="center">
</p>



### Deep Learning-Based Non-Intrusive Commercial Load Monitoring

>The key elements of the method are a new neural network structure called TTRNet and a new loss function called MLFL. TTRNet is a multi-label classification model that can autonomously learn correlation information through its unique network structure. MLFL is a loss function specifically designed for multi-label classification tasks, which solves the imbalance problem and improves the monitoring accuracy for challenging loads. [[PDF](https://www.researchgate.net/publication/361988541_Deep_Learning-Based_Non-Intrusive_Commercial_Load_Monitoring/figures?lo=1)] [[Pytorch](https://github.com/shaoshuai6666/TTRNet)] [2022]
<p align='center'>
<img title="" src="./img/TTRNet.png" alt="" width="800" data-align="center">
</p>



### BERT4NILM: A Bidirectional Transformer Model for Non-Intrusive Load Monitoring

> We propose BERT4NILM, an architecture based on bidirectional encoder representations from transformers (BERT) and an improved objective function designed specifically for NILM learning. We adapt the bidirectional transformer architecture to the field of energy disaggregation and follow the pattern of sequence-to-sequence learning. [[PDF](https://dl.acm.org/doi/10.1145/3427771.3429390)] [[Pytorch](https://github.com/Yueeeeeeee/BERT4NILM)] [2020]
<p align='center'>
<img title="" src="./img/BERT4NILM.png" alt="" width="800" data-align="center">
</p>





0 comments on commit 0b8675f

Please sign in to comment.