Skip to content

Commit

Permalink
fixed max sequence length
Browse files Browse the repository at this point in the history
  • Loading branch information
c3363046 committed Sep 28, 2021
2 parents 93b81d3 + 07dad59 commit d5917e3
Show file tree
Hide file tree
Showing 2 changed files with 23 additions and 7 deletions.
27 changes: 22 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,23 +1,27 @@
# NeuralLog
Repository for the paper: Log-based Anomaly Detection Without Log Parsing.

This repository is under refactorization: 80% done_.

**Abstract**: Software systems often record important runtime information in system logs for troubleshooting purposes. There have been many studies that use log data to construct machine learning models for detecting system anomalies. Through our empirical study, we find that existing log-based anomaly detection approaches are significantly affected by log parsing errors that are introduced by 1) OOV (out-of-vocabulary) words, and 2) semantic misunderstandings. The log parsing errors could cause the loss of important information for anomaly detection. To address the limitations of existing methods, we propose NeuralLog, a novel log-based anomaly detection approach that does not require log parsing. NeuralLog extracts the semantic meaning of raw log messages and represents them as semantic vectors. These representation vectors are then used to detect anomalies through a Transformer-based classification model, which can capture the contextual information from log sequences. Our experimental results show that the proposed approach can effectively understand the semantic meaning of log messages and achieve accurate anomaly detection results. Overall, NeuralLog achieves F1-scores greater than 0.95 on four public datasets, outperforming the existing approaches.

## Framework
![Results](docs/images/framework.jpg)
<p align="center"><img src="https://i.ibb.co/3C23jkb/framework.jpg" width="502"><br>An overview of NeuralLog</p>

NeuralLog consists of the following components:
1. **Preprocessing**: Special characters and numbers are removed from log messages.
2. **Neural Representation**: Semantic vectors are extracted from log messages using BERT.
3. **Transformer-based Classification**: A transformer-based classification model containing Positional Encoding and
Transformer Encoder is applied to detect anomalies.
3. **Transformer-based Classification**: A transformer-based classification model containing Positional Encoding and Transformer Encoder is applied to detect anomalies.

## Requirements
1. Python 3.6+
2. tensorflow 2.4
3. transformers
4. tf-models-official 2.4.0
5. ...
5. scikit-learn
6. pandas
7. numpy
8. gensim
## Demo
- Extract Semantic Vectors

Expand All @@ -37,4 +41,17 @@ See [notebook](demo/Transformer_based_Classification.ipynb)
## Data and Models
Datasets and pre-trained models can be found here: [Data](https://figshare.com/s/6d3c6a83f4828d17be79)
## Results
![Framework of NeuralLog](docs/images/results.jpg)
| Dataset | Metrics | LR | SVM | IM | LogRobust | Log2Vec | NeuralLog |
| :--- | :--- | :--- | :--- | :--- | :--- | :--- | :--- |
| | Precision | 0.99 | 0.99 | **1.00** | 0.98 | 0.94 | 0.96 |
| HDFS | Recall | 0.92 | 0.94 | 0.88 | **1.00** | 0.94 | **1.00** |
| | F1-score | 0.96 | 0.96 | 0.94 | **0.99** | 0.94 | 0.98 |
| | Precision | 0.13 | 0.97 | 0.13 | 0.62 | 0.80 | **0.98** |
| BGL | Recall | 0.93 | 0.30 | 0.30 | 0.96 | **0.98** | **0.98** |
| | F1-score | 0.23 | 0.46 | 0.18 | 0.75 | 0.88 | **0.98** |
| | Precision | 0.46 | 0.34 | - | 0.61 | 0.74 | **0.93** |
| Thunderbird | Recall | 0.91 | 0.91 | - | 0.78 | 0.94 | **1.00** |
| | F1-score | 0.61 | 0.50 | - | 0.68 | 0.84 | **0.96** |
| | Precision | 0.89 | 0.88 | - | 0.97 | 0.91 | **0.98** |
| Spirit | Recall | 0.96 | **1.00** | - | 0.94 | 0.96 | 0.96 |
| | F1-score | 0.92 | 0.93 | - | 0.95 | 0.95 | **0.97** |
3 changes: 1 addition & 2 deletions neurallog/data_loader.py
Original file line number Diff line number Diff line change
Expand Up @@ -304,7 +304,6 @@ def load_Supercomputers(log_file, train_ratio=0.5, windows_size=20, step_size=0,
with open(log_file, mode="r", encoding='utf8') as f:
logs = f.readlines()
logs = [x.strip() for x in logs]
logs = logs[:1000000]
try:
with open(e_name, mode='rb') as f:
E = pickle.load(f)
Expand Down Expand Up @@ -420,7 +419,7 @@ def load_Supercomputers(log_file, train_ratio=0.5, windows_size=20, step_size=0,
# with open("../data/embeddings/BGL/iforest-test.pkl", mode="wb") as f:
# pickle.dump((x_te, y_te), f, protocol=pickle.HIGHEST_PROTOCOL)

(x_tr, y_tr), (x_te, y_te) = load(
(x_tr, y_tr), (x_te, y_te) = load_HDFS(
"../data/raw/HDFS/HDFS.log", "../data/raw/HDFS/anomaly_label.csv", train_ratio=0.8, split_type='sequential')
#
# with open("./data/embeddings/BGL/neural-train.pkl", mode="wb") as f:
Expand Down

0 comments on commit d5917e3

Please sign in to comment.