Git Product home page Git Product logo

neurallog's Introduction

NeuralLog

Repository for the paper: Log-based Anomaly Detection Without Log Parsing.

Abstract: Software systems often record important runtime information in system logs for troubleshooting purposes. There have been many studies that use log data to construct machine learning models for detecting system anomalies. Through our empirical study, we find that existing log-based anomaly detection approaches are significantly affected by log parsing errors that are introduced by 1) OOV (out-of-vocabulary) words, and 2) semantic misunderstandings. The log parsing errors could cause the loss of important information for anomaly detection. To address the limitations of existing methods, we propose NeuralLog, a novel log-based anomaly detection approach that does not require log parsing. NeuralLog extracts the semantic meaning of raw log messages and represents them as semantic vectors. These representation vectors are then used to detect anomalies through a Transformer-based classification model, which can capture the contextual information from log sequences. Our experimental results show that the proposed approach can effectively understand the semantic meaning of log messages and achieve accurate anomaly detection results. Overall, NeuralLog achieves F1-scores greater than 0.95 on four public datasets, outperforming the existing approaches.

Framework


An overview of NeuralLog

NeuralLog consists of the following components:

  1. Preprocessing: Special characters and numbers are removed from log messages.
  2. Neural Representation: Semantic vectors are extracted from log messages using BERT.
  3. Transformer-based Classification: A transformer-based classification model containing Positional Encoding and Transformer Encoder is applied to detect anomalies.

Requirements

  1. Python 3.5 - 3.8
  2. tensorflow 2.4
  3. transformers
  4. tf-models-official 2.4.0
  5. scikit-learn
  6. pandas
  7. numpy
  8. gensim

Demo

  • Extract Semantic Vectors
from neurallog import data_loader

log_file = "../data/raw/BGL.log"
emb_dir = "../data/embeddings/BGL"

(x_tr, y_tr), (x_te, y_te) = data_loader.load_Supercomputers(
     log_file, train_ratio=0.8, windows_size=20,
     step_size=5, e_type='bert')
  • Train/Test Transformer Model

See notebook

  • Full demo on the BGL dataset
$ pip install -r requirements.txt
$ wget https://zenodo.org/record/3227177/files/BGL.tar.gz && tar -xvzf BGL.tar.gz
$ mkdir logs && mv BGL.log logs/.
$ cd demo
$ python NeuralLog.py

Data and Models

Datasets and pre-trained models can be found here: Data

Results

Dataset Metrics LR SVM IM LogRobust Log2Vec NeuralLog
Precision 0.99 0.99 1.00 0.98 0.94 0.96
HDFS Recall 0.92 0.94 0.88 1.00 0.94 1.00
F1-score 0.96 0.96 0.94 0.99 0.94 0.98
Precision 0.13 0.97 0.13 0.62 0.80 0.98
BGL Recall 0.93 0.30 0.30 0.96 0.98 0.98
F1-score 0.23 0.46 0.18 0.75 0.88 0.98
Precision 0.46 0.34 - 0.61 0.74 0.93
Thunderbird Recall 0.91 0.91 - 0.78 0.94 1.00
F1-score 0.61 0.50 - 0.68 0.84 0.96
Precision 0.89 0.88 - 0.97 0.91 0.98
Spirit Recall 0.96 1.00 - 0.94 0.96 0.96
F1-score 0.92 0.93 - 0.95 0.95 0.97

Citation

If you find the code and models useful for your research, please cite the following paper:

@inproceedings{le2021log,
  title={Log-based anomaly detection without log parsing},
  author={Le, Van-Hoang and Zhang, Hongyu},
  booktitle={2021 36th IEEE/ACM International Conference on Automated Software Engineering (ASE)},
  pages={492--504},
  year={2021},
  organization={IEEE}
}

neurallog's People

Contributors

vanhoanglepsa avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

neurallog's Issues

Cannot reproduce the time efficiency of NeuralLog stated in the paper, taking more than 2 hours for only 4 epochs

Dear author of NeuralLog, we try to use the recommended default hyper-parameters in the paper to reproduce the time efficiency of NeuralLog on the BGL dataset.
The paper set the window size is 20 and step is 1, which makes the training data very huge.
It takes more than two hours to encode the log and train the model. While the paper "Log-based Anomaly Detection Without Log Parsing" claimed it takes 20 minutes to do so.
Could you give us some suggestions to reproduce the experiments? Our machine is 12900k and RTX3090. According to the Github repository, we use batch_size = 256 and epoch = 4.
Thanks for your kindness.

How to detect anomalies if we don`t have labels

This is one of the great research I found. I having logs without the labels. I wanted to use this semantic vectors to detect anomalies. I saw in neurallog/data_loader.py file. In this file at function load_HDFS if label is None you are creating the data frame which contains 'BlockId', 'EventSequence', "TimeSequence" columns. but after this how to proceed for the log anomaly detection can you please tell?(specifically if labes are not available) Thank you

What does this "balance" function do?

def balancing(x, y):
    print(y.count(0), y.count(1))
    if y.count(0) > y.count(1):
        pos_idx = [i for i, l in enumerate(y) if l == 1]
        neg_idx = [i for i, l in enumerate(y) if l == 0]
        pos_idx = shuffle(pos_idx)
        neg_idx = shuffle(neg_idx)
        neg_idx = neg_idx[:len(pos_idx) * 5]
        check_ids = [False] * len(x)
        for idx in pos_idx:
            check_ids[idx] = True
        for idx in neg_idx:
            check_ids[idx] = True
        x = [s for i, s in enumerate(x) if check_ids[i]]
        y = [s for i, s in enumerate(y) if check_ids[i]]

And

for idx in pos_idx:
            check_ids[idx] = True
        for idx in neg_idx:
            check_ids[idx] = True

Both β€œfor” loops make check_ids[] equal to true, is that right?

Provided trained models can't be loaded

Hello,

Could you let me know, how to load the pre-trained models provided by you using Keras (as you used). As I tried to load the models but get the error

from tensorflow.keras.models import load_model
model = load_model('demo/hdfs_transformer.hdf5') 

Traceback (most recent call last):
  File "/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages/IPython/core/interactiveshell.py", line 3553, in run_code
    exec(code_obj, self.user_global_ns, self.user_ns)
  File "<ipython-input-45-62ca9a14609d>", line 1, in <module>
    model = keras.models.load_model('demo/hdfs_transformer.hdf5')
  File "/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages/tensorflow/python/keras/saving/save.py", line 207, in load_model
    compile)
  File "/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages/tensorflow/python/keras/saving/hdf5_format.py", line 181, in load_model_from_hdf5
    raise ValueError('No model found in config file.')
ValueError: No model found in config file.

what is the label?

It's not clear what is the label? is it self-supervised?
I am looking for a model which predicts the next line of log.

How to process ThunderBird data

Hello, I met some problems when using your dataloader on Thunderbird data (data/raw/Thunderbird10M.log), with the error UnicodeDecodeError: 'utf-8' codec can't decode byte 0xa8 in position 5029: invalid start byte when f.readlines(). I have tried to modify mode 'r' to 'rb'. It didn't work. Could you please the proper dataloader to process Thunderbird data? Thanks a lot!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    πŸ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. πŸ“ŠπŸ“ˆπŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❀️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.