Git Product home page Git Product logo

fsnet's Introduction

Learning Fast and Slow for Online Time Series Forecasting

This project contains the Pytorch implementation of the following paper from Salesforce Research Asia:

Title: Learning Fast and Slow for Online Time Series Forecasting

Authors: Quang Pham, Chenghao Liu, Doyen Sahoo, and Steven Hoi

Introduction

Learning Fast and Slow for Online Time Series Forecasting introduces FSNet to forecast time series on the fly. FSNet augments the standard deep neural network (TCN in this repo) with the capabilities to quickly adapt to simultaneously deal with abrupt changing and repeating patterns in time series. Particularly, FSNet improves the slowly-learned backbone by dynamically balancing fast adaptation to recent changes and retrieving similar old knowledge. FSNet achieves this mechanism via an interaction between two complementary components of an adapter to monitor each layer's contribution to the lost, and an associative memory to support remembering, updating, and recalling repeating events.

FSNet

Requirements

  • python == 3.7.3
  • pytorch == 1.8.0
  • matplotlib == 3.1.1
  • numpy == 1.19.4
  • pandas == 0.25.1
  • scikit_learn == 0.21.3
  • tqdm == 4.62.3
  • einops == 0.4.0

Benchmarking

1. Data preparation

We follow the same data formatting as the Informer repo (https://github.com/zhouhaoyi/Informer2020), which also hosts the raw data. Please put all raw data (csv) files in the ./data folder.

2. Run experiments

To replicate our results on the ETT, ECL, Traffic, and WTH datasets, run

chmod +x scripts/*.sh
bash .scripts/run.sh

3. Arugments

Method: Our implementation supports the following training strategies:

  • ogd: OGD training
  • large: OGD training with a large backbone
  • er: experience replay
  • derpp: dark experience replay
  • nomem: FSNET without the associative memory
  • naive: FSNET without both the memory and adapter, directly trains the adaptation coefficients.
  • fsnet: the proposed FSNet framework

You can specify one of the above method via the --method argument.

Dataset: Our implementation currently supports the following datasets: Electricity Transformer - ETT (including ETTh1, ETTh2, ETTm1, and ETTm2), ECL, Traffic, and WTH. You can specify the dataset via the --data argument.

Other arguments: Other useful arguments for experiments are:

  • --test_bsz: batch size used for testing: must be set to 1 for online learning,
  • --seq_len: look-back windows' length, set to 60 by default,
  • --pred_len: forecast windows' length, set to 1 for online learning.

fsnet's People

Contributors

chenghaoliu89 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.