Git Product home page Git Product logo

mechanisticprobe's Introduction

MechanisticProbe

arxiv Dataset Download GitHub Project


In this paper, we explored how language models (LMs) perform multi-step reasoning tasks: by memorizing answers from massive pretraining corpus, or by step-by-step reasoning.


To answer the research question, we propose the MechanisticProbe to detect the reasoning trees inside LMs via the attention patterns. We run the analysis experiments on three reasoning tasks:

  1. Probe GPT-2 on the synthetic task: finding the k-th smallest number from a number list;
  2. Probe LLaMA on the synthetic task: ProofWriter;
  3. Probe LLaMA on the real-world reasoning task: AI2 Reasoning Challenge (ARC).

The MechanisticProbe is quite simple composed of two kNN classifiers:

The first one is used to predict if the statement is useful or not for the reasoning task; The second one is used to detect the reasoning step of the useful statement.


Try MechanisticProbe

1. Prepare the environment

Install necessary libraries:

$ pip install -r requirements.txt

Download the processed dataset folder from the HuggingFace repo.

2. Prepare the model

Run the finetuning script to finetune GPT-2 on the synthetic reasoning task (finding the k-th smallest number) by:

$ bash run_finetune_gpt2.sh

3. Run the MechanisticProbe

Run our probe model to analyze the finetuned GPT-2 on the synthetic task:

$ bash run_probing_gpt2.sh

Other analysis experiments

We have also prepared code for other analysis experiments. Users have to prepare the datasets as well as the LMS. The path of datasets and models should be specified in the scripts.

  1. Attention visualization for GPT-2 model on the synthetic reasoning task (run_attnvisual_gpt2.sh);
  1. Causal analysis (head entropy calculation) and full attention visualization (run_causal_gpt2.sh);
  1. Probing experiments for LLaMA on the ProofWriter and ARC (run_probing_proofwriter.sh and run_probing_arc.sh);
  1. Robustness experiment for LLaMA on the ProofWriter (run_corrupt_proofwriter.sh).

Cite

If this work/code is helpful for you, welcome to cite our paper:

@article{hou2023towards,
  title={Towards a Mechanistic Interpretation of Multi-Step Reasoning Capabilities of Language Models},
  author={Hou, Yifan and Li, Jiaoda and Fei, Yu and Stolfo, Alessandro and Zhou, Wangchunshu and Zeng, Guangtao and Bosselut, Antoine and Sachan, Mrinmaya},
  journal={arXiv preprint arXiv:2310.14491},
  year={2023}
}

Contact

Feel free to open an issue or send me ([email protected]) an email if you have any questions!

mechanisticprobe's People

Contributors

yifan-h avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.