Skip to content

RevLLM -- Reverse Engineering Tools for Large Language Models

License

Notifications You must be signed in to change notification settings

jwergieluk/revllm

Repository files navigation

RevLLM

Reverse Engineering Tools for Large Language Models

Introduction

RevLLM is a Python library designed to facilitate the analysis of Transformer Language Models, particularly focusing on generative, decoder-type transformers. Our library aims to democratize the access to advanced explainability methods for data scientists and machine learning engineers who work with language models. Built on top of Andrej Karpathy's esteemed nanoGPT, RevLLM stands as a robust and user-friendly tool in the field of natural language processing.

Features

  • Extensive Model and Method Documentation. Documentation is available directly in the provided streamlit dashboard.
  • GPT-2 Models: Automatic download and usage of GPT-2 models (thanks to Huggingface integration).
  • Model Analysis: Deep insights into transformer language models.
  • Code base: Simple, easy-to-understand and self-contained.

Feature rundown:

Logit Lens

Prompt Importance Analysis

Self-Attention Analysis

Tokenizer Analysis

GPT-2 maintains a fixed dictionary of around 50k tokens. The model uses the Byte Pair Encoding tokenizer algorithm split any given input sentence into a token sequence. This token sequence is mapped to a sequence of integers that is consumed by the model.

Embedding Matrix Statistics and Visualization

Generation with Top-k Sampling and Temperature

Automatic Installation

To install RevLLM, simply clone this repository and run the following command:

./make_mamba_env.sh

Proceed to the next section to run the demo app.

Manual Installation

Install the dependencies

The app uses a Python distribution called Anaconda. Alternatively, you can use Micromamba, which is a lightweight version of Anaconda.

To install Anaconda, follow the instructions on at (https://docs.anaconda.com/free/anaconda/install/windows/).

To install Micromamba, follow the instructions at (https://mamba.readthedocs.io/en/latest/installation/micromamba-installation.html)

Create the Anaconda revllm environment

Open the shell and navigate to the directory where you cloned the repository. Then, run the following command to create the revllm environment:

conda create -n revllm --yes --file conda_packages.txt -c conda-forge --strict-channel-priority

Activate the revllm environment by running

conda activate revllm

Create the Micromamba revllm environment

Alternatively, you can use Micromamba to create the revllm environment.

micromamba self-update --yes
micromamba create -n revllm --yes --file conda_packages.txt -c conda-forge

Update the revllm environment

To update the revllm environment, delete the existing environment and create it again:

conda env remove -n revllm

or, if you are using Micromamba:

micromamba env remove -n revllm

Then, follow the instructions above to create the revllm environment.

Running the Demo App

To run the app, open the shell and navigate to the directory where you cloned the repository. Then, run the following command to activate the revllm environment:

conda activate revllm

Or, if you are using Micromamba, run the following command to activate the revllm environment:

micromamba activate revllm

Execute the following commands to start the demo app:

./run_demo.sh

The app should now be running on http://localhost:8608/.

License

RevLLM is released under the MIT License.

Acknowledgements

Special thanks to Andrej Karpathy and contributors to the nanoGPTproject for their foundational work in the field.