Skip to content

Building a fast matching engine in Rust for efficient processing of an ITCH order book.

License

Notifications You must be signed in to change notification settings

amankrx/matching-engine-rs

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

16 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

matching-engine-rs

License Version GitHub Stars

This is an attempt to implement a matching engine with Rust. Currently, I have created an implementation of a Limit Order Book. The LOB offers fast processing of the ITCH data clocking at 11.3 Million messages per second (or a latency of 88 ns) as tested on my local machine. Checkout the Performance section for more information.

Table of Contents

Project Structure

These project consists of two libraries:

  • itch-parser: This library is responsible for managing the processing of NASDAQ ITCH 5.0 protocol data. It parses the useful fields that will be required for the Limit Order Book. The remaining fields are skipped using placeholders. Check out the folder's README for more information.
  • optimized-lob: This library contains a streamlined and efficient implementation of a Limit Order Book (LOB). It is worth noting that the LOB simply stores a few useful fields that will be required for creating a LOB. It just keeps an aggregate quantities at each level. Check out the folder's README for more information.

Apart from that, there is a testing suite for both libraries that can be found in the "tests" directory.

Build, Run, and Test

Make sure you have Rust installed. Also, you must download the NASDAQ ITCH 5.0 data whose instructions are available in the ITCH Specifications. All of these operations are performed in the tests directory.

cd tests

Build

cargo build

or

cargo build --release

Running the LOB

ITCH_DATA=PATH_TO_ITCH_DATA_FILE cargo run

or

ITCH_DATA=PATH_TO_ITCH_DATA_FILE cargo run --release

Running the ITCH parser

ITCH_DATA=PATH_TO_ITCH_DATA_FILE cargo run -- --itch-parser

or

ITCH_DATA=PATH_TO_ITCH_DATA_FILE cargo run --release -- --itch-parser

Testing

cargo test

Device Specifications

At the time of testing:

Device: MacBook Air M2
CPU architecture: Apple M2
CPU logical cores: 8
CPU physical cores: 8
RAM total: 16 GB
RAM free: 11.5 GB

Performance

ITCH Processing

ITCH Parser Processing...

Success...

ITCH Parsing Statistics:
Total Messages: 268744780
Total Time: 6.082 seconds
Speed: 44189583 msg/second
Latency: 22 ns

LOB Performance

LOB Processing...

Success...

Performance Metrics:
Total Messages: 268744780
ITCH Latency: 88 ns
Total Time: 23.660 seconds
Speed: 11358746 msg/second

Orderbook Statistics:
Total Add Orders: 118631456
Total Execute Orders: 5822741
Total Cancel Orders: 2787676
Total Delete Orders: 114360997
Total Replace Orders: 21639067

ITCH Specifications

The project follows the Nasdaq TotalView-ITCH 5.0 standard for the processing of data.

I have specifically used their 12302019.NASDAQ_ITCH50 data whose compressed file can be downloaded from here.

Contributing

Contributions to matching-engine-rs are welcome! If you encounter any issues, have suggestions, or would like to add new features, please feel free to open an issue or submit a pull request. Note that I'm still learning my way around Rust and trading systems, so any feedback is appreciated!

License

This project is licensed under the MIT License.

About

Building a fast matching engine in Rust for efficient processing of an ITCH order book.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages