Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Project Proposal: What is the fastest Bril Interpreter? #396

Open
rcplane opened this issue Oct 19, 2023 · 2 comments
Open

Project Proposal: What is the fastest Bril Interpreter? #396

rcplane opened this issue Oct 19, 2023 · 2 comments
Labels
proposal course project proposals

Comments

@rcplane
Copy link
Contributor

rcplane commented Oct 19, 2023

What will you do?

Project Statement

For the Cornell Capra Bril language designed for learning about compiler design and implementation, a recent listing shows that at least two interpreters and three compilers are currently supported.

We pose the problem, given as input:

  • an input Bril program
  • a grammar of possible interpreter commands and options
  • optional past interpreter data for
    • other Bril program executions
    • the stated input Bril program execution on interpreters other than the one we might currently be generating configuration for

Can we predict the fastest end-to-end interpreter and options configuration to execute that Bril program?

Novelty

We will implement a novel predictor for interpreter runtime on Bril programs.
Though multiple interpreter and compile and run tools have been built, we aim to make a survey studying their options and making deployment more accessible.

How will you do it?

Minimum Viable Project Outputs

  • a standardized Bril runtime environment supporting multiple languages implemented as a virtualized x86 docker container platform suitable for recurring Github Actions execution as public free continuous integration testing
  • a predictor to answer the above problems statement, executable on the environment

Possible Extensions

Given sufficient time or other interested parties, we could expand into developing new interpreter features, new benchmarks, or more sophisticated predictors including code structure analysis or large language models.

How will you empirically measure success?

  • detailed comparison studies charting interpreter runtime behavior and important factors in predictor success or failure such as dynamic instruction count from the reference interpreter, on all current in-tree Bril benchmarks

Benefits to Future Course Infrastructure

A standardized deployment environment loaded with additional programming languages can facilitate student exploration of new programming languages and tooling with solid reference baselines and less frustration in setting up a desired language toolchain.

Standardized Docker deployment on Github Actions will facilitate benchmarking and regression testing for course infrastructure as well as reduce “it works on my machine” troubles for students seeking common development environment like dev containers.

Team members:

just @rcplane for now

@rcplane rcplane added the proposal course project proposals label Oct 19, 2023
@sampsyo
Copy link
Owner

sampsyo commented Oct 23, 2023

Sounds very cool! I look forward to seeing how this turns out.

Here's one high-order bit that would be great to resolve soon (i.e., within ~1 week?), before diving too much into the infrastructure: exactly which interpreters/compilers/JITs will you measure? Maybe a prioritized list would be helpful, so you can be sure to get the first few and incorporate some latter ones as a "stretch goal."

@rcplane
Copy link
Contributor Author

rcplane commented Nov 5, 2023

Dockerfiles for default and rust interpreter, still prioritizing other possible interpreters and options.

@rcplane rcplane closed this as completed Nov 5, 2023
@rcplane rcplane reopened this Nov 5, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
proposal course project proposals
Projects
None yet
Development

No branches or pull requests

2 participants