Skip to content

Latest commit

 

History

History
316 lines (225 loc) · 13.5 KB

CONTRIBUTING.md

File metadata and controls

316 lines (225 loc) · 13.5 KB

Setup | Running linting/tests | Writing tests | Debugging code | Internals


Contributing

Contributions are always welcome, no matter how large or small. Before contributing, please read the code of conduct.

Not sure where to start?

  • If you aren't just making a documentation change, you'll probably want to learn a bit about a few topics.
  • ASTs (Abstract Syntax Tree): The Babel AST spec is a bit different from ESTree. The differences are listed here.
  • Check out /doc for information about Babel's internals
  • Check out the Babel Plugin Handbook - core plugins are written the same way as any other plugin!
  • Check out AST Explorer to learn more about ASTs or make your own plugin in the browser
  • When you feel ready to jump into the Babel source code, a good place to start is to look for issues tagged with help wanted and/or good first issue.
  • Follow along with what we are working on by joining our Slack, following our announcements on Twitter, and reading (or participating!) in our meeting notes.
  • Check out our website and the repo

Chat

Feel free to check out the #discussion/#development channels on our Slack. Some of us are always online to chat!

Developing

Note: Versions < 5.1.10 can't be built.

Babel is built for Node 4 and up but we develop using Node 8 and yarn. You can check this with node -v.

Make sure that Yarn is installed with version >= 0.28.0. Installation instructions can be found here: https://yarnpkg.com/en/docs/install.

Setup

$ git clone https://github.com/babel/babel
$ cd babel
$ make bootstrap

Then you can either run:

$ make build

to build Babel once or:

$ make watch

to have Babel build itself and incrementally build files on change.

You can access the built files for individual packages from packages/<package-name>/lib.

If you wish to build a copy of Babel for distribution, then run:

$ make build-dist

Running linting/tests

You can run lint via:

# ~6 sec on a MacBook Pro (Mid 2015)
$ make lint

You can run eslint's autofix via:

$ make fix

You can run tests + lint for all packages (slow) via:

# ~46 sec on a MacBook Pro (Mid 2015)
$ make test

If you just want to run all tests:

# ~40 sec on a MacBook Pro (Mid 2015)
$ make test-only

Most likely you'll want to focus in on a specific issue.

To run tests for a specific package in packages, you can use the TEST_ONLY environment variable:

$ TEST_ONLY=babel-cli make test

TEST_ONLY will also match substrings of the package name:

# Run tests for the @babel/plugin-transform-classes package.
$ TEST_ONLY=es2015-class make test

Use the TEST_GREP variable to run a subset of tests by name:

$ TEST_GREP=transformation make test

Substitute spaces for hyphens and forward slashes when targeting specific test names:

$ TEST_GREP="arrow functions destructuring parameters" make test

To enable the Node.js debugger added in v6.3.0, set the TEST_DEBUG environment variable:

$ TEST_DEBUG=true make test

You can combine TEST_DEBUG with TEST_GREP or TEST_ONLY to debug a subset of tests. If you plan to stay long in the debugger (which you'll likely do!), you may increase the test timeout by editing test/mocha.opts.

To test the code coverage, use:

$ BABEL_ENV=cov make build
$ ./scripts/test-cov.sh

Troubleshooting Tests

In case you're not able to reproduce an error on CI locally, it may be due to

  • Node Version: Travis CI runs the tests against all major node versions. If your tests use JavaScript features unsupported by lower versions of node, then use minNodeVersion option in options.json.
  • Timeout: Check the CI log and if the only errors are timeout errors and you are sure that it's not related to the changes you made, ask someone in the slack channel to trigger rebuild on the CI build and it might be resolved

In case you're locally getting errors which are not on the CI, it may be due to

  • Updates in Dependencies: Make sure you run make bootstrap before you run make build or make watch before you run the tests.

Writing tests

Most packages in /packages have a test folder, however some tests might be in other packages or in /packages/babel-core.

@babel/plugin-x

All the Babel plugins (and other packages) that have a /test/fixtures are written in a similar way.

For example, in @babel/plugin-transform-exponentiation-operator/test:

  • There is an index.js file. It imports our test helper. (You don't have to worry about this).

  • There can be multiple folders under /fixtures

    • There is an options.json file whose function is similar to a .babelrc file, allowing you to pass in the plugins and settings you need for your tests.
    • For this test, we only need the relevant plugin, so it's just { "plugins": ["@babel/plugin-transform-exponentiation-operator"] }.
    • If necessary, you can have an options.json with different options in each subfolder.
  • In each subfolder, you can organize your directory structure by categories of tests. (Example: these folders can be named after the feature you are testing or can reference the issue number they fix)

  • Generally, there are two kinds of tests for plugins

    • The first is a simple test of the input and output produced by running Babel on some code. We do this by creating an actual.js file and an expected.js file.
    • If you need to expect an error, you can ignore creating the expected.js file and pass a new throws key to the options.json that contains the error string that is created.
    • The second and preferred type is a test that actually evaluates the produced code and asserts that certain properties are true or false. We do this by creating an exec.js file.

In an actual/expected test, you simply write out the code you want transformed in actual.js.

// actual.js
2 ** 2;

and the expected output after transforming it with your options.json in expected.js.

// expected.js
Math.pow(2, 2);

In an exec.js test, we run or check that the code actually does what it's supposed to do rather than just check the static output.

// exec.js
assert.equal(8, 2 ** 3);
assert.equal(24, 3 * 2 ** 3);

If you need to check for an error that is thrown you can add to the options.json

// options.json example
{
  "plugins": [["@babel/plugin-proposal-object-rest-spread", { "useBuiltIns": "invalidOption" }]],
  "throws": "@babel/plugin-proposal-object-rest-spread currently only accepts a boolean option for useBuiltIns (defaults to false)"
}

If the test requires a minimum Node version, you can add minNodeVersion (must be in semver format).

// options.json example
{
  "minNodeVersion": "5.0.0"
}

babylon

Writing tests for Babylon is very similar to the other packages. Inside the packages/babylon/tests/fixtures folder are categories/groupings of test fixtures (es2015, flow, etc.). To add a test, create a folder under one of these groupings (or create a new one) with a descriptive name, and add the following:

  • Create an actual.js file that contains the code you want Babylon to parse.

  • Add an expected.json file with the expected parser output. For added convenience, if there is no expected.json present, the test runner will generate one for you.

After writing tests for babylon, just build it by running:

$ make build-babylon

Then, to run the tests, use:

$ TEST_ONLY=babylon make test-only

Bootstrapping expected output

For both @babel/plugin-x and babylon, you can easily generate an expected.js/expected.json automatically by just providing actual.js and running the tests as you usually would.

// Example
- packages
  - babylon
    - test
      - fixtures
        - comments
          - basic
            - block-trailing-comment
              - actual.js
              - expected.json (will be generated if not created)

Debugging code

A common approach to debugging JavaScript code is to walk through the code using the Chrome DevTools debugger. For illustration purposes, we are going to assume that we need to get a better understanding of Generator.generate(), which is responsible for generating code for a given AST. To get a better understanding of what is actually going on for this particular piece of code, we are going to make use of breakpoints.

generate() {
+ debugger; // breakpoint
  return super.generate(this.ast);
}

To include the changes, we have to make sure to build Babel:

$ make build

Next, we need to execute Generator.generate(), which can be achieved by running a test case in the @babel/generator package. For example, we can run the test case that tests the generation of class declarations:

$ TEST_DEBUG=true TEST_GREP=ClassDeclaration make test-only

./scripts/test.sh
Debugger listening on port 9229.
To start debugging, open the following URL in Chrome:
    chrome-devtools://devtools/remote/serve_file/@60cd6e859b9f557d2312f5bf532f6aec5f284980/inspector.html?experiments=true&v8only=true&ws=127.0.0.1:9229/3cdaebd2-be88-4e7b-a94b-432950ab72d0

To start the debugging in Chrome DevTools, open the given URL. The debugger starts at the first executed line of code, which is Mocha's first line by default. Click Resume script execution Resume script execution button. to jump to the set breakpoint. Note that the code shown in Chrome DevTools is compiled code and therefore differs.

Creating a new plugin (spec-new)

Example: babel/babylon#541

  • Create a new issue that describes the proposal (ex: #538). Include any relevant information like proposal repo/author, examples, parsing approaches, meeting notes, presentation slides, and more.
  • The pull request should include:
    • An update to the plugins part of the readme. Add a new entry to that list for the new plugin flag (and link to the proposal)
    • If any new nodes or modifications need to be added to the AST, update ast/spec.md
    • Make sure you use the this.hasPlugin("plugin-name-here") check in Babylon so that your new plugin code only runs when that flag is turned on (not default behavior)
    • Add failing/passing tests according to spec behavior
  • Start working about the Babel transform itself!

Internals