Skip to content

approvals/ApprovalTests.Dart

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

Approval Tests implementation in Dart πŸš€


codecov Pub License: MIT Repository views Stars

Pub likes Pub popularity Pub points

Build and test badge Deploy and Create Release mdsnippets

πŸ“– About

Approval Tests are an alternative to assertions. You’ll find them useful for testing objects with complex values (such as long strings), lots of properties, or collections of objects.

Approval tests simplify this by taking a snapshot of the results, and confirming that they have not changed.

In normal unit testing, you say expect(person.getAge(), 5). Approvals allow you to do this when the thing that you want to assert is no longer a primitive but a complex object. For example, you can say, Approvals.verify(person).

I am writing an implementation of Approval Tests in Dart. If anyone wants to help, please text me. πŸ™

πŸ“‹ How it works

  • If the changed results match the approved file perfectly, the test passes.
  • If there's a difference, a reporter tool will highlight the mismatch and the test fails.

πŸ“¦ Installation

Add the following to your pubspec.yaml file:

dependencies:
  approval_tests: ^1.0.0

πŸ‘€ Getting Started

The best way to get started is to download and open the starter project:

This is a standard project that can be imported into any editor or IDE and also includes CI with GitHub Actions.

It comes ready with:

  • A suitable .gitignore to exclude approval artifacts
  • A ready linter with all rules in place
  • A GitHub action to run tests and you can always check the status of the tests on the badge in the README.md file.

πŸ“š How to use

In order to use Approval Tests, the user needs to:

  1. Set up a test: This involves importing the Approval Tests library into your own code.

  2. Optionally, set up a reporter: Reporters are tools that highlight differences between approved and received files when a test fails. Although not necessary, they make it significantly easier to see what changes have caused a test to fail. The default reporter is the CommandLineReporter. You can also use the DiffReporter to compare the files in your IDE.

  3. Manage the "approved" file: When the test is run for the first time, an approved file is created automatically. This file will represent the expected outcome. Once the test results in a favorable outcome, the approved file should be updated to reflect these changes. A little bit below I wrote how to do it.

This setup is useful because it shortens feedback loops, saving developers time by only highlighting what has been altered rather than requiring them to parse through their entire output to see what effect their changes had.

Approving Results

Approving results just means saving the .approved.txt file with your desired results.

We’ll provide more explanation in due course, but, briefly, here are the most common approaches to do this.

β€’ Via Diff Tool

Most diff tools have the ability to move text from left to right, and save the result. How to use diff tools is just below, there is a Comparator class for that.

β€’ Via approveResult property

If you want the result to be automatically saved after running the test, you need to use the approveResult property in Options:

void main() {
  test('test JSON object', () {
    final complexObject = {
      'name': 'JsonTest',
      'features': ['Testing', 'JSON'],
      'version': 0.1,
    };

    Approvals.verifyAsJson(
      complexObject,
      options: const Options(
        approveResult: true,
      ),
    );
  });
}

snippet source | anchor

this will result in the following file example_test.test_JSON_object.approved.txt

{
  "name": "JsonTest",
  "features": [
    "Testing",
    "JSON"
  ],
  "version": 0.1
}

snippet source | anchor

β€’ Via file rename

You can just rename the .received file to .approved.

Reporters

Reporters are the part of Approval Tests that launch diff tools when things do not match. They are the part of the system that makes it easy to see what has changed.

There are several reporters available in the package:

  • CommandLineReporter - This is the default reporter, which will output the diff in the terminal.
  • DiffReporter - This reporter will open the Diff Tool in your IDE.
    • For Diff Reporter I using the default paths to the IDE, if something didn't work then you in the console see the expected correct path to the IDE and specify customDiffInfo. You can also contact me for help.

CommandLineComparator img

To use DiffReporter you just need to add it to options:

 options: const Options(
   reporter: const DiffReporter(),
 ),
Visual Studio code img Android Studio img

πŸ“ Examples

I have provided a couple of small examples here to show you how to use the package. There are more examples in the example folder for you to explore. I will add more examples in the future. Inside, in the gilded_rose folder, there is an example of using ApprovalTests to test the legacy code of Gilded Rose kata. You can study it to understand how to use the package to test complex code.

And the verify_methods folder has small examples of using different ApprovalTests methods for different cases.

JSON example

void main() {
  const jsonItem = JsonItem(
    id: 1,
    name: "JsonItem",
    anotherItem: AnotherItem(id: 1, name: "AnotherItem"),
    subItem: SubItem(
      id: 1,
      name: "SubItem",
      anotherItems: [
        AnotherItem(id: 1, name: "AnotherItem 1"),
        AnotherItem(id: 2, name: "AnotherItem 2"),
      ],
    ),
  );

  test('verify model', () {
    Approvals.verifyAsJson(
      jsonItem,
      options: const Options(
        deleteReceivedFile:
            true, // Automatically delete the received file after the test.
        approveResult:
            true, // Approve the result automatically. You can remove this property after the approved file is created.
      ),
    );
  });
}

snippet source | anchor

this will result in the following file verify_as_json_test.verify_model.approved.txt

{
  "jsonItem": {
    "id": 1,
    "name": "JsonItem",
    "subItem": {
      "id": 1,
      "name": "SubItem",
      "anotherItems": [
        {
          "id": 1,
          "name": "AnotherItem 1"
        },
        {
          "id": 2,
          "name": "AnotherItem 2"
        }
      ]
    },
    "anotherItem": {
      "id": 1,
      "name": "AnotherItem"
    }
  }
}

snippet source | anchor

Passed test example

❓ Which File Artifacts to Exclude from Source Control

You must add any approved files to your source control system. But received files can change with any run and should be ignored. For Git, add this to your .gitignore:

*.received.*

βœ‰οΈ For More Information

Questions?

Ask me on Telegram: @yelmuratoff.
Email: yelamanyelmuratov@gmail.com

Video Tutorials

You can also watch a series of short videos about using ApprovalTests in .Net on YouTube.

Podcasts

Prefer learning by listening? Then you might enjoy the following podcasts:

Coverage

🀝 Contributing

Show some πŸ’™ and star the repo to support the project! πŸ™Œ
The project is in the process of development and we invite you to contribute through pull requests and issue submissions. πŸ‘
We appreciate your support. 🫰



Thanks to all contributors of this package