Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement automated (regression) testing #46

Open
boulund opened this issue Feb 26, 2018 · 4 comments
Open

Implement automated (regression) testing #46

boulund opened this issue Feb 26, 2018 · 4 comments

Comments

@boulund
Copy link
Member

boulund commented Feb 26, 2018

As a stretch-goal for the next main version of the workflow, I'd like us to implement automated regression testing. It's not very difficult, and will improve our ability to catch small bugs, typos, etc. in pull requests and commits as early as possible.

In order to implement this in a useful way, we need at least one test case for each major path through the workflow. E.g. one gram positive, one gram negative, one with reference proteins, one which is contaminated, etc., so we can validate that all common paths through the workflow works as they should.

I'm labeling this issue as low priority and assigning it to the BACTpipe v3.0 milestone for now. There's no rush to implement this at this time.

@boulund boulund added this to the Bactpipe 3.0 milestone Feb 26, 2018
@boulund
Copy link
Member Author

boulund commented Mar 1, 2018

Having recently containerized the BACTpipe workflow with Docker (#47), this will be much easier to implement. Any continuous integration suite (e.g. Travis CI, commonly used for Github projects) can then run the tests inside the container, making it very easy to configure and set up. But still, no rush. Better to hold off on this until all other tasks have been completed.

@thorellk
Copy link
Collaborator

thorellk commented Dec 2, 2020

This is something that @abhi18av also mentioned. Maybe this is the time to implement it?

@boulund
Copy link
Member Author

boulund commented Dec 2, 2020

I absolutely think so! Not sure where the nextflow field is at now, how do people normally perform automated testing of their workflows, do you know @abhi18av ?

I guess we need at least a minimal test data set that assembles and can be annotated, but using a minimum amount of resources to do so, so the test can be run relatively fast on a small vm with TravisCI/CircleCI/Github actions or whatever people use nowadays.

@abhi18av
Copy link
Collaborator

abhi18av commented Dec 2, 2020

I do agree on this as well :)

With the new modular design of the code base, it should be easier to test at a module level as well as (sub)-workflow level. Github actions seems to be the way forward, but only after the community has figured out an optimal automated testing workflow.

This week itself, there has been major progress, with nf-core/modules#80 and nf-core/modules#85 and in a week or so it would be stabilized. Then we can see whether we can re-use that pattern or do we have to create our own :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants