Skip to content

Desktop Client QA Workflow

Klaas Freitag edited this page Jul 2, 2015 · 7 revisions

This page describe the QA efforts that we do on ownCloud Desktop Client Development

Manual QA

The Desktop Team currently has one dedicated QA engineer. This is how we collaborate.

Bugfixes

Once reported bugs are fixed by one of the developers, the developer who fixed it sets the Ready to Test tag and mentions the SHA sum of the fixing commit. The bug is not being closed by the developer.

The QA engineer picks up the list of bugs with the Ready to Test label and verifies that the bug fix. If the bug is really fixed, the QA engineer closes the bug report, ideally with a comment that it was verified successfully. If the bug is not fixed, the QA engineer removes the Ready to Test tag and comments the bug. The developer who worked on the bug fix picks the bug up again and fixes again.

Releases

For releases, the QA Engineer works based on a testplan. There is a generic testplan, but also, if applicable, a release specific test plan that is linked from the generic one. Both of them have to be tested with an upcoming release.

To set up the test plans, the QA engineer needs detailed information on what new functionality or bug fixes is available in the new release.

Developers are supposed to transfer QA related information from specific github issues to the Wiki pages. For issues, that involves information on how the issue can be reproduced, the circumstances under which stuff happens, how it was fixed. For new features, the Wiki page should explain what the new feature is, how it can be used in general, and how it can be tested. Please remember to link between the issues and the wiki page where applicable.

QA engineers can either work directly work from the Wiki page, or create their own testplans based on that. QA also is supposed to document the testing efforts.

Early Involvement of QA

The earlier QA is involved in both bug fixing and new feature development, the better. QA is invited to give feedback all along the development process. This is best happening in github. Developers are supposed to CC QA engineers as early as possible.

Automatted QA

Unit Tests

We have two collections of unit tests, one originating from csync here and one based on the Qt Testing Framework available here. To include the unit tests to the build, call cmake with the parameter -DUNIT_TESTING=ON.

The tests can be run manually on Linux, but not (yet) on Windows or Mac by calling make test.

State: Both unit test groups are part of the continous integration server. They run on every checkin.

tx.pl Scripts

We maintain a set of perl based scripts to perform integration tests between client and server. The scripts wrap around the command line client owncloudcmd. The basic idea is that the client directory and a test server is filled with a defined set of files through the server's WebDAV interface, completely without sync client utilization. After that, one or more sync run is performed by the script. After the sync run completed, the local and remote file trees are asserted if both are equal, or meat the expectation.

The scripts can be run manually on the developers machine on Linux, but not (yet) on Windows or Mac by calling them directly from the csync/tests/ownCloud directory of the git checkout. Prerequisite is to crate a t1.cfg file based on the template in that directory.

State: The tx.pl scripts are part of the continous integration server. They run on every checkin.

Smashbox

There is a test suite called smashbox which performs a whole variety of tests between server and multiple clients. There is also an ownCloud branch.

State: Not yet running fully automated but manually driven by QA and Jakub Moscicki from CERN. Automation is WIP.

Performance Tests

Performance tests should be done on a regular base to be able to detect changes in one or the other direction. Only performance improvements should be accepted. For that, maybe every night a defined set of data should be uploaded from one client to a server and synced down to another client. The server can be on the same machine as the clients which minimizes network latency problems.

From the server's access_log logfile timing information can be computed, as the access_log contains proper time stamps and also can log the server processing time of a request. Some characteristic duration values should be stored to be able to compare it with every run to be able to detect a tendency.

To create and handle a reproduceable, potentially huge file tree, we created a script set called dav torture. It can be found here and is described in a blog entry.

State: There is currently no automatted performance testing.

##Algorithm Validation The sync algorithm is a very complex algorithm which needs to be validated especially for edge cases. Edge cases are very rare to happen, but given the huge amount of files we potentially deal with, it is very likely that even the edge cases happen.

###Documentation A first step is proper documentation of the algorithm, so that people can read and understand the idea and chime into the discussion.

The sync alogrithm is documented in the ownCloud Client Documentation.

State: The documentation is started but could be improved.