Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[meta] Presentation API Testing #266

Open
6 of 7 tasks
anssiko opened this issue Feb 12, 2016 · 57 comments
Open
6 of 7 tasks

[meta] Presentation API Testing #266

anssiko opened this issue Feb 12, 2016 · 57 comments
Assignees

Comments

@anssiko
Copy link
Member

anssiko commented Feb 12, 2016

When doing a larger change spanning across components it is often helpful to have a single issue for tracking the bigger change. This is such a "meta issue" for tracking the development of the test suite for the Presentation API.

Resources:

Tasks and related sub-issues:

All - @louaybassbouss is your test facilitator for the Presentation API spec and will be responsible for coordinating the test suite development. Please work with @louaybassbouss by contributing new test cases and reviewing existing tests. See the Work Mode > Testing wiki for more information on how to setup your own test environment.

Happy testing!

@louaybassbouss
Copy link
Contributor

thx @anssiko. I will take the responsibility to keep the IDL-harness up-to-date with the spec. I will try to extract the IDL for each conformance class (controlling and receiving UA) programatically.
I'd like also to ask Presentation API implementors to share with the group more information about their implementations e.g. which version of controlling UA, which presentation devices, limitations, etc. These information may help testers while writing or reviewing tests.
For testing activities, I'd like to start with basic tests around the following features:

  • monitor screen availability.
  • launch new presentation.
  • reconnect to a running presentation.
  • join a running presentation.
  • presentation connection lifecycle (and transition between connection states as stated in the spec).
  • messaging (send and receive messages in the formats stated in the spec).

If you like to open an new issue please use the label testing.
I will let you know when the initial web-platform-tests PR is merged.

@anssiko
Copy link
Member Author

anssiko commented Feb 15, 2016

Thanks @louaybassbouss. I updated the initial comment with the suggested tasks for basic tests, and will link to related issues when they appear.

@anssiko
Copy link
Member Author

anssiko commented Feb 17, 2016

The initial PR has landed to establish the test structure for the Presentation API, so you're now able to run the work-in-progress tests in-browser at:

http://w3c-test.org/presentation-api/

The Presentation API tests are located in the w3c/web-platform-tests repo at:

https://github.com/w3c/web-platform-tests/tree/master/presentation-api

(I added these links to the head of the spec and updated the first comment in this issue accordingly.)

w3c-test.org mirrors the master branch of the web-platform-tests repo. Whenever a new Pull Request is merged to master, the mirror should be updated.

@louaybassbouss
Copy link
Contributor

thx @anssiko. I just submitted a new PR to the web-platform-tests repo containing an update of idlharness.html for controlling and receiving UAs according to the lastet ED from today.
Volunteers a welcome to review this PR. Please add a comment to the PR after review (if everything is fine just put LGTM).

@anssiko
Copy link
Member Author

anssiko commented Mar 15, 2016

@zqzhang Could you look into creating some of the basic tests enumerated in #266 (comment)?

@zqzhang
Copy link
Member

zqzhang commented Apr 29, 2016

@yhe39 will jump in this after the International Labor Day holidays.

@anssiko
Copy link
Member Author

anssiko commented May 2, 2016

Thanks @zqzhang and @yhe39 -- please coordinate your test contributions with the Test Facilitator @louaybassbouss. He indicated he'd have new contributors to help with testing too.

@louaybassbouss feel free to share your plans and expected schedule. It'd be good if this meta issue would be kept updated as the testing progresses as to allow more people jump in and help.

@anssiko
Copy link
Member Author

anssiko commented May 12, 2016

Initial test results published (thanks @louaybassbouss and @zqzhang):

https://w3c.github.io/test-results/presentation-api/controlling-ua/all.html
https://w3c.github.io/test-results/presentation-api/receiving-ua/all.html

These test results contain idlharness.js tests only at the moment. For increased coverage, @louaybassbouss, @zqzhang, and @yhe39 will be adding basic tests (see the first comment) beyond the existing tests for WebIDL interfaces. Any ambiguities or issues in the spec that may be discovered while writing further test cases should be reported to the group so we can improve the spec accordingly.

@louaybassbouss
Copy link
Contributor

I am happy to announce that the first test report of Presentation API is published. It includes separate test results for Controlling 1 and Receiving 2 UAs. It considers for now only WebIDL tests. Many thanks to @zqzhang for the support. Also welcome @yhe39 from Intel, @mariuswsl and @taff-franck from Fraunhofer FOKUS who volunteered to support this activity. @mariuswsl and @taff-franck are working on the first 4 basic tests from the list in the first comment. Other volunteers are welcome to write, review or run tests. We plan to publish a second test report before the next F2F Meeting and a third one by the end of June. The second report will considers a subset of the basic tests and third report all basic tests and hopefully more Browsers. To Presentation API implementers: can you please share information with the group about current status, limitations, setup of presentation devices, etc.. This will help us a lot in writing and running tests.

@anssiko
Copy link
Member Author

anssiko commented May 12, 2016

Thanks for the update @louaybassbouss, and welcome @mariuswsl and @taff-franck to the growing Presentation API testing team.

@louaybassbouss
Copy link
Contributor

thx @mariuswsl 1 and @taff-franck 2 for the submission of new tests and @zqzhang for the review. These tests address the following features of the API:

  • Monitor screen availability
  • Launch new presentation
  • Reconnect to a running presentation

Once the PRs are merge, I will publish a new test report.

@louaybassbouss
Copy link
Contributor

I just submitted a PR for second test report. @zqzhang can you review and merge.

@anssiko
Copy link
Member Author

anssiko commented Jun 27, 2016

Thanks @louaybassbouss! Ping @Honry who can probably help review. @zqzhang may be unavailable currently.

@louaybassbouss
Copy link
Contributor

done. BTW I also submitted a new PR with idlharness updates. @Honry can you also review and merge this.

@Honry
Copy link

Honry commented Jun 28, 2016

I would like to help review the PR, but I have no access right to merge it, still need help from someone else.

@anssiko
Copy link
Member Author

anssiko commented Jun 28, 2016

Thanks for your help. Let us know when you have reviewed the PR and we'll find someone to merge the PR.

@Honry
Copy link

Honry commented Jun 28, 2016

OK, I've added some comments on the PR, @louaybassbouss please help address them.

@zqzhang
Copy link
Member

zqzhang commented Jun 29, 2016

The pull request for the testing report has been merged. For the tests update pull request, please address the review comment and drop me a note. I can then help merge it.

@louaybassbouss
Copy link
Contributor

The new presentation API test reports are available here:

Thx @zqzhang, @Honry, @yhe39, @mariuswsl and @taff-franck for writing or reviewing the tests.

PS: I will be on vacation until end of July. Please contact during this time @zqzhang or @anssiko I you want to submit or review tests or if you need any help regarding testing.

@anssiko
Copy link
Member Author

anssiko commented Jul 4, 2016

Thanks @louaybassbouss, @zqzhang, @Honry, @yhe39, @mariuswsl and @taff-franck!

(The test suite and implementation report are linked from the spec.)

@mfoltzgoogle
Copy link
Contributor

Dependent issue filed:
web-platform-tests/wpt#3294

@louaybassbouss
Copy link
Contributor

hello all I am back from vacation before I review all the mails is there anything urgent to do first regarding testing?

@tidoust
Copy link
Member

tidoust commented Aug 30, 2016

To get a rough estimation of the test coverage of the spec so far, I prepared the following document, which @louaybassbouss and I will try to keep up to date as the group progresses on the test suite:
https://tidoust.github.io/presentation-api-testcoverage/

This document is meant to be a (quick-and-dirty) work document: it links sections of the spec to test files that check them, and gives an estimation of each section's coverage, along with comments as to what still needs to be fixed, improved or done (comments are visible when you hover the coverage percentages, or the links to the test files). Tests may check more than one section. Tests that appear with "(PR)" are defined in a pull request and not merged yet.

The coverage estimation in percentage is rough and does not mean much on top "section not tested", "some missing tests", "we could perhaps do better" and "should be good enough". In other words, it's more a way to quickly assess which sections still need some love, do not read too much into it. Ideally, the test suite will cover all sections. In practice, some tests are arguably more important than others, and some steps that depend on implementations may prove hard to test.

The document is on GitHub. Feel free to suggest updates and/or send pull requests:
https://github.com/tidoust/presentation-api-testcoverage/

@tomoyukilabs
Copy link
Contributor

@mfoltzgoogle Thanks a lot for your update! However, I haven't still succeeded to confirm the Receiver API in 1-UA mode (with arbitrary URLs) with Chrome Dev for Mac 58.0.3004.3 yet; a receiver UA does not seem to have navigator.presentation.receiver. Could you find anything I might be missing?

@mfoltzgoogle
Copy link
Contributor

The receiver property is only exposed in pages started as presentations. I can double check the status when I have access to a Mac.

@tomoyukilabs
Copy link
Contributor

Yes, I have checked existence of navigator.presentation.receiver at the receiver by the codes like:

<!DOCTYPE html>
<meta charset="utf-8">
<script>
window.onload = () => {
  document.body.innerHTML += navigator.presentation.receiver;
};
</script>

The results on my Mac and Chromecast would be undefined, though.

@mfoltzgoogle
Copy link
Contributor

Okay, I'm able to reproduce this. We're investigating.

@mfoltzgoogle
Copy link
Contributor

Update:

  • You'll need to pass the flag --enable-blink-features=PresentationReceiver
  • We landed a bugfix that should have helped. Try today's canary release or newer.

@tomoyukilabs
Copy link
Contributor

Thanks! I have confirmed that PresentationReceiver in 1-UA mode of Chrome Canary works.

@mfoltzgoogle
Copy link
Contributor

@tidoust @tomoyukilabs Just pinging on the state of test coverage. Does the test coverage document [1] reflect the current status?

[1] https://tidoust.github.io/presentation-api-testcoverage/

@tomoyukilabs
Copy link
Contributor

@mfoltzgoogle Yes, if I understand correctly.

@tidoust
Copy link
Member

tidoust commented May 9, 2017

@mfoltzgoogle Yes, I maintain that document and it should roughly reflect the current status. In particular, thanks to @tomoyukilabs's thorough work on the test suite, it's almost all green now. The remaining work items are a bunch of minor things to fix or improve but the test suite now covers most algorithms.

@tidoust
Copy link
Member

tidoust commented May 9, 2017

Actually, with today's updates, I believe we're basically done. What we'll need to do now is:

  1. to run tests and update the implementation report accordingly
  2. to split tests when needed to make the implementation report more readable (most atomic tests have been grouped to avoid having to create too many manual tests, but that means that it may be hard to understand that a "fail" is actually a "pass except for one tiny little thing")
  3. to update human descriptions of the text

@mfoltzgoogle
Copy link
Contributor

Sounds good. We'll await the implementation report (or if I get a slice of time, can obviously attempt to run the current test suite myself).

@mfoltzgoogle
Copy link
Contributor

@tidoust Could we help out with the remaining TODOs here?

https://github.com/tidoust/presentation-api-testcoverage/blob/gh-pages/coverage.js

Also it looks like some sections of the spec are missing entries, so they show up as question marks in the coverage document.

@tidoust
Copy link
Member

tidoust commented May 19, 2017

@mfoltzgoogle The remaining TODOs are around points that I'm not sure we can test in the end. Typically, I do not see any good reproducible way to force an error when the connection gets established. And I'm not sure we can include tests in our test suite that require more than one secondary display available. I turned the "TODOs" into "If possible" suggestions for now.

The question marks were for sections for which I did not really know how to report coverage because the concepts they define can only be indirectly tested. I added links to these indirect tests and report the coverage for these sections as "N/A".

@mfoltzgoogle
Copy link
Contributor

Thanks for the clarification. I think the following two should be possible, it just requires there to be multiple presentation displays (possibly supporting different presentation URLs). I can send a PR if I can find a suitable solution.

    comments: [
      'If possible: test with multiple availability URLs in the set of availability objects',
      'If possible: test with multiple presentation displays (test user may not have multiple displays at hand though)'
   ]

@mfoltzgoogle
Copy link
Contributor

I have a WIP to add a test for availability with multiple displays, but it needs a bit more work before I can send a PR.

Re: #266 (comment)

@tidoust

  • Should we expect an implementation report soon?
  • Is there an issue filed to split the tests up? Or does that just mean splitting the results in the implementation report (versus splitting the test cases)?

@tidoust
Copy link
Member

tidoust commented Jun 16, 2017

Should we expect an implementation report soon?

My understanding is that @louaybassbouss is working on it. @louaybassbouss, any timeline?

Is there an issue filed to split the tests up? Or does that just mean splitting the results in the implementation report (versus splitting the test cases)?

I did not create a separate issue for that for now, because I do not know whether that will be needed at all.

It all depends on what the implementation report will reveal. The situation we should avoid is a report with lots of failures triggered by the same bug, that would not convey the fact that the implementation actually supports the feature under test. If the updated report looks bad, we should try to make the tests more atomic. If the updated report looks good, we don't need to split the tests up.

@louaybassbouss
Copy link
Contributor

@tidoust yes are working on this there was an issue in the Test Runner that blocked us @aleygues already reported this. After it is fixed, we were able to create the test report for Chrome Desktop but for Chrome Android in all the tests the Screen Selection Dialog was not displayed. @mfoltzgoogle any idea? we use as receiver Chromecast Ultra.

@tomoyukilabs
Copy link
Contributor

@louaybassbouss Is there anything which I can help you with?

@mfoltzgoogle
Copy link
Contributor

@louaybassbouss I'll take a look and see if I can repro. CC @avayvod

@louaybassbouss
Copy link
Contributor

@mfoltzgoogle we were able to run the tests on Chrome Canary for Android thx @avayvod for fixing it. Now we have the reports for Chrome Canary for Android, Chrome Desktop and Firefox for Android. We then recognised that the results of IDL tests are missing in the report due to some error in the idlharness lib which seems to be fixed in the latest version. @aleygues want to run the tests again to have also the IDL results in the report but the Test Runner is not working. This happened many times in the last weeks. I hope the bug on test runner will be fixed soon in order to run the tests with the IDLharness. otherwise I can publish the report with the current results and we can update it when we have the new report. @tidoust what do you think?

@mfoltzgoogle
Copy link
Contributor

That sounds good to me.

@louaybassbouss
Copy link
Contributor

I just submitted a PR with the new test results of Presentation API including WebIDL test results. Three Browsers are tested Chrome for Android, Chrome Desktop and Firefox for Android. Thx for @aleygues for helping in running the tests.

@mfoltzgoogle mfoltzgoogle added the P1 label Nov 1, 2017
@mfoltzgoogle
Copy link
Contributor

From https://www.w3.org/2017/11/06-webscreens-minutes.html#x11:

ACTION: @mfoltzgoogle to investigate gaps in the test coverage for Chrome

@mfoltzgoogle mfoltzgoogle self-assigned this Nov 8, 2017
@mfoltzgoogle
Copy link
Contributor

In Chrome, we are tracking progress on WPT here.

https://bugs.chromium.org/p/chromium/issues/detail?id=705170

We are tracking improving coverage in these two issues. The former run over Blink + content layer, and the latter run over the whole browser.

https://bugs.chromium.org/p/chromium/issues/detail?id=505664
https://bugs.chromium.org/p/chromium/issues/detail?id=678472

@tomoyukilabs
Copy link
Contributor

@mfoltzgoogle @schien FYI, I have made detailed reports of WPT results: [controlling UA][receiving UA]. I'd be happy if these reports would be helpful for you.

Also, I have submitted a couple of PRs to the WPT repository to fix several bugs: web-platform-tests/wpt#9046, web-platform-tests/wpt#9009 (merged).

@mfoltzgoogle
Copy link
Contributor

Thanks for the updated results, the detailed reasons for failure help with debugging.
I commented on the pending PR as well.

@tomoyukilabs
Copy link
Contributor

The GitHub repo for the Presentation API test suite was moved to:
https://github.com/web-platform-tests/wpt/tree/master/presentation-api

@mfoltzgoogle mfoltzgoogle removed the F2F label Mar 26, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

7 participants