At a regular pace in a tabloid fashion, we can see different people throwing at the face of each people their table of numbers, of greens and reds. Each of these tables showing percentages, scores and used as a flag for claiming a virtue of conformance to the specification. "We are a modern browser", "We have a score of 100% for HTML5", etc. Let's make it clear.
These are wrong assessments of Conformance.
I spent a bit of my time working on conformance topics at W3C in the past. I have been the editor of QA Framework: Specification Guidelines (the work of a wonderful group of people), which explains how to define conformance for specifications.
A few facts
- Passing tests cases just mean passing test cases.
- Conformance is a tool to define what is mandatory in a specification for having the basic interoperability between different products.
- Implementation Conformance Statement (ICS) is a tool which lists all the requirements that people need to pass to assess conformance
- It is not possible to claim conformance to a technology without the technology defining what is conformance.
- Conformance is unrelated to certification.
- The quality of implementations is into the details of interoperability.
What can we do?
- Create more tests
- Report bugs
- Publish report for each individual test, not for the meaningless global result
The tests are very useful because they push further the envelop and help to increase interoperability. The more you had, the better. The OpenWeb technologies have many dimensions of variability and as such are very hard to test. A few hundred of tests are close to nothing to be able to understand and reach interoperability. The XML specification had a few ten thousands tests and the test suite was not complete.