Skip to content

Create a report by ARIA/HTML feature #1519

@ccanash

Description

@ccanash

Top-Level Report Page Updates

On the reports page, after the introduction and before the L2 heading with text “Support Levels”, add a tab list with three tabs: “Test Plans”, “ARIA Features”, and "HTML Features."

The L2 heading that currently has content “Support Levels” will serve as a tab title and have content of either: “Test Plan Support Levels,” “ARIA Feature Support Levels,” or "HTML Feature Support Levels".

Table Summary by AT/browser combination

ARIA Summary Table example:

ARIA Feature JAWS and Chrome NVDA and Chrome VoiceOver for macOS and Safari
aria-selected X% of passing X% of passing X% of passing
button X% of passing X% of passing X% of passing

HTML Summary Table example:

HTML Feature JAWS and Chrome NVDA and Chrome VoiceOver for macOS and Safari
Button X% of passing X% of passing X% of passing

Percentage and value definitions for AT/Browser Combo Support Level columns

Each percentage is a combination of must-have and should-have behaviors that should be combined into a single total assertion count (X% of passing = P/N*100):

  1. N: Total number of “Must-Have” and "Should-Have" assertions that reference this attribute across all most-recent reports on this AT/browser combination where most-recent is:
    a. Report generated from most recent version of recommended test plan if available.
    b. If a recommended test plan report is not available, report generated from most recent version of candidate test plan if available.
    c. Data from draft test plans should not be included in attribute reports.
  2. P: Number of N assertions with a passing verdict.
  3. F: Number of N assertions with a failing verdict.
  4. U: Number of N assertions that were untestable.

Additional Summary Table Features

  1. The attributes link to specifications using data in support.json.
  2. X% Support Level links to the raw data report described below.
  3. Data can be downloaded in CSV format. TBD on how the values will be split across columns in the CSV.

Source Data

  1. Each assertion, with some exceptions, has one or more associated ARIA attribute references, specified in “assertions.csv”.
  2. The attribute being referenced is defined by the combination of the “type” and “value” columns in “references.csv”. Note that there are currently two types: aria and htmlAam. Reports by attribute should include both.
  3. Each assertion with one or more references has verdicts from one or more commands in one or more tests in a test plan report.
  4. A “support level” could be a percentage of passing verdicts from P1 and P2 (MUST and SHOULD) assertions, excluding P3 assertions. Note that the priority of a given assertion is context dependent, e.g., assertion A1 may have priority P1 for command C1 of test T1 and have priority P3 for command C2 of test T1.

ARIA/HTML Feature Detail Report

A new kind of report page that can be reached from links on the support level values in the top-level summary of ARIA/HTML feature support.

Page Title

Format:
AT_NAME and BROWSER_NAME Support for FEATURE_NAME | ARIA-AT Reports

Example:
JAWS and Chrome Support for aria-selected | ARIA-AT Reports

Page Content

  1. Breadcrumbs
  2. An L1 heading with content “AT_NAME and BROWSER_NAME Support for FEATURE_NAME” (FEATURE_NAME should link to spec derived from info in support.json)
  3. An L2 heading "Summary of Results for AT_NAME and BROWSER_NAME" followed by a table of detailed breakdowns of # of passing, failing, and untestable assertions per priority level
  4. A “Download CSV” button
  5. A table of the raw data.

The assertions statistics summary should include the following columns and rows:

Passing Failing Untestable
Must-Have Behaviors 6 of 12 3 of 12 3 of 12
Should-Have Behaviors 4 of 8 2 of 8 2 of 8
Must + Should 10 of 20 5 of 20 5 of 20
Percent of Behaviors 50% 25% 25%

The data table would contain the following columns:

  1. Test Plan: Hyperlink with text of the name and version of the test plan
  2. Test Title: Hyperlink that targets the heading for the test in the test plan report
  3. Command: The command sequence as it appears in the test plan report
  4. Assertion Priority: Must/Should
  5. Assertion Phrase: The assertion phrase as it appears in the test plan report
  6. Result: Passed|Failed|Untestable
  7. Last Tested On: Date of the test as it appears in the test run history
  8. Last Tested AT Version: Name/Version of AT as it appears in the run history in the test plan report
  9. Last Tested Browser Version: Name/Version of browser as it appears in the run history in the test plan report
  10. Number of severe side Effects: Count of severe side effects recorded for the command in the test plan report
  11. Number of moderate side Effects: Count of moderate side effects recorded for the command in the test plan report

Detail report table features

  1. Default sort order is ascending test plan name, test title, command
  2. Given the CSV download, V1 of this report does not need sort, filter, or search functionality.

CSV File

The CSV file could have a name that reflects the page title:
AT_NAME+BROWSER_NAME-Support-for-FEATURE_NAME-ARIA-AT.csv

Example:
JAWS+Chrome-Support-for-aria-selected-ARIA-AT.csv

The first column of the report should be the ARIA/HTML Feature Name. This will disambiguate the content and enable people to easily combine multiple reports.

If a column is a hyperlink, the CSV file should have two columns – one for the link text and one for the Link URL.

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions