﻿<html>
<head>

<title>ATAG2.0 Tester (Author) Template (Form) 32-3 - for Success Criterion 3.2-3</title>

<style type="text/css">


h2 {font-size: 150%;}
h1 {font-size: 200%;}


</style>
</head>

<body>

<h1>
ATAG 2.0 Tester (Author) Template (Form) 32-3 - for Success Criterion 3.2
</h1>
<p>
This form is divided into four parts: Part 1 - General Information, Part 2 - General Questions,
Part 3 - Specific Questions, and Part 4 - Supplemental Questions.  NOTE: The (link to) ATAG2.0
techniques may be included in Part 3 questions, but there may be other questions to answer in Part 3.
Therefore, the questions may need to be reorganized, so that for some questions, every question
may need to be answered, but for others, only one out of a set of questions (related to ATAG2.0
techniques) may need to be answered.  If you need any additional help (for example, steps to take
or resources to access) in order to answer any of the questions following, please email:
(mailto) public-atag-tests@w3.org.  It is desired that this form be easy and quick to fill out.
<hr>
<h2>Part 1 - General Information</h2>
<p>
DISCLAIMER: Data entered on this form is informative only.  The submitter bears all responsibility for the
accuracy of the data.  The data entered on this form 
may be reviewed by the W3C AUWG?
<p>
 Answers to all questions are
required (except for those questions marked OPTIONAL)(however, you may refer to answers on other templates if you wish); if you feel a question is not applicable, just write "not applicable" and
specify why.  If you need additional ATAG information (including access to the ATAG2.0 spec and techniques) to complete this form, you may go
to the ATAG site.  If you need additional WAI testing resources, go to ?  Please submit this form to "testreport site".  This information
will be made public.  This information will be made accessible.  After submitting this form, you may proceed directly to
the <a href="atag-tester-form33-3.htm">Tester Form for Success Criterion 3.3-3</a>, or back to the <a href="ATAG20testsuite.htm">
main ATAG test page</a>.  It is possible that some results may be machine-reportable, in
which case the developer should have noted this information on the developer's form (link?), and you
could attach the machine-reportable results to this form.  There are no dependencies on other tester forms
 for filling out this form.  Thank you.

<h3>TEST PURPOSE:</h3> To evaluate an authoring tool's formats according to SC3.2

<h3>TEST AGAINST:</h3> Success Criterion 3.2 - 
<ul>
<li>The authoring tool must always provide a check (automated check, semi-automated check or
manual check) for each applicable requirement to conform to WCAG Level AAA
<li>The authoring tool must always infomr the author of any failed check results prior to completion
of authoring 
</ul>  

<hr>
<h2>Part 2 - General Questions (may be replaced by Conformance Profile?)</h2> 
<h3>
Name of Tester (Author):
</h3>
<p>
<br>
<br>
<br>
<h3>
Date and Version of ATAG2.0 Specification Tested Against (if different than 22 Nov 04 WD):
</h3>
<p>
<br>
<br>
<br>
<h3>
Contact Info of Tester/Author(s):
</h3>
<p>
<br>
<br>
<br>
<h3>
Address of Tester/Author(s):
</h3>
<p>
<br>
<br>
<br>
<h3>
Email of Tester/Author(s):
</h3>
<p>
<br>
<br>
<br>
<h3>
Phone Number(s) of Tester/Author(s):
</h3>
<p>
<br>
<br>
<br>
<h3>
Fax# of Tester/Author(s):
</h3>
<p>
<br>
<br>
<br>
<h3>
Authoring Tool Tested (please be specific) - category of authoring tool,
format(s) output from authoring tool, and platform that authoring tool uses)
(indicate if authoring tool is a bundled tool):
</h3>
<p> 
<br>
<br>
<br>
<h3>
Have you accessed the developers form (if available) (link?) for this particular authoring tool and success criterion?  If so, do you have any questions on
any of that information?  It is not necessary (but desirable) for the developer to have submitted a form for the tool you
are testing.  
</h3>
<p>
<br>
<br>
<br>
<hr>





<h2>Part 3 - Specific Test Questions for Success Criterion 3.2</h2>
Note: These questions may in the future refer back in a one-to-one relationship with information on each question
provided by the tool developer on a separate template?  You have access to the developer's information
for each question.   For more information on ATAG2.0 spec go to ?.
<p>
INVESTIGATION:
<p>
<ul>
<li>How did the authoring tool provide automated checks to ensure meeting WCAG Level AAA requirements? 
 (may require authoring actions) (NOTE: MAY WANT TO STOP AT THIS POINT TO GET INFORMATION NECESSARY TO ANSWER FOLLOWING QUESTIONS,
AND RETURN TO FORM AT THIS POINT!) 
 
<br>
<br>
<br>

<li>How did the authoring tool provide semi-automated checks to ensure meeting WCAG Level AAA requirements? 
 (may require authoring actions) (NOTE: MAY WANT TO STOP AT THIS POINT TO GET INFORMATION NECESSARY TO ANSWER FOLLOWING QUESTIONS,
AND RETURN TO FORM AT THIS POINT!) 
 
<br>
<br>
<br>

<li>How did the authoring tool provide manual checks to ensure meeting WCAG Level AAA requirements? 
 (may require authoring actions) (NOTE: MAY WANT TO STOP AT THIS POINT TO GET INFORMATION NECESSARY TO ANSWER FOLLOWING QUESTIONS,
AND RETURN TO FORM AT THIS POINT!) 
 
<br>
<br>
<br>
<li>How did the authoring tool inform you of failed check results (see previous?)?  (may require authoring actions) (NOTE: MAY WANT TO STOP AT THIS POINT TO GET INFORMATION NECESSARY TO ANSWER FOLLOWING QUESTIONS,
AND RETURN TO FORM AT THIS POINT!) 
 
<br>
<br>
<br>



</ul>
<p>
<!--
<h2> [(link to developer's) 7.2.1.1 entry?] For each specific authoring action mentioned previously, did the authoring tool interface demonstrate the enabling of user
 input/output choice according to Part 7.2.1.1 of ISO16071:2002(E) (please be specific-if (so/not), how was it (so/not) demonstrated?)?
What test procedure did you use?  What test environment did you use?</h2>
<p>
<h2> [(link to developer's) 7.2.2 entry?] For each specific authoring action mentioned previously, did the authoring tool interface demonstrate enabling the user to
 perform task effectively with any single input device according to Part 7.2.2 of ISO16071:2002(E)(please be specific
-if (so/not), how was it (so/not) demonstrated)?  What test procedure did you use?  What test environment did you use?</h2>
<p>
<h2> [(link to developer's) 7.2.4 entry?] For each specific authoring action mentioned previously, did the authoring tool interface demonstrate enabling user setting
 of timed responses according to Part 7.2.4 of ISO16071:2002(E)(please be specific-if (so/not), how was it (so/not) demonstrated)?
What test procedure did you use?  What test environment did you use?</h2>
<p>
<h2> [(link to developer's) 7.2.10 entry?] For each specific authoring action mentioned previously, did the authoring tool interface demonstrate an avoidance of
 seizure-inducing blink rates according to Part 7.2.10 of ISO16071:2002(E)(please be specific-if (so/not), how was it (so/not) demonstrated)?
What test procedure did you use?  What test environment did you use?</h2>
<p>
<h2> [(link to developer's) 7.2.12 entry?] For each specific authoring action mentioned previously, did the authoring tool interface demonstrate enabling user control
 of time-sensitive presentation of information according to Part 7.2.12 of ISO16071:2002(E)(please be specific-if (so/not), how
 was it (so/not) demonstrated)?  What test procedure did you use?  What test environment did you use?</h2>

<p>
<h2> [(link to developer's) 7.3.1 entry?] For each specific authoring action mentioned previously, did the authoring tool interface demonstrate the use of system-standard input/output
 according to Part 7.3.1 of ISO16071:2002(E)(please be specific-if (so/not), how
 was it (so/not) demonstrated)?  What test procedure did you use?  What test environment did you use?</h2>
<p>
<h2> [(link to developer's) 7.3.2 entry?] For each specific authoring action mentioned previously, did the authoring tool interface demonstrate the provision of object labels
 according to Part 7.3.2 of ISO16071:2002(E)(please be specific-if (so/not), how
 was it (so/not) demonstrated)?  What test procedure did you use?  What test environment did you use?</h2>
<p>
<h2> [(link to developer's) 7.3.3 entry?] For each specific authoring action mentioned previously, did the authoring tool interface demonstrate how to make event notification
available to assistive technologies
 according to Part 7.3.3 of ISO16071:2002(E)(please be specific-if (so/not), how
 was it (so/not) demonstrated)?  What test procedure did you use?  What test environment did you use?</h2>
<p>
<h2> [(link to developer's) 7.3.4 entry?] For each specific authoring action mentioned previously, did the authoring tool interface demonstrate how to make object attributes available to
assistive technologies according to Part 7.3.4 of ISO16071:2002(E)(please be specific-if (so/not), how
 was it (so/not) demonstrated)?  What test procedure did you use?  What test environment did you use?</h2>
<p>
<h2> [(link to developer's) 7.3.5 entry?] For each specific authoring action mentioned previously, did the authoring tool interface demonstrate the use of system-standard input/output
 according to Part 7.3.5 of ISO16071:2002(E)(please be specific-if (so/not), how
 was it (so/not) demonstrated)?  What test procedure did you use?  What test environment did you use?</h2>
<p>
<h2> [(link to developer's) 7.4.11 entry?] For each specific authoring action mentioned previously, did the authoring tool interface demonstrate the reserving of accessibility key mappings
 according to Part 7.4.11 of ISO16071:2002(E)(please be specific-if (so/not), how
 was it (so/not) demonstrated)?  What test procedure did you use?  What test environment did you use?</h2>
<p>
<h2> [(link to developer's) 7.4.13 entry?] For each specific authoring action mentioned previously, did the authoring tool interface demonstrate the separation of keyboard navigation and activation
 according to Part 7.4.13 of ISO16071:2002(E)(please be specific-if (so/not), how
 was it (so/not) demonstrated)?  What test procedure did you use?  What test environment did you use?</h2>
<p>
<h2> [(link to developer's) 7.5.2 entry?] For each specific authoring action mentioned previously, did the authoring tool interface demonstrate the enabling of location of button functions
 according to Part 7.5.2 of ISO16071:2002(E)(please be specific-if (so/not), how
 was it (so/not) demonstrated)?  What test procedure did you use?  What test environment did you use?</h2>
<p>
<h2> [(link to developer's) 7.5.9 entry?] For each specific authoring action mentioned previously, did the authoring tool interface demonstrate the provision of alternatives to chorded key presses
 according to Part 7.5.9 of ISO16071:2002(E)(please be specific-if (so/not), how
 was it (so/not) demonstrated)?  What test procedure did you use?  What test environment did you use?</h2>
<p>
<h2> [(link to developer's) 7.6.1 entry?] For each specific authoring action mentioned previously, did the authoring tool interface demonstrate font customization and legibility according to Part 7.6.1 of ISO16071:2002(E)(please be specific
if (so/not), how was it (so/not) demonstrated)?
What test procedure did you use?  What test environment did you use?</h2>  
<p>
<h2> [(link to developer's) 7.8.1 entry?] For each specific authoring action mentioned previously, did the authoring tool interface demonstrate alternatives to the use of color as the sole source of information according to Part 7.8.1 of 
ISO16071:2002(E)(please be specific - if (so/not), how was it (so/not) demonstrated)?
What test procedure did you use?  What test environment did you use?</h2>  
<p>
<h2> [(link to developer's) 7.8.6 entry?] For each specific authoring action mentioned previously, did the authoring tool interface demonstrate provision of alternatives to coding by hue according to Part 7.8.6 of ISO16071:2002(E)(please be specific
if (so/not), how was it (so/not) demonstrated)?
What test procedure did you use?  What test environment did you use?</h2>  
<p>
<h2> [(link to developer's) 7.9.5 entry?] For each specific authoring action mentioned previously, did the authoring tool interface demonstrate allowing users to choose visual indication of audio output according to Part 7.9.5 of ISO16071:2002(E)
(please be specific - if (so/not), how was it (so/not) demonstrated)?
What test procedure did you use?  What test environment did you use?</h2>  
<p>
<h2> [(link to developer's) 7.10.1 entry?] For each specific authoring action mentioned previously, did the authoring tool interface demonstrate allowing task-relevant warning or error information to persist according to Part 7.10.1 of ISO16071:2002(E)
(please be specific - if (so/not), how was it (so/not) demonstrated)?
What test procedure did you use?  What test environment did you use?</h2>  
<p>
<h2> [(link to developer's) 7.12.3 entry?] For each specific authoring action mentioned previously, did the authoring tool interface demonstrate enabling cursor and pointer customization according to Part 7.12.3 of ISO16071:2002(E)
(please be specific - if (so/not), how was it (so/not) demonstrated)?
What test procedure did you use?  What test environment did you use?</h2>  
<p>
<h2> [(link to developer's) 7.13.1 entry?] For each specific authoring action mentioned previously, did the authoring tool interface demonstrate enabling non-pointer navigation directly to windwows according to Part 7.13.1 of ISO16071:2002(E)
(please be specific - if (so/not), how was it (so/not) demonstrated)?
What test procedure did you use?  What test environment did you use?</h2>  
<p>
<h2> [(link to developer's) 7.14.1 entry?] For each specific authoring action mentioned previously, did the authoring tool interface demonstrate provision of focus cursor according to Part 7.14.1 of ISO16071:2002(E)
(please be specific - if (so/not), how was it (so/not) demonstrated)?
What test procedure did you use?  What test environment did you use?</h2>  
<p>
<h2> [(link to developer's) 7.14.2 entry?] For each specific authoring action mentioned previously, did the authoring tool interface demonstrate provision of keyboard navigation according to Part 7.14.2 of ISO16071:2002(E)
(please be specific - if (so/not), how was it (so/not) demonstrated)?
What test procedure did you use?  What test environment did you use?</h2>  
<p>
<h2> [(link to developer's) 7.14.3 entry?] For each specific authoring action mentioned previously, did the authoring tool interface demonstrate provision of navigation to task-appropriate groups of controls according to Part 7.14.3 of ISO16071:2002(E)
(please be specific - if (so/not), how was it (so/not) demonstrated)?
What test procedure did you use?  What test environment did you use?</h2>  
<p>

-->
<hr>
<p>
AFTER THE INVESTIGATION:


<hr>
<p>
Please answer the following questions for TIMES 1&2 (different)(Please enter them at this point:____),
 for example of updated/added web content x and/or instruction y as specified previously, that
you found in answers to previous questions:
(NOTE: x goes from 1 to n, where n is greater than or equal to 1)
(NOTE: y goes from 1 to m, where m is greater than or equal to 1)

<ul>

  

<li>Was a check provided to you to make sure that web content x 
passed all of the (specific version of) WCAG20 Level 1, 2 and 3 Tests in the WCAG2.0 Test Suite for WCAG2.0 SC1.1, 
?
If yes, why?  If no, why not?   
<br>
<br>
<br>
<li>Was a check provided to you to make sure that web content x 
passed all of the (specific version of) WCAG20 Level 1, 2 and 3 Tests in the WCAG2.0 Test Suite for WCAG2.0 SC1.2?
If yes, why?  If no, why not?   
<br>
<br>
<br>
<li>Was a check provided to you to make sure that web content x 
passed all of the (specific version of) WCAG20 Level 1, 2 and 3 Tests in the WCAG2.0 Test Suite for WCAG2.0 SC1.3?
If yes, why?  If no, why not?   
<br>
<br>
<br>
<li>Was a check provided to you to make sure that web content x 
passed all of the (specific version of) WCAG20 Level 1, 2 and 3 Tests in the WCAG2.0 Test Suite for WCAG2.0 SC1.4?
If yes, why?  If no, why not?   
<br>
<br>
<br>
<li>Was a check provided to you to make sure that web content x 
passed all of the (specific version of) WCAG20 Level 1, 2 and 3 Tests in the WCAG2.0 Test Suite for WCAG2.0 SC2.1?
If yes, why?  If no, why not?   
<br>
<br>
<br>
<li>Was a check provided to you to make sure that web content x 
passed all of the (specific version of) WCAG20 Level 1, 2 and 3 Tests in the WCAG2.0 Test Suite for WCAG2.0 SC2.2?
If yes, why?  If no, why not?   
<br>
<br>
<br>
<li>Was a check provided to you to make sure that web content x 
passed all of the (specific version of) WCAG20 Level 1, 2 and 3 Tests in the WCAG2.0 Test Suite for WCAG2.0 SC2.3?
If yes, why?  If no, why not?   
<br>
<br>
<br>
<li>Was a check provided to you to make sure that web content x 
passed all of the (specific version of) WCAG20 Level 1, 2 and 3 Tests in the WCAG2.0 Test Suite for WCAG2.0 SC2.4?
If yes, why?  If no, why not?   
<br>
<br>
<br>
<li>Was a check provided to you to make sure that web content x 
passed all of the (specific version of) WCAG20 Level 1, 2 and 3 Tests in the WCAG2.0 Test Suite for WCAG2.0 SC2.5?
If yes, why?  If no, why not?   
<br>
<br>
<br>
<li>Was a check provided to you to make sure that web content x 
passed all of the (specific version of) WCAG20 Level 1, 2 and 3 Tests in the WCAG2.0 Test Suite for WCAG2.0 SC3.1?
If yes, why?  If no, why not?
<br>
<br>
<br>
<li>Was a check provided to you to make sure that web content x 
passed all of the (specific version of) WCAG20 Level 1, 2 and 3 Tests in the WCAG2.0 Test Suite for WCAG2.0 SC3.2?
If yes, why?  If no, why not?   
<br>
<br>
<br>   
<li>Was a check provided to you to make sure that web content x 
passed all of the (specific version of) WCAG20 Level 1, 2 and 3 Tests in the WCAG2.0 Test Suite for WCAG2.0 SC4.1?
If yes, why?  If no, why not?   
<br>
<br>
<br>
<li>Was a check provided to you to make sure that web content x 
passed all of the (specific version of) WCAG20 Level 1, 2 and 3 Tests in the WCAG2.0 Test Suite for WCAG2.0 SC4.2?
If yes, why?  If no, why not?   
<br>
<br>
<br>




</ul>
<hr>
<p>

<p>
Please answer the following questions for TIMES 1&2 (different) (Please enter them at this point:____),
 for every type of markup and example of content that had acceptable formats, that
you found in answers to previous questions:
<ul>

<li>Were any failed check results (not passing all of the (specific version of) WCAG20 Level 1, 2 and 3 
 Tests in the WCAG2.0 Test Suite for WCAG2.0 SC1.1) reported to you prior to completion of your authoring task?
If yes, why?  If no, why not?   
<br>
<br>
<br>
<li>Were any failed check results (not passing all of the (specific version of) WCAG20 Level 1, 2 and 3 
 Tests in the WCAG2.0 Test Suite for WCAG2.0 SC1.2) reported to you prior to completion of your authoring task?
If yes, why?  If no, why not?   
<br>
<br>
<br>
<li>Were any failed check results (not passing all of the (specific version of) WCAG20 Level 1, 2 and 3 
 Tests in the WCAG2.0 Test Suite for WCAG2.0 SC1.3) reported to you prior to completion of your authoring task?
If yes, why?  If no, why not?   
<br>
<br>
<br>
<li>Were any failed check results (not passing all of the (specific version of) WCAG20 Level 1, 2 and 3 
 Tests in the WCAG2.0 Test Suite for WCAG2.0 SC1.4) reported to you prior to completion of your authoring task?
If yes, why?  If no, why not?   
<br>
<br>
<br>
<li>Were any failed check results (not passing all of the (specific version of) WCAG20 Level 1, 2 and 3 
 Tests in the WCAG2.0 Test Suite for WCAG2.0 SC2.1) reported to you prior to completion of your authoring task?
If yes, why?  If no, why not?   
<br>
<br>
<br>
<li>Were any failed check results (not passing all of the (specific version of) WCAG20 Level 1, 2 and 3 
 Tests in the WCAG2.0 Test Suite for WCAG2.0 SC2.2) reported to you prior to completion of your authoring task?
If yes, why?  If no, why not?   
<br>
<br>
<br>
<li>Were any failed check results (not passing all of the (specific version of) WCAG20 Level 1, 2 and 3 
 Tests in the WCAG2.0 Test Suite for WCAG2.0 SC2.3) reported to you prior to completion of your authoring task?
If yes, why?  If no, why not?   
<br>
<br>
<br>
<li>Were any failed check results (not passing all of the (specific version of) WCAG20 Level 1, 2 and 3 
 Tests in the WCAG2.0 Test Suite for WCAG2.0 SC2.4) reported to you prior to completion of your authoring task?
If yes, why?  If no, why not?   
<br>
<br>
<br>
<li>Were any failed check results (not passing all of the (specific version of) WCAG20 Level 1, 2 and 3 
 Tests in the WCAG2.0 Test Suite for WCAG2.0 SC2.5) reported to you prior to completion of your authoring task?
If yes, why?  If no, why not?   
<br>
<br>
<br>
<li>Were any failed check results (not passing all of the (specific version of) WCAG20 Level 1, 2 and 3 
 Tests in the WCAG2.0 Test Suite for WCAG2.0 SC3.1) reported to you prior to completion of your authoring task?
If yes, why?  If no, why not?
<br>
<br>
<br>
<li>Were any failed check results (not passing all of the (specific version of) WCAG20 Level 1, 2 and 3 
 Tests in the WCAG2.0 Test Suite for WCAG2.0 SC3.2) reported to you prior to completion of your authoring task?
If yes, why?  If no, why not?   
<br>
<br>
<br>   
<li>Were any failed check results (not passing all of the (specific version of) WCAG20 Level 1, 2 and 3 
 Tests in the WCAG2.0 Test Suite for WCAG2.0 SC4.1) reported to you prior to completion of your authoring task?
If yes, why?  If no, why not?   
<br>
<br>
<br>
<li>Were any failed check results (not passing all of the (specific version of) WCAG20 Level 1, 2 and 3 
 Tests in the WCAG2.0 Test Suite for WCAG2.0 SC4.2) reported to you prior to completion of your authoring task?
If yes, why?  If no, why not?   
<br>
<br>
<br>

<li>In sum, did the authoring tool ensure that all pre-authored content provided by the authoring tool conform to the
relevant WCAG Level AAA checkpoints as described previously?
If yes, why?  If no, why not?
<br>
<br>
<br> 


</ul>


<!--

<ul>
<li>Despite prompting assistance from the tool (see Checkpoint 3.1), accessibility problems may still be introduced. For example, the author may cause accessibility problems by hand coding or by opening content with existing accessibility problems for editing. In these cases, the prompting and assistance mechanisms that operate when markup is added or edited (i.e. insertion dialogs and property windows) must be backed up by a more general checking system that can detect and alert the author to problems anywhere within the content (e.g. attribute, element, programmatic object, etc.). It is preferable that this checking mechanisms be well integrated with correction mechanisms (see Checkpoint 3.3), so that when the checking system detects a problem and informs the author, the tool immediately offer assistance to the author.

<li>The checkpoints in guideline 4 require that implementations of checking be: [@@expand this ? - could be techniques? - needs more emphasis and visibility@@]
    <ul>
    <li>clearly available to the author (checkpoint 4.3) [Priority 2]
    <li> configurable (checkpoint 4.X) [Priority 2]
    <li> integrated into the workflow of Web Content development (checkpoint 4.1) [Priority 2]
    <li> naturally integrated into the appearance and interactive style of the tool (checkpoint 4.4) [Priority 3]
    </ul>
<li> The authoring tool must always provide a check (automated check, semi-automated check or manual check) for each applicable requirement to conform to WCAG.

<li>Automate as much checking as possible. Where necessary provide semi-automated checking. Where neither of these options is reliable, provide manual checking.
<ul>
        <li> Automated: In automated checking, the tool is able to check for accessibility problems automatically, with no human intervention required. This type of check is usually appropriate for checks of a syntactic nature, such as the use of deprecated elements or a missing attribute, in which the meaning of text or images does not play a role.

         <li>   Example 3.2.1(a): This illustration shows a summary interface for a code-based authoring tool that displays the results of an automated check. (Source: mockup by AUWG)
            [longdesc missing]

         <li>   Example 3.2.1(b): This illustration shows an interface that displays the results of an automated check in a WYSIWYG authoring view using blue squiggly highlighting around or under rendered elements, identifying accessibility problems for the author to correct. (Source: mockup by AUWG)
            [longdesc missing]

          <li>  Example 3.2.1(c): This illustration shows an authoring interface of an automated check in a code-level authoring view. In this view, the text of elements with accessibility problems is shown in a blue font, instead of the default black font. (Source: mockup by AUWG)
            [longdesc missing]
</ul>
        <li> Semi-Automated: In semi-automated checking, the tool is able to identify potential problems, but still requires human judgment by the author to make a final decision on whether an actual problem exists. Semi-automated checks are usually most appropriate for problems that are semantic in nature, such as descriptions of non-text objects, as opposed to purely syntactic problems, such as missing attributes, that lend themselves more readily to full automation.

         <li>   Example 3.2.1(d): This illustration shows a dialog box that appears once the tool has detected an image without a description attribute. However, since not all images require description, the author is prompted to make the final decision. The author can confirm the at this is indeed an accessibility problem and move on to the repair stage by choosing "Yes". (Source: mockup by AUWG)
            [longdesc missing]

        <li> Manual: In manual checking, the tool provides the author with instructions for detecting a problem, but does not automate the task of detecting the problem in any meaningful way. As a result, the author must decide on their own whether or not a problem exists. Manual checks are discouraged because they are prone to human error, especially when the type of problem in question may be easily detected by a more automated utility, such as an element missing a particular attribute.

         <li>   Example 3.2.1(e): This illustration shows a dialog box that reminds the author to check if there are any words in other languages in the document. The author can move on to the repair stage by pressing "Yes". (Source: mockup by AUWG)
            [longdesc missing]

<li>Consult the Techniques For Accessibility Evaluation and Repair Tools [WAI-ER @@change to AERT@@] Public Working Draft for evaluation and repair algorithms related to WCAG 1.0. @@rewording@@

<li> The authoring tool must inform the author to any failed check results prior to completion of authoring.


<li>Tool designer lists on form all web content types produced by the tool
<li>Tool designer lists on form all WCAG-capable formats supported for each content type mentioned previously
<li>Tool designer specifies on form whether any format selections of the tool are automatic 
<li>Author verifies on form that all of the following are true for each supported/selected format mentioned before (or N/A if not applicable?):  

<li>For every referenced format, all the WCAG tests are satisfied


</ul>
--><p>
<hr>
<ul>
<li>What test procedure did you use for SC3.2?  What test environment did you use for SC3.2?
<br>
<br>
<br>
<li>
SUMMARY QUESTION:Objectively, from your perspective, did this authoring tool "pass" 
("yes" or "N/A" answers to all non-OPTIONAL questions AFTER THE INVESTIGATION)  ATAG2.0 Success Criterion 3.2?
If yes, why?  If not, why not? (please be specific)
<br>
<br>
<br>
</ul> 
<p>

<hr>
<h2>Part 4 - Supplemental Questions</h2>

 
<h3>
Please give any other information you feel may be helpful (please be specific): 
</h3>
<p>
<br>
<br>
<br>
<h3>
Please comment on the quality of the questions asked and/or the specification/techniques (please be specific): 
</h3>
<p>
<br>
<br>
<br>
<h3>
What other questions do you feel might be helpful?  Do you have any bugs/issues with this form? (please be specific)  
</h3>
<p>
<br>
<br>
<br>
<h3>Date of completion of this form (template):</h3>
<p>
<br>
<br>
<br> 
<p>
Thank you very much!  Your evaluation will be logged and made publicly available.
</p>



   


</body>
</html>