This is my favorite accessibility validator. I think it does the best job of pointing out potential problems, and providing feedback and suggestions within the content of the page. Other validators that I looked at required one or more mouse clicks to see what the potential problems were and what could be done about them (see below).
Bobby not only points out the problems within the context of the page, it also points you to a more detailed description of the potential problem (should you need it) as well as provides links to the W3C Accessibility Guidelines/Recommendations.
What I don't like about Bobby is the way it points out where your problems might be on the page. Since each image on a page is a potential accessibility issue (even when it has associated ALT text), your page can quickly become littered with question marks.
I don't mind the pointers, although some sort of logic might work for checking if an image with empty ALT text appears to be a spacer gif: is it a gif, does the word spacer appear in its name, are its dimensions 1px wide or 1px tall, etc. What I would like to see are some icons that clue me in to what the potential problems might be. This is something that I like about The Wave.
This accessibility validator comes in a close second to Bobby. I really like that the potential problem areas are identified by icons. That is much better than the question marks you get with Bobby.
My main complaint is that the page you get back when validating has no in-context descriptions of what the issues might be. If the icons were links to anchors at the bottom of the page (a la Bobby), that explained the potential problems, I'd be happier. As it is now, I need to open a new page to identify what the icons mean, and how to use them to evaluate my page.
Since I use a Mac, I am a bit disappointed that most Accessibility software is made for the Wintel group. But since most disabled users use those OSs, I guess it makes sense. I do have Virtual PC on my computer, and so was able to download and install A-Prompt, and even get it to work :)
I didn't care for the way this software works at all. In order to find out how to use it you must use the Help application the accompanies it (though this isn't obvious at the outset). You are limited to files that reside locally on your machine (as far as I could tell anyway). There are times when it is useful to be able to evaluate a page as it resides on a server - for instance when you are evaluating the software via an emulator, and don't want to copy a bunch of files into said emulator :)
Also, this software does not show you the potential problems in the context of a rendered page. This requires using a separate application to see how the changes the program makes (or suggests that you make), affects the design of the page. So some browser integration would be useful here.
I think this program tries to do too much. Where Bobby and the Wave are useful tools not only to ferret out potential issues, they also do some teaching about how to fix the problems, and why it is necessary. I got the sense the A-Prompt was trying to do all the fixing within the program, implying that a machine could find and fix most of the problems (particularly the six automatic fixes).
Maybe I am too worried about the way I code pages, but I don't trust machines to get things right.
I was also a bit troubled about the A-Prompt logo. At least Bobby makes it clear that it is only checking for Priority 1/Level A compliance. A-Prompt implies that you can use it to get Level AAA compliance. And that, combined with the "automatic" nature of the program just doesn't sit right with me.
The biggest limitation with these tools is that there is so much room for interpretation with these guidelines. What constitutes an image that requires a LONGDESC or D-Link? Who decides if such descriptions are adequate for a non-sighted user? How can you tell if a table truly makes sense when linearized? Do the chosen colors provide enough contrast for color-blind users? Are related elements grouped? Is the text in the simplest language possible? Is the navigation clear and consistent?
All of these require human intervention to answer, and some of them are critical to the accessibility of the page. So for now we have to use these validators to help us find potential problems (which is why that word appears so often in this write-up). Humans still have to make the final decisions.