Dog Food, CMS Accessibility and a Nice Surprise
You’ve been avoiding the main streets for a long, long while
The truth that I’m seeking is in your missing file
What’s your position, baby, what’s going on?
Why is the light in your eyes nearly gone?
- SOMETHING’S BURNING, BABY
Following on from the wonderfully entertaining “CMS Vendor Meme” (a.k.a. the “CMS Celebrity Deathmatch“), I’d like to drill slightly deeper into Item #9 – Dog Food. For the uninitiated, “Eating your own dogfood” means that the vendor uses their own software to run their own site. All of them do, according to the responses to the Vendor Meme so far, although not always on the very latest version.
So, do the vendors’ sites, written on technology which is sold as fully accessible and built by experts (at least, one hopes the vendor has experts), actually produce markup that validates? I guess the first question one has to ask is does it matter if a site is accessible. And the answer: Oh yes. For many many reasons which I’m not going to go into here. I understand that W3C validation ≠ Accessibility, but that is another discussion for another time too. Validation is still an important part.
I know that it isn’t always easy to make complex site that validate. Where I work, sites should always validate when they’re launched – it is part of the User Acceptance Criteria. However, we are guilty of back-sliding when sites are in support /maintenance mode, and editors break things when abusing Rich Text Editors. Shock, horror – there are still a lot of CMS products that allow editors to enter broken markup.
WordPress do a pretty good job. This blog validates at the time of writing, no thanks to me. Admittedly, I did have to fix the FeedBurner RSS link which left the closing slash from the img tag, but that wasn’t WordPress’s fault.
I digress. I thought I’d test the home pages of a few major commercial Web CMS vendors – those listed as Enterprise or Upper Tier in the latest CMS Watch Web CMS Report. I tested the vendor home page, which may not be CMS related at all, especially for the big boys. The results are tabulated below. The numbers below were generated on 18 March between 21:00 and 23:00 GMT using the W3C HTML Validator. I didn’t check the CSS or Feeds, just the markup. Both encoding and doctype were left on “Detect Automatically”. I didn’t look into the details of the errors. The ones with a large number of errors might actually only be a few errors that are repeated, or have knock-on effects.
|Vendor||URL Checked||Detected DOCTYPE||Number of Errors (2009/03/18)|
|EMC Documentum||uk.emc.com||XHTML 1.0 Transitional||121|
|IBM||www.ibm.com||XHTML 1.0 Strict||0|
|Autonomy Interwoven||www.interwoven.com||XHTML 1.0 Transitional||254|
|OpenText||www.opentext.com||XHTML 1.0 Transitional||205|
|Oracle||www.oracle.com||HTML 4.0 Transitional||39|
|Vignette||www.vignette.com||XHTML 1.0 Transitional||39|
|CoreMedia||www.coremedia.com||HTML 4.01 Transitional||49|
|Day||www.day.com||HTML 4.01 Strict||2|
|Fatwire||www.fatwire.com||HTML 4.01 Transitional||1|
|Alterian Mediasurface||www.mediasurface.com||XHTML 1.0 Strict||41|
|Percussion||www.percussion.com||XHTML 1.0 Transitional||4|
|SDL Tridion||www.tridion.com||XHTML 1.0 Strict||41|
|Microsoft||www.microsoft.com||XHTML 1.0 Transitional||177|
The nice surprise mentioned in the title is IBM. Big Blue really does care about standards, and maybe Java is going to safe place should the SUN deal materialise. Hats off to Fatwire, Day and Percussion who get really close and clearly try to ensure the markup is good. The other 9 out of 13, however, don’t look so promising.
So, what am I saying? I am not for a second implying that the products that do badly in the above are “not accessible”. I just think the question we always see in an CMS Selection RFP is incorrect. Asking about an accessible editing interface (which comes out of the box) makes sense. Asking about an accessible front end (which is different for every implementation) makes no sense at all.
So, instead, the question on the RFP should be “Does your CMS allow the developer full control over the markup. If not, please specify where?” Now, it is highly unlikely that any product can answer an unequivocal “yes” to this. For example, every .NET based product mandates that a FORM tag containing the VIEWSTATE exists. However, this does not cause a problem.
I believe the problem in most of the examples in the table above could be rooted in one of:
- The technology makes valid markup impossible – I think this could probably be worked around in many cases. But sometimes you simply can’t get around the bad markup you’re given.
- Nobody knew it mattered – Ignorance isn’t an excuse any more.
- Someone decided it wasn’t important – this doesn’t need further comment. Give them some concrete shoes and send them for a swim.
- There isn’t time and/or budget to ensure it validates – in some cases it is more expensive to create a validating, progressively enhanced site. However, in many cases I believe it is cheaper to do it properly.
- The front end team lacked the skill - This I can believe. Hopefully this improves with time. Many server side developers aren’t any good at client side work. I know I fall into this camp. When I was coding, CSS didn’t exist, HTML still had TABLES in it and the BLINK tag was cool. I’m not allowed anywhere near the front end code where I work. We have professionals for that.
- Showing off with fancy client side technologies - There are far too many sites that use Flash/AIR/Silverlight for no good reason, without providing an accessible fallback. Now this won’t affect the W3C validation, but it annoys the hell out of me. Use these technologies where they are needed, not for the sake of it.
I’m sure there are other reasons I’ve missed out, and I’d love to hear about them. I believe the responsibility for convincing management of the importance of doing things properly lies with us, the technologists. And if they seem not to care too much about accessibility, play the Increased Revenue cards (SEO, multi-device target market, maintainable code, integration with as yet unknown services, working on IE8 and other future browsers, etc) instead.
And once again, nice one IBM for winning the Home Page test. I apologise for my behaviour in some meetings in the past about the markup from WebSphere Portal. But let’s not get complacent – it would be nice if you could make the deeper pages in the site validate too.
UPDATE: Does anyone have the energy to publish a similar test for the mid-range and Open Source vendors? I might do it in a week or three if no-one else does it first.