For closer to 20 than 10 years, I’ve made the argument that the quality of the code of a website (which is relatively easy to assess, compared to any sort of binary application, where code is almost impossible to assess in any meaningful way) – whether it validates, its approach to accessibility, its adoption of good practices and conventions – is an indicator of the overall approach of an organisation to the stuff you can’t see.
If an airline doesn’t really care about getting its most visible technology – its website – right, what’s going on with all its other technology? Like, oh I don’t know, the technology that allows aeroplanes to travel kilometres above the earth at velocities approaching that of sound, with hundreds of people on board?
As usual, very few people listened.
But I have young children, so I’m used to that.
However, I was reminded of this apparently quaint idea (something many adherents to a standards-based approach to web development would also have argued “back in the day”) by an article titled What Web Page Structure Reveals on News Quality posted last week by Frederic Filloux, who writes frequently on the business and technology of the news media.
The news media currently faces many, many challenges. Not least of these is the challenge of discerning the quality of an information source. This is what we might loosely call the “fake news” issue.
Fake news is not just the traditional media’s problem – it’s all of ours, but especially it is a serious problem for Facebook and Google as the primary conduits of information of all kinds into many more than a billion people’s lives.
Human filtering simply doesn’t scale to determining the veracity and trustworthiness of so many sources, so various individuals and groups are working on a project to algorithmically determine the quality and trustworthiness of a news source.
Filloux described his project thus:
“The News Quality Scoring Project (NQS) is aimed at finding and quantifying ‘signals’ that convey content quality. The idea is to build a process that is scalable and largely automated. Incidentally, it will contribute to debunk fake news by ‘triangulating’ questionable sources.”
That, of course, ties into my introductory observations about code quality as a proxy for a deeper sense of quality, using the HTML structure (what we today would call the design patterns) of a page as one indicator – among several others – of the quality of the site itself.
It’s not hard to find examples of poorly coded and marked up websites that present rubbish content on barely usable webpages. There’s almost an instinct that comes into play where you know without even looking that those “gallery” style pages (“see what your favourite TV stars look like now”) are going to turn out to be poorly coded.
This, in turn, cuts straight to our overall perceptions of quality (terrible) and trustworthiness (none).
But what about good design patterns? Can they be useful indicators of site quality? Filloux’s project aims to find out.