Are poor site auditing systems killing the market?

14 Dec 2015

  • Tweet this item
  • share this item on Linkedin

Could the providers of such site auditing and monitoring systems be in danger of killing the start of the market?

By Shane Diffily, host of Web Governance Masterclass

http://www.diffily.com/articles/site_auditing_dialogue.htm

Recently I spoke with Lawrence Shaw from Sitemorse about the incredible variety of tools now available to support online management - but also about the relative lack of sophistication of many.

Lawrence Shaw
"Sitemorse has found that organisations using what the market sees as 1st Generation tools (i.e. mapping out a digital estate, understanding and reporting on content assessment, etc.) users often drown in a sea of reports. The more they review, the more inefficient they become.

Content has to become smarter.

The good news is that in 2nd Generation services that provide improved publishing agility, reports are thinner and thinner ensuring things are right first time. This basic reason is that editors finally have the information they need to act.

But could the legacy of problems with 1st Generation services, dissuade more from making a switch?"

Shane Diffily
I see what you mean. Despite the current cohort of Digital Managers being among the first to have access to real data about Digital Governance, they are often disillusioned by the sheer volume of information to be trawled through.

Perhaps the most talked about issues of this type concern automated accessibility and link checking tools.

I mean, they seem like such a good idea.

"What?! I can check thousands of pages for compliance to legal requirements and quality standards ... in an instant? Sign me up."

Of course, only later does the realisation dawn that automation does not necessarily mean discrimination.

While many 1st Generation services are great at checking pages against simple rules, they are not so great at:
•    Recognising false positives
•    Separating content from features/navigation
•    Ranking results based on importance

It could actually be that if we are not careful 1st Generation services, reporting against a complete site could be actually generating additional workload - preventing timely action on matters that really need prioritised attention.

We have to constantly remind ourselves that such tools like this are really just syntactical engines.

Yes, they can shuffle lots of symbols - but any semantics depends on you.

You've never had it so good..... You might remember that back in the early days of the web, the most popular means for extracting meaning from data was our old friend Excel.

But of course, the problem with such a manually intensive system of analysis is that it is simply not scalable.

With so many things happening on a modern online presence, it is extremely hard for Digital Managers to retain visibility over absolutely everything - and 1st Generation are often of little help.

Sure, they generate tons of data, but no-one has the luxury of spending long hours sorting the wood from the trees.

Indeed, the truth is few Product Managers (especially those on very large scale projects) can really claim to know just how well their systems are performing.

Things are just too complex.
Most haven't the slightest clue whether complications in content are due to honest (but mistaken) decisions about user needs - or poor writing or sloppy implementations.

No doubt they would love to find out, but it is as much as they can do merely to correct errors as they arise - "whack-a-mole" style - never mind investigate them all.

And Excel is not the answer.

Even if you were to try, creating the mountains of spreadsheets needed to generate an answer would absorb more time than was lost by the issue under analysis!

The next generation
All this means that the 2nd Generation of management tools that integrate with DAM/CMS and publishing technology, are not only welcome but critical to keeping the engine of online ticking over.

More than anything else they are helping managers reconnect with operations and disperse the fog of uncertainty that surrounds many sites.
Not only do such tools remove the need for mundane manual work, they are also generate the semantics needed to apply meaning to the data being processed.

Even better, some can now be integrated with other tools to reveal the type of detailed insights needed to improve decision-making

For example, consider the power of a tool that could cross-reference information from sources such as analytics, QA, CMS and more.

Imagine how useful it could be ...


"It looks like Katherine is doing a great job.
I see she has published almost 30 new posts to our Wildlife Protection blog over the last month. Each page is also getting lot of visitors and plenty of social media shares, especially on Twitter. Good work.

Oh, but I also see that several entries have bad spelling mistakes. Plus, Readability is sometimes below par - too many long sentences & difficult words.

Finally, it looks like comments are not being responded to either.
I'll drop Katherine a line to let her know we can help, and also tell her about our new Writing for the Web training course coming up later this month."

Early days
Of course, it is still early days. There are as yet few examples of this type of integration.

Nevertheless, some can be seen on GOV.uk whereby data is mashed from several sources to create new Performance Dashboards. This represents a first step towards the type of analysis that will one day be commonplace.
The result will be that entire discipline of Digital Governance will open up to scrutiny as never before, as 2nd Generation tools reveal insights that were once obscure.

And Excel, it ain't.