Digital management is opening up to scrutiny as never before as more and more tools gather insights about operations.
A quick glance at Gartner's Digital Marketing Transit Map from 2013 shows the sheer diversity of technology now on offer.
This is both heartening - and heart stopping.
Heartening, in that the information needed to manage a quality online experience can now be collected so easily - from web traffic to SEO to QA and more.
But it's heartstopping to think about the reams of extra site reports that will need analysis as tools are added.
There is a real danger that the quality of decision making will decline as data volumes increase.
If we can't figure out how to sort the signal from the noise, much of the benefit of these tools will be lost.
Stream of consciousness data
This problem is not unique to web.
Consider the problem of logfiles.
Even though they contain essential insights, the sheer size of such files combined with a lack of sophisticated reporting means they are routinely ignored.
Few teams can justify the time needed for analysis when they could be working on their product itself.
Of course, this doesn't mean that the demand for understanding does not exist. Indeed, vendors who manage to fill the gap are making real headway.
There is no reason why the same cannot happen for web.
High performance 'sorting' engines
In a previous article I wrote that no matter how sophisticated many first generation web management tools may seem, they are in fact quite simple.
All they do is compare a website against a set of pre-set rules to determine a binary result:
- Broken link: Fail
- Meta description complete: Pass
- Text misspelling: Fail
These tools are basically high performance 'sorting' engines. Sure, they can blast through masses of information and list lots of results - but they offer no semantic insight. The task of identifying causal issues or isolate underlying patterns depends on us.
But what if semantics could be built in?
What if, say, a report about content quality (broken links, missing metadata) could be linked to insights from CMS (web authors), project management (team task lists) and then weighted for different factors?
What was once bland data would become a rich mine of insight.
But vendors cannot do this alone.
If we are to move away from the era of endless site reports to a new era of actionable insight, we need to identify what's most important in terms of measurement. Vendors can then get clever with their engineers.
A first step is to consider how you track the success of web management overall and then single out the metrics that (when taken together) can reveal deeper insights.
The 3 broad indicators I routinely use are:
1. Online indicators
These represent your ability to meet the minimum needs of users including measures for content quality, site performance, accessibility, traffic/conversions, subjective opinions (via feedback, surveys), etc
2. Operational indicators
Operational indicators represent your ability to meet the needs of internal organisational customers and stakeholders, eg by tracking production turnaround times, web development costs, quality of delivery, customer satisfaction and more.
3. Organisational indicators
And finally, organisational indicators stand for your ability to deliver a stable web presence within the policies and objectives of the enterprise overall, including total cost of delivery, minimal business risk or staff morale/turnover.
As may be seen, these indicators include both quantitative & qualitative measures. The reason is that although quantitative metrics are more easily tracked, they do not represent the totality of experience.
Good scores on things like few broken links, a high standard of accessibility or few misspellings could suggest things are going well - until a comparison with feedback, surveys and other qualitative sources reveals a different story.
(Read more in the article "Why weaponizing web tools improves decision making")
Digital management healthcheck
Naturally, change won't happen overnight. But already many 2nd generation web auditing tools are becoming more semantic - leading to better decision making.
Looking ahead, it's possible to imagine individual systems becoming more integrated and being able to score the subjective health of digital management based on a variety of measures.
For example, Sitemorse's benchmark reports shows how weightings can be applied to different metrics to create an overall score for site quality. This helps web managers to identify at a glance underlying issues (such as poor quality legacy content) and to prioritise corrective action.
Ultimately the benefits of such analysis will be seen in:
- Reduction of risk
- Competitive advantage
- Better productivity
But the benefits go beyond measurement.
As tools become simpler and more insightful, they can be shared more widely as less training is needed to use them.
This means they can be opened up to anyone who contributes to online - whether web staff, devolved CMS publishers and senior management - so they can track items of importance or interest to them.
This will increase general awareness of web and further improving understanding.
That is why deliberations about web management technology will be so important over the mid-term.
Identifying the right tools and extracting the right measures will endow you and your digital team with greater credibility and reinforce your position as a guiding authority.
NEW! Download a fully editable Website Quality Assurance (QA) checklist in excel format.