The Analytics of Web Governance
It is a mistake to think that the only things that can go wrong on a website are those that are easiest to measure.
While it is relatively easy to track the effects of poor design or content on online performance using web analytics - what about poor Web Governance? What if your site management is not up-to-the-job? How will you know?
As explained in this article, every Web Manager needs a new set of analytics to track the effects of governance on the quality of their site.
Every system of Web Governance is composed of 4 resources (People, Processes, Tools & Budget), that support 4 core activities (Leadership, Maintenance, Development & Infrastructure).
Read more about the activities & resources of Web Governance.
Some of these are highly sensitive and demand constant attention. Indeed, any distraction can have an almost immediate impact on site quality.
For example, should the Maintenance task of Quality Assurance be ignored or carried out by poorly-skilled staff, straightaway you will see an increase in broken links, misspellings, metadata problems, etc.
So, just as bad decisions about design & content can impact on online success, so can bad decisions about Web Governance.
Yet governance has often been ignored as a source of problems because of the difficulties in measuring it properly.
But this is changing.
One impact of the arms race in web technology is the diversity of tools that is now available to assist with online management. This includes not only 'dumb' tools like CMS, but 'smart' systems that can both measure the quality of operations and act as decision aids.
As a result, it is now possible to create a new set of analytics to track the impact of Web Governance on online performance.
What to measure
The purpose of Web Governance is to allow a site to operate in a controlled and orderly way. Its principal concern is to maintain a well-oiled, responsive web presence that can spring into action when called upon to achieve online goals.
The aim of our new set of analytics is to measure if & where decisions connected with governance could cause your site to become less than 'fit-for-purpose'.
This is done by selecting metrics that are sensitive to the type of inattention & under-resourcing that are typical of dysfunctional management systems.
Listed below are the analytics I propose as a starting point. As may be seen, they fall into 2 broad groups.
- Metrics connected with the quality of experience delivered by a site.
- Metrics connected with the quality of delivery by a Web Team.
I believe these metrics provide the immediacy of insight needed to judge whether web operations are on track or whether - as the result of some governance issue - they are running into the sand.
As may be seen much of the supporting data comes from automated tools, e.g. QA, content, technical performance. (I mention a few products I am familiar with, though more are available.)
It should be noted however that some metrics have no formal measurement systems, but rather depend on general administrative oversight, e.g. client management.
In any event, the onus is on the Web Manager to gather the data that she believes is appropriate. The more formal this is, the more likely she is to catch a governance issue before it spirals out of control.
Does the website deliver a minimum standard of user experience as regards design, content & code?
- Is content written to a minimum standard of readability & is error free, e.g. no broken links, spelling errors, missing metadata, etc?
- Does the design adhere to minimum standards of usability & accessibility, e.g. usability heuristics, WCAG 2.0?
- Does the code conform to development standards & operate gracefully cross-platform, e.g. semantic MarkUp based on W3C Web Standards?
- Are all pages optimised for delivery quality & accepted standards, e.g. file size, cookies, metadata, etc.
Source of analytics:
- Many such measures can sourced from a website QA service such as those provided by Siteimprove, Sitemorse, HiSoftware, Google Webmaster Tools and more. This includes broken links, accessibility, spelling & grammar, error pages, metadata, branding, semantic MarkUp, etc.
- This may be supplemented by additional tools that measure other aspects of online quality. For example Clarity Grader or Sitebeam for content readability, SEOMoz for SEO, as well as standard web analytics.
Does the website deliver a minimum standard of user experience as regards technical performance?
- Are metrics for availability, reliability & responsiveness adhered to, as set out in an SLA agreed with your host provider?
- Do changes to infrastructure (e.g. a hardware or software upgrade) have an impact on technical performance?
- Are changes to infrastructure co-ordinated for minimal impact on business-as-usual operations?
Source of analytics:
- As above, a website monitoring service such as those provided by Siteimprove or Sitemorse can provide metrics on uptime, crashes & other access issues.
- These may be supplemented by tools such as Exceptional for identifying code problems or Netsparker for checking basic security controls.
- In addition, communications with internal customers must form part of analytics. The aim is to ensure that such customers (e.g. a product department) can continue to expedite online plans despite any technical changes.
- Ensuring a robust 'change control' process is in place is the first step to this end.
Are interactions managed to the satisfaction of web visitors?
- Are communication response times adhered to, e.g. 24 hours for email, 12 hours for Facebook, 30mins for Twitter, etc?
- Are general satisfaction ratings from visitors adequate?
Source of analytics:
- A CRM tool may be used to track interactions, e.g. Intercom.
- User satisfaction ratings can be obtained from a number of sources, e.g. email feedback, page content ratings, shares/likes/comments/mentions from social media analytics or a media monitoring tool, an online survey, etc.
Are developments completed to the satisfaction of stakeholders?
- Are development schedules adhered to?
- Are estimates for cost & time adhered to?
- Are satisfaction ratings from stakeholders adequate?
Source of analytics:
- The aim here is to establish the efficiency & effectiveness by which your team carries out it work. As such, standard management metrics must form part of the governance analytics dashboard, e.g. budgetting, time management, customer satisfaction, etc.
Of course, even if a red-flag does go up in analytics, it does not necessarily mean that governance is to blame. It is merely a starting point for investigation.
For example, bad UX may be due to an honest but mistaken decision about design or content. Or negative feedback on a team may be due to the customer delivering fuzzy requirements.
Therefore, to validate whether governance is genuinely at fault, the following steps should be followed.
1. Did the issue arise as the result of a governance activity being ignored, or not being carried out to the right level of granularity?
For example, if content goes live with many misspellings, is it because the task of 'Quality Assurance' is not occurring, or is only being done on a cursory level?
2. If not 1, is it because the process is not being followed, or the process itself is not up to scratch?
For example, perhaps QA is occurring but the staff member is ignoring the documented process. Or perhaps the process itself is not detailed enough and is missing key steps.
3. If not 2, is it because of some issue of inadequate manpower, or inappropriate allocation of manpower?
For example, perhaps the staff member responsible knows that she needs to do QA - but does not have enough time. Alternatively, perhaps no-one has been formally assigned to QA by the Web Manager.
4. If not 3, is it because of some issue of poor staff expertise, or poor application of skills?
For example, perhaps the responsible staff member does not have the right skills to carry out all the checks required. Or perhaps the Web Manager has nominated someone who is inappropriate to the role, e.g. asking a Developer to spell check & grammar check written content.
5. If not 4, is it because of some issue of missing tools, or misuse of tools?
For example, perhaps the staff member does not have the right tool for the job or is using the tool in the wrong way.
6. If not 5, is it because of some issue of insufficient budget, or misuse of budget?
As with manpower in 3 above – perhaps there is not enough budget for this activity or too much budget is applied elsewhere. For example, perhaps so much funding goes into creating new content that little is left over for on-going maintenance.
7. If none of the above, is it due to some other dysfunction in Web Governance, e.g. communication among teams, lack of direction, lack of enforcement?
As often stated, one of the hardest parts of governance to control - and one that can have a huge impact on operations - is people.
A team with poor management, lots of infighting or low morale will often deliver poor work even if it is well resourced.
People are so fundamental to operations that no amount of cash will fix a dysfunctional team. More radical surgery is required.
No-one who does not work on a Web Team generally cares much about the basics of high-quality governance.
In fact, most do not even notice the effort that goes into supervising a website until something goes wrong, e.g. a poorly resourced Web Team seizes up due to overwork.
Yet, governance is such a critical part of the equation that if problems exist they cannot be put off forever. After all the issues of design, content and technology have been sorted – you still have to face up to how you actually administer your website.
And that means engaging with Web Governance.
Eager to learn more? Read these articles & put manners on your Web Governance!
Looking for a little advice? No problem!
Drop me an email with your Governance question & I'll do my best to answer!