Even well resourced organisations with wonderful, rich websites can struggle with this question. They spend time and money on sophisticated web stats software and user research and feel that through this they have got a handle on what their users think when, in actual fact, they could be off the mark by several magnitudes. Why might this be true?
I feel that generally there is an over reliance on web stats that, while providing compelling evidence at times, will never quite give us the complete picture. They often supply good evidence of what is happening but very rarely why. Often the problem is that we’re asking the wrong questions and, where this is the case, any amount of data will prove absolutely useless. For me the real problem is the whole ‘black box’ nature of web stats. You enter some parameters and get some numbers but we have no real idea of what’s happened in between, of how these numbers have been generated and how real life they actually are. Reliance of a single data source in providing evidence for decision making is dangerous in the extreme.
Of course there are other things we can do such as user research, usability studies and focus groups and, while they can often provide valuable insights, there are inherent dangers due to small sample size, cost and, of course, the famous Nine Biases. I’ve personally carried out many of these studies and have never been convinced that the people who attend are in any way representative of real users.
We can of course resort to online surveys which have their own set of problems such as sample size again, generally poor response rates, stratification problems (e.g. gender, age, ethnicity etc.) and the fact that your responses will only reflect the views of people who tend to answer online surveys in the first place.
So if large internet companies are having problems how much worse is it for intranets that are generally under funded and under resourced?
I have consistently argued in this blog that there is a fundamental problem with intranets – they just don’t work. I suspect that part of the problem is that intranet teams don’t have much of a clue as to what their users really want and need and so they generally respond to the only clear wants and needs they can identify, that of their stakeholders and bosses. If the First Law of Intranets is that ‘The volume of content held in an intranet and the usability of the intranet are inversely proportionate. When content grows usability inevitably decreases and vice versa’ then the Second Law might be ‘Any intranet designed to please your boss will inevitably crash and burn‘.
In this blog I have suggested approaches such as the Enterprise-Wide Information System and of course The Lean Intranet but anything you do that is not based on real user wants and needs is likely to fail. So if big corporations with money and big teams can often get this wrong how can intranet teams ever hope to succeed?
I haven’t posted for over a year as my career path has led me in a somewhat different trajectory. I have moved from being a jobbing Information Architect to becoming a User Data Analyst. What is User Data Analysis you might ask? It’s a specialisation I am largely creating for myself and it’s based on holistically looking at all available evidence of user wants, needs and behaviours in order to identify patterns and trends. This approach is proving valuable in guiding strategy and design and ensuring that our site is responsive to our users changing needs within an ever changing web eco-system.
I’ve learnt a lot over the past year and I will attempt to share some of this learning over the next few posts and show how approaches I’m now using in a large organisation might be adapted for intranet teams.
User Data Analysis
Never trust an unsupported piece of data, no matter where it comes from. User Data Analysis must consider all available user data from all available sources as well as identifying the need for new data sources. We make hypotheses all the time but how do we know we’re right? For instance we know we have a certain average number of page views for our site so when we release a new version and the page views increase dramatically what do we make of it? Unfortunately all too often we tend to view stats through our own belief system and that tells us that getting more page views, for instance, is a great success and our users like the site. Unfortunately it may be the case that users can’t find anything and are looking at more pages to try and find stuff. If you don’t have more than one data source you will never know which is true.
Only draw conclusions from multiple data sources over time. I think of this as information provenance. If you see a Van Gogh for sale and you want to buy it the first thing you will do is check the work’s provenance to make sure it’s the real thing. We should look at any data set, analysis or report in the same light. Where did it come from, how trustworthy is it and what is it really telling us? If the whole point of generating and analysing data is to improve our product then we had better be sure it’s right. Many organisations use web stats alone in making decisions and then wonder why things go pear shaped. As can be seen above any data exercise has its flaws and its limitations, it is only when multiple data sources generated by different methodologies are combined that we get closer to a description of reality.
Always be brutally honest. A quote from one of my heroes Richard Feynman – ‘The first principle is that you must not fool yourself and you are the easiest person to fool.’ This is probably the hardest part of being a user analyst. Telling colleagues who have worked their socks off that they have wasted their time and creativity is really hard but if it has to be done do it as soon as possible. Projects that are on the wrong track are like vampires sucking the life blood from your organisation. Of course people will always want to shoot the messenger but it is imperative that user data analysts are always truthful and don’t try and make a greyscale world black and white in order to please bosses or colleagues. The world is messy and therefore data will sometimes be messy too. You often get data sets disagreeing with each other and it may be that this is telling us to dig a bit deeper but it may also be just the way it is. Also, at times, data can’t always tell us what we need to know to make a decision and we should be honest about it. Sometimes we need to take that jump into the dark whatever the data says.
Get to know your users. For the past year I’ve been reading all of our users’ contacts through formal and informal channels as well as looking at web stats. I’ve attended focus groups and communicated directly with our users via email. I feel like I’m getting to know our users a little but there are always surprises. However even the little I know is proving invaluable. I have always said that the best computer in the world is the one between our ears and stuffing it with user data can help to bring you a little closer to users and provide a better understanding of their motivations, needs and wants. Once you have have this then you can….
…be your users’ representative. When I was Quality Manager in the automotive sector the Japanese customer centered philosophy adopted by Toyota and Nissan was drilled into me. Indeed, even though I was paid by my employers, I was told that I should think of myself as my customers’ representative within my company. Which I did. This led to some conflict with colleagues at times but it made for a much healthier company and our customers trusted us. A user data analyst should think this way too. It is not enough to get to know your users and their wants and needs, an analyst must fight for them too. They must become the face of the user within their team. This is beginning to happen to me. I now get asked along to meetings where future strategy is discussed and quite often when a suggestion is made the question is asked ‘What might our users make of this?’ and all eyes turn to me. I surprise myself a lot of the time as, after stuffing myself with all that user data, the answer appears obvious. Not trusting this at first I actually went back over the data to confirm what I’d said in meetings and thankfully I was right. I now trust myself to give a robust answer to questions about users and this can really make meetings more productive. Without this lengthy discussions on topics that are dead ends can take up valuable time at meetings.
Be an evangelist. It is not enough to know your user, issue reports and attend meetings. Everyone in your team needs to be as aware of what their users want and need and how their activities can have an impact on user satisfaction. User data analysts must do their best to spread the word and be available to any member of the team when required. No-one really wants to jump into the dark. The user data analyst is the one who goes ahead holding the lamp. The following team may not see too clearly but the fact that they can see at all is really the point.
I’ll be putting up some further thoughts on user data analysis in the coming months.