Once A Week Is Never Enough

Once A Week Is Never Enough

Over the last few months we have been looking at User Engagement and recently asked people take part in a global usage survey so we could create a series of benchmark metrics detailing how organisations used OBIEE.

We had a great response and have started to process the data. One of the first things that struck us was that the average user logs into their OBIEE system about once per week.

once-a-week

DAU/MAU

We derive the once per week figure from the DAU/MAU metric we have talked about here and here. DAU/MAU shows us the % of your user community repeatably using the tool - so a kind of stickiness metric.

DAU and MAU stand for Daily Average Users and Monthly Average Users and are the unique number of users who login and use the system over the period. The definition of DAU/MAU we are using is (30 day moving average of DAU)/(MAU) - using a moving average levels out any usage spikes on particular days of the week.

The reason this measure is significant is that is tells us of how integrated the system is into people’s (working) lives. If you believe that the information in your OBIEE system is invaluable to your employees carrying out their jobs, then you would expect your DAU/MAU to tend to 100%.

Note: DAU/MAU won’t identify users who don’t into your system at all, there is an assumption that they will login at least once over the period of a month. If this is not happening then you have an adoption problem, not an engagement problem.

Is More Engagement Better?

My view is that the higher the engagement in the system, the more valuable the system is. So the questions become whether one visit per week enough; what does good look like; and what did you expect when you designed the system?

If you follow this reasoning, then we could assume that if we could get people using their OBIEE systems more, then their organisations would be better off.

Raising Engagement

We have previous discussed 5 levers that we think drive user engagement:

  • User interface and user experience
  • Quality, relevance, and confidence in data
  • Performance
  • Ability to use the system
  • Accessibility – is the system available in the right way, at the right time?

I want to talk about a couple of them here and how they can effect engagement.

User Interface And User Experience

We have been looking at report and dashboard design for a while now and believe that every dashboard should tell a story. When a user goes to a screen it must support the job they are doing. This means that dashboards have to be designed first and foremost focusing on user’s business processes. We see too many dashboards that contain loosely coupled sets of data that developers thought should be in the same place.

If you need to understand what data a user or analyst needs to do their job, then observe them, look which reports and spreadsheets they use, follow their eyes to see which cells on the various tables they monitor. Are they looking at trends, comparisions or exceptions?

We cover this in depth through the Visual Redesign phase of our User Engagement Service.

The following quote is from one of our client who has used the service:

The new designs provide a simple and at the same time powerful view of data.

Quality, relevance, and confidence in data

However good your dashboard and report design is, if the underlying data is incorrect or inconsistent, or if the user thinks it is, then the system will not be valuable. Users will go to other sources for the data, and they will stop using the unstrusted source. Too many times trust is purely subjective. This is one of the reasons why Excel starts to take root in organisations as a reporting tool. It is somewhere users can self source data, transform and aggregate it as they please and, maybe misguidedly, trust the results.

This starts to open up a much wider conversation about Data Governance that I will address in future posts. We have spoken to a lot of organisations recently where the issue is not that their users don’t have access to data, its that fact there are multiple systems that they could get the answer from and multiple routes around their information architecture that the data flows. You need a clear business intelligence and analytics architecture and roadmap.

Business intelligence and analytics systems are typically judged by return on investment (ROI), however I wonder if organisations with mature information architectures supported by complex data flows could be better off by looking at the cost and complexity of their data architecture. By looking at this they could address both data governance and the total cost of ownership (TCO).

The first step should be to analyse the analytics driven data flows, find out where they apply business logic, where there is duplication, and where there are silos. If you can simplify this process then you start to build the foundation for building a better ROI business case.

So addressing quality, relevance, and confidence in data is a big undertaking. There are some simple things you can do, such as maintain a taxonomy of reporting attributes across the organisation, however in most organisations with mature information architectures the answer is likely to be more deep rooted.

Conclusion

Our goal at Rittman Mead is to help companies harness data in and around their organisations through a range of analytics strategies, processes and tools. We see User Engagement as a key part of this. We believe in our User Engagement Service we have the right tools to help you raise your DAU/MAU.

We have a fundamental belief that the more people embed business intelligence and analytics into their work lives, the better job they will do their company.

The great thing is that in user engagement in OBIEE is measurable. Usage Tracking provides unequivocal details of how your system is performing, who is using it and how often, now and after any changes.