Skip to content

How to Avoid Analytics Failure: 10 Factors for Analytics Success from Research and Practice

Analytics Success Factors

This article expands on the material I discussed in a presentation of the same name I gave at Big Data World Asia 2018.


Introduction

The analytics* profession has been building data-driven decision support systems in organisations for almost 50 years. Given 50 years of global collective experience, do you think that a profession has a very high success rate? If you answered “yes” then you would be mistaken. The cause of the analytics failure is often not technical but usually organisational and cultural. How can you reduce the risk of your analytics initiative failing? In this article, I highlight 10 factors for analytics success drawn from 20+ years of academic research and industry experience.

How Bad is the Problem?

Academic and industry analyst research conducted over the last 20 years shows that the analytics profession not only has a very high failure rate but that failure rate is arguably increasing. The research shows that executive information systems (EIS) in the 1990s, business intelligence (BI) systems in the 2000s and advanced analytics systems more recently had failure rates ranging from 60% to 85% [1][2][3][4][5].

These results are not good, and to make matters worse, it is possible the real failure rate could be higher than the research reports because organisations tend to over-represent successful projects and under-represent failed projects when they respond to project outcome research.

Why Do Analytics Initiatives Fail?

Why after 50 years do we in the collective business intelligence, business analytics and data science professions have a collectively low success rate? We have a lot of experience and the technology is cheaper, easier and more accessible than ever before. Surely we should be good at this by now. I believe our failure rate is so high for two reasons.

First, we do a bad job of learning from our collective failures and mistakes. When was the last time you went to a conference presentation that talked about a failed analytics initiative and shared the lessons learned? When was the last time you read academic or industry research that analysed failed analytics initiatives and recommended factors to increase success?

Those who fail to learn from the mistakes of their predecessors are destined to repeat them. 

Second, we are too focused on the relatively easy technical aspects of analytics initiatives and tend to ignore the relatively difficult and arguably more important business and people aspects. As the success factors below show, the technical factors of a successful analytics initiative only account for 20-25% of all of the success factors.

How Do You Make Analytics Initiatives Successful?

Below I present ten factors that when implemented correctly will significantly increase the chance of analytics initiative success. These factors are drawn from over 20 years of academic research into the outcomes of analytics initiatives [2][6][7] and tested by my own successes and failures.

1. Committed and Informed Executive Sponsorship

Choose the most senior business (not IT) executive relative to the scope of the analytics initiative. Do not start without this.

This is the most important success factor. If you do not have this I strongly suggest putting the initiative on hold until you do. The ideal analytics sponsor is an executive who is:

  • A business leader NOT an IT leader. Analytics is a business initiative that is enabled by IT and not an IT initiative. It is great if an IT executive is also involved but the ultimate sponsor must be a business executive.
  • As senior as possible relative to the scope of the initiative. If the initiative is divisional in scope (e.g. a marketing analytics) then the CMO or similar person would be suitable. If the initiative is enterprise-wide then the sponsor would ideally be the CEO.
  • Sufficiently committed to analytics to be prepared to spend some personal reputation capital to champion analytics to the organisation and use their position to help remove roadblocks when they occur.
  • Is informed about analytics, at least at a business level so they can make informed decisions about the initiative as it progresses.

2. Widespread Management Support

Having widespread management support helps to manage change, overcome resistance and secure necessary resources.

Widespread management support is critical for the long-term success of an analytics initiative. Success ultimately comes from the people in the organisation changing the way they work and think about data to want to use the analytics systems regularly. Widespread management support can make the change management process easier, help overcome resistance to using the analytics systems and secure sustainable resourcing for the maintenance and growth of the systems.

Widespread management support for analytics is not necessarily needed in the early stages but something you should get as soon as possible if you want analytics to be viable in the long term.

3. Clear Link to Business Objectives

All analytics initiatives must positively contribute to business objectives. This is required for the survival of analytics.

If people using an analytics system does not visibly contribute to achieving one or more of the organisation’s objectives then the long-term survival of analytics is in jeopardy. A successful analytics initiative requires a long-term commitment of people and infrastructure. The sponsoring executive and the management team will very quickly cancel an analytics initiative if is it not making a positive contribution to the goals and objective the management team are being rewarded on.

When starting an analytics initiative pick a pilot project that addresses an identifiable business problem that when it is solved will have clearly measurable outcomes. Also, as each new initiative is deployed always make time to write a mini internal case study that explains how using the new or improved analytics system added business value. These case studies are not only helpful internal marketing for analytics but also evidence to justify ongoing funding if anytime the worth of analytics is questioned.

4. Have the Right People

Have a diverse team with both business and technical skills who act as business partners, and ensure business users are data literate and competent decision makers.

There are two types of people you need to consider when planning for analytics success: The analytics team and the analytics users.

When choosing your analytics team members remember analytics is a multi-disciplinary profession that combines both business and technical skills. Depending on their specific role, a team member will either have skills biased toward business have skills biased toward technical.

You should resist the urge to hire the so-called full-stack data analyst or data scientist. Not only are these types of people almost as rare as unicorns and often very expensive, but they are also usually more like a jack of all trade and a master of none. You are better off finding people who have a T-shaped or Pi-shaped skill set, that is a people who are moderately competent in a broad range of analytics skills but specialise in one or two specific analytics skills. You should also aim to make your analytics team as diverse as possible on as many factors (e.g. gender, ethnicity, educational background, industry experience, etc.) as you can. A diverse analytics team can provide more innovative solutions to problems and help minimise problems with system bias.

With regards to team structure, you should structure and position your team to be more like a group of consultants that partner with the business to support decision making rather than a team of technologists who simply build systems.

When it comes to analytics users it is important to ensure they have the right skills and knowledge to make the best use of the analytics systems your team provides. In addition to providing software and system training, it is equally or even more important the analytics users are data literate so they can correctly interpret the insights from the system and have the knowledge to translate those insights into value-adding business decisions.

5. Use Evolutionary Development Methods

Use development methods that allow your team to deliver content quickly and respond to rapidly changing requirements.

Some of the earliest research into analytics success identified the need to use evolutionary development methods, that is methods based on continuous cycles of development that involve significant user participation. The reason to use evolutionary development is that requirements for analytics change continuously and often rapidly. The continuous change in requirements is not only caused by changes in the business environment, but also by the user learning more about the business environment through the use of the analytics systems.

While the concept of evolutionary development predates modern agile development by some 20 years [8], evolutionary development methods are most closely aligned to modern, data-oriented agile development methods but with one small change: the nature of the final system is unknown and even if it is known it is very likely to change. Whichever method you chose to manage the development of your analytics system you must ensure that it is flexible and can deliver changes rapidly.

6. Choose Appropriate Technology

Your analytics technology should fit business requirements, allow rapid changes and offer a great user experience.

There is an almost limitless range of analytics technology available for you to choose from. The set of technologies you choose for analytics in your organisation should closely fit with the requirements of the business users, and not chosen, as I have seen on more than one occasion, to fill gaps in the resume of members of the analytics team. Most importantly the criteria for choosing any business user-facing technology products should be heavily weighted towards factors related to user experience. If business users find analytics tools too hard to use their adoption rate will be low and the long-term viability of your analytics initiative will be threatened.

Choose technology products that are flexible and adapt to changing user needs. Also, only choose the minimum set of products you need to meet the user needs because always remember analytics tools are not Pokemon, you don’t have to collect ‘em all.

7. Have Appropriate Resources

Analytics is a process, not a project. Ensure you secure ongoing funding for people, tools and services to meet user demand.

Analytics is a process, not a project. This means your analytics system is never finished. It needs frequent maintenance and enhancement to ensure it remains relevant to the business user. To ensure your analytics system remains relevant, the resourcing model for your analytics initiative must reflect that itis process. The funding for your team, tools and services must be recurrent and set at a level to ensure that the level of service, the rate of development and expansion of capability are accounted for and can be planned for with certainty.

Analytics should be seen as an ongoing business function, not an IT project and therefore it should be funded accordingly.

8. Understand Decision Support Requirements

Analytics supports decision making. Decision makers and the decisions they make should guide everything your team does.

Analytics is not about solving business problems, although that is part of the process, and it is not about generating insights, although again that is also part of the process. Analytics ultimately exists to support and/or automate business decision making. To achieve this purpose all members of your analytics team must understand human decision making and how people used and often misuse data when making decisions.

All aspects of your analytics team should be decision-driven and focus primarily on the decision maker and the decision he or she has to make.

9. Effective Data Management

Choose tools that ensure the currency, consistency and accuracy of data. Beware of the urge to wait for perfect data.

Data is the raw material of any analytics initiative. As is the case with cooking, the best methods and equipment cannot correct for poor quality ingredients. Following from success factor six your analytics initiative should use the most appropriate tools to ensure the currency, consistency and accuracy of data. Also, any data models should be flexible and extensible.

One thing to be aware of is the desire for data perfectionism. It is common for analytics teams to believe they can only proceed with building some part of an analytics system if they have the best data. The reality is that even the best data is always an imperfect model of the real world so it is more important to work toward having data that is good enough for the decision being supported. For example, if your data is 80% correct and complete and someone suggests spending more resources to make the data 90% or 100% complete you have to ask how likely is it that a different decision will be made with the better data and is the incremental value of that better decision likely to be more than the incremental cost of obtaining the better data.

10. Manage Project Scope

Carefully manage the scope of development iterations to always meet user expectations and plan to meet future demand.

While success factor seven tells us that analytics is a process, developing each iteration can be considered a small project. It is important to set the scope of each iteration so your team can deliver what it promised in the time frame. As with all projects, it is always better to under-promise and over-deliver than the reverse.

On the longer-term planning scale, it is important for your team to not become a victim of its own success. If you implement all of these success factors fully, there is a good chance your analytics initiative will be very successful and you will have business people from all over the organisation wanting your team to build systems to improve decision making. Always be planning ahead with regards to the demand for your team’s services and ensure that your management sponsor and stakeholders are prepared to provide the financial resources you need to meet demand.

What’s Next?

Is your analytics initiative set up for success or heading for failure? How many success factors is your analytics initiative missing? Almost all or just a few?

Do you want to know how to assess your organisation’s readiness for analytics success? Contact us today for a free consultation.


A short form of this article is available at LinkedIn – How to Avoid Analytics Failure


End Notes

* I define the “analytics” very broadly. I include any initiative that involves the design, development and implementation of computer systems that use data, statistical and quantitative analysis, explanatory and predictive models to support or automate decision making in organisations. This includes business intelligence, business analytics, advanced analytics, data science, etc.

References

  1. C. E. Koh and H. J. Watson, “Data management in executive information systems,” Information Management, vol. 33, no. 6, pp. 301–312, Jun. 1998.
  2. P. Poon and C. Wagner, “Critical success factors revisited: success and failure cases of information systems for senior executives,” Decision Support Systems, vol. 30, no. 4, pp. 393–418, Mar. 2001.
  3. W. Kernochan, “Why Most Business Intelligence Projects Fail,” Enterprise Apps Today, 02-May-2011.
  4. Gartner, Inc., “Gartner says business intelligence and analytics leaders must focus on mindsets and culture to kick start advanced analytics,” Gartner Newsroom, 15-Sep-2015.
  5. M. Asay, “85% of big data projects fail, but your developers can help yours succeed,” TechRepublic, 10-Nov-2017.
  6. R. K. Rainer and H. J. Watson, “What does it take for successful executive information systems?,” Decision Support Systems, vol. 14, no. 2, pp. 147–156, Jun. 1995.
  7. D. Arnott, “Success Factors for Data Warehouse and Business Intelligence Systems,” in ACIS 2008 Proceedings, Canterbury, NZ, 2008.
  8. D. Arnott, “Decision support systems evolution: Framework, case study and research agenda,” European Journal of Information Systems, vol. 13, no. 4, pp. 247–259, Sep. 2004.