Insight and advice

Minerra Blog

How to Avoid Analytics Failure: 10 Factors for Analytics Success from Research and Practice

Analytics Success Factors

This article expands on the material I discussed in a presentation of the same name I gave at Big Data World Asia 2018.


The analytics* profession has been building data-driven decision support systems in organisations for almost 50 years. Given 50 years of global collective experience, do you think that a profession has a very high success rate? If you answered “yes” then you would be mistaken. The cause of the analytics failure is often not technical but usually organisational and cultural. How can you reduce the risk of your analytics initiative failing? In this article, I highlight 10 factors for analytics success drawn from 20+ years of academic research and industry experience.

How Bad is the Problem?

Academic and industry analyst research conducted over the last 20 years shows that the analytics profession not only has a very high failure rate but that failure rate is arguably increasing. The research shows that executive information systems (EIS) in the 1990s, business intelligence (BI) systems in the 2000s and advanced analytics systems more recently had failure rates ranging from 60% to 85% [1][2][3][4][5].

These results are not good, and to make matters worse, it is possible the real failure rate could be higher than the research reports because organisations tend to over-represent successful projects and under-represent failed projects when they respond to project outcome research.

Why Do Analytics Initiatives Fail?

Why after 50 years do we in the collective business intelligence, business analytics and data science professions have a collectively low success rate? We have a lot of experience and the technology is cheaper, easier and more accessible than ever before. Surely we should be good at this by now. I believe our failure rate is so high for two reasons.

First, we do a bad job of learning from our collective failures and mistakes. When was the last time you went to a conference presentation that talked about a failed analytics initiative and shared the lessons learned? When was the last time you read academic or industry research that analysed failed analytics initiatives and recommended factors to increase success?

Those who fail to learn from the mistakes of their predecessors are destined to repeat them. 

Second, we are too focused on the relatively easy technical aspects of analytics initiatives and tend to ignore the relatively difficult and arguably more important business and people aspects. As the success factors below show, the technical factors of a successful analytics initiative only account for 20-25% of all of the success factors.

How Do You Make Analytics Initiatives Successful?

Below I present ten factors that when implemented correctly will significantly increase the chance of analytics initiative success. These factors are drawn from over 20 years of academic research into the outcomes of analytics initiatives [2][6][7] and tested by my own successes and failures.

1. Committed and Informed Executive Sponsorship

Choose the most senior business (not IT) executive relative to the scope of the analytics initiative. Do not start without this.

This is the most important success factor. If you do not have this I strongly suggest putting the initiative on hold until you do. The ideal analytics sponsor is an executive who is:

  • A business leader NOT an IT leader. Analytics is a business initiative that is enabled by IT and not an IT initiative. It is great if an IT executive is also involved but the ultimate sponsor must be a business executive.
  • As senior as possible relative to the scope of the initiative. If the initiative is divisional in scope (e.g. a marketing analytics) then the CMO or similar person would be suitable. If the initiative is enterprise-wide then the sponsor would ideally be the CEO.
  • Sufficiently committed to analytics to be prepared to spend some personal reputation capital to champion analytics to the organisation and use their position to help remove roadblocks when they occur.
  • Is informed about analytics, at least at a business level so they can make informed decisions about the initiative as it progresses.

2. Widespread Management Support

Having widespread management support helps to manage change, overcome resistance and secure necessary resources.

Widespread management support is critical for the long-term success of an analytics initiative. Success ultimately comes from the people in the organisation changing the way they work and think about data to want to use the analytics systems regularly. Widespread management support can make the change management process easier, help overcome resistance to using the analytics systems and secure sustainable resourcing for the maintenance and growth of the systems.

Widespread management support for analytics is not necessarily needed in the early stages but something you should get as soon as possible if you want analytics to be viable in the long term.

3. Clear Link to Business Objectives

All analytics initiatives must positively contribute to business objectives. This is required for the survival of analytics.

If people using an analytics system does not visibly contribute to achieving one or more of the organisation’s objectives then the long-term survival of analytics is in jeopardy. A successful analytics initiative requires a long-term commitment of people and infrastructure. The sponsoring executive and the management team will very quickly cancel an analytics initiative if is it not making a positive contribution to the goals and objective the management team are being rewarded on.

When starting an analytics initiative pick a pilot project that addresses an identifiable business problem that when it is solved will have clearly measurable outcomes. Also, as each new initiative is deployed always make time to write a mini internal case study that explains how using the new or improved analytics system added business value. These case studies are not only helpful internal marketing for analytics but also evidence to justify ongoing funding if anytime the worth of analytics is questioned.

4. Have the Right People

Have a diverse team with both business and technical skills who act as business partners, and ensure business users are data literate and competent decision makers.

There are two types of people you need to consider when planning for analytics success: The analytics team and the analytics users.

When choosing your analytics team members remember analytics is a multi-disciplinary profession that combines both business and technical skills. Depending on their specific role, a team member will either have skills biased toward business have skills biased toward technical.

You should resist the urge to hire the so-called full-stack data analyst or data scientist. Not only are these types of people almost as rare as unicorns and often very expensive, but they are also usually more like a jack of all trade and a master of none. You are better off finding people who have a T-shaped or Pi-shaped skill set, that is a people who are moderately competent in a broad range of analytics skills but specialise in one or two specific analytics skills. You should also aim to make your analytics team as diverse as possible on as many factors (e.g. gender, ethnicity, educational background, industry experience, etc.) as you can. A diverse analytics team can provide more innovative solutions to problems and help minimise problems with system bias.

With regards to team structure, you should structure and position your team to be more like a group of consultants that partner with the business to support decision making rather than a team of technologists who simply build systems.

When it comes to analytics users it is important to ensure they have the right skills and knowledge to make the best use of the analytics systems your team provides. In addition to providing software and system training, it is equally or even more important the analytics users are data literate so they can correctly interpret the insights from the system and have the knowledge to translate those insights into value-adding business decisions.

5. Use Evolutionary Development Methods

Use development methods that allow your team to deliver content quickly and respond to rapidly changing requirements.

Some of the earliest research into analytics success identified the need to use evolutionary development methods, that is methods based on continuous cycles of development that involve significant user participation. The reason to use evolutionary development is that requirements for analytics change continuously and often rapidly. The continuous change in requirements is not only caused by changes in the business environment, but also by the user learning more about the business environment through the use of the analytics systems.

While the concept of evolutionary development predates modern agile development by some 20 years [8], evolutionary development methods are most closely aligned to modern, data-oriented agile development methods but with one small change: the nature of the final system is unknown and even if it is known it is very likely to change. Whichever method you chose to manage the development of your analytics system you must ensure that it is flexible and can deliver changes rapidly.

6. Choose Appropriate Technology

Your analytics technology should fit business requirements, allow rapid changes and offer a great user experience.

There is an almost limitless range of analytics technology available for you to choose from. The set of technologies you choose for analytics in your organisation should closely fit with the requirements of the business users, and not chosen, as I have seen on more than one occasion, to fill gaps in the resume of members of the analytics team. Most importantly the criteria for choosing any business user-facing technology products should be heavily weighted towards factors related to user experience. If business users find analytics tools too hard to use their adoption rate will be low and the long-term viability of your analytics initiative will be threatened.

Choose technology products that are flexible and adapt to changing user needs. Also, only choose the minimum set of products you need to meet the user needs because always remember analytics tools are not Pokemon, you don’t have to collect ‘em all.

7. Have Appropriate Resources

Analytics is a process, not a project. Ensure you secure ongoing funding for people, tools and services to meet user demand.

Analytics is a process, not a project. This means your analytics system is never finished. It needs frequent maintenance and enhancement to ensure it remains relevant to the business user. To ensure your analytics system remains relevant, the resourcing model for your analytics initiative must reflect that itis process. The funding for your team, tools and services must be recurrent and set at a level to ensure that the level of service, the rate of development and expansion of capability are accounted for and can be planned for with certainty.

Analytics should be seen as an ongoing business function, not an IT project and therefore it should be funded accordingly.

8. Understand Decision Support Requirements

Analytics supports decision making. Decision makers and the decisions they make should guide everything your team does.

Analytics is not about solving business problems, although that is part of the process, and it is not about generating insights, although again that is also part of the process. Analytics ultimately exists to support and/or automate business decision making. To achieve this purpose all members of your analytics team must understand human decision making and how people used and often misuse data when making decisions.

All aspects of your analytics team should be decision-driven and focus primarily on the decision maker and the decision he or she has to make.

9. Effective Data Management

Choose tools that ensure the currency, consistency and accuracy of data. Beware of the urge to wait for perfect data.

Data is the raw material of any analytics initiative. As is the case with cooking, the best methods and equipment cannot correct for poor quality ingredients. Following from success factor six your analytics initiative should use the most appropriate tools to ensure the currency, consistency and accuracy of data. Also, any data models should be flexible and extensible.

One thing to be aware of is the desire for data perfectionism. It is common for analytics teams to believe they can only proceed with building some part of an analytics system if they have the best data. The reality is that even the best data is always an imperfect model of the real world so it is more important to work toward having data that is good enough for the decision being supported. For example, if your data is 80% correct and complete and someone suggests spending more resources to make the data 90% or 100% complete you have to ask how likely is it that a different decision will be made with the better data and is the incremental value of that better decision likely to be more than the incremental cost of obtaining the better data.

10. Manage Project Scope

Carefully manage the scope of development iterations to always meet user expectations and plan to meet future demand.

While success factor seven tells us that analytics is a process, developing each iteration can be considered a small project. It is important to set the scope of each iteration so your team can deliver what it promised in the time frame. As with all projects, it is always better to under-promise and over-deliver than the reverse.

On the longer-term planning scale, it is important for your team to not become a victim of its own success. If you implement all of these success factors fully, there is a good chance your analytics initiative will be very successful and you will have business people from all over the organisation wanting your team to build systems to improve decision making. Always be planning ahead with regards to the demand for your team’s services and ensure that your management sponsor and stakeholders are prepared to provide the financial resources you need to meet demand.

What’s Next?

Is your analytics initiative set up for success or heading for failure? How many success factors is your analytics initiative missing? Almost all or just a few?

Do you want to know how to assess your organisation’s readiness for analytics success? Contact us today for a free consultation.

A short form of this article is available at LinkedIn – How to Avoid Analytics Failure

End Notes

* I define the “analytics” very broadly. I include any initiative that involves the design, development and implementation of computer systems that use data, statistical and quantitative analysis, explanatory and predictive models to support or automate decision making in organisations. This includes business intelligence, business analytics, advanced analytics, data science, etc.


  1. C. E. Koh and H. J. Watson, “Data management in executive information systems,” Information Management, vol. 33, no. 6, pp. 301–312, Jun. 1998.
  2. P. Poon and C. Wagner, “Critical success factors revisited: success and failure cases of information systems for senior executives,” Decision Support Systems, vol. 30, no. 4, pp. 393–418, Mar. 2001.
  3. W. Kernochan, “Why Most Business Intelligence Projects Fail,” Enterprise Apps Today, 02-May-2011.
  4. Gartner, Inc., “Gartner says business intelligence and analytics leaders must focus on mindsets and culture to kick start advanced analytics,” Gartner Newsroom, 15-Sep-2015.
  5. M. Asay, “85% of big data projects fail, but your developers can help yours succeed,” TechRepublic, 10-Nov-2017.
  6. R. K. Rainer and H. J. Watson, “What does it take for successful executive information systems?,” Decision Support Systems, vol. 14, no. 2, pp. 147–156, Jun. 1995.
  7. D. Arnott, “Success Factors for Data Warehouse and Business Intelligence Systems,” in ACIS 2008 Proceedings, Canterbury, NZ, 2008.
  8. D. Arnott, “Decision support systems evolution: Framework, case study and research agenda,” European Journal of Information Systems, vol. 13, no. 4, pp. 247–259, Sep. 2004.

Data to Dashboard in 90 Minutes with Ajilius Data Warehouse Automation

This is the third video in our Data Warehouse Automation series, Data to Dashboard in 90 Minutes with Ajilius Data Warehouse Automation. In this video, we demonstrate a full end-to-end workflow for the creation of a data warehouse from the Sakila movie rental database using Ajilius.

The following steps are covered during the demonstration:

  • Connecting to data sources (MySQL and CSV)
  • Extracting data from data sources to loading tables
  • Transforming data from load tables to staging tables
  • Creating dimension and fact tables
  • Exporting metadata to Yellowfin for data analysis

We hope you find the video informative. A transcript of the first component of the video is included below.

Our second video, Introduction to Data Warehouse Automation covered some of the fundamentals of data warehouse automation and how it delivers significant value to organisations.

In this video, I use the Ajilius data warehouse automation platform to demonstrate a full end-to-end workflow to create a data warehouse from scratch. I speed the video up along the way to compress about 90 minutes of work into about 25 minutes. Future videos will show each part of the process at normal speed.

The Sakila Database

First, we’ll talk about the database that we’ll be using for this exercise. We’re going to use the ‘Sakila’ database which is a sample data set for the MySQL database platform.

The story behind the sample database is all about a DVD rental company. The company uses a system to record all the rental activity for the business, and all data entered into the system is saved a transactional database. This transactional database is the dataset we’ll be working with.

The dataset contains a variety of information such as:

  • Information about customers and staff (personal details, etc.)
  • Information about each store (location, manager, etc.)
  • Information about each film (e.g. title, language, genre)
  • Information about when when each film was rented and returned
  • Metrics such as how much revenue the rental generated

The data has been collected in a transactional database which isn’t structured to be optimised for reporting and analytics. In the first video in our data warehouse automation series, we mentioned that a star schema is the best data structure to support analytics.

Building a Data Warehouse with Ajilius

In this exercise, we’re going to transform the transactional data in the DVD rental database to a star schema. This will allow quick and easy analysis of the rental transactions at the DVD rental company.

We will build the following:

  • Fact table containing metrics such as how long each film was rented for and how much revenue was generated
  • A series of dimensions allowing this data to be sliced and diced by staff member, customer, store, film and date

Sakila Movie Rental Database Structure

The process we’ll go through is as follows:

  • Create a destination database for the warehouse in Microsoft SQL server and connect the destination data warehouse to Ajilius
  • Create a connection to the source database, which is the Sakila DVD rental database in MySQL
  • Extract data from the source database to the loading tables
  • Transform the data from the loading tables into the staging tables
  • Create dimension and fact tables
  • Create relationships between the dimensions and fact table to create a star schema and then export the metadata to Yellowfin to allow immediate analysis of the data

A diagram of the steps involve in creating the Sakila Data Warehouse using Ajilius

Summary of Timings

The table below summarises the time it took for each of the components of the data warehouse to be completed. All in all, the data warehouse was created in less than 90 minutes!

A table displaying a summary of the time taken to create the data warehouse

Stay tuned for future videos in our data warehouse automation series where we’ll use the Ajilius Data Warehouse Automation platform to further build upon the data warehouse we have just created.

What Is Data Warehouse Automation And How Does It Deliver Value?

This is the second video in our Data Warehouse Automation series, Introduction to Data Warehouse Automation. This video is ideal for people from a business background who are new to data warehouse automation and want to find out more. We’ll be covering the following topics:

If you haven’t already seen our first video in the series, Introduction to Data Warehousing, you may choose to watch this video first.

We hope you find the video below informative.

Notes from the video are provided below.  The slides can also be found on SlideShare.

Our first video, Introduction to Data Warehousing covered some of the basics of data warehouses and how they deliver business value.

In this video, we’re going to talk more about the development of data warehouses, and in particular, give you an introduction to data warehouse automation, and how it delivers significant value to organisations. Let’s first cover what is involved in developing a data warehouse.

Overview of traditional data warehouse development

Diagram showing the ETL / ELT process of a data warehouse

Traditional data warehouse organisations often have data sitting in various locations. This includes operational systems (where the data isn’t structured for reporting), in Excel or CSV, or in the cloud in some form. We need to get the data from these sources into our data warehouse where it is structured for reporting.

This is done by:

  • Writing code to extract the data from the sources and load it into a temporary location.
  • Transforming the data into a format suitable for reporting. This process also includes processes to improve the quality control (called cleansing), and data integration.
  • Making the data available in the data warehouse where it is structured in an optimum way for reporting purposes

This whole process is known in the industry as “ELT” or “ETL”. Traditionally, all of the code for the ELT/ETL process was written from scratch.

The problems with traditional data warehouse development methods

There are four main problems with traditional data warehouse development.

  • It’s time consuming as it requires significant manual coding – approximately 70-75% of the total project time. As a result, projects take far too long and users have to wait to get the information they need.
  • It’s error-prone. Due to the amount of manual coding required, it is easy to make mistakes and difficult to find and fix errors.
  • It’s costly due to the development resources and the effort required. Coding accountes for 70-75% of project costs. As a result, projects often result in failure – due to budget overruns, slow delivery, and low perceived value with stakeholders.
  • Users get frustrated as it takes way too long for them to get the information they need.

What’s the solution to these problems?

You might say that data warehouses are bad – let’s stop using data warehouses. However this isn’t the right solution because data warehouses provide a tremendous amount of business value. The problem was that we were using the wrong approach. There was a mismatch between traditional data warehouse development and the evolutionary nature of analytics development.

There were three main problems that are specific to analytics development.

  • Users don’t know whether they want something until they’ve actually seen it. We needed a different approach where we could release functionality rapidly in short, iterative cycles and make changes based upon user feedback. The reason is that if we build most functionality in one go, we would run a high risk that the output would have little value to the business.
  • Users requirements change once they start using the analytics system. If you don’t respond to those changes, people will stop using it.
  • Manual coding takes far too long and costs too much money.

The right solution to these problems is to use a combination of:

  • Evolutionary analytics development methods, and
  • Data warehouse automation

Evolutionary Development

Evolutionary Analytics Development: An iterative cycle of prioritisation, prototyping and feedback.

Evolutionary development involves a constant cycle of requirements gathering, rapid prototyping and review. It aims to deliver functionality incrementally in short release cycles. The evolutionary approach allows development of analytics systems that deliver value in a shorter time frame and have a tighter fit with organisational decision-making requirements.

The problem is that the evolutionary approach is extremely difficult with traditional data warehouse development methods – because manual coding takes too long. So we need to use data warehouse automation to solve these problems.

How Data Warehouse Automation delivers value

Rapid data warehouse delivery and agile changes
By automating much of the design, build and maintenance of your data warehouse. You can pull data from virtually any data source or cloud-based platform and deploy new functionality rapidly – in days and weeks, not months and years.

Dramatically reduced development costs
As all development is metadata-driven, minimal coding is required. As such, fewer ELT developers are needed and a high-quality solution can be delivered in a very short time.

Consistent high quality code and automated documentation
Data warehouse automation builds the entire extract, load and transform process by combining metadata that describes your data warehouse with industry best practice design patterns. This results in high-quality, consistent and error-free code with auto-generated documentation.

Choice of data warehouse platform
One of the problems with traditional data warehouse development was that after you’ve spent time writing the code to extract data from your source systems to your destination data warehouse, you were more or less locked into your data warehouse platform.

For example, if you were using SQL server to house your data warehouse, and you wanted to move to a cloud-based platform such as Amazon Redshift, it would take you months or perhaps years to rewrite all the code from scratch.

With data warehouse automation, all the transform logic is built in using metadata, and then the data warehouse automation platform will use this metadata and logic to write the specific code required for your target data warehouse platform. If you want to switch to a different platform, the data warehouse automation platform will re-write all the ELT code for you, rewrite it specifically so that it is optimised for the new platform.

Metadata export
Data warehouse automation tools allow you to export metadata into your data visualisation tool such as Yellowfin, Power BI, Qliq and Tableau, allowing you to analyse your data immediately and save time in report development.

And the output?

  • Functionality can be delivered rapidly, incrementally and fully support evolutionary development of analytics functionality
  • Development costs are a fraction of those using traditional methods, with less financial risk around data warehouse initiatives and a significantly higher return on investment
  • Dramatically reduced risk
  • Greater stakeholder engagement as the output is tightly aligned with their requirements
  • Far more business value and a significantly higher return on investment

Stay tuned for the next video in our data warehouse automation series where we’ll use the Ajilus data warehouse automation platform to build a data warehouse from scratch!

Do decision makers in your organisation have to wait weeks and months for changes and additions to be made to the data warehouse? Has your organisation avoided building a data warehouse because you have heard it can take too long or it’s too expensive? Minerra’s experienced consultants can help you assess your organisations needs and show you how using data warehouse automation and evolutionary development methods can deliver a cost effective data warehouse in much less time than you thought. Contact us for a casual chat to see how we can help.

What Is A Data Warehouse And How Does It Deliver Value?

We’re kicking off a series of videos to answer “What is a data warehouse?” and “Data Warehouse Automation” with our first video: Introduction to Data Warehousing. This video is ideal for people with a business background who would like to learn more about data warehousing, specifically:

We hope you find the video below informative.

Notes from the video are provided below. The slides can also be found on SlideShare.

What is a data warehousing?

Diagram showing the flow of data through a data warehouse

In order to correctly understand and make use of  data warehousing, you need to understand the problem you are solving.

It may be a decision that needs to be made, or a question that needs answering. For example:

  • A marketing manager might want to understand where to invest their online advertising dollars
  • A call centre manager might want to know the optimum number of staff to hire for their call centre, or
  • A sales manager might want to identify the customers that deliver the most profit – so they can find more of these customers.

We want to make an informed, and objective, data driven decision, and we need some data to answer these questions.

At an organisation, you may have data being collected in a variety of places. This may be:

  • From internal systems, such as a finance system or ERP, customer systems such as your CRM, HR / payroll, or operations systems such as a manufacturing system
  • From web apps, such as Salesforce and Xero, Google Analytics, or social media such as Twitter and Facebook
  • From spreadsheets and flat files
  • From other cloud sources
  • Or from any other data sources within the organisation

However this data is likely to be:

  • Not structured for reporting
  • Hard to access
  • Captured in a silo and not integrated with all the other data to give a complete picture

This is where data warehousing comes in. Data warehousing allows you to:

  • Extract data from your organisational systems
  • Load it into a centralised location
  • Transform and integrate the data into a format optimised for analytics

The data warehouse can be used as a source for your data visualisation tool to provide reports & dashboards, for advanced analytics, and for a variety of other purposes.

So how does this create value for an organisation?

  • A data warehouse creates a single source of data that is consistent in format, structured in a uniform way, contains complete and accurate data that can be relied upon, and is up to date
  • It is structured and designed specifically to allow data to be accessed quickly
  • It provides a single integrated view of an organisation by combining data from multiple sources
  • Provides a complete data set, allowing you to analyse data from the past to predict the future

All of this delivers value to the business by:

  • Giving managers access to the information they need more quickly and easily, with significantly less ongoing effort to prepare the data
  • Providing information about the business environment more quickly which means that managers can respond to changes rapidly
  • Providing information more frequently

And leads to more informed managers, making data driven, objective decisions – for which you’ll see the results in your organisation’s bottom line.

How is a data warehouse structured?

At its most basic, a data warehouse is a collection of tables containing data structured in a way that is optimised for reporting and analytics.

There are different ways that the data can be modeled. The best way to model data to support decision making is a dimensional model, sometimes known as a ‘star schema’ or a Kimball model.

At a high level, the tables in a dimensional model can be categorised into two different types:

  • Fact tables (also called event tables) contain individual business events (e.g. sales transactions), or aggregated, summarised business events (e.g. sales by month). They consist of measures and calculations such as sales amount and discount rate.
  • Dimensions contain descriptive attributes – fields that describe the measures in the fact table. They allow you to slice and dice your data. For example, a given sales transaction relates to a customer, a product, a salesperson, and a date.

So you could look at, for example

  • The transactions by salesperson for a given product
  • The top customers by revenue
  • The overall trend on revenue over time
  • If there is a relationship between discounts given and sales

Data warehouses usually contain multiple star schemas for analysing organisational data.

How are data warehouses used?

A data warehouse contains data that is used to support business decisions. It can be used in a variety of ways:

  • With a Data visualisation tool such as Yellowfin, Power BI, Tableau or Qlik, to provide reports and dashboards to managers to allow them to monitor and manage organisational performance, share information around the business, drive processes or send out tasks.
  • With Advanced Analytics such as network and cluster analysis, forecasting, data mining, sentiment analysis, simulation modeling, etc. There could be a two-way feed between the data warehouse and advanced analytics.
  • As a source of data to feed into other organisational systems, for example, into CRM for future marketing after customer segmentation analysis.

Stay tuned for the next video in our data warehousing automation series: Introduction to Data Warehouse Automation!

Do you have people in your organisation who spend most of their time using Excel to manually “glue” together data from for different systems? Do you only get monthly reports because weekly reports take too long? Do you have to wait too long for ad hoc data questions to be answered? If you answered yes to one or more of these questions then your organisation may gain a lot of value from a data warehouse. Minerra’s consultants have a lot of experience working with many type of organisation to help them quickly build a cost effective data warehouse that significantly reduces the manual effort required to prepare the data for regular reports and ad hos analysis. Contact us for a casual chat to see how we can help.

Minerra at Melbourne Business Analytics Conference

On a chilly mid-July Thursday, Minerra participated in the Melbourne Business Analytics Conference 2017. Minerra had a booth at the one-day conference where we met analytics and data experts and business leaders from across the country.

Steve Remington and Edgar Kautzner at Minerra's booth at the 2017 Melbourne Business Analytics Conference.

I also gave a presentation titled “Delivering Decision Support Through User-Centric Design” where I shared a successful approach we took with a large American organisation. The slides of my presentation are available on Slideshare and a recording of the presentation is below.

Keynote Presentation by Tom Davenport

To kick off the conference, keynote speaker Tom Davenport appeared via video conference from the US. He presented his view about analytics ‘eras’; in particular a new era in analytics named augmented analytics, which is what he called “analytics 4.0”. It was an interesting presentation however the ideas may somewhat align more with large multi-national organisations with a very mature analytics capability, than with the priorities and needs of many organisations, particularly small to medium enterprises that are still building their analytics capability from a comparatively low base.

Good Decisions versus Good Outcomes

Professor Zeger Degraeve, Dean of Melbourne Business School, conducted an engaging session about the differences between Good Decisions, and Good Outcomes. He talked about the role of analytics on informing good decisions, and that outcomes are generally uncertain. Based on probability, outcomes can be negative. Good managers should therefore not measure decision and decision support on outcomes, but in the way they support good decisions.

Analytics in Sport

It was a pleasure to hear Michael Cheika, head coach of the Wallabies, introduce a down-to- earth view of analytics use. He explained how the Wallabies use analytics in combination with their own expertise in areas such as injury prevention, performance improvement, and competition analysis. Although he mentioned big data and analytics, it seemed more like he was using spreadsheets (integrating data from different sources) and then trying to push those results to players directly.

Creating Business Value with Analytics

I also attended a fantastic panel session later in the afternoon about value creation from analytics. Senior analytics users and managers participated in the session, including Jane Eastgate (Head of Flybuys Analytics, Coles), Dr Amy Shi-Nash (Head of Data Science, Commonwealth Bank of Australia, Sheetal Patole (Chief Data and Analytics Officer, Macquarie Bank), Scott Jendra (CIO, Australian Football League), and Enrico Rizzon (Partner, Procurement Analytics, AT Kearney).

The discussion was focused on two important parts of analytics in organizations: value creation, and gender diversity in analytics teams. Interesting statements were made about the analytics teams’ closeness with the business areas of organizations (participation in meetings, roadshows, etc). The conversation also focused on the lack of female analytics partition in the market. The panel recognised that improvements had been made, but that much more work on this issue is needed in the next decade.

Overall, it was a productive and insightful day, and we’re looking forward to attending the event again next year.

If you have any questions about my presentation, please don’t hesitate to contact me at [email protected]. Wherever your organisation is in your analytics journey, we are happy to have a no-obligations chat to see what we can do to help.

New whitepaper by Minerra – Marketing Analytics: The only way forward

In every business, in every department, managers are accountable for the money they spend, and how effective that spending has been. Marketing analytics is no different. Unfortunately for years, marketers were executing plans based on gut feel, or unreliable numbers.

Digital Marketing Analytics For A Changing Era

In this digital age, marketers can no longer flounder around in the dark to figure out if their tactics are effective. With almost all aspects of business and personal consumption conducted online, it is no longer difficult to track return on investment.  Aided by the right tools, the ability to make informed, data-driven decisions by looking at digital marketing analytics, marketers can answer key questions such as:

  • What are your most successful marketing channels?
  • Where are you getting the most bang for your buck?
  • How are your customers engaging with your company?

So where do you start? What do you need to measure? How do you measure what you require?

Minerra’s latest whitepaper answers these questions and more. We take a look at the evolving role of the marketer and what’s important now and in the future. We explore the role of analytics in marketing, and take you through some of the ways you can start leveraging the data you are no doubt already accumulating in the background.

Download the whitepaper

Wherever your marketing department is in your analytics journey, Minerra is happy to help. Download our whitepaper to learn more about marketing, analytics, and how you can optimise your marketing approach. If you have any questions, or would like to chat about how we can help, contact us.

Business Analytics Tools are not Pokemon – you don’t have to “catch ’em all”!

Pokemon logo over Pokemon card collection

In early June, I attended the Gartner Data and Analytics Summit in Mumbai, India. I attended the summit as a representative of one of Minerra’s key technology partners, Yellowfin. With their Indian partner Aptus Data Labs, we engaged over 100 attendees with some great conversations about business intelligence, business analytics tools and the challenges of both.

In our conversations, one theme quickly emerged. Many people visiting the Yellowfin booth had a similar conundrum: they had many different analytics tools (usually three or more) in their organisation and wanted to consolidate down to just one tool.

Over the two days, the number of people coming up to us with this issue was so numerous that we began calling it the “Pokemon Problem” – it appeared that many organisations treat analytics tools like Pokemon and feel that they have to “catch ‘em all”!

Why so many analytics tools?

The reasons why organisations “suddenly” had so many different analytics tools varied but over the course of the summit, some similar explanations emerged:

  • Different tools were acquired as a result of mergers and acquisitions but nothing had been done to standardise one analytics tool across the merged organisation.
  • Organisations had a loose or non-existent policy about the purchase of analytics tools within different departments, so each department purchased the tool they wanted despite different tools already existing within the organisation.
    • This is also a consequence of poorly governed self-service business intelligence.
  • Organisations had an existing analytics tool, but the existing tool did not have the required functionality. As a consequence of this, it decided to buy another analytics tool and use both tools in parallel rather than creating a plan to migrate from the old system to the new.
  • One organisation even said they purchased a new analytics tool because a newly employed data scientist did not like the existing analytics tools.

Why is this a problem?

The problems cause by collecting many different analytics tools were:

  • Confusion and inefficiencies among consumers of analytics content. There was a huge burden on staff to remember how to use many different tools to access the information they need to monitor performance and make decisions. This also led many users to use the analytics tools just to download raw data so they could later analyse the data using Excel – the one tool they know well. This also increased the risks associated with ungoverned analytics content being used in the organisation.
  • Low productivity for analytics developers, either because additional developers are needed to ensure that the organisation has expert skills for all the analytics tools, or the existing developers have to know all tools, which leads to them having only a moderate level of skill with all tools rather than being expert in one tool.
  • Increased licence and maintenance costs for the organisation because they have small licence holdings with many software vendors rather a large licence holding with one vendor, which may lead to overall lower licence costs.
  • Increased operational costs because IT departments have to provide and maintain multiple sets of infrastructure for each analytics tool, particularly if the tool requires a server to distribute the content.

In my next post on this topic, I will provide the responses we gave to attendees – how you can solve the issue, and where you can start.

Steve Remington – Principal Consultant and Founder, Minerra

Is your organisation using multiple analytics tools? Are your employees making the most of these tools? Are some of these tools perhaps redundant, or have too many overlapping features? Minerra can help assess your needs and weight them against what you have to provide you with a plan to streamline your analytics tools. Contact us for a casual chat to see how we can help.

Image Credit: Jarek Tuszyński via Wikimedia Commons

Analytics and Data Driven Marketing: What are the analysts saying?

Marketers were among the first departments to start utilising data and analytics to inform strategy and tactics. They had to go where the audience was, and when the audience moved online, so did marketers. The internet, social media and mobile technology have all influenced the way data driven marketing get their messages out to their target audiences, and the way they measure success. The digital marketing mix now has to include a web presence, social media accounts, and content generation and distribution as basic tactics, and all these have to be analysed and measured.

Marketers constantly have to answer these questions; What is your ROI? What are your most successful marketing channels? What is your cost per lead? How are your best leads scored? Is the sales team focusing on these top leads? Are the leads being generated aligned to the type of leads the business needs for continued growth and success? These are all questions that can only be answered with data and analysis.

The Future Of Data Driven Marketing

According to Gartner, more than two-third of marketers plan to base most of their decisions on analytics within two years. It is no longer acceptable for a sophisticated marketing strategy to be formed based on ‘gut feel’. Plans and tactics should not only align with overall business strategy, they must be informed and optimised by data driven marketing.

Their analysts have also identified four key traits marketers should have for success in data-driven marketing: empathy, agility, accountability, and a focus on data hygiene and integrity. High-quality data will aid marketers build dynamic programs that meet business goals.

Forrester’s analysts have identified the need to balance analytics with engagement. Data must be evaluated and then engaged with in the most effective way to create an impact on revenue and the bottom line. While many marketers are utilising analytics, they need to be able to tweak, adapt and change tactics to really ensure that they’re using the data to improve the marketing function.

Forrester also emphasizes the need to create deeper relationships with customers through technology – “business intelligence (BI) solutions; cloud infrastructure to reduce cost and be more agile; marketing tech that offers a real-time, single view of the customer; customer experience processes; and specifically useful to the next wave of relationships, artificial intelligence that will drive a conversational relationship with your customer”. Traditional advertising fails to create deep, conversational relationships – Forrester predicts the end these marketing channels.

Analytics and data visualisation platforms can certainly help marketers understand and analyse data in order to make data-driven customer-centric decisions. Real-time data allows for the marketer to be agile and test and tweak plans to ensure success. Engagement data allows the marketer to build deeper and more meaningful relationships with customers – whether engagement occurs via social media, sales teams, or through product usage.

There is a depth of information just waiting to be tapped, you just need to get started.

Wherever your marketing department is in your analytics journey we’d be happy to have a no-obligation consultation with you to see how we can help you meet your business goals. Contact us at our details here.

Analytics: Are you ready to take the leap?

Analytics, big data, business intelligence.

These buzzwords entered conversations in organisations a few years ago now, but not many people know what they mean.

As information technology has pervaded every department and every business process, companies have generated a tremendous amount of data. From small businesses to international conglomerates, every organisation generates data that, until fairly recently, has been simply accumulated and stored. And until you can access, interpret and understand this data, your organisation is missing out on the opportunity to use this information to augment and make informed business decisions.

Analytics is a vital tool to support the decision-making process. This article provides a few succinct examples of how businesses of any size can benefit from analytics:

  • It’s much easier to make informed decisions.
  • It’s a structured way for growing revenue.
  • It increases the competitive advantage over other players in the industry, including larger businesses.
  • It improves the productivity of their business operations.
  • It enhances the quality of their customer service.

So why wouldn’t you leverage your data to gain these benefits? There are many perceived barriers companies face when thinking about approaching analytics:

  • I don’t think my company wants to spend money on this yet.
  • We don’t have the budget for this.
  • I don’t know where my data is.
  • I don’t know how to get to my data.
  • I don’t even know what is being stored.
  • I wouldn’t know the first thing about sorting out the data.
  • I cannot interpret this data.
  • I cannot communicate this effectively to my manager/head of department/CEO.

With the right tools and the right approach, companies can easily overcome these barriers so they can leverage their data for competitive advantage.

We’ve written a short guide to help you think about whether your organisation is ready to take the leap. Click on the button below to fill in a quick form and get the guide delivered to your inbox.

Download our guide

If you think your organisation is ready to have a conversation about analytics, give us a call. We’d be happy to have a no-obligation consultation with you to see where your organisation stands, and how we can help.