Why Your Data Analytics Programs Still Aren't Working

{authorName}

Tech Insights for ProfessionalsThe latest thought leadership for IT pros

Tuesday, October 23, 2018

Is your data analytics platform failing to provide the results you were expecting? Here are a few common reasons why these initiatives may disappoint.

Article 4 Minutes
Why Your Data Analytics Programs Still Aren't Work

Data is the lifeblood that sustains any business today. Without the insights it provides into customers, competitors and the wider market, firms cannot make well-informed strategic decisions, personalize their offerings, or react quickly to changing circumstances.

Therefore, big data analytics will be a top priority for any company that wants to be successful. However, this will be a major investment, and requires specialized skills in order to effectively turn raw data into usable insight.

For many organizations, this is a challenge they have yet to meet, and it could be costing them significant amounts of money, as poor data analytics outcomes can leave them with no benefit or - worse - lead them down the wrong path.

Data analytics is a complex activity, and there are a wide range of reasons why such programs might fail. Here are a few of the most common issues that might sound familiar to many companies.

1. Failing to integrate your data

Many firms may have heard of the concept of 'data lakes', where companies can pool all their data assets into a single resource from which they can conduct analytics. These solutions have many advantages - they break down silos, ensure that everyone has access to all relevant data, and can be scaled to whatever a firm needs - but if they aren't implemented properly, many of the advantages will be lost.

A common mistake is to bring the data together, but fail to consider how it actually integrates. Data that comes from multiple sources may be in the same place, but if they are in different formats, you can end up with an incomplete picture as analytics tools can't make connections and find patterns. Therefore, data integration tools are a must to avoid the need for huge amounts of manual effort to negate this issue.

2. Inaccurate or out-of-date data

Any big data analytics platform is only as good as the raw materials you put into it. But too many companies are still feeding their solutions with incomplete, inaccurate or out-of-date data that isn't relevant to the task at hand. Often, there is a mistaken belief that these issues can be overcome simply by increasing volumes, as if more data will simply drown out any errors.

The result of this can be misleading or just plain wrong and can harm businesses in both the short and long term. However, there are many data cleansing tools available that can help firms avoid these issues.

3. Not having the right people on the team

Most businesses recognize the importance of having skilled personnel on the team who can program big data analytics tools and interpret the results, which has made data scientists one of the most in-demand jobs in the tech sector. But as technologies such as artificial intelligence and machine learning make their way into the mainstream, more will be required.

Many firms are therefore turning their attention to data engineers. While there is some overlap between the role of the data engineer and the data scientist, the engineer's job focuses more on developing the underlying architectures of the platform and managing and cleaning the raw datasets before they are handed over to data scientists. It's a less high-profile job, but without it, your data scientists won't have the tools they need to function effectively.

4. Presenting data without context

The outcome of any big data analytics process should be a clear report that highlights the findings of the activity and what the implications for the business should be. But one common mistake businesses make is in not fully explaining the reasoning and context behind its conclusions.

For instance, a good system will include information on how likely a predicted scenario is to occur. Obviously, a company will respond differently if their forecast is 90 percent likely than if it is only 50 to 60 percent likely, but without this crucial context they will be unable to make the best-informed decision.

Another part of this is the way in which outcomes are presented. Columns of aggregated data that might be meaningful to an experienced data scientist may look incomprehensible to business managers who will actually be expected to act on the findings. As a result, businesses should focus on their user interface, allowing an easy overview of results, as well as the opportunity to drill deeper into the findings in order to understand the underlying assumptions.

Tech Insights for Professionals

Insights for Professionals provide free access to the latest thought leadership from global brands. We deliver subscriber value by creating and gathering specialist content for senior professionals.

Comments

Join the conversation...