The Viewfinder

Getting beyond the Twilight Zone of Data Uncertainty

Part 1

You unlock this door with the key of imagination, beyond it is another dimension – you’re moving into a land of both shadow and substance of things and ideas – you’ve just crossed over into the Twilight Zone of DATA UNCERTAINTY.

Imagine if you will stepping into your car and plugging in your destination. After your app calculates the average speed and distance, it plots out the best route to get you where you want to go safely and on time. But what if you don’t trust the app’s data? What do you do?

Unfortunately, this is the situation for many organizations when they try to access their data to obtain the needed insights to make sound business decisions.

Despite an abundance of data, they have no way of dredging through their massive data lakes to extract actionable knowledge.

In this first, in a series of articles, we will begin the journey of moving you from a state of data uncertainty to laying a solid data practice foundation for your organization’s successful transformation to a “digital-savvy” enterprise.

We will do this by focusing on the three D’s of Data Modernization: data quality, data access and data compliance.

Today we will talk about data quality.

Data Quality

Let’s look at the data quality issue from the standpoint of data volume versus analysis and value.

According to our research, in 2016, the “average company manages 162.9TB of data,” while the “average enterprise manages 347.56TB.”

To provide you with a point of reference, the average Small-Medium Business (SMB) deals with a seemingly paltry 47.81TB of data.

Since 2016, IDC reports that the amount of data overall has increased exponentially to reach 74ZB in 2021.

However, the real question is not the amount of data, but what percentage of that data do companies analyze and how much of that has intrinsic business value?

According to IDC, “less than 5% of all data” that is collected is analyzed.

From the standpoint of value, 55% of all data is “classified as dark data,” which is “the information assets organizations collect, process and store during regular business activities, but generally fail to use for other purposes (for example, analytics, business relationships and direct monetizing).” The remaining 45% is classified as usable data.

Based on the above, it is virtually impossible to determine the “true” value of a company’s data when so little of it is analyzed, which means that most organizations are “flying blind.”

A Cultural Imperative

When he took the helm of Microsoft in 2014, CEO Satya Nadella talked about the importance of organizations creating a data culture so that everyone could “make better decisions based on (quality) data.”

Echoing Nadella’s visionary direction, a March 2021 Harvard Business Review article presents what we believe is an essential cornerstone for laying a solid data practice foundation: successfully creating a data-driven culture.

Although recognizing the benefits of creating the right culture to improve data quality resulting in better data access and utilization, 95% of all executives identify organizational and process challenges as the primary obstacles impeding the adoption of big data and AI initiatives. In short, the problem with accessing and using quality data to its full potential is a people and process issue – and ultimately a leadership issue.

The only way to address people and process issues to create a data culture is for CEOs to recognize the importance of data beyond a conceptual perspective and see it in a practical bottom-line context. Everything from customer satisfaction and regulatory compliance to employee empowerment make a case for becoming a data-driven organization.

In the next installment in the Data Twilight Zone series, we will discuss Data Access.

No comments found.
Anonymous User

Leave a Reply

Your email address will not be published. Required fields are marked *