Both business and tech analysts agree that a capacity for advanced analytics is becoming mission-critical to the basic viability for all companies. While some believe that we are only three to five years to this point, others estimate that there is more time. Still, the writing on the wall is clear. Companies need to prepare for the not-so-distant future when advanced analytics are both integrated and essential at all levels of organizational decision-making. Already at present, it is imperative to exploit whatever competitive advantage is possible to take out from available data.
However, when it comes to extracting business value from data, many companies are failing, and failing miserably. As an Oracle-sponsored IDC white paper noted, despite the fact that over the past two years:
- 60% of companies started using new analytic techniques
- 33% began using new data types
- 32% introduced new metrics or key performance indicators (KPIs)
Some 60% of organizations surveyed reported that they were hampered by too little business intelligence (BI) and too few analytic applications developers. The same survey found that only 10% of employees were satisfied with the big-data technology resources available to them to support analysis and decision-making. So why is this happening?
Tech investment solves part the problemPart of the dissonance can be explained by the fact that there is a disconnect between IT and the consumers of BI across enterprises.
Many not from the IT-side of businesses fail to understand just how difficult it is to squeeze useful business intelligence and advanced analytics from a rich data set. The ability to model data, build aggregations, star schemes, enhancements, and so forth is essential for turning raw data from a pile of cluttered data points into unified, structured tables that are appropriate for analytical queries. But this can be difficult to do and mind-bogglingly slow.
Given the challenges of data analytics within today’s highly competitive environment, it can be next-to-impossible for IT to find the right balance within the “golden triangle” defined by performance, cost, and simplicity. If balance is not met, internal clients can get frustrated by bad performance of analytics tools, the difficulty of getting data insights, or the high costs of BI.
Fortunately, though, a range of tech solutions exist that can help. The problems mentioned above can be mitigated by utilizing end-to-end data management-as-a-service, that does everything from easing or even automating ETL processes to, as Fern Halper, director of TDWI Research, notes: “embedding analytics into systems and applications at the point of decision making.”
How can you improve analytics adoption in your organization?Realize that tech investment solves only one part of the problem
Tech issues constitute only one element of the problem. Because of this, technology can therefore only be one part of the solution: business-side is just as important. In many respects, becoming data-driven is a cultural, not technological challenge.
Walter Storm, Chief Data Scientist for Lockheed Martin, helpfully distinguishes between two categories of data scientists: back-office and front-office. According to Storm, there is tremendous value in the work of the back-office data scientists who have a passion for coding, deep learning, and technology. But, he notes:
"The greater challenge is the front office data science work. It is this data scientist that is the translator - speaking both the language of the business and the language of data science. The front-office data scientist must know the industry, have a firm grasp of economics and finance, and be able to validate, integrate and use advanced models within a broader decision support framework.”
With this distinction, Storm hints at the broader issue of organizational culture and silos that slows the implementation of data-driven, advanced analytics in organizations. Quite simply, not only do data scientists who truly want to be effective need to be focused on the broad drivers of enterprise success, but that professionals across organizations need to be aware that they need to be actively engaged and help to shape IT work with data.
Treat Data As A Corporate Asset
As Bill Shmarzo, CTO of Dell EMC Services, points out:
“Business leadership needs to accept responsibility to treat data and analytics as corporate assets to be maximized and exploited, instead of treating data as someone else’s (IT’s) problem.”
This may be one of the organization’s biggest cultural challenges, because most organizations have treated data as a cost to be minimized instead of a source of customer, product, operational and market insights that can be used to optimize key business processes, uncover new monetization opportunities and create a more compelling customer experience.
Start with business use cases, not tech solutions
Becoming data-driven starts with business use cases, not by force feeding technological solutions. The transformation should begin with organization-wide efforts to identify top priority business use cases that will benefit from analytics.
Building on this understanding means using business-use cases as a springboard to launch cross-business unit, collaborative efforts. Essentially, the best results can be obtained by aligning people, processes and technology to create a comprehensive project-based approach.
In practice, this means that instead of establishing warehousing and analytics as two distinct projects, bring them together. By aligning predictive analytic data requirements with infrastructure initiatives, the two are able to work together in a manner that focuses enterprise resources on key enterprise needs. This type of collaborative approach is a step in the right direction, helping enterprise leadership to adopt analytics as a business discipline versus an activity that is left to tech and data science teams. Technological solutions that facilitate efficient and effective analytics are essential, and should accompany cultural and organizational shift.
Panoply utilizes machine learning and natural language processing (NLP) to learn, model and automate the standard data management activities performed by data engineers and scientists. This not only saves thousands of code lines and countless hours of debugging and research, but lets them focus on ensuring that data serves key enterprise needs for advanced analytics.
Ready to move your organization closer to analytics, faster? Get your free trial of Panoply today!