CAI SIX PART WEBCAST SERIES | STEP 1
Companies today find themselves awash in data, with much more yet to come. IDC estimates the Global Datasphere will jump to 175 ZB by 2025 from 50.5 ZB in 2020. Yet business managers often struggle to transform that data into profit.
It’s not for lack of new technology. The data and analytics market is expanding at a 13.2% compound annual growth rate and will reach $274.3 billion by 2020, according to IDC. Instead, the lack of performance stems from much deeper problems within each company. Forrester Research recently found that 73% of the data collected goes unused.
CAI decided to explore these issues through a six-part webinar series entitled: “Activating Data & Analytics for Real Business Value Creation.” The free series features Steven Stone, CEO and Founder of NSU Technologies, and Tom Villani, Senior Vice President, Digital Innovation, at CAI. Stone, a former CIO at Lowe’s and Lbrands, uses real-world examples to help companies extract true value from their data.
This summary highlights key points from the first webinar. We invite you to register for the full series at www.cai.io and to contact Mark.Longo@cai.io for a complimentary Data & Analytics Readiness Assessment.
Learning from the Real World
Successful data analytics efforts flow from a three-part CPP Cycle – Content, Performance, and Presentation.
Content, or data, is foundational, building on elements like governance, data management, integration, preparation, and quality.
Performance means that data must be available to decision-makers in time to support the right decision.
Presentation is the last mile of data delivery and analytics, including data discovery, visualization, mobile delivery, and other factors.
The CPP cycle will serve as a foundation for all six parts of this webinar series. Here are two real-world examples of how the correct balance of content, performance, and presentation can drive real value:
Lowe’s once had 28 registers in an average store, but data revealed that stores used no more than 19. It decided to reduce the number to 21 and saved $13 million in hardware and maintenance costs in the first year. Over five years, the analysis added $40 million in profit.
Lbrands data showed conversion rates at its Bath & Body Works (BBW) stores fell during the holidays as more customers shopped. The data also found the problem: long checkout lines. Lbrands adjusted its register and staffing policies to drive a 300 – 400 basis point increase in conversions. That added $25 million in Black Friday weekend revenue the following year.
Delivering Business Critical Information
Business-critical information falls into three main categories: 1) information that supports a strategic direction or decision; 2) information that monitors operational performance; and 3) sensitive information related to intellectual property, legal, or regulatory matters. All of it must be trusted, relevant, and useful.
However, so many people now have access to information that it often becomes so fragmented into silos that limit collaboration, slow cross-functional initiatives, and raise data processing costs.
For example, when Lbrands first shipped goods to China, it had frequent problems clearing the inspection and quarantine process due to inaccurate product tags. The company had the correct data in its DRP solution, but business managers used data from their own Excel spreadsheets, resulting in a disconnect between the labels and the shipping information.
To avoid these kinds of data silos, technology managers first must ask why employees felt a need to go around a trusted system. It is important to create systems that serve the real needs of business managers, even if that means sanctioning a set a data marks to provide end-users with more flexibility and better performance.
The first step is to assure everyone shares common goals through an enterprise alignment – a Rosetta Stone that assures everyone is speaking the same language. Once executives agree on standards, other business leaders will follow. After all, they do need trusted data to do their jobs. Experimenting with new data tops and analytical tools will help sustain data curiosity across the enterprise. The key is to keep it simple at the start, then add complexity through iteration.
Self-service Data & Analytics
The trend toward self-service comes with both benefits and challenges. When done right, it can encourage collaboration, speed the company’s response to changing business conditions, and free engineers to work on more critical matters than, say, adding a new column to an existing form. According to a 2020 study by Dresner Advisory Services, 62% of business intelligence users consider self-service to be “critical” or “very important.”
A reliable self-service environment will also facilitate the work of auditors and support reliable decisions. Conversely, an unreliable system will undermine all those goals. Governance can’t be an afterthought. Self-service with bad data simply enables people to make bad decisions faster.
It’s also important to solve the company’s core data needs before adding self-service. Unless you already answer 80% of business questions through existing solutions, you’re not ready to tackle self-service.
As we mentioned earlier, a strong semantic layer is essential to assure consistent metrics. However, if your goal is to make data available across a variety of analysis and business tools, a good way to do that is to use APIs that mimic a semantic layer by allowing quick access to reliable information.
In most companies today, there are power users, or citizen developers, that use tools like Access or Excel to build intricate models that reflect a good deal of creativity but lack reliable data. Rather than replace those, you can improve on them by leveraging a user-focused API library. Or you can build APIs that eliminate the need for Access or Excel databases, which also gives you the ability to support the applications popular with citizen developers.
The final delivery may come through dashboards, which have advanced quickly from grid-based reports that represented little beyond a snapshot in time. Today, we’re seeing the first true iterations of real-time dashboards. In a car, you’d never want anything but a real-time dashboard. Now we have real-time dashboards that end users can build into self-service environments.
Creating a Flexible Data Architecture
Looking forward, your data strategy and architecture should reflect the desire of your organization to be data-driven. So far, we’ve looked at the ability to provide operational reports or descriptive tools. Now consider the rising role of data in both our daily lives and the lives of our companies. IDC estimates we had about 85 daily interactions with data in 2010. By 2025, that number will be 4,800. So you’ll need a strategy that embraces that growth while touching on all parts of the organization – people, process, and technology.
We have to look at how our processes will evolve, the cost and availability of data analytics talent, and how real-time decision-making, predictive analytics, and AI will change our processes. What are the downstream impacts of all that, and what technology will be needed to power that? As data volume grows, and data flows in from new structured and unstructured sources, how can we maximize the value from that? All of this will be a big transformation for companies. Collecting 10 PB of information without using it to drive decisions is just a waste of storage.
Architecture governance is where you start on this journey, but flexibility is extremely important. Four in 10 companies already see major unstructured data workloads and need architectures to accommodate that and different data types. One-third of organizational data is already stored in the cloud, and security has never been more important. Companies need to assess where they are along an analytics maturity curve.
That starts with basic reporting, typically involving a lot of manual effort.
By the second stage, you’ve developed some quasi-enterprise-level capabilities like data marts and warehouses. That allows managers to see what already happened. However, the third level includes predictive capabilities to show them what will happen. Only about one-quarter of companies use predictive analytics in standard processes today.
The fourth level turns to prescriptive processes that harness AI to recommend the next best actions. Google and Amazon already use prescriptive technology to power their recommendation engines. Finally, you have the fifth level where analytics are no longer separate but embedded directly into the process to optimize decisions and optimize returns.
Future webinars in this series will help companies determine where they are in this journey and chart their course for moving forward to complete their digital transformation. But the first step comes down to adopting a culture where executives and employees buy into a vision of what a truly data-driven organization can be.