Data Before Delivery: Putting the Cart Before the Horse?

Omnira Software

I have now heard dozens of stories from friends and colleagues about failed BI and analytics initiatives. They range in costs from hundreds of thousands to tens of millions of dollars. It’s a problem that I see several companies at risk of repeating…and the motivation for today’s blog. A common thread to most of these stories is trying to “fix up our data before we select or implement an analytics tool”. How can anyone possibly understand the data needs, the use cases, and the possible data issues without providing a means to use the data, view the data and identify issues? It’s like trying to anticipate what part of a car a mechanic should fix without test driving it first.

Consider starting with the data you have, take it for a test drive. See how the business wants to use it and evolve your data quality and architectural initiatives incrementally. You’ll realize value on the way and better focus your efforts.

Why Do They Fail?

  1. Lack of focus: Hyped up terms like “big data”, “data lakes” and “cloud” distract us from the pragmatic task of delivering information ­­­— getting reliable, current information into the hands of business users in a form that allows them to inform decisions. Don’t over complicate the task at hand.
  2. Slow time to value: While these initiatives are inspired by sound motives, they fail to get buy-in or build momentum because they aren’t delivering near term value. Successful projects are built on a series of quick wins and can make course corrections along the way to ensure success.

Context for Success

I like the “cart before the horse” analogy because I think of the cart as the data, the horse as the visual analytics tool(s) and the driver as the business user who is using the data to make value-based decisions. Like the horse-drawn cart has a destination, so should the data-driven decisions — fulfilling corporate objectives.

The tools, to some degree, don’t matter as long as they provide reliable information delivery in a form that supports decision making processes. And yes, there needs to be a process. Companies often try to apply new technology to an old way of doing things. Time to value should be the primary driver, and adapting processes to leverage the strengths of tools yields faster, better results than trying to adapt tools to fit legacy processes. The goal should be to use technology to deliver consistent, reliable information to inform decisions that are aligned with corporate objectives.

Tips to be Successful

  • Make Time-to-Value your primary focus: Build momentum with small successes. Get something useful into the hands of business users sooner…don’t over-architect the solution because no one can anticipate all of the needs until they walk down the path.
  • Select visual analytics tools with these qualities:

    1. Data Governance –there should be administrative capabilities to ensure that data is delivered with the same business logic for all users. Consistent, reliable and repeatable should be the information delivery objectives.
    2. Adaptability & Agility – your data needs are going to evolve over time, as might your data architecture. Having a tool that can be configured quickly to evolve with your needs is important. This should include the ability to change data mappings as your data sources evolve (e.g., from a direct database connection to a data mart) without compromising end-user analyses. The consumption of reliable data, wherever it resides, should be seamless to the end user.
    3. Usability – you want maximum adoption. Aim for optimal usability but don’t get caught in the trap of designing something unique for every user…try to standardize on some best practices and processes in your organization and don’t forget to train your people.
    4. Self-Service – you want people to have some latitude to build their own analyses with governed data in a sanctioned environment…consider it to be “governed freedom”.
    5. Domain-Specific – domain-specific tools often have industry expertise and best practices baked into them, and specialized processing capabilities that are difficult to accommodate by pre-conditioning the data. For example, time-normalizing data could take many forms (e.g., first gas production, first oil production, peak production date, spud date, etc.). Trying to accommodate these in a data warehouse with pre-conditioned data would be extremely challenging, time consuming and expensive. Domain-specific tools often deliver a much faster time to value and can be more cost effective versus building and maintaining a build-it-yourself solution.
  • Define and Evolve Processes: Explore the opportunities for ideal business processes using your evolving technology, identify best practices and try to standardize the approaches that are most effective. I have a client that regularly showcases successful innovations in analyses and processes in an attempt to fuel a culture of excellence. Let the rock stars share their techniques and innovations with others so everyone can improve. You also need managers who are equipped to lead and establish the standards and processes that are aligned with corporate objectives. It’s their job to establish expectations and communicate them clearly.
  • Training: Don’t forget to build training into your budget. It takes people, process and technology working in concert to optimize efficiencies. Training, like technology, should be perceived as an information and insight investment. People are driving technology to make your company more successful…empower them with adequate and regular training. Training that integrates with your company’s processes is even more impactful.
  • Business & IT Collaboration: IT is there to support the business and should be grease in the gears, not sand. The best way to build a successful project is to have business and IT collaborate with a clear understanding of mutual goals centered on creating value in a timely fashion.
  • Executive Sponsorship: You need the support of the people who sign the cheques. They’re more likely to buy into an approach that has time-to-value at the forefront and is conveyed as an investment in insight, efficiencies and optimization. Quick wins that are clearly defined will cement executive support and make everyone look good. It allows you to wade into the waters of pragmatic spending and justify the cost of each step rather than strive for a massive budget with lofty goals that use hype-inspired terminology.

Avoid Distractions

I use the term “magpie syndrome” to characterize someone who is always chasing the shiny object at the expense of getting a solid information delivery foundation in place. There are no shortage of visually impressive charts, dashboards and AI/machine learning technologies available to you…but beware of diverting your focus away from the core principles of effective information delivery ­­­— consistent and reliable data that supports value-based decisions that are aligned with corporate goals. Trying to cram more information into a dashboard or fussing endlessly about colours and layout are things that will take your eye off the prize. While these can be important considerations for clear interpretation and insight, they fundamentally rely on clear goals and the right data. Experts in the field say that several individual charts usually convey insight more effectively than a complex dashboard. Don’t make things more complex than they need to be. Ask yourself repeatedly “is what I am doing helping me make a better decision?”.

In summary, don’t start with the data, start with deliverability. Make time to value your focus and get proper executive support. Be agile, be nimble and be prepared to evolve. Processes should be well-defined and supportive of corporate objectives. Failure to put some structure around data governance and business processes will lead to chaos…empower your team with “governed freedom”. Get started – your first success should only take a few weeks. If it takes longer, the success of your project may be at risk.

Comments