5 Keys for turning Big Data into value for your customers
JULY 27, 2017
by Stephanie Collis
Head of Marketing, Zetaris
Follow me on LinkedIn & Twitter
Over the last few years, Zetaris has been helping clients get closer to their customers by building deep analytics applications embedded in customer interaction technologies. Through several learning cycles we've distilled what works and what is problematic when trying to influence customer behaviour through permission based interaction in real-time. In this article, I've tried to offer these learnings and insights into how In-context Real-time Interaction may be brought to life.
We're all talking about the amazing opportunities this digitally transformed device-driven world presents to business. Every client I meet aspires to execute on digital strategies that create new experiences for their customers. From offering discounts when customers walk by their stores, to pre-configuring new insurance plans when a customer goes on an adventure holiday; we're all hoping to win the customers heart by being relevant, significant, valuable and timely. However, without building on a proven interaction framework, the risk is that your offer, advice or other type of interaction will only serve to enrage the customer.
We live in a "Big Data" world. Even though firms increasingly interact with customers digitally, they are realizing that valuable customer data lies beyond their digital applications. However, we can't dismiss the important interactions that happen via voice calls to customer support or through email, and customers are generating masses of potentially useful data in the IoT where use of social media and mobile apps is now prevalent.
Capturing and making sense of masses of broad, unstructured information such as voice, email, social media content, and so on, is driving many new technology offerings. Solutions such as Hadoop have emerged, along with Business Intelligence packages that sit over Hadoop. But these are only part of the solution.
The true value of data is not how much you have, but measured by what you do with it. The ultimate value for firms comes from marrying this unstructured data with the structured data that already exists within the firm and doing it in a timely and focused way, driven by application needs. Only then can you personalize customer-facing applications across the customer life cycle, draw connections between customer behavior and business events, such as billing cycles or new product launches, and generate targeted analytics that reveal opportunities to drive profit.
Various technological resources are available in the current data-driven world, however to achieve full value from your data, the technology platform must encapsulate multiple components all-in-one. These key technology requirements are:
Access all types of data where the data lives
The need to be able to deal with all types of data sources including Hadoop, SQL, NoSQL, Postgres, and proprietary data warehouses is imperative. Accessing data where it lives means the application can drive the data choices, minimizing unnecessary extraction, transformation and loading (ETL). If an application needs the latest customer data, for event-driven analytics or another purpose, it can do so.
Join across diverse data sets when the application required
Not only must the technology enable access to data where it lives, it must support flexible joining of data from the difference data sources when and how the application dictates. Historically, many analytics were supported by traditional data warehouses that embodied prior choices about data sets and joins. They could never anticipate all the ways applications needed the data, so a single point of truth was often lost and storage costs became untenable. The consequences of this can be avoided by joining the data accessed from where it lives, when needed.
Do it fast and at scale
Many solutions available provide connectors to multiple data sources and imply you can join across them, but doing this with large data sets and within the time-frame demanded by some applications is not easy. The ability to do this in a fast, scalable fashion requires massively parallel processing (MPP) capability, advanced query optimization, in-memory processing, and the management of multi-node data clusters.
Be accessible to the business user
Many business users last the skills to access the data for analytical queries. Even when they have SQL knowledge, they often lack the skill to optimize queries for large data sets. Skills such as Hive and Map/Reduce needed to access Hadoop data are typically not known outside of data scientists and developers. Providing a uniform SQL interface to diverse data sources makes users less reliant on the time of data scientists, increasing business productivity. Allowing end users to easily generate data products using pre-written queries enables the analytical platform to be more efficiently used by everyone, and becomes a more valuable resource for the business.
Deliver the service flexibly
Many firms have little appetite for installing the infrastructure needed to fast and flexibly process large amounts of diverse data. While some of their applications may have an ongoing need for the data, some may be one-off analysis to support a specific purpose, such as a marketing campaign or strategic review. Providing the technology via the cloud on a Software-as-a-Service basis means they can be on-boarded quickly and scale their requirements up and down over time.
Recognizing that this technological dexterity is required for optimal business value, the Zetaris Analytics Workbench has been developed with each in mind. By combining Zetaris' own advanced tools with the best open source offerings, such as Spark and Hadoop, it provides accessible and exceptional performance and is delivered via your choice of a private cloud or the Zetaris Cloud.
To find out more information on how you can evolve your data into true value for your business and your customers, please contact us.