Part 1: Making Data Analytics Work
Part 2: Work Strategies For Success
Faced with a proliferation of predictive analytics tools and platforms, and the hundreds of models and methods they use, analytics-driven organisations need clear roadmaps to navigate the options. The good news is that there were plenty of them at Predict 2017.
Creme Global’s Cronan McNamara pulled no punches when he described a fundamental principle for any project. “If you’re not dealing with the data scientifically you’re not doing data science, you are doing something else. It’s about applying scientific methods to data and being rigorous,” he said.
UCD’s Brian MacNamee kicked off his session quoting statistician George Box – “all models are wrong, but some are useful” – before offering a crash course in how to avoid data modelling pitfalls. The objective, he explained, was to use models to extract insights from data that can help make better decisions. Easier said than done. “You never know ahead of time what the right modelling algorithm is going to be,” he said.
“You have to perform experiments to figure out which one is going to be best.” He demonstrated how an algorithm that performs well on one problem can perform badly on another. The challenge is finding the one that sits in the middle, a “goldilocks model” that does the best possible job.
Better data, however, will almost always beat bigger and more complicated models. MacNamee illustrated this with a blurry picture of an astronaut on the moon. Image processing techniques using pixel representation delivered some improvements but it was still hard to see any detail. Using frequency domain representation, however, filtered out the noise and provided a much clearer image.
“Changing your data is often the best thing you can do in order to solve a particular problem,” he said. “Time spent up there with data is probably time well spent rather than choosing complicated algorithms and tuning them to death.”
Karl Heery from the AON Centre for Innovation and Analytics offered a glimpse inside a multinational that invested heavily in predictive analytics to enable better business decisions. Exploring the best way to carry out analytics at scale, it opted for a cloud-first strategy in 2015, building its platform on Amazon.
Heery explained how the cloud has helped maximise the utilisation of the development environment and set basic principles: “Don’t run what we don’t need; treat servers like cattle, not pets,” he said. “We can schedule to stop and start servers outside of core hours and save 60 per cent on the costs of non-production environments.”
He spoke of the importance of creating a simple reference architecture where the data is initially ingested into a data lake from various transactional systems, APIs and public sources. From there it goes to an enterprise data warehouse via ETL (Extract, Transform, Load) systems. This is when the data science starts and the analytics team can begin to offer out insights to the business through data marts.
“You never know ahead of time what the right modelling algorithm is going to be.
You have to perform experiments to figure out which one is going to be best”
Brian MacNamee, UCD
The delivery method is agile, with software teams working around products rather than functions, and where the product owners set the priorities. Cross-functional teams work in two-week sprints and then stand up what they’ve done for review. A decision is taken on whether it’s ready to ship or iterate again.
When Karen Church joined Intercom she set about finding a way to make analytics a more effective part of the business. Like AON, the start-up ran analytics centrally and was focused on creating a common language that the rest of the business could understand. But Church was concerned that analysts were not as tightly integrated with the product people as they needed to be.
One option was to embed them with the people they were working for. They would get a better understanding of the business problem but it risked creating silos and undermining the identity of the analytics team. Church settled on a hybrid model. “We still have a central product analytics team but we offer a number of ways in which we’re embedding – the level and extent really depends on the goal and project at hand,” she said.
The notion of becoming an analytics-driven organisation was thoroughly explored over the two days. Taking data sets and analysing them to create business value is a well-understood objective but there’s devil in the detail, not least knowing where the value is going to come from. Bernard Marr, author and academic, identified three strategic lenses to answer this question:
The most common goal is the first, analysing data to inform decision making and improve performance, a process that only works when organisations are focused on clear and measurable outcomes. “Every analytics project should end in some sort of change in the decision process within an organisation,” said Aoife D’Arcy from The Analytics Store.
She shared the benefits of 20 years of experience at the coalface of analytics to identify what drives success. The secret, she said, is defining the decision space – the data, people processes, and systems that work around the problem you are trying to solve – and then coming up with an analytics solution that fits within them.
Part of the solution must be metrics to make sure success is measurable, along with a strong focus on the time to decision making. There is no point knowing a customer is going to leave after they have left or discovering a claim is fraudulent after it’s paid out. “It’s really important to get that piece of insight, that piece of information into the hands of the people who are using it in a timely fashion so they can make that good business decision,” said D’Arcy.
Conor Duke, data scientist at Boxever, believes cutting down decision times is the big challenge for data analytics and could be halved by new technologies. “No matter how many slides you generate or models you run, people have to make decisions on what you generate and that’s the next frontier,” he said. “If you can make a decision on the 15th of the month that’s usually made at the end of the month, you become way more responsive and can accelerate the amount of decisions you make and the rate at which you make them.”
He also talked about “data-driven discussions” as opposed to “data-driven decisions”, emphasising the importance of context. Rather than offer people a binary choice on which to act – you either do it or you don’t – the data should be injected into scenarios that present a big picture view that demands more discussion.
Nathean CEO Maurice Lynch made the case for moving analytics away from centralised control to a model that democratises data and puts the tools in the hands of employees. The company’s approach is all about baking analytics into different parts of the business, moving away from the idea that crunching data is something that only goes on in headquarters where the data scientists reside.
He gave an example of aggregating data in retail. “Give store managers the power to do their own analysis – they know their data; they know their business,” he said. “Are they being given the right tools if all they’ve got is some extract from Excel?”
“It’s really important to get that piece of insight, that piece of information into the hands of the people who are using it in a timely fashion so they can make that good business decision”
Aoife D’Arcy, The Analytics Store.
He wasn’t the only one arguing that data science should depend on more than an elite few. Uli Bethke, from Sonra, said that data analytics was a team sport and organisations needed a broad range of skills to get the most out of their analytics strategy. “You won’t find one resource that will combine all the skills necessary to make data analytics successful,” he said, listing business analysts, data analysts, computer scientists and systems people as essential members of the team.
People and infrastructure are essential for analytics success but so too are the first steps in any project, argued Bethke, listing the three most important:
Step 1 – Define the problem and the context of the problem. This should determine if it actually makes sense to spend time and money looking at the data. The return has to be greater than the investment.
Step 2 – Have a vision of what type of analysis is going to successfully solve your problems. Are you looking to compare two entities or are you trying to understand the root cause of a problem?
Step 3 – Look for actionable outcomes. Everyone knows the ideal, but it’s not always possible. Some analysis may require further investigation. Try and think about what actions will be taken in advance.
Sometimes an organisation will stumble upon a data strategy by accident. Ionology’s Niall McKeown told the story of a supplier to the hotel industry that discovered a client had an inordinately large number of orders for a particular item. It didn’t make any sense and suggested systemic theft. The floodgates opened the moment they presented this data-driven finding to the hotel. The client wanted to know what else the supplier could reveal about the business and the relationship changed forever.
“As a company they are now after a different type of customer doing a different type of thing,” he said. “Using data and predictive modelling, they created competitive advantage. They can kick the doors open when they go and see new customers and give them a reason to switch from their current provider.”
Find out more about our conference. Tickets for 2017 are on sale now.