THANK YOU FOR SUBSCRIBING
Big Data's Three Routes - Which Direction drives the Most Value?
By Mike Upchurch, Co-founder, Fuzzy Logix
But it doesn’t have to be this way. There is a route to driving value, but you have to be realistic and you have to be methodical in your approach. You also have to start by recognizing that, in reality, there are only three kinds of big data projects.
1)The first is simply focused on replacing aging traditional infrastructure; in effect to re-platform an environment and make it fit for purpose in today’s economy. Let’s call this ‘the makeover.’
2)The second type of big data project sees companies recognize that their traditional data warehouse environments may be good at reporting and managing structured data, but fall down when it comes to other forms of data and, most crucially, analytics (and after all, without analytics, how can realize the optimal value of all that data?). Let’s call this the ‘upgrade.’
3)The third and final type of big data project is where companies decide they’ll save money by blindly adopting big data technology. Let’s call this the ‘Kamikaze option.’
But here’s the unwelcome truth.Only one of these projects will deliver new flexibility in reporting and data collection while saving money. Only one of these projects will drive a very high return on investment. And, lastly, one of these projects is a giant IT black hole that can waste millions and ruin careers.
Let’s start with ‘the ‘makeover’. There are very valid reasons for replacing aging infrastructure with big data technology. Let’s say you bought a bunch of data warehouse appliances four years ago and it’s time to review things. In the years that have passed, your company knows they’ll want to leverage more analytics and be more responsive while lowering their cost. How to achieve this?
Deploy a hybrid approach and use new versions of traditional data warehouse technology for a portion of your infrastructure and ‘big data’ technology, such as Hadoop for the rest.
I passionately believe that there is value to be had from the big data phenomenon but, it requires the right strategy and execution
In my experience, this is logical and will likely be successful. And a tip for success to drive maximum value is to have 3-4 environments that allow for innovation, testing and exploration; including one sizable area allocated as a data scientist playground. For the companies that upgrade their traditional architecture (but buy less of it) and add Hadoop and related technology, taking this approach will see them have their current and future needs covered, save cost and improve performance. The ROI for this is medium to potentially high and the time-to-value is reasonable.
Now let’s turn to the ‘upgrade’ option. In this case, the company in question is more advanced in their adoption of analytics. They likely have their traditional environment running along just fine thank you, but they realize they are missing out on leveraging big data and the related technology to realize value from new types of data and analytics. For this company, adding a Hadoop environment and an analytics playground will give them maximum value. In the last few years, I’ve seen projects like this deliver 600% to 3,200% ROI in 12 months. The key to success here though is that the company is ready to not just collect and store big data, but to act on the results operationally.
Let’s look at the most extreme example. A giant marketing company with billions of rows of data created a massive analytics environment that included in-database analytics. Their internal business case stated a 12 month return on investment of 3,200%. They listed numerous reasons, but the main driver was that their data scientist could build 10X as many models during a given period of time. Crucially the marketing team was ready to act on the insight by changing existing programs and running new ones almost immediately after the results of the analytics were available. In their case, the process to turn around some models went from days to minutes. These types of companies are the ones who are truly benefitting from the big data revolution; they start their projects with a clear purpose and are structured to act on the results.
Finally, let’s look at what I call the Kamikaze option. Thankfully, it’s not common, but it does exist. The mantra is a common one. “We have to get some big data projects in place! We’ll adopt all this free open source technology, put it in the cloud and dramatically downsize our staff!” We’ll save a fortune”. Except it won’t happen. For these companies to have any hope of monetizing big data, they need to have a few things in place. One is a clear understanding of the optimal mix of technologies based on their business needs and operational ability to deliver. You can’t just throw all the technology over the fence to the cloud and adopt 100% open source if you are an enterprise level company.
Instead, you need to optimize your environment based on a mix of business needs and ability of the business to operationally act on results. These companies will also need a clear vision of the results they expect, as well as stunningly good IT project managers. The list of ‘must haves’ to be successful are onerous though. Therefore, the result (more often than not is that a huge amount of money will have been spent, a lot of disruption will have been endured, and the company in question will findthemselves back where they started).
To conclude, I passionately believe that there is value to be had from the big data phenomenon but, it requires the right strategy and execution. Put simply, for a moderate return on investment, you’ve got to leverage and optimal mix of traditional and big data technology to replace your aging infrastructure.
Then, for a high return on investment, run analytics on your big data (on traditional and/or Hadoop environments) but be sure you can act operationally on the results. And never throw the baby out with the bath water by opting for the kamikaze option because that path is littered with the wrecks of failed deployments (and careers!).