So what are we really talking about?
Unless you’ve been living in the proverbial cave or been in a coma for the last several years you’ve probably heard a lot about Big Data, how necessary it is and the value you’re missing out on if you’re not doing it. And the term Big Data seems, well pretty big, so what does it really mean?
Really it’s all about getting business value out of large and/or non-traditional datasets. A non traditional dataset being something like website click through logs, aircraft engine performance logs, social media streams, cell phone GPS logs etc. Anything where the format can be variable, the volume can be large and the speed you receive it can be high. Datasets like these have been around for awhile but have always been hard to analyse because relational databases just aren’t scalable or flexible enough. Enter the new technology.
To work with non traditional datasets we need new non traditional tools, tools like Hadoop, NoSQL Databases, Kafka and Spark, these tools work because they can deal with complex data structures and they can scale to handle truly enormous datasets in real time.
So far so good and it all sounds pretty easy, at least that’s the perception the market prefers to project. But here’s the rub, it isn’t. The vendor space is crowded, often the technology is immature when compared to what we’re used to, and unless you have a degree in statistics and have a couple of serious data projects under your belt, finding and delivering value is challenging.
The key to success is to focus in on the value, the business problem at hand. Not the rosey five year all singing all dancing real time data lake plan but the six to twelve month ROI plan. Uncover one nugget of value and build on that success.
We have delivered countless large and complex data projects over the years, but as we’ve upskilled on big data technologies and delivered projects to some of our larger customers, we’ve learnt a lot about how Big Data differs from more traditional projects. Experience has shown there are more speed bumps to watch out for and sometimes they’re on parts of the road where you wouldn’t expect to see them. The ‘secret sauce’ we apply to ensure project success is still there but the nuances of how it’s applied are different. You need a depth of understanding of the technology, and its application to the business problem at hand, that you only get from delivering projects.
It’s critical to identify the right technologies as this is an area where things can get complex. You’ll want an expert to help simplify your choices. It’s a bit like building a car, even if you’re a mechanic, you wouldn’t build a car from scratch if you’re used to working with trucks. Maybe you’ll succeed but it would be a long and painful process, likely yielding mediocre results. Clearly recognise your capabilities and get help where and when you need it.
To be successful in any data project, and this is particularly important for Big Data projects, it’s critical to break the problem down and learn in rapid iterations. Why? Because you’re doing this for the first time, most of the time. Big data projects in particular are all about breaking new ground.
Agility is key and we structure our projects to support agile iterative learning because we’ve proven it works over hundreds of data projects. We have tailored our service offerings to ensure that each and every project we undertake is successful.
Using our proven approach we can take you on a successful Big Data journey.
- Big Data Opportunity Assessment identification
- Building a Big Data business case
- POC/Experiment Framework and Delivery
- Technology roadmap, selection, installation, configuration and support
- Project Data Engineering & Analytics
- Data Governance & Data Quality
- Realtime or batch driven data projects
Our Big Data specialists can help you avoid the pitfalls and deliver results, with our help you’ll be getting value from Big Data before you know it.