I recently started a blog series based on research done by interviewing a number of leaders for big data and analytics initiatives. The first part in that blog series on leaders’ perspectives on big data was specifically about business goals. This time we’re going to look a little more at the more technical objectives of big data, by which I mean technology goals.
I recently started a blog series based on research done by interviewing a number of leaders for big data and analytics initiatives. The first part in that blog series on leaders’ perspectives on big data was specifically about business goals. This time we’re going to look a little more at the more technical objectives of big data, by which I mean technology goals.
One common theme is the desire to bring analytics to more of the workforce. Sometimes this is described as more detailed dashboards, at other times businesses really want to provide interactive engagement with the data. Business agility is often cited as a goal which is derived from more insights. These insights should not be restricted to business analysts or senior executives either. Everyone should be able to make decisions more quickly and more accurately.
Another dimensional shift has to do with the time view of data. In the past, much of analytics was, well, looking at the past. Now many organizations have a balanced view between historical results – running in batch, real-time insights – often based on streaming, and future predictive views – sometimes needing more advanced analytics techniques. This shift really does require new technologies–traditional approaches to analytics and data warehousing just aren’t set up to address these issues.
While it may seem obvious to say so, the Vs of big data are real. More variety, more velocity, more volume, and the like are all real considerations from a technical capabilities point of view. The drive towards Hadoop, Spark, and NoSQL databases continues to be powered by these needs. And the economics of these new technologies is often the key enabler. The cost efficiency of the data platform has radically changed, which should really be measured as cost per answer.
And just as we found in the ESG IT Spending Intentions Survey, security, privacy, and governance are becoming hard requirements for all organizations. Any new solutions in the area of big data and analytics must be able to demonstrate their safety for sensitive information. Collection of more data into one repository increases the risk if proper controls are not in place.
One last common technical consideration is the data lifecycle. More organizations are now concerned with how much data they should keep for how long, and also how accessible it is in the future. This has a cost aspect, but tends to be more value-driven as a consideration. That is, “will I be able to find what I need, when I need it in the future?”
Of course, all of this must be considered in light of existing data platforms, databases, data warehouses, and the like. Few people are starting from a fresh slate, most really want to leverage existing tools alongside new big data technology. If they can get more efficiency and new capabilities, then they will be happy.
If none of this seems particularly radical to you, recognize that this represents a broad swath of the customers for these solutions. Their thoughts are digesting the best practices and applying them to their own business, not having the most advanced bleeding edge piece of software. Yet the market seems sound now on the core principles of big data, too, and ready for more!