There is a lot of chatter as the New Year starts as to what each vendor is going to bring to the table in 2012. We have “Big Data”. “The Cloud”, “in-memory”, solid state drives, and many other exciting technology announcements. Teradata itself is introducing new capability in the above arenas as well as database features such as Hybrid Columnar. There will be a lot of new toys to play with this year in the IT department.
There is a lot of chatter as the New Year starts as to what each vendor is going to bring to the table in 2012. We have “Big Data”. “The Cloud”, “in-memory”, solid state drives, and many other exciting technology announcements. Teradata itself is introducing new capability in the above arenas as well as database features such as Hybrid Columnar. There will be a lot of new toys to play with this year in the IT department.
However, we need to remember that unless those new toys can drive some real business value they will quickly be put up on the shelf in favor of the next shiny, new innovation. When looking at IT, it must be remembered that the “I” (information) is the critical word to stress, not the “T” (technology).
So how can we leverage all this new capability so the user community can then drive information based actions that are timely, relevant, and profitable? I will explore a few of them here as well as give you some ideas on a best practice that should be an integral part of any new technology assessment and initial rollout to your company.
Simplistic big data, more voluminous than anything, can be used to detect fine patterns. Energy companies rolling out smart meters now have hourly usage statistics. These can be used to give customers more control as to when, and how, to use energy. I have often joked that if you tell me it is cheaper to run the laundry at 2:00 AM then I will make sure to wake my wife up to take advantage of the lower pricing.
Going to the more traditional definition of “big data” and the social or unstructured data feeds, companies can better understand customer sentiment as well as customer reach and influence. By knowing who are the most influential customers, and how they are being treated you can quickly create a positive experience (which will be shared) or quickly resolve a bad experience (which will then not be shared or the fix is shared).
Some of these above capabilities would involve more than just “big data” technology. It would also leverage in-database analytics and either in-memory or solid state drives to provide higher levels of performance against larger volumes of data. Again, all of these technologies are neat in their own rights but taken together they create a very large business potential.
The real question though is what do you do with the better performance or more granular layer of data? Do you have the business community actively involved in the evaluation of new technologies? While that may sound like a silly question, I have sat across the table from IT departments that were making the decision of the data warehouse based on the speed of the backplane interface! On a list ranked 1 through 10 the first 7 were technical based and the last 3 were lightly business aligned. If you don’t care about your business outcomes then you are in trouble.
Okay, a couple tips on evaluating the business value of emerging technologies so you can hit the ground running as they are introduced, and more importantly, hey are tested in real life.
First, take a look at your current environment versus the pains of your user community. Are the users happy with the level or breadth of the data just need more timely access? This can either come from better performance when the query is run or more frequent loading so data is ready for analytics sooner. What is in the way of the better performance? Are you moving the data around to each process or are their other dependencies that effect how often you can load data before it is accessible by the user directly? Are you building lots of indexes, cubes, special extracts to tools to execute processes such as geospatial analytics? In many of these cases new technology overkill and it is simply a matter of leveraging your current resources correctly.
Second, if you were able to remove this barrier and give the users the data more timely then what is it worth to them? Would they be able to execute the same action sooner and thus in a potentially more relevant and profitable way? Would they execute the same actions more frequently such as moving a marketing campaign into the call center instead of a delayed e-mail or snail-mail messaging? How would they change other processes to make them more efficient due to the accelerated information and insights? If the user can not define the value of the capability then either you do not need the change or more likely you are talking to the wrong person!
Third, set up a quick real world scenario to test the new technology or capability. This needs to have definite outcomes and then users need to know how to measure the effect of the business change. Without those two elements defined just stop.
Fourth, you need to start stressing the technology. Does it scale well? What are the weak points? How well does it interact with the rest of your current architecture? By knowing how far you can take the technology, and more importantly, how far the technology will take your business you can effectively plot you roadmap for the next 12-24 months
Finally, with all these considerations you can determine is this a good technology to incorporate into your future roadmaps. It may turn out that your business test showed value but your stress test showed problems. Knowing this upfront will prevent you from adopting a new technology that will quickly become another on-off solution that you constantly have to work around in the future.