I have written about few in-memory technologies and approaches in recent months (including IBM DB2 BLU, SAS Overall in-memory strategy and SAP Predictive Analytics with its support for HANA in-memory) and into this space has come Teradata with its new Intelligent Memory. I would summarize the idea behind the Teradata Intelligent Memory product with three points:
- While much faster than disk, and much cheaper than it used to be, memory is still 80x more expensive
- Data volumes are exploding even faster than memory availability so putting all your data in memory is impractical
- Why ask your DBA to do manually what your system can do automatically?
Teradata has taken its prior work on how “hot” data is – how important a given piece of data is in the IO of a database – and extended this so that very hot data is put into memory. This is exactly what you would do manually – put the data that makes the biggest performance data into memory – but does it automatically and dynamically, constantly updating what’s in memory v just on disk to continually optimize performance. Nice.
Anyway, I am going to be writing some though leadership stuff on in-memory processing as it relates to Decision Management Systems so look for that in the coming months.
Copyright © 2013 http://jtonedm.com James Taylor
(image: decision management / shutterstock)