Len Dubois low res pic[1] By Len Dubois, Sr. Vice President, Marketing & Sales Support, Harte Hanks Trillium Software

Ten years ago, analysts and vendors began to tell us that poor data quality is a business issue and not just an IT issue. But listening to industry expert jargon during these ten years, it’s been quite obvious who the buyers of data quality solutions have been…Information Technology professionals only. Not any more....

Terms such as “parse, integrate, standardize, and match” – and more recently buzz words like “parallel processing, pre-built, The Cloud, and Big Data” - have littered the canvas of vendor sales and marketing messaging. I think we can all admit that these terms neither endear us to business people, nor make it easy for line-of-business managers to understand their role in improving the value of information - and in particular its quality. 

More and more though, it certainly appears that business consumers of information are coming to the fore in the Data Quality market. They’re not only expressing their frustration with the data’s inability to meet their business needs, but also with the solutions being used to solve those data quality problems.

In the past, our buyers mostly occupied positions in the IT realm and asked for ways to “parse data” and “integrate systems data” but that’s changing fast. Today, Claims VPs demand to know how we will “mitigate claims losses” and “improve claims cycle time,” while Risk and Compliance executives are focused on “data due diligence in the face of regulatory requirements.”

So, it’s in the language, terminology and business objectives we use that dictates who will be most able to determine how poor data quality manifests itself in your business processes.  It is also in how we will help the business take responsibility for solving these challenges.