If you’re a Chief Information Officer (CIO) there are three things that your organization expects of you: 1) keep everything running; 2) add new capabilities; and 3) do it all as cheaply as possible.  The metrics that CIOs typically use to measure these things include keeping a count of the number of outages, number of projects delivered and budget variances.  The problem with this approach is that it fails to take account of complexity.

When I talk with senior executives, regardless of their role, the conversation inevitably turns to the frustration that they feel about the cost and complexity of doing even seemingly simple things such as preparing a marketing campaign, adding a self-service capability or combining two services into one.  No matter which way you look at it, it costs more to add or change even simple things in organisations due to the increasing complexity that a generation of projects have left behind as their legacy.  It should come as no surprise that innovation seems to come from greenfield startups, many of which have been funded by established companies who’s own legacy stymies experimentation and agility.

This doesn’t have to be the case.  If a CIO accepts the assertion that complexity caused by the legacy of previous projects is the enemy of agility, then they should ask whether they are explicitly measuring the complexity that current and future projects are adding to the enterprise.  CIOs need to challenge themselves to engineer their business in such a way that it is both flexible and agile with minimal complexity compared to a startup.

The reason that I spend so much time writing about information rather than processes or business functions is that the modern enterprise is driven by information.  So much so, that metrics tracking the management and use of information are very effective predictors of issues with the development of new processes and obstacles to the delivery of new functionality.  There appear to be no accurate measures of the agility of enterprise technology that focus on just processes or functions without information but there are measures of information that safely ignore processes and functions knowing that well organized information assets enable new processes and functions to be created with ease.

The CIO who wants to ensure their organisation has the capability to easily implement new functions in the future should look to measure how much information the organization can handle without introducing disproportionate complexity.  The key is in structuring information assets in such a way as to ensure that complexity is compartmentalized within tightly controlled units with well understood boundaries and properly defined interfaces to other information assets.  These interfaces act as dampeners of complexity or turbulence, allowing for problems or changes to be constrained and their wider impact minimized.

Creating such siloes may seem to go against conventional wisdom of having an integrated atomic layer and enterprise approach to data.  Nothing could be further from the truth, it is simply ensuring that the largest possible quantity of information is made available to all stakeholders with the minimum possible complexity.

The actual measures themselves are described in my book, Information-Driven Business,  as the “small worlds data measure” for complexity and “information entropy” for the quantity.  Applying the measures is surprisingly easy, the question each CIO then needs to answer is how to describe these measures in a way that engages their business stakeholders.  If technology leaders are hoping to avoid difficult topics with their executive counterparts then this will be impossible, but if they are willing to share their inside knowledge on modern information technology then the “us and them” culture can start to be broken down.