Organizations of all kinds make many decisions, decisions that are critical to their success. These decisions can be divided into strategic decisions (one-off, collaborative, high value decisions), tactical or management decisions (primarily about controlling how business is performed day to day, week to week) and operational or transactional decisions (those about a single customer or transaction).
Organizations of all kinds make many decisions, decisions that are critical to their success. These decisions can be divided into strategic decisions (one-off, collaborative, high value decisions), tactical or management decisions (primarily about controlling how business is performed day to day, week to week) and operational or transactional decisions (those about a single customer or transaction). Different kinds of analytics can be applied in each case – visualization and highly interactive drill-down analytics are powerful for strategic decisions while embedded scoring is effective for transactional ones for instance.
When applying analytics to operational decisions the ability to deliver analytic insight to someone diminishes as there may be no-one present, in the case of a website or other automated system, or because the person who is present does not have the analytic skills to consume the insight. Driving automation into these decisions creates challenges (see critical issues in applying analytics at production scale) and forces new roles on business, IT and analytic staff. This brief discusses these roles and how they need to interact to deliver successful use of analytics in operational decisions.
The ROI and challenges of analytics in automated decisions
Operational decisions, because they are taken often and in a repeatable way, offer a huge multiplicative effect – any improvement in the quality of decisions is multiplied by the number of decisions so that even small increases from analytics can make a big difference. In addition, because similar decisions are made again and again, ongoing decision analysis can create continuous improvement. Operational decisions also allow for the development of more effective approaches by constantly challenging existing decision-making approaches with new ones. However, as discussed in the brief “Challenges in applying analytics at production scale” there are five critical issues:
- Actions not just predictions
- Time to deploy models
- Compliance
- Data
- Automated analytic decisions involve multiple groups This last is the most critical and is the subject of this brief.
The three legged stool – business, analytics and IT
A three legged stool requires all three legs to be stable. In the case of automated analytic decisions the three legs are business owners, analytic experts and IT professionals. Without all three you cannot achieve the results you want. The business must be involved because these are business decisions. It is business owners who must manage the business to the agreed metrics and KPIs, business owners who understand the strategic direction of the organization. If the business is not engaged it will be difficult to ensure that the decisions made are those that are needed by the business. In addition it is business owners who see shifts in markets and competitors, new regulations, court cases and more first. These shifts change the parameters of a good decision and can require that the decision-making approach in use be changed. If the business is not engaged then these changes will not be applied as rapidly or effectively as they need to be. Remember, however, that the role of these business experts is not to make these operational decisions but to define how these decisions should be made.
IT must be involved because these are automated decisions. They must be delivered using IT infrastructure, systems or Decision Services (see the role of Decision Services in an enterprise architecture). Changes to how these decisions are being made are going to require system changes. Problems in performance or availability of these systems are going to be serious and will rely on IT to fix them. New or changed business processes or systems are going to need to be integrated with decision services. If IT is not engaged then none of this will work seamlessly.
Finally we must apply analytics to improve these decisions. Even in organizations that are attached to “gut” decisions by executives or to deferring to business experts even when they disagree with analytic models, the need to apply analytics to operational, automated decisions is clear. No business really should defer the decision making approach to its IT department without providing some solid analytic guidance. Any users of these systems are not likely to be a viable decision- maker. These operational systems are being used by junior, front-line staff without analytic skills or the time to apply them effectively.
Roles
All three groups must be engaged in the process then. Much of what they will need to do is the same work they usually do. The IT team will need to worry about performance, system integration, availability and all the usual system issues. The business team will need to understand their business and its measures of success. And the analytic team will need to analyze data and come up with meaningful and reliable insight. But each group will also have to take on new roles.
Business user roles
The business users must provide clarity in defining decisions explicitly. This needs to include defining the actions that can be taken, measures of success and more. The measures that will be applied to judge successful decisions and the ways in which those impacted by these decisions are themselves incented must be well understood. If analytic decisions contradict the measures and incentives of those involved they will not be adopted. Business users also have responsibility for the rules that wrap around a model and for understanding the business circumstances that might invalidate a model. They must be clear what actions are taken in what circumstances and based on what model outcomes. Finally business users must commit to close engagement with the analytic team on what kinds of models will help. If the business needs 30 days to intervene with customers who are a retention risk then the analytic team must be told this or their models will not be helpful. Business users must agree to be educated on the basics of analytic models – what can and can’t be predicted, the difference between causation and correlation and so on. They must become effective consumers of and requestors of analytic models.
Analytic staff roles
First and foremost analytic staff must always remember to focus on business results not analytic model accuracy. It is a common fallacy of analytic teams that they are done once a statistically valid model has been created. Instead they must focus on business results – that success for a model requires business success. The model must be implemented and used, it must change the behavior of the business, to be a good model.
Besides this business understanding there is also a need for analytic teams to engage with the IT department with respect to the operational systems. Analytic teams need an understanding of the data available in production systems and of the cost and difficulty of new data sources. Just because a data source will improve the model does not mean it should be included if the IT time and cost of making it available in production is excessive. The need to successfully implement models must be part of the analytic team’s assessment of the quality and readiness of a model. An essential element of building effective analytic models is managing the data available. Some must be used to train models, some kept for verifying the model and some used only to see how predictive the model is against data not used in its construction. Analytic teams are used to worrying about data from this analytic modeling perspective. They must also now work with their IT colleagues to provide the IT department with test scenarios (to confirm that the model performs as expected once implemented) as well as regularly updated datasets for use in impact analysis as models and rules are changed. Finally at a practical level analytic teams need to rethink how they pick analytic tools. Most teams have a tendency to allow individual analysts to pick their own tools and to focus on the creativity and flexibility tool choices allow. These features are important but if analytic models are to be deployed into production systems then the analytic tools being used must be selected not just based on how they support modeling but also on their ability to drive analytic models into the systems the company actually has. If the company uses a BRMS (see Business Rules Management Systems as a deployment platform for analytics) then the modeling tool should be able to push its results into the BRMS. If the production environment is a legacy COBOL one then some thought must be given to how the model will be turned into executable COBOL code.
IT roles
IT departments are often largely ignorant of more advanced analytic techniques, of their value and of their requirements when it comes to data. This has to change as analytics are simply too valuable to a typical organization today. IT departments must consider the value of the data they manage to analytic teams so that it can be stored and managed in a way that is suitable for analytic use. This typically means keeping details not just summary data, keeping data accessible for longer, and tracking changes to data to prevent what are called “leaks from the future” where it is not possible to recreate the data as it was on a particular day in the past. Analytic teams can provide these requirements if they are asked. Sadly most IT departments don’t ask and the analytic team ends up working from log files or other low-level extracts instead of from the data in the data warehouse, the data that has been so carefully cleaned and integrated.
IT departments must also create a requirements gathering and design approach that identifies opportunities for analytics. Partly this means making sure that they push on new attributes to see if they can be derived analytically from existing data. It also means ensuring that questions like “what would you like to know about customers” or “are there things you would like to be able to predict as part of this system” get asked.
Finally IT departments should adopt a design pattern for Decision Services (see the role of decision services in an enterprise architecture) and become adept at implementing decision services. This will almost certainly also require the adoption of a rules-based infrastructure for them.
Recommendations
Besides a focus on these roles a number of other actions can be taken to strengthen the three legged stool.
- Analytic COE should include IT people If an analytic center of excellence is being developed, and it should be, then this center should have strong connections with the IT department. Indeed an enterprise architect or similar should be assigned at least part time to the CoE.
- Conduct analytic training for IT/business Ensuring that the business and IT teams have an understanding of the power and limitations of analytics is essential. This does not mean teaching them how the algorithms work or what the various statistical measures of a model mean. It does mean introducing lift curves, introducing probabilities and confidence and similar.
- Feed IT requirements into analytic tool selections and standardize analytic tools
- Analytic teams must engage with IT departments to see how the realities of their deployment environment should impact analytic model choices. Analytic teams should also seriously consider standardizing their tools, organizing models and variables into a proper shared repository and generally “industrializing” their analytic processes.
- Measure analysts based on value of deployed models only Analytic teams should not be allowed to say that they did a good job because they produced an accurate model – they must show that their models have impacted the business in a positive way. Only then will they truly engage with the business and IT departments to ensure successful implementation.
EDITOR’S NOTE: This article was initially published by James Taylor for the International Institute for Analytics and was republished here with all proper permissions.