As previewed yesterday, ILOG (now an IBM company) is releasing the 7.0 products of their business rule management system (BRMS) family. These mark a big step forward for the ILOG product range. ILOG BRMS 7.0 has the standard BRMS components – an Eclipse-based development environment (Rule Studio), a web-based collaboration environment for non-technical users (Rule Team Server), a shared rule repository, deployment and execution technology (Rule Execution Server). Uniquely (I think) they have a strong integration with the Microsoft Office tools, especially Word and Excel.
Three new products are being released:
- JRules BRMS product line v7.0 – integration with office, decision validation services, rule analysis and reporting, decision table templates.
- Rules for .NET BRMS product line v7.0 – integration with Rule Team Server, Decision Services deployment and improved rule reporting.
- Rules for COBOL 7.0 – improved code generation for rule flows and decision tables, numeric computations and double-byte character set support.
With these releases Rule Team Server, ILOG’s …
Copyright © 2009 James Taylor. Visit the original article at First Look – IBM/ILOG BRMS 7.0.
As previewed yesterday, ILOG (now an IBM company) is releasing the 7.0 products of their business rule management system (BRMS) family. These mark a big step forward for the ILOG product range. ILOG BRMS 7.0 has the standard BRMS components – an Eclipse-based development environment (Rule Studio), a web-based collaboration environment for non-technical users (Rule Team Server), a shared rule repository, deployment and execution technology (Rule Execution Server). Uniquely (I think) they have a strong integration with the Microsoft Office tools, especially Word and Excel.
Three new products are being released:
- JRules BRMS product line v7.0 – integration with office, decision validation services, rule analysis and reporting, decision table templates.
- Rules for .NET BRMS product line v7.0 – integration with Rule Team Server, Decision Services deployment and improved rule reporting.
- Rules for COBOL 7.0 – improved code generation for rule flows and decision tables, numeric computations and double-byte character set support.
With these releases Rule Team Server, ILOG’s collaboration environment, and its Microsoft Office editing tools will be shared across the Java and .NET development tools. The developer tools for Java remain in Eclipse and the .NET tools in Visual Studio but the web and Office-based tools are now shared. Rule projects are stored in a common repository across the three deployment options – Java, .NET and COBOL – and both the Java and .NET platforms can now deploy as Decision Services.
As part of getting briefed on the products I got a couple of chances to talk with various IBM executives. These executives have been consistently enthusiastic about ILOG and about the potential for business rules. They see IBM continuing to compete in what you might call the stand-alone business rules market where companies are looking for a business rules management system (BRMS) to solve specific decision management problems. They are also working hard to make business rules part of their broader platform offerings – their transaction, application server and process management platforms for instance. This will mean customers will adopt business rules as part of adopting other technology to solve their business problems. IBM also sees lots of potential for the underlying rules execution technology – in Master Data Management for example – where a BRMS is not required but the ILOG execution and management technologies could be very helpful. I think we will see more and more use of the technology in other IBM products as well as more integration with the full BRMS. All of which, of course, is good. It’s also clear that IBM is committed to all three platforms – the standalone market requires all three and IBM’s platforms require both Java and COBOL. Personally I am excited to see how they integrate the BRMS with the Process Server and Business Events products in the short term and with the Information Management tools in the longer term.
Unlike many of the First Looks I write, this one involved a live public demo given at the IBM IMPACT show as well as a private one at the same event. As a result it is pretty long.
The first step is to create a business object model, in this case based on XML. For the purposes of the demo they are creating the XML schema directly. The integration with Eclipse allows easy use of standard XSD creation tools. Next a rule project gets created and associated with this schema. A layer of abstraction, the business object model or BOM, gets created to allow the objects being manipulated by rules to remain stable even if the underlying schema changes. The creation of the BOM includes some default verbalizations – “chatty” labels for the elements of the schema so that rules will be more readable when they refer to the schema elements.
The service definition is created and the inputs and outputs specified. At this point rules can be added. The basic rule editor exposes the rule syntax directly but the verbalizations mean that fairly English-like sentences can be created as rules are built. The rules have an if..then structure and a nice type-ahead editor drops down lists of attributes etc. Although aimed at a technical user, the resulting rule is accessible and could easily be shown to a business user in a collaborative way. Although you could develop a rule flow and multiple rule sets, you can also just specify the rules for the service as a simple list.
At this point the rules are stored in files and managed as source code files in the same way as any other code files being edited in Eclipse. A simple wizard allows the developer to package up and deploy the rules as a service. The developer can override or version the deployed instances in existence. Once deployed you can use the Rule Execution Server to see what rules are deployed and running. From here the URL for the WSDL can be pulled and used to make the service accessible to anything, though you can also invoke the service using POJO, EJB and JCA components.
The rules from RTS were checked in and the repository updated. The RTS and the Eclipse environment can be synchronized so that rule changes made by the developer are applied to the repository and changes can be made in either area. A governance model for review, approve, deployment etc exists allowing different users to play different roles in the RTS environment. Assuming you have the right authority the rules can be re-deployed and the running server updated. In addition Microsoft Word RuleDocs were then generated from the project to show the MS Office integration.
Users can publish the rules from RTS to RuleDocs directly. Excel files are created for decision tables and Word files for other rules. These documents include the rules but also embed the meta data necessary to explain and support editing of these rules. These files can be stored on file servers, WebDAV servers (content management systems) and managed as project assets. Users can baseline the project as they generate the documents to ensure a common snapshot for comparison when merging the files back in. Support for WebDAV allows you to export RuleDoc files directly into an environment with document based security.
Synchronization services are consistent between the various development tools, RTS and the office tools allowing rules to be edited in various different ways, each perhaps suitable for different users, and then synchronized into a single repository.
Specific new features in 7.0 include changes to Rule Team Server and a new offering called Decision Validation Services, which brings improvements in simulation and testing along with an additional component called the Decision Warehouse
Rule Team Server (RTS) is ILOG’s collaboration tool, now extended across the whole product line. RTS has an explorer metaphor allowing access to the various kinds of artifacts and allowing in place editing of rules etc. Simulations and Test Suites are now artifacts, like rule sets and rule flows, and are managed in RTS. Users of RTS can see, manage, edit and run the test suites and simulations directly. The same meta model is used to manage versions, ownership etc. for all artifacts.
RTS has security control at the artifact type level and instances can be controlled individually using an extension. These extensions are available throughout RTS to allow companies using RTS to add their own capabilities. For instance to add fine grained control, to support different scenarios for test data creation add database access for simulations and more. These extensions allow the components of RTS to be reused, mixed with custom components and be extended.
The separate Rule Scenario Manager has been replaced with new testing and simulation capabilities in RTS 7.0 called Decision Validation Services. Decision Validation Services allow Test Suites and Simulations to be created. Test Suites represent a set of executions of a decision service, with required input data, expected output data and optionally details of the rules and rule tasks executed. Simulations are similar in terms of their structure, though very different in terms of purpose and data used: usually they do not contain assertions on expected output data and data volumes may be much higher. Test Suites are typically executed AFTER a rule change has been made to validate the implementation of the change, whereas Simulations are run prospectively to determine WHAT changes could usefully be made.
Building either follows the same basic process. Users can specify that all the rules or a subset of the rules in a project should be tested and then define a scenario. The tool generates an empty Excel spreadsheet based on the object model and rule set signature. This allows the specification of both test data and expected results. Besides the input and output data, users can also test the execution results – specifying that a specific rule or task in the rule flow should have executed for instance. The definition process generates a usable and readable spreadsheet, assuming a reasonably coherent test scenario. These spreadsheets can be stored in the repository and versioned as usual. Once stored, these can be used for Test Suites or Scenarios. These can be run (either directly or in the background) and the results are displayed in a nice graphical layout. For Test Suites this involves displaying successful tests, failed tests, problems with test data etc. For simulations the results and summaries of those results are displayed. The rule project map within the tool (that lays out the various steps for a user to help them keep track of their projects) has been extended to include the Decision Validation Services step before deploy and integrate. This helps ensure that users remember to develop test cases and test the rules.
Because Test Suites and Scenarios may be created, managed and executed by non-technical users, the tool allows the creation of a framework to support this execution. Within the Rule Studio environment technical users can create validation projects (distinct from rule projects) that will be managed in the environment. These projects specify the configurations on which validation can be run and available formats for sample data. This information is used to guide business users and help them set up validation environment – they can select the validation environment they wish to use for any Test Suite or Simulation they wish to run. Developers can also add a new provider for scenario data, for instance, and can specify a renderer class, configuration plug in etc. They can also define KPIs that are linked to KPI renderers for use in Simulation. All this allows IT to create an environment in which business users will be able to manage their own test scenarios even when they are more complex, accessing existing backend datasets for example. The complexity is packaged up and installed automatically to the Rule Team Server.
Runtime execution audit is supported by the capture of detailed execution information on the Rule Execution Server. The system administrator of the Rule Execution Server can decide how much audit information should be logged to the Decision Warehouse. The tool allows control over the rules you want to monitor and how much information you want to keep – just rules executed, objects changed, etc. Data is logged to a decision warehouse that can be queried using the Rule Execution Server web console or accessed via an API for analysis using BI or statistics tools such as Cognos. Rule artifacts within the decision warehouse hyperlink back to Rule Team Server to allow traceability back to the rule or other artifact that was executed. While there is not much packaged capability in terms of reports etc this is a nice foundation for real decision analysis, crucial for real ongoing improvement.
So that’s it. Lots of new capabilities across the board and consistent application of features that used to be available only on one platform or another. If you are an ILOG customer, let me know what you think of the new release.