Tuesday, September 18, 2007

TopBraid now supports the OWLIM Inference Engine

I am pleased to announce that TopBraid Composer 2.3 now comes bundled with a new inference engine: SwiftOWLIM is a rule-based high-performance reasoner for RDFS and (partial) OWL semantics. OWLIM has been implemented by OntoText and uses the Sesame API as its main interface. It is an attractive foundation both for smaller projects (SwiftOWLIM is a free open-source engine) and for professional use (OntoText offers BigOWLIM, scaling to billions of triples).

In Composer, OWLIM can be configured to run in three modes (OWL-max, OWL Horst and RDFS), depending on what type of expressivity you need for your application. Our initial tests with OWLIM indicate that OWLIM may become a serious alternative to the better-known engines such as Pellet and Jena. My colleague Dean Allemang is using OWLIM to classify ontologies that contain tons of individuals. His models are essentially impossible to handle with Pellet which uses a different reasoning algorithm that is geared for complete OWL DL semantics.

While Composer 2.3 is the first release with OWLIM support, we anticipate that several improvements will be folded into the coming releases. For example we could expose OWLIM's own rule language similar to the Jena rules interface.