ABSTRACT. Widely adopted for more than 20 years in industrial fields, business rules offer the opportunity to non-IT users to define decision-making policies in a simple and intuitive way. To facilitate their use, systems known as Business Rule Management Systems have been developed, separating the business logic from the application one. While suitable for processing structured and complete data, BRMS face difficulties when those are incomplete or uncertain. This study proposes a new approach for the integration of probabilistic reasoning in IBM Operational Decision Manager (ODM), IBM’s BRMS, especially through the introduction of a notion of risk, making the compilation phase more complex but increasing the expressiveness of business rules.
Computational Hermeneutics: Using Automated Theorem Proving for the Logical Analysis of Natural-Language Arguments (poster teaser)
ABSTRACT. While there have been major advances in automated theorem proving (ATP) during the last years, its main field of application has mostly remained bounded to mathematics and hardware/software verification. I argue that the use of ATP in philosophy can also be very fruitful, not only because of the obvious quantitative advantages of automated reasoning tools (e.g. reducing by several orders of magnitude the time needed to test some argument's validity), but also because it enables a novel approach to the logical analysis of arguments. This approach, which I have called computational hermeneutics, draws its inspiration from work in the philosophy of language such as Donald Davidson's theory of radical interpretation and contemporary so-called inferentialist theories of meaning, which do justice to the inherent circularity of linguistic understanding: the whole is understood (compositionally) on the basis of its parts, while each part is understood only in the (inferential) context of the whole. Computational hermeneutics is thus a holistic, iterative, trial-and-error enterprise, where we evaluate the adequacy of some candidate formalization of a sentence by computing the logical validity of the whole argument. We start with formalizations of some simple statements (taking them as tentative) and use them as stepping stones on the way to the formalization of other argument's sentences, repeating the procedure until arriving at a state of reflective equilibrium: A state where our beliefs have the highest degree of coherence and acceptability.
Towards knowledge-based integration and visualization of geospatial data using Semantic Web technologies (poster teaser)
ABSTRACT. Geospatial data have been pervasive and indispensable for various real-world ap-plication of e.g. urban planning, traffic analysis and emergency response. To this end, the data integration and knowledge transfer are two prominent issues for augmenting the use of geospatial data and knowledge. In order to address these issue, Semantic Web technologies have been considerably adopted in geospatial domain, and there are currently still some inactivates investigating the benefits brought up from the adoption of Semantic Web technologies. In this context, this paper showcases and discusses the knowledge-based geospatial data integration and visualization leveraging ontologies and rules. Specifically, we use the Linked Data paradigm for modelling geospatial data, and then create knowledge base of the visualization of such data in terms of scaling, data portrayal and geometry source. This approach would benefit the transfer, interpret and reuse the visuali-zation knowledge for geospatial data. At the meantime, we also identified some challenges of modelling geospatial knowledge and outreaching such knowledge to other domains as future study.
ABSTRACT. The Answer set programming (ASP) is a non-monotonic declarative programming paradigm that is widely used for the formulation of problems in artificial intelligence. The ASP paradigm provides also a general framework for the resolution of decision and optimization problems. The idea behind ASP is to represent a problem as a logic program and solve that problem by computing sta- ble models. In our work, we propose a new method for searching stable models of logical programs. This method is based on a relatively new semantic that had not been exploited yet. This semantic captures and extends that one of the stable models. The method performs a DPLL enumerative process only on a restricted set of literals called the strong back-door (STB). This method has the advantage to use a Horn clause representation having the same size as the input logic pro- gram and has constant spatial complexity. It avoids the heaviness induced by the loop management from which suffer most of the ASP solvers based on the Clark completion.
Concepts as Modalities in Description Logics (poster teaser)
ABSTRACT. Language phrases like "stone lion", "fake gun" and "glass gummy bear" inspire us to invent a new type of concept composition for description logics (DL). A set-theoretic approach by Klarman is considered but not adopted since his logic does not allowing conflicting possibilities as demonstrated by the "glass gummy bear". Constraints are presented that should hold for the extended logic. A new approach extends the FOL-embedding of DL and introduces a new dimension as a Kripke structure. In the end, further examples are discussed (the penguin does not fly and the veggi burger does not contain any meat).
Inducing Schema.org markup from Natural Language Context (poster teaser)
ABSTRACT. Schema.org creates, supports and maintain schemas for structured data on the web pages. For a non-technical author, it is difficult to publish contents in a structured format. The current work presents an automated way of Inducing Schema.org markup from Natural Language Context of web-pages. This paper proposes an approach of inducing the Schema.org from the content by applying the knowledge base creations technique. As a dataset, Web Data Commons is used and the scope for the experimental parts is limited to RDFa. In the initial stage, the approach is tested using the Knowledge Graph building technique - Knowledge Vault and KnowMore.
An Ontology for Transportation System (poster teaser)
ABSTRACT. In this work, we present a domain ontology for Transportation System. We have developed an ontology for a semantics-aware transportation system from the perspective of a traveler user, capable of answering general competence queries like the nearest bus stop to a particular place, the nearest parking slots available, etc. We have studied the transportation system of some of the big cities of the world and have tried to come up with a vocabulary that can be applied to any city with little modifications. This vocabulary is further aligned with an upper-level ontology to have a common starting point.
A Case-Based Inquiry into the Decision Model and Notation (DMN) and the Knowledge Base (KB) Paradigm
ABSTRACT. Modelling decisions in organisations is a challenging task.
Deciding which modelling language to use for the problem at hand is a
fundamental question. We investigate the Decision Model and Notation
(DMN) standard and the IDP knowledge base system (KBS) in their
effectiveness to model and solve specific real-life case problems. This pa-
per presents two cases that are solved with DMN and IDP: (1) Income
taxation for foreign artists temporarily working in Belgium; and (2) Reg-
istration duties when purchasing real-estate in Belgium. The solutions
in both DMN and IDP are examined and restrictions and opportunities
of both methods are elaborated upon for each type of problem presented
in the case studies. Additionally, compatibilities and synergies between
DMN and IDP are identified
ABSTRACT. Constraint Handling Rules (CHR) is usually compiled to logic programming languages. While there are implementations for imperative programming languages such as C and Java, its most popular host language remains Prolog. In this paper we present CHR.js, a CHR system implemented in JavaScript, that is suitable for both the server-side and interactive client-side web applications. CHR.js provides (i) an interpreter, which is based on the asynchronous execution model of JavaScript, and (ii) an ahead-of-time compiler, resulting in synchronous constraint solvers with better performances. Because of the great popularity of JavaScript, CHR.js is the first CHR system that runs on almost all and even mobile devices, without the need for an additional runtime environment. As an example application we present the CHR.js Playground, an offline-capable web-interface which allows the interactive exploration of CHRs in every modern browser.
A First Order Logic Benchmark for Defeasible Reasoning Tool Profiling
ABSTRACT. In this paper we are interested in the task of a data engineer choosing what tool to use to perform defeasible reasoning with a first order logic knowledge base. To this end we propose the first benchmark in the literature that allows one to classify first order defeasible reasoning tools based on their semantics, expressiveness and performance.
Computational Regulation of Medical Devices in PSOA RuleML
ABSTRACT. The registration and marketability of medical devices in Europe is governed by the Regulation (EU) 2017/745 and by guidelines published in terms thereof. This work focuses on rules for risk-based classification of medical devices as well as for the declaration of conformity of each class. We formalized the core rules of the EU regulation for medical devices in Positional-Slotted Object-Applicative (PSOA) RuleML. This resulted in a refinement and subgrouping of the original legal rules. The formalization led to an object-relational PSOA RuleML rulebase, which was supplemented by object-relational facts (data) about medical devices to form a knowledge base (KB). The KB represents knowledge by facts and rules integrating RDF/F-logic-like graphs/frames (`objects') with Prolog-like relationships. We tested this open-source KB, Medical Devices Rules, by querying it in the open-source PSOATransRun system, which provided a feedback loop for refinement and extension. The aim of this formalization is to create a computational guideline to assist regulators, Notified bodies (NBs), manufacturers, importers, distributors, and wholesalers of medical devices in the classification of medical devices. This can support the licensing process for these stakeholders and for the registration of medical devices with a CE conformity mark and a ''notified body number'' affixed. Because the Medical Devices Rules KB already is in a computational rule format, it can become a pluggable component of smart contracts about medical devices.
Nuance Reasoning Framework: A Rule-Based System for Semantic Query Rewriting
ABSTRACT. We present the Nuance Reasoning Framework (NRF), a generic rule-based framework for semantic query rewriting and reasoning that is being utilized by Nuance Communications Inc. in speech-enabled conversational virtual assistant solutions for numerous automotive OEMs. We focus on the semantic rewriting task performed by NRF, which bridges the conceptual mismatch between the natural language front-end of automotive virtual assistants and their back end databases, and personalizes
the results to the driver. We also describe many of its powerful features such as rewriter arbitration, query mediation and more.
A rule-based eCommerce methodology for the IoT using trustworthy Intelligent Agents and Microservices
ABSTRACT. The impact of the Internet of Things will transform dramatically not only people's' everyday lives but also business and economy. This huge network of intercommunicating heterogeneous Things is expected to revolutionize the commerce industry by driving innovation and new opportunities in the near future. Yet, this open, distributed and heterogeneous environment raises important challenges. Old eCommerce practices cannot be sufficiently applied to the demands of the Internet of Things while trustworthiness issues arise. In this context, this study aims at proposing a rule-based eCommerce methodology that will allow Things, devices, services and humans, to safely trade on the network. For this purpose, the proposed methodology represents Things as Intelligent Agents since they form an alternative to traditional interactions among people and objects while, at the same time, they are involved in a rich research effort regarding trust management. The methodology combines Intelligent Agents with the microservice architecture in order to deal with Things heterogeneity while it adopts the use of a social agent-based trust model for the Internet of Things. Additionally, well-known semantic technologies such as RuleML and defeasible logic is adopted in order to maximize interoperability among Things. Furthermore, in order to deal with issues related to rules exchange where no common syntax is used, the methodology is integrated to an appropriate multi-agent knowledge-based framework. Finally, an eCommerce use case scenario is presented that illustrates the viability of the proposed approach.
Learning condition action rules for personalised journey recommendations
ABSTRACT. We apply a learning classier system (specically XCSI) to the task of learning individual passenger preferences, expressed as condition-action rules, in order to provide personalised suggestions for
their onward journey. Learning classier systems are adaptive systems that function in complex environments, altering internal structure to achieve a goal through interactions with the environment. The proposed XCSI system is embedded within a simulated environment of passengers travelling around the London Underground network, subject to disruption. We show that XCSI is able to learn passenger preferences and use that to provide reasonably accurate recommendations on multi-modal transport options, without substantial parameter modication. Further parameter adjustments yield higher performance. Structural adjustments may remain, suggesting opportunities for improvement in future work.