FLOC 2022: FEDERATED LOGIC CONFERENCE 2022
DL PROCEEDINGS: PAPERS WITH ABSTRACTS

Editors: Martin Homola, Jean Christoph Jung and Marie-Laure Mugnier

Authors, Title and AbstractPaperTalk

ABSTRACT. While semantic approaches for revising knowledge bases are fine-grained and independent of the syntactical forms, they are unable to be straightforwardly applied in the standard semantic DLs. In this paper, we present a characterization of KB revision in DL under the fixed-domain semantics, where the domain is fixed and finite to accommodate the use of knowledge bases adapting a closed-world assumption. We also introduce an instantiation of a model-based revision operator which satisfies all standard postulates using the notion of distance between interpretations. The model set of the revision result is shown to be expressible into a KB in our setting. In addition, by weakening the KB based on certain domain elements, an individual-based revision operator is provided as an alternative approach.

Aug 08 11:00

ABSTRACT. We introduce a framework that allows for the construction of sequent systems for expressive description logics extending ALC. Our framework not only covers a wide array of common description logics, but also allows for sequent systems to be obtained for extensions of description logics with special formulae that we call role relational axioms. All sequent systems are sound, complete, and possess favorable properties such as height-preserving admissibility of common structural rules and height-preserving invertibility of rules.

Aug 09 10:15

ABSTRACT. This paper is about the integration in a unique formalism of knowledge representation languages such as those provided by description logic languages and rule-based reasoning paradigms such as those provided by logic programming languages. We aim at creating an hybrid formalism where description logics constructs are used for defining concepts that are given as arguments to the predicates of the logic programs.

Aug 08 15:05

ABSTRACT. We present in this paper a reformulation of the usual set-theoretical semantics of the description logic ALC with general TBoxes by using categorical language. In this setting, ALC concepts are represented as objects, concept subsumptions as arrows, and memberships as logical quantifiers over objects and arrows of categories. Such a category-based semantics provides a more modular representation of the semantics of ALC. This feature allows us to define a sublogic of ALC by dropping the interaction between existential and universal restrictions, which would be responsible for an exponential complexity in space. Such a sublogic is undefinable in the usual set-theoretical semantics, We show that this sublogic is PSPACE by proposing a deterministic algorithm for checking concept satisfiability which runs in polynomial space.

Aug 10 12:15

ABSTRACT. We present two alternative algorithms for computing (all or some) solutions to the concept abduction problem: one algorithms is based on Reiter's hitting set tree algorithm, whereas the other on relies on a SAT encoding. In contrast to previous work, the algorithms do not rely on a refutation-based calculus and, hence, can be used also with efficient reasoners for tractable DLs such as EL and its extensions. An adaptation to other forms of (logic-based) abduction, e.g., to ABox abduction, is also possible.

Aug 09 14:00

ABSTRACT. Data preparation is a key step for process mining. In this paper, we present how to leverage the Virtual Knowledge Graph approach for extracting event logs from data in relational databases. This approach is implemented in the OnProm system, and supports both the IEEE Standard for eXtensible Event Stream (XES), and the recently proposed standard Object-Centric Event Logs (OCEL).Process mining is a family of techniques that supports the analysis of operational processes based on event logs. Among the existing event log formats, the IEEE standard eXtensible Event Stream (XES) is the most widely adopted. In XES, each event must be related to a single case object, which may lead to convergence and divergence problems. To solve such issues, object-centric approaches become promising, where objects are the central notion, and one event may refer to multiple objects. In particular, the Object-Centric Event Logs (OCEL) standard has been proposed recently. However, the crucial problem of extracting OCEL logs from external sources is still largely unexplored. In this paper, we try to fill this gap by leveraging the Virtual Knowledge Graph (VKG) approach to access data in relational databases. We have implemented this approach in the OnProm system, extending it from XES to OCEL support.

Aug 10 11:50

ABSTRACT. As abduction is getting more attention in the world of ontologies, multiple abduction solvers for DL have been developed. So far, however, there was no attempt for an unified API that would enable to integrate any DL abduction solver into an application – much in the spirit of the well known OWL API that is now implemented by most deductive DL reasoners. Abstracting the common functionality of multiple DL abduction solvers, we introduce DL Abduction API, that we hope can help to fill this space.

Aug 09 15:15

ABSTRACT. Minimal Hitting Set (MHS) is a well-known and complete method to compute all explanations of an ABox abduction problem. MHS is NP-complete and widely recognized as inefficient. MergeXplain (MXP), on the other hand, is fast, but does not guarantee to find all explanations. MHS-MXP is a hybrid algorithm which adopts the divide-and-conquer heuristics of MXP and combines it with MHS. MHS-MXP is complete and – at least on a part of the inputs – more efficient than MHS. We describe a favourable class of inputs for the hybrid algorithm. We describe an experimental implementation which enables us to perform first preliminary empirical evaluation on this class of inputs.

Aug 08 12:15

ABSTRACT. It is known that the problems of deciding the existence of Craig interpolants and of explicit definitions of concepts are both 2ExpTime-complete for standard description logics with nominals and/or role inclusions. These complexity results depend on the presence of an ontology. In this article, we first consider the case without ontologies (or, in the case of role inclusions, ontologies only containing role inclusions) and show that both the existence of Craig interpolants and of explicit definitions of concepts become coNExpTime-complete for DLs such as ALCO and ALCH. Secondly, we make a few observations regarding the size and computation of interpolants and explicit definitions, both with ontologies and without.

Aug 09 09:00

ABSTRACT. The authoring of complex concepts or (instance) queries written in a description logic (DL) can be a difficult task. An established approach to generate such concepts from positive examples is to employ the most specific concept (msc), which generalizes an ABox individual into a concept and the least common subsumer (lcs), which generalizes a collection of concepts into a single concept. These inferences are investigated for the EL, but so far there are no methods for the 2-dimensional case such as temporalized DLs. We report in this abstract on our computation algorithms for the lcs and the msc in a temporalized DL: EL extended by the LTL operators next and global. We sketch the computation algorithms for both inferences---with and without the use of rigid symbols.

Aug 08 16:50

ABSTRACT. Weighted Threshold Operators are n-ary operators that compute a weighted sum of their arguments and verify whether it reaches a certain threshold. They have been extensively studied in the area of circuit complexity theory, as well as in the neural network community under the name of perceptrons. In Knowledge Representation, they have been introduced in the context of standard DL languages by adding a new constructor, the Tooth operator. Tooth-operators have been shown to behave like linear classification models. Thus, they can play a role in bridging symbolic and sub-symbolic reasoning approaches. In particular, tooth expressions can provide a powerful yet natural tool to represent local explanations of black-box classifiers in the context of Explainable AI. In this paper, we present the result of a user study in which we evaluated the interpretability of tooth expressions, and we compared them with Disjunctive Normal Forms (DNF). In the user study, we asked respondents with different backgrounds to perform distinct classification tasks using concepts represented either as tooth expressions or as different types of DNF formulas. We evaluated interpretability through accuracy, response time, confidence, and perceived understandability by human users. We expected tooth expressions to be generally more interpretable than DNFs. In line with our hypothesis, the study revealed that tooth expressions are generally faster to use, and that they are perceived more understandable by users who are less familiar with logic. Our study also showed that the type of task, the type of DNF, and the background of the respondents affect the interpretability of the formalism used.

Aug 09 15:30

ABSTRACT. In reverse engineering of database queries, one aims to construct a query from a set of positively and negatively labelled answers and non-answers. The query can then be used to explore the data further or as an explanation of the answers and non-answers. We consider this reverse engineering problem for queries formulated in various fragments of positive linear temporal logic LTL over data instances given by timestamped atomic concepts. We focus on the design of suitable query languages and the complexity of the separability problem: ‘does there exist a query in the given query language that separates the given answers from the non-answers?’. We deal with both plain LTL queries and those that are mediated by ontologies providing background knowledge and formulated in fragments of clausal LTL.

Aug 07 10:05

ABSTRACT. Learning, in Angluin’s framework of exact learning, a query in the presence of a description logic ontology often involves as a crucial (iterated) step the generalization of a hypothesis query. This has been done, for example, by constructing a least general common generalization of the hypothesis and a counterexample that was provided by the oracle. In this research note, we observe that it is useful to instead use a more liberal construction that uses the counterexample as a guide to produce a generalization of the hypothesis, but does not necessarily generalize the counterexample. This approach allows us to learn in polynomial time ELI concept queries in the presence of DL-LiteF-Horn ontologies.

Aug 08 16:25

ABSTRACT. The importance of taking individual, potentially conflicting perspectives into account when dealing with knowledge has been widely recognised. Many existing ontology management approaches fully merge knowledge perspectives, which may require weakening in order to maintain consistency; others represent the distinct views in an entirely detached way. As an alternative, we propose Standpoint Logic, a simple, yet versatile generic approach to extend existing KR formalisms by the capability to express domain knowledge relative to diverse, possibly conflicting standpoints, which can be hierarchically organised, combined, and put in relation with each other. As a concrete showcase, this extended abstract introduces the standpoint-enhanced version of the very expressive description logic SROIQbs, which is tightly connected to the W3C-standardised ontology language OWL 2 DL. We report that, by virtue of a “small model property” and using some elaborate encoding tricks, it is possible to establish a polytime translation from standpoint-enhanced SROIQbs into plain SROIQbs. By virtue of this result, existing highly optimised OWL reasoners can be used off the shelf to provide practical reasoning support for ontology languages from the OWL family extended by standpoint modelling.

Aug 07 16:50

ABSTRACT. A categorical approach to study model comparison games in terms of comonads was recently initiated by Abramsky et al. In this work, we analyse games that appear naturally in the context of description logics and knowledge representation. We consider expressive sublogics of ALCIOQbSelf, namely, the logics that extend ALC with any combination of inverses, nominals, number restrictions, safe boolean roles combinations and the Self operator. Our construction augments and modifies the so-called modal comonad by Abramsky and Shah. The approach that we took heavily relies on the use of relative monads, which we leverage to encapsulate additional capabilities within the bisimilation games in a compositional manner.

Aug 08 14:50

ABSTRACT. When working with description logic ontologies, understanding entailments derived by a description logic reasoner is not always straightforward. So far, the standard ontology editor Protégé offers two services to help: (black-box) justifications for OWL 2 DL ontologies, and (glass-box) proofs for lightweight OWL EL ontologies, where the latter exploits the proof facilities of reasoner ELK. Since justifications are often insufficient in explaining inferences, there is thus only little tool support for explaining inferences in more expressive DLs. In this paper, we introduce EVEE-LIBS, a Java library for computing proofs for DLs up to ALCH, and EVEE-PROTEGE, a collection of Protégé plugins for displaying those proofs in Protégé. We also give a short glimpse of the latest version of EVONNE, a more advanced standalone application for displaying and interacting with proofs computed with EVEE-LIBS.

Aug 09 14:25

ABSTRACT. In this paper we present a new labelled Natural Deduction calculus for the logic iALC, an intuitionistic description logic with nominals originally designed to reason over laws and other normative sentences in general. Even though this logic already has a formalised Sequent Calculus system, in practice Natural Deduction is more adequate in regards to making proofs more explainable - which is further aided by our use of labels. Finally, we prove soundness and normalisation for the new Natural Deduction system, and show its completeness.

Aug 09 15:45

ABSTRACT. In ontology-mediated query answering, access to incomplete data sources is mediated by a conceptual layer constituted by an ontology. To correctly compute answers to queries, it is necessary to perform complex reasoning over the constraints expressed by the ontology. In the literature there exists a multitude of techniques incorporating the ontological knowledge into queries. However, few of these approaches were designed for comprehensibility of the query answers. In this article, we try to bridge these two qualities by adapting a proof framework originally applied to axiom entailment for conjunctive query answering. We investigate the data and combined complexity of determining the existence of a proof below a given quality threshold, which can be measured in different ways. By distinguishing various parameters such as the shape of a query, we obtain an overview of the complexity of this problem for the lightweight ontology languages DL-Lite_{R} and EL.

Aug 07 09:40

ABSTRACT. State constraints in AI Planning globally restrict the legal environment states. Standard planning languages make closed-domain and closed-world assumptions. Here we address open-world state constraints formalized by planning over a description logic (DL) ontology. Previously, this combination of DL and planning has been investigated for the light-weight DL DL-Lite. Here we propose a novel compilation scheme into standard PDDL with derived predicates, which applies to more expressive DLs and is based on the rewritability of DL queries into Datalog with stratified negation. We also provide a new rewritability result for the DL Horn-ALCHOIQ, which allows us to apply our compilation scheme to quite expressive ontologies. In contrast, we show that in the slight extension Horn-SROIQ no such compilation is possible unless the weak exponential hierarchy collapses. Finally, we show that our approach can outperform previous work on existing benchmarks for planning with DL ontologies, and is feasible on new benchmarks taking advantage of more expressive ontologies. It is an abstract of a paper accepted at AAAI-22.

Aug 09 12:00

ABSTRACT. Circumscription is one of the major approaches to bringing non-monotonic (common-sense) reasoning features to first-order logic and related formalisms. In a nutshell, when computing logical entailments from circumscribed first-order theories, we focus on a restricted subset of the theory’s classical models, so that the less plausible models (from the common-sense perspective) are eliminated. Circumscription has been studied for knowledge bases expressed in various Description Logics (DLs), with a focus on understanding the computational complexity of reasoning. Those studies revealed that circumscription causes a dramatic increase in computational complexity in a broad range of DLs. Roughly speaking, this is due to the second-order quantification step that checks the non-existence of a “better” model, e.g., one where some specific predicate that we want to minimize has a smaller extension. In this paper, we consider a new notion of circumscription in DLs, aiming to preserve the key ideas and advantages of classical circumscription while mitigating its impact on the computational complexity of reasoning. Our main idea is to replace the second-order quantification step with a series of (pointwise) local checks on all domain elements and their immediate neighborhood. This approach provides a sound approximation of classical circumscription and is closely related to the notion of pointwise circumscription proposed by Lifschitz for first-order logic. We formalize several variants of pointwise circumscription in DLs and perform a preliminary study of computational complexity in this setting. Our main achievement is to show that, under certain syntactic restrictions, standard reasoning problems like subsumption testing or concept satisfiability for ALCIO knowledge bases with pointwise circumscription are (co)NEXPTIME-complete.

Aug 08 11:50

ABSTRACT. We show how JSON documents can be abstracted as concept descriptions in an appropriate description logic. This representation allows the use of additional background knowledge in a form of a TBox and an assignment of referring expression types (RETs) to certain primitive concepts to detect situations in which subdocuments, perhaps multiple subdocuments located in various parts of the original documents, capture information about a particular conceptual entity. Detecting such situations allows for normalizing the JSON document into several separate documents that capture all information about such conceptual entities in separate documents. This transformation preserves all the original information present in the input documents. The RET assignment contributes a set of possible concept descriptions that enable more refined and normalized capture of documents, and to more crafted answers to queries that adhere to user expectations expressed as RETs. We also show how RETs allow checking for a document admissibility condition that ensures that the documents describe a single conceptual entity.

Aug 10 11:25

ABSTRACT. Dealing with context dependent knowledge has led to different formalizations of the notion of context. Among them is the Contextualized Knowledge Repository (CKR) framework, which is rooted in description logics but links on the reasoning side strongly to logic programs and Answer Set Programming (ASP) in particular. The CKR framework caters for reasoning with defeasible axioms and exceptions in contexts, which was extended to knowledge inheritance across contexts in a coverage (specificity) hierarchy. However, the approach supports only this single type of contextual relation and the reasoning procedures work only for restricted hierarchies, due to non-trivial issues with model preference under exceptions. In this paper, we overcome these limitations and present a generalization of CKR hierarchies to multiple contextual relations, along with their interpretation of defeasible axioms and preference. To support reasoning, we use ASP with algebraic measures, which is a recent extension of ASP with weighted formulas over semirings that allows one to associate quantities with interpretations depending on the truth values of propositional atoms. Notably, we show that for a relevant fragment of CKR hierarchies with multiple contextual relations, query answering can be realized efficiently with the popular asprin framework. The algebraic measures approach is more powerful and enables e.g. reasoning with epistemic queries over CKRs, which opens interesting perspectives for the use of quantitative ASP extensions in other applications. The paper has been presented at the 37th International Conference on Logic Programming (ICLP 2021).

Aug 07 15:00

ABSTRACT. Classical instance queries over an ontology only consider explicitly named individuals. Concept referring expressions (CREs) also allow for returning answers in the form of concepts that describe implicitly given individuals in terms of their relation to an explicitly named one. Existing approaches, e.g., based on tree automata, can neither be integrated into state-of-the-art OWL reasoners nor are they directly amenable for an efficient implementation. To address this, we devise a novel algorithm that uses highly optimized OWL reasoners as a black box. In addition to the standard criteria of singularity and certainty for CREs, we devise and consider the criterion of uniqueness of CREs for Horn ALC ontologies. The evaluation of our prototypical implementation shows that computing CREs for the most general concept (⊤) can be done in less than one minute for ontologies with thousands of individuals and concepts.

Aug 10 11:00

ABSTRACT. Propositional SAT solvers have been a popular way of computing justifications for ontological entailment -- minimal subsets of axioms of the ontologies that entail a given conclusion. Most SAT encodings proposed for Description Logics (DLs), translate the inferences obtained by a consequence-based procedure to propositional Horn clauses, using which entailments from subsets of axioms can be effectively checked, and use modified SAT solvers to systematically search over these subsets. To avoid repeated discovery of subsets with already checked entailment, the modified SAT solvers add special blocking clauses that prevent generating truth assignments corresponding to these subsets, the number of which can be exponential, even if the number of justifications is small. In this paper, we propose alternative SAT encodings that avoid generation of unnecessary blocking clauses. Unlike the previous methods, the inferences are used not only for checking entailment from subsets of axioms, but also, as a part of the encoding, to ensure that the SAT solver generates truth assignments corresponding only to justifications.

Aug 09 14:50

ABSTRACT. Forgetting is an important ontology extraction technique. A variant of forgetting which has received significant attention in the literature is \emph{deductive forgetting}. While \emph{deductive forgetting} is attractive as it generates the forgetting view in a language with the same complexity as the language of the original ontology, it is known to be not precise as it may not preserve the information that requires more complex languages. We study \emph{deductive forgetting} with the aim of understanding the unpreserved information. We present a system that performs \emph{deductive forgetting} and produces a set of axioms~$\Delta$ representing the unpreserved information in the forgetting view. Our system allows a new fine-grained ontology extraction process that gives the user the option to enhance the informativeness of the deductive forgetting view by appending to it axioms from~$\Delta$.

Aug 09 09:50

ABSTRACT. We formally introduce ontology-based data federation (OBDF), to denote a framework combining ontology-based data access (OBDA) with a data federation layer, which virtually exposes multiple heterogeneous sources as a single relational database. In this setting, the SQL queries generated by the OBDA component by translating user SPARQL queries are further transformed by the data federation layer so as to be efficiently executed over the data sources. The structure of these SQL queries directly affects their execution time in the data federation layer and their optimization is crucial for performance. We propose here novel optimizations specific for OBDF, which are based on "hints" about existing data redundancies in the sources, empty join operations, and the need for materialized views. Such hints can be systematically inferred by analyzing the OBDA mappings and ontology and exploited to simplify the query structure. We also carry out an experimental evaluation in which we show the effectiveness of our optimizations.

Aug 08 14:25

ABSTRACT. This extended abstract briefly summarizes our recent work on extending the study of counting queries to Horn description logics outside the DL-Lite family. Through a combination of novel techniques, adaptations of existing constructions, and new connections to closed predicates, we achieve a complete picture of the data and combined complexity of answering counting conjunctive queries (CCQs) and cardinality queries (a restricted class of CCQs) in ELHI⊥ and its various sublogics.

Aug 07 09:15

ABSTRACT. The inexpressive Description Logic (DL) FL0, which has conjunction and value restriction as its only concept constructors, had fallen into disrepute when it turned out that reasoning in FL0 w.r.t. general TBoxes is ExpTime-complete, that is, as hard as in the considerably more expressive logic ALC. In this paper, published in the journal Theory and Practice of Logic Programming, we rehabilitate FL0 by presenting a dedicated subsumption algorithm for FL0 , which is much simpler than the tableau-based algorithms employed by highly optimized DL reasoners. Our experiments show that the performance of our novel algorithm, as prototypically implemented in our FL0wer reasoner, compares very well with that of the highly optimized reasoners. FL0wer can also deal with ontologies written in the extension FLbot of FL0 with the top and the bottom concept by employing a polynomial-time reduction, shown in this paper, which eliminates top and bottom. We also investigate the complexity of reasoning in DLs related to the Horn-fragments of FL0 and FLbot.

Aug 08 16:00

ABSTRACT. We investigate practical algorithms for inconsistency-tolerant query answering over prioritized knowledge bases, which consist of a logical theory, a set of facts, and a priority relation between conflicting facts. We consider three well-known semantics (AR, IAR and brave) based upon two notions of optimal repairs (Pareto and completion). Deciding whether a query answer holds under these semantics is (co)NP-complete in data complexity for a large class of logical theories, and SAT-based procedures have been devised for repair-based semantics when there is no priority relation, or the relation has a special structure. The present paper introduces the first SAT encodings for Pareto- and completion-optimal repairs w.r.t. general priority relations and proposes several ways of employing existing and new encodings to compute answers under (optimal) repair-based semantics, by exploiting different reasoning modes of SAT solvers. The comprehensive experimental evaluation of our implementation compares both (i) the impact of adopting semantics based on different kinds of repairs, and (ii) the relative performances of alternative procedures for the same semantics.

Aug 08 14:00

ABSTRACT. Ontology Classification is a central DL reasoning task and supported by several highly-optimised reasoners for OWL ontologies. Different notions of modularity, including the atomic decomposition (AD), have already been exploited by different modular reasoners. In our previous work, we have designed and implemented a new AD-informed and MORe-inspired algorithm that uses Hermit and ELK as delegate reasoners, but avoids any duplicate subsumption tests between these two reasoners. In this paper, we push the algorithm further with easyfication and parallelization. We empirically evaluate our algorithm with a set of Snomed CT extensions ontologies and a corpus of BioPortal ontologies. We also design, implement and empirically evaluate a new modular reasoner, called Crane, which works with ``coarsened'' AD.

Aug 10 10:00

ABSTRACT. Abduction is the task of finding possible extensions of a given knowledge base that would make a given sentence logically entailed. As such, it can be used to explain why the sentence does not follow, to repair incomplete knowledge bases, and to provide possible explanations for unexpected observations. We consider TBox abduction in the lightweight description logic EL, where the observation is a concept inclusion and the background knowledge is a description logic TBox. To avoid useless answers, such problems usually come with further restrictions on the solution space and/or minimality criteria that help sort the chaff from the grain. We argue that existing minimality notions are insufficient, and introduce connection minimality. This criterion rejects hypotheses that use concept inclusions “disconnected” from the problem at hand. We show how to compute a special class of connection-minimal hypotheses in a sound and complete way. Our technique is based on a translation to first-order logic, and constructs hypotheses based on prime implicates. We evaluate a prototype implementation of our approach on ontologies from the medical domain.

Aug 08 10:00

ABSTRACT. In this extended abstract we report about an approach for reasoning about actions with domain descriptions including an EL⊥ ontology in a temporal action theory. The action theory is based on a Dynamic Linear Time Temporal Logic, and extensions are defined through temporal answer sets. The work provides conditions under which action consistency can be guaranteed with respect to an EL⊥ ontology, by polynomially encoding an EL⊥ knowledge base into the domain description of the temporal action theory.

Aug 08 17:15

ABSTRACT. (This is an extended abstract on our paper accepted at ESWC 2022.)

Aug 09 09:25