View: session overviewtalk overview
Conference Chair: Giuseppe Sanfilippo
Program Chairs: Sébastien Destercke, Vanina Martinez,
Rector: Massimo Midiri (To be confirmed)
Vice-Rector for Education and International Affairs: Fabio Mazzola
Head of the Department of Mathematics and Computer Science: Cinzia Cerroni
Quid Restaurant Maps
10:50 | PRESENTER: Salvador Madrigal Castillo ABSTRACT. Multi-label classification (MLC) is a supervised learning problem where each instance can be associated with none, one, or multiple labels. MLC has received increasing attention due to its wide range of applications, such as text categorization and medical diagnosis. Despite a rich literature on MLC, handling imbalanced data, often encountered in real-world MLC datasets, has not been tackled satisfactorily. Based on a thorough literature review, it appears that the existing methods for imbalanced MLC are either hard to be coupled with sound theoretical guarantees or of limited scalability. This paper discusses the potential (dis)advantages of existing methods for imbalanced MLC, when being coupled with Binary relevance classifier (BRC), and introduces Discrete Minimax BRC (DMBRC), which would be a promising attempt to robustify the BRC by leveraging theoretically sound properties of the Discrete Minimax Classifier. We also provide empirical evidence to illustrate how DMBRC may be advantageous in balancing the label-wise error rates. Finally, we envision future works on further strengthening DMBRC in both label-wise error rates and conventional MLC evaluation metrics. |
11:10 | PRESENTER: Vincenzo Taormina ABSTRACT. Recent advances in deep learning have often surpassed human perfor-mance in image classification. Among the most renowned cases, just think of the ImageNet Large Scale Visual Recognition Challenge competition. However, challenges persist in complex fields such as medical imaging. An example is the Human Protein Atlas which maps all human proteins in more than 171,000 im-ages that makes a computation challenge due to high class imbalance. To address these challenges from a green perspective, we propose a transfer learning ap-proach using Convolutional Neural Networks (CNNs) pre-trained on the ImageNet dataset. We use CNN layers as feature extractors, feeding the extracted features into a Support Vector Machine with a linear kernel. Our method com-bines both image-level and cell-level perspectives. At the cell level, we segment nuclei and extract the surrounding nuclear membrane area. The ensemble classi-fication shows promising performance with limited computational effort. |
11:30 | PRESENTER: Wenlong Chen ABSTRACT. In this paper, we focus on the Discrete Bayesian Classifier (DBC), which discretizes the input space into regions where class probabilities are estimated. We investigate fuzzy partitioning as an alternative to the hard partitioning classically used to discretize the space. We show that our approach not only boosts the DBC’s performance and resilience to noise, but also mitigates the loss of information due to discretization. The benefits of soft partitioning are demonstrated experimentally on several synthetic and real datasets. |
11:50 | r-ERBFN : an Extension of the Evidential RBFN Accounting for the Dependence Between Positive and Negative Evidence PRESENTER: Serigne Diène ABSTRACT. Recently, it was shown that a radial basis function network (RBFN) with a softmax output layer amounts to pooling by Dempster's rule positive and negative evidence for each class, and approximating the resulting belief function by a probability distribution using the plausibility transform. This so-called latent belief function offers a richer uncertainty quantification than the probabilistic output of the RBFN. In this paper, we show that there exists actually a set of latent belief functions for a RBFN. This set is obtained by considering all possible dependence structures, which are described by correlations, between the positive and negative evidence for each class. Furthermore, we show that performance can be enhanced by optimizing the correlations brought to light. |
12:10 | Extended Boltzmann Machine - Generative Model PRESENTER: Maria Rifqi ABSTRACT. The increase in computing power in recent years has brought generative models and the use of synthetic data back to the fore to solve a variety of previously unsolved problems, in particular when fields are subject to constraints linked to the sensitivity of the information processed. This article proposes a modified version of restricted Boltzmann machines (RBM), known as Bernoulli machines, to improve its ability to handle non-binary data without making the methodology more complex to understand and manipulate. To assess the performance of our algorithm, we compare it with various generative models that are well documented in the scientific literature and have repeatedly proven their effectiveness in a variety of contexts. We also chose to use a large number of open source datasets with different features in terms of both their structure and the purpose they address in order the verify the generalization capacity of our results to different scientific area. |
A'nica Restaurant, Link
14:40 | PRESENTER: Florence Dupin de Saint-Cyr ABSTRACT. Stereotypes are necessary for human cognition. Indeed our limited computational capabilities and our need for quick decision making require using shortcuts for reasoning. In this work, we discuss how to formalize reasoning with stereotypes using uncertain default rules with an anchorage degree. |
15:00 | PRESENTER: Henri Prade ABSTRACT. Propositional possibilistic logic handles pairs made of a proposition and a level expressing a degree of certainty; these levels belong to a totally or- dered scale. This basic possibilistic logic only allows for the conjunction of pos- sibilistic formulas, in agreement with the min decomposability of necessity mea- sures for this connective. Generalized possibilistic logic extends this formalism to negation and disjunctions of weighted pairs of formulas. In this paper, we con- sider a class of possibilistic logics where propositions are labeled by elements of a Boolean algebra. We first consider the example where propositions are associated with groups of agents that believe in them. This multiagent logic is then extended by attaching degrees of necessity to pairs (proposition, set of agents), and a mul- tiagent counterpart of generalized possibilistic logic is proposed as well. Other examples of Boolean-valued formulas are discussed, where the Boolean labels represent time intervals, or yet other propositional formulas representing reasons to believe propositions. |
15:20 | PRESENTER: Henri Prade ABSTRACT. Provenance calculus based on two operations forming a semi-ring, enables the combination and propagation of annotations associated with data. This note emphasizes that this calculation, if based on max and min operations, corresponds exactly to query evaluation when data are la- beled with levels of certainty in the sense of possibility theory. |
15:40 | PRESENTER: Giuliano Rosella ABSTRACT. Counterfactual conditionals are statements like “if a were the case, then b would be the case" and they are crucial for various fields like logic, linguistics, and AI. In this work we introduce a novel modal algebraic framework to analyze Lewis counterfactuals that allows us to combine logical, algebraic, and probabilistic approaches. |
16:00 | On Decisive Revision Operators and Totally Ordered Information PRESENTER: Jérome Gaigne ABSTRACT. This paper focuses on decisive belief revision operators, i.e. operators leading to totally informed situations. Such situations can be represented in the Katsuno-Mendelzon (KM) revision framework by a complete propositional formula, from which either any formula or its negation can be entailed. From a semantic point of view, this kind of operator leads to a single, most plausible interpretation. Despite their prevalence in decision theory, this class of operators has not been previously studied in the context of the Katsuno-Mendelzon (KM) revision framework. We propose in this paper to characterize decisive operators by a set of postulates. We also provide a representation theorem leading to linear orders on interpretations. Finally, we exhibit a concrete operator family satisfying this new set of postulates by combining KM revision operators with tie-breaking functions. |
Quid Restaurant Maps
Steri, Piazza Marina 61