CP 2023: 29TH INTERNATIONAL CONFERENCE ON PRINCIPLES AND PRACTICE OF CONSTRAINT PROGRAMMING
PROGRAM FOR THURSDAY, AUGUST 31ST
Days:
previous day
all days

View: session overviewtalk overview

09:00-10:00 Session 23

Invited Talk 4: Great Hall (#1022) (Chair: Mark Wallace)

10:00-10:30Coffee Break

Great Hall Lower Gallery (#1022K)

10:30-11:00 Session 24

CP 2023 Best Paper AwardGreat Hall (#1022) (Chair: Roland Yap)

Mathew J. McIlree and Ciaran McCreesh, Proof Logging for Smart Extensional Constraints

11:00-11:30 Session 25

CP 2023 Best Application Paper AwardGreat Hall (#1022) (Chair: Roland Yap)

Matthias Klapperstueck, Frits De Nijs, Ilankaikone Senthooran, Jack Lee-Kopij, Maria Garcia de la Banda and Michael Wybrow, Exploring Hydrogen Supply/Demand Networks: Modeller and Domain Expert views

11:30-12:00 Session 26A

Search 2, Great Hall (#1022) (Chair: Andreína Francisco)

11:30
Learning a Generic Value-Selection Heuristic Inside a Generic Constraint Programming Solver

ABSTRACT. Constraint programming is known for being an efficient approach to solving combinatorial problems. Important design choices in a solver are the branching heuristics, designed to lead the search to the best solutions in a minimum amount of time. However, developing these heuristics is a time-consuming process that requires problem-specific expertise. This observation has motivated many efforts to use machine learning to learn efficient heuristics without expert intervention automatically. To our knowledge, it is still an open research question. Although several generic variable-selection heuristics are available in the literature, the options for a generic value-selection heuristic are more scarce. In this paper, we propose to tackle this issue by introducing a generic learning procedure that can be used to obtain a value-selection heuristic inside a constraint programming solver. This has been achieved thanks to the combination of a deep Q-learning algorithm, a tailored reward signal, and a heterogeneous graph neural network architecture. Experiments on graph coloring, maximum independent set, and maximum cut problems show that our framework is able to find better solutions close to optimality without requiring a large number of backtracks while being generic.

11:30-12:00 Session 26B

Decision Diagrams 2, East Common Room (#1034) (Chair: Claude-Guy Quimper)

11:30
Partitioning a Map into Homogeneous Contiguous Regions: A Branch-and-Bound approach Using Decision Diagrams

ABSTRACT. Regionalization is a crucial spatial analysis technique used for partitioning a map divided into zones into k continuous areas, optimizing the similarity of zone attributes within each area. This technique has a variety of applications in fields like urban planning, environmental management, and geographic information systems. The REDCAP algorithm is a well-known approach for addressing the regionalization problem. It consists of two main steps: first, it generates a spatially contiguous tree (SCT) representing the neighborhood structure of the set of spatial objects using a contiguity-constrained hierarchical clustering method. Second, it greedily removes k-1 edges from the SCT to create k regions. While this approach has proven to be effective, it may not always produce the most optimal solutions. We propose an alternative method for the second step, an exact dynamic programming (DP) formulation for the k-1 edges removal problem. This DP is solved using a multi-decision-diagram (MDD)-based branch and bound solver leading to a more optimal solution. We compared our proposed method with the REDCAP state-of-the-art technique on real data and synthetic ones, using different instances of the regionalization problem and different supervised and unsupervised metrics. Our results indicate that our approach provides higher quality partitions than those produced by REDCAP at acceptable computational costs. This suggests that our method could be a viable alternative for addressing the regionalization problem in various applications.

12:00-13:20Lunch Break

Great Hall Lower Gallery (#1022K)

13:20-14:20 Session 27A

Search 3, Great Hall (#1022) (Chair: Richard Wallace)

13:20
Guiding Backtrack Search by Tracking Variables during Constraint Propagation

ABSTRACT. It is well-known that variable ordering heuristics play a central role in solving efficiently Constraint Satisfaction Problem (CSP) instances. From the early 80's, and during more than two decades, the dynamic variable ordering heuristic selecting the variable with the smallest domain was clearly prevailing. Then, from the mid 2000's, some adaptive heuristics have been introduced: their principle is to collect some useful information during the search process in order to take better informed decisions. Among those adaptive heuristics, wdeg/dom (and its variants) remains particularly robust. In this paper, we introduce an original heuristic called pick/dom that tracks the variables that are directly involved in the process of constraint propagation, when ending with a conflict. We show how to implement it in both variable-oriented and arc-oriented propagation schemes. We demonstrate the robustness of this new heuristic by conducting a large experimentation with the well-known constraint solver ACE.

13:50
Large Neighborhood Beam Search for Domain-Independent Dynamic Programming

ABSTRACT. Large neighborhood search (LNS) is an algorithmic framework that removes a part of a solution and performs search in the induced search space to find a better solution. While LNS shows strong performance in constraint programming, little work has combined LNS with state space search. We propose large neighborhood beam search (LNBS), a combination of LNS and state space search. Given a solution path, LNBS removes a partial path between two states and then performs beam search to find a better partial path. We apply LNBS to domain-independent dynamic programming (DIDP), a recently proposed generic framework for combinatorial optimization based on dynamic programming. We empirically show that LNBS finds better quality solutions than a state-of-the-art DIDP solver in five out of nine benchmark problem types with a total of 8587 problem instances. In particular, LNBS shows a significant improvement over the existing state-of-the-art DIDP solver in routing and scheduling problems.

13:20-14:50 Session 27B

Machine Learning 2, East Common Room (#1034) (Chair: Tias Guns)

13:20
Incremental Constrained Clustering by Minimal Weighted Modification

ABSTRACT. Clustering is a well-known task in Machine Learning that aims at grouping objects according to their similarity. It is unsupervised and therefore difficult to tune without human input. In constrained clustering, subject matter experts (SME) can incorporate knowledge through constraints in the clustering algorithm. Involving an expert results in an incremental process where expert constraints are added on the fly, iteratively. Although satisfying the constraints is crucial, successive partitions should look alike to avoid disturbing the expert. In this paper, we present an incremental constrained clustering framework integrating active query strategies and a constraint programming model to fit the expert’s expectations while preserving cluster structure, so that the user can understand the process and apprehend its impact. Our model supports instance and group-level constraints, which can be relaxed. Experiments on reference datasets and a case study related to the analysis of satellite image time series show the relevance of our framework.

13:50
FastMapSVM for Predicting CSP Satisfiability

ABSTRACT. Recognizing the satisfiability of Constraint Satisfaction Problems (CSPs) is NP-hard. Although several Machine Learning (ML) approaches have attempted this task by casting it as a binary classification problem, they have had only limited success for a variety of challenging reasons. First, the NP-hardness of the task does not make it amenable to straightforward approaches. Second, CSPs come in various forms and sizes while many ML algorithms impose the same form and size on their training and test instances. Third, the representation of a CSP instance is not unique since the variables and their domain values are unordered. In this paper, we propose FastMapSVM, a recently developed ML framework that leverages a distance function between pairs of objects. We define a novel distance function between two CSP instances using maxflow computations. This distance function is well defined for CSPs of different sizes. It is also invariant to the ordering on the variables and their domain values. Therefore, our framework has broader applicability compared to other approaches. We discuss various representational and combinatorial advantages of FastMapSVM. Through experiments, we also show that it outperforms other state-of-the-art ML approaches.

14:20
From Formal Boosted Tree Explanations to Interpretable Rule Sets

ABSTRACT. The rapid rise of Artificial Intelligence (AI) and Machine Learning (ML) has invoked the need for explainable AI (XAI). One of the most prominent approaches to XAI is to train rule-based ML models, e.g. decision trees, lists and sets, that are deemed interpretable due to their transparent nature. Recent years have witnessed a large body of work in the area of constraints- and reasoning-based approaches to the inference of interpretable models, in particular decision sets (DSes). Despite being shown to outperform heuristic approaches in terms of accuracy, most of them suffer from scalability issues and often fail to handle large training data, in which case no solution is offered. Motivated by this limitation and the success of gradient boosted trees, we propose a novel anytime approach to producing DSes that are both accurate and interpretable. The approach makes use of the concept of a generalized formal explanation and builds on the recent advances in formal explainability of gradient boosted trees. Experimental results obtained on a wide range of datasets, demonstrate that our approach produces DSes that more accurate than those of the state-of-the-art algorithms and comparable with them in terms of explanation size.

14:50-15:00 Session 28

Closing, Great Hall (#1022)