Chuan Luo (Peking University)
Kaile Su (Griffith University)
ABSTRACT. This paper presents a local search SAT solver named CCAnr, which is based on the configuration checking method and has good performance on non-random SAT instances. As with the CCASat solver, CCAnr adopts the CCA (configuration checking with aspiration) heuristic and switches between two modes: it flips a variable according to the CCA heuristic if any; if there does not exist a variable allowed to be flipped by the CCA heuristic, then it flips a variable in a random unsatisfied clause. The main novelty of CCAnr lies on the greedy heuristic of picking a variable from the selected unsatisfied clause, which contributes significantly to its good performance on structured instances. Previous local search algorithms usually utilize diversification properties such as {\it age} or randomized strategy to pick a variable from the unsatisfied clause. Our experiments on crafted and application benchmarks from SAT Competition 2013 and 2014 show that CCAnr has better performance that other state of the art local search solvers on structured instances, and its performance can be further improved by using a preprocessor CP3. Our results suggest that a greedy heuristic in the focused search mode is helpful to improve local search algorithms for solving structured SAT instances.
ABSTRACT. PBLib is an easy to use and efficient library written in \CC\xspace for translating pseudo-Boolean (PB) constraints into CNF. We have implemented fifteen different encodings of PB constraints into CNF. Our aim is to use efficient encodings, in terms of formula size and whether unit propagation maintains generalized arc consistency. Moreover, PBlib normalizes PB constraints and can automatically decides which encoder provides the most effective translation. We also support incremental strengthening for optmization problems where the tighter bound is realized with few additional clauses as well as conditions for PB constraints.
ABSTRACT. In this paper we present several improvements to extraction of minimal unsatisfiable subformulas (MUSes) of Boolean formulas. As our first contribution we amplify the role of preprocessing for computing group MUSes and describe rotation on preprocessed formulas. We find very convenient to adopt the framework of labeled CNF formulas and we present our algorithms in this more general framework. We use the assumption-based approach for computing MUSes due to its simplicity and the ability to use any SAT-solver as the back-end. However, this comes with a price: it is well-known that the assumption-based approach performs significantly worse than the resolution-based approach. This leads to our second contribution, we show how to bridge the gap between the two approaches using ``chunking''. An extensive experimental evaluation shows that our method significantly outperforms state-of-the-art solutions in the context of group MUS extraction.
ABSTRACT. We give improved deterministic algorithms solving sparse instances of MAX-SAT and MAX-$k$-CSP. For instances with $n$ variables and $cn$ clauses (constraints), we give algorithms running in time $\poly(n)\cdot 2^{n(1-\mu)}$ for \begin{itemize} \item $\mu = \Omega(\frac{1}{c} )$ and polynomial space solving MAX-SAT and MAX-$k$-SAT, \item $\mu = \Omega(\frac{1}{\sqrt{c}} )$ and exponential space solving MAX-SAT and MAX-$k$-SAT, \item $\mu = \Omega(\frac{1}{ck^2} )$ and polynomial space solving MAX-$k$-CSP, \item $\mu = \Omega(\frac{1}{\sqrt{ck^3}} )$ and exponential space solving MAX-$k$-CSP. \end{itemize} The previous MAX-SAT algorithms have savings $\mu=\Omega(\frac{1}{c^2 \log^2 c})$ for running in polynomial space~\cite{SST14} and $\mu=\Omega(\frac{1}{c \log c})$ for exponential space~\cite{DW06}. We also give an algorithm with improved savings for satisfiability of depth-2 threshold circuits with $cn$ wires.
Tobias Schubert (University Freiburg)
Bernd Becker (University Freiburg)
ABSTRACT. The problem of counting the number of satisfying assignments of a propositional formula (#SAT) can be considered to be the big brother of the well known SAT problem. However, the higher computational complexity and a lack of fast solvers currently limit its usability for real world problems.
Similar to SAT, utilizing the parallel computation power of modern CPUs could greatly increase the solving speed in the realm of #SAT. However, in comparison to SAT there is an additional obstacle for the parallelization of #SAT that is caused by the usage of conflict learning together with the #SAT specific techniques of component caching and sub-formula decomposition. The combination can result in an incorrect final result being computed due to incorrect values in the formula cache. This problem is easily resolvable in a sequential solver with a depth-first node order but requires additional care and handling in a parallel one. In this paper we introduce laissez-faire caching which allows for an arbitrary node computation order in both a sequential and parallel solver while ensuring a correct final result. Additionally, we apply this new caching approach to build countAntom, the world's first parallel #SAT-solver.
Our experimental results clearly show that countAntom achieves considerable speedups through the parallel computation while maintaining correct results on a large variety of benchmarks coming from different real-world applications. Moreover, our analysis indicates that laissez-faire caching only adds a small computational overhead.
William Lindsay (University of Waterloo)
Vijay Ganesh (Waterloo)
Jia Hui Liang (University of Waterloo)
Sebastian Fischmeister (University of Waterloo)
Krzysztof Czarnecki (University of Waterloo)
ABSTRACT. In this paper, we present SATGraf, a tool for visualizing the evolution of the structure of a Boolean SAT formula in real time as it is being processed by a conflict-driven clause-learning (CDCL) solver. The tool is parametric, allowing the user to define the structure to be visualised. In particular, the tool can visualise the community structure of real-world Boolean satisfiability (SAT) instances and their evolution during solving. Such visualisations have been the inspiration for several hypothesis about the connection between community structure and the running time of CDCL SAT solvers, some which we have already verified. SATGraf has allowed us to make three empirical observations regarding CDCL solvers. First, we observe that the Variable State Independent De- caying Sum (VSIDS) branching heuristic consistently chooses variables with a high number of inter-community edges. Second, we observe that backtracking from a conflict can sometimes cause unnecessary work. Fi- nally, we observe that VSIDS typically chooses decision variables with the same community as the previous decision variable.
ABSTRACT. We propose a notion of hints, clauses that are not necessarily consistent with the input formula. The goal of adding hints is to speed up the SAT-solving process. For this purpose, we provide an efficient general mechanism for hint addition and removal, accomplished by partial resolution. When a hint is determined to be inconsistent, the partial resolution-graph of an unsatisfiable-core is used to reduce the search space. The suggested mechanism is used to boost performance by adding generated hints to the input formula. We describe two specific hint-suggestion methods, one of which increases performance by 30% on satisfiable SAT '13 competition instances and solves 9 instances not solved by the baseline solver.
ABSTRACT. In incremental SAT solving, information gained from previous similar instances has so far been limited to learned clauses that are still relevant, and heuristic information such as activity weights and scores. In most settings in which incremental satisfiability is applied, many of the instances along the sequence of formulas being solved are unsatisfiable. We show that in such cases, with a linear-time analysis of the proof, many constants can be assumed for the next instance. The added constants simplify the next instance and consequently accelerate the search.
ABSTRACT. SAT filters are a novel and compact data structure that can be used to quickly query a word for membership in a fixed set. They have the potential to store more information in a fixed storage limit than a Bloom filter. Constructing a SAT filter requires sampling diverse solutions to randomly constructed constraint satisfaction instances, but there is flexibility in the choice of constraint satisfaction problem. Presented here is a case study of SAT filter construction with a focus on constraint satisfaction problems based on MAX-CUT clauses (Not-all-equal 3-SAT, 2-in-4-SAT, etc.) and frustrated cycles in the Ising model. Solutions are sampled using a D-Wave quantum annealer, and results are measured against classical approaches. The SAT variants studied are of interest in the context of SAT filters, independent of the solvers used.
Geoffrey Chu (NICTA Victoria Laboratory, Department of Computing and Information Systems, University of Melbourne)
Christian Muise (University of Melbourne)
Peter J. Stuckey (University of Melbourne)
ABSTRACT. Model counting is the task of computing the number of assignments to variables V that satisfy a given propositional theory F. Model counting is an essential tool in probabilistic reasoning. In this paper, we introduce the problem of model counting projected on a subset P of original variables that we call 'priority' variables. The task is to compute the number of assignments to P such that there exists an extension to 'non-priority' variables V\P that satisfies F. Projected model counting arises when some parts of the model are irrelevant to the counts, in particular when we require additional variables to model the problem we are counting in SAT. We discuss three different approaches to projected model counting (two of which are novel), and compare their performance on different benchmark problems.
Joao Marques-Silva (University College Dublin)
ABSTRACT. We consider the algorithmic task of computing a maximal autarky for a clause-set F, i.e., a partial assignment which satisfies every clause of F it touches, and where this property is destroyed by adding any non-empty set of further assignments. We employ SAT solvers as oracles, using various capabilities. Using the standard SAT oracle, log_2(n(F)) oracle calls suffice, where n(F) is the number of variables, but the drawback is that (translated) cardinality constraints are employed, which makes this approach less efficient in practice. Using an extended SAT oracle, motivated by the capabilities of modern SAT solvers, we show how to compute maximal autarkies with 2 n(F)^{1/2} simpler oracle calls. The novel algorithm combines the previous two main approaches, based on the autarky-resolution duality and on SAT translations.
Peter Sanders (Karlsruhe Institute of Technology)
Carsten Sinz (Karlsruhe Institute of Technology)
ABSTRACT. A simple yet successful approach to parallel satisfiability (SAT) solving is to run several different (a portfolio of) SAT solvers on the input problem at the same time until one solver finds a solution. The SAT solvers in the portfolio can be instances of a single solver with different configuration settings. Additionally the solvers can exchange information usually in the form of clauses. In this paper we investigate whether this approach is applicable in the case of massively parallel SAT solving. Our solver is intended to run on clusters with hundreds of processors, hence the name HordeSat. HordeSat is a fully distributed portfolio-based SAT solver with a modular design that allows it to use any SAT solver that implements a given interface. HordeSat has a decentralized design and features hierarchical parallelism with interleaved communication and search. We experimentally evaluated it using all the benchmark problems from the application track of the 2014 International SAT Competition. The experiments demonstrate that HordeSat is scalable up to at least 512 processors with superlinear average speedup.
Karina Gitina (Albert-Ludwigs-University Freiburg, Germany)
Jennifer Nist (Albert-Ludwigs-University Freiburg, Germany)
Christoph Scholl (Albert-Ludwigs-University Freiburg, Germany)
Bernd Becker (Albert-Ludwigs-University Freiburg, Germany)
ABSTRACT. For SAT and QBF formulas many techniques are applied in order to reduce/modify the number of variables and clauses of the formula, before the formula is passed to the actual solution algorithm. It is well known that these preprocessing techniques often reduce the computation time of the solver by orders of magnitude. In this paper we generalize different preprocessing techniques for SAT and QBF problems to dependency quantified Boolean formulas (DQBF) and describe how they need to be adapted to work with a DQBF solver core. We demonstrate their effectiveness both for CNF- and non-CNF-based DQBF algorithms.
ABSTRACT. We consider the incremental computation of minimal unsatisfiable cores (MUCs) of QBFs. To this end, we equipped our incremental QBF solver DepQBF with a novel API which allows for incremental solving based on clause groups. A clause group is a set of clauses which is incrementally added to or removed from a previously solved QBF. Our implementation of this API is related to incremental SAT solvers which rely on selector variables and solving under assumptions. However, in contrast to SAT solvers the API entirely hides selector variables and assumptions from the user. This property facilitates the integration of DepQBF as a library in other tools. We present implementation details and show that the API can be implemented in any DPLL-based SAT and QBF solver. For the first time, we report on experiments related to the incremental computation of MUCs of QBFs using DepQBF's novel clause group API. Our results illustrate its efficiency and applicability.
Florent Capelli (Université Paris Diderot)
Stefan Mengel (Laboratoire d'Informatique de l'École Polytechnique (LIX), École Polytechnique)
Friedrich Slivovsky (Institute of Information Systems, Vienna University of Technology)
ABSTRACT. We show that the traces of recently introduced dynamic programming algorithms for #SAT can be used to construct structured deterministic DNNF (decomposable negation normal form) representations of propositional formulas in CNF (conjunctive normal form). This allows us prove new upper bounds on the complexity of compiling CNF formulas into structured deterministic DNNFs in terms of parameters such as the treewidth and even the clique-width of the incidence graph.
Marius Lindauer (University of Freiburg)
Frank Hutter (University of Freiburg)
ABSTRACT. Most modern SAT solvers expose a range of parameters to allow some customization to improve performance on specific instance sets. Performing this customization manually can be challenging and time-consuming, and as a consequence several automated algorithm configuration methods have been developed for this purpose. Although automatic algorithm configuration has already been applied successfully to many different SAT solvers, a comprehensive analysis of the configuration process is usually not readily available to users. Here, we present SpySMAC, the first tool to address this gap by providing an easy-to-use framework for (i) automatic configuration of SAT solvers in different settings, (ii) a thorough performance analysis comparing the best found configuration to the default one, and (iii) an assessment of each parameter's importance using the fANOVA framework. To showcase our tool, we apply it to MiniSAT and Lingeling, two well-known solvers with very different characteristics.
ABSTRACT. We introduce h-modularity, a structural parameter of CNF formulas, and present algorithms that render the decision problem SAT and the model counting problem #SAT fixed-parameter tractable when parameterized by h-modularity. The new parameter is defined in terms of a partition of the given CNF formula into strongly interconnected communities which are sparsely interconnected with each other. Each community forms a hitting formula, whereas the interconnections between communities form a graph of small treewidth. Our algorithms first identify the community structure and then use them for an efficient solution of SAT and #SAT, respectively. We further show that h-modularity is incomparable with known parameters under which SAT or \#SAT is fixed-parameter tractable.
Jesus Giráldez-Cru (IIIA-CSIC)
Jordi Levy (IIIA - CSIC)
Laurent Simon (LABRI)
ABSTRACT. Conflict-Driven Clause Learning (CDCL) techniques are nowadays the cornerstone that bursts modern SAT solvers specialized in industrial instances. Last years, one of the focuses has been put on strategies to select which learnt clauses are removed during search. Originally, the need for removing clauses was motivated by the finiteness of memory. Recently, it has been shown that more aggressive clause deletion policies may improve solvers performance, even if memory is sufficient. Also, utility of learnt clauses has been related to the modular structure of industrial formulas.
In this paper, we show that more learnt clauses is not always less run-time. On the contrary, providing a SAT solver learnt clauses, and not allowing it to remove them, may slow it. We identify a set of highly useful learnt clauses and prove that, even if we disallow the solver to remove them, they contribute to speed up, specially when dealing with satisfiable formulas. These clauses are learnt by solving pairs of connected communities of the formula. In an exhaustive experimentation, we show that adding such clauses as a formula preprocessing contributes in most cases to outperform SAT solvers. This would suggest that the community structure may play an important role in clause deletion policies.
ABSTRACT. We present a new algorithm that can efficiently extract information about nested functional dependencies between variables of a formula in CNF. Our algorithm uses the relation between gate encodings and blocked sets in CNF formulas. Our notion of "gate" emphasizes this relation. The presented algorithm is central to our new tool "cnf2aig" that produces equisatisfiable and-inverter-graphs (AIGs) from CNF formulas. We compare the novel algorithm to earlier approaches and show that the produced AIG are generally more succinct and use less input variables. As the gate-detection is related to the structure of input formulas, we furthermore analyze the gate-detection before and after applying preprocessing techniques.
Ruben Martins (University of Oxford)
Mikolas Janota (INESC-ID Lisboa)
Ines Lynce (INESC-ID/IST, University of Lisbon)
Vasco Manquinho (INESC-ID, IST/University of Lisbon, Portugal)
ABSTRACT. Most recent MaxSAT algorithms rely on a succession of calls to a SAT solver in order to find an optimal solution. In particular, several algorithms take advantage of the ability of SAT solvers to identify unsatisfiable subformulas. Usually, these MaxSAT algorithms perform better when small unsatisfiable subformulas are found early. However, this is not the case in many problem instances, since the whole formula is given to the SAT solver in each call. In this paper, we propose to partition the MaxSAT formula using a resolution-based graph representation. Partitions are then iteratively joined by using a proximity measure extracted from the graph representation of the formula. The algorithm ends when only one partition remains and the optimal solution is found. Experimental results show that this new approach further enhances a state of the art MaxSAT solver to optimally solve a larger set of industrial problem instances.
Alessandro Previti (University College Dublin)
Joao Marques-Silva (University College Dublin)
ABSTRACT. The problem of propositional formula minimization can be traced to the mid of the last century, to the seminal work of Quine and McCluskey, with a large body of work ensuing from this seminal work. Given a set of implicants (or implicates) of a formula, the goal for minimization is to find a smallest set of prime implicants (or implicates) equivalent to the original formula. This paper considers the more general problem of computing a smallest prime representation of a non-clausal propositional formula, which we refer to as formula simplification. Moreover, the paper proposes a novel, entirely SAT-based, approach for the formula simplification problem. The original problem addressed by the Quine-McCluskey procedure can thus be viewed as a special case of the problem addressed in this paper. Experimental results, obtained on well-known representative problem instances, demonstrate that a SAT-based approach for formula simplification is a viable alternative to existing implementations of the Quine-McCluskey procedure.
Xin Zhang (Georgia institute of Technology)
Aditya Nori (Microsoft Research)
Mayur Naik (Georgia institute of Technology)
ABSTRACT. Very large MaxSAT instances, comprising 10^20 clauses and beyond, commonly arise in a variety of domains. We present VOLT, a framework for solving such instances, using an iterative, lazy grounding approach. In each iteration, VOLT grounds a subset of clauses in the MaxSAT problem, and solves it using an off-the-shelf MaxSAT solver. VOLT provides a common ground to compare and contrast different lazy grounding approaches for solving large MaxSAT instances. We cast four diverse approaches from the literature on information retrieval and pro- gram analysis as instances of VOLT. We have implemented VOLT and evaluate its performance under different state-of-the-art MaxSAT solvers.
ABSTRACT. We show that the way CDCL SAT solvers find a satisfying assignment is fundamentally different from the way they prove unsatisfiability. We explain this important difference by identifying direct connections to the workings of some of the most important elements in CDCL solvers: the effects of restarts and VSIDS, and the roles of learned clauses. We give a wide range of concrete evidence that highlights the varying effects and roles of these elements. As a result, this paper not only sheds a new light on the internal workings of CDCL but ultimately suggests a promising new direction in SAT research. Based on the above conjecture, we have developed several new ideas for improving the performance of SAT solvers by optimizing them for either satisfiable or unsatisfiable instances. We have implemented these ideas on top of our new solver COMiniSatPS and observed significant performance improvement.
Carlos Mencía (University College Dublin)
Joao Marques-Silva (University College Dublin)
ABSTRACT. The enumeration of minimal unsatisfiable subsets (MUSes) finds a growing number of practical applications, that includes a wide range of diagnosis problems. As a concrete example, the problem of axiom pinpointing in the EL family of description logics (DLs) can be modeled as the enumeration of the group-MUSes of Horn formulae. In turn, axiom pinpointing for the EL family of DLs finds important applications, such as debugging medical ontologies, of which SNOMED CT is the best known example. The main contribution of this paper is to develop an efficient group-MUS enumerator for Horn formulae, HgMUS, that finds immediate application in axiom pinpointing for the EL family of DLs. In the process of developing HgMUS, the paper also identifies performance bottlenecks of existing solutions. The new algorithm is shown to outperform all alternative approaches when the problem domain targeted by group-MUS enumeration of Horn formulae is axiom pinpointing for the EL family of DLs, with a representative suite of examples taken from different medical ontologies.
Tzu-Chien Hsu (National Taiwan University)
Jie-Hong Roland Jiang (Department of Electrical Engineering/Graduate Institute of Electronics Engineering, National Taiwan University)
ABSTRACT. Quantified Boolean satisfiability (QSAT) is natural formulation of many decision problems and yet awaits further breakthroughs to reach the maturity enabling industrial applications. Recent advancements on quantified Boolean formula (QBF) proof systems sharpen our understanding of their proof complexities and shed light on solver improvement. Particularly QBF solving based on formula expansion has been theoretically and practically demonstrated to be more powerful than non-expansion based solving. However recursive expansion suffers from exponential formula explosion and has to be carefully managed. In this paper, we propose a QBF solver using levelized SAT solving in the flavor of formula expansion. New learning techniques are devised to control formula growth. Experimental results on application benchmarks show that our prototype implementation is comparable with state-of-the-art solvers and outperforms other solvers in certain instances.
Gereon Kremer (RWTH Aachen University)
Sebastian Junges (RWTH Aachen University)
Stefan Schupp (RWTH Aachen Theory of Hybrid Systems)
Erika Abraham (RWTH Aachen University)
ABSTRACT. During the last decade, popular SMT solvers were extended step-by-step with a wide range of satisfiability-checking procedures for different theories. Many of those SMT solvers also support the user-defined tuning and combination of the procedures, typically via command-line options. However, configuring solvers this way is a tedious task with restricted options.
In this paper we present our modular and extensible C++ library SMT-RAT, which offers numerous parameterized procedure modules for different logics. These modules can be configured and combined into an SMT solver using a comprehensible but powerful strategy, which can be specified via a graphical user interface. This makes it easier to construct a solver which is tuned for a specific set of problem instances. Compared to a previous version, we extended our library with a number of new modules, and support parallelization in strategies. An additional contribution is our thread-safe and generic C++ library CArL, offering efficient data structures and basic operations for real arithmetic, which can be used for the fast implementation of new theory-solving procedures.
Matteo Marescotti (Università della Svizzera Italiana)
Natasha Sharygina (Università della Svizzera Italiana)
ABSTRACT. This paper studies how parallel computing can be used to reduce the time required to solve instances of the Satisfiability Modulo Theories problem (SMT). We address the problem in two orthogonal ways: (i) by distributing the computation using algorithm portfolios, search space partitioning techniques, and their combinations; and (ii) by studying the effect of partitioning heuristics, and in particular the look-ahead heuristic, to the efficiency of the partitioning. We implemented the approaches in the OpenSMT2 solver and experimented with the QF_UF theory on a computing cloud. The results show a consistent speed-up on hard instances with up to an order of magnitude run time reduction and several more instances being solved within the timeout.
Michael Kaufmann (Wilhelm-Schickard-Institute for Informatics, University of Tuebingen, Germany.)
ABSTRACT. Searching for minimal explanations of infeasibility in constraint sets is a problem known for many years. Recent developments closed a gap between approaches that enumerate all minimal unsatisfiable subsets (MUSes) of an unsatisfiable formula in the Boolean domain and approaches that extract only one single MUS. These new algorithms are described as partial MUS enumerators. They offer a viable option when complete enumeration is not possible within a certain time limit. This paper develops a novel method to identify clauses that are identical regarding their presence or absence in MUSes. With this concept we improve the performance of some of the state-of-the-art partial MUS enumerators using its already established framework. In our approach we focus mainly on determining minimal correction sets much faster to improve the MUS finding subsequently. An extensive practical analysis shows the increased performance of our extensions.
ABSTRACT. On application benchmarks, the invention of the variable decision heuristic VSIDS (variable state indepedent decaying sum) in the context of the CDCL (conflict-driven clause learning) SAT solver Chaff is considered a major mile stone in improving the efficiency of modern SAT solvers. In this paper, we propose a new variant ACIDS (average conflict-index decision score) and relate it to the original implementation of VSIDS, its popular modern implementation EVSIDS (exponential VSIDS), the VMTF (variable move-to-front) scheme, and other related decision heuristics. The main goal is to provide an empirical evaluation to serve as a starting point for trying to understand the reason for the efficiency of these decision heuristics. In our experiments, it turns out that VMTF, ACIDS, EVSIDS behave very similarly, if implemented carefully.
Alessandro Previti (University College Dublin)
Joao Marques-Silva (University College Dublin)
ABSTRACT. Knowledge compilation and approximation finds a wide range of practical applications. One relevant task in this area is to compute the Horn least upper bound (Horn LUB) of a propositional theory F. The Horn LUB is the strongest Horn theory entailed by F. This paper studies this problem and proposes two new algorithms that rely on making successive calls to a SAT solver. The algorithms are analyzed theoretically and evaluated empirically. The results show that the proposed methods are complementary and enable computing Horn LUBs for instances with a non-negligible number of variables.