Abstract. Some interesting problems have a poor worst-case complexity, but may still be soluble in some, or many, cases. If appropriate, there is ‘fixed-parameter tractability’ as one way of describing the solubility. But in other cases, e.g. SAT-solving, there isn’t necessarily a neat theoretical solution, but powerful practical ideas. What can we say about problems around real quantifier elimination?
Towards a New Projection for Cylindrical Algebraic Decomposition
ABSTRACT. Cylindrical Algebraic Decomposition (CAD) started as a single algorithm for constructing a single kind of mathematical object. However, under the umbrella of "CAD" we now have many different algorithms producing different kinds of mathematical objects. While different from one another in important ways, all of these approaches
are similar in that they rely on a projection operator, a notion of delineability, and a mechanism for lifting. In this talk we propose a new projection operator, based on analytic delineability and real root isolation for lifting, that is closely related to McCallum's projection, but that offers a possible solution to the problem of "nullification", which causes CAD based on McCallum's projection to be incomplete. The new projection offers particular advantages for "model-based" algorithms, like NLSAT, Cylindrical Algebraic Coverings, and Non-uniform CAD.
ABSTRACT. Symbolic computation is rooted in mathematical logic and covers all types of computations with symbolic objects. A special area in symbolic computation covers logical problems and their algorithmic solution. For real algebra, the method of the cylindrical algebraic decomposition (CAD) offers an elegant solution, but it comes with scalability problems due to its doubly-exponential worst-case complexity.
A more recent but strongly related research area called satisfiability checking has been established in computer science. Focusing first on easier theories, recent developments have lead to new functionalities implemented in satisfiability modulo theories (SMT) solvers to support satisfiability checking also for quantifier-free real-algebraic formulas, partly with the help fo CAD-based techniques, thus sharing interests with the community of symbolic computation.
James H. Davenport has been the main initiator of a past H2020-FETOPEN-2015-CSA project called SC-Square: Satisfiability Checking and Symbolic Computation, with the overall objective of connecting these areas and initiate their fruitful exchange. In this talk we recall some milestones of these developments and make some observations on their impact.
ABSTRACT. We present a recent algorithm for testing whether a polynomial in ZZ[x] is cyclotomic, and if so, determining its index. We present also an algorithm for determining a small superset of the indexes of cyclotomic factors of a given polynomial in ZZ[x]. As an application this gives a faster way of determining if a polynomial is (LRS)-degenerate.
ABSTRACT. Since 2008, Albert Rich has been developing an integration system called Rubi (Rule Based Integration). Many collaborators, including DJJ, have contributed to the system, although overall it remains largely a solo project. James may remember the gloomy prognostications that accompanied the first appearance of Mathematica, and how pattern recognition and term rewriting is no basis for a computer algebra system. Mathematica has survived, and Rubi is another system intent on disproving the neigh-sayers, who clearly lacked basic horse sense. I shall report briefly on progress.
ABSTRACT. This paper focuses on developing an innovative solution to the protein folding problem, a long-standing challenge in Bioinformatics, through the lens of quantum reinforcement learning (QRL). It proposes a hybrid method which incorporates a Quantum Neural Network created using a Parameterized Quantum Circuit. The study is conducted on the Hydropolar Model, a simplified representation of proteins in terms of their polarity, and the experiments are carried out in both 2D and 3D lattice representations. The datasets used for analysis include benchmark proteins from existing research and PDB or FASTA files from the Protein Data Bank. The core of the paper presents the detailed process of modeling and implementing the protein folding problem using QRL, along with detailed comparisons with classical deep reinforcement models inspired from the literature. The findings from this study demonstrate that QRL performs effectively in predicting energies close to optimal levels. However, it poses a greater challenge in terms of convergence when compared to classical methods. Besides the novel method proposed, we also present a supporting custom tool - the protein-folding-gym-utils python library - designed specifically to assist researchers in applying reinforcement learning techniques to the Protein Folding Problem.
ABSTRACT. The study presents a comparison between Machine Learning approaches, such as Decision Trees, XGBoost, Linear Discriminant Analysis, and Histogram Boost for detecting days with a higher likelihood of an increased incidence of stroke occurrences in the region of Transylvania, Romania, using meteorological data. The study brings an original contribution to existing work by including air fronts in the analysis and focusing on machine learning methods, being the first one to approach this issue in Romania. Furthermore, it covers a large period of 10 years (2013-2022). The proposed methods include dimensionality reduction and clustering techniques, as the initial dataset contains a large class imbalance, with the positive class representing critical days composing ~ 1/20 of the entire dataset. Based on the findings outlined, the combination of Kernel PCA, K-means, XGBoost, and Feature Selection emerges as the most effective pipeline, resulting in an overall metric of 74% when evaluated on the test dataset.
Cyber vulnerabilities surveillance via web-scraped specialized documents
ABSTRACT. Cyber security articles represent an important source of information of up-to-date data about software vulnerabilities, including data like Indicators of Compromise, highly valuable to any security researcher. The articles can be collected using Crawlers and Scrapers, and can be later analyzed in order to create keyword-type searches. This paper conducts a study towards the extraction of articles from forum-like websites, and of relevant information from the articles, while highlighting different types of Crawlers, Scrapers and analysis algorithms, such as the extraction of Indicators of Compromise, keyword text search and Natural Language Processing methods. Thus, we will present the types of forum websites with their characteristics, alongside different types of Crawlers, while also presenting some of the benefits of ethical scraping and some of its most important principles, such as postponing a request to server and changing the User-Agent header. The extracted information can be saved in an Elasticsearch Index, in order to create various statistics about the information extracted, providing an up-to-date overview over the current state of the cybersecurity space.
ABSTRACT. Cylindrical Algebraic Decomposition (CAD) was the first practical means for doing real quantifier elimination (QE), and is still a major method. Nevertheless, its complexity is inherently doubly exponential in the number of variables. Where applicable, virtual term substitution (VTS) is more effective, turning a QE problem in n variables to one in n-1 variables in one application, and so on. Hence there is scope for hybrid methods: doing VTS where possible then using CAD.
This paper describes such a poly-algorithmic implementation, based on the second author's Ph.D. thesis. The version of CAD used is based on a new implementation of Lazard's method, with some improvements to handle equational constraints.
A couple of catch ups for the modified unit-circle zero location test
ABSTRACT. The unit-circle zero location test proposed by the author several decades ago has been improved to integrate nonessential singularities and received a version that remains faction-free for (real or Gaussian) integer polynomials with any singularity profile. In this paper we complement a modified form of the same test, proposed about a decade later, with a corresponding recursion that is not hampered by non-essential singularities and with a corresponding equally general integer preserving version.
Counting Linear Extensions of Modular Partial Orders
ABSTRACT. The counting of linear extensions is one of the prominent problems
about partial orders. Unfortunately, the problem is computationally
hard. In fact, relatively few counting procedures have been proposed
in the literature. In this paper, we present such a counting procedure
based on the modular decomposition of posets. This allows, first, to
identify the series and parallel substructures, for which an efficient
recursive counting procedure is known. Second, we propose a way to
handle more complex prime substructures using an alternative
decomposition scheme based on two complementary rules: the BIT-rule
that can ``consume'' individual elements in a chain, and the
SPLIT-rule that can ``break'' antichains. We develop a symbolic algorithm
based on a multivariate integral formula solving the linear extension
count than can be constructed along the decomposition. To discuss the
complexity of this algorithm, we introduce a novel parameter, the
BIT-width. We show that the algorithm is $\mathcal{O}(n^{w+1})$ where
$w$ is the BIT-width of the input poset.
ABSTRACT. This note briefly reports on a set of practical tools that are part of a larger system, the full description of which will be presented elsewhere. These tools are designed to assist mathematicians working with theorem provers like Prover9 on open problems. While Prover9 is appreciated for its intuitive syntax, many other widely-used provers utilize TPTP or Waldmeister syntax. This divergence poses challenges for the working mathematicians who wish to take advantage of the fact that some provers excel in certain types of problems while others are better suited for different kinds. Additionally, some of these provers struggle with handling theories containing multiple goals, a common scenario when mathematicians are exploring meaningful conjectures. To address these issues, we have developed a translator to facilitate the creation of a prover portfolio in the user's preferred language, as well as a dedicated algorithm for processing multiple goals. This tool enables the use of various provers without the need to rewrite theories, and proves conjectures sequentially, each proven conjecture serving as a new axiom for subsequent ones.
Composite clustering crossover operator for improving genetic algorithms
ABSTRACT. Introduced by Holland in 1975, genetic algorithms are used to search for solutions in cases where the search area is big. Standard implementations of genetic algorithms use well known operators like parent selection, crossover and mutation. For each type of operators, there are different possible implementations to choose from. In most of the cases, however, the crossover implementations are variations of the same idea, concentrated around cutting parent chromosomes into parts and using these parts to build offspring. There are few exceptions that can be found in literature related to k-means clustering crossover. This paper presents an additional insight into the clustering crossover by analyzing the results of binary, integer and real coded genetic algorithms (RGAs). The experimental work compares a standard genetic algorithm solution for common RGA benchmark functions, OneMax problem as binary coded GA and an integer coded RNA representation, with an original implementation which uses a clustering crossover operator. The results show that the usage of the composite clustering crossover operator can increase the performance of the genetic algorithm in some cases, leading to faster convergence to the desired solution. However, not all problem types benefit from the combination. Further study is required.
Towards a virtual/mixed-reality graphical programming language
ABSTRACT. We previously arrived at the problem of enhanc-
ing usability of using string diagrams to give semantics to
monad-comonad adjunctions for GUI program representa-
tions - beyond the advantages given by composability.
We proposed a semantics in a 3D space using the idea that data
types are ”circles” positioned at some sort of ”distance” from a
conscious observer that gravitate towards a ”center” given by an
”input-output” tensor, much like magnetic fields around a coil.
The combination gives birth to a torus shape where programs are
closed transversal loops that merge into ”vortex” (or ”spirals”)
on the surface of the torus. Enhancement to the intuition would
then available when giving meaning to size, rotation speeds,
traversal speed, color, sound - in a consistent manner - in a
VR/AR environment.
We now build upon the idea to show that a suitable represen-
tation of the program data types in the above described program
would be a Hopf fibration of a 3-sphere - given the cardinality of
the zoom levels of the categorical representation of the program.
Strictly Pseudocontractive Mappings Beyond Linear Spaces
ABSTRACT. Physical phenomena often manifest in mathematics as optimization problems with certain constraints. To effectively solve most of these problems with constraints, it is usually necessary to consider notions such as convexity, linearity, or smoothness. Interestingly, metric spaces with flexible distance structures beyond just the Euclidean distance play a significant role in addressing these situations. In our talk, we focus on strictly pseudocontractive mappings in general metric spaces as a natural extension of the existing notion based on the Euclidean norm. We establish important properties of these mappings and show how fixed points of the mappings can be effectively approximated using an iterative scheme for fixed points of nonexpansive mappings in CAT(0) spaces.
Proximal-type methods for Approximation of Solutions to Equilibrium Problems over the Set of Fixed Points of Enriched Nonexpansive Mappings
ABSTRACT. In this paper, we propose a proximal-type algorithm for approximating solutions to an equilibrium problem over the set of fixed points of an enriched nonexpansive mapping, where the underlying bifunction is a monotone operator. We prove the convergence of the sequence generated by our proposed method under standard conditions. Additionally, we provide numerical illustrations to validate our results and compare them with existing literature.
In 1922, a young 21-year-old Curry started reading Principia Mathematica and was intrigued by the complications of its substitution rule. As a result of trying to analyse substitution, Curry conceived of the combinators in 1926 and started a lifelong search for powerful systems that combine computations and deductions (functions and logic) and that are able to formalise mathematics. His formalisms as well as existing complementary and competing formalisms demonstrate the struggle to internalise as much as possible while keeping the system consistent. There is also a struggle for elegant theories that minimise the number of basic concepts while remaining as close as possible to the language’s structure, and also the struggle for generalising concepts, connecting areas that may seem far apart and applying useful techniques from one area to the other.
This talk discusses two aspects of this development:
First, we go back as far as the ancient mathematicians (and especially the Pythagorean school) and discuss how the problems/paradoxes associated with of the infinite led them to miss the opportunity of defining the real numbers and how the developments from middle ages to the 17th and 19th century led to the development of real numbers and mathematical analysis, and also to even more paradoxes that led to the birth of computability and mathematical foundations.
Second, we discuss the problem of defining and internalising substitution in the lamba calculus and how to avoid the hurdle of how/when to introduce the substitution rule and alpha congruence. We cannot define alpha congruence without a form of substitution but for substitution and reduction to work, we need to assume a form of alpha congruence. The Curry school of thought has laid the foundations for this.
There is also a third aspect which discusses the struggle of going fully formal (especially with the variety and approaches to mechanise and formalise mathematics, e.g., in proof checkers) and keeping the informal as well as the historical tradition and lessons.
On validating Lazard's projection in CAD (with James' help)
ABSTRACT. In 1990, Daniel Lazard announced a new projection operation for cylindrical algebraic decomposition (CAD). His operation is considerably reduced compared with Collins' original one, hence is of significant computational benefit. However, soon after Lazard's announcement, a flaw in the validity proof of the proposed projection was found. Fast forwarding to 2011, Hoon Hong and Scott McCallum had begun to look again at Lazard's proposed method, with the hope of validating at least some special cases of it. At this point, James Davenport asked us a very pertinent question: is there a counter-example to the validity of Lazard's method? Stimulated by James' question, we struggled to find one, but to no avail. However, the insight gained from that effort led to some partial success in finding a correct partial validation of Lazard's method. From there, with the help of Adam Parusinski and Laurentiu Paunescu, a complete validity proof of Lazard was finally obtained. This talk will fill in some of the details of this story, and touch upon some of the recent uses of Lazard's method in CAD.
Can Explainable AI Give Insights for Symbolic Computation
ABSTRACT. In recent years there has been increased use of machine learning (ML) techniques within mathematics, including symbolic computation where it may be applied safely to optimise or select algorithms. We consider the emerging field of explainable AI (XAI)
in which ML models are either directly interpretable, or post-hoc analysis is conducted upon them to explain their decisions. We hypothesise that such techniques may allow new insights for symbolic computation, inspiring new implementations within computer algebra systems that do not directly call upon AI tools.
We present a case study on the use of ML to select the variable ordering for cylindrical algebraic decomposition. It has already been demonstrated that ML can make the choice well, but here we show how the SHAP tool for explainability can be used to inform new heuristics of a size and complexity similar to those human-designed heuristics currently commonly used in symbolic computation.
ABSTRACT. We consider the functional inverse of the Gamma
function in the complex plane. The inverse is a multivalued
function, and we define a set of suitable branches by proposing
an extension from the real line to the complex plane.
ABSTRACT. Many of James Davenport’s research interests and results in symbolic computation are closely related to Maple commands and functionality. In this presentation, I will give an overview of the various contributions James has made over the years that directly or indirectly influenced Maple.
A Feng-Liu type fixed point theorem on a set endowed with two metrics
ABSTRACT. In this paper, we will present some fixed point results for Feng-Liu multi-valued operators on a set endowed with two metrics. Some stability
properties of the fixed point inclusion are also established. An application to an integral inclusion of Fredholm type is presented.
An early survey on enriched contractive type mappings
ABSTRACT. We present a few relevant developments on enriched contractive type mappings that followed the introduction of this class in the paper [Berinde, V. Approximating fixed points of enriched nonexpansive mappings by Krasnoselskij iteration in Hilbert spaces. Carpathian J. Math. 35 (2019), no. 3, 293--304.]
Some stability results for multi-valued nonlinear graph contractions on a set with two metrics
ABSTRACT. This work focuses on proving some stability results for a general class of multi-valued contractions in the setting of a set endowed with two metrics.
Fixed point results for multi-valued nonlinear graph contractions on a set with two metrics
ABSTRACT. The aim of this paper is to prove a multi-valued version of Maia's fixed point theorem for multi-valued nonlinear graph contraction in the case of a nonempty set endowed with two metrics.
Fixed Point Theorems for Enriched Multivalued Nonexpansive Mappings in Hilbert Spaces
ABSTRACT. The main purpose of this paper is to turn some results of fixed point theorems from the general class of nonexpansive mapping, introduced by Vasile Berinde, denoted b-enriched nonexpansive mapping into multivalued mappings in Hilbert space.
Weak snd Strong Convergence Theorems for Krasnoselskii Iterative algorithm in the class of enriched strictly pseudocontractive and enriched nonexpansive operators in Hilbert Spaces
ABSTRACT. In this paper, we present some results about the aproximation of fixed points of enriched strictly
pseudocontractive and enriched nonexpansive operators. There are numerous works in this regard (for example [9], [10], [11] [14], [16], [35] and references to them). Of course, the bibliogracal references are extensive and they are mentioned at the end of this paper. In order to approximate the fixed points of enriched strictly pseudocontractive and enriched nonexpansive mappings, we use
the Krasnoselskii iterative algorithm for which we prove weak and strong convergence theorems. Also, in this paper, we make a comparative study about some classical convergence theorems from the literature in the class of enriched strictly pseudocontractive and enriched nonexpansive mappings.
Metric conditions, graphic contractions and weakly Picard operators
ABSTRACT. In this paper we study the following problem: \emph{Let $(X,d)$ be a complete metric space and $f:X\to X$ be a mapping. Which metric conditions imposed on $f$ imply that:
\begin{itemize}
\item [$(i)$] $f$ is a graphic contraction;
\item [$(ii)$] $f$ is weakly Picard mapping.
\end{itemize} }