NCL'22: NON-CLASSICAL LOGICS. THEORY AND APPLICATIONS 2022
PROGRAM FOR THURSDAY, MARCH 17TH
Days:
previous day
next day
all days

View: session overviewtalk overview

09:00-10:00 Session 16: Invited Talk

Enter here.

09:00
Temporal Reference and Temporal Indexicals

ABSTRACT. I sketch an approach to logic and indexicality, which I illustrate using hybrid tense logic and temporal indexicals. I have chosen temporal indexicals because they are probably the easiest natural language indexicals to model in a propositional logic. I have chosen hybrid tense logic because while it is rooted in the Priorean tradition of taking the internal perspective on time seriously, it also allows us to incorporate both Reichenbach's approach to temporal reference, and (as we shall see) a Kaplan-style character approach to indexicality. This will enable us to view "now" as a bridge that leads us to the logic(s) of "yesterday," "today", "tomorrow," and indeed other indexicals.

10:00-11:30 Session 17: Contributed Talks

Enter here.

10:00
Negation-free definitions of paraconsistency

ABSTRACT. Paraconsistency is commonly defined and/or characterized as the failure of a principle of explosion. The various standard forms of explosion involve one or more logical operators or connectives, among which the negation operator is the most frequent. In this article, we ask whether a negation operator is essential for describing paraconsistency. In other words, is it possible to describe a notion of paraconsistency that is independent of connectives? We present two such notions of negation-free paraconsistency, one that is completely independent of connectives and another that uses a conjunction-like binary connective that we call 'fusion'. We also derive a notion of 'quasi-negation' from the former and investigate its properties.

10:30
The Forms of Categorical Proposition

ABSTRACT. An exhaustive survey of categorical propositions is proposed in thepresent paper, both with respect to their nature and the logical problems raisedby them. Through a comparative analysis of Term Logic and First-Order Logic,it is shown that the famous problem of existential import may be solved in twoways: with a model-adaptive strategy, in which the square of opposition is validatedby restricting the models; with a language-adaptive strategy, in which the logicalform of categorical propositions is extended in order to validate the square in everymodel. The latter strategy is advocated in the name of logic, which means truth inevery model. The resulting semantics of bitstrings can be extended to any kinds of formulas,beyond the present of categorical propositions. An interesting case study is the set of knowledge statements considered in Englebretsen (1969), whose general logical form may be treated in the same pattern without introducing any intensional account of modal semantics. Another case is the set of dyadic relations, whose binary predicates can also be analyzed in a purely Boolean way. Finally, the present paper needs some automatic process in order to determine the nature of logical relations between any pair of the available 256 categorical propositions. This requires the implementation of a programming machine in the style of Prolog.

11:00
Algebraizability of the Logic of Quasi-N4-Lattices

ABSTRACT. The class of quasi-N4-lattices (QN4-lattices) was introduced as a generalization of quasi-Nelson algebras and N4-lattices in a way that N4-lattices are the QN4-lattices satisfying the double negation law $(\nnot \nnot x = x)$ and quasi-Nelson algebras are the QN4-lattices satisfying the explosive law $(x \land \nnot x) \to y = ((x \land \nnot x) \to y) \to ((x \land \nnot x) \to y)$. In this paper, we introduce a logic $(\mathbf{L}_{\mathrm{QN4}})$ via a Hilbert-style presentation whose algebraic semantics is a class of algebras that we show to be term-equivalent to the class of QN4-lattices. The result is obtained by showing that the calculus introduced by us is algebraizable in the sense of Blok and Pigozzi, and its equivalent algebraic semantics is equivalent to the class of QN4-lattices. We also consider the question of how one could define $\mathbf{L}_{\mathrm{QN4}}$ as a relevance logic.

11:30-12:00Coffee Break
12:00-13:00 Session 18: Invited Talk

Enter here.

12:00
Gautama and Almost Gautama Algebras and Their Associated Logics

ABSTRACT. The varieties of regular double Stone algebras and regular Kleene Stone Algebras are fairly well-known and well studied. The amazing similarity in their structures led me to introduce a new variety as a common generalization of the two varieties. The algebras in this new variety are called "Gautama algebras'" in memory and honor of the founders of Indian Logic: Medhatithi Gautama and Akshapada Gautama. It turns out that the variety G of Gautama algebras is the join of the variety of regular Stone algebra and regular Kleene Stone algebras and also that the variety G is the equivalent algebraic semantics of a new logic (in the sense of Blok and Pigozzi).

More recently, the variety G has been further generalized to "Almost Gautama algebras." It turns out that this new variety also has a corresponding logic.

In this lecture, I wish to present results about these new varieties and their corresponding logics.

13:00-14:30Lunch Break
14:30-15:30 Session 19: Invited Talk

Enter here.

14:30
A Paradox for the Existence Predicate

ABSTRACT. In this paper, a paradox is shown to arise from prima facie highly plausible assumptions for the existence predicate as applied to definite descriptions. There are several possibilities to evade the paradox; all involve modifications in first-order logic with identity, existence, and definite descriptions; some stay within classical logic, others leave it. The merits of the various "ways out" are compared. It is proposed, and supported by argument, that the most attractive "way out" is within classical logic and has the consequence that there is a new logical truth: "There is at least one nonexistent object." But this "exit" will certainly not be to everyone's taste and liking. Thus, the paradox defies complete resolution (as every good paradox should).

15:30-16:00Coffee Break
16:00-18:15 Session 20A: Workshop: Advances in Formal Ontology

Enter here.

16:00
Possible Laws Contra Principle of Plenitude

ABSTRACT. The so-called Principle of Plenitude was ascribed to Leibniz by A. O. Lovejoy in The Great Chain of Being. A Study of the History of an Idea (1936). Its temporal version states that what holds always, holds necessarily (or that no genuine possibility can remain unfulfilled). This temporal formulation is the subject of the presented research. Lovejoy's idea was criticised by Hintikka who justified it referring to specific Leibnizian notions of absolute and hypothetical necessities interpreted in the semantics of many possible worlds. In the paper, Hintikka's interpretative suggestions are developed and enriched with a temporal component that is present in the characteristics of the real world given by Leibniz. We use in our approach the Leibnizian idea that change is primary to time and the idea that there are possible laws that characterize worlds other than the real one. We formulate a modal propositional logic with three primitive operators for change, temporal constancy, and possible lawlikeness. We give its axiomatics and show that our logic is complete with respect to the given semantics of possible worlds. Finally, we show that the counterparts of the considered versions of the Principle of Plenitude are falsified in this semantics and the same applies to the counterpart of Leibnizian necessarianism.

16:30
On Interpretation of Mathematical Text: Stetigkeit und irrationale Zahlen as a Case Study

ABSTRACT. Through analysis of Richard Dedekind's Stetigkeit und irrationale Zahlen, we substan-tiate three interpretative perspectives on a mathematical text: mathematical practice, mathematical structures, and ontology.

Dedekind's [6] was the third in raw of 1872's celebrated papers introducing real numbers,following [9], and [4]. All three extend rational numbers to a bigger eld through varioustechniques. [9] and [4] employ sequences and their limits, [6]  cuts in a totally orderedset and least upper bounds of subsets. Moreover, [4], and [6] motivate their accounts ofcontinuity by references to Euclid's straight-line, especially his theory of commensurableand incommensurable line segments. While coming to specics, [9] shows real numbers are Cauchy-complete, [4]  takes it for granted, [6] shows the line of real numbers is Dedekind-complete, i.e., every cut (A, B) has maximum in A or minimum in B, but not both. Regarding the straight-line, [4] claims every innite bounded subset has an accumulationpoint; [6] models real numbers to mirror all phenomena of straight-line, notwithstanding,nally admits having no means to decide whether it is Dedekind complete.

The 20th-century foundations of geometry reshaped the modern understanding of Eu-clid's straight-line. Viewed from the perspective of models, every ordered eld (F, +, ·, 0, 1, <) closed under the square root operation (Archimedean or non-Archimedian) allows recon-structing Euclid's plane geometry (books IIV of the Elements) on the Cartesian plane F × F. Following these ndings, Cantor and Dedekind's comments on the relationships between real numbers and the straight-line lost their relevance altogether. Yet, the re-ceived scholarship in the history and philosophy of mathematics is that [4] and [6] started a process of arithmetisation of calculus, and some scholars even coined the term Cantor-Dedekind axiom of continuity, despite the fact Cauchy- and Dedekind-completeness are not equivalent. Our reading of [6] goes beyond these cliches.

Considered from the perspective of contribution to mathematical practice, [9] intro-duced ε-δ and sequential continuity of functions, and a way of dening objects through a diagonal of an innite matrix (known as the Cantor diagonal argument); [4]  the conceptof derived set, P0, [6]  the idea of a totally ordered set, cuts and least upper bounds asmeans of dening objects, as well as the topological concept of continuous functions.Due to the categoricity of axioms, real numbers are currently dened as a completeordered eld. [10] and [11] introduced that perspective, consequently [9], [4], [6] arepresented as bringing in the so-called constructions of real numbers, or models for axiomsof real numbers. Yet, throughout the rst decades of the 20th century, other perspectiveswere developed, such as real-closed elds [1], and topological elds [12]. The later sets upa kind of categoricity theorem combined with an axiomatic account of the concept of thelimit of a sequence. [6] includes concepts relating to topological eld perspective, whilean ordered eld perspective is but implicit and includes no mention of compatibility oforder with sums and products.

Taking on an ontological perspective, we assume an interpretation of the construction of real numbers in terms of a formal theory of objects and adopt Roman Ingarden's on-tology developed in Controversy over the Existence of the World and The Literary Work of Art. It examines ideal, real, purely intentional (subjective), and secondary intentional(inter-subjective) objects. A literary character exemplies the later class, and we employit interpreting Dedekind's construction of real numbers. [2] expounds characteristics ofintentional objects in technical terms of formal ontology. In our talk, adopting informalterminology, we will focus on the two seemingly most controversial traits of these objects:derivation, implying there were no real numbers before 1872, and schematism  that an intentional object is incomplete (in ontological rather than mathematical sense). Regard-ing the rst, we discuss the claim that decimal and continued fraction representation of real numbers had been well-known in the 18th and 19th centuries. As for the second, wediscuss, based on [5], how the theory of innitesimals (as developed in the nonstandardanalysis) is related to the Continuum Hypothesis and 1872's theories of real numbers.

References:

[1] Artin, E., Schreier, O. 1926. Algebraische Konstruktion reeller Körper, Abhandlungen aus dem Math-ematischen Seminar der Hamburgischen Universität 5, 1926, 85-99.

[2] Błaszczyk, P. 2005. On the mode of existence of real numbers. Annaleca Husserliana 88, 2005,137-155; https://www.researchgate.net/publication/336414123_On_the_Mode_of_Existence_of_Real_Numbers

[3] Błaszczyk, P. 2021. Galileo's paradox and numerosities. Zagadnienia Filozoczne w Nauce 70, 73-107.

[4] Cantor, G. 1872. Über die Ausdehnung eines Satzes aus der Theorie der trigonometrischen Reihen. Mathematische Annalen 5, 1872, 123-132.

[5] Cutland, N. Kessler, Ch., E. Kopp, E., Ross, D. 1988. On Cauchy's Notion of Innitesmal, The British Journal for the Philosophy of Science 39, 1988, 375-378.

[6] Dedekind, R. 1872. Stetigkeit und irrationale Zahlen. Friedrich Vieweg & Sohn, Braunschweig 1872.

[7] Dedekind, R. 2005. Continuity and irrational numbers. In Ewald, W., From Kant to Hilbert. Vol. II: A Source Book in the Foundations of Mathematics. Oxford Univeristy Press, Oxford 2005, 766-779.

[8] Dedekind, R. 2017. Ci¡gao±¢ i liczby niewymierne. Tłumaczenie J. Pogonowski. Annales Universitatis Paedagogicae Cracoviensis. Studia ad Didacticam Mathematicae Pertinentia 9, 2017, 170-183.

[9] Heine, E. 1872. Elemente der Functionenlehre. Journal fur die reine und angewandte Mathematik 74,1872, 172-188.

[10] Hilbert, D. 1900. Über den Zahlbegri. Jahresbericht der Deutschen Mathematiker-Vereinigung 8,1900, 180-184.

[11] Huntington, E.V. 1902. Theory of Absolute Continuous Magnitude. Transactions of the American Mathematical Society 3(2), 1902, 246-279.

[12] Pontriagin, L. 1932. Über stetige algebraische Körper, The Annals of Mathematics 33(1), 1932,163-174.

17:15
Perzanowski's Combination of Onto-logic and Topology

ABSTRACT. Perzanowski's combination ontologic is the ontology of elements and theircombinations. Perzanowski concluded that combinations are defined by the elements' internal features, i.e. whether one element is connected with another depends on the elements' insides. We present the view that the structure of the combination depends on topological apriori. We suggest enriching the structure of the ontological universe by its topologisation. For this purpose, we use the Hilbert cube to consider the basic concepts of Perzanowski's ontologic.

17:45
What is a Topological Ontology?

ABSTRACT. Topological ontology is a philosophical and ontological science that has beendeveloping for several decades. This relatively young form of philosophical research takes upproblems, concepts and theorems of classical ontology/metaphysics taking into account notonly formal tools and methods, but in particular topological concepts and theorems.In the conference presentation the following issues, questions, will be presented:1) the basic fields of study of formal ontology: the real world (individuals, states of affairs,relations between individuals, systems of things (such as a family, a university, a sports club,an army); the world of ideas (in the sense of Plato, Whitehead, Ingarden and others), language- as the medium in which we express ontological theories; the sphere of concepts - as theintelligible plane in which we express in an intensional and conceptual way the sense of ourontological theories,2) basic concepts of topological ontology: mathematical concepts, e.g. topological space, baseof space, subspace, weaker and stronger space, homeomorphism; ontological concepts, e.g.state of affairs, possible world, monad, ontological atom, system,3) some theorems of topological ontology of philosophical, methodological and ontologicalimportance,4) selected and detailed problems of topological ontology: e.g. the problem of negative stateof affairs in the topological interpretation of Russell&#39;s and Wittgenstein&#39;s ontology of logicalatomism, the question of independence and compatibility of states of affairs (situations), thepossibility of constructing modal logics on lattice and topological structures (these structuresallow interpreting the main theses of the ontology of logical atomism presented byWittgenstein in his Tractatus Logico - Philosophicus).

Bibliography:

Kaczmarek J., [2019], Ontology in Tractatus Logico – Philosophicus. A TopologicalApproach, [in:] G. M. Mras, P. Weingartner, B. Ritter (Eds.), Philosophy of Logic andMathematics, Proceedings of the 41 st International Ludwig Wittgenstein Symposium, DeGruyter, pp. 246 – 262

Kaczmarek J., [2022], The Four-Category Ontology Modulo Topological Ontology, [in:]E. J. Lowe and Ontology, (ed.) Mirosław Szatkowski, Routledge, New York, pp. 143-164

16:00-19:00 Session 20B: Tutorial: Graded Logic

Enter here.

16:00
Graded Logic, Part I: Humancentric approach to logic

ABSTRACT. Graded logic (GL) is the logic of human reasoning with graded percepts. Human percepts of truth, importance, satisfaction, suitability, preference, and many others, are all graded, i.e., they are a matter of degree. Without the loss of generality, all degrees are normalized in the interval [0,1], where 0 denotes the lowest degree and 1 denotes the highest degree of the intensity of given percept. Truth is the most important of graded percepts because all graded percepts can be described as the degree of truth of the statement asserting that a given percept has the highest intensity. For example, the degree of truth of the statement "the object A completely satisfies all our requirements" is equivalent to the degree of satisfaction (or suitability) of A. Obviously, truth is not an anonymous real number, but the percept that has both the intensity and a clear semantic identity: the role, meaning, and importance for specific stakeholder. The stakeholder is defined as an individual, or an organization, who specify an assertion, and then need to know its degree of truth. Usually, that is done in the process of decision-making, where the stakeholder evaluates alternatives and selects the most suitable alternative. GL is fully humancentric: its main goal is to be consistent with observable properties of human reasoning. That is the motivation for development of GL.

GL is based on the concept that both simultaneity (conjunction) and substitutability (disjunction) are a matter of degree. This concept was introduced in 1973. The conjunction degree (later renamed andness) is an indicator of similarity between a logic aggregator and the full (or pure) conjunction (the minimum function). The disjunction degree (later renamed orness) is an indicator of similarity between a logic aggregator and the full (pure) disjunction (the maximum function). Andness and orness are adjustable and complementary indicators, observable and measurable in human reasoning. The highest orness (1) corresponds to the lowest andness (0), and the highest andness (1) corresponds to the lowest orness (0). The andness 1 denotes the full conjunction, and the orness 1 denotes the full disjunction. Between these extreme points we have the graded (or partial) conjunction and the graded (or partial) disjunction functions with adjustable degrees of andness/orness. Therefore, the models of graded conjunction/disjunction must provide continuous transition from the full conjunction to the full disjunction.

Both the inputs and the output of the graded conjunction function and the graded disjunction function are the degrees of truth. The graded conjunction is a basic GL function where andness is greater than orness. It is the model of simultaneity, and consequently the output of this function is primarily affected by the low values of arguments. The graded disjunction is a basic GL function where orness is greater than andness. It is the model of substitutability, and consequently the output of this function is primarily affected by the high values of arguments. Human logic reasoning provably combines simultaneity and substitutability.

All means are functions that return values between the minimum and the maximum of their arguments. Thus, means can be interpreted as logic functions. The centroid of all means, where andness equals orness (i.e., both have the value 1/2), is the traditional arithmetic mean. The arithmetic mean is interpreted as the logic neutrality because it simultaneously has 50% of conjunctive properties and 50% of disjunctive properties. Parameterized means, such as power means and exponential means, provide the desirable continuous transition from the full conjunction to the full disjunction. In addition, when used as logic functions, means provide another necessary property: means can be weighted and support the degree of importance of input arguments. In human reasoning, each truth has a meaning and consequently also a specific degree of importance for its stakeholder. It is not exaggeration to claim that all logic models where truth does not come with its importance degree, must be rejected as realistic models of human reasoning. Many means, such as weighted power means, use weights to express the impact/importance of individual arguments. So, such means are the primary candidates for serving as graded logic aggregators.

The use of annihilators is a fundamental observable property of the graded conjunction and the graded disjunction used in human reasoning. If the graded conjunction supports the annihilator 0, then such operator is called hard; if the annihilator 0 is not supported, then the operator is called soft. If the graded disjunction supports the annihilator 1, then such operator is called hard; if the annihilator 1 is not supported, then the operator is called soft. The hard graded conjunction can be verbalized as the "must have" condition, where each input is considered mandatory, so that its nonsatisfaction unconditionally yields the output 0. The hard graded disjunction can be verbalized as the "enough to have" condition, where each input is considered sufficient to fully satisfy stakeholder's requirements: its value 1 unconditionally yields the output 1. Both the graded conjunction and the graded disjunction can be soft, verbalized as the "nice to have" condition where the properties of aggregation are conjunctive or disjunctive, but the annihilators are not supported. Both hard and soft logic aggregators are provably present and used in human reasoning.

The basic Graded Conjunction/Disjunction (basic GCD) is the fundamental GL aggregation function that includes and integrates 7 types of logic aggregation: full conjunction, hard graded conjunction, soft graded conjunction, neutrality (the weighted arithmetic mean), soft graded disjunction, hard graded disjunction, and the full disjunction. The basic GCD and negation are sufficient to create all other compound GL functions, such as graded absorption, implication, abjunction, equivalence, exclusive disjunction, and others. In vertices of the unit hypercube GL and the classical Boolean logic are identical. Therefore, GL is a seamless generalization of the classical Boolean logic inside the whole unit hypercube.

Conjunctive properties can be stronger than the full conjunction, and disjunctive properties can be stronger than the full disjunction. The extreme conjunctive function is called the drastic conjunction (the function where the output is 1 only if all inputs are 1, and in all other cases the output is 0). The extreme disjunctive function is called the drastic disjunction (the function where the output is 0 only if all inputs are 0, and in all other cases the output is 1). To cover the full spectrum of logic aggregators in GL we also use the extended GCD, which is a function that provides the andness-directed continuous transition from the drastic conjunction to the drastic disjunction. The extended GCD includes the basic GCD in the segment between the pure conjunction and the pure disjunction. All forms of GCD satisfy De Morgan duality.

The extended GCD in the range of high andness between the full conjunction and the drastic conjunction is called hyperconjunction. Similarly, hyperdisjunction is the extended GCD in the high orness range between the full disjunction and the drastic disjunction. Hyperconjunction covers the area of t-norms and the hyperdisjunction covers the area of t-conorms. That includes the product t-norms/t- conorms, which model the probability of independent events. In this way GL includes models of probabilistic reasoning, building useful bridges between logic and the probability theory.

One of central results of GL is the graded logic conjecture which claims that 10 basic types of logic functions, consisting of 7 function types of basic GCD, hyperconjunction, hyperdisjunction, and negation are necessary and sufficient to adequately model mental processes when human beings aggregate subjective categories, i.e., degrees of truth corresponding to various graded percepts. After selecting an appropriate type of GCD aggregator, humans regularly perform fine tuning of aggregator by additionally adjusting both the desired importance and the desired andness/orness. The extended GCD and negation are necessary and sufficient components for building a graded propositional calculus that can process graded/partial truth in a way consistent with observable properties of human reasoning.

GL is highly applicable and supported by collection of professional software tools. In industrial setting, GL is the fundamental component of decision engineering, an area of professional decision-making based on complex logic criteria, that can use hundreds of inputs. Published applications of GL arein the areas of ecology, evaluation, optimization, and selection of computer hardware and software,medicine (evaluation of disease severity and patient disability), public health (evaluation of priority for vaccination), geography and space management (suitability maps), agriculture, urban planning, evaluationof data management systems, web browsers, search engines, windowed environments, websites, e-commerce sites, homes (in online real estate), groundwater contamination, cybersecurity, and others. Adetailed presentation of GL, a presentation of decision engineering applications of GL (evaluation,optimization, comparison, and selection of complex alternatives), as well as a survey of the GL-based LSP decision method andcorresponding literature can be found in the monograph by J. Dujmović.

References:

[1] Jozo Dujmović (2018): Soft Computing Evaluation Logic: The LSP Decision Method and Its Applications. John Wiley & Sons, Inc., doi:10.1002/9781119256489.

17:00
Graded Logic, Part II: Graded Conjunction/Disjunction

ABSTRACT. Graded logic (GL) is the logic of human reasoning with graded percepts. Human percepts of truth, importance, satisfaction, suitability, preference, and many others, are all graded, i.e., they are a matter of degree. Without the loss of generality, all degrees are normalized in the interval [0,1], where 0 denotes the lowest degree and 1 denotes the highest degree of the intensity of given percept. Truth is the most important of graded percepts because all graded percepts can be described as the degree of truth of the statement asserting that a given percept has the highest intensity. For example, the degree of truth of the statement "the object A completely satisfies all our requirements" is equivalent to the degree of satisfaction (or suitability) of A. Obviously, truth is not an anonymous real number, but the percept that has both the intensity and a clear semantic identity: the role, meaning, and importance for specific stakeholder. The stakeholder is defined as an individual, or an organization, who specify an assertion, and then need to know its degree of truth. Usually, that is done in the process of decision-making, where the stakeholder evaluates alternatives and selects the most suitable alternative. GL is fully humancentric: its main goal is to be consistent with observable properties of human reasoning. That is the motivation for development of GL.

GL is based on the concept that both simultaneity (conjunction) and substitutability (disjunction) are a matter of degree. This concept was introduced in 1973. The conjunction degree (later renamed andness) is an indicator of similarity between a logic aggregator and the full (or pure) conjunction (the minimum function). The disjunction degree (later renamed orness) is an indicator of similarity between a logic aggregator and the full (pure) disjunction (the maximum function). Andness and orness are adjustable and complementary indicators, observable and measurable in human reasoning. The highest orness (1) corresponds to the lowest andness (0), and the highest andness (1) corresponds to the lowest orness (0). The andness 1 denotes the full conjunction, and the orness 1 denotes the full disjunction. Between these extreme points we have the graded (or partial) conjunction and the graded (or partial) disjunction functions with adjustable degrees of andness/orness. Therefore, the models of graded conjunction/disjunction must provide continuous transition from the full conjunction to the full disjunction.

Both the inputs and the output of the graded conjunction function and the graded disjunction function are the degrees of truth. The graded conjunction is a basic GL function where andness is greater than orness. It is the model of simultaneity, and consequently the output of this function is primarily affected by the low values of arguments. The graded disjunction is a basic GL function where orness is greater than andness. It is the model of substitutability, and consequently the output of this function is primarily affected by the high values of arguments. Human logic reasoning provably combines simultaneity and substitutability.

All means are functions that return values between the minimum and the maximum of their arguments. Thus, means can be interpreted as logic functions. The centroid of all means, where andness equals orness (i.e., both have the value 1/2), is the traditional arithmetic mean. The arithmetic mean is interpreted as the logic neutrality because it simultaneously has 50% of conjunctive properties and 50% of disjunctive properties. Parameterized means, such as power means and exponential means, provide the desirable continuous transition from the full conjunction to the full disjunction. In addition, when used as logic functions, means provide another necessary property: means can be weighted and support the degree of importance of input arguments. In human reasoning, each truth has a meaning and consequently also a specific degree of importance for its stakeholder. It is not exaggeration to claim that all logic models where truth does not come with its importance degree, must be rejected as realistic models of human reasoning. Many means, such as weighted power means, use weights to express the impact/importance of individual arguments. So, such means are the primary candidates for serving as graded logic aggregators.

The use of annihilators is a fundamental observable property of the graded conjunction and the graded disjunction used in human reasoning. If the graded conjunction supports the annihilator 0, then such operator is called hard; if the annihilator 0 is not supported, then the operator is called soft. If the graded disjunction supports the annihilator 1, then such operator is called hard; if the annihilator 1 is not supported, then the operator is called soft. The hard graded conjunction can be verbalized as the "must have" condition, where each input is considered mandatory, so that its nonsatisfaction unconditionally yields the output 0. The hard graded disjunction can be verbalized as the "enough to have" condition, where each input is considered sufficient to fully satisfy stakeholder's requirements: its value 1 unconditionally yields the output 1. Both the graded conjunction and the graded disjunction can be soft, verbalized as the "nice to have" condition where the properties of aggregation are conjunctive or disjunctive, but the annihilators are not supported. Both hard and soft logic aggregators are provably present and used in human reasoning.

The basic Graded Conjunction/Disjunction (basic GCD) is the fundamental GL aggregation function that includes and integrates 7 types of logic aggregation: full conjunction, hard graded conjunction, soft graded conjunction, neutrality (the weighted arithmetic mean), soft graded disjunction, hard graded disjunction, and the full disjunction. The basic GCD and negation are sufficient to create all other compound GL functions, such as graded absorption, implication, abjunction, equivalence, exclusive disjunction, and others. In vertices of the unit hypercube GL and the classical Boolean logic are identical. Therefore, GL is a seamless generalization of the classical Boolean logic inside the whole unit hypercube.

Conjunctive properties can be stronger than the full conjunction, and disjunctive properties can be stronger than the full disjunction. The extreme conjunctive function is called the drastic conjunction (the function where the output is 1 only if all inputs are 1, and in all other cases the output is 0). The extreme disjunctive function is called the drastic disjunction (the function where the output is 0 only if all inputs are 0, and in all other cases the output is 1). To cover the full spectrum of logic aggregators in GL we also use the extended GCD, which is a function that provides the andness-directed continuous transition from the drastic conjunction to the drastic disjunction. The extended GCD includes the basic GCD in the segment between the pure conjunction and the pure disjunction. All forms of GCD satisfy De Morgan duality.

The extended GCD in the range of high andness between the full conjunction and the drastic conjunction is called hyperconjunction. Similarly, hyperdisjunction is the extended GCD in the high orness range between the full disjunction and the drastic disjunction. Hyperconjunction covers the area of t-norms and the hyperdisjunction covers the area of t-conorms. That includes the product t-norms/t- conorms, which model the probability of independent events. In this way GL includes models of probabilistic reasoning, building useful bridges between logic and the probability theory.

One of central results of GL is the graded logic conjecture which claims that 10 basic types of logic functions, consisting of 7 function types of basic GCD, hyperconjunction, hyperdisjunction, and negation are necessary and sufficient to adequately model mental processes when human beings aggregate subjective categories, i.e., degrees of truth corresponding to various graded percepts. After selecting an appropriate type of GCD aggregator, humans regularly perform fine tuning of aggregator by additionally adjusting both the desired importance and the desired andness/orness. The extended GCD and negation are necessary and sufficient components for building a graded propositional calculus that can process graded/partial truth in a way consistent with observable properties of human reasoning.

GL is highly applicable and supported by collection of professional software tools. In industrial setting, GL is the fundamental component of decision engineering, an area of professional decision-making based on complex logic criteria, that can use hundreds of inputs. Published applications of GL arein the areas of ecology, evaluation, optimization, and selection of computer hardware and software,medicine (evaluation of disease severity and patient disability), public health (evaluation of priority for vaccination), geography and space management (suitability maps), agriculture, urban planning, evaluationof data management systems, web browsers, search engines, windowed environments, websites, e-commerce sites, homes (in online real estate), groundwater contamination, cybersecurity, and others. Adetailed presentation of GL, a presentation of decision engineering applications of GL (evaluation,optimization, comparison, and selection of complex alternatives), as well as a survey of the GL-based LSP decision method andcorresponding literature can be found in the monograph by J. Dujmović.

References:

[1] Jozo Dujmović (2018): Soft Computing Evaluation Logic: The LSP Decision Method and Its Applications. John Wiley & Sons, Inc., doi:10.1002/9781119256489.

18:00
Graded Logic, Part III: Graded propositional calculus

ABSTRACT. Graded logic (GL) is the logic of human reasoning with graded percepts. Human percepts of truth, importance, satisfaction, suitability, preference, and many others, are all graded, i.e., they are a matter of degree. Without the loss of generality, all degrees are normalized in the interval [0,1], where 0 denotes the lowest degree and 1 denotes the highest degree of the intensity of given percept. Truth is the most important of graded percepts because all graded percepts can be described as the degree of truth of the statement asserting that a given percept has the highest intensity. For example, the degree of truth of the statement "the object A completely satisfies all our requirements" is equivalent to the degree of satisfaction (or suitability) of A. Obviously, truth is not an anonymous real number, but the percept that has both the intensity and a clear semantic identity: the role, meaning, and importance for specific stakeholder. The stakeholder is defined as an individual, or an organization, who specify an assertion, and then need to know its degree of truth. Usually, that is done in the process of decision-making, where the stakeholder evaluates alternatives and selects the most suitable alternative. GL is fully humancentric: its main goal is to be consistent with observable properties of human reasoning. That is the motivation for development of GL.

GL is based on the concept that both simultaneity (conjunction) and substitutability (disjunction) are a matter of degree. This concept was introduced in 1973. The conjunction degree (later renamed andness) is an indicator of similarity between a logic aggregator and the full (or pure) conjunction (the minimum function). The disjunction degree (later renamed orness) is an indicator of similarity between a logic aggregator and the full (pure) disjunction (the maximum function). Andness and orness are adjustable and complementary indicators, observable and measurable in human reasoning. The highest orness (1) corresponds to the lowest andness (0), and the highest andness (1) corresponds to the lowest orness (0). The andness 1 denotes the full conjunction, and the orness 1 denotes the full disjunction. Between these extreme points we have the graded (or partial) conjunction and the graded (or partial) disjunction functions with adjustable degrees of andness/orness. Therefore, the models of graded conjunction/disjunction must provide continuous transition from the full conjunction to the full disjunction.

Both the inputs and the output of the graded conjunction function and the graded disjunction function are the degrees of truth. The graded conjunction is a basic GL function where andness is greater than orness. It is the model of simultaneity, and consequently the output of this function is primarily affected by the low values of arguments. The graded disjunction is a basic GL function where orness is greater than andness. It is the model of substitutability, and consequently the output of this function is primarily affected by the high values of arguments. Human logic reasoning provably combines simultaneity and substitutability.

All means are functions that return values between the minimum and the maximum of their arguments. Thus, means can be interpreted as logic functions. The centroid of all means, where andness equals orness (i.e., both have the value 1/2), is the traditional arithmetic mean. The arithmetic mean is interpreted as the logic neutrality because it simultaneously has 50% of conjunctive properties and 50% of disjunctive properties. Parameterized means, such as power means and exponential means, provide the desirable continuous transition from the full conjunction to the full disjunction. In addition, when used as logic functions, means provide another necessary property: means can be weighted and support the degree of importance of input arguments. In human reasoning, each truth has a meaning and consequently also a specific degree of importance for its stakeholder. It is not exaggeration to claim that all logic models where truth does not come with its importance degree, must be rejected as realistic models of human reasoning. Many means, such as weighted power means, use weights to express the impact/importance of individual arguments. So, such means are the primary candidates for serving as graded logic aggregators.

The use of annihilators is a fundamental observable property of the graded conjunction and the graded disjunction used in human reasoning. If the graded conjunction supports the annihilator 0, then such operator is called hard; if the annihilator 0 is not supported, then the operator is called soft. If the graded disjunction supports the annihilator 1, then such operator is called hard; if the annihilator 1 is not supported, then the operator is called soft. The hard graded conjunction can be verbalized as the "must have" condition, where each input is considered mandatory, so that its nonsatisfaction unconditionally yields the output 0. The hard graded disjunction can be verbalized as the "enough to have" condition, where each input is considered sufficient to fully satisfy stakeholder's requirements: its value 1 unconditionally yields the output 1. Both the graded conjunction and the graded disjunction can be soft, verbalized as the "nice to have" condition where the properties of aggregation are conjunctive or disjunctive, but the annihilators are not supported. Both hard and soft logic aggregators are provably present and used in human reasoning.

The basic Graded Conjunction/Disjunction (basic GCD) is the fundamental GL aggregation function that includes and integrates 7 types of logic aggregation: full conjunction, hard graded conjunction, soft graded conjunction, neutrality (the weighted arithmetic mean), soft graded disjunction, hard graded disjunction, and the full disjunction. The basic GCD and negation are sufficient to create all other compound GL functions, such as graded absorption, implication, abjunction, equivalence, exclusive disjunction, and others. In vertices of the unit hypercube GL and the classical Boolean logic are identical. Therefore, GL is a seamless generalization of the classical Boolean logic inside the whole unit hypercube.

Conjunctive properties can be stronger than the full conjunction, and disjunctive properties can be stronger than the full disjunction. The extreme conjunctive function is called the drastic conjunction (the function where the output is 1 only if all inputs are 1, and in all other cases the output is 0). The extreme disjunctive function is called the drastic disjunction (the function where the output is 0 only if all inputs are 0, and in all other cases the output is 1). To cover the full spectrum of logic aggregators in GL we also use the extended GCD, which is a function that provides the andness-directed continuous transition from the drastic conjunction to the drastic disjunction. The extended GCD includes the basic GCD in the segment between the pure conjunction and the pure disjunction. All forms of GCD satisfy De Morgan duality.

The extended GCD in the range of high andness between the full conjunction and the drastic conjunction is called hyperconjunction. Similarly, hyperdisjunction is the extended GCD in the high orness range between the full disjunction and the drastic disjunction. Hyperconjunction covers the area of t-norms and the hyperdisjunction covers the area of t-conorms. That includes the product t-norms/t- conorms, which model the probability of independent events. In this way GL includes models of probabilistic reasoning, building useful bridges between logic and the probability theory.

One of central results of GL is the graded logic conjecture which claims that 10 basic types of logic functions, consisting of 7 function types of basic GCD, hyperconjunction, hyperdisjunction, and negation are necessary and sufficient to adequately model mental processes when human beings aggregate subjective categories, i.e., degrees of truth corresponding to various graded percepts. After selecting an appropriate type of GCD aggregator, humans regularly perform fine tuning of aggregator by additionally adjusting both the desired importance and the desired andness/orness. The extended GCD and negation are necessary and sufficient components for building a graded propositional calculus that can process graded/partial truth in a way consistent with observable properties of human reasoning.

GL is highly applicable and supported by collection of professional software tools. In industrial setting, GL is the fundamental component of decision engineering, an area of professional decision-making based on complex logic criteria, that can use hundreds of inputs. Published applications of GL arein the areas of ecology, evaluation, optimization, and selection of computer hardware and software,medicine (evaluation of disease severity and patient disability), public health (evaluation of priority for vaccination), geography and space management (suitability maps), agriculture, urban planning, evaluationof data management systems, web browsers, search engines, windowed environments, websites, e-commerce sites, homes (in online real estate), groundwater contamination, cybersecurity, and others. Adetailed presentation of GL, a presentation of decision engineering applications of GL (evaluation,optimization, comparison, and selection of complex alternatives), as well as a survey of the GL-based LSP decision method andcorresponding literature can be found in the monograph by J. Dujmović.

References:

[1] Jozo Dujmović (2018): Soft Computing Evaluation Logic: The LSP Decision Method and Its Applications. John Wiley & Sons, Inc., doi:10.1002/9781119256489.