next day
all days

View: session overviewtalk overview

10:00-12:00 Session 1: Pre-conference Roundtable: Science Policies, Research Funding, and the Future of the University

The current system of rewards for academic work, which connects funding assignments to metric performances, and these in turns on the capacity to attract funding, reinforces multiple Matthew effects and increases gaps across universities and scholars, both geographically (widening the North-South and West-East divides), and institutionally (expanding the distance between high ranked universities and those down in the ladder), as well as disciplinary: favouring applied fields of research over theoretical ones, and the hard sciences over “soft” sciences and humanities. The epistemic dimensions of such state of affair is to date a relatively underexplored area of research. This roundtable presents current debates around science policies, research funding allocation and the future of the university, from diverse epistemological perspectives, not only in view of improving fairness towards the underprivileged, and the efficiency of the system, but also in order to foster a new role for science and research institutions in society. 

Speakers: Sabina Leonelli (Exeter), Richard Pettigrew (Bristol), Barbara Osimani (Marche Polytechnic University), Marco Ottaviani (Milan), Andrea Saltelli (Barcelona).

Discussants: Angela Liberatore (ERC), Jean-Pierre Bourguignon (ERC), Ferruccio Resta (CRUI).

14:30-15:45 Session 4: Keynote: Christian List

Abstract: In this talk, I will present a case for scientific realism about free will. I will begin by summarizing some of the main scientifically motivated challenges for free will and will then respond to them by presenting a naturalistic indispensability argument for free will. The argument supports the reality of free will as an emergent higher-level phenomenon. I will also explain why the resulting picture of free will does not conflict with the possibility that the fundamental laws of nature are deterministic.


16:00-18:00 Session 5A: S: Understanding cancer: How can philosophy and biology contribute together?
Understanding cancer: How can philosophy and biology contribute together? (in person)
PRESENTER: Lucie Laplane

ABSTRACT. The biological complexity and heterogeneity of cancer make it very difficult to apprehend, control, and cure. The challenges associated with cancer, however, are not just clinical or biological, they are also conceptual and philosophical. Philosophy of cancer biology is a small but growing field of research and this symposium will expand the scope of this field by addressing new questions that are both central to today’s understanding of cancer and can benefit from a close collaboration between biologists and philosophers: - Is cancer a breakdown of multicellularity? - Which clones drive clonal evolution? - How can the immune system favor cancer? - What is the role of the microbiota in cancer? In addition to contributing to conceptual clarification, this symposium will discuss philosophy of science in practice through a reflection on how philosophers, biologists, and oncologist can work together to better understand and treat cancer.

16:00-18:00 Session 5B: Philosophy of physical sciences
Bell’s Assumptions and the Structure of Quantum Mechanics (in person)
PRESENTER: Carl Hoefer

ABSTRACT. A common view is that Bell assumed “classicality” or “classical realism”, which goes against the fundamental tenets of QM. The violation of Bell’s inequalities thus does not imply non-locality of QM or of nature, it simply reinforces the fact that QM is not classical. We examine two recent variants of this thesis (Werner 2014; Griffiths 2020), and their associated versions of QM: operational QM, and the consistent histories approach. We explore how they evade Bell’s theorem. We show that Werner’s notion of classicality is equivalent with probabilistic conditions formulated by Pitowsky and Fine in the 1980s. However, classicality thus construed is in fact a consequence of standard causal-statistical assumptions. In evading the derivation of Bell’s inequalities, each of the theories in question violates one of these standard assumptions: in operational QM the Common Cause Principle doesn’t hold; the histories formulation of QM is conspiratorial.

Relational Quantum Mechanics and the PBR Theorem: A Peaceful Coexistence (in person)

ABSTRACT. According to the principles of Relational Quantum Mechanics (RQM) the wave function is considered neither a concrete physical item evolving in spacetime, nor an object representing the absolute state of a certain quantum system. In this context, it is defined as a computational device encoding observers’ information; hence, RQM offers an epistemic view of the wave function. This perspective seems to be at odds with the PBR theorem, a result excluding that wave functions represent knowledge of an underlying reality described by some ontic state. This talk aims to argue that RQM is not affected by the conclusions of PBR’s argument; consequently, this alleged inconsistency can be dissolved. To achieve this result, I will take into account the foundations of the PBR theorem, i.e. Harrigan and Spekkens’ categorization of ontological models, showing that their implicit assumptions made about the nature of the ontic state are incompatible with the postulates of RQM.

Scope and Limits of Stochastic Quantum Mechanics in the Block Universe (in person)

ABSTRACT. This presentation aims to characterise recent alternative approaches to quantum mechanics in the block universe. I show that, contrary to what has been widely thought, these theories need not to be either linear (and unitary) or time symmetric. Regarding the last characteristic, I take into consideration a recent theorem by Leifer and Pusey which proposes that, under certain assumptions, time-symmetric theories require retrocausality. I base my analysis on two stochastic/non-linear proposals: (i) time-symmetric models of collapse theories, and (ii) Kent’s relativistic quantum theory. These models allow me to claim that linearity (and unitarity), time-symmetry and retrocausality are not indispensable characteristics of a quantum theory compatible with the block universe perspective. To conclude, I discuss the interpretative as well as ontological scope and limits of these alternatives.

Managing Uncertainty in Radiometric Dating (online)

ABSTRACT. This talk will use the methodologically rich case of radiometric measurements to ground four philosophical insights about uncertainty in experimental contexts: First, by tracing the history of radiometric date and uncertainty revisions, I will highlight the way in which uncertainty estimates are fallible knowledge claims that depend on background assumptions that can be revised over time. Second, I call attention to important sources of uncertainty in radiometric dating that are not included in uncertainty estimates, because it is not known how to quantify them, even though they have a substantial influence on both the precision and accuracy of radiometric dates. Third, I discuss the different sorts of methodological strategies that have to be deployed to reduce different sources of uncertainty in radiometric dates. And fourth, I offer another approach to managing uncertainty in radiometric dating that I call the adequacy-for-purpose view of uncertainty estimates.

16:00-18:00 Session 5C: Philosophy of life sciences
Biodiversity vs. Paleodiversity Measurements: the Incommensurability Problem (online)

ABSTRACT. In this paper, I compare measurements of biodiversity to measurements of paleodiversity. My intent is to understand whether commonly used inferences from paleodiversity measurements to biodiversity estimates are epistemically well-motivated. I claim that justifying such comparative evaluations (e.g. using paleodiversity data to show we are currently facing a biodiversity crisis) is harder than it appears. Paleodiversity measurements are incommensurable with contemporary measures of biodiversity, given the different ways that biodiversity is conceptualized, and quantifed accordingly. Specically, unlike current biodiversity measures, paleoestimates rely heavily on an understanding of biodiversity as species counts. But the understanding of current biodiversity is not reducible to species inventories. I call this mismatch the “incommensurability problem”. I conclude by proposing two possible ways of overcoming this incommensurability problem.

Cultural maladaptation and the inverse correlation hypothesis (in person)

ABSTRACT. Three hypotheses of the persistence of maladaptive cultural traits in a population—e.g., celibacy—have been proposed in cultural evolutionary theory: the (1) cultural inertia, (2) cultural attractors and (3) cultural transmission hypotheses. The goal of this research is two-fold. First, to provide a much-needed philosophical analysis of the concept and explanations of cultural maladaptation. I argue that there is no consensus about the definition and that the hypotheses on offer have limitations. Second, I suggest an additional hypothesis that stems from various assumptions of cultural evolutionary theory: the inverse correlation hypothesis (ICH). This hypothesis proposes that cultural traits that minimize reproduction consequently make available resources and time that an individual utilizes instead to propagate the cultural trait in question. I articulate the ICH and advance that, although it is a promising hypothesis, it faces important challenges with regards to empirical evidence.

Intrinsic Biological Essentialism: Devitt’s New Argument (in person)

ABSTRACT. Michael Devitt has recently responded to criticism of his doctrine of intrinsic biological essentialism (IBE). I argue that parts of his response expose IBE to a dilemma: putative intrinsic essences may be saved from triviality at the expense of explantoriness; or they may be explanatory but likely trivial. I also argue that his argument for IBE sanctions postulation of intrinsic essences for non-taxonomic categories like “predator”. This threatens an unwanted proliferation of essences which cannot – unlike those of hierarchical taxa – be neatly related.

Conceptual and methodological issues concerning psyhcological essentialism in the context of folk biology (in person)

ABSTRACT. Psychological essentialism is the view according to which people have an innate tendency to see certain kinds as essence-based natural kinds. In the context of folk biology, this essentialism supposedly applies to biological kinds (especially at the folk taxonomic level of generic species). Some authors even claim that we have an innate folk biological model that involves certain biological concepts relevant for folk biological essentialism. I will give an overview of some conceptual and methodological issues concerning these claims, and the studies aimed to support them. I will claim that 1) psychological essentialism is not as inconsistent with evolutionary theory as has often been stated; 2) people do not have an innate tendency to recognize generic species as salient ‘essentializable’ kinds – this ability is much more learning-dependent than has often been suggested; 3) even if the folk biological module exists, it does not involve the specific biological concepts ascribed to it.

16:00-18:00 Session 5D: General philosophy of science
Why Experimental Balance is Still a Reason to Randomize (in person)
PRESENTER: Marco Martinez

ABSTRACT. Experimental balance is the control for the value of the conditions, other than the one under study, which are liable to affect the result of a test. We will discuss three different approaches to balance. Millean balance requires to identify and equalize ex ante the value of these conditions in order to conduct solid causal inferences. Fisherian balance measures ex post the influence of uncontrolled conditions through the analysis of variance. In efficiency balance the value of the antecedent conditions is decided ex ante according to the efficiency they yield in the estimation of the treatment outcome. Against some old arguments by John Worrall, we will show that in both Fisherian and efficiency balance there are good reasons to randomize the allocation of treatments, in particular when there is no agreement among experimenters as to the antecedent conditions to be controlled for.

Expert Judgment in Climate Science (in person)
PRESENTER: Mason Majszak

ABSTRACT. The importance of modeling for climate projection and understanding of past and present climate is clear and this feature has been widely evaluated by the philosophy of climate science community. However, an equally important feature has received little attention, the uses of expert judgment within climate science. In this talk, our first aim is to provide a systematic examination of the various instances and uses of expert judgment in climate science, and, from this descriptive approach, to highlight the contexts of uses of expert judgment. Our second aim is to discuss the essential features of expert judgment in climate science. This will serve as a conceptual elaboration which, as we claim, must be done through both the philosophy of science and the social epistemology perspectives.

Realism, Antirealism, and Theoretical Conservatism (in person)
PRESENTER: Luca Tambolo

ABSTRACT. This paper investigates the question of whether a systematic connection obtains between one’s commitment to scientific realism or antirealism and one’s attitude towards the possibility of theory change affecting our best theories. Stanford (2015, 2019, 2020) has argued that realists will recommend scientists to be relatively conservative from the theoretical point of view, while antirealists will recommend a relatively more favorable attitude towards theory change. We counter that it is not allegiance to realism or antirealism as such that dictates one’s response to radical theoretical novelty: what matters the most is, rather, the proposed alternative’s presumed ability to realize one’s favorite cognitive aim(s). As we argue, unless one embraces the implausible assumption that antirealist cognitive aim(s) are easier to achieve than realist aim(s), there is no reason to maintain that antirealists will look at theory change affecting our best theories more favorably than realists.

Can induction be justified on practical grounds? (online)

ABSTRACT. Recently, G. Schurz, D. Steel and F. Huber have argued for induction by showing it to be optimal or necessary and sufficient for a certain end. Yet, these arguments need not convince a skeptic who prioritizes the avoidance of error. A firmer argument for induction thus has to show that we should not prioritize the avoidance of error. This seems plausible, if only for practical reasons: Skeptics who do not base their decisions on induction will more often fail to get what they want. The aim of the paper is to discuss an argument for induction along these lines. As a formal tool, I use action games as considered by Schurz. But I cannot rely on his results on the optimality of metainduction, or so I argue. So I compare the inductivist and the skeptic from scratch. Although induction does pay off in certain worlds, there are possible worlds in which it does not. Consequently, the prospects of a practical justification of induction are dim.

16:00-18:00 Session 5E: Integrated history and philosophy of science
Historical and Contemporary Climate Model Intercomparisons: Lessons for Pluralism in Modeling (online)

ABSTRACT. I connect historical climate model intercomparisons to contemporary ones and highlight key epistemic & pragmatic issues facing scientists. This reveals a highly integrated & collaborative practice and implies a rich methodological pluralism (MP) but not an ontic competitive one. Model intercomparisons help correct systematic model biases, they inform model development & diagnostic subprojects, and they highlight the distributed epistemic labor characteristic of climate modeling. Thus, the MP view I develop is richer and more accurate than several highly influential views in the philosophy of climate modeling (e.g., Parker 2006, Katzav 2014). I then argue that ontic competitive pluralism—the idea that different models are incompatible representations and that scientists are competing to build the “best” model—is ontic monism in disguise. This becomes clear by seeing that so-called “competing” climate models share a model-type (Lloyd 2015) and from current debates about model weighting.

Paths that did not cross. Why philosophy of science had no impact on science policy in the Twentieth century? (in person)

ABSTRACT. Philosophy of science and science policy, intended as the management and organization of science, seem to overlap on several topics, such as research evaluation and the proper organization of the scientific enterprise. So far, however, the influence of philosophy of science on science policy studies and practices has been rather scarce. Why has philosophy of science never reached science policy, despite its potential? This paper aims at answering this question by examining two philosophical programmes with a clear science policy relevancy, namely Lakatos’ MSRP and Kitcher’s social epistemology, and by contrasting them with scientometrics, a field that was successfully integrated into science policies. Our analysis shows that both institutional and intellectual factors contributed to the isolation of philosophy of science from science policy. Based on these results, we suggest some strategies to build a fruitful collaboration between philosophers of science and science policy makers.

Loops, Topologies and Genidentity: Reichenbach’s Direction of Time meets Feynman’s Diagrams (in person)

ABSTRACT. Reichenbach’s "The Direction of Time" ends with a surprising twist. Prompted by Feynman’s formulation of quantum electrodynamics, he abandoned the delicate balancing between the macroscopic foundation of time and microscopic descriptions of order advocated in previous chapters. Why did Reichenbach react so strongly? The early understanding of Feynman diagrams was influenced by their visual proximity to bubble chamber pictures and Minkowski diagrams. Both shaped how Reichenbach interpreted an electron-positron pair, taking the positron as electron actually running backwards in time. This allegedly allowed a closed causal curve and thwarted microscopic order. While Reichenbach believed that functional genidentity accordingly failed, I argue that the concept can be adapted to modern quantum field theory if one allows for local variations, processes of higher order, as long as they are governed by symmetry rules and as long as energy conservation holds for particles actually observed.

Wherein is the concept of disease normative? From weak normativity to value-conscious naturalism (online)

ABSTRACT. In this paper we focus on some new normativist positions and compare them with traditional ones. In so doing, we claim that if normative judgements are involved in determining whether a condition is a disease only in the sense identified by new normativisms, then disease is normative only in a weak sense, which must be distinguished from the strong sense advocated by traditional normativisms. Specifically, we shall argue that weak and strong normativity are different to the point that one normativist label ceases to be appropriate for the whole range of positions. The focus of our discussion is that weak normativism is compatible with, and is a possible complement to, naturalism about disease: if values and norms are not components of the concepts of disease, but only intervene in other explanatory roles, then the concept of disease is no more value-laden than many other scientific concepts, or even any other scientific concept.

16:00-18:00 Session 5F: Formal philosophy of science
The Pragmatic Value of Uncertain Evidence (in person)

ABSTRACT. We explore the question of whether cost-free uncertain evidence is worth waiting for in advance of making a decision. A classical result in Bayesian decision theory, known as the value of evidence theorem, says that, under certain conditions, when you update your credences by conditionalizing on some cost-free and certain evidence, the subjective expected utility of obtaining this evidence is never less than the subjective expected utility of not obtaining it. We extend this result to a type of update method, a variant of Judea Pearl's virtual conditionalization, where uncertain evidence is represented as a set of likelihood ratios. Moreover, we argue that focusing on this method rather than on the widely accepted Jeffrey conditionalization enables us to show that, under a fairly plausible assumption, gathering uncertain evidence not only maximizes expected pragmatic utility, but also minimizes expected epistemic disutility (inaccuracy).

How to believe long conjunctions of beliefs: probability, quasi-dogmatism and contextualism (in person)
PRESENTER: Stefano Bonzio

ABSTRACT. According to the so-called Lockean thesis, a rational agent believes a proposition just in case its probability is sufficiently high, i.e., greater than some suitably fixed threshold. The Preface paradox is usually taken to show that the Lockean thesis is untenable, if one also assumes that rational agents should believe the conjunction of their own beliefs: high probability and rational belief are in a sense incompatible. In this contribution, we show that this is not the case in general. More precisely, we consider two methods of computing how probable must each of a series of propositions be in order to rationally believe their conjunction under the Lockean thesis. The price one has to pay for the proposed solutions to the paradox is what we call "quasi-dogmatism": the view that a rational agent should believe only those propositions which are "nearly certain" in a suitably defined sense.

Causal Attribution and Partial Liability: A Probabilistic Model (in person)

ABSTRACT. This paper presents two causal-probabilistic models for calculating partial liability in tort law, and explores their scopes and limits. The aim of the work is, ultimately, a unification of principles of causal attribution in legal doctrine and in the sciences, together with a rigorous justification of the concrete models.

Epistemically modest methodological triangulation (in person)

ABSTRACT. Should a scientist rely on methodological triangulation? Heesen et al. (2019) recently provided a convincing affirmative answer. However, their approach requires belief gambles if the evidence is discordant. We instead propose epistemically modest triangulation (EMT), according to which one should withhold judgement in such cases. We show that for a scientist in a methodologically diffident situation the expected utility of EMT is greater than that of Heesen et al.’s (2019) triangulation or that of using a single method. We also show that EMT is more appropriate for increasing epistemic trust in science. In short: triangulate, but do not gamble with evidence.

18:00-19:30 Session 6: Poster session

The programme is available here. The poster session will start with 3-minute oral presentations from all poster presenters. Breakout rooms will then be enabled for discussion with online poster presenters. Meanwhile, in-person poster presenters will be able to have discussions in front of their posters.