View: session overviewtalk overview
The current system of rewards for academic work, which connects funding assignments to metric performances, and these in turns on the capacity to attract funding, reinforces multiple Matthew effects and increases gaps across universities and scholars, both geographically (widening the North-South and West-East divides), and institutionally (expanding the distance between high ranked universities and those down in the ladder), as well as disciplinary: favouring applied fields of research over theoretical ones, and the hard sciences over “soft” sciences and humanities. The epistemic dimensions of such state of affair is to date a relatively underexplored area of research. This roundtable presents current debates around science policies, research funding allocation and the future of the university, from diverse epistemological perspectives, not only in view of improving fairness towards the underprivileged, and the efficiency of the system, but also in order to foster a new role for science and research institutions in society.
Speakers: Sabina Leonelli (Exeter), Richard Pettigrew (Bristol), Barbara Osimani (Marche Polytechnic University), Marco Ottaviani (Milan), Andrea Saltelli (Barcelona).
Discussants: Angela Liberatore (ERC), Jean-Pierre Bourguignon (ERC), Ferruccio Resta (CRUI).
https://lse.zoom.us/j/88024601292; Meeting ID: 880 2460 1292
Please sign up here.
Abstract: In this talk, I will present a case for scientific realism about free will. I will begin by summarizing some of the main scientifically motivated challenges for free will and will then respond to them by presenting a naturalistic indispensability argument for free will. The argument supports the reality of free will as an emergent higher-level phenomenon. I will also explain why the resulting picture of free will does not conflict with the possibility that the fundamental laws of nature are deterministic.
16:00 | Understanding cancer: How can philosophy and biology contribute together? (in person) PRESENTER: Lucie Laplane ABSTRACT. The biological complexity and heterogeneity of cancer make it very difficult to apprehend, control, and cure. The challenges associated with cancer, however, are not just clinical or biological, they are also conceptual and philosophical. Philosophy of cancer biology is a small but growing field of research and this symposium will expand the scope of this field by addressing new questions that are both central to today’s understanding of cancer and can benefit from a close collaboration between biologists and philosophers: - Is cancer a breakdown of multicellularity? - Which clones drive clonal evolution? - How can the immune system favor cancer? - What is the role of the microbiota in cancer? In addition to contributing to conceptual clarification, this symposium will discuss philosophy of science in practice through a reflection on how philosophers, biologists, and oncologist can work together to better understand and treat cancer. |
16:00 | Bell’s Assumptions and the Structure of Quantum Mechanics (in person) PRESENTER: Carl Hoefer ABSTRACT. A common view is that Bell assumed “classicality” or “classical realism”, which goes against the fundamental tenets of QM. The violation of Bell’s inequalities thus does not imply non-locality of QM or of nature, it simply reinforces the fact that QM is not classical. We examine two recent variants of this thesis (Werner 2014; Griffiths 2020), and their associated versions of QM: operational QM, and the consistent histories approach. We explore how they evade Bell’s theorem. We show that Werner’s notion of classicality is equivalent with probabilistic conditions formulated by Pitowsky and Fine in the 1980s. However, classicality thus construed is in fact a consequence of standard causal-statistical assumptions. In evading the derivation of Bell’s inequalities, each of the theories in question violates one of these standard assumptions: in operational QM the Common Cause Principle doesn’t hold; the histories formulation of QM is conspiratorial. |
16:30 | Relational Quantum Mechanics and the PBR Theorem: A Peaceful Coexistence (in person) ABSTRACT. According to the principles of Relational Quantum Mechanics (RQM) the wave function is considered neither a concrete physical item evolving in spacetime, nor an object representing the absolute state of a certain quantum system. In this context, it is defined as a computational device encoding observers’ information; hence, RQM offers an epistemic view of the wave function. This perspective seems to be at odds with the PBR theorem, a result excluding that wave functions represent knowledge of an underlying reality described by some ontic state. This talk aims to argue that RQM is not affected by the conclusions of PBR’s argument; consequently, this alleged inconsistency can be dissolved. To achieve this result, I will take into account the foundations of the PBR theorem, i.e. Harrigan and Spekkens’ categorization of ontological models, showing that their implicit assumptions made about the nature of the ontic state are incompatible with the postulates of RQM. |
17:00 | Scope and Limits of Stochastic Quantum Mechanics in the Block Universe (in person) ABSTRACT. This presentation aims to characterise recent alternative approaches to quantum mechanics in the block universe. I show that, contrary to what has been widely thought, these theories need not to be either linear (and unitary) or time symmetric. Regarding the last characteristic, I take into consideration a recent theorem by Leifer and Pusey which proposes that, under certain assumptions, time-symmetric theories require retrocausality. I base my analysis on two stochastic/non-linear proposals: (i) time-symmetric models of collapse theories, and (ii) Kent’s relativistic quantum theory. These models allow me to claim that linearity (and unitarity), time-symmetry and retrocausality are not indispensable characteristics of a quantum theory compatible with the block universe perspective. To conclude, I discuss the interpretative as well as ontological scope and limits of these alternatives. |
17:30 | Managing Uncertainty in Radiometric Dating (online) ABSTRACT. This talk will use the methodologically rich case of radiometric measurements to ground four philosophical insights about uncertainty in experimental contexts: First, by tracing the history of radiometric date and uncertainty revisions, I will highlight the way in which uncertainty estimates are fallible knowledge claims that depend on background assumptions that can be revised over time. Second, I call attention to important sources of uncertainty in radiometric dating that are not included in uncertainty estimates, because it is not known how to quantify them, even though they have a substantial influence on both the precision and accuracy of radiometric dates. Third, I discuss the different sorts of methodological strategies that have to be deployed to reduce different sources of uncertainty in radiometric dates. And fourth, I offer another approach to managing uncertainty in radiometric dating that I call the adequacy-for-purpose view of uncertainty estimates. |
16:00 | Why Experimental Balance is Still a Reason to Randomize (in person) PRESENTER: Marco Martinez ABSTRACT. Experimental balance is the control for the value of the conditions, other than the one under study, which are liable to affect the result of a test. We will discuss three different approaches to balance. Millean balance requires to identify and equalize ex ante the value of these conditions in order to conduct solid causal inferences. Fisherian balance measures ex post the influence of uncontrolled conditions through the analysis of variance. In efficiency balance the value of the antecedent conditions is decided ex ante according to the efficiency they yield in the estimation of the treatment outcome. Against some old arguments by John Worrall, we will show that in both Fisherian and efficiency balance there are good reasons to randomize the allocation of treatments, in particular when there is no agreement among experimenters as to the antecedent conditions to be controlled for. |
16:30 | Expert Judgment in Climate Science (in person) PRESENTER: Mason Majszak ABSTRACT. The importance of modeling for climate projection and understanding of past and present climate is clear and this feature has been widely evaluated by the philosophy of climate science community. However, an equally important feature has received little attention, the uses of expert judgment within climate science. In this talk, our first aim is to provide a systematic examination of the various instances and uses of expert judgment in climate science, and, from this descriptive approach, to highlight the contexts of uses of expert judgment. Our second aim is to discuss the essential features of expert judgment in climate science. This will serve as a conceptual elaboration which, as we claim, must be done through both the philosophy of science and the social epistemology perspectives. |
17:00 | Realism, Antirealism, and Theoretical Conservatism (in person) PRESENTER: Luca Tambolo ABSTRACT. This paper investigates the question of whether a systematic connection obtains between one’s commitment to scientific realism or antirealism and one’s attitude towards the possibility of theory change affecting our best theories. Stanford (2015, 2019, 2020) has argued that realists will recommend scientists to be relatively conservative from the theoretical point of view, while antirealists will recommend a relatively more favorable attitude towards theory change. We counter that it is not allegiance to realism or antirealism as such that dictates one’s response to radical theoretical novelty: what matters the most is, rather, the proposed alternative’s presumed ability to realize one’s favorite cognitive aim(s). As we argue, unless one embraces the implausible assumption that antirealist cognitive aim(s) are easier to achieve than realist aim(s), there is no reason to maintain that antirealists will look at theory change affecting our best theories more favorably than realists. |
17:30 | Can induction be justified on practical grounds? (online) ABSTRACT. Recently, G. Schurz, D. Steel and F. Huber have argued for induction by showing it to be optimal or necessary and sufficient for a certain end. Yet, these arguments need not convince a skeptic who prioritizes the avoidance of error. A firmer argument for induction thus has to show that we should not prioritize the avoidance of error. This seems plausible, if only for practical reasons: Skeptics who do not base their decisions on induction will more often fail to get what they want. The aim of the paper is to discuss an argument for induction along these lines. As a formal tool, I use action games as considered by Schurz. But I cannot rely on his results on the optimality of metainduction, or so I argue. So I compare the inductivist and the skeptic from scratch. Although induction does pay off in certain worlds, there are possible worlds in which it does not. Consequently, the prospects of a practical justification of induction are dim. |
16:00 | Historical and Contemporary Climate Model Intercomparisons: Lessons for Pluralism in Modeling (online) ABSTRACT. I connect historical climate model intercomparisons to contemporary ones and highlight key epistemic & pragmatic issues facing scientists. This reveals a highly integrated & collaborative practice and implies a rich methodological pluralism (MP) but not an ontic competitive one. Model intercomparisons help correct systematic model biases, they inform model development & diagnostic subprojects, and they highlight the distributed epistemic labor characteristic of climate modeling. Thus, the MP view I develop is richer and more accurate than several highly influential views in the philosophy of climate modeling (e.g., Parker 2006, Katzav 2014). I then argue that ontic competitive pluralism—the idea that different models are incompatible representations and that scientists are competing to build the “best” model—is ontic monism in disguise. This becomes clear by seeing that so-called “competing” climate models share a model-type (Lloyd 2015) and from current debates about model weighting. |
16:30 | Paths that did not cross. Why philosophy of science had no impact on science policy in the Twentieth century? (in person) PRESENTER: Eugenio Petrovich ABSTRACT. Philosophy of science and science policy, intended as the management and organization of science, seem to overlap on several topics, such as research evaluation and the proper organization of the scientific enterprise. So far, however, the influence of philosophy of science on science policy studies and practices has been rather scarce. Why has philosophy of science never reached science policy, despite its potential? This paper aims at answering this question by examining two philosophical programmes with a clear science policy relevancy, namely Lakatos’ MSRP and Kitcher’s social epistemology, and by contrasting them with scientometrics, a field that was successfully integrated into science policies. Our analysis shows that both institutional and intellectual factors contributed to the isolation of philosophy of science from science policy. Based on these results, we suggest some strategies to build a fruitful collaboration between philosophers of science and science policy makers. |
17:00 | Loops, Topologies and Genidentity: Reichenbach’s Direction of Time meets Feynman’s Diagrams (in person) ABSTRACT. Reichenbach’s "The Direction of Time" ends with a surprising twist. Prompted by Feynman’s formulation of quantum electrodynamics, he abandoned the delicate balancing between the macroscopic foundation of time and microscopic descriptions of order advocated in previous chapters. Why did Reichenbach react so strongly? The early understanding of Feynman diagrams was influenced by their visual proximity to bubble chamber pictures and Minkowski diagrams. Both shaped how Reichenbach interpreted an electron-positron pair, taking the positron as electron actually running backwards in time. This allegedly allowed a closed causal curve and thwarted microscopic order. While Reichenbach believed that functional genidentity accordingly failed, I argue that the concept can be adapted to modern quantum field theory if one allows for local variations, processes of higher order, as long as they are governed by symmetry rules and as long as energy conservation holds for particles actually observed. |
17:30 | Wherein is the concept of disease normative? From weak normativity to value-conscious naturalism (online) PRESENTER: Maria Cristina Amoretti ABSTRACT. In this paper we focus on some new normativist positions and compare them with traditional ones. In so doing, we claim that if normative judgements are involved in determining whether a condition is a disease only in the sense identified by new normativisms, then disease is normative only in a weak sense, which must be distinguished from the strong sense advocated by traditional normativisms. Specifically, we shall argue that weak and strong normativity are different to the point that one normativist label ceases to be appropriate for the whole range of positions. The focus of our discussion is that weak normativism is compatible with, and is a possible complement to, naturalism about disease: if values and norms are not components of the concepts of disease, but only intervene in other explanatory roles, then the concept of disease is no more value-laden than many other scientific concepts, or even any other scientific concept. |
16:00 | The Pragmatic Value of Uncertain Evidence (in person) PRESENTER: Patryk Dziurosz-Serafinowicz ABSTRACT. We explore the question of whether cost-free uncertain evidence is worth waiting for in advance of making a decision. A classical result in Bayesian decision theory, known as the value of evidence theorem, says that, under certain conditions, when you update your credences by conditionalizing on some cost-free and certain evidence, the subjective expected utility of obtaining this evidence is never less than the subjective expected utility of not obtaining it. We extend this result to a type of update method, a variant of Judea Pearl's virtual conditionalization, where uncertain evidence is represented as a set of likelihood ratios. Moreover, we argue that focusing on this method rather than on the widely accepted Jeffrey conditionalization enables us to show that, under a fairly plausible assumption, gathering uncertain evidence not only maximizes expected pragmatic utility, but also minimizes expected epistemic disutility (inaccuracy). |
16:30 | How to believe long conjunctions of beliefs: probability, quasi-dogmatism and contextualism (in person) PRESENTER: Stefano Bonzio ABSTRACT. According to the so-called Lockean thesis, a rational agent believes a proposition just in case its probability is sufficiently high, i.e., greater than some suitably fixed threshold. The Preface paradox is usually taken to show that the Lockean thesis is untenable, if one also assumes that rational agents should believe the conjunction of their own beliefs: high probability and rational belief are in a sense incompatible. In this contribution, we show that this is not the case in general. More precisely, we consider two methods of computing how probable must each of a series of propositions be in order to rationally believe their conjunction under the Lockean thesis. The price one has to pay for the proposed solutions to the paradox is what we call "quasi-dogmatism": the view that a rational agent should believe only those propositions which are "nearly certain" in a suitably defined sense. |
17:00 | Causal Attribution and Partial Liability: A Probabilistic Model (in person) ABSTRACT. This paper presents two causal-probabilistic models for calculating partial liability in tort law, and explores their scopes and limits. The aim of the work is, ultimately, a unification of principles of causal attribution in legal doctrine and in the sciences, together with a rigorous justification of the concrete models. |
17:30 | Epistemically modest methodological triangulation (in person) ABSTRACT. Should a scientist rely on methodological triangulation? Heesen et al. (2019) recently provided a convincing affirmative answer. However, their approach requires belief gambles if the evidence is discordant. We instead propose epistemically modest triangulation (EMT), according to which one should withhold judgement in such cases. We show that for a scientist in a methodologically diffident situation the expected utility of EMT is greater than that of Heesen et al.’s (2019) triangulation or that of using a single method. We also show that EMT is more appropriate for increasing epistemic trust in science. In short: triangulate, but do not gamble with evidence. |
The programme is available here. The poster session will start with 3-minute oral presentations from all poster presenters. Breakout rooms will then be enabled for discussion with online poster presenters. Meanwhile, in-person poster presenters will be able to have discussions in front of their posters.