previous day
next day
all days

View: session overviewtalk overview

09:00-11:00 Session 11A: Philosophy of science in practice
Patchwork concepts and operationalism (in person)
PRESENTER: Philipp Haueis

ABSTRACT. Patchwork approaches posit that many scientific concepts have multiple related meanings, depending on the technique used to apply the concept, the domain to which it is applied, and the property picked out in that domain. By tying concept meaning in part to techniques, patchwork approaches face the same issue as operationalism: each new technique would create a new concept, resulting in an unmanageable array of conceptual disunity. We show how patchwork approaches can avoid the issue and preserve conceptual integrity while respecting the role of operational definitions in scientific practice. Multiple techniques are associated with the same patchwork concept if they realize the same general reasoning strategy. Such strategies are stepwise instructions to determine which property a concept refers to in a domain. Using examples from biology and neuroscience, we show how reasoning strategies integrate multiple technique-involving uses of a term into a cohesive conceptual structure.

Explanatory Holes? Testing the Limits of the Mechanistic Framework (in person)

ABSTRACT. This paper discusses a type of explanation which does not seem to have received philosophical attention. Found in engineering science, these explanations appear to be mechanistic, yet they feature nonstandard entities (voids, cracks, pits…). Specifically, these explanations show how the activities and organization of nonstandard entities are responsible for a phenomenon. The paper studies this type of explanation, focussing on the case of micro-void explanations of ductile fracture. After characterizing these void explanations, three possible ways of understanding them are suggested: (i) they are traditional mechanistic explanations which employ standard entities (voids should be understood as shorthand expressions), (ii) they are special cases of mechanistic explanations that employ nonstandard entities, (iii) they require an alternative account of explanation. I elaborate and discuss each option and suggest that an account of fictional mechanism explanations might be warranted.

Participation and Objectivity (online)

ABSTRACT. Many philosophers of science have recently argued that extra-academic participation in scientific knowledge production does not threaten scientific objectivity. Quite the contrary: citizen science, participatory projects, transdisciplinary research, and other similar endeavours can even increase the objectivity of the conducted research. Simultaneously, scientists working in fields where such participation is common, have expressed worries about various ways in which it can result in biases. In this paper I clarify how participation can both increase and threaten the objectivity of the conducted research.

Epistemic aims, methodological choices, and value trade-offs in modeling (online)

ABSTRACT. Different epistemic aims are satisfied by prioritizing different epistemic values. Axel Gelfert (2013, 2016) argues that trade-offs in modeling such as those between generality, precision, and realism can be understood as demarcation criteria between sciences. I build on Gelfert’s argument by focusing on a pervasive kind of trade-off in computationally intensive modeling – that between accuracy and speed. I use two case studies in physics to show how trading off speed and accuracy satisfies different epistemic aims, but rather than demarcating sciences, it transforms the disciplinary boundaries. I argue in favor of an epistemically significant role for speed of generating results, which is often dismissed or downplayed as an extra-scientific element, or conflated with ease of use. Case studies are informed by qualitative interviews conducted with practicing modelers in theoretical physics.

09:00-11:00 Session 11B: Philosophy of physical sciences
Reliability, Informativeness and Sensitivity in dark matter observation (online)

ABSTRACT. The physics of dark matter can be probed by employing five different methods of observation: (i) via its gravitational effects, (ii) via precision measurements of cosmological observables, (iii) via direct searches, (iv) via indirect searches, and (v) via collider experiments. A natural question that arises is whether any of these methods is epistemically superior than the rest. The central aim of this paper is to answer this question by developing a way of evaluating the epistemic power of each method, based on the criteria of Reliability, Informativeness and Sensitivity. The main conclusion is that although these virtues are useful tools for evaluating the robustness of each method, the overall epistemic power of a possible observation of dark matter does not depend on the method per se. Rather it is a matter of the confidence of scientists in the underlying physics and the performance of the experimental apparatus.

Modeling and modality in astrophysics (in person)
PRESENTER: Giulia Schettino

ABSTRACT. Models can be found everywhere in astrophysics. Due to the variety of spatial and temporal scales, astronomical phenomena cannot be reproduced in laboratory, making astrophysics a special discipline. In this framework, the adoption of models and simulations becomes mandatory. Here, we are interested in exploring the nature of the modality implied in such modeling practice: how the space of theoretical possibilities is constrained? What kind of possibility (epistemic or objective) is at work?

We will address these questions by comparing two cases of modeling at very different scales: a) a modeling practice taking place in a “phenomenological” context, as modeling weak gravity at the solar system level; b) the theoretical modeling activity characterizing cases of model building in cosmology. The two cases seem to refer to substantially different modeling practices. However, as we will argue, the general situation in astrophysics stands out to be much less clear-cut than what it seems.

A Lie-Algebraic Stability Explanation of the Effective Application of Mathematics to Physics (online)

ABSTRACT. Various attempts have been made to reasonably explain the undeniable effectiveness of mathematics in its application to physics. In particular, case-by-case explanations have been conducted without disregarding formal analysis gleaned from actual practice. In that same spirit, yet focusing more on the innovative power of mathematics and its general patterns of applicability, I shall argue that its effectiveness in developing accurate and informative predictions is explained in part due to the application of a powerful heuristic strategy that consists in reformulating physical theories in terms of new mathematical structures which correspond (under a group-theoretic representation) to the class of abstract Lie algebras that are stable.

Gauge theories from the effective perspective (online)

ABSTRACT. This paper argues that supporters of effective field theory interpretations of relativistic quantum theories have reason to adopt a philosophically revisionary interpretation of gauge theories. Philosophers have recently been interested in the conventional, perturbative approach to quantum field theory, bringing it to bear on general philosophical issues such as scientific realism, emergence, and fundamentality. Key to these applications has been the notion of an effective theory: one explicitly relativized to a particular length scale. However, this notion faces well-known problems when applied to gauge theories. The standard philosophical view on gauge theories sees them as featuring descriptive fluff that obstructs the interpretation of an effective quantum gauge theory as scale-relative. I argue that this obstacle can be naturally avoided by an independently motivated alternative interpretation of gauge theories, which does not impute to them extra fluff.

09:00-11:00 Session 11C: Philosophy of life sciences
Two accounts of extrapolation (online)

ABSTRACT. According to one account, extrapolations are a type of inference. They are not completely uniformed, but carry evidential weight prior to direct testing. According to a rival account, extrapolation can only be a method for generating hypotheses, which must be ultimately tested on an independent basis. Although the two accounts seem contradictory, they do in fact sustain one another. The experimental practices promoted by the hypothesis-generator account ultimately generate the prior evidence on which future extrapolative inferences can rely. In turn, extrapolative inferences generate informed hypotheses which improve estimates of causal efficacy, as well as the chances of successful confirmation by direct testing.

Epistemic roles of similarity considerations in mouse models of cancer (online)

ABSTRACT. This paper identifies and analyzes diverse epistemic roles of similarity considerations in cancer research which utilizes mouse models, including immunocompetent and immunodeficient transplantable models, genetically engineered models and humanized models. To this end, the paper proposes to disentangle three – often intertwined in practice but conceptually distinct – research modes: model selection, model extrapolation and model creation. It is then argued that each of the modes exhibits reliance on different forms of similarity considerations. By appreciating the epistemic complexities, it is also possible to shed some light on more general philosophical debates regarding the similarity account of scientific representation. However, rather than to reinvigorate it, the claim of this paper is limited in that it clarifies the specific conditions under which similarity may be crucial for both establishing and maintaining the representational relation between the model and its target.

Can a Microbiome be 'Obesogenic'? (in person)

ABSTRACT. Research into the human microbiome has proliferated over the past two decades. Growing evidence has implicated the microbiome in a range of key bodily functions, and there is promise of the development of effective interventions into thus far difficult to address conditions or diseases.

I focus on the relationship between the microbiome and obesity, and examine the concept of the ‘obesogenic microbiome’. I argue that there are two key problems: firstly, the erroneous assumption that obesity is a single category which the microbiome can causally influence, and secondly, the conceptualisation of the microbiome as a manipulable entity in such a way that makes its integration with host or environment dynamics more difficult. These problems have broader implications both for the interpretation of microbiome research, and for our understanding of what makes something ‘obesogenic’, including obesogenic humans and environments.

Specificity of Association in Epidemiology (online)

ABSTRACT. The epidemiologist Bradford Hill famously claimed that specificity of association (the fact that a variable X is associated with a certain medical outcome Y but not other outcomes) is strong evidence of causation in epidemiology. Yet prominent epidemiologists have dismissed this idea, claiming that it relies on dubious principles about disease causation. Against these criticisms, I argue that specificity of association does have a useful role to play in epidemiological causal inference. On the picture I propose, the evidential value of specificity is tied to contingent features of the sorts of causal inference problems that epidemiologists typically face. Along the way I argue (pace Woodward) that specificity of association is a matter not only of the number of putative effects associated with a variable, but also of the heterogeneity of these effects, and that this is important to understand why nonspecificity is often evidence against causation in epidemiology.

09:00-11:00 Session 11D: General philosophy of science
Multiple Realization and Evolutionary Dynamics: A Fitness-Based Account (online)

ABSTRACT. Multiple realization occurs when a natural kind is variably realized at more basic levels and the common physical structure of the realizers is not essential for supporting nomological statements. It has been suggested that this phenomenon may be an outcome of natural selection acting over multiple realizers that perform an adaptive function. In this paper we make the following contributions. First, we present a revision of this model characterized by stricter equilibrium conditions and superior explanatory power. Second, we present a typology of multiple realization that provides a plausible account of the differences between across- and within-species multiple realization. Third, we perform a formal analysis of the dynamics of multiple realization that sheds light on the differences between multiple realization at different levels of organization.

Against Prohibition (Or, When Using Ordinal Scales to Compare Groups is OK) (online)

ABSTRACT. A widely held view on measurement inferences, that goes back to Steven’s (1946) theory of measurement scales and “permissible statistics”, defends the following prohibition: you should not make inferences from averages taken with ordinal scales (vs. interval or ratio scales). This prohibition is general—it applies to all ordinal scales—and it is sometimes endorsed without qualification. Adhering to it dramatically limits the research that the social and biomedical sciences can conduct. I provide a Bayesian analysis of this inferential problem, determining when measurements from ordinal scales can be used to confirm hypotheses about relative group averages. The prohibition, I conclude, cannot be upheld, even in a qualified sense. I illustrate with the paradigm ordinal scale, Mohs’ scale of hardness, showing that the literature has mischaracterized it. The account here provided offers a potential resolution to the tension between practice and methodology that besieges measurement scales.

A Practitioner's Guide to Pragmatic Humeanism (online)

ABSTRACT. Advocates of the Humean account of laws have recently added elements of pragmatism into their view. But every step away from orthodox Lewisianism view risks losing the view's claim to deliver objective facts about laws of nature, causal relations, and nomological possibility. I will discuss two departures from orthodoxy: first, Humeans have argued that the theoretical virtues which determine which regularities are laws should be tailored to our epistemic needs. Second, Humeans have argued that the properties in which the laws are formulated should be determined not by objective metaphysical structure, but instead by their usefulness in formulating an effective and simple theory. In this paper, I will evaluate whether these changes are concessions to what Lewis called a `ratbag idealist'. First, I'll present and respond to Lewis's Ratbag worry. Next, I'll discuss a related objection from Shamik Dasgupta, and respond on behalf of Humeans who deny Lewis's account of natural properties.

09:00-11:00 Session 11E: General philosophy of science & integrated history and philosophy of science
Understanding the epistemic role of measurement issues in 19th century craniology (in person)

ABSTRACT. Craniology – the practice of inferring intelligence differences from measurements of human skulls – remained a popular research program until the end of the 19th century. Starting from the late 1970s, it became a critical target of historians and sociologists of science, who successfully uncovered the socio-cultural biases invalidating the evidence and claims that it produced. Although this literature offers an extensive catalogue of the epistemic flaws of craniology, these are treated as contingent features that were conducive to the defence of certain socio-cultural values and stereotypes. I argue, instead, that three epistemic issues were deeply constitutive of craniological practice as an unsuccessful attempt to measure an abstract quantity such as intelligence: 1) the systematic failure to develop a sound inferential scaffolding; 2) the issue of circularity and reliability of measurement, or problem of coordination; 3) the excessive epistemic burden attributed to measured data.

Political Representation in Science (in person)

ABSTRACT. Philip Kitcher holds that one of the problems he sets out to address is the problem of inadequate representation in science. If science is to work in the public interest, the public has to be represented in science. This is particularly important in science policy where inadequate representation can reduce the legitimacy of decisions based on scientific expert advice. But how is representation to be understood in this context and what kind of representation could legitimise scientists’ influence on politics? In this talk, I use an approach informed by political theory to argue that Kitcher as well as Helen Longino focus too narrowly on the representation of social perspectives. I suggest we instead consider ways that group interests can be organised in science, particularly in expert committees. While this approach runs the risk of merging scientific and political debates in a way many philosophers of science feel uncomfortable with, I maintain that it gives a more realistic picture.

The role of research heuristics for the occurrence and handling of new research opportunities in application-oriented research (online)

ABSTRACT. My paper deals with research heuristics in application-oriented research and their role in the occurrence and handling of new research opportunities. Based on several case studies from the life sciences, I first argue that application-oriented research frequently produces findings that can take research in new directions. I then discuss whether a certain heuristic for the selection of research topics is conducive to the production of new research opportunities, namely setting epistemically ambitious research goals. Second, I discuss different heuristics for the handling of new research opportunities. While epistemic considerations favour exploration of all epistemically interesting new research opportunities, practical considerations suggest limiting exploration to those findings relevant for the sought applications. Therefore, I propose that exploration on epistemic grounds should be enabled at the level of the research system rather than in each application-oriented research project.

Why Fund Basic Research? Unpredictability and Other Enigmas (online)

ABSTRACT. Why should we fund basic science? This question has attracted modest philosophical attention. Within science policy circles, the foremost argument for funding basic science is the ‘Unpredictability Argument,’ where basic science should be funded because it frequently facilitates technological gains. In this paper, we assess its most contentious and underscrutinized premises, that technological gains have frequently followed from basic science and will continue to do so into the future. This reveals a new set of challenges for philosophers that can contribute towards well-informed science funding policies.

09:00-11:00 Session 11F: Philosophy of social science and cognitive science
On dynamic-mechanistic explanations in the cognitive sciences (in person)

ABSTRACT. It is argued that dynamic and mechanistic explanations in the cognitive sciences are mutually exclusive. Some dynamists argue that cognitive systems are non-decomposable, thus they cannot supplement the mechanistic framework, which depends upon assuming the target system's decomposability. Mechanists claim that dynamic approach lacks explanatory power because it merely describes the dynamics of the target phenomenon without addressing the underlying causal mechanism.

In my paper I argue that these arguments can be refuted and dynamic and mechanistic approaches are complementary, rather than exclusive . In short, I show that dynamic models may also describe components of mechanisms, and that mechanistic approach is not blindly committed to decomposability and may also explain systems which are minimally decomposable. As an example of dynamic-mechanistic explanation I will introduce a model of epileptic seizure.

Learning as a Part of memory (online)

ABSTRACT. What is the relation between learning and memory? Although these cognitive capacities are taken to be closely related they are often conceived as distinct capacities. In this talk, I examine the relation between learning and memory, arguing that learning is not conceptually independent of memory. Rather, learning should be conceived as part of the capacity to remember. I argue for this view on the basis of conceptual and empirical considerations. I claim that the textbook definition is fraught with conceptual inconsistencies which make it difficult to accept it in its current form. Characterizing learning as memory acquisition circumvents these conceptual inconsistencies. Further, I argue that by optogenetically intervening on distinct stages in the process of memory, researchers have been able to experimentally implement artificial information in animals. Optogenetic studies of artificial memory only make sense in light of learning being memory acquisition.

Empirical Philosophy of Economics - Drawing Borders between Quantitative and Qualitative Methods (in person)

ABSTRACT. In this paper, I attempt to compare and criticize two sets of methods of empirical philosophy of economics inspired by the social sciences — quantitative (i.e. questionnaires, vignettes) and qualitative (i.e. focus groups, participant observation). In the first part, I analyse two case studies — empirical approaches to classical questions of philosophy of economics — Nagatsu’s and Põder’s quantitative study of economists’ notion of preferences and Svetlova’s qualitative study of models’ de-idealization. I argue that both approaches alone are insufficient to provide a significant improvement of the classical tools of philosophical analysis. The quantitative study lacks a clear semantics of results, a genuine measure of significance and an insight into the subject’s thought process, while the qualitative study lacks representativeness. Thus, I argue in the last part, the methodology of the empirical philosophy of economics must aim to combine both sets of methods.

The method of cases in economics: a challenge for naturalism? (online)

ABSTRACT. The use of the 'method of cases' in philosophy has been criticized for two main reasons. First, it allegedly relies on a mysterious psychology of intuitions. Second, philosophical cases are atypical, which makes our judgements about them more unreliable. I argue that there are similarities between the method of cases as it is used in philosophy and the sciences. In particular, I show that intuitions have the same role in philosophy and the sciences if one construes it along the lines of the 'revisionary approach'. Then, I show that scientific cases can also be atypical. This suggests that there is more methodological continuity between philosophy and the sciences than previously thought.

11:30-13:30 Session 12A: Philosophy of technology and interdisciplinary research
What do engineers understand? The case of biological methanation (online)
PRESENTER: Michael Poznic

ABSTRACT. What sort of understanding do engineers acquire through their investigations into how to best make something? This paper develops a case study to argue for three conclusions about engineering and understanding. (i) In line with much recent work in the philosophy of science, engineers’ understanding turns on grasping models. (ii) There are at least two kinds of models: representational models that aim to accurately depict features of target systems and design models that aid in the production of artifacts. The understanding of engineers consists in the appropriate integration of these representational and design models. A design model must be informed by representational models for engineers to genuinely understand. This conclusion supports Dellsén’s recent argument that understanding does not require explanation. However, (iii) Dellsén’s account of understanding based exclusively on models of dependencies is too narrow to make sense of the understanding of engineers.

How supervised machine learning “measures” and what we can learn from it (in person)

ABSTRACT. This paper explores a striking correspondence between supervised machine learning and a model-based epistemology of measurement. While related comparisons have led other philosophers of science to debate the relative evidential status of computational prediction and measurement outcomes, I argue that once we consider disciplinary differences in measurement practices, both measurement and machine learning are too diverse to warrant meaningful comparison of their evidential status. Instead, we can draw other lessons from the analogies between machine learning and measurement. Comparisons with measurement offer a promising approach to better grasping the epistemology of machine learning, as well as a point of reference for contemporary challenges surrounding the use of machine learning in science.

A computational approach to the philosophical discussion of model transfer (in person)

ABSTRACT. Templates have been proposed as a central concept in explaining interdisciplinary model transfer. But while philosophers have given several detailed accounts of template transfer, a comprehensive, global characterization of this phenomenon is still lacking. In the present study, we propose that the combination of philosophical and computational analyses can extend our understanding of the construction, transfer, and application of model templates. The focus of the proposed presentation will be on the computational analysis. This analysis is based on 60493 preprints that deal with oscillation-phenomena that had been uploaded over the last 10 years to the ArXiv and the BioRxiv. We use this data to compare the structures that arise from the mathematical content and the semantic content of the same dataset. This allows us to establish a first indication of the extent to which scientific work in this area is connected via cross-disciplinary model templates.

Extended Virtue and Scientific Expertise (online)

ABSTRACT. As contemporary scientific inquiry depends on technologically enhanced research skills and large-scale collaboration, we need a notion of epistemic virtue that accommodates not only individual skills and traits but also social-technological extension of epistemic competence. I first formulate a scientifically relevant notion of epistemic competence in terms of field-specific epistemic skills and sensitivity towards sources of error. In the second part, I develop an account of extended epistemic competence, which involves the integration of technological instruments with the cognitive processes underlying knowledge production, as well as distribution of epistemic labor, credit and responsibility in the context of research collaborations. In the last part, I develop a relational notion of scientific expertise in terms of informant-reliability, which comprises one’s extended competence as well as responsibility to inform truly and to avoid misinforming.

11:30-13:30 Session 12B: Philosophy of physical sciences
Limiting reduction of hydrodynamics, singular limits, and asymptotic expansions (in person)

ABSTRACT. This paper focuses on a case of the limiting reduction in physics which has not been discussed yet in the philosophical literature (e.g. Batterman 2020), viz the reduction of hydrodynamics to systems of particles in classical mechanics. We first argue that the Navier-Stokes equation can be derived from a system of particles in classical mechanics, but with a *singular* limit. For that purpose, we sketch how the Navier-Stokes equation is derived (e.g. Gallagher 2019) and we argue that the *singular limit* prevents a limiting reduction. However, this derivation allows one to recover the Navier-Stokes equation in the limit epsilon—>0 when *asymptotic expansions* are truncated. We thus investigate these asymptotic expansions, which turn out to be *divergent* series. According to Miller (forthcoming), they cannot be interpreted as approximation schemes. In this context, we discuss how to interpret this derivation of the Navier-Stokes equation with an error *epsilon* arbitrarily small.

The Past as Key to the Future: The Paleoclimate as an Analogue Model for Contemporary Climate Change (online)

ABSTRACT. Paleoclimatology has the potential to provide invaluable information on Earth’s climate, including by using the paleoclimate as a model for present and future climate change. In this talk, I will draw out a number of philosophical implications of the practice of using the paleoclimate as an analogue (physical) model. First, I will discuss implications of this practice for the philosophy of the historical sciences, specifically pertaining to the debates over whether the historical sciences can be involved in generalizations or prediction. Second, I will enumerate a variety of ways in which the paleoclimate is unusual as an analogue model: it is not a scaled (up or down) model of its target system, it is not artificially constructed, and it is not manipulable. These features of the paleoclimate raise a challenge for philosophers of modeling to spell out exactly what makes analogue models useful to scientists, both in this particular case and more generally.

Climate extremes as serious possibilities (in person)

ABSTRACT. An alternative possibilistic perspective has been suggested in order to avoid the deficiencies of the probabilistic approach in climate modelling. But this possibilistic perspective raises the epistemological question of the identification and justification of serious possibilities in the climate context. This paper aims to show how recent work in climate science (and in particular in the subfield of extreme event attribution) can help to address this ‘possibilistic challenge’. More specifically, we investigate to what extent the sort of qualitative (and causal) understanding at play within the storyline approach (typically relying on thermodynamical considerations) can help to ground serious possibilities in the climate context. The storyline approach developed in climate science can actually itself be naturally understood within the possibilistic framework discussed in philosophy; the link between the two needs to be carefully articulated though.

Against Symmetry Fundamentalism (in person)

ABSTRACT. Symmetry fundamentalism holds that symmetries are features of physical reality and should hence be employed as guides to what’s fundamental. In this presentation, I call for philosophical caution when symmetry fundamentalism is employed for metaphysical research. First, I distinguish between two views on symmetries –by-stipulation and by-discovery. Then, I argue that the role that symmetries play in modern physics supports, to a great extent, the by-stipulation view, which conflicts symmetry fundamentalism. It follows from this that symmetry fundamentalism is not adequate to construe the role that symmetries play in current physics since it overloads their ontological import. I conclude that symmetry deflationism, instead of symmetry fundamentalism, should be recommended.

11:30-13:30 Session 12C: Philosophy of life sciences
Pregnancy as Agency (online)

ABSTRACT. Pregnancy is widely regarded as a passive state that lasts without requiring action from a pregnant woman. As traditionally assumed also by many feminist philosophers, pregnancy is not something a woman does; it rather happens to her, thereby depriving her of her agency. My paper explores the metaphysical prospects of a novel view of pregnancy as agency that goes beyond recent feminist acknowledgements that pregnant women actively make adjustments to the changes of their bodies. To this end, I bring together two hitherto separate debates: the debate on agency as a biological capacity of organisms (‘bio-agency’) in the philosophy of biology and the debate on the metaphysics of mammalian pregnancy. I argue that the best chances to facilitate a convincing scientifically informed view of pregnancy as agency are provided by a combination of the bio-agency approach with a Process View of pregnancy, according to which a pregnant organism is a bifurcating hypercomplex process.

Organisms, biological individuals, and levels of organization: An integrative framework (in person)

ABSTRACT. This paper deals with the notions of ‘biological individual’ and ‘organism’, which are central in contemporary philosophy of biology but are often – and problematically – conflated. I argue that organisms are special kinds of biological individuals that emerge at several levels in the hierarchy of organization and that there is no single ‘level of the organism’. I conceptualize biological individuals and organisms as nested systems composed of interacting parts (subsystems). I develop a conceptual framework to individuate organisms by screening (‘zooming-in and -out’) through the hierarchy of nested systems. Under this view, the organism is understood as a limit case, i.e. a system that emerges at a level below and above which systems of qualitatively different kinds are found. Finally, I formalize these ideas using nested graphs, apply them to two case studies on biological individuality (holobionts and superorganisms), and discuss some their theoretical and practical consequences.

On the meaning of biogeographical areas as natural entities and on their ontological status (in person)

ABSTRACT. This work aims to address the ontology of biogeographical areas (more specifically called areas of endemism), considered to be the fundamental units of study in biogeographical disciplines. In particular, the individuality thesis proposed by Crother and Murray (2011), and Murray and Crother (2015) who considered areas of endemism as historical individuals (in the sense of Ghiselin (1974) and Hull (1976) for biological species), will be discussed. The meaning of areas of endemism as the natural entities of Earth’s surface, in the context of historical biogeography, will also be discussed. Finally, the ontological viewpoint will be distinguished from the operational procedures that concern specific scientific framework (such as evolutionary theory and phylogenetic systematics –i.e. cladistics- in historical biogeography) and the relevance of ontological investigation will be highlighted.

Organisms as persisters and overcomers (in person)

ABSTRACT. This paper addresses the relation between the concepts of organism and biological individual. Recently, biophilosophers have clarified different kinds of individuality (e.g. metabolic, reproductive) but largely failed to show how individuality and organismality differ. An influential view understands organisms as self-organizing systems whose parts maintain them as functional wholes. I argue that this view of organisms as persisters (i) cannot distinguish the organisms from other self-maintaining individuals (e.g. holobionts, colonies), and (ii) neglects crucial dynamics characteristic of organisms. This refers to organisms’ ability to de- and reorganize their individuality (e.g., from metabolic to reproductive individuality) during development to become qualitatively different wholes. By drawing on cases of microbial colonization and sexual parasitism I develop a view of organisms as overcomers that solves these two problems and clarifies the explanatory roles of organisms in biology.

11:30-13:30 Session 12D: General philosophy of science
A Bayesian Perspective on Severity: Risky Predictions and Specific Hypotheses (online)
PRESENTER: Noah van Dongen

ABSTRACT. A tradition that goes back to Popper assesses the value of a statistical test primarily by its severity: was it a honest and stringent attempt to prove the theory wrong? The main idea is that a successful risky prediction is more impressive than a successful vague prediction. For "error statisticians" such as Mayo, and frequentists more generally, severity is a key virtue in hypothesis tests. Conversely, failure to incorporate severity into statistical inference, as it allegedly happens in Bayesian inference, counts as a major methodological shortcoming. In this contribution, we argue that the error-statistical explication of severity has substantive drawbacks (i.e., neglect of research context; lack of connection to specificity of predictions; problematic similarity of degrees of severity to one-sided p-values). Second, we argue that severity matters for Bayesian inference via the value of specific, risky predictions: severity boosts the evidential value of a Bayesian hypothesis test.

Old Evidence, Measurement, and Accuracy (in person)

ABSTRACT. I propose a new solution to the problem of old evidence. I argue that it is a particular sort of evidence giving rise to the problem---and likewise suggests a solution. The idea is that we generally rely on measurements with respect to old evidence and these come with statistical structure, viz., with confidence intervals which indicate how likely the real value lies within a certain range. I argue that if we take the accuracy of the hypothesis into account and update on the part of the confidence interval implied by the hypothesis, its posterior probability does, after all, increase.

Genuine Confirmation and Tacking by Conjunction (online)
PRESENTER: Gerhard Schurz

ABSTRACT. Tacking by conjunction is a deep problem for Bayesian confirmation theory that rests on the following fact: To every hypothesis H that is confirmed by a piece of evidence E, one can ‘tack’ an irrelevant hypothesis X so that the conjunction H&X is also con-firmed by E. This seems counter-intuitive. Existing Bayesian solution proposals try to soften the negative impact of this result by showing that although H&X is confirmed by E, it is so only to a lower degree. In this paper we outline some problems of these proposals and develop an alternative solution based on a new concept of confirmation that is called "genuine confirmation". We show that the notion of genuine confirma-tion provides a satisfying solution to the tacking problem and has two further ad-vantages: (i) it does not suffer from the problem of measure sensitivity and (ii) it is a necessary condition for Bayesian convergence-to-certainty results.

Interpreting Probability Claims in Climate Science (in person)

ABSTRACT. Probablistic claims are common in climate science. To date, these claims have usually been treated as expressing subjective credences. I argue against this view, which has three major problems. First, it fails to account for how the probabilities in question are in fact generated, which often involves the use of classical statistics rather than Bayesian updating. Second and third, the presentation of subjective credences by scientists in scientific reports is both (descriptively) atypical and (normatively) inappropriate. A better view is that such claims represent the authors' best estimate of the objective "weight of the evidence."

11:30-13:30 Session 12E: Philosophy of social science
Non experts: which ones would trust you? (in person)

ABSTRACT. Following Goldman (2001), recent contributions have discussed the evaluation of experts as defined by the 2experts/novice problem (when novices must find criteria that allow them to choose among experts). Goldman’s criteria have been validated by recent contributions, yet, less attention has been paid to novices and how their heterogeneity might mediate in the expert/novice relation. We present an account that distinguishes novices based on their capacity to evaluate messages according to evidential quality. We discuss a scenario in which experts need to decide whether to issue an accurate (but complex) message (as related to a Government recommendation), or a simple (but less accurate) message. Novices are distributed in two groups (highly competent (HCN) and low competence novices (LCN)). We show how under reasonable assumptions, complex messages can be more effective even if the majority of citizens are LCN. We present a case study regarding the use of masks in COVID-19 prevention.

Evaluating Interpretive Qualitative Theories (online)

ABSTRACT. Typically, empirical studies that employ interpretive qualitative methods focus more often on generating than on testing theories. It is both intuitive and typically assumed by researchers that (a) the theories generated in this context have some credibility, and that (b) this credibility is in some way a function of the quality and volume of the collected data. In this paper, I discuss both claims. I argue that by conceptualizing the inference involved in qualitative interpretive theorizing as selective abduction, credibility can be defined as satisfaction of some selection criterion. I examine two possible selection criteria: likelihood and posterior probability. I discuss how these criteria can be construed in a way that sheds light on claims (a) and (b) and, at the same time, fits the practice of qualitative methods. I argue that both criteria entail rewards and limitations, but only the posterior probability criterion can explicate claim (b).

Defending a concrete interventionist theory of singular causation (online)

ABSTRACT. Woodward's interventionism clearly shows how one can find evidence for type-level causation. Such evidence can come from both actual interventions (e.g., RCTs) and natural experiments. However, as a methodological guide for finding evidence of token-level causation, such as claims of the form ‘event x caused event y’, Woodward’s theory is incomplete. For such claims, Woodward’s framework relies on hypothetical interventions, counterfactual claims telling us what would have happened to y, if we had intervened on x. I sharpen interventionism, showing what criteria hypothetical interventions ought to meet to constitute good evidence, focusing on case study research in political science. There, token claims are common and of great importance for both theorizing and social policy. I show that although methodologists of political science have concrete guidelines for counterfactual analysis, they are more ambiguous than and even contradictory to my interventionist proposal.

Extrapolating Causal Effects - Where is Our Theory of Confidence? (online)

ABSTRACT. Extrapolation of causal effects is common in various sciences. Despite important progress on elucidating the challenges it involves, the existing literature has largely neglected how we can articulate and manage the uncertainties that invariably remain in extrapolation. Drawing on existing resources, I sketch a causal graph-based approach, called support graphs, that can help analysts clarify several important issues in an integrated way, including: 1) which assumptions are needed for an inference, 2) how relevant these assumptions are for a conclusion, 3) whether they enjoy sufficient support, and 4) how confident we may be in a conclusion on that basis.

11:30-13:30 Session 12F: Formal philosophy of science
A Battle in the Statistics Wars (online)
PRESENTER: William Peden

ABSTRACT. Debates between Bayesian, frequentist, and other statistical methodologies generally focus on conceptual justifications, sociological arguments, or their long-run virtues. Our paper adds a fresh perspective to these “Statistics Wars”, via simulations of methodologies' short-run decision making performances. We simulated a decision problem involving bets on a series of binomial events. We coded three players: a standard Bayesian, a frequentist, and one based on Jon Williamson's Bayesianism. We varied the simulation/player parameters in many ways. Surprisingly, all players had comparable (and good) performances, even in the very short-run where their decisions differed considerably. Our study indicates that all three approaches should be taken seriously. Our study is unprecedented, apart from a preliminary and relatively obscure 1990s conference paper. The decision problem and the players are very modifiable, so our study has great potential fruitfulness for the philosophy of statistics.

Causal Heterogeneity and Independent Components (in person)
PRESENTER: Lorenzo Casini

ABSTRACT. Many variables have potentially ambiguous effects on other variables due to their heterogeneous causal role in the population of interest. When heterogeneity is present and data on individual units are unavailable, correct estimates of causal effects by popular inference methods are harder to come by, and so are successful policies. In this paper, we exploit the ``independent component representation'' and tools that recover this representation to address the problems induced by heterogeneity. Our proposed method is data-driven and ``unsupervised'', meaning that it can uncover causal heterogeneity without directly investigating the data generating mechanism. We illustrate our argument by reference to the ``rebound effect'' in energy economics, namely the ambiguous effect of energy efficiency advances on greenhouse gas emissions via (aggregate) consumption, which in the literature has been attributed to underlying differences in consumption behaviour across individuals.

Two birds with one stone? Not when arguing for Bayesianism and Credal Veritism (online)

ABSTRACT. Accuracy-first epistemology argues that the Bayesian norms are accuracy-conducive. Together with Credal Veritism, i.e. the claim that accuracy is the sole fundamental source of epistemic value, this can be used for an argument in favor of the Bayesian norms. One possible line of reasoning for Credal Veritism (as put forward, for example, by Richard Pettigrew) again relies on the accuracy-conduciveness of the Bayesian norms. It is argued that this reasoning is question-begging. It turns out that the accuracy-conduciveness of the Bayesian norms cannot be used simultaneously in an argument for Bayesianism and in an argument for Credal Veritism: the accuracy-firster cannot kill both birds with the same stone.

Accuracy and Probability Kinematics (online)

ABSTRACT. Accuracy-based arguments purport to establish the laws that govern credences (or degrees of belief) by showing that these laws further the goal of accuracy---having credences as close as possible to the truth. This paper investigates the relationship between accuracy and Probability Kinematics (also called Jeffrey Conditionalization). It is well known that the popular Brier score cannot be used in an expected-accuracy argument for Probability Kinematics. In this paper, we show that a local logarithmic inaccuracy does not lead to Probability Kinematics either. And we present a new accuracy-based account of Probability Kinematics based on considerations of accuracy-dominance.

14:00-14:30 Session 13: EJPS information session

We invite junior scholars to join us and discuss the process of publishing journal articles (from paper submission to peer-review, to the editorial handling of submissions, etc.).

14:30-15:00 Session 14A: BSPS Open Monographs information session

BSPS Open publishes Open Access philosophy of science monographs on the basis of merit alone, and not on an author’s ability to pay a fee. Find out about how to submit and get your manuscripts published. For more information check http://www.thebsps.org/bsps-open/

14:30-15:00 Session 14B: EJPS meet the editors

This session is an opportunity to meet the editorial team of EJPS. Our speakers are Federica Russo, Phyllis Illari, Mathias Frisch and Dunja Šešelja. Federica and Phyllis, who have been the Editors-in-Chief of EJPS for the last four years, will reflect on their editorial experience. Mathias and Dunja, the new Editors-in-Chief, will introduce the new editorial team and their vision for the journal in the coming years.

15:00-17:00 Session 15A: S:COVID-19 in the public sphere: conspiracies, exceptionalism and moral profiling
COVID-19 in the public sphere: conspiracies, exceptionalism and moral profiling (online)
PRESENTER: Lisa Bortolotti

ABSTRACT. During the COVID-19 pandemic there has been extensive public debate about the causes of the pandemic, the best ways in which governments and individual citizens could respond to it, and the most effective treatments available for it. There has also been a surge of misinformation affecting institutions and individuals: wild conspiracies about how the virus originated, dubious appeals to national character to explain why certain countries were justified in adopting different policies, and conflicting attributions of political and moral values to medical and societal responses to the virus. Such attitudes did have a powerful and often costly impact on debates, decisions, and behavioural and health outcomes. In this symposium we look at some of these phenomena in detail, focusing on the mechanisms responsible for various forms of misinformation and on the values that shape our understanding of the world around us in a time of stress and uncertainty.

15:00-17:00 Session 15B: S: Particles, fields, or both?
Particles, Fields, or Both? (online)
PRESENTER: Charles Sebens

ABSTRACT. One of the primary tasks of philosophers of physics is to determine what our best physical theories tell us about the nature of reality. Our best theories of particle physics are quantum field theories. Are these theories of particles, fields, or both? Hubert will open the debate by examining the prospects for solving the problems of self-interaction that arise in classical electromagnetism for an ontology of point particles interacting with the electromagnetic field. Lazarovici will defend a pure particle ontology for quantum field theory based on Dirac's idea that particles are never truly created or destroyed, they just enter and exit a background infinite sea of particles. Sebens will defend a pure field ontology, arguing against a particle approach for photons and in favor of a field approach for electrons and positrons. Swanson will address the no-go theorems facing both particle and field ontologies, identifying three strategies for saving a field ontology.

15:00-17:00 Session 15C: S: Modeling the Nerve Impulse: Philosophy of Science Meets Scientific Practice
Modeling the Nerve Impulse: Philosophy of Science Meets Scientific Practice (online)

ABSTRACT. The Hodgkin-Huxley model is generally considered to provide a correct explanation of nervous transmission. Recently, however, alternative modeling strategies have been advanced that question basic assumptions of the Hodgkin-Huxley model, igniting a debate in neuroscience. This raises crucial philosophical questions: What does the coexistence of a multiplicity of models in this debate teach us about scientific modeling, and what are the implications for scientific practice? In this symposium, scientists and philosophers will discuss how philosophical reflection on modeling applies to this case study and what this current situation in neuroscience teaches us about the nature of scientific modeling more generally.

15:00-17:00 Session 15D: S: Philosophizing about the Unknown: Black Holes
Philosophizing about the Unknown: Black Holes (in person)
PRESENTER: Erik Curiel

ABSTRACT. In the past 50 years, black holes have moved to the center of theoretical physics and astronomy. Recently, they have moved to the center of philosophy of science as well, raising new problems and shedding new light on old ones, ranging from the nature of space, time and matter, to issues of observation and explanation. Possible answers are of interest not only to philosophers, but to scientists as well: they grapple with the same concerns, and eagerly engage with philosophers about them. Symposium speakers will cover a wide range of philosophical issues, showing how the idea of black holes ramifies throughout different fields of philosophy, illuminates connections among them, and offers immediate connections to physics. They will begin with a history of black holes as problematic objects of knowledge, and then go into more technical analysis of their various problems in theoretical and observational physics, discussing their philosophical implications.

15:00-17:00 Session 15E: S: Quantum Realism: Moving Forward in Neutral
Quantum Realism: Moving Forward in Neutral (in person)
PRESENTER: James Fraser

ABSTRACT. There are many radically different interpretations of quantum theory: Everettian, Bohmian, spontaneous collapse approaches, and so on. It is often thought that in order to be a scientific realist about quantum physics one must commit to one of these interpretations, despite the prolonged lack of consensus about which is best supported. In this symposium, we bring together philosophers of science who are exploring the possibility of maintaining a realist attitude towards quantum physics while remaining neutral on interpretation. It will be the first meeting dedicated to the prospects of remaining neutral in the face of underdetermination in the foundations of quantum theory and thus has the potential to play an important role in stimulating an emerging discussion in the philosophical literature.

15:00-17:00 Session 15F: S: Missing theory: case studies from medical and social science
Missing theory: case studies from medical and social science (online)

ABSTRACT. The social and medical sciences are mostly atheoretical. Extolling statistical and experimental methods, researchers in these fields favor ground-up empirical approaches. They determine stable patterns in the observations and rely on those in their explanations. The natural sciences, by contrast, have flourishing theoretical subdisciplines that impact how these sciences progress. What explains the atheoreticity of the social and medical sciences, and is it defensible? In our symposium we investigate the comparative lack of theoretical reflection in the medical and social sciences using different case studies. Through these case studies we assess the merits and defects brought on by the exclusion of theory within these fields. Our contributions will be critical but constructive: we believe that the medical and social sciences will benefit from theoretical reflection. Ultimately, the reappraisal of theory will lead these sciences to a more fruitful connection with their target domains.

17:15-18:30 Session 16: Keynote: Eva Jablonka - Progress in Evolutionary Biology? The EES Debate

Do the extensions and revisions suggested by the advocates of the Extended Evolutionary Synthesis (EES) – a current developmentally-oriented version of evolutionary theory that challenges the neo-Darwinian version that has been dominant since the 1950s – amount to progress in evolutionary biology? What, if anything, is the nature of this progress? I consider these questions within a framework that combines the systems biology approach of Conrad Waddington for investigating embryological development with the sociological approach of Ludwik Fleck for analyzing the development of scientific systems. I focus on the contribution of studies of epigenetic inheritance because the results stemming from this research program are seen as unimportant by followers of the neo-Darwinian version of evolutionary theory, while the same results are seen as crucial and progressive by biologists advocating the ESS. This case therefore highlights the context-sensitive nature of assessments of scientific progress during periods of theory change and suggests that progress is relative to the delineation of the theoretical boundaries of the scientific system and the time scale that is chosen.