Accueil du site > Liste à établir > Archives > Activités de l’année 2004-2005 > Aspects formels et historiques du (...)

Aspects formels et historiques du changement scientifique / Formal vs historical accounts of scientific change

For a number of years, logicians and philosophers of science have developped models of belief revision, both in the probabilistic and Bayesian tradition and in the qualitative and so called « AGM » approach, and a large body of literature now exists and is expanding, often associated with work on non classical logics and computer science. In spite of the fact that many of these models are relevant to traditional issues in the philosophy and methodology of science - such as the nature of confirmation, of scientific explanation, of theory change or truth - there has been too little contact between these formal and logical approaches and more historically oriented work in the philosophy of science. Recent work on belief revision systems, on Bayesian epistemology and on such topics as incommensurability or truth approximation suggests that a a dialogue between the two traditions can be renewed. The present workshop, which associates researchers from Paris and from London, is part of the project "Connaissance, revision, preuve et changement scientifique" sponsored by the Program "Histoire des Savoirs" of the French CNRS, intends to make a move in the direction of exploring both ther merits and limits of these approaches. It could be the first step of a more regular cooperation between British, French and other Europena researchers in these fields.

Contacts

Pascal Engel

- UFR de philosophie
- Université Paris IV Sorbonne
- 1 rue Victor Cousin
- 75005 Paris
- France
- Email : pascal.engel@paris4.sorbonne.fr

Stephan Hartmann
- Centre for Philosophy of Natural and Social Science
- London School of Economics and Political Science
- Houghton Street
- London WC2A 22AE
- U.K.
- Email : S.Hartmann@lse.ac.uk

Voir site web / see web site


PROGRAM

Monday, December 13 2004

9.00 -9.15 Pascal Engel & Stephan Hartmann : Introduction

9.15-10.30 John Worrall : Tom Kuhn Meets Tom yes II Commentator : Jerome Dokic

Coffee Break

10.45-12h Alexander Bird : Should a Naturalized Epistemologist Prefer Historical or Formal Accounts of Theory-Change ? Commentator : Pascal Engel

12.00-13.15 Stephan Hartmann : The Formalist and the Historicist Should be Friends Commentator : Elie Zahar

Lunch

14.30-15.45 Léna Soler : A Symbiotic Model of Scientific Development ? Commentator : Alain Boyer

15:45-17.00 Anouk Barberousse : Resistance to Belief Revision ? The Specific Heat Case Commentator : Roman Frigg

Coffee Break

17.15-18.30 Fabien Chareix : An Original View on the Birth of Calculus : Christian Huygens’critiques on infinitesimal analysis Commentator : Armond Duwell

Conference Dinner

Tuesday, December 14 2004

9.30-10.45 Philippe Mongin : Confirmation and the Law of Demand Commentator : Amit Hagar

10.45-12.00 Hervé Zwirn : Abductive Logic in a Belief Revision Framework Commentator : Paul Egré

Lunch

14.00-15.15 Luc Bovens : Beta Functions, Expected Utility Maximization and the Maximin Principle Commentator : Denis Zwirn

15.15-16.30 Jaques Dubucs : Inductive Inertia : Good to Have It, but Not too Much Commentator : Daniel Andler


ABSTRACTS

Alexander Bird

Should a Naturalized Epistemologist Prefer Historical or Formal Accounts of Theory-Change ?

Recent history and philosophy of science suggests that formal approaches to theory change should be attractive to scientific realists while historical accounts tend to lend support to various kinds of anti-realism, including relativism. In this paper I argue that this perspective is misleading. I argue that formal approaches are insufficient to support scientific realism, while historical approaches can be supplemented by a naturalised epistemology to yield a satisfyingly realist account of theory change.


Anouk Barberousse

Resistance to Belief Revision ? The Specific Heat Case

The aim of the paper is to construct a formalization of a historical case (the specific heat problem) allowing a discussion of the validity of various models of belief and theory revision. Before the quantum revolution, the specific heat problem, jeopardizing Newtonian mechanics, was variously taken into account by the physicists. In the absence of any epistemic access to the internal structure of molecules other than by analogy and speculation, physicists had to tame a complete uncertainty and to find a way to bypass it. This case will be confronted with the formal notions of epistemic entrenchment, coherence, and plausibility.


Luc Bovens

Beta Functions, Expected Utility Maximization and the Maximin Principle

In risk analysis, the precautionary principle is held up as a counter weight to expected utility maximization. But clearly, a strict application of the adage that it is better to be safe than sorry is stifling. We do not make policy by focusing strictly on the worst outcomes and choosing the policy that yields the best worst outcome. Although the probability of worst outcomes is not known with precision, we do make estimates of the risks and decide to accept certain risky prospects and not others. For Ellsberg, expected-utility maximization for decision-making under risk and the maximin solution for decision-making under uncertainty are two poles of a continuum. Between these poles we have varying degrees of confidence in our probability assessment. Ellsberg modeled this continuum by introducing a measure of our degree of confidence in our probability assessment. He maximizes the sum of the expected utility, weighted by r, and the utility of the worst outcome, weighted by 1 - r. I argue that the same results can be obtained by means of expected utility maximization within a strictly Bayesian framework. Our degree of confidence in our probability assessment is represented by means of a Beta density function. By letting our probability assessment be the value of p for which the density function reaches its maximum and by calculating the expected utility by means of the upper bound of a confidence interval, expected utility maximization enjoins us to be the more cautious, the lower our degree of confidence in our probability assessment is. My approach has the following advantages : (i) It respects the intuition that we are more inclined to take account of the worst-case scenario when our degree of confidence concerning our probability assessment is low ; (ii) It avoids the dogmatic stand of the precautionary principle ; (iii) It does not bring in any machinery outside of the Bayesian framework of expected-utility maximization.


Fabien Chareix

An Original View on the Birth of Calculus : Christian Huygens’ Critiques on Infinitesimal Analysis

The rise of new infinitesimal methods at the end of the seventeenth century have accelerated the change of the galilean style in natural philosophy. The history of this turning point in physics usually consider it as a continuous and quite fast change in the way physicists dealt with coordinates, motions, accelerations and their differential aspects. Accordingly, twenty years after Leibniz gave the rules of infinitesimal calculus, followed by the Bernoullis, L’Hospital or Varignon, the change was completely done and gave birth to a rational or analytical mechanics, as if the end of the so called ’geometrization’ of nature had been a change without any discussion at all. In the moment where this revolution took place, some physicists strongly rejected the new methods, though using them in an informal way, because nothing could be say about the legitimity of the infinitesimal quantities. Christiaan Huygens, in his correspondance with Leibniz and many mathematician, was one of them. His attachment to geometrical methods led him to define precisely what, in his mind, was the meaning of natural philosophy. This attitude can be related to the way Huygens rejected Newton’s gravity theory as a pure mathematical, not physical, account of matter and motion. He thus gives an original view on an internal resistance to theory change in those times of permanent scientific evolution.


Jacques Dubucs

Non Bayesian Ingredients in Theory Change

Bayesianism proposes a smooth image of science as converging process of changing degrees of belief by means of conditionalization with respect to new information. Case studies in scientific theory change as well as recent reflection in epistemology suggest that this image is not correct. In particular, it does not fit with the revision process by which we reframe our theories when the new information does not cohere well with what we are prepared to accept or to premiss with. The paper will try to provide a systematic clarification of this problem. Stephan Hartmann The Formalist and the Historicist Should be Friends There are two main traditions in the philosophy of science. Formalists present general accounts of science and provide rational reconstructions of scientific theories and their dynamics. Historicists examine specific case studies, refrain from generalizations, and are typically not interested in the rationality of science. While formalist approaches tend to be “too far away” from real science to be relevant, it is impossible to address a number of philosophically interesting questions using a purely historicist approach. For example, many of the factors involved in scientific theory choice cannot be justified within an entirely historicist framework. In this talk, I will argue that formalist and historicist approaches can be reconciled in a way that takes the best of both worlds. More specifically, I will develop a Bayesian picture of scientific theorizing that bridges the gap between a (moderately) historicist approach and a (moderately) formalist approach and show how the problem of scientific theory choice can be addressed within this picture.


Philippe Mongin Confirmation and the Law of Demand

It is a classic question in philosophy of science whether confirmation follows the "descending" direction of logical inference, or the opposite "ascending" direction, or both. On the one hand, Hempel and Glymour argue that confirmation should follow the "descending" direction, and by no means the "ascending" one. On the other hand, Popper’s concept of confirmation, i.e., corroboration, is more obviously compatible with confirmation following an "ascending" than a "descending" direction. The point of this paper is to illuminate this contrast by means of a case study in recent theoretical economics. Hildenbrand’s Market Demand (1994) is an original attempt to derive the law of demand at the aggregate level, while allowing for individual consumer decisions that may not conform with the individual law of demand. Hildenbrand has elaborated his mathematical theory with a eye on future testing, and carried actual tests which he says are not unfavourable to his aggregate law. The paper will show that, despite Hildenbrand’s official falsificationism, his confirmation concept is more in accord with Hempel’s than Popper’s, and it will try to assess the commonsense of the "descending" and "ascending" analyses of confirmation in terms of the example at hand.


Léna Soler

A Symbiotic Model of Scientific Development ?

Many recent history-oriented studies of science have explained scientific bifurcations - and thus, as a particular case, changes of beliefs - in terms of what may be called a ‘symbiotic model of scientific development’. I will especially these models in reference to Pickering’s and Hacking’s analyses. Symbiotic models assume that the emergence of stable developmental stages of science is the result of the co-stabilization of various types of mutually supporting factors. Such models explicitly or implicitly rely on an enlarged notion of coherence (enlarged so as to include non propositional factors such as actions, material, marks etc.). Just which factors may be or are in fact constitutive of a scientific symbiosis is a matter of discussion, but at least some of these factors are theoretical and experimental items. These accounts of scientific development go often with a more (Pickering) or less (Hacking) strong commitment to contingentism. The aim of the paper is (a) to sketch the general structure of the symbiotic explanatory scheme ; (b) to examine the relations between this scheme and contingentism ; (c) to reflect on its relevance, explanatory power and intrinsic methodological difficulties.


John Worrall

Tom Kuhn Meets Tom Bayes II

It has been suggested (e.g. by Wes Salmon) that much of Kuhn’s treatment of the historical details of ’scientific revolutions’, and of his treatment of so-called ’elderly holdouts’ in particular, far from showing that such revolutions cannot be seen as ’rational’ events, can in fact be straightforwardly reconciled with the Bayesian account of scientific reasoning. A detailed investigation of the particular case of David Brewster (a ‘hold out’ against Fresnel’s wave theory of light) reveals two sorts of features - (i) aspects of the case which can be given Bayesian explications but which in fact underline clear weaknesses in the Bayesian account and (ii) aspects of the case which cannot be given Bayesian explications but which are crucial to the rationality issue. This does not imply, of course, that no ’formal’ account can be given of the rationality of particular theory-changes in the history of science, but only that that ’formal’ account will not be (unsupplemented) personal Bayesianism.


Hervé Zwirn

Abductive Logic in a Belief Revision Framework

Abduction was first introduced in the epistemological context of scientific discovery. It was more recently analyzed in Artificial Intelligence, especially with respect to diagnosis analysis or ordinary reasoning. These two fields share a common view of abduction as a general process of hypotheses formation. More precisely, abduction is conceived as a kind of reverse explanation where a hypothesis H can be abduced from events E if H is a “good explanation” of E. The paper surveys four known schemes for abduction that can be used in both fields. Its first contribution is a taxonomy of these schemes according to a common semantic framework based on belief revision. Its second contribution is to produce, for each scheme, a representation theorem linking its semantic framework to a set of postulates. Its third contribution is to present semantic and axiomatic arguments in favor of one of these schemes, "ordered abduction", which has never been vindicated in the literature.


Dans cette rubrique