Inventing Temperature

Home > Other > Inventing Temperature > Page 37
Inventing Temperature Page 37

by Hasok Chang


  end p.216

  operational image and the actual operations, but we hope for a gradual convergence between them. Such convergence would be a considerable achievement, especially if it could be achieved with a high degree of quantitative precision.

  This convergence provides a basis for a workable notion of accuracy. We can say that we have an accurate method of measurement, if we have good convergence. How about truth? Can we ever say whether we have obtained the true values of an abstract concept like absolute temperature? The question of truth only makes sense if there is an objectively determinate value of the concept in each physical situation.62 In case we have a convergent operationalization, we could consider the limits of convergence as the "real" values; then we can use these values as the criteria by which we judge whether other proposed operationalizations produce true values. But we must keep firmly in mind that the existence of such "real values" hinges on the success of the iterative procedure, and the successful operationalization is constitutive of the "reality." If we want to please ourselves by saying that we can approach true values by iterative operationalization, we also have to remember that this truth is a destination that is only created by the approach itself.

  Theoretical Temperature without Thermodynamics?

  The foregoing analysis points to an interesting way in which the understanding and measurement of temperature could be tidied up and possibly refined further. The suggestion arises most of all from acknowledging a clear defect in my own analysis. That defect was the postulation of a sharp dichotomy between the abstract and the concrete. The discomfort in my discussion created by that oversimplification may have already been evident to the reader. The dichotomy between the abstract and the concrete has been enormously helpful in clarifying my thinking at the earlier stages, but I can now afford to be more sophisticated. What we really have is a continuum, or at least a stepwise sequence, between the most abstract and the most concrete. This means that the operationalization of a very abstract concept can proceed step by step, and so can the building-up of a concept from concrete operations. And it may be positively beneficial to move only a little bit at a time up and down the ladder of abstraction. Thomson was much too ambitious, in trying to connect up a resolutely abstract concept of temperature with the most concrete measurement operations all at once. In the end he had to rely on smaller steps, and a brief review of his steps may suggest some ways of progressing in small steps more deliberately.

  The Joule-Thomson work on gases and gas thermometers suggests that it would be possible to postulate a slightly abstract concept of temperature that is defined by the behavior of the ideal gas, with no reference to heat engines or any general thermodynamic theory. As was well known long before Thomson, most gases not too near their temperature of liquefaction have similar patterns of thermal

  62. It may seem odd to speak of the reality of temperature, which is a property and not an object. However, there are good arguments to the effect that it makes more sense to address the realism question to properties than to entities; see Humphreys 2004, sec. 15.

  end p.217

  expansion. It would be a modest leap from that empirical observation to postulate a theoretical concept of temperature (let's call it "ideal-gas temperature") that fits into the following story: "the behavior a gas is governed by temperature according to a simple law; of course, there are some complications because we do not have a completely accurate way of measuring real temperature, and also the different gases are all slightly imperfect in their own various ways; but the ideal gas would expand with real temperature in a perfectly linear fashion." If we want to believe this pleasing theoretical story, we have to fill in some blanks successfully. We have to come up with some independent way of detecting and measuring these various imperfections in actual gases, in the hope that this measure will fit in with their deviations from linear expansion with temperature. This is not so far from what Thomson, Joule, and their followers actually did, picking up on the concept of Amontons temperature. The Joule-Thomson cooling effect is just such a property that we can conceive as being tightly correlated with deviations from regular expansion. To get that far, we do not need a highly theoretical story of the kind that Thomson sought to provide about why the Joule-Thomson cooling should be related to peculiarities in the expansion of gases. Such a highly theoretical story belongs in later steps going up the ladder of abstraction and does not need to constrain the construction of the concept of ideal-gas temperature.

  In fact, in historical retrospect, it is not at all clear that the theory of heat engines was the best place to anchor the theoretical temperature concept. The use of the Carnot cycle in the definition of temperature was most unnecessary. It was certainly helpful initially for the purpose of getting Thomson to see how an abstract concept of temperature might be constructed, but he could have kicked that ladder away. The definition of 1854 was still overly restrictive, since it remained tied to heat input and output in isothermal processes occurring within a reversible cycle. Even the most liberalized concept of temperature based on the Carnot cycle would be restrictive, because such a concept cannot be adapted to nonreversible or noncyclical processes easily. Bypassing the Carnot cycle, and even all of classical thermodynamics, ideal-gas temperature can be linked up with the kinetic theory of gases very naturally. After that, it is of course plausible to think about incorporating the kinetic conception of temperature into an even more general and abstract theoretical framework, such as classical or quantum statistical mechanics. I am not saying that Thomson around 1850 should have or could have foreseen the development of the modern kinetic theory and statistical mechanics. What I think he could have perceived is that the notion of temperature applicable to the expansion of gases did not have to be housed in a theory of heat engines.

  In "The Defense of Fixity" in chapter 1, I have noted the robustness of low-level laws; for those laws to be meaningful and useful, the concepts occurring in them need to be well defined at that semi-concrete level and well connected to adjacent levels of abstraction. Something like the concept of ideal-gas temperature is at the right level of abstraction to support the phenomenological physics of gases. Other theories at that same phenomenological level (e.g. those governing conduction and radiation) may well require different semi-concrete concepts of temperature. The various semi-concrete concepts may become united by more abstract theories. Such concept-building processes moving toward greater abstraction should be harmonized

  end p.218

  with operationalization processes that start with the most abstract concepts and concretize them step-by-step. Such a well-ordered two-way process of abstraction and concretization would enable us to build a conceptual-operational system that can evolve with maximum flexibility and minimum disruption in the face of change and new discoveries.

  end p.219

  5. Measurement, Justification, and Scientific Progress

  Abstract: This chapter argues that in attempting to justify measurement methods, a circularity inherent in empiricist foundationalism is found. Dealing with this circularity means accepting it and admitting that justification in empirical science must be coherentist. Within such coherentism, epistemic iteration provides an effective method of scientific progress, resulting in the enrichment and self-correction of the initially affirmed system.

  Keywords: measurement methods, justification, foundationalism, coherentist

  Hasok Chang

  Washing dishes and language can in some respects be compared. We have dirty dishwater and dirty towels and nevertheless finally succeed in getting the plates and glasses clean. Likewise, we have unclear terms and a logic limited in an unknown way in its field of application—but nevertheless we succeed in using it to bring clearness to our understanding of nature.

  Niels Bohr (1933), quoted by Werner Heisenberg, in Physics and Beyond (1971)

  The preceding chapters have examined the development of thermometry, concentrating on the justification of standards and assumption
s. These stories of measurement bring out a particular mode of scientific progress, which I will try to articulate briefly and systematically now, building on various insights expressed earlier. I did not set out to advance any overarching epistemological doctrines in this book. However, it would be dishonest for me to hide the image of a particular mode of scientific progress that has emerged in the course of considering the concrete episodes. As I will elaborate further in "The Abstract and the Concrete," the content of this chapter should not be taken as a generalization from the preceding chapters, but as the articulation of an abstract framework that has been necessitated for the construction of the concrete narratives. Nor should this chapter be regarded as the summary of all of the epistemologically significant points made in earlier chapters. Here I will pull together only the ideas and arguments that can be strengthened and deepened through the synthesis; the rest can stand as they were developed in earlier chapters.

  The overall argument of this chapter can be summarized as follows. In making attempts to justify measurement methods, we discover the circularity inherent in empiricist foundationalism. The only productive way of dealing with that circularity is to accept it and admit that justification in empirical science has to be coherentist. Within such coherentism, epistemic iteration provides an effective method of scientific progress, resulting in the enrichment and self-correction of the initially affirmed system. This mode of scientific progress embraces both conservatism and pluralism at once.

  end p.220

  Measurement, Circularity, and Coherentism

  In his famous discussion of the difficulties of the empirical testing of scientific theories, Pierre Duhem made the curious statement that "the experimental testing of a theory does not have the same logical simplicity in physics as in physiology" ([1906] 1962, sec. 2.6.1, 180-183). The physiologists can make their observations by means of laboratory instruments that are based on the theories of physics, which they take for granted. However, in testing the theories of physics, "it is impossible to leave outside the laboratory door the theory we wish to test." The physicists are forced to test the theories of physics on the basis of the theories of physics. Among physicists, those who are involved in the testing of complicated and advanced theories by means of elementary observations would be in a relatively straightforward epistemic position, much like Duhem's physiologists. But for those who try to justify the reasoning that justifies the elementary observations themselves, it is very difficult to escape circularity. The basic problem is clear: empirical science requires observations based on theories, but empiricist philosophy demands that those theories should be justified by observations. And it is in the context of quantitative measurement, where the justification needs to be made most precisely, that the problem of circularity emerges with utmost and unequivocal clarity.

  In each of the preceding chapters I examined how this circularity of justification manifested itself in a particular episode in the development of thermometry, and how it was dealt with. Chapter 1 asked how certain phenomena could have been judged to be constant in temperature, when no standards of constancy had been established previously. The answer was found within the self-improving spiral of quantification—starting with sensations, going through ordinal thermoscopes, and finally arriving at numerical thermometers. Chapter 2 asked how thermometers relying on certain empirical regularities could be tested for correctness, when those regularities themselves would have needed to be tested with the help of thermometer readings. The answer was that thermometers could be tested by the criterion of comparability, even if we could not verify their theoretical justification. Chapter 3 asked how extensions of the established thermometric scale could be evaluated, when there were no pre-existing standards to be used in the new domains. The answer was that the temperature concept in a new domain was partly built through the establishment of a convergence among various proposed measurement methods applying there. Chapter 4 asked how methods of measuring abstract concepts of temperature could be tested, when establishing the correspondence between the abstract concept and the physical operations relied on some theory that would itself have required empirical verification using results of temperature measurement. The answer was found in the iterative investigation based on the provisional assumption of an unjustified hypothesis, leading to a correction of that initial hypothesis.

  In each of my episodes, justification was only found in the coherence of elements that lack ultimate justification in themselves. Each episode is an embodiment of the basic limitation of empiricist foundationalism. I take as the definition of foundationalism the following statement by Richard Foley (1998, 158-159): "According to foundationalists, epistemic justification has a hierarchical structure. Some beliefs are self-justifying and as such constitute one's evidence base. Others

  end p.221

  are justified only if they are appropriately supported by these basic beliefs." The main difficulty in the foundationalist project is actually finding such self-justifying beliefs. There have been great debates on that matter, but I think most commentators would agree that any set of propositions that seem self-justifying tend not to be informative enough to teach us much about nature. Formal logic and mathematics are cases in point. In the realm of experience, the theory-ladenness of language and observation forces us to acknowledge that only unarticulated immediate experience can be self-justifying. And as Moritz Schlick conceded, such immediate experience (which he called "affirmations") cannot be used as a basis on which to build systems of scientific knowledge ([1930] 1979, 382): "Upon affirmations no logically tenable structure can be erected, for they are already gone at the moment building begins."1

  Faced with this difficulty of foundationalist justification, we could try to escape by giving up on the business of empirical justification altogether. However, I do not think that is a palatable option. In the context of physical measurements, I can see only two ways to avoid the issue of justification altogether. First, we could adopt a simplistic type of conventionalism, in which we just decide to measure quantity Q by method M. Then M is the correct method, by fiat. One might think that something like the meter stick (or the standard kilogram, etc.), chosen by a committee, embodies such a conventionalist strategy. That would be ignoring all the considerations that went into selecting the particular platinum alloy for the meter stick, which led to the conclusion that it was the most robust and least changeable of all the available materials. Simplistic conventionalism comes with an unacceptable degree of arbitrariness. But going over to a more sophisticated form of conventionalism will bring back the necessity for justification; for example, in Henri Poincaré's conventionalism, one must justify one's judgment about which definitions lead to the simplest system of laws.

  The second method of eliminating the question of justification is the kind of extreme operationalism that I examined critically in "Beyond Bridgman" in chapter 3, according to which every measurement method is automatically correct because it defines its own concept. The problem with that solution is that it becomes mired in endless specifications that would actually prevent measurement altogether, because any variation whatsoever in the operation would define a new concept. To get a flavor of that problem, review the following passage from Bridgman: So much for the length of a stationary object, which is complicated enough. Now suppose we have to measure a moving street car. The simplest, and what we may call the 'naïve' procedure, is to board the car with our meter stick and repeat the operations we would apply to a stationary body. … But here there may be new questions of detail. How shall we jump on to the car with our stick in hand? Shall we run and jump on from behind, or shall we let it pick us up in front? Or perhaps

  1. Nonetheless, in the "protocol sentence debate" within the Vienna Circle, Schlick remained a foundationalist in opposition to Neurath, arguing that affirmations still served as the foundation of the empirical testing of knowledge.

  end p.222

  does now the material of which the stick is composed make a differe
nce, although previously it did not? (Bridgman 1927, 11; emphasis added)

  This kind of worrying is very effective as a warning against complacency, but not conducive to plausible practice. As it is impossible to specify all of the potentially relevant circumstances of a measurement, it is necessary to adopt and justify a generally characterized procedure.

  Therefore, we have no choice but to continue seeking justification, despite the absence of any obvious self-justifying foundations. Such perseverance can only lead us to coherentism. I use the word in the following sense, once again adopting the formulation by Foley (1998, 157): "Coherentists deny that any beliefs are self-justifying and propose instead that beliefs are justified in so far as they belong to a system of beliefs that are mutually supportive." We have seen coherentism in positive action in each of the preceding chapters. The simplest cases were seen in chapter 2, where Regnault employed the criterion of "comparability" to rule out certain thermometers as candidates for indicators of real temperature, and in chapter 3, where the "mutual grounding" of various measurement methods served as a strategy for extending the concept of temperature into far-out domains. In the remainder of this chapter I wish to articulate a particular version of coherentism that can serve as a productive framework for understanding scientific progress. (However, that articulation in itself does not constitute a vindication of coherentism over foundationalism.)

  Before moving on to the articulation of a progressive coherentism, it is interesting to note that the chief foundationalist metaphor of erecting a building on the firm ground actually points to coherentism, if the physical situation in the metaphor is understood correctly. The usual foundationalist understanding of the building metaphor is as outdated as flat-earth cosmology. There was allegedly a certain ancient mythological picture of the universe in which a flat earth rested on the back of very large elephants and the elephants stood on the back of a gigantic turtle. But what does the turtle stand on? We can see that the question of what stands at the very foundation is misplaced, if we think about the actual shape of the earth on which we rest. We build structures outward on the round earth, not upward on a flat earth. We build on the earth because we happen to live on it, not because the earth is fundamental or secure in some ultimate sense, nor because the earth itself rests on anything else that is ultimately firm. The ground itself is not grounded. The earth serves as a foundation simply because it is a large, solid, and dense body that coheres within itself and attracts other objects to it. In science, too, we build structures around what we are first given, and that does not require the starting points to be absolutely secure. On reflection, the irony is obvious: foundationalists have been sitting on the perfect metaphor for coherentism!

 

‹ Prev