Professor Maxwell's Duplicitous Demon
Page 17
Maxwell wrote of that medium:
[I]t is well able to do all that is required of it, whether we give it nothing to do but to transmit light and heat or whether we make it the machinery of magnetism and electricity also and at last assign gravitation itself to its power.
Given Maxwell’s discovery of the nature of light, it’s no surprise that he wanted to bring electricity and magnetism into the mix, but the idea of gravitation also being involved may seem a little odd. However, it fitted well with ideas that had been circulating ever since Newton’s day.
Newton famously claimed not to have a hypothesis for how gravity worked at a distance, writing in his masterpiece, the Principia, ‘hypotheses non fingo’, usually translated as ‘I frame no hypotheses’.* This wasn’t true. As a great supporter of particle-based theories – thinking, for example, that light was a collection of particles or ‘corpuscles’ – Newton did have a particle-based theory of the mechanism of gravity, variants of which would be developed up to the time that Einstein put gravity on a sound mathematical footing in 1915.
As we have seen, the idea based on invisible particles flowing through space and pushing on massive bodies was simple and attractive,† but it had one major flaw, which various scientists over the years would attempt to overcome with what were ultimately fudges. In its simple form, the theory predicts that the gravitational force will be linked to the size of the attracting body. While size is usually a factor, it’s only because big things tend to be more massive. Newton had shown it was the mass of the bodies involved that determined their gravitational attraction, not their size. Explanations of gravitation based on the particle pressure theory had to be modified to account for this.
What Maxwell seems to have had in mind in his letter to Bond was a similar mechanism for gravity, but one that depended on pressure in the ether. As he put it:
If we could understand how the presence of a dense body could produce a linear pressure radiating out in straight lines from the body and keep up this kind of pressure continually, then gravitation would be explained on mechanical principles and the attraction of two bodies would be the consequence of the repulsive action of the lines of pressure in the medium.
He drew an image showing the Sun emitting lines which then hit a body and curve around it in parabolic shapes, and went on to speculate that the comet’s tail is a result of these pressure lines, pushing away from the Sun. But he couldn’t explain why the lines of force would be visible as a tail (we now know the tail is gas and dust, vaporised from the comet), asking:
Is there anything about a comet to render its lines of force visible? and not those of a planet which are much stronger? I think that visible lines of gravitating force are extremely improbable, but I never saw anything so like them as some tails of comets.
Maxwell’s ideas here may not have been fruitful, but they demonstrate the breadth of his thinking.
The viscosity engine
Rather than immediately refining his model for electromagnetism after its initial triumph, Maxwell returned to his old sparring partner, the nature of a gas, looking specifically at a property of fluids known as viscosity – a measure of the liquid’s (or gas’s) resistance to shearing forces – effectively how thick and gloppy the material is.
At the time, it was thought that the viscosity of a gas varied as the square root of the temperature – if the temperature quadrupled, for example, the viscosity would double. This would not be the temperature as we measure it for domestic use, from the arbitrary starting point of the freezing point of water, but from the coldest possible temperature, absolute zero, which is around –273.15°C (–459.67°F). The concept of absolute zero had been around since the eighteenth century, but Maxwell would have had an appropriate scale to use thanks to his friend William Thomson, who in 1848 devised the Kelvin scale (Thomson was later ennobled as Lord Kelvin), starting at absolute zero.
However, Maxwell, who always seemed to enjoy bridging the gap between experiment and theory, undertook a series of experiments to confirm or deny the behaviour of viscosity with temperature. His schedule at King’s College left him plenty of time for experimental work, but unlike a modern physics professor he did not have access to a laboratory at the university and had to perform his experiments in the attic of his house.
Maxwell’s main experimental device for investigating the viscosity of gases consisted of a series of discs along a vertical wire spindle, alternating fixed discs and discs which rotated together by twisting the wire. The discs were contained inside a glass chamber which meant that Maxwell could alter the pressure with an air pump, or change the contents to a different gas to discover the impact on the viscosity. This was no table-top apparatus – it stood above Maxwell’s height on the attic floor (see Figure 6). The discs were 10.56 inches in diameter and the wire was four feet long. Maxwell started the discs moving using magnets outside the case, sliding them from side to side until the discs were twisting back and forth on the wire. The resultant oscillation was slow – Maxwell notes that ‘the period of a complete oscillation was 72 seconds and the maximum velocity of the edge of the disks was about 1/12 inch per second’.
When the discs were oscillating, the resistance of the air between the turning discs and the nearby fixed discs (having fixed discs also minimised the impact of draughts) caused a dragging effect that enabled Maxwell to estimate the viscosity of the gas. Despite a dangerous-sounding setback – his glass chamber imploded when he reduced the pressure too much and it took him a month to get the apparatus working again – he soon had solid results, which he presented to the Royal Society in November 1865.
Because the whole point of the experiment was to see how viscosity varied with temperature, Maxwell and Katherine had to make extreme efforts to vary the temperature in the London attic. The near-contemporary biography of Campbell and Garnett notes that:
For some days a large fire was kept up in the room, though it was in the midst of very hot weather. Kettles were kept on the fire and large quantities of steam allowed to flow into the room. Mrs Maxwell acted as stoker, which was very exhausting work when maintained for several consecutive hours. After this the room was kept cool, for subsequent experiments, by the employment of a considerable amount of ice.
FIG. 6. Maxwell’s viscosity apparatus with the discs enclosed in the glass container like an inverted bell jar.
The findings from this experiment proved a challenge to Maxwell’s own theory of gases. It did confirm his surprise discovery that viscosity was independent of pressure. But his theoretical work had fitted with the previously assumed square root relationship between temperature and viscosity, while the experiments clearly showed that the viscosity actually was directly proportional to the temperature. You would only have to double the temperature to double the viscosity. Despite this, Maxwell’s results did plenty for his reputation as an experimental physicist pushing the boundaries of knowledge. For the moment, though, he would be distracted from taking the work any further.
Stereoscopes and coffins
It’s no great surprise that the distraction came from his old favourite topics of light and colour vision, but in his London home Maxwell turned this into a mix of research and parlour entertainment. In Victorian homes like Maxwell’s it was not uncommon for visitors to be presented with a diversion that consisted of a new piece of technology. In the 1860s, many a parlour would be considered incomplete without a stereoscope, as essential a piece of middle-class home technology as a computer today.
A form of stereoscope – which combined two pictures portrayed as if seen from the positions of the two eyes to produce a 3D image – had been invented in the 1830s by another King’s College professor, Charles Wheatstone.‡ Maxwell was certainly aware of this as early as 1849, while at Edinburgh University, as he wrote to Lewis Campbell about ‘Wheatstone’s Stereoscope’ and how Sir David Brewster had ‘exhibited at the Scottish Society of Arts Calotype§ pictures of the statue of Ariadne and the beast seen from two statio
ns’, which Maxwell comments ‘when viewed properly, appeared very solid’. By the 1860s the combination of readily available stereoscopic photographs and a far less complex optical device to view them meant that stereoscopes were all the rage.
Maxwell himself later devised a significant improvement on the standard stereoscope, though it never took off commercially as it was both larger and more expensive than the traditional form. The parlour stereoscope consisted of a frame holding a pair of photographs (or drawn images) and a pair of lenses through which the viewer looked. This was essentially the same approach as used in the View-Master toy, popular from the 1940s and reaching a zenith in the 1960s, although this had seven pairs of images on a disc, rotated by pulling a trigger on the side of the viewer. The result of looking through the stereoscope was that the images were combined in the viewer’s brain, producing a virtual 3D image.
The stereoscope was effective, but limited. Some people had trouble viewing the images, the experience was individual – only one person could look through the lenses at a time – and the virtual image was quite distant from the viewer. In 1867, Maxwell developed a ‘real image’ stereoscope, which used the standard pair of photographs and dual lenses, but then had another, single large lens in front. The viewer (or viewers, if they were close enough together) looked at this lens from a distance of a couple of feet and saw the 3D image floating in space just behind the large lens. Maxwell had a device assembled by the instrument makers Elliott Brothers, and gave a paper on it to the British Association in September 1887. He would later use it to demonstrate 3D images of curved surfaces and mathematical knots. Topology and knot theory were recreational activities for Maxwell for a number of years, over and above their bearing on his more mainstream work.
However, in the Maxwells’ home, visitors could expect a more unusual experience. They would be taken up to the attic to have an encounter with ‘the coffin’. This was Maxwell’s latest light box for mixing red, green and blue light to produce a whole spectrum of possible colours, one at a time. The eight-foot-long box had caused some confusion to Maxwell’s neighbours when it was delivered as it did resemble a straight coffin. For the visitor this would be simply a new and exciting experience, but for Maxwell it was an opportunity to collect data on the way a range of individuals – both normal sighted and colour blind – perceived different colours. For several years, around 200 visitors a year were subjected to the coffin in the attic.
A standard for resistance
A less entertaining but more practical diversion came over the matter of electrical units – the units used to measure, for example, electrical current or resistance – which were becoming increasingly important as electrical engineering and particularly telegraphy took off. It’s hard to appreciate now what a fundamental breakthrough telegraphy provided in speed of communication. Two of Maxwell’s friends, William Thomson and Henry Fleeming Jenkin, were involved in the biggest telegraphy project of the day, the transatlantic cable, and there was considerable concern that resistance in the cable would render the project unusable. Thomson, working with Faraday, had shown that if the resistance of the cable were too high, it could take as long as four seconds to send a single character.
With such a major technology at the mercy of a topic that had not been precisely studied, getting a better measure of resistance had far more practical application than just establishing a common unit. The British Association for the Advancement of Science saw Maxwell, with his expertise in electromagnetism, as the ideal person to be involved. The BA had set up a committee to study the requirements for standard units at the Manchester meeting in 1861 and Maxwell would play a major role in putting together the report to the Newcastle meeting in 1863.
Historically, units had been derived locally and this caused considerable confusion in international communication. Each country had its own definitions of units such as length or weight, which made it difficult to be sure what a measurement really indicated. This was nothing new. In his book The Sand Reckoner, the Ancient Greek mathematician and engineer Archimedes gave a size for the universe using the measure of ‘stades’ – multiples of the length of the running track at a stadium. A stadion (plural stades) was supposed to be 600 feet, but each city had its own definition of a foot. This means we can’t know for sure what Archimedes intended, with a stadion being anything between about 150 and 200 metres. The BA felt that with electrical science becoming inherently international due to the undersea cables, this kind of uncertainty could not be allowed to happen again.
The new capabilities of electricity and magnetism required appropriate units and Maxwell therefore agreed to take on electrical standards with the help of the Edinburgh engineer Henry Fleeming Jenkin (among other things, the inventor of the cable car) and Balfour Stewart, a physicist who had worked with Forbes at Edinburgh and was director of the Kew Observatory in Richmond upon Thames. The small team produced a more rational system for defining units for resistance, current and the like, based on experiments they undertook at King’s College.
This was an unusual piece of work for Maxwell in that it is pretty well the only significant true teamwork he did, rather than acting alone (with the exception of assistance from Katherine). It’s not that he worked in hermetic isolation – letters between Maxwell and the likes of William Thomson and Peter Tait are full of scientific ideas and queries, where the physicists would use each other as sounding boards. But unlike modern science, there was very little true collaboration involved.
Many of the traditional units were relatively simple to pin down (if only a universal standard could be agreed on). These had started with a physical measure from nature, then been locally standardised by having a definitive example. The older measures of distance, for example, such as the foot and the mile, were dependent on typical human characteristics – the size of a part of the anatomy and one thousand paces (mille passus) respectively. The metre was slightly more scientifically determined, originally 1/10,000,000th of the distance from the North Pole to the equator on the meridian that ran through Paris. Each had become standardised to be represented by an official measure, though, as we have seen, these varied from country to country.
It was less obvious where a standard for voltage or current or electrical resistance would come from. Indirect measures were proposed that made use of equipment that translated one of the electrical measures into more familiar physical units. So, for instance, as it was known that the force between two electrical charges dropped off with the inverse square of the distance between them, it was possible to define a unit of charge (later called the coulomb) with a combination of force generated and distance between charged objects. This could then be used to calculate current (later the amp), which was a rate of flow of charge and so on.
Alternatively, current itself could be used as the way in. As the values involved in electromagnetic interaction became better known (though what was causing it was yet to be fully understood), force and distance could also be used as a measure of current between two interacting electrical coils. And a third option made resistance the starting point. This involved measuring the deflection of a rotating coil under the influence of magnetism.
Because of the importance of understanding the properties of the transatlantic cable, resistance was a key focus for Maxwell’s group at King’s College, which would make real an elegant mechanism devised by William Thomson based on this third option. Thomson’s design involved spinning a coil of wire in the magnetic field of the Earth and using the induced electromagnetism to counter the Earth’s pull on a small permanent magnet. Because the size of the magnetic field of the Earth cancels out between the two effects, the amount that the permanent magnet is deflected away from magnetic north depends only on the size of the coil, the speed of its rotation and its resistance – so, given the first two values, the equipment can be used to calculate an absolute value for resistance.
The velocity of a resistor
The unit of electrical resistance, measured
using Thomson’s method, turned out to be velocity. This was not connected to the actual velocity of a signal through the wire (something that confused many of those working on telegraphy at the time). It was simply the consequence of taking the units of the different values such as distance and speed of rotation that went into the measurement – the resultant dimensions of the resistance were distance and time in the form of a velocity. The standard unit settled on was 10 million metres per second, which would soon after be called a B.A. unit, also known as an ‘ohmad’,¶ which rapidly got shortened to ohm.|| At least, 10 million metres per second was the intended value, although a measuring error (surely not one of Maxwell’s infamous arithmetic slip-ups) meant that the standard ohm was actually slightly larger than it should have been.
The Thomson design was not a trivial piece of apparatus to use effectively. The coil had to be rotated at a constant speed, with a considerable amount of effort put into the design by Jenkin to provide a governor to keep the rotation steady. The mechanism was constantly breaking down, and to make matters worse, it was sufficiently sensitive that when an iron ship passed on the nearby Thames, the detection magnet would be slightly deflected.** It took many months of admittedly sporadic work to get a satisfactory set of readings. After the initial report at the 1863 BA meeting, a further twelve months would pass before they were sure that the values were reliable.