-13-
of the shift that occurred about the same time in the basic word-order of Latin sentences from SOV (subject-object-verb) to SVO. She considers that 'an appeal to multiple causation--not ruling out the possibility of cultural determinants--may well prove to be the most satisfactory approach to the problem.'5
George Steiner has recalled the shock he experienced when, as a young child, he first realized that statements could be made about the far future. 'I remember', he writes, 'a moment by an open window when the thought that I was standing in an ordinary place and "now" and could say sentences about the weather and those trees fifty years on, filled me with a sense of physical awe. Future tenses, future subjunctives in particular, seemed to me possessed of a literal magic force.' He compares that feeling with the mental vertigo which is often produced by contemplating extremely large numbers, and draws attention to the interesting suggestion made by some scholars of Sanskrit, the oldest Indo-European language known, that 'the development of a grammatical system of futurity may have coincided with an interest in recursive series of very large numbers'.6
Be that as it may, it is clear that the origin of the concept of number, like the origin of language, is closely connected with the way in which our minds work in time, that is, by our being able to attend, strictly speaking, to only one thing at a time and our inability to do this for long without our minds wandering. Our idea of time is thus closely linked with the fact that our process of thinking consists of a linear sequence of discrete acts of attention. As a result, time is naturally associated by us with counting, which is the simplest of all rhythms. It is surely no accident that the words 'arithmetic' and 'rhythm' come from two Greek terms which are derived from a common root meaning 'to flow'. The relation between time and counting is further discussed in my The Natural Philosophy of Time.7
Time and natural bases of measurement
Most people, however primitive, have some method of time-recording and time-reckoning based either on the phases of nature indicated by temporal variations of climate and of plant and animal life or on celestial phenomena revealed by elementary astronomical observations. Time- reckoning, that is the continuous counting of time-units, was preceded by time-indications provided by particular occurrences. The oldest method of counting time was by means of some readily recognizable recurrent phenomenon, for example the counting of days in terms of
-14-
dawns such as we find in Homer ( "'This is the twelfth dawn since I came to Ilion'", Iliad, xxi. 80-1). In this method of time-reckoning, as M. P. Nilsson has remarked, it is not the units as a whole that are counted, since the unit as such has not been conceived, but a concrete phenomenon occurring only once within this unit. It is what he calls the 'pars pro toto method', so extensively used in chronology.8
A good example of this method is provided by the extended use of the word 'day'. The fusion of day and night into a single unit of twenty-four hours did not occur to primitive man, who regarded them as essentially distinct phenomena. It is a curious fact that even now very few languages have a special word to denote this important unit. Notable exceptions are the Scandinavian terms, for example the Swedish dygn, whereas in English we use the same word 'day' to denote the full twenty-four-hour period and also the daylight part of it. Instead of appealing to 'dawn' and 'day', some peoples count time by the number of nights. This may be because sleeping provides a particularly convenient time-indicator. A familiar relic of this in English is the word 'fortnight', a term which is now as obsolete in the United States as the word 'sennight' is in Britain.
To indicate a particular time in the period of daylight the sun can often be used, either by reference to its position in the sky or in some other way. Thus, the Australian aborigine will fix the time for a proposed action by placing a stone in the fork of a tree so that the sun will strike it at the required time. Many tribes in the tropics indicate the time of day by referring to the direction of the sun or to the length or position of the shadow cast by an upright stick, but before sunrise the natural phenomenon most widely used as a time-indicator is cock-crow.
A wide variety of conventions have been adopted for deciding when the day-unit begins. Dawn was chosen by the ancient Egyptians, whereas sunset was chosen by the Babylonians, Jews, and Muslims. The Romans at first chose sunrise but later midnight, because of the variable length of the daylight period. Dawn was the beginning of the day-unit in Western Europe before the advent of the striking clock in the fourteenth century, but later midnight was chosen as the beginning of the civil day. Astronomers, such as Ptolemy, found it more convenient to choose midday, and this remained the beginning of the astronomical day until 1 January 1925 when, by international agreement, the astronomical day was made to coincide with the civil day.
Besides the day the other most important natural unit of time is the year. Nevertheless, although each year normally presents the same cycle of phenomena, man only gradually learned to unite the different seasons
-15-
into a definite temporal unit. This step was particularly difficult to take by people living in those equatorial regions where there are two similar half-years, each with its own seed-time and harvest, since by a 'year' a vegetation-period was originally understood. There is an important difference between the natural year, that is, the period of the earth's annual revolution around the sun, and the agricultural year. The former has no natural beginning or end, whereas the latter has. In Old Norse, German, and Anglo-Saxon years tended to be reckoned in winters. The reason for this practice, which was of course rare in the tropics, was the same as that for counting days by nights, winter being a season of rest, an undivided whole, and therefore more convenient than summer with its many activities. Nevertheless, there were exceptions to this rule. For example, in Slavonic time was reckoned in summers and in English expressions such as 'a maiden of eighteen summers' were used, whereas in medieval Bavaria years were reckoned in autumns.
Time-indications from climatic and other natural phases during the course of the year are only approximate and tend to fluctuate from year to year. Greater accuracy is often desirable for agriculture, and it was recognized long ago that this could be provided by the stars, particularly by their rising and setting. Observation of these phenomena did not make great intellectual demands on primitive man, who rises and goes to bed with the sun. Experience teaches him which stars rise in the east just before the sun and which appear in the west at dusk and shortly afterwards set there. These 'heliacal' risings and settings, as they are called, vary throughout the year and can be readily correlated with particular natural phenomena. The stars therefore provide us with a ready and more accurate means of determining the time of year than any based on the phases of terrestrial phenomena. Just as the time of day may be revealed by the position of the sun, so the time of year can be determined by means of heliacal risings and settings, and this can form the basis of a calendar. Timings can also be approximately determined by observing the position of stellar groupings that can be easily recognized, notably the Pleiades.
Although the stars can help man to determine the seasons, they do not enable him to divide the year into parts. Instead, the moon has been used to produce a temporal unit between the year and the day. Moreover, unlike time-indications from natural phases and the stars, the moon's waxing and waning provide a continuous means of time-reckoning. Consequently, the moon can be regarded as the first chronometer, since its continually changing appearance drew attention to the durational
-16-
aspect of time. Although the concept of the month is much more readily attained than that of the year, it is difficult to combine the two satisfactorily, because the solar period is not a convenient multiple of the lunar period. So long as the beginning of the month was determined by observing the new moon, the month was based on lunations, but they are inconvenient for measuring time, since it is the movement of the sun that determines the seasons and the rhythm of life associated with them. As a re
sult, our system of months no longer has any connection with the moon but is a purely arbitrary way of dividing the solar year into twelve parts. Our present concept of the year can be traced back to the Romans and through them to the Egyptians, who disregarded lunation as a time- measure.
As regards shorter intervals of time than the year and the day, primitive people have often made use of convenient physiological intervals such as 'the twinkling of an eye' or occupational intervals such as the time required for cooking a given quantity of rice. Indeed, man's unwillingness to abandon natural bases of measurement was for long a hindrance to the development of a scientific system of timekeeping. This is particularly evident in the case of the hour. The division of the daylight period into twelve parts was introduced by the Egyptians, who first of all divided the interval from sunrise to sunset into ten hours and then added two more for morning and evening twilight respectively. They also divided the night into twelve equal parts. These 'seasonal hours', as they are called, varied in duration according to the time of year. The inconvenience of this practice, although not so great in countries like Egypt as in more northerly places, introduced an unnecessary complication into the development of the water-clock and was quite impracticable in scientific astronomy.
Time in contemporary society
What particularly distinguishes man in contemporary society from his forebears is that he has become increasingly time-conscious. The moment we rouse ourselves from sleep we usually wonder what time it is. During our daily routine we are continually concerned about time and are forever consulting our clocks and watches. In previous ages most people worked hard but worried less about time than we do. Until the rise of modern industrial civilization people's lives were far less consciously dominated by time than they have been since. The development and continual improvement of the mechanical clock and, more recently, of portable watches has had a profound influence on the way we
-17-
live. Nowadays we are governed by time-schedules and many of us carry diaries, not to record what we have done but to make sure that we are at the right place at the right time. There is an ever-growing need for us to adhere to given routines, so that the complex operations of our society can function smoothly and effectively. We even tend to eat not when we feel hungry but when the clock indicates that it is meal-time. Consequently, although there are differences between the objective order of physical time and the individual time of personal experience, we are compelled more and more to relate our personal 'now' to the time- scale determined by the clock and the calendar. Similarly, in our study of the natural world, never has more importance been attached to the temporal aspects of phenomena than today. To understand why this is so and how it has come about that the concept of time now dominates our understanding of both the physical universe and human society, no less than it controls the way we organize our lives and social activities, we must examine the role that it has played throughout history.
-18-
Part II
Time in Antiquity and the Middle Ages
-19-
[This page intentionally left blank.]
-20-
3. Time at the Dawn of History
Prehistory
Consciousness of self is a fundamental characteristic of human existence. It involves a sense of personal continuity through a succession of different states of awareness. This sense of personal identity depends essentially on memory, but a sense of the past could only have arisen when man consciously reflected on his memories. Similarly, purposeful action involves at least implicit recognition of some future achievement, but a general sense of the future could not have resulted until man applied his mind systematically to the problem of future events. Man must have been conscious of memories and purposes long before he made any explicit distinction between past, present, and future.
The famous palaeolithic paintings found in caves such as that at Lascaux in the Dordogne have been interpreted as evidence that, at least implicitly, people were operating 20,000 or more years ago with teleological intent in terms of past, present, and future. From what we know of primitive races it is highly probable that the incentive for producing these paintings was magical, the object being to fix in paint on the wall or ceiling of a cave an event--usually the slaying of an animal--which it was hoped would be effected in the future elsewhere. It may be that those responsible for the well-known picture of the so-called 'Dancing Sorcerer' (on the wall of one of the innermost recesses of the Trois Frères cave in the department of Arriège in France), which represents a man in the skin of an animal and wearing the antlers of a stag, may have felt that the actual performance of the dance was insufficient, since they were concerned about the conservation of the magical efficacy of the dance after it had ended. If correct, this hypothesis might explain why these people so many thousands of years ago went to the trouble and danger of penetrating so deeply into the cave for this purpose.
In making these pictorial representations people must have relied on their memories of past events, and so all three modes of time were involved. But this no more implies a conscious awareness of the distinctions between past, present, and future than the use of language
-21-
necessitates an explicit knowledge of grammar. Indeed, it must have required an enormous effort for man to overcome his natural tendency to live like the animals in a continual present. Moreover, the development of rational thought actually seems to have impeded man's appreciation of the significance of time.
In his classic work Primitive Man as Philosopher, Paul Radin argues that among primitive men there exist two different types of temperament: the man of action who is oriented towards external objects, interested primarily in practical results and comparatively indifferent to the stirrings of his inner self, and the thinker--a much rarer type--who is impelled to analyse and 'explain' his subjective states. The former, in so far as he considers explanations at all, inclines to those that stress the purely mechanical relations between events. His mental rhythm is characterized by a demand for endless repetiton of the same event or events, and change for him means essentially some abrupt transformation. The thinker, on the other hand, finds purely mechanical explanations inadequate. But, although he seeks a description in terms of a gradual development from one to many, simple to complex, cause to effect, he is perplexed by the continually shifting forms of external objects. Before he can deal with them systematically he must give them some permanence of form. In other words, the world must be made static.1
Belief that ultimate reality is timeless is deeply rooted in human thinking, and the origin of rational investigation of the world was the search for the permanent factors that lie behind the ever-changing pattern of events. As Radin stressed in his discussion of the thought of primitive man, 'as soon as an object is regarded as a dynamic entity, then analysis and definition become both difficult and unsatisfactory. Thinking is under such circumstances well-nigh impossible for most people.'2 Indeed, language itself inevitably introduced an element of permanence into a vanishing world. For, although speech itself is transitory, the conventionalized sound symbols of language transcended time. At the level of oral language, however, permanence depended solely on memory. To obtain a greater degree of permanence the time symbols of oral speech had to be converted into the space symbols of written speech. The earliest written records were simply pictorial representations of natural objects, such as birds and animals. The next step was the ideograph by means of which thoughts were represented symbolically by pictures of visual objects. The crucial stage in the evolution of writing occurred when ideographs became phonograms,
-22-
that is representations of things that are heard. This conversion of sound symbols in time to visual symbols in space was the greatest single step in the quest for permanence.
The distinctions we make between past, present, and future refer to the transitional nature of time. Although dependent on memory, our sense of personal identity is closely
associated with the durational aspect of time. Man's discovery that he himself, like other living creatures, is born and dies must have led him intuitively to try to circumvent the relentless flux of time by seeking to perpetuate his own existence indefinitely. Evidence of ritual burial goes back at least to Neanderthal man and possibly even earlier.3 A Neanderthal burial of about 60,000 years ago, at a cave in northern Iraq, even appears to have included flowers.4 As for our own species, the oldest evidence, going back to possibly about 35,000 BC, reveals that the dead were not only equipped with weapons, tools, and ornaments but also with food, which must often have been in short supply among the living. In some cases bodies were covered with red ochre, which may have been intended to simulate blood, in the hope of averting physical extinction. The care taken over the disposal of the dead indicates a deeply held conviction that, provided the appropriate steps were taken, death could be regarded as a transitional state.
The idea of death as a transition from one phase of life to another that could only be satisfactorily effected by performing the appropriate rituals became the pattern for dealing with other natural changes. The principal transitions from one phase of people's life to another were thought of as crises and as a result the community to which they belonged assisted with the appropriate rituals.
Similarly, the principal transitions in nature were also regarded as occurring suddenly and dramatically. In the palaeolithic period men were already aware that at certain times of the year animals and plants are less prolific than at others, and seasonal ritual observances to maintain an adequate supply of them were therefore deemed necessary. With the change from a nomadic and food-gathering to an agricultural and more highly organized form of society, man's anxiety about himself and the animals that he hunted merged into a wider anxiety about nature. At the critical seasons a ritual response was required to overcome the unpredictable factors that might otherwise interfere with the regular growth of crops. The succession of natural phenomena and phases became evidence for a dramatic interpretation of the universe. Nature was seen as a process of strife between divine cosmic powers and demoniacal chaotic
Time in History: Views of Time From Prehistory to the Present Day Page 3