Modern Mind: An Intellectual History of the 20th Century

Home > Other > Modern Mind: An Intellectual History of the 20th Century > Page 13
Modern Mind: An Intellectual History of the 20th Century Page 13

by Peter Watson


  It is perhaps difficult for us, today, to imagine the full impact of modernism. Those alive now have all grown up in a scientific world, for many the life of large cities is the only life they know, and rapid change the only change there is. Only a minority of people have an intimate relation with the land or nature.

  None of this was true at the turn of the century. Vast cities were still a relatively new experience for many people; social security systems were not yet in place, so that squalor and poverty were much harsher than now, a much greater shallow; and fundamental scientific discoveries, building on these new, uncertain worlds, created a sense of bewilderment, desolation and loss probably sharper and more widespread than had ever been felt before, or has since. The collapse of organised religion was only one of the factors in this seismic shift in sensibility: the growth in nationalism, anti-Semitism, and racial theories overall, and the enthusiastic embrace of the modernist art forms, seeking to break down experience into fundamental units, were all part of the same response.

  The biggest paradox, the most worrying transformation, was this: according to evolution, the world’s natural pace of change was glacial. According to modernism, everything was changing at once, and in fundamental ways, virtually overnight. For most people, therefore, modernism was as much a threat as it was a promise. The beauty it offered held a terror within.

  * Strauss was not the only twentieth-century composer to pull back from the leading edge of the avant-garde: Stravinsky, Hindemith and Shostakovich all rejected certain stylistic innovations of their early careers. But Strauss was the first.19

  5

  THE PRAGMATIC MIND OF AMERICA

  In 1906 a group of Egyptians, headed by Prince Ahmad Fuad, issued a manifesto to campaign for the establishment by public subscription of an Egyptian university ‘to create a body of teaching similar to that of the universities of Europe and adapted to the needs of the country.’ The appeal was successful, and the university, or in the first phase an evening school, was opened two years later with a faculty of two Egyptian and three European professors. This plan was necessary because the college-mosque of al-Azhar at Cairo, once the principal school in the Muslim world, had sunk in reputation as it refused to update and adapt its mediaeval approach. One effect of this was that in Egypt and Syria there had been no university, in the modern sense, throughout the nineteenth century.1

  China had just four universities in 1900; Japan had two – a third would be founded in 1909; Iran had only a series of specialist colleges (the Teheran School of Political Science was founded in 1900); there was one college in Beirut and in Turkey – still a major power until World War I – the University of Istanbul was founded in 1871 as the Dar-al-funoun (House of Learning), only to be soon closed and not reopened until 1900. In Africa south of the Sahara there were four: in the Cape, the Grey University College at Bloemfontein, the Rhodes University College at Grahamstown, and the Natal University College. Australia also had four, New Zealand one. In India, the universities of Calcutta, Bombay, and Madras were founded in 1857, and those of Allahabad and Punjab between 1857 and 1887. But no more were created until 1919.2 In Russia there were ten state-funded universities at the beginning of the century, plus one in Finland (Finland was technically autonomous), and one private university in Moscow.

  If the paucity of universities characterised intellectual life outside the West, the chief feature in the United States was the tussle between those who preferred the British-style universities and those for whom the German-style offered more. To begin with, most American colleges had been founded on British lines. Harvard, the first institution of higher learning within the United States, began as a Puritan college in 1636. More than thirty partners of the Massachusetts Bay Colony were graduates of Emmanuel College, Cambridge, and so the college they established near Boston naturally followed the Emmanuel pattern. Equally influential was the Scottish model, in particular Aberdeen.3 Scottish universities were nonresidential, democratic rather than religious, and governed by local dignitaries – a forerunner of boards of trustees. Until the twentieth century, however, America’s institutions of higher learning were really colleges – devoted to teaching – rather than universities proper, concerned with the advancement of knowledge. Only Johns Hopkins in Baltimore (founded in 1876) and Clark (1888) came into this category, and both were soon forced to add undergraduate schools.4

  The man who first conceived the modern university as we know it was Charles Eliot, a chemistry professor at Massachusetts Institute of Technology who in 1869, at the age of only thirty-five, was appointed president of Harvard, where he had been an undergraduate. When Eliot arrived, Harvard had 1,050 students and fifty-nine members of the faculty. In 1909, when he retired, there were four times as many students and the faculty had grown tenfold. But Eliot was concerned with more than size: ‘He killed and buried the limited arts college curriculum which he had inherited. He built up the professional schools and made them an integral part of the university. Finally, he promoted graduate education and thus established a model which practically all other American universities with graduate ambitions have followed.’5

  Above all, Eliot followed the system of higher education in the German-speaking lands, the system that gave the world Max Planck, Max Weber, Richard Strauss, Sigmund Freud, and Albert Einstein. The preeminence of German universities in the late nineteenth century dated back to the Battle of Jena in 1806, after which Napoleon finally reached Berlin. His arrival there forced the inflexible Prussians to change. Intellectually, Johann Fichte, Christian Wolff, and Immanuel Kant were the significant figures, freeing German scholarship from its stultifying reliance on theology. As a result, German scholars acquired a clear advantage over their European counterparts in philosophy, philology, and the physical sciences. It was in Germany, for example, that physics, chemistry, and geology were first regarded in universities as equal to the humanities. Countless Americans, and distinguished Britons such as Matthew Arnold and Thomas Huxley, all visited Germany and praised what was happening in its universities.6

  From Eliot’s time onward, the American universities set out to emulate the German system, particularly in the area of research. However, this German example, though impressive in advancing knowledge and in producing new technological processes for industry, nevertheless sabotaged the ‘collegiate way of living’ and the close personal relations between undergraduates and faculty that had been a major feature of American higher education until the adoption of the German approach. The German system was chiefly responsible for what William James called ‘the Ph.D. octopus’: Yale awarded the first Ph.D. west of the Adantic in 1861; by 1900 well over three hundred were being granted every year.7

  The price for following Germany’s lead was a total break with the British collegiate system. At many universities, housing for students disappeared entirely, as did communal eating. At Harvard in the 1880s the German system was followed so slavishly that attendance at classes was no longer required – all that counted was performance in the examinations. Then a reaction set in. Chicago was first, building seven dormitories by 1900 ‘in spite of the prejudice against them at the time in the [mid-] West on the ground that they were medieval, British and autocratic.’ Yale and Princeton soon adopted a similar approach. Harvard reorganised after the English housing model in the 1920s.8

  Since American universities have been the forcing ground of so much of what will be considered later in this book, their history is relevant in itself. But the battle for the soul of Harvard, Chicago, Yale, and the other great institutions of learning in America is relevant in another way, too. The amalgamation of German and British best practices was a sensible move, a pragmatic response to the situation in which American universities found themselves at the beginning of the century. And pragmatism was a particularly strong strain of thought in America. The United States was not hung up on European dogma or ideology. It had its own ‘frontier mentality’; it had – and exploited – the opportunity to cherry-pick w
hat was best in the old world, and eschew the rest. Partly as a result of that, it is noticeable that the matters considered in this chapter – skyscrapers, the Ashcan school of painting, flight and film – were all, in marked contrast with aestheticism, psychoanalysis, the élan vital or abstraction, fiercely practical developments, immediately and hardheadedly useful responses to the evolving world at the beginning of the century.

  The founder of America’s pragmatic school of thought was Charles Sanders Peirce, a philosopher of the 1870s, but his ideas were updated and made popular in 1906 by William James. William and his younger brother Henry, the novelist, came from a wealthy Boston family; their father, Henry James Sr., was a writer of ‘mystical and amorphous philosophic tracts.’9 William James’s debt to Peirce was made plain in the title he gave to a series of lectures delivered in Boston in 1907: Pragmatism: A New Name for Some Old Ways of Thinking. The idea behind pragmatism was to develop a philosophy shorn of idealistic dogma and subject to the rigorous empirical standards being developed in the physical sciences. What James added to Peirce’s ideas was the notion that philosophy should be accessible to everyone; it was a fact of life, he thought, that everyone liked to have what they called a philosophy, a way of seeing and understanding the world, and his lectures (eight of them) were intended to help.

  James’s approach signalled another great divide in twentieth-century philosophy, in addition to the rift between the continental school of Franz Brentano, Edmund Husserl, and Henri Bergson, and the analytic school of Bertrand Russell, Ludwig Wittgenstein, and what would become the Vienna Circle. Throughout the century, there were those philosophers who drew their concepts from ideal situations: they tried to fashion a worldview and a code of conduct in thought and behaviour that derived from a theoretical, ‘clear’ or ‘pure’ situation where equality, say, or freedom was assumed as a given, and a system constructed hypothetically around that. In the opposite camp were those philosophers who started from the world as it was, with all its untidiness, inequalities, and injustices. James was firmly in the latter camp.

  He began by trying to explain this divide, proposing that there are two very different basic forms of ‘intellectual temperament,’ what he called the ‘tough-’ and ‘tender-minded.’ He did not actually say that he thought these temperaments were genetically endowed – 1907 was a bit early for anyone to use such a term – but his choice of the word temperament clearly hints at such a view. He thought that the people of one temperament invariably had a low opinion of the other and that a clash between the two was inevitable. In his first lecture he characterised them as follows:

  Tender-minded Tough-minded

  Rationalistic (going by principle) Empiricist (going by facts)

  Optimistic Pessimistic

  Religious Irreligious

  Free-willist Fatalistic

  Dogmatic Pluralistic

  Materialistic

  Sceptical

  One of his main reasons for highlighting this division was to draw attention to how the world was changing: ‘Never were as many men of a decidedly empiricist proclivity in existence as there are at the present day. Our children, one may say, are almost born scientific.’10

  Nevertheless, this did not make James a scientific atheist; in fact it led him to pragmatism (he, after all, had published an important book Varieties of Religious Experience in 1902).11 He thought that philosophy should above all be practical, and here he acknowledged his debt to Peirce. Beliefs, Peirce had said, ‘are really rules for action.’ James elaborated on this theme, concluding that ‘the whole function of philosophy ought to be to find out what definite difference it will make to you and me, at definite instants of our life, if this world-formula or that world-formula be the true one…. A pragmatist turns his back resolutely and once for all upon a lot of inveterate habits dear to professional philosophers. He turns away from abstraction and insufficiency, from verbal solutions, from bad a priori reasons, from fixed principles, closed systems, and pretended absolutes and origins. He turns towards concreteness and adequacy, towards facts, towards action, and towards power.’12 Metaphysics, which James regarded as primitive, was too attached to the big words – ‘God,’ ‘Matter,’ ‘the Absolute.’ But these, he said, were only worth dwelling on insofar as they had what he called ‘practical cash value.’ What difference did they make to the conduct of life? Whatever it is that makes a practical difference to the way we lead our lives, James was prepared to call ‘truth.’ Truth was/is not absolute, he said. There are many truths, and they are only true so long as they are practically useful. That truth is beautiful doesn’t make it eternal. This is why truth is good: by definition, it makes a practical difference. James used his approach to confront a number of metaphysical problems, of which we need consider only one to show how his arguments worked: Is there such a thing as the soul, and what is its relationship to consciousness? Philosophers in the past had proposed a ‘soul-substance’ to account for certain kinds of intuitive experience, James wrote, such as the feeling that one has lived before within a different identity. But if you take away consciousness, is it practical to hang on to ‘soul’? Can a soul be said to exist without consciousness? No, he said. Therefore, why bother to concern oneself with it? James was a convinced Darwinist, evolution he thought was essentially a pragmatic approach to the universe; that’s what adaptations – species – are.13

  America’s third pragmatic philosopher, after Peirce and James, was John Dewey. A professor in Chicago, Dewey boasted a Vermont drawl, rimless eyeglasses, and a complete lack of fashion sense. In some ways he was the most successful pragmatist of all. Like James he believed that everyone has his own philosophy, his own set of beliefs, and that such philosophy should help people to lead happier and more productive lives. His own life was particularly productive: through newspaper articles, popular books, and a number of debates conducted with other philosophers, such as Bertrand Russell or Arthur Lovejoy, author of The Great Chain of Being, Dewey became known to the general public as few philosophers are.14 Like James, Dewey was a convinced Darwinist, someone who believed that science and the scientific approach needed to be incorporated into other areas of life. In particular, he believed that the discoveries of science should be adapted to the education of children. For Dewey, the start of the twentieth century was an age of ‘democracy, science and industrialism,’ and this, he argued, had profound consequences for education. At that time, attitudes to children were changing fast. In 1909 the Swedish feminist Ellen Key published her book The Century of the Child, which reflected the general view that the child had been rediscovered – rediscovered in the sense that there was a new joy in the possibilities of childhood and in the realisation that children were different from adults and from one another.15 This seems no more than common sense to us, but in the nineteenth century, before the victory over a heavy rate of child mortality, when families were much larger and many children died, there was not – there could not be – the same investment in children, in time, in education, in emotion, as there was later. Dewey saw that this had significant consequences for teaching. Hitherto schooling, even in America, which was in general more indulgent to children than Europe, had been dominated by the rigid authority of the teacher, who had a concept of what an educated person should be and whose main aim was to convey to his or her pupils the idea that knowledge was the ‘contemplation of fixed verities.’16

  Dewey was one of the leaders of a movement that changed such thinking, in two directions. The traditional idea of education, he saw, stemmed from a leisured and aristocratic society, the type of society that was disappearing fast in the European democracies and had never existed in America. Education now had to meet the needs of democracy. Second, and no less important, education had to reflect the fact that children were very different from one another in abilities and interests. For children to make the best contribution to society they were capable of, education should be less about ‘drumming in’ hard facts that the teacher thought necessary a
nd more about drawing out what the individual child was capable of. In other words, pragmatism applied to education.

  Dewey’s enthusiasm for science was reflected in the name he gave to the ‘Laboratory School’ that he set up in 1896.17 Motivated partly by the ideas of Johann Pestalozzi, a pious Swiss educator, and the German philosopher Friedrich Fröbel, and by the child psychologist G. Stanley Hall, the institution operated on the principle that for each child there were negative and positive consequences of individuality. In the first place, the child’s natural abilities set limits to what it was capable of. More positively, the interests and qualities within the child had to be discovered in order to see where ‘growth’ was possible. Growth was an important concept for the ‘child-centred’ apostles of the ‘new education’ at the beginning of the century. Dewey believed that since antiquity society had been divided into leisured and aristocratic classes, the custodians of knowledge, and the working classes, engaged in work and practical knowledge. This separation, he believed, was fatal, especially in a democracy. Education along class lines must be rejected, and inherited notions of learning discarded as unsuited to democracy, industrialism, and the age of science.18

  The ideas of Dewey, along with those of Freud, were undoubtedly influential in attaching far more importance to childhood than before. The notion of personal growth and the drawing back of traditional, authoritarian conceptions of what knowledge is and what education should seek to do were liberating ideas for many people. In America, with its many immigrant groups and wide geographical spread, the new education helped to create many individualists. At the same time, the ideas of the ‘growth movement’ always risked being taken too far, with children left to their own devices too much. In some schools where teachers believed that ‘no child should ever know failure’ examinations and grades were abolished.19 This lack of structure ultimately backfired, producing children who were more conformist precisely because they lacked hard knowledge or the independent judgement that the occasional failure helped to teach them. Liberating children from parental ‘domination’ was, without question, a form of freedom. But later in the century it would bring its own set of problems.

 

‹ Prev