Book Read Free

Heritage and Foundations

Page 36

by Alain de Benoist


  The behaviours which are purely instinctive and predetermined in the animal are found in ‘thought’ in the human species, and historicised. From sexuality, man makes eroticism; form work, organised action; from aggression, a strategy; from the ‘word’, a discourse; from a series of events, a history.

  He is the only one to be able to say I with knowledge, he is also the only one to be able to capitalise the ancestral inheritance, to reactualise it at any moment, to enrich it by renewing it. It is this in which his ‘free will’ consists.

  ‘Free, man draws everything from himself, including his own end’, writes Professor Pierre P. Grassé — ‘whence his immense responsibility with regard to the species. He finalises everything, including the future of his species. Condemned to be the worker of his destiny, he weaves his future and gives life the meaning that his will and his desire suggest to him (…) The finality presents itself completely differently among animals than it does for us, for the determinants of their conduct differ radically from ours. More precisely, in man, the immanent biological finality which dominates our bodily structure and our organic functions is overlaid by the finality created by our will in the field of freedom’ (Toi, ce petit dieu ! Albin Michel, 1971).

  Whereas the animal is born already trained, man must train himself. He is a ‘being of dressage’ (Gehlen).447 Hence the importance of education and the need for discipline in order to create circuits of habit. To want to escape from all constraint, and, above all, from that which the individual must impose on himself, is to return to the lower limit of humanity. To be less human.

  Organiser of his own destiny, man in the final analysis is a profoundly cultural being.

  His potential abilities are innate. But they are developed or thwarted by the environment, and then constantly reoriented by learning and cultural heritage. The latter determines the knowledge system (the group’s own worldview), which consists of classifications allowing the division, analysis, and comprehension of the surrounding environment (including the human environment), and consequently, the statement of norms and rules of action. Man inherits a tradition. But within it, he innovates constantly: a subtle dialectic, where permanence and change are combined. As Alland writes: ‘Man is born with the faculty of assimilating culture, not with culture’.

  For a ‘Return to Culture’

  Culture itself has a biological origin. Better: ‘Every change in our environment gives rise to new selective pressures, which continue to operate, both genetically and culturally’ (Alland). But this biological basis is one factor, while the possibilities and modalities of cultural expression vary immensely.

  It remains to be seen, however, to what extent self-mastery is innate. This is a crucial detail, since it is precisely this quality which enables man to act independently of his impulses (see Arnold Gehlen, Moral und Hypermoral, and Irenäus Eibl-Eibesfeldt (The Preprogrammed Man).

  Thanks to evidence from ethology, it is now possible to initiate a radical critique of the philosophies stemming from Rousseau’s thought. Man is not ‘naturally good’. At birth, he is neither ‘free’ nor ‘equal’ to anyone. But to deepen this criticism, it is necessary, as Arnold Gehlen has done in Der Mensch, to substitute for the doubtful (and often reductionist) slogan of ‘return to nature’ that of ‘return to culture’.

  Nature, say the philosophers of life, tells us what we are, but not what we can become.

  In an essay published in 1973, L’anti-nature, Clement Rosset rightly denounced ‘deep and ineradicable illusion’ in the very idea of ‘nature’.

  Claiming in turn Empedocles, the sophists and atomists of antiquity, as well as Machiavelli and Nietzsche, he notes that what is man’s own, that is to say culture, corresponds precisely to the part of him which, not belonging to ‘nature’, can be considered ‘artifice’. He therefore denounces the ‘naturalist ideologies’, and among them, the ‘ecological’ challenge that, regardless of its legitimate aspects, is essentially anti-cultural (hence ‘anti-human’) insofar as it suggests that man must not ‘defy’ nature, that he must stop his growth and cease to be a man, rather than engaging himself in new confrontations.

  Rossez, to the contrary, proposes that man should fully assume his specificity ‘by renouncing the idea of nature, which can be regarded as the principle of all ideas tending to “divinise” existence and to deprecate it as such’.

  ‘Thus’, he continues, ‘man, freed from the idea of nature, will be able to recover his true … nature: a “denatured” and properly human nature’.

  Here we find the necessity of a tragic conception of life: ‘To affirm existence is to affirm tragedy: to consent to the impossibility of seizing existence in general. The tragedy of existence is to dispense with any ontological framework. The affirmation is tragic or it is not (...) This is the constant paradox of tragic philosophy: to celebrate without reason and to detail all the horror of the world for the sole pleasure of placing the inalterable character of its joy in relief’.

  And Rosset concludes: ‘The distinction is imposed between the formula of morality and religion (‘Be first humble and you will see happiness follows‘) and the formula of joy (‘Be first happy and you will necessarily be humble’). The second formula is more certain than the first: for joy guarantees humility (Nietzsche), whereas humility does not guarantee joy (Pascal). The chronological and psychological nuance is important, for it signifies that the deepest wisdom does not recommend to be humble first, but happy first’.

  *

  La dimension humaine, a study by Alexander Alland. Seuil, 190 pages.448

  L’anti-nature, a study by Clément Rosset. PUF, 330 pages.449

  *

  Human ethology has been combined with scientific ecology, genetics, and the study of population dynamics to give rise to a new discipline, sociobiology. It is defined as a ‘systematic study of the biological foundations of all forms of social behaviour, both in man and in animals’ (New Scientist, 13 May 1976). Its principal theorist is Edward O. Wilson, a Professor at Harvard University, whose great work, Sociobiology: The New Synthesis (Belknap: Harvard, 1979) was a sensation in the United States.

  Sociobiology is part of the (‘elitist’) neo-Darwinian theory of evolution. It takes into account the fact that there is a high degree of interactions between individuals in the same population resulting from preferential associations not exclusively linked to habitat. It shows that individuals of the same population are not random, interchangeable units. What is more, it proposes a re-examination of the notion of natural selection by shifting its emphasis from the individual to the parental group (kin). In this hypothesis, ‘altruistic behaviours’ are no longer in contradiction with the ‘Darwinian fitness’ of the gene — a notion according to which the main biological function of the organism is not the reproduction of other organisms, but the reproduction and multiplication (through other organisms) of the ‘best’ genes from the point of view of selection and adaptation.

  Professor Wilson’s theories have been particularly well received by ethologists. However, in the United States, they have provoked a campaign of systematic denigration on the part of the left-wing ‘radical scientists’ gathered around Richard C. Lewontin. They formed a ‘Sociobiology Study Group’ whose aim was to denounce the ‘implicit political message’ of sociobiology. A book has even been published emanating from this circle: Sahilins, The Use and Abuse of Biology: An Anthropological Critique of Sociobiology (University of Michigan Press, Ann Arbor, 1976.)

  To these critiques, E. O. Wilson replied (New York Review of Books, 11 December, 1975), that his opponents distort his views in order to propagate a purely environmentalist ideology, and for his part he never denied the specific importance of cultural factors.

  Inspired by Marxism or Behaviourism, the attacks against Lorenz, Eibl-Eibesfeldt, and Ardrey also continue to grow. In the space of a few years, the criticisms of John P. Scott (Aggression, Chicago: University of Chicago Press, 1958 and 1970), Ashley Montagu (Man and Aggr
ession, London: Oxford University Press, 1968 and 1973), Walter Hollitscher (Kain oder Prometheus? Frankfurt/M: Marxistische Blätter, 1972), Jospeph Rattner (Aggression und menschliche Natur, Frankfurt/M: Fischer, 1972),450 Wolfgang Schmidtbauer (Die sogennante Aggression, Hamburg: Hoffmann und Campe, 1972),451 and Kurt Gerhardt (Aggression und Rassismus — elementare Verhaltensweisen? Körsel, München, 1973),452 Rolf Denker (Aufklärung über Aggression, W. Kohlhammer, Stuttgart, 1975),453 Ulrich Erckenbrecht (Mensch, du Affe, Kübler: Lampertheim, 1975),454 Gerhard Roth (Kritik der Verhaltungsforschung), C. H. Beck, München, 1976),455 Gunter Pilz and Hugo Moesch (Der Mensch und die Graugans. Eine Kritik an Konrad Lorenz. Umschau: Frankfurt/M, 1976),456 etc. These studies, which all seem to reiterate each other, are on the whole not very convincing.

  Psychological

  Intelligence, Inheritance, and IQ

  ‘The results of numerous carefully conducted surveys unquestionably lead to the conclusion that IQ (intelligence quotient) tests, suitably designed and administered, yield results that coincide remarkably well with the child’s successes (...) In a very summary way, let us say that any child with an IQ. of 115 can hope to enter high school, that if he has a quotient of 125 he will enter university, and that he has every chance to graduate at the top of his class if his quotient is between 135 and 140’.

  According to Professor Hans J. Eysenck, the prospects of ‘democratising’ education are more limited than we would believe.

  Born in Berlin in 1916 and established in Great Britain since 1936, Professor Eysenck studied at the Universities of Dijon, Brussels, and Exeter. He was appointed in 1942 to the Mill Hill Emergency Hospital and then to the Maudsley Hospital in London. In 1950, he took over the Department of Psychology at the Institute of Psychiatry of the University of London. He is probably the British psychologist most well-known to the public. We see him regularly on television. Almost all of his books have been bestsellers (Know Your Own IQ, Crime and Personality, Sense and Nonsense in Psychology, Uses and Abuse of Psychology, Psychology and Politics). In total, sales of his works have exceeded 1.5 million copies.

  In Know Your Own IQ, he surveys intelligence tests and allows the reader to ‘evaluate’ themselves.

  It was the Frenchman Alfred Binet who, in 1904, first noticed that mental capacities and functions could be measured by tests which required these capacities and functions in order to pass.

  Psychometry soon developed, especially in the Anglo-Saxon countries, where, unlike France, psychology is regarded as a science and not as a branch of speculative philosophy.

  Mental Age / Actual Age

  Today there are a whole host of ‘tests’, some more generalised and some more specialised in scope (tests to evaluate a particular aptitude, tests for children, tests used in psychiatry, etc.). Professor Eysenck’s ‘game-book’ consists in eight series of forty questions. Their peculiarity is as follows: it is by discovering the method that allows one to pose the problem that one also discovers the solution.

  An example of ‘verbal’ test: ‘Black is to white as high is to: (1) low; (2) green; (3) ascending; (4) distant‘. The right answer is obviously ‘low’. ‘Green’ is an aberrant answer. ‘Ascending’ and ‘distant’ indicate bad comprehension.

  An example of a ‘progressive logic’ test: ‘What is the missing number at the end of the series: 3, 7, 16, 35 …’ The answer is 74. (Each number is the double of the preceding, plus one, plus two, plus three, etc.).

  Other tests consist of comparing sets of words and forms, noting similarities and differences. Within each series, the exercises grow increasingly difficult.

  Through these kinds of tests, psychologists evaluate the mental age of a subject. Having a mental age of three means that the tests that are passed are successfully completed (statistically speaking) by three-year-olds. But this mental age is not necessarily the real age: a child who passes the tests of the eight-year-old, but is tripped up by tests of the nine-year-old, has a mental age of eight years, whether he is nine or ten chronologically.

  By deriving an arithmetic ratio between mental and chronological ages (a ratio multiplied by 100 in order to eliminate decimals), we obtain the intellectual quotient (IQ), ‘one of the best-known concepts’, writes Eysenck, ‘among anyone who is concerned with psychology in any capacity’. Three children with a mental age of eight years, but a chronological age of six, eight, and twelve years, will have an IQ of 133 (8/6 x 100), 100 (normal IQ) and 67, respectively.

  In the European population, there are approximately 50% of people with an IQ between 90 and 110, 25% having a higher IQ, and 25% a lower IQ Among the best, Eysenck says, ‘there are about 14.5% with IQs between 110 and 120, 7% between 120 and 130, 3% between 130 and 140, and 0.5% above 140’. There is an international association called Mensa, founded by Sir Cyril Burt, which groups people whose quotient exceeds 148 (after they have passed various tests). It has 20,000 members in some 15 countries. (La Mensa-France was created on the initiative of Robert Lehr).

  Individuals whose IQ is less than 70 are considered as ‘mentally deficient’. Among these, imbeciles (IQ 25 to 50) and idiots (less than 25) are usually entrusted to specialised institutions. Morons (from 50 to 70) are capable of carrying out, under surveillance, certain concrete operations (as opposed to formal operations, which require a capacity for abstraction).

  Of course, the distinction between mental age and chronological age is only valid for children year-to-year, when intellectual development is not stabilised. For adults, when year-to-year growth no longer has a different value, psychologists introduce ‘corrective weights’. ‘What we actually tell an adult when we tell him his IQ’, says Professor Eysenck, ‘is that if the concept of IQ was still applicable at his age, that is the IQ he would have actually obtained’.

  Studies carried out according to professional categories show that, from a statistic point of view (there are of course exceptions), there is a very clear correlation between social position and successful results in intelligence tests. We find individuals of all IQs in each social class, but the average of these IQs varies according class. A British study, for example, has highlighted six social classes, from the ‘higher professional’ to the ‘unskilled’, whose average IQs are 140, 130, 116, 108, 90, and 85 respectively.

  Predictive Value

  The misuse that certain ‘reductive’ industrial psychologists (the ‘deluded psychopaths’457 denounced by de Montmollin in a book published in 1972) have made of intelligence tests have sparked criticisms that are sometimes justified. It is certainly very difficult to quantify intelligence or character. On the other hand, an assessment (of an ‘analytic’ kind) of the sum of different personality traits gives us an incomplete picture of this personality regarded as a whole. Nevertheless, it is also certain that IQ evaluation has a predictive value: it allows us, statistically speaking, to predict what socio-educational level an individual is most likely to reach. ‘The connection between IQ and academic achievement continues throughout the course of the studies’, says Jean-Louis Lavallard (…) ‘the students with higher than average IQs apply more frequently for entry into bachelor’s degrees and are more often accepted. They have also gone into more reputedly difficult sectors (Le Monde de l’éducation, October, 1975). Lastly, the misuse of a thing does not imply that the thing itself is bad.

  In his book, Professor Eysenck responds to numerous objections.

  Contrary to a widespread belief, therefore, the Professor’s opinion of his pupils does not decisively influence the grades he gives them. Indeed, ‘if one compares the assessments of a teacher of a pupil with the results obtained by the same pupil during an IQ test, one sees that there is a close correlation between the two’.

  Of course, there are always exceptions. A very bright child can get bad results at school. But that does not mean that he is ‘less intelligent’ than he thought — or that the professors ‘want’ this. His failures are more likely due to character traits that have nothing to do with intelli
gence: the ability to work in groups, assiduity, accuracy, personal interest, among other things.

  ‘To criticise a measure of intelligence on the pretext that it does not teach us anything about non-intellectual qualities is not an acceptable attitude’. But the reverse is not true: ‘Those who have low IQ cannot succeed in intellectual or academic careers’.

  Can we say, however, that there are many kinds of intelligence?

  During the 1920s, the nature of intelligence was the subject of debates that are somewhat outdated today: we call intelligence the set of mental faculties that prove best suited to existence in an evolved society: speed of comprehension, a spirit of analysis and synthesis, aptitude for logic, abstract reasoning, sense of adaptation.

  The subjective aspect of this definition should not surprise us: intelligence is not an absolute fact — it varies from culture to culture. But its ‘subjective’ nature does not prevent it from being evaluated. The sensation of ‘hot’ and ‘cold’ is also subjective. This does not prevent heat being measured ‘objectively’ by means of the thermometer. ‘Despite greater complexity’, says Eysenck, ‘intelligence tests are comparable to the thermometer: they give objective indications on something that is not objective’. Moreover, when the thermometer was invented, little was known about the nature of heat’.

  Experimented, revised, and refined since the beginning of the century, these tests have proven their worth. Their results can be expressed in the form of curves and equations with a ‘reliability’ of 90 to 95%.

  It should not be forgotten, however, that these measures are only statistical data. In scientific matters, ‘true’ only means ‘endowed with a very high degree of probability’, except in the field of mathematics, whose propositions are always true precisely because they tell us nothing about the realities that they describe. IQ itself is an average: from one category of tests to another, the disparity in ‘performance’ often reaches ten points.

 

‹ Prev