The Psychology Book

Home > Other > The Psychology Book > Page 21
The Psychology Book Page 21

by DK


  IN CONTEXT

  APPROACH

  Cognitive development

  BEFORE

  1920s Lev Vygotsky develops his theory that cognitive development is a both a social and a cultural process.

  1952 Jean Piaget publishes his developmental theories in his book, The Origins of Intelligence in Children.

  AFTER

  1960s The teaching program “Man: A Course of Study (MACOS),” based on Bruner’s theories, is adopted in schools in the US, the UK, and Australia.

  1977 Albert Bandura publishes Social Learning Theory, which looks at development through a mixture of behavioral and cognitive aspects.

  The field of developmental psychology was dominated throughout much of the 20th century by Jean Piaget, who explained how a child’s thinking develops and matures in stages, as a result of a natural curiosity to explore the environment. Lev Vygotsky’s theory, which appeared in English shortly after Piaget’s, also claimed that a child finds meaning through experience, but widened the meaning of the word “experience” to encompass cultural and social experience. Children, he said, learn mainly through interaction with other people.

  At this point in the 1960s, the “cognitive revolution” was gaining momentum; mental processes were increasingly being explained by the analogy of the brain as an “information processor.” Jerome Bruner was a key figure in this new approach, having previously studied the ways that our needs and motivations influence perception—and concluding that we see what we need to see. He became interested in how cognition develops, and so began to study cognitive processes in children.

  A spiral curriculum would work best in schools, Bruner suggested. This involves a constant revisiting of ideas, building incrementally until the child reaches a high level of understanding.

  The mind as processor

  Bruner began his investigations by applying cognitive models to Piaget and Vygotsky’s ideas, shifting the emphasis in the study of cognitive development from the construction of meaning to the processing of information: the means by which we acquire and store knowledge. Like Piaget, he believes that acquiring knowledge is an experiential process; but like Vygotsky, sees this as a social occupation, not a solitary one. He maintains that learning cannot be conducted unassisted: some form of instruction is essential to a child’s development, but “to instruct someone… is not a matter of getting him to commit results to mind. Rather, it is to teach him to participate in the process.” When we acquire knowledge, we need to actively participate and reason, rather than passively absorb information, because this is what gives knowledge meaning. In terms of cognitive psychology, reasoning is seen as “processing information,” so the acquisition of knowledge should be seen as a process, not a product or end result. We need encouragement and guidance in that process, and for Bruner, that is the role of a teacher.

  In The Process of Education (1960), Bruner presented the idea that children should be active participants in the process of education. The book became a landmark text, altering educational policy in the US at governmental and schoolteacher level.

  JEROME BRUNER

  The son of Polish immigrants in New York City, Jerome Seymour Bruner was born blind, but regained his sight after cataract operations at the age of two. His father died of cancer when Bruner was 12, and his grief stricken mother moved the family frequently during his subsequent school years. He studied psychology at Duke University, then at Harvard, where he attained a PhD in 1941 alongside Gordon Allport and Karl Lashley.

  Bruner served in the US army’s Office for Strategic Studies (an intelligence unit) during World War II, then returned to Harvard, where he collaborated with Leo Postman and George Armitage Miller. In 1960, he cofounded the Center for Cognitive Studies with Miller at Harvard, remaining until it closed in 1972. He spent the next ten years teaching at Oxford University in England, before returning to the US. Bruner continued to teach into his nineties.

  Key works

  1960 The Process of Education

  1966 Studies in Cognitive Growth

  1990 Acts of Meaning

  See also: Jean Piaget • Lev Vygotsky • Albert Bandura

  IN CONTEXT

  BRANCH

  Cognitive psychology

  APPROACH

  Learning theory

  BEFORE

  1933 Gestalt psychologist Kurt Lewin leaves the Berlin School of Experimental Psychology and emigrates to the US.

  AFTER

  1963 Stanley Milgram publishes his experiments on willingness to obey authority figures, even when orders conflict with one’s conscience.

  1971 Philip Zimbardo’s Stanford prison study shows how people adapt to the roles they are assigned.

  1972 US social psychologist Daryl Bem proposes the alternative self-perception theory of attitude change.

  1980s Elliot Aronson defends Festinger’s theory, conducting experiments into initiation rites.

  By the end of World War II, social psychology had become an important field of research, spearheaded in the US by Kurt Lewin, the founder of the Research Center for Group Dynamics at the Massachusetts Institute of Technology in 1945.

  On the staff at the center was one of Lewin’s former students, Leon Festinger. Originally attracted by Lewin’s work in Gestalt psychology, he later took an interest in social psychology. In the course of his research, Festinger observed that people continually seek to bring order to their world, and a key part of that order is consistency. To achieve this, they develop routines and habits, such as establishing regular mealtimes and choosing favorite seats on their daily commute to work. When these routines are disrupted, people feel very uneasy. The same is true, he found, of habitual thought patterns or beliefs. If a very strong opinion is met with contradictory evidence, it creates an uncomfortable internal inconsistency; Festinger called this “cognitive dissonance.” He reasoned that the only way to overcome this discomfort is to somehow make the belief and the evidence consistent.

  Unshakeable conviction

  After reading a report in a local newspaper in 1954, Festinger saw an opportunity to study the reaction to just such a cognitive dissonance. A cult claimed to have received messages from aliens warning of a flood that would end the world on December 21; only true believers would be rescued by flying saucers. Festinger and some of his colleagues at the University of Minnesota gained access to the group, interviewing them before the designated apocalyptic date and again afterward, when the events had failed to transpire.

  The now-famous Oak Park study of this group, written up by Festinger, Henry Riecken, and Stanley Schachter in When Prophecy Fails, describes the reaction of the cult members. Where common sense might lead us to expect that the failure of their prediction and consequent cognitive dissonance would cause cult members to abandon their beliefs, the opposite occurred. As the day of reckoning drew near, another “message” came through, declaring that, due to the group’s dedication, the world was to be spared. Cult members became even more fervent believers. Festinger had anticipated this; to accept the contradictory evidence would set up an even greater dissonance between past belief and present denial, he argued. This effect was compounded if a great deal (reputation, jobs, and money) had been invested in the original belief.

  Festinger concluded that cognitive dissonance, or at least the avoidance of it, makes a man of strong conviction unlikely to change his opinion in the face of contradiction; he is immune to evidence and rational argument. As Festinger explains: “Tell him you disagree and he turns away. Show him facts or figures and he questions your sources. Appeal to logic and he fails to
see your point.”

  LEON FESTINGER

  Leon Festinger was born in Brooklyn, New York, to a Russian immigrant family. He graduated from City College of New York in 1939, then studied at the University of Iowa under Kurt Lewin, finishing his PhD in Child Psychology in 1942. After spending the later years of World War II in military training, he rejoined Lewin in 1945 at the Research Center for Group Dynamics at the Massachusetts Institute of Technology (MIT).

  It was during his appointment as professor at the University of Minnesota that Festinger made his famous Oak Park study of a cult predicting the end of the world. He moved to Stanford University in 1955, continuing his work in social psychology, but in the 1960s he turned to research into perception. He later focused on history and archaeology at the New School for Social Research in New York. He died of liver cancer, aged 69.

  Key works

  1956 When Prophecy Fails

  1957 A Theory of Cognitive Dissonance

  1983 The Human Legacy

  See also: Kurt Lewin • Solomon Asch • Elliot Aronson • Stanley Milgram • Philip Zimbardo • Stanley Schachter

  IN CONTEXT

  APPROACH

  Memory studies

  BEFORE

  1885 Hermann Ebbinghaus publishes his pioneering book Memory: A Contribution to Experimental Psychology.

  1890 William James makes the distinction between primary (short-term) and secondary (long-term) memory in The Principles of Psychology.

  1950 Mathematician Alan Turing’s test suggests that a computer can be considered a thinking machine.

  AFTER

  1972 Endel Tulving makes the distinction between semantic and episodic memory.

  2001 Daniel Schacter proposes a list of the different ways we misremember in The Seven Sins of Memory.

  George Armitage Miller once famously complained: “My problem is that I have been persecuted by an integer. For seven years this number has followed me around.” So begins his now famous article The Magical Number Seven, Plus or Minus Two: Some Limits on our Capacity for Processing Information. He goes on: “There is… some pattern governing its appearances. Either there really is something unusual about the number or I am suffering from delusions of persecution.” Despite the whimsical nature of his title and introduction, Miller had a serious intent, and the article was to become a landmark of cognitive psychology and the study of working memory (the ability to remember and use pieces of information for a limited amount of time).

  "The persistence with which this number plagues me is far more than a random accident."

  George Armitage Miller

  Miller’s paper was published in The Psychological Review in 1956, when behaviorism was being superseded by the new cognitive psychology. This fresh approach—which Miller wholeheartedly embraced—focused on the study of mental processes, such as memory and attention. At the same time, advances in computer science had brought the idea of artificial intelligence closer to reality, and while mathematicians, such as Alan Turing, were comparing computer processing with the human brain, cognitive psychologists were engaged in the converse: they looked to the computer as a possible model for explaining the workings of the human brain. Mental processes were being described in terms of information processing.

  "The process of memorizing may be simply the formation of chunks… until there are few enough chunks so that we can recall all the items."

  George Armitage Miller

  Miller’s main interest was in the field of psycholinguistics, stemming from his work during World War II on speech perception, which formed the basis for his doctoral thesis. This led him to take an interest in the growing field of communications, which in turn introduced him to information theory. He was particularly inspired by Claude Shannon, a leading figure in communications, who was investigating effective ways of turning messages into electronic signals. Shannon’s communication model, which involved translating ideas into codes made up of “bits”, underpins all digital communication. Miller was inspired to look at mental processes in a similar way, and to establish the ground rules for the modern field of psycholinguistics in his 1951 book, Language and Communication.

  "The kind of linguistic recoding that people do seems to me to be the very lifeblood of the thought processes."

  George Armitage Miller

  Seven categories

  Miller took Shannon’s method of measuring information and his idea of “channel capacity” (the amount of information that can be processed by a system) and applied it to the model of short-term memory as an information processor. This was when he began to be “persecuted” by the recurrence and possible significance of the number seven; “sometimes a little larger and sometimes a little smaller than usual, but never changing so much as to be unrecognizable.”

  The first instance of the “magical” number came from experiments to determine the span of absolute judgment—how accurately we can distinguish a number of different stimuli. In one experiment cited in Miller’s paper, the physicist and acoustic specialist Irwin Pollack played a number of different musical tones to participants, who were then asked to assign a number to each tone. When up to around seven different tones were played, the subjects had no difficulty in accurately assigning numbers to each of them, but above seven (give or take one or two), the results deteriorated dramatically.

  In another experiment, by Kaufman, Lord, et al, in 1949, researchers flashed varying numbers of colored dots on to a screen in front of participants. When there were fewer than seven dots, participants could accurately number them; when there were more than seven, participants were only able to estimate the number of dots. This suggests that the span of attention is limited to around six, and caused Miller to wonder whether the same basic process might be involved in both the span of absolute judgment and the span of attention.

  The tones and dots in these experiments are what Miller calls “unidimensional stimuli” (objects that differ from one another in only one respect); but what interested Miller is the amount of information in speech and language we can effectively process, and items such as words are “multidimensional stimuli.” He looks to later studies by Pollack in which the simple tones were replaced by tones that varied in six ways (such as pitch, duration, volume, and location). Surprisingly, despite the apparently larger amount of information, the results still pointed to a differential limit of seven, plus or minus two. The difference is that as more variables are added, accuracy slightly decreases. Miller claims this allows us to make “relatively crude judgments of several things simultaneously.” It may explain how we are able to recognize and distinguish such complex things as spoken words and people’s faces, without having to process the individual sounds or features.

  Miller sees the human mind as a communication system: as the input information increases, the amount transmitted to the brain also increases initially, before leveling off at an individual’s “channel capacity.” Miller then took this idea of channel capacity a stage further, applying it to the model of short-term memory. William James first proposed the notion of short-term memory, and it had long been an accepted part of the model of the brain as an information processor, coming between the sensory input of information and long-term memory. Hermann Ebbinghaus and Wilhelm Wundt had even suggested that short-term memory had a capacity limited to around seven items (seven, again). Miller believed that what he called working memory had a capacity that corresponded to the limits of absolute judgment and span of attention.

  An experiment into the span of attention presented participants with random patterns of dots flashed on a screen for
a fraction of a second. Participants instantly recognized the number if there were fewer than seven.

  Bits and chunks

  In terms of our ability to process information, if working memory is limited to about seven elements, there is a potential bottleneck restricting the amount that can be put into long-term memory. But Miller suggested that there was more to the correspondence than just the number seven, no matter how magical it appeared. The multidimensional stimuli of previous experiments could be seen as composed of several “bits” of related information, but treated as a single item. Miller believed that by the same principle, working memory organizes “bits” of information into “chunks,” to overcome the informational bottleneck caused by our limited spans of absolute judgment and short-term memory. A chunk is not, however, just an arbitrary grouping, but an encoding of bits into a meaningful unit; for example, a string of 21 letters represents 21 bits of information, but if this can be broken down into a sequence of three-letter words, it becomes seven chunks. Chunking is dependent on our ability to find patterns and relationships in the bits of information. To someone who does not speak the same language, the seven words might be meaningless, and would not constitute seven chunks, but 21 bits.

 

‹ Prev