Tomorrow's People

Home > Other > Tomorrow's People > Page 23
Tomorrow's People Page 23

by Susan Greenfield


  If the maturing of the Information Age is to revolutionize all aspects of education from how we learn and what we learn to what we learn with then it is no surprise that another change might be where we learn. Glen Russell, Lecturer in Education at Monash University, sees three types of virtual schools on the horizon. First, the ‘Independent’, where students access and interact with materials whenever they wish; such a school would not rely on real-time communication between student and teacher. Second, ‘Synchronous’ schools, where scheduled online meetings take place with other students and teachers, consisting of live chats and video-conferencing. This system would offer more socialization but reduced flexibility in timing. Thirdly, ‘Broadcast’ schools, where students would access lectures or broadcasts on the web; the biggest disadvantage here, of course, is that interaction would be restricted.

  In the future, needless to say, all three types of school could combine. But there could, in any event, be a big problem when students work from home, using the internet, without ever being part of a conventional classroom. It may well be that those working remotely won't be just those who are physically unable, through disability or geographical location, to attend conventional school; an increasing number of ‘regular’ students will be attending virtual schools simply because they wish to. There is the very likely prospect of a dual education system. But the big issue that virtual education throws into focus is what a school is actually for.

  The development of real face-to-face human relations, many would argue, is as important as learning facts. The consequences of switching to virtual schooling could be that students will fail to develop an understanding of their own emotions, and those of others; the patterns of lifelong friendship will be reduced; and mere facts could take precedence over the wisdom that comes from real experiences and spontaneous dialogues. The prospect of a solitary student working alone on a computer for their entire education is not intuitively appealing, but perhaps virtual schools are inevitable nonetheless. Our current education system could be viewed as a product of the Industrial not the Information Age. Students are currently subdivided into classrooms and year groups, given standardized textbooks, made to memorize information and regurgitate it in tests. This grading, like factory quality control, fitted the needs of the 19th century very well. In the UK, at least, the traditional public school, with its anti-intellectual culture, its emphasis on team games and group leadership, along with physical discomfort and abysmal food, equipped its pupils perfectly to run the remote and inhospitable corners of the British Empire – to work closely with others within a rigid order and hierarchy that was never challenged.

  But now… Apparently the sum total of knowledge in the world – or perhaps I should say information – is doubling every four years. Multimedia are being developed with text, sound, photos and video that could tailor the material on a syllabus to an individual child's learning style, be it association, abstract, visual or whatever. Moreover, the buildings of traditional schools are costly: one estimate for the USA is that it needs to invest $112 billion to repair or upgrade 80,000 school premises. So perhaps there is a persuasive argument for dispensing with one-size-fits-all curricula and, indeed, one-size-fits-all bricks-and-mortar schools.

  Different models are possible for a virtual school of the future: perhaps a dense neighbourhood of sub-network schools, or block schools with ten children or fewer, or simply home-schools where a child goes at their own pace. Approximately 150,000 children in Britain are home-schooled, and this figure is predicted to triple by 2010. This hyper-localized learning, perhaps with parents working from home themselves, or with friends or neighbours wanting to teach in the context of a particular religion or culture, would necessitate socialization through different methods, hobbies or sports. The obvious criticism, that the minds of the young will be narrowed within their culture, or even within their families, could be offset by the added advantage of distance learning with children from different cultures all over the world.

  Since specific facts will be accessed on a just-in-time basis, emphasis will ideally be on how to learn – mastering concepts, thinking critically and expressing oneself effectively: no less than a preparation for the lifelong, independent learning that the lifestyle of the second half of the 21st century will entail. Following this theme, perhaps formal education will end much earlier. Learning will be less associated with institutions, and more with need-driven learning in the workplace.

  Future generations will not therefore be generalists. The Information Age could lead to more specialization not less, as the specific needs of diverse aspects of society are met by appropriate technologies, ever-changing and needing ever-changing skills, as we saw in Chapter 4. We could be on the brink of returning to the mood of the mid 19th century, when most of the workforce were apprenticed at a young age and learnt one ‘vocational’ skill that fitted the needs of society. The only difference compared to the old system would be that within a certain trade, or vocation, constant retraining – lifelong learning – would be an accepted feature of everyday life. And the skills would not be manual but invariably IT-related.

  The problem with an IT-based education, with the focus on the individual going at their own pace for their own individual needs and curiosities, is that surely there will be an inevitable loss of direction in terms of what we are learning as a cohesive society. If, rather than thinking, everyone is having an experience with interactive and personalized pedagogic programmes, how might we make progress in increasing our common, shared knowledge-base and our progressive quest for the truth? The cynical might even say that this new type of cyber-education will hold out the lure of a quick fix for problems of pupil attainment. Everyone will be a winner; an individual curriculum might be the ultimate dumbing-down.

  Distance learning has the undoubted advantages of offering frequent opportunities to experiment with new technologies, minimal instructor travel, access to guest speakers, along with just-in-time training that couldn't be achieved previously, since conventional classes required preparation done a week or so in advance. A further advantage of multiple-conferencing technologies is that they might actually be preferable to face-to-face communication, since one would get to know more fellow students than just those sitting nearby. The fact that students seem better able to fit classes into their schedule, plus the fact that there are shorter classes scheduled over a longer time, may account for why the drop-out rate from such classes so far has proved very low.

  But what about this new technology applied to universities? Already almost two out of three American children go on to higher education, and this population is set to swell to 16.3 million by 2009, according to the US Department of Education's figures. Currently, the over-25s account for some 40 per cent of the student community. In a few years, mature students over 35 will exceed 18- to 19-year-olds. Virginia Tech already offers a 24-hour-a-day programme in undergraduate maths taught electronically. Students work in a communal space with computers whilst faculty and teaching assistants roam the aisles rather than leading the class. A 32-year-old student is less than enthusiastic: ‘This is Orwellian Math. It's just you and the machine, and the professor is this shadowy figure who emails you once a week.’

  But then, test scores and enrolment are up, with fewer drop-outs, and the costs are down. Like secondary schools, there could be various types of virtual universities. First, public educational offerings such as those developed already by the Universities of Texas and California; second, collaborative educational efforts between universities such as Western Governors virtual university, a collaboration of most western states excluding Texas and California; third, private entities, such as the University of Phoenix with 60,000 students; finally there will be an increasing number of corporate training establishments, as exemplified by the Marriott corporation, which offers worldwide training programmes as well as educational services to other businesses.

  Once again, even more perhaps than at secondary school, we have to ask: if
students can work at their own pace and direction to an extreme degree, what then constitutes a course? The notion of personalized degrees is strange, but might be a feature of the future academic landscape. Given the individual needs, interests and abilities that can now be catered for, and given that computers could structure and mark tests, there is no technical reason why we all should not take this path of ultra-individualization. There would have to be, of course, some way of assigning a value to a particular piece of knowledge or experience. In addition corporate degrees might become popular, with exactly the right blend of skills for each company, or standardization across companies who produce their own modules for assessment.

  Universities in the future therefore will be not so much places where you gain knowledge for life as engines for technological advancement. Yet Jim Dator of the University of Hawaii sees the change as inevitable, and bricks-and-mortar universities as relics of the past. He argues that although universities are very old, the public education system has been going for only 150 years and was designed to meet the needs of emerging industrial states. Formal education was not needed in the slow-moving, agricultural, feudal economies. But then, farmers and peasants were transformed into workers and managers. The idea, from the point of view of the state, was never to enable scholars to ‘pursue truth’.

  As we enter the 21st century, in education, as in life generally, space and time will become less and less standardized. Students will be learning at different rates in different places. The international dimension of new technologies will enable a sharing of different cultural perspectives. Western thought will no longer dominate, but is predicted to come fourth behind Confucian, Hindi and Islamic agendas. In addition, students will gain experience from simulated real-time experiences in virtual hospitals and factories. But research might be cut, unless there is a direct military or commercial spin-off. Meanwhile campuses may well become theme parks: Bill Gates has already funded such a venture at Harvard, where the scene is set permanently in 1925, complete with lectures on the topics of the period, such as Marxism and relativity theory. Imagine, then, the future education of your grandchildren or great-grandchildren, inevitably in some type of virtual university, on a course that is highly tailored either to them personally or to the needs of their employer. David Waguespack of the University of Oregon sums up the situation:

  The virtual university is truly a mixed bag. Among the benefits are competition, choice, and greater educational access. At best, the virtual university will force regional colleges to improve and rethink teaching, because alternatives will be available. Among the downsides are turning education into a commodity and a degraded college experience. At worst, the virtual university will create a situation where credentials take precedence over learning, and educational convenience masquerades as greater access. The truth is likely somewhere between the two extremes. What ultimately comes of the virtual university depends not on the opinions of academics, however, but on whether the consumers of education want it.

  So what do we want of education? The most gloomy prediction is that we will be living in a society geared to the material needs and desires of society, albeit a global one, where time and space have little relevance. We will inhabit a world of experience, more specifically, screen experience, rather than abstract thought; answers will crowd onto our screens and compete for attention, no longer linked to any clear questions. There may well be nothing about our new world that we need to ponder. Universities will no longer be a central plank in our culture, primarily because no one will believe any longer that ‘the truth’ is out there waiting to be discovered, let alone that it is beautiful. Is this intellectual heresy really what awaits us?

  7

  Science: What questions will we ask?

  I think people get the wrong impression about scientists in that they think in an orderly, rigid way from step 1 to step 2 to step 3. What really happens is that often you make some imaginative leap which at the time may seem nonsensical. When you capture the field at those stages it looks like poetry in which you are imagining without yet proving.

  Paul Steinhardt, physicist.

  Having looked at how we will live, work, love and learn, the time has come to ask what the future holds for the human imagination: how will our successors cope when, and if, it comes to tackling new, big questions of science? So far our entire journey into the forthcoming decades of the 21st century has been driven by the startling advances in a vast range of technologies, but these dazzlingly innovative incursions into our lives actually have their inception in basic concepts introduced in the 20th century – computers and genetics.

  As it happens, computers and genetics, though seemingly unrelated, do share a common origin, a single leap of imagination in basic science almost a century ago. That colossal intellectual milestone was quantum theory, pioneered by Werner Heisenberg and Erwin Schrodinger in the 1920s. Quantum theory challenged the idea, seemingly impregnable at the time, that waves and particles were distinct and suggested instead that they were inseparable. Heisenberg and Schrodinger used the notion that waves and particles were really two sides of the same coin to describe the ‘quantization’ of energy, the process whereby energy can be transferred only in packets and not, as had been thought until then, in a continuous manner. Abstract and baffling as quantum theory may sound the insights it gave into the basics of matter and energy were to have astounding implications for more down-to-earth branches of science. Advanced materials such as lasers and transistors, and therefore ultimately computers, rely on the principles of quantum theory. Likewise in biology, the currently emerging feats of gene manipulation, triggered by our ability to manipulate atoms, are reliant on an understanding of molecular bonds and the technique of X-ray crystallography, both of which hark back to quantum theory.

  These knock-on effects of quantum theory have provided enough work for generations of scientists and technologists over the previous century, and keep their contemporary counterparts fully occupied well into this one, contemplating its further resounding implications. But was quantum theory, and the various scientific revolutions it spawned, a one-off? Some think that we will see no comparable great breakthroughs. The science journalist John Horgan, author of The End of Science, for example, claims that: ‘Scientists will continue making incremental advances, but they will never achieve their most ambitious goals, such as understanding the origin of the universe, of life and of human consciousness.’

  How valid is this view? When we looked at future education in the previous chapter the threat loomed large of a new way of life, one that emphasized the passive, the hedonistic and the experiential over abstract thought and imagination. Such a state of mind, obviously, would doom any further original scientific endeavour. But, for the sake of argument, let's assume that somehow human imagination manages to survive the suffocation of an utterly comfortable lifestyle. If indeed highly innovative ideas and insights that challenge existing thinking always take time to be translated into practical technology that affects everyone's daily existence, then the thoughts of scientists of this century will be setting the pace and agenda for life well into 2200 and beyond.

  Let's think about who those scientists will be. The rugged individualists of the 19th century, such as Michael Faraday and John Dalton, who set up labs in their cellars, scrabbled around rocks or simply observed, like Darwin, the world around them, eventually gave way to the more institutionalized genre of academic scientists, funded by governments and charitable trusts to give, as Haldane predicted long ago, ‘the answer of the few to the demands of the many for wealth, comfort and victory’. But nowadays the horizons of the Ivory-Tower boffin are being widened by two new trends. First, there is a growing need for innovative science in the private sector, as companies in high-tech industries, particularly pharmaceutical companies, depend for survival on having novel products in the pipeline. Allowing for a rosy enough economic situation – a big assumption – there will therefore be a need for more ideas-based companie
s; this drive for intellectual innovation could result in a smorgasbord of new opportunities for the development of intellectual property within universities. If entrepreneurs regain the confidence and capability to invest in high technology, then science will prove a more lucrative, exciting and, above all, useful career, and it may become a profession as structured, and as popular, as law or medicine within the next few decades.

  The second recent trend is the resurgence of the amateur. As an example, take the Royal Institution in London, where I am Director: it was founded in 1799, in the words of its charter, to ‘diffuse science for the common purposes of life’ and the idea was to ensure that lectures on the scientific discoveries of the time were as rewarding as any other evening outing. So successful were these presentations by the great scientists of the day, complete with bangs, smoke and other exciting demonstrations, that the lecture theatre in central London rapidly became a fashionable salon, a place to be seen as much as to think and to question how science for example, in the shape of the newly invented electric motor was impacting on and changing the accepted way of life. We are now seeing a revival of this attitude. For the last few decades popular-science books and science broadcasts have captured the public imagination. Someone has even remarked that science is now like foreign travel was in the 19th century: because it was not accessible at first hand people were keen to pursue it indirectly, through the eyes of an intrepid but always articulate pioneer. Science debates, and events such as authors of science books lecturing at literary festivals, attract hundreds if not thousands at a sitting; above all, the public are realizing that they need to be empowered with knowledge, to be scientifically literate, if they are to contribute to the great debates that science will inspire this century.

 

‹ Prev