Book Read Free

The One World Schoolhouse: Education Reimagined

Page 17

by Salman Khan


  India loves its Bollywood movies, and even in the most remote rural villages there is almost always someone with a first-generation DVD player and a television set. Thanks to grant money that Khan Academy has received, we already have video lessons translated into Hindi, Urdu, and Bengali (as well as Spanish, Portuguese, and several other languages) and copied onto DVDs, to be distributed free of charge.

  Admittedly, just having students watch the videos is not ideal; with DVDs alone, they would not be able to do the self-paced exercises or have access to a great deal of feedback. Even so, video lessons on DVD would be a significant improvement over what’s available now. Their availability would ameliorate the teacher shortage situation; kids would at least be able to pause, repeat, and review the lessons. And it would be a win—wouldn’t it?—if we could give kids in the world’s poorest areas even a cheap approximation of what the wealthy have.

  But say we aim higher. Say we aim ridiculously high. Say we aim to give kids in poor rural villages around the world virtually the same experience as kids in Silicon Valley have. This is preposterous, right? Well, I believe it can be done.

  Consider: Inexpensive tablet computers (think of smaller, cheaper iPads) are coming onto the market in India for under $100. If it can be expected to run for around five years, the annual cost of owning this device is $20. As I’ve already explained, the Khan Academy curriculum is designed so that students can get what they need in one to two hours a day of following lessons and working out problems; this means that a single tablet could be used by four to as many as ten students a day. But let’s take the most conservative number; if the computer is shared among four students, the cost is $5 per student per year. Let’s now give our students some downtime and some sick days, and posit that the computer is used three hundred days a year. The cost is thus under two cents per student per day. Can anyone tell me in good conscience that this is more than the world can afford? Even more, the technology will only get better and cheaper from here on out.

  Realistically, cheap tablet devices alone do not suffice to re-create a Silicon Valley–style virtual educational experience. There remain the questions of Internet connectivity and the gathering and use of data regarding students’ progress. These are logistical challenges that will vary in different locations, but the general point I want to make is that with some imagination and technological savvy, the challenges can be met far more cheaply than is usually acknowledged.

  Without getting too technical, consider Internet access. Broadband connections would be nice, but broadband is relatively expensive and not currently available everywhere. There are much cheaper alternatives. Bandwidth-hogging videos can be preloaded on devices and user data could be transmitted over cellular networks. If there is no cellular connectivity, information regarding students’ work and progress could be downloaded from individual computers, copied onto flash drives, and carried in a truck to central servers. They could be carried on a donkey! The point I’m getting at is that not everything in high-tech education has to be high-tech. There are hybrid solutions right in front of us—if we are open to them.

  Coming back to cost, cellular Internet connectivity can be had for as little as $2 per month in India. So our per-student expense has now increased to $11 per year ($44 per year per device with Internet that can be shared by four students). Let’s further suggest a worst-case scenario in which not even this meager amount can be procured from public or philanthropic funds. What then?

  Certainly in a place like India, the price of educating the poor could be covered by the middle class and the well-to-do—not by taxation, charity, or under any sort of compulsion, but by giving prosperous families themselves a much better deal on education.

  Let me explain. In much of the developing world, especially in both South and East Asia, school is regarded not primarily as a place to learn—the rigid conditions don’t allow for much of that—but rather as a place to show off what you know. The actual learning happens before or after school, through the use of private tutors. Even middle-class families tend to see tutors as a necessary expense, and private tutoring is in fact how many teachers actually end up making something approaching a middle-class income. As teachers of advanced subjects are in short supply, so are tutors in those subjects. Accordingly, tutoring in calculus or chemistry gets pretty pricey.

  What if the families who currently use private tutors were offered an alternative that was far less expensive, far more comprehensive, and designed up to a proven international standard? In other words, what if they were offered paid but low-cost access to computer centers that offered Internet-based, self-paced mastery learning? This might be bad news for the private tutors, but it would be good news for everybody else. Middle-class families would spend far less for quality education; kids would have the benefit of a complete, tested curriculum rather than the hit-or-miss of tutors whose own understanding might be less than world-class.

  Supported by the fees of those who could afford them, the centers would be free to the poor and the currently unschooled. The beauty is that the middle-class kids, still attending conventional classes, would use the center in the early morning or the evening. The kids (and adults for that matter) without access to other education could use the facilities during the day.

  Now, as a sworn enemy of one-size-fits-all approaches, I’m not suggesting that this scheme would work everywhere or that it couldn’t be improved upon. But I am convinced that the basic model—providing high-quality, low-cost education to the affluent and middle class and using the revenues to make the same services free to the poor—has a place in how we finance our educational future. In a perfect world, such schemes would not be necessary; governments and societies would see to it that all had access to quality education. In the real world, however, with its blatant inequities and tragic shortfalls in both money and ideas, new approaches are needed to prop up and refresh a tired system that works for some but fails for many. The cost of wasting millions of minds is simply unacceptable.

  The Future of Credentials

  When people talk about education, they are usually mixing together several ideas. The first is the idea of teaching and learning. That is what the bulk of this book is about—how can we rethink the best ways to learn. The second is the idea of socialization. That, too, we have touched on when discussing peer-to-peer collaboration and mixed-age classrooms. The third is the idea of credentialing—giving a piece of paper to someone that proves to the world that they know what they know. These three different aspects of education are muddled together because today they are all performed by the same institutions—you go to college to learn, have a life experience, and get a degree.

  Let’s try a simple thought experiment: What if we were to separate (or decouple) the teaching and credentialing roles of universities? What would happen if regardless of where (or whether) you went to college, you could take rigorous, internationally recognized assessments that measured your understanding and proficiency in various fields—anything from quantum physics to European history to software engineering. Some could be assessments designed in conjunction with employers looking for people with particular skills. Because these assessments could be even more thorough than what happens during exam time at many universities, they might be expensive, maybe $300 a pop. You could also take these exams at any age.

  Think about the implications. Most students who go to college are not going to nationally known private colleges like Princeton or Rice or Duke. They are also not going to well-known state universities like Berkeley, UT Austin, or the University of Michigan. The great majority of students go to not-well-known regional or community colleges. This is especially the case for students from underrepresented communities because these schools have more open admissions and tend to be more affordable (although they can still be quite expensive). Even if a student gets an amazing education at these schools they are at a marked disadvantage. Because employers use the “difficulty of getting in” to a school as a proxy for th
e quality of its graduates, the students from less well-known schools often fail to pass the résumé screen. College is all about opening up opportunity, but the reality is that the ultra-smart, ultra-hardworking kid from a poor family, who worked full-time while getting good grades at a regional school or community college, will almost always be passed over when compared to someone graduating from a more well-known and selective school.

  With our hypothetical assessments—microcredentials if you will—anyone could prove that they know just as much in a specific domain as someone with an exclusive diploma. Even more, they wouldn’t have had to go into debt and attend university to prove it. They could prepare through textbooks, the Khan Academy, or tutorials from a family member. Because even name-brand diplomas give employers limited information, it would be a way for even elite college graduates to differentiate themselves from their peers, to show that they actually have retained deep, useful skills. In short, it would make the credential that most students and parents need cheaper (since it is an assessment that is not predicated on seat time in lecture halls) and more powerful—it would actually tell employers who is best ready to contribute at their organizations based on metrics that they find important.

  Now, I do not think this will eliminate the need or value of universities for many students. If you are lucky enough to attend a good university, you will be immersed in a community of inspiring peers and professors doing amazing things. You will build social bonds that are at least as valuable—emotionally and economically—as that first job out of college. You will have life experiences that are priceless. The universities themselves will continue to conduct cutting-edge research that pushes society forward (and in which undergrads can often participate). The signal to employers of getting in and being socialized in these types of communities will always carry weight. College will become something similar to an MBA. It will be optional. You can have a very successful career without it, but it is a great life experience that will probably help if you can afford the time and money.

  What this will change is the opportunities and the ecosystem for the great majority of students who aren’t given the luxury to attend a name-brand school, because now they’d have the opportunity to—at minimum—work toward a recognized credential in whatever way they see fit. It would allow a forty-year-old laid-off factory worker to show that they still have the analytical skills and brain plasticity to work alongside twenty-two-year-old college grads in a twenty-first-century job. It would allow anyone, in any field, to better themselves and prepare for valuable credentials without the sacrifice of money and time that today’s higher education demands.

  What College Could Be Like

  I have never let my schooling interfere with my education.

  —MARK TWAIN

  In the last section, we explored what would happen if credible credentials could be earned outside of a college. I’d like to turn now to a vision of how college education might change to better suit our needs. The starting point for this discussion is a very basic disconnect between most students’ expectations for college—a means to employment first and a good intellectual experience second—and what universities believe their value is—an intellectual and social experience first, with only secondary consideration to employment.

  And it is unfair to expect traditional universities to cater to the whims of the economy or job market. They are designed to be places insulated from the “real world” so that intellectual truth and pure research can be pursued with as few practical constraints as possible. This is what allows them to be truly fertile soil for breakthrough ideas and fundamental discoveries. Even more, some professors—especially those at large research universities—don’t view teaching as the best use of their time, and were not selected to be professors based on teaching ability. They were hired to do research and sometimes view teaching as a necessary evil. I have professor friends who feel lucky when they don’t have to teach a course at all.

  So let us face this as an open-ended design problem—is it possible to craft a university experience that bridges the gap between students’ expectations and professors’ inclinations? One that provides the rich social and intellectual atmosphere of a good existing college, while at the same time exposing students to those intellectual but also practical fields that will make them valuable to the world? Where the faculty is invested in the future of their students and not just their own ability to publish research papers? And now let’s be ambitious: Might there be a sustainable way to make this experience free, or even pay the students to participate?

  Computer science is a good place to start. I know the field reasonably well and I also have a sense for the job market—which is tight and growing tighter every day. It is a field where degrees can be valuable, but the ability to design and execute on open-ended, complex projects is paramount; seventeen-year-olds with unusual creativity and intellect have been known to get six-figure salaries. Because of the demand for talent and the recognition that college degrees and high GPAs are not the best predictor of creativity, intellect, or passion, top employers have begun to treat summer internships as something of a farm league. They observe students actually working and make offers to those who perform the best. Employers know that working with a student is an infinitely better assessment than any degree or transcript.

  Students have also begun to recognize something very counterintuitive: that they are more likely to get an intellectual grasp of computer science—which is really the logical and algorithmic side of mathematics—by working at companies like Google, Microsoft, or Facebook than by reading textbooks or sitting in lecture halls. They see the projects that these companies give their interns as being more intellectually challenging and open-ended than the somewhat artificial projects given in classrooms. Even more, they know that the product of their efforts will touch millions of people instead of just being graded by a teaching assistant and thrown away.

  So, to be clear, in software engineering, the internship has become far more valuable to the students as an intellectual learning experience than any university class. And it has become more valuable to the employer as a signal of student ability than any formal credential, class taken, or grade point average.

  I want to emphasize that these internships are very different from the ones that many people remember having even twenty years ago. There is no getting coffee for the boss, sorting papers, or doing other types of busywork. The projects aren’t just cute things to work on that have no impact on real people. In fact, the best way to differentiate between forward-looking, twenty-first-century industries and old-school, backward-looking ones is to see what interns are doing. At top Internet companies, interns might be creating patentable artificial intelligence algorithms or even creating new lines of business. By contrast, at a law firm, government office, or publishing house, they will be doing paperwork, scheduling meetings, and proofreading text. This menial work will be paid accordingly, if at all, whereas pay scales at the new-style internships reflect the seriousness of the work involved; college interns in Silicon Valley can earn over $20,000 for the summer.

  Given the increasing importance of internships in terms of both intellectual enrichment and enhancement of job prospects, why do traditional colleges limit them only to summers, pushing them aside to cater to the calendar needs of lectures and homework? The answer is simple inertia—this is how it has always been done, so people haven’t really questioned it.

  Actually, some universities have. Despite being founded not even sixty years ago, the University of Waterloo is generally considered to be Canada’s top engineering school. Walk down a hallway at Microsoft or Google and you will find as many Waterloo grads as those from MIT, Stanford, or Berkeley—despite the fact that, because of work visa issues, it is a significant hassle for American employers to hire Canadian nationals. And this isn’t some attempt to get low-cost labor from across the border—Waterloo graduates are commanding salaries as high as the very best American grads. What is Waterloo do
ing right?

  For one thing, Waterloo recognized the value of internships long ago (they call them co-ops) and has made them an integral part of its students’ experience. By graduation, a typical Waterloo grad will have spent six internships lasting a combined twenty-four months at major companies—often American. The typical American grad will have spent about thirty-six months in lecture halls and a mere three to six months in internships.

  This past winter—not summer—all of the interns at the Khan Academy, and probably most of the interns in Silicon Valley, were from Waterloo because that is the only school that views internships as an integral part of students’ development outside of the summer. While the students at most colleges are taking notes in lecture halls and cramming for winter exams, the Waterloo students are pushing themselves intellectually by working on real projects. They are also getting valuable time with employers and pretty much guaranteeing several job offers once they graduate. On top of that, some are earning enough money during their multiple high-paying internships to pay for their tuition (which is about one-sixth to one-third the cost of a comparable American school) and then some. So Waterloo students graduate with valuable skills, broad intellectual development, high-paying jobs, and potential savings after four or five years.

  Compare this to the typical American college grad with tens or hundreds of thousands of dollars of debt, no guarantee of an intellectually challenging job, and not much actual experience with which to get a job.

  Waterloo has already proven that the division between the intellectual and the useful is artificial; I challenge anyone to argue that Waterloo co-op students are in any way less intellectual or broad-thinking than the political science or history majors from other elite universities. If anything, based on my experience with Waterloo students, they tend to have a more expansive worldview and are more mature than typical new college graduates—arguably due to their broad and deep experience base.

 

‹ Prev