Thank You for Being Late

Home > Nonfiction > Thank You for Being Late > Page 8
Thank You for Being Late Page 8

by Thomas L. Friedman


  Originally founded by three grade-A geeks—Tom Preston-Werner, Chris Wanstrath, and P. J. Hyett—GitHub is now the world’s largest code host. Since I could not visit any major company today without finding programmers using the GitHub platform to collaborate, I decided I had to visit the source of so much source code at its San Francisco headquarters. By coincidence, I had just interviewed President Barack Obama in the Oval Office about Iran a week earlier. I say that only because the visitor lobby at GitHub is an exact replica of the Oval Office, right down to the carpet!

  They like to make their guests feel special.

  My host, GitHub’s CEO, Chris Wanstrath, began by telling me how the “Git” got into GitHub. Git, he explained, is a “distributed version control system” that was invented in 2005 by Linus Torvalds, one of the great and somewhat unsung innovators of our time. Torvalds is the open-source evangelist who created Linux, the first open-source operating system that competed head-to-head with Microsoft Windows. Torvalds’s Git program allowed a team of coders to work together, all using the same files, by letting each programmer build on top of, or alongside, the work of others, while also allowing each to see who made what changes—and to save them, undo them, improve them, and experiment with them.

  “Think of Wikipedia—that’s a version control system for writing an open-source encyclopedia,” explained Wanstrath. People contribute to each entry, but you can always see, improve, and undo any changes. The only rule is that any improvements have to be shared with the whole community. Proprietary software—such as Windows or Apple’s iOS—is also produced by a version control system, but it is a closed-source system, and its source code and changes are not shared with any wider community.

  The open-source model hosted by GitHub “is a distributed version controlled system: anyone can contribute, and the community basically decides every day who has the best version,” said Wanstrath. “The best rises to the top by the social nature of the collaboration—the same way books get rated by buyers on Amazon.com. On GitHub the community evaluates the different versions and hands out stars or likes, or you can track the downloads to see whose version is being embraced most. Your version of software could be the most popular on Thursday and I could come in and work on it and my version might top the charts on Friday, but meanwhile the whole community will enjoy the benefits. We could merge them together or go off on our different paths, but either way there is more choice for the consumer.”

  How did he get into this line of work? I asked Wanstrath, age thirty-one. “I started programming when I was twelve or thirteen years old,” he said. “I wanted to make video games. I loved video games. My first program was a fake AI program. But video games were way too difficult for me then, so I learned how to make websites.” Wanstrath enrolled at the University of Cincinnati as an English major, but he spent most of his time writing code instead of reading Shakespeare, and participating in the rudimentary open-source communities online. “I was desperate for mentorship and looking for programs that needed help, and that led me to a life of building developer tools,” he explained.

  So Wanstrath fired off his open-source résumé and examples of his work to various software shops in Silicon Valley, looking for a junior-level programming job. Eventually a manager at CNET.com, a media platform that hosts websites, decided to take a chance on him, based not on his grades from college but on the “likes” on his programming from different open-source communities. “I didn’t know much about San Francisco,” he said. “I thought it was beaches and Rollerbladers.” He soon found out it was bits and bytes.

  So, in 2007 “I was a software engineer using open-source software to build our products for CNET.” Meanwhile, in 2007, Torvalds went to Google and gave a Tech Talk one day about Git—his tool for collaborative coding. “It was on YouTube and so a bunch of my open-source colleagues said, ‘We’re going to try this Git tool and get away from all these different servers serving different communities.’”

  Up to that point the open-source community was very open but also very balkanized. “Back then there was really no open-source community,” recalled Wanstrath. “It was a collection of open-source communities, and it was based on the project, not the people. That was the culture. And all the tools, all the ideology, were focused on how you run and download this project and not how people work together and talk to each other. It was all project-centric.” Wanstrath’s emerging view was: Why not be able to work on ten projects at the same time in the same place and have them all share an underlying language, so they could speak to one another and programmers could go from one to the next and back?

  So he began talking about a different approach with his CNET colleague P. J. Hyett, who had a computer science degree, and Tom Preston-Werner, with whom Wanstrath had collaborated on open-source projects long before they ever met in person.

  “We were saying to ourselves: ‘It is just so freaking hard to use this Git thing. What if we made a website to make it easier?” recalled Wanstrath. “And we thought: ‘If we can get everyone using Git, we can stop worrying about what tools we are using and start focusing on what we are writing.’ I wanted to do it all with one click on the Web, so I could leave comments about a program and follow people and follow code the same way I follow people on Twitter—and with the same ease.” That way if you wanted to work on one hundred different software projects, you didn’t have to learn one hundred different ways to contribute. You just learned Git and you could easily work on them all.

  So in October 2007, the three of them created a hub for Git—hence “GitHub.” It officially launched in April 2008. “The core of it was this distributed version control system with a social layer that connected all the people and all the projects,” said Wanstrath. The main competitor at that time—SourceForge—took five days to decide whether to host your open-source software. GitHub, by contrast, was just a share-your-code-with-the-world kind of place.

  “Say you wanted to post a program called ‘How to Write a Column,’” he explained to me. “You just publish it under your name on GitHub. I would view that online and say: ‘Hey, I have few points I would like to add.’ In the old days, I would probably write up the changes I wanted to make and pitch them in the abstract to the community. Now I actually take your code into my sandbox. That is called a ‘fork.’ I work on it and now my changes are totally in the open—it’s my version. If I want to submit the changes back to you, the original author, I make a pull request. You look at the new way I have laid out ‘How to Write a Column’; you can see all the changes. And if you like it, you press the ‘merge’ button. And then the next viewer sees the aggregate version. If you don’t like all of it, we have a way to discuss, comment, and review each line of code. It is curated crowdsourcing. But ultimately you have an expert—the person who wrote the original program—‘How to Write a Column’—who gets to decide what to accept and what to reject. GitHub will show that I worked on this, but you get to control what is merged with your original version. Today, this is the way you build software.”

  A decade and a half ago Microsoft created a technology called .NET—a proprietary closed-source platform for developing serious enterprise software for banks and insurance companies. In September 2014, Microsoft decided to open-source it on GitHub to see what the community could add. Within six months Microsoft had more people working on .NET for free than they had had working on it inside the company since its inception, said Wanstrath.

  “Open source is not people doing whatever they wanted,” he quickly added. “Microsoft established a set of strategic goals for this program, told the community where they wanted to go with it, and the community made fixes and improvements that Microsoft then accepted. Their platform originally only ran on Windows. So one day Microsoft announced that in the future they would make it work on Mac and Linux. The next day the community said, ‘Great, thank you very much. We’ll do one of those for you.’” The GitHub community just created the Mac version themselves—overnight. It was a
gift back to Microsoft for sharing.

  “When I use Uber,” concluded Wanstrath, “all I am thinking about now is where I want to go. Not how to get there. It is the same with GitHub. Now you just have to think about what problem do you want to solve, not what tools.” You can now go to the GitHub shelf, find just what you need, take it off, improve it, and put it back for the next person. And in the process, he added, “we are getting all the friction out. What you are seeing from GitHub, you are seeing in every industry.”

  When the world is flat you can put all the tools out there for everyone, but the system is still full of friction. But the world is fast when the tools disappear, and all you are thinking about is the project. “In the twentieth century, the constraint was all about the hardware and making the hardware faster—faster processors, more servers,” said Wanstrath. “The twenty-first century is all about the software. We cannot make more humans, but we can make more developers, and we want to empower people to build great software by lifting the existing ones up and opening up the world of development to create more coders … so they can create the next great start-up or innovation project.”

  There is something wonderfully human about the open-source community. At heart, it’s driven by a deep human desire for collaboration and a deep human desire for recognition and affirmation of work well done—not financial reward. It is amazing how much value you can create with the words “Hey, what you added is really cool. Nice job. Way to go!” Millions of hours of free labor are being unlocked by tapping into people’s innate desires to innovate, share, and be recognized for it.

  In fact, what is most exciting to see today, said Wanstrath, “is the people behind the projects discovering each other now on GitHub. It’s companies finding developers, developers finding each other, students finding mentors, and hobbyists finding co-conspirators—it’s everything. It is becoming a library in the holistic sense. It is becoming a community in the deepest sense of the word.” He added: “People meet each other on GitHub and discover they are living in the same town and then go out and share pizza and talk all night about programming.”

  Still, even open source needs money to operate, especially when you have twelve million users, so GitHub devised a business model. It charges companies for using its platform for private business accounts, where companies create private software repositories with their proprietary business codes and decide who they want to let collaborate on them. A great many major companies have both private and public repositories on GitHub now, because it enables them to move faster, making use of the most brainpower.

  “We built our cloud architecture on open-source software, called OpenStack, so we can leverage the community, and we have a hundred thousand developers who don’t work for us—but what they can do in a week, we couldn’t do in a year,” said Meg Whitman, president and CEO of Hewlett Packard Enterprise. “I am convinced that the world is driven by validation and that’s what makes these communities so powerful. People are driven by their desire for others in the community to validate their work. You like me? Really? Most people don’t get tons of validation. I learned this at eBay. People went crazy about their feedback. Where else can you wake up and see how much everyone loves you!?”

  It used to be that companies waited for the next chip to come down the line. But now that they can use software to make any hardware dance and sing in new ways, it’s software that people are waiting for and collaborating on most avidly. That is why AT&T’s John Donovan said: “For us Moore’s law was the good ol’ days. Every twelve to twenty-four months we could plan on a new chip and we knew it was coming and we could test around it and plan around it.” Today it is much more about what software is coming down the pike. “The pace of change is being driven by who can write the software,” he added. “You know that something is up when the guys with all the trucks and ladders who climb up telephone poles tell you, ‘Donovan, we’re a software company now.’ Software used to be the bottleneck and now it is overtaking everything. It has become a compound multiplier of Moore’s law.”

  Networking: Bandwidth and Mobility

  While the accelerating advances in processing, sensing, storage, and software have all been vital, they would never have scaled to the degree they have without the accelerating advances in connectivity—that is, the capacity and speed of the world’s network of overland and undersea fiber-optic cables and wireless systems, which are the backbone of the Internet, as well as mobile telephony. Over the last twenty years, progress in this realm also has been moving at a pace close to Moore’s law.

  In 2013, I visited Chattanooga, Tennessee, which had been dubbed “Gig City” after it installed what was at the time the fastest Internet service in America—an ultra-high-speed fiber-optic network that transferred data at one gigabit per second, which was roughly thirty times the average speed in a standard U.S. city. According to a February 3, 2014, report in The New York Times, it took a mere “33 seconds to download a two-hour, high-definition movie in Chattanooga, compared with 25 minutes for those with an average high-speed broadband connection in the rest of the country.” When I visited, the city was still buzzing about an unusual duet heard on October 13, using video-conference technology with super-low latency. The lower the latency, the less noticeable are the delays when two people are talking to each other from across the country. And with Chattanooga’s then-new network, the latency was so low that a human ear could not pick it up. To drive home that point, T Bone Burnett, a Grammy Award winner, performed “The Wild Side of Life” with Chuck Mead, a founder of the band BR549, for an audience of four thousand. But Burnett played his part on a screen from a Los Angeles studio, and Mead on a stage in Chattanooga. The transcontinental duet was possible, reported Chattanoogan.com, because the latency of Chattanooga’s new fiber network was sixty-seven milliseconds, meaning the audio and video traveled 2,100 miles from Chattanooga to Los Angeles in one-fourth of the blink of an eye—so fast no human ear could pick up the slight delay in sound transmission.

  That duet was also a by-product of accelerating breakthroughs—just in the last few years—in the science of fiber optics, explained Phil Bucksbaum, a professor of natural science in the physics department at Stanford University. Bucksbaum specializes in the laser science that is the foundation of optical communications and is the former president of the Optical Society. Early in his career, in the 1980s, he worked at Bell Labs. In those days, computer scientists would use a command called “ping” to find out if a computer they wanted to communicate with in another part of the Bell Labs building was “awake.” Ping would send out an electronic message that would bounce off the other computer and indicate if it was awake and ready for a two-way conversation. Ping also had a clock that would tell you how long it took for the electric pulse to go down the wires and back.

  “I hadn’t used ping in more than a decade,” Bucksbaum told me over breakfast in September 2015. But for the fun of it “I sat down at my computer in my house in Menlo Park and pinged a bunch of computers around the world the other day,” just to see how fast the pulse could get there and back. “I started pinging computers in Ann Arbor, Michigan; Imperial College London; the Weizmann Institute in Israel; and the University of Adelaide in Australia. It was amazing—the speed was more than half as fast as the speed of light,” which is two hundred million meters per second. So that pulse went from a keystroke on Bucksbaum’s computer, into his local fiber-optic cable, then into the terrestrial and undersea fiber cable, and then into a computer half a world away at more than half the speed of light.

  “We are already half as fast as the laws of physics will allow, and trying to go faster runs into diminishing returns,” he explained. “In twenty years,” he added, “we went from maybe this is a good idea to there’s no turning back to hitting the physical limits … With ping I found out how close to the physical limits we were, and it was pretty startling. It is a way big revolution.”

  This revolution happened, Bucksbaum explained, thanks to a ki
nd of Moore’s law that has been steadily quickening the transmission speeds of data and voice down fiber-optic cables. “The speed at which we can transmit data over undersea cables just keeps accelerating,” said Bucksbaum. The short version of the story, he explained, goes like this: We started out sending voice and data using a digital radio frequency over coaxial cable made primarily of copper wire. That is what your first cable/phone company sent into your house and into the box on your television set. They also used the same coaxial cable to carry voice and data under the ocean to the four corners of the globe.

  And then scientists at places like Bell Labs and Stanford started playing around with using lasers to send voice and data as pulses of light through optical fibers—basically long, skinny, flexible glass tubes. Starting in the late 1980s and early 1990s that evolved into the new standard. The original fiber-optic cables were made of chains of cables that only went so far. After traveling a certain distance, the signal would weaken and have to stop at an amplifier box, where it would be turned from light into an electronic signal, amplified, and then converted back to light and sent on its way again. But over time the industry discovered novel ways of using chemicals and splicing the fiber cables to both increase capacity for voice and data and transmit a light signal that would never weaken.

  “That was a huge breakthrough,” explained Bucksbaum. “With all this internal amplification they could get rid of the electronic amplifier boxes and lay continuous end-to-end fiber-optic cables” from America to Hawaii or China to Africa or Los Angeles to Chattanooga. “That enabled even more nonlinear growth,” he said—not to mention the ability to stream movies into your home. It made broadband Internet possible.

 

‹ Prev