Evolution 2.0: The Singularity is Here

Home > Other > Evolution 2.0: The Singularity is Here > Page 6
Evolution 2.0: The Singularity is Here Page 6

by Richard Childers


  “Wow, cool idea,” Russell said. “How much data are we talking about?”

  “Well, data feeds from something like six hundred sensors plus two tracks of high definition video and a single track of audio”, I replied.

  “That’s an enormous amount of data. Let’s say I can build a device that can capture it all in real time, how do you plan to store it? Even a few minutes of information will surpass any kind of storage I can build in an easily portable device.”

  At this point Fincher interrupted saying, “Colin and I have discussed this and I think we will have to have a really big wireless pipeline uploading the data to a bank of my servers in the cloud.”

  “OK,” Russell answered. “I suppose that’s possible. But do you have any idea of the cost to implement something like this? It would be staggeringly expensive.”

  “For all intents and purposes this project has an unlimited budget. How much money do you think we are talking about?”

  “Millions anyway and that’s not including the servers we’d need.”

  “That’s not a problem. And I will set aside as much server space as you require. We recently built some pretty big server farms located on a barge in the San Francisco Bay. They are water cooled and truly state of the art. We built them but I don’t really have much use for them yet. That market is still developing. So I can make them available for this project at least for the next year or so. So the question is, can you build a portable device that can facilitate this operation.”

  “Yea, that’s doable. I should be able to build a smart phone prototype that can handle that much real time data. Now what you do with it once you have it is beyond my scope of expertise.”

  Fincher’s face on the monitor turned towards his AI expert. “That would be your turf Glen. How would you go about making sense out of that much data?”

  “Well, it’s certainly an interesting problem. The voice stuff is easy. We have speech recognition that can digitize the audio in real time and we should be able to discover matching patterns in the EEG data. But the visual data might be a bit tougher. Right off the top of my head, it seems like we might be able to take the two video streams and use them to create 3D models. The data is there and it’s stereoscopic so we should be able to do that. But it’s going to take some real horsepower to do it in anything like real time. But then initially, I guess it wouldn’t have to. We could create the 3D models offline and then compare the sensor data after the fact. Still gonna take some major processing power.”

  “I’ve got an idea that might handle that,” Fincher said. “Some years ago I got talked into contracting for a pretty big chunk of supercomputer time at the Maui High Performance Computing Center. They have an IBM iDataplex system with over 750 compute nodes. Each node has two 8 core processors and 32 gigs of memory. They call it the Riptide System and it’s rated at 251 peak TFLOPS. It’s really a bit of a boondoggle. Hawaii’s Senator Dan Inoye managed to get the whole thing funded. But they don’t really have much use for it. The Department of Defense uses about 20% of the cycles and the rest were up for grabs. I was one of several government contractors that were strong-armed into buying time. I’ve never really had much use for it so I may as well make it available for this.”

  Claire chimed in with “As long as we’re doing this, why don’t we add something like Google Glass and eye tracking? We can have the user operate the computer and record his eye movements at the same time. I would think that might be data that can be read on you EEG. That might eventually allow us to insert a feedback loop, putting interpreted data back on the virtual computer screen of the Google Glass. What do you think?”

  Glen smiled and commented, “If we can make it work at all, that might help the AI to interpret the data. As long as we are shooting for the moon, we might as well aim for Mars. Russell, how long do you think it will take you to build this headset and the smartphone prototype?”

  “The headset should be done in a week or two. It’s pretty straightforward and my assistant can do a lot of the work. The smartphone is a bit trickier. I already have built a phone with almost as much processing power as Dr. Anderson needs. But the communication requirements are something entirely new. Give me a month and I should have something we can begin to test.”

  “I’ll let the folks in Maui know you are going to be using all those compute cycles I paid for but never used,” chimed in Fincher. “You can get in touch with them directly and they’ll walk you through the process. Their DOD work ensures a pretty high level security so we should be good to go.”

  “All right folks, let’s get started. Keep me apprised of you progress.”

  “If you’d like Colin,” Claire offered, “I can coordinate progress reports and set up a Gantt chart so you can track everything online.”

  “Great. That makes it easy. Thank you all for you time. And once again Dr. Fincher, thank you for all your support.”

  Of course a month quickly turned into two. Russell made several unsuccessful headset prototypes before he came up with something that didn’t feel like I was wearing a medieval knight’s helmet but he did finally come up with something that worked pretty well. It was still a bit heavy with all the sensors but it only looked moderately nerdy. And the important thing was, it worked. We tested it in the lab by hooking it up directly to the computer I had originally used for efference testing and we got a very detailed picture of the brain’s activity. It took another couple of weeks to get the sensor signals into an array that could feed directly into the portable computer that Russell was building. Finally two and a half months after we began, we were able to feed the helmet’s data into a portable device that only weighed three pounds and could be carried by hooking it to my belt. “Don’t worry, Dr. Anderson,” Russell assured me. “The next gen of this device will be substantially smaller. I just wanted to give you something that you could begin to test.” Glen and Russell had put together a real time feed that dumped the data to the cloud storage where it was transmitted to Maui for interpolation. It was still a bit clunky but it was beginning to come together.

  After these frustrating delays, Glen and Russell came to me with a demo that truly blew me away. After setting up two high resolution displays, they had me walk around the lab wearing the headset. One display carried the live video feed in real time and the second display showed a detailed 3 dimensional model of the space complete with high resolution photographic texture maps that had been captured by the headset cameras. And the time lag was less than a minute from video capture to a fully interactive photorealistic 3D environment. It was like creating a video game space on the fly. “That’s incredible!” I exclaimed. Will this work in a more complex environment?”

  “Let’s give it a try,” Glen suggested. “Go outside and walk around the quad.”

  I spent the next hour and thirty minutes wandering around the Stanford campus without paying any real attention to where I was going or how my path crisscrossed in my aimless wandering. When I got back to the lab the guys were cruising around a remarkably accurate digital version of the campus. The implications of this really blew me away. “With the unlimited storage and compute capacity Fincher has given us, we can build accurate models of anyplace anyone goes wearing a headset. Can you figure out how to attach metatags so we can build a data base of information like building names, departments, and labs? That sort of thing?”

  “Yea, I think I could create an algorithm that captures data from Google Maps and Google World and adds it to the database. And I suppose we could have the computer search other online databases for additional information we could tag to the 3D objects.”

  “Great, let’s do it. Good work guys. I hadn’t envisioned anything this robust when we talked about this. It’s really amazing!”

  “The next step is for you to start wearing this rig as much of the time as is possible. We need to start generating data for the computer to try and interpolate,” said Glen Gary. “I have our most powerful AI set up in Maui to look for ident
ifiable patterns that correlate with your speech and vision.”

  “And the data is automatically sent to the computer so I don’t have to worry about downloading or anything?”

  “No, all you have to do is look a bit ridiculous. The computer will take it from there. I have no idea how long it will be until it starts making sense out of the data.”

  “How will I know when it has?”

  “I set it up for the AI to communicate with you through your Google Glass eyephones. I’m not sure how it will accomplish that but I suspect you’ll know when it does. We need to establish a feedback loop so the AI can learn what’s working and what’s not.”

  And so it began. I was a bit self-conscious at first, assuming people would think me weird in my foreign legion cap. But, hell, it was Stanford. Weird is normal here. The next few months were filled with the mundane tasks of scientific research. It’s not all ah-ha moments. In fact most of it is a kind of systematic drudgery as teams of scientists gather data and then seek to make sense out of what that data tells them. I kept a close watch on Jay Moore’s work as he struggled to grow an artificial neuron that transitioned into a mechanical electrical lead. Jean worked closely with him and together they were making real progress on two fronts. They had a good start on building the neuron and Jay was becoming a little more comfortable in his human interactions. He even looked me in the eye a couple of times when we were discussing his work. And that was real progress. I knew Jay was never going to become adept at social interaction but Jean and I both believed that he could learn how to function with other people present.

  For three months I sort of twiddled my thumbs, supervising people who really didn’t need much supervision while I allowed our data capture system to observe my every thought and action. I did become fascinated by the 3D models we were assembling from my visual data. I enjoyed cruising through these virtual environments and testing the metadata that was becoming embedded in the model. It was fun to walk past a lab building and bring up a list of researchers working there along with a synopsis of their work. At first I only did this while cruising the virtual campus we had built but after a while I started triggering tags and labels as I walked the campus. I discovered I could look at some landmark in real time and if I held my focus for more than a few moments, information tags would appear superimposed upon the image I was focused upon. I figured the eyetracking system in the eyephones was cuing the data to appear. It was kind of cool.

  And then one day the system surprised me. I was walking past Stanford football stadium and I wondered who they were playing on Saturday. As I did the words “CAL Berkeley, 1:30PM appeared in my vision. I didn’t really think much about it for a moment until I stopped abruptly realizing what had happened. The system had answered a question I hadn’t really asked. At first I thought maybe I had spoken my query out loud but when I recalled my audio track, I confirmed I hadn’t spoken.

  I immediately thought to call Claire and tell her what had occurred but before I could activate my Bluetooth with the command “Sancy Call Claire”, she was on the line. “Hi Claire,” I said. “Good timing, I was about to call you. Wait til…”

  “What do you mean?” she asked. “You did call me.”

  “No I was about to call you and then your call came through.”

  “Nope, my phone rang and it was you calling. I know cause I was eating my lunch.”

  I thought about this for a second and when I realized what had occurred, I almost walked into a tree. “Call the team together. We need to meet right away. And get Fincher hooked in over the video conferencing system. I’ll be back in the lab in about 20 minutes.”

  “What’s going on?” Claire asked with a note of puzzlement in her voice.

  “Things are heating up. I’ll explain when I get there.”

  When we all were seated in the conference room, Fincher appeared on the television screen saying, “This better be good. I ducked out of a meeting with the IRS to be here and I’m sure there will be hell to pay as a consequence.”

  “Hi Bob. I won’t keep you long.” Addressing the room I told them, “It’s really started. This morning while I was walking across campus I wondered about the identity of one particular lab building and the information just appeared on my heads up display. When I started to call Claire, she just appeared on line. It was really quite remarkable.”

  Glen was the first to respond. “Wow, that’s fantastic! Actually, I’m not surprised at the appearance of metadata. You’ve been regularly cruising the 3D environment made from you video captures so I’ve instructed the AI to pay close attention to those sessions and compare their input with the original data. It’s a kind of a feedback loop and that can be particularly effective for an AI that is trying to learn. But the phone thing’s a bit of a surprise. The software is not really written that way. Are you sure you didn’t speak the command?”

  Instead of answering his question I made to speak “Sancy, call Claire” but I stopped short of speaking the words. Claire’s phone immediately rang. “I guess the system has started to recognize at least some efference copies.”

  “It obviously has,” Glen replied. “I suspect some more will come on line in the near future. Keep advising me every time this happens and I’ll go back into the data and map each occurrence. It seems we’re on the right track.”

  Fincher broke in asking “Do you have a second headset and portable computer ready yet?”

  “Almost,” Russell answered. “I should have a second set up within a week. And the system is all set up for a second user. Have you decided who to hook up next?”

  “Pick me coach! Pick me!” cried Claire.

  “Alright Claire, you’re next” I replied.

  “Cool!” she responded with an enthusiastic fist pump.

  “I’ll be all wired up with no particular place to go.”

  And I guess that was the real beginning of this strange journey. A while back, I had set myself on a course of chasing an improbable dream, achieving a direct mind to computer link though the interface of human thought. That had been my dream for a really long time. And suddenly it was there. I did it. And everything changed. I had proven the basic premise. It was possible. Now I had to figure out what to do with it. This breakthrough changed everything. And it was up to me to decide how it was going to change the game.

  Chapter 6

  The next three months were filled with untold hours of attempts to directly interface with the computer yielding an occasional incremental increase in control and the gradual development of my ability to think commands rather than speak them into a microphone or type them on a keyboard. Sometimes it felt like practicing shooting free throws on a basketball court, a task I had spent an inordinate of time on during my high school years. I had improved but nonetheless, I had never made it off the bench. A decent free throw is no substitute for an amazing hook shot that never touches the rim.

  Claire got hooked up to her system a couple of months after I received mine. At first she struggled to establish a direct mental interface to the system but after her first successful link, her progress was remarkable. Within weeks she was as adept as I was. And then we had another unexpected breakthrough. I was at my desk, absorbed in the day to day paperwork that plagues all Principal Investigators of research projects when I received a rather odd message, directly delivered by Sancy. “Tell that son of a bitch he can have those reports as soon as I get them done.” I puzzled over this for a moment, wondering what this was referring to. Until that moment, all of my messages had been plucked out of my own thoughts. After a few seconds of rather intense puzzlement, it occurred to me that this was a response to an email I had sent to Claire asking her for some budget breakdowns. I had sent this message to her a few minutes ago by thinking it and then having Sancy deliver it to Claire’s email account. Immediately, I stood up and ran into Claire’s office.

  “Did you just get the email I sent you about those budget breakdowns?” I asked.

  “Christ, I
just got them a couple of minutes ago. I’ll get to it this morning,” Claire replied a bit testily.

  “No worries. Get them when you can. What I want to know is did you send me a reply email?”

  “No I didn’t. To tell you the truth, I was a bit pissed off that you were bugging me about…”

  “So you didn’t send me a message stating ‘Tell that son of a bitch he can have those reports as soon as I get them done’?”

 

‹ Prev