Tribe of Hackers
Page 29
48
Davi Ottenheimer
“In theory, practicing security means taking low-cost baby steps in every area possible, treating time spent practicing security as a small, incremental investment that grows.”
Website: www.flyingpenguin.com
Davi Ottenheimer is the founder and president of flyingpenguin, with more than 20 years’ experience managing global security operations and assessments— including a decade of leading incident response and digital forensics teams. He is also a member of the faculty at Institute for Applied Network Security (IANS), serves on the board of a couple of security startups, and guest lectures at St. Pölten University of Applied Sciences. In 2012, while consulting with VMware engineering, Davi co-wrote the cloud security book Securing the Virtual Environment: How to Defend the Enterprise Against Attack. Lately, he is the head of product security for a popular database company and has been working on his next book, The Realities of Securing Big Data, about the societal risks inherent in unsecured machine learning and AI.
If there is one myth that you could debunk in cybersecurity, what would it be?
I sometimes hear people attempting to prove our folk tales and fables false. It is tempting to use personal expertise to debunk fantastical-sounding cybersecurity stories of inhuman skill, unearthly severity, or the astronomical likelihood of attacks. Yet, behind all of our reality-based arguments to dispel some awful-sounding threats—advanced persistent threats (APTs), nation-states, corporate mercenaries, or hackers in hoodies—lies the ancient issue that mythology actually is a great way to educate and inform.
Instead of debunking myths or making them disappear, what if we acknowledged their purpose and adjusted them to carry messages further in the direction we want to go? We could, instead of debunking, strive to enrich allegories like the myth of the solo genius attacker, or “rock-star” hackers. We also can improve upon the myths of talent scarcity and the myth of rising complexity in cybersecurity. What I’m saying is that our industry should portray myths using more human tragedies with complex narratives—inherently fallible and plagued by self-imposed vulnerabilities—to further the art of cybersecurity mythology. Years ago, I came up with “Ctrl-Alt-Delete When You Leave Your Seat” and won some awareness competitions with it. If I added a giant red penguin mythology to this phrase to make it stickier and someone started saying “giant red penguins are not real,” I would be very sad about the state of our industry.
What is one of the biggest bang-for-the-buck actions that an organization can take to improve its cybersecurity posture?
I’d love to pull out some particular technology-focused action here, such as patching. Quick, everybody patch everything. Done. However, decades of experience tells me the biggest bang truly comes from setting up a reliable, repeatable discipline of practicing security. (A little wax on, wax off leads even a scrawny kid to achieve the seagull pose, for those of you who remember The Karate Kid.)
In theory, practicing security means taking low-cost baby steps in every area possible, treating time spent practicing security as a small, incremental investment that grows. It’s like insurance, just without the adversarial lawyers writing exclusion clauses to ensure their employers’ margins. In reality, some organizations take a long time to accept that security will do more good than harm, so they never truly accept the concept of practicing it regularly. These organizations hope for lottery-style returns on spending instead of treating security as small steps worth doing daily. This is backwards, of course, since perfection in pose comes from practice.
Two things generally push organizations out of the allure of a lottery-style approach—either friendly regulators bring fines for lack of practice or unfriendly adversaries bring breaches. This is why I say initiating disciplined, incremental approaches is one of the lowest-cost, best-result actions an organization can take—whether we’re talking about patching, encryption, identity management, education, logging, or anything else. Organizations that do not regularly practice security tend to throw money away for tactical diversions and stopgaps after harm has already been done. They also tend to be blind to the strategic problems that manifest, while regular practitioners frequently report results to management. Course correction becomes harder the further you go without a safety check.
“Course correction becomes harder the further you go without a safety check.”
All that being said, each organization will still have to prioritize actions, and that really depends on specifics that come from practicing threat modeling (a part of every security action). Sometimes more bang comes from patching, while other times the bang would be biggest from logging. No matter what the organization chooses, their posture will improve the most with the least cost if they act early on—creating a discipline of practice and assigning resources to it, no matter how small.
How is it that cybersecurity spending is increasing but breaches are still happening?
If increased spending meant that bad things would stop happening completely, economists would probably lose their minds. America spends more on healthcare than other countries, and yet it has worse health and a shorter life expectancy.
This has also been linked to attempts by American politicians to gamble on private ventures that profit from harm, in much the same way that antivirus companies have delivered so little value to customers while making investors rich. Coincidence? Market analysis tells us people spend for a myriad of reasons. Leaving the market without regulation has been repeatedly proven ineffective in getting people to spend on things that reduce harm. Instead, we’ve learned that below a certain level of control (regulatory guidance), the probability of harm goes up dramatically—like driving a car faster than 12 mph without a seatbelt. The best “nudge,” then, is spending above a reasonable threshold (regulatory guidance) to keep things minimally and predictably safe, even during high performance. This is far more effective than giving companies complete free reign and hoping that will eliminate all breaches.
I tend to speak about cybersecurity in terms of automobile safety because there’s so much data to work with, and it’s not very controversial (anymore). There’s ample proof that increased seatbelt spending significantly reduced injuries. If I remember right, the drop was initially 50 percent. After that, no matter how much manufacturers were willing to spend on seatbelt technology, there was only negligible improvement. So, regulators shifted to requiring airbags and found yet another big drop in harm. The history of seatbelts thus becomes a good case study for cybersecurity spending—better results come from teams collaborating on more science-based regulations and raising our baselines in predictable areas. This has been the effect of California’s Database Security Breach Notification Act (SB-1386) as well as the Payment Card Industry Data Security Standard (PCI DSS).
Interestingly, a 2018 survey by Adobe found that 83 percent of the 500 private and public cybersecurity professionals polled agreed that government regulations have a positive impact on cybersecurity.14 With a social-good focus on the compromises necessary for regulation, also known as good governance, there is a reasonable chance that increased spending will reduce the pace of breaches, as we have already seen with some of these early regulations.
Do you need a college degree or certification to be a cybersecurity professional?
Oh, economics again. Whenever someone asks me if you need a certification to work in security, I ask them whether they would trust a website that doesn’t use HTTPS, because that’s a certificate system they ostensibly believe is essential to verifying whether a site is professional. To be fair, our cybersecurity industry’s attempt to build a highly reputable certificate authority (CA) was sidetracked by “businesses” that realized you can get wealthy by charging large amounts of money to hand out weak and untrustworthy certs. A bit of a loophole, yet the point still stands. I mean, universities apparently had the same idea as shady CA companies. Obviously, you don’t need or want an untrusted degree or certificate to prove you’re a professi
onal.
With that said, when an authority is undermined or compromised, you are likely better off building your own proofs. All that means is, yes, choose wisely which authority is proving your authority, even if that means yourself. I still know system administrators who offer excellent reasons for only trusting “self-signed” certs, and I have been enjoying the latest wave of administrators who believe the cost of SSL certs should be zero. College degrees for free? What? Can we call them “cert socialists?”
The whole model of requiring certification raises an economic reality that standards should be designed to provide a lower-cost means of proof in order to enable a freer exchange of information. In that sense, a college degree says you accomplished something finite and measurable, and it presents proof in a known format that could get you hired. Likewise, certification from an authority can dramatically reduce the cost of proving that you are who you claim to be. Should you happen to find less costly or easier methods of proving yourself, then you could end up ahead of the game. Most people still have to play this game because it’s their best option, if not their only option, and because the authority model isn’t completely broken.
How did you get started in the cybersecurity field, and what advice would you give to a beginner pursuing a career in cybersecurity?
This seems like a follow-up to the previous question. First, I have to give a shout-out to my family. Hi mom! For as long as I can remember, computers have been around. My great aunt proudly showed me photos of the “computing group” she managed in the 1950s, and one of my great uncles managed bulk power (nuclear) computing systems, while another talked of his U.S. Army days running copper telco lines behind enemy lines. Thus, I have always thought of computers in terms of international relations and war, sort of like it was going to be my generation’s piston engine. My grandfather was an electrical engineer and apparently worked with some of the first U.S. Navy computing systems before he started a company that worked with silicon manufacturers. Both of my parents, by way of teaching at a university, offered me inexpensive access to computers, and we communicated by email. By the time I was leaving high school in the 1980s, I was already toying with laptops and building my own low-cost “kit” computers, and…hacking.
To put things in perspective, I was likely seen as the least technical among those around me. My sibling was a SunOS administrator, and back then, as a kid, I thought it was fair game to try to get around whatever access obstacles were in the way of playing Netrek. Hackers were treated as playful, mischievous, or malicious instead of as career-minded creative types. My first decision in terms of career was to not pursue my family’s interest in computers and instead strike out on my own to make a mark in another field. To my grandfather’s dismay, I walked away from computer science, let alone electrical engineering, and took up philosophy and political science and ended up in history. In the end, I suppose the most influential thing to kickstart my career in cybersecurity was a very wise graduate school adviser, who noticed I was always hacking around the empty lab systems. He suggested I leverage computers for a livable wage instead of just academic degrees. I believe he put it as, “Life’s comforts come more easily in computer jobs, and when you get tired of that, you can always come back and suffer through a PhD.”
The moral of the story, and the advice I always try to give beginners, is this: find a way to the financial comforts that will allow you to be creative and curious with computers. Find time to break things and try to put them back together, repeatedly. Cybersecurity is all about shifting contexts, trying the unusual and undocumented things others might take for granted. It’s also about communicating results in a way that helps people see what they were blind to before. The best people I’ve worked with in cybersecurity don’t give up easily because their curiosity is vast, and they try the most things because their creativity runs deep.
“Find time to break things and try to put them back together, repeatedly.”
On that note, when I finished graduate school and headed to California, I stopped to visit my great uncle. Since he was nearing retirement from his computer operations career, I figured it would be a bonding moment to tell him I now was off to start my own career as his was winding down. He looked at me and laughed, his big mustache quivering as he snickered, “You think you’ll get hired to work with computers with your history degree?! That’s a new one.” A week later, I walked into a subsidiary of Space Applications—a Digital (DEC) VAR I found in a phone book—told them about my love of hacking macOS and TCP/IP, and they hired me on the spot because the context of computers was shifting and they needed security-minded distributed systems people like me. A few months later, I was hand-delivering cutting-edge Unix systems for rocket research to high-security campuses and responding to Windows NT breaches. The rest is…history.
What is your specialty in cybersecurity, and how can others gain expertise in your specialty?
This looks like a trick question to me. Instead of specialization, which felt limiting, I’ve always been eager to learn anything anyone was willing to throw at me.
Expertise also seemed relative, instead of absolute, so a specialization opportunity depended on the expertise of those around me. On a project—or an engagement where a team has to divide and collaborate—I’ll take a special assignment, like being a cloud expert. Yet beyond that, I like to keep up on everything. I have to admit that writing a book on cloud security in 2012 (Securing the Virtual Environment) generated a lot of engagements where I was asked, “What’s special about cloud?” Rather than push cloud as my specialization, I’d usually say, “It’s mostly the same concepts, just a different context.”
When it comes to specialization, I’ve always felt it was better to learn every position and rotate through them. The downside to being a cyber pastoralist (or shepherd) is the feeling of having the reset button pushed all the time. You need to have a strong support system to survive a really big reset. It can be incredibly humbling for an expert generalist to stay competitive with specialists all the time.
The upside is you never feel tempted to believe the myth of talent shortage or the myth that cybersecurity is becoming too complex to keep up, because working up through levels of complexity is what makes you tick. So, if I had to pick, I would stay curious and avoid over-specialization. That has been my specialty, and for the last five years, it has meant having fun breaking big data systems, advocating for encryption solutions, and hunting bad and unsafe machine learning (ML) and AI. I look forward to others establishing specialties in these areas and showing me how much of a generalist I still am. That way, I can pivot to that security thing no one is good at yet.
What is your advice for career success when it comes to getting hired, climbing the corporate ladder, or starting a company in cybersecurity?
I shifted from academia to the business world, and the best advice I can give for career success, which I learned through trial and error, is to have empathy for customer pain. Every job I’ve ever held has started with someone saying they had a need and asking if I could help. Same for starting a company in cybersecurity. When you find a lot of people have a problem, and if you can think of a solution that reduces it at a reasonable cost, then you have a product or service worth selling. Also important to starting a company is putting the right team together and knowing how and when to pivot from failure. I don’t think I’m saying anything novel or unique here, so I’ll end with a couple thoughts about corporate ladders.
First, corporate ladders tend to be unfair by design. Those at the top are controlling who they’d like to see coming up behind them, reflecting their values. Thus, historically, it’s been a sordid mess of racism and misogyny. Choosing to leave and start one’s own company has made more sense for excluded groups, instead of taking fruitless steps in a game fixed against them. The Chinese Exclusion Act of 1882 is linked directly to Chinese working in America leaving corporate ladders and starting restaurants and laundry services instead (because they became their ow
n bosses). Cybersecurity had very short ladders initially, so achieving a role as an industry leader almost always meant starting one’s own company for executive/top experience. Even by 2000, 30 years after cybersecurity had started, there were almost no ladder-climber role models to learn from; very few had reached a true top and stayed there.
Second, corporate ladders really are social engineering exercises, which makes them an interesting aspect of security. Those who are best at it could turn out to be insider threats, and security teams might be in a position of building safety checks to flag and investigate competitive ladder climbers who use methods counterproductive to corporate values. That might sound a bit Machiavellian, so I’ll just circle back to my point about specialization. Some like to stay on a rung and become specialists, never wanting to climb out of it. That actually has value we should recognize, which might be easier if we call it a form of specialization. Others want to jump rung to rung and move along in their career, expanding as generalists across rungs. My advice is if you want to be the latter, earn those steps by being the best you can at that level/rung of specialization. Unlike a physical ladder, each step in the corporate ladder is different, and moving up a level usually requires some artifact or proof of worth.
What qualities do you believe all highly successful cybersecurity professionals share?
Brevity.
What is the best book or movie that can be used to illustrate cybersecurity challenges?
The word hacker itself comes from horses pulling carts in London centuries ago. So, I’m tempted to list illustrations of transportation challenges of bygone eras, such as license plates on cars to stop them from committing crimes and driving away. Instead, I’ll say Metaphors We Live By by George Lakoff, a cognitive linguist, which was a popular book when I was in college. Lakoff makes the argument that we really shouldn’t be saying “attack” and “breach” when we talk about cyber, unless we actively recognize where these terms come from and how they convey certain powerful meanings.