I’ve long been immersed in the world of technology — as a hobbyist, a venture capitalist, a board member and now as a CEO. But looking back at key segments of my personal investment history, I’m struck by the story it tells about security. Yes, information security — it may just be the most dynamic sector of this dynamic market.
That doesn’t mean the security discipline has followed a meticulous plan. In fact, it’s often charted a parallel course to two distinct trends: the hot new technology, and the most current threat. Investment opportunities in this area have gone this way, too.
It all started with point solutions. Technology professionals and their most tech-savvy customers would identify specific pain points — often after the problem had taken its toll — and developers would come up with solutions to guard against it. This was like the proverbial Little Dutch Boy trying to plug holes in the dike. (Sadly, many infrastructures still rely heavily on these isolated niche solutions.) Perhaps effective in the early days of the security market, this strategy is now far from adequate; even with some holes plugged, the dike remained leaky…and I’d argue it’s becoming more leaky.
So, the industry moved on to event detection and incident response. The goal was to sniff out individual trouble spots as soon as a real problem arose and put corrective measures in place, and/or respond more effectively when a vulnerability was exploited. In the process, security became a more strategic part of software development and deployment, but still an “add-on” — not core to the strategic information technology that runs a business.
I’ve been immersed in this field a long time, and I’ve seen how these features were built into software designed for other purposes. I was an early investor in (and a board member of) Spyglass, a name that will likely resonate with tech historians. It was one of the first HTML browsers to hit the market, and that was before the Internet really entered mainstream consciousness. By design, it had security layers built in.
Of course, many non-tech professionals will know what Spyglass became — the mighty Microsoft OEM’d the software, then made it available for free as Internet Explorer. It helped dethrone the previous browser champion, Netscape, and essentially commoditized the browser market over night.
But then things got more interesting, because as networks increasingly connected people and businesses, ever-greater amounts of content began to flow everywhere and stored online. So, the investment focus turned toward the next critical area: firewalls.
In hindsight, this was a turning point. While networks are fundamentally designed to enable easy access — what’s the point of having data online if the people who need it can’t reach it? — the firewall is charged with gate keeping: preventing unwanted access but also allowing proper access from the inside out and outside in.[related-posts]
Viruses, the earliest threat to computers, reared their ugly heads again, so firewalls were modified to perform deep-packet inspection and capture them. Black lists and filters were added. Oddly, pornography was another driver of security innovation. It’s a two-pronged problem: the network has to block incoming X-rated spam, but it also has to prevent outbound access to certain URLs (yes, business users have been known to surf porn sites while at work).
I was on to the firewall market early. From my investment perch, I scoured the market to find the best firewall companies, and there were many options from which to choose. I wasn’t just after innovation — I wanted enterprise-class software companies with a solid business plan and recorded revenue. That’s how Check Point caught my attention: It had an OEM deal with Sun, then the “backbone of the Internet.”
Inevitably, data leaks became a critical issue — sensitive information began to leave the network without authorization, and certainly without control. That led me to Vontu and its Data Loss Prevention (DLP) solutions. Now owned by security giant Symantec, these offerings helped organizations prevent the loss of confidential or proprietary information, regardless of where it was stored or used. In a sense, it was firewalls in reverse. I also focused on Splunk, which features the capability to identity data breaches by analyzing mountains of information with user-friendly analytics.
Over time, the security environment has evolved from isolated problem areas to sweeping threat matrices — a single attack from sophisticated cyber criminals can encompass multiple attack modes, multiple technologies and stealthy tactics. There are completely unpredictable zero-day threats, where bad actors exploit previously unknown software vulnerabilities, and Advanced Persistent Threats (APTs) that go after the weakest link in any security system.
That’s why I turned my attention to FireEye, which protects against data packets that can assemble themselves — a level of sophistication that would have been unthinkable when I first started investing in security.
From my perspective, I see everything from crude (but effective) phishing attacks to sustained, multi-pronged attacks that last for months and infiltrate global infrastructures. No point product, or set of point products, can with 100 percent certainty guarantee that an attack will be prevented. That’s why security executives need to accept that their networks will be breached. It’s not a question of if, but when.
This is why we talk about network resilience — keeping the infrastructure as safe as possible from outside attacks, but also keeping the business running while under attack. Every CEO must be thinking about this; security budgets are the fastest-growing part of their IT budgets, yet attacks continue to happen more frequently and with greater impact. This is not an easy problem solved by one simple install, network redesign or army of cyber engineers.
Today, organizations need networks that are digitally resilient. This includes an accurate understanding of data flows, host access, redundancy, full back-up and, most importantly, the ability to initiate recovery measures immediately in the event of an attack.
Effective digital resilience requires a complete understanding of the infrastructure. Understanding the infrastructure requires automation, measurement and testing, then more and continuous measurement of the network and its cyber readiness.
For my part, I’ve put my money in (and currently lead) RedSeal, which offers cybersecurity analytics solutions to Global 2000 organizations to maximize their digital resilience.