Both cyber attacks and cybersecurity spending have been rising exponentially over the past decade. Cyber attacks alone have risen 500% and enterprises can expect to see damage related to cybercrime hit a projected $10.5 trillion annually by 2025. (Cybersecurity Ventures)
Something isn’t adding up.
It’s evident that while organizations are spending more and more on legacy cybersecurity solutions, these platforms are not holding up their end of the deal and are not able to proactively defend in a modern, non-signature attack threatscape.
Are organizations meant to spend more and more in this new, unpredictable, modern threatscape? Or get smarter by leaving legacy platforms and knowledge behind and rethinking their approach to zero-day, non-signature attacks?
Over the past couple of weeks our Head of Strategic Alliances, Geoff Coulehan, has been detailing the top 5 considerations that should be guiding SOC strategy in 2021 and beyond:
1. A Modern SOC Should Not Be Entirely Dependent On Human Operators and Their Personal Experience
The modern approach to Security Operations Center (SOC) development represents a fundamental challenge: the intersection between human operation and technology. Do humans enhance technology, or is human input holding SOCs back from using modern technology to its full potential? Can today’s technology even deliver on vendor promises in a typical SOC environment?
A modern SOC should not be entirely dependent on human operators and their personal experience. The issue has been a foundational problem with not only the methodologies used by SOCs for the past 15 to 20 years, but it should be questioned whether the problem is actually compounded by the technology itself.
2. Incremental Stacking of Correlative Analysis Platforms Will Ultimately Prove Ineffective and Costly
Today, the cost to operate a SOC has dramatically increased, thanks in large part to the additive nature of popular security solutions, each of which carries a hefty operational price tag to deploy, run, and keep these systems in tune.
Vendors tend to gloss over a glaring foundational issue related to stacking platforms on top of other platforms in order to achieve a singular goal: how to work within and across multiple siloes of proprietary data.
Vendors position their SIEM platforms in a way that requires customers to aggregate and format data into the vendor’s exclusive, proprietary format. This is the only way to obtain the data the SIEM needs to compare against historical data to detect anomalies.
3. Log Data is Not Effective as a Foundation for Prevention, Detection, Remediation or Analytics
The fundamental flaw that keeps SIEM from fulfilling the promises vendors promote is quite simple: when it comes to real-time threat prevention and detection, log data is inevitably incomplete, inappropriate and ineffective.
By its very nature, a log based solution is only as current as its latest aggregation. Given the sprawling nature of today’s typical corporate network infrastructure, this approach is wholly insufficient. Log data will always be incomplete, pre-summarized, and prohibitively limited in its ability to give SOC teams insight into what is actually happening on a broader level of granularity for the summarized information.
To overcome some of the limitations of these solutions, vendors offer a wide range of additive tools. For example, NTA tools add improved traffic analysis. However, when tools are stacked upon a foundationally flawed premise from the start, perceived improvements are not a true solution.
4. Cybersecurity Spend for Data Retention and Analysis is Out of Control and Largely Unnecessary
Enterprises sift through the same data many times over to find and understand details. This work represents a remarkable financial and human resources investment. It’s not a winning approach when all that work and investment leads to a security “solution” that isn’t a solution at all.
Traditional cybersecurity approaches rely on log data. To operate, these systems require SOC teams to massage, extract, transform, normalize, and consolidate log data into a central repository. It’s the only way to get the data into the proprietary format required by the third party security solution.
Companies must consider not only their initial investment into SIEM software, but the ongoing costs of licensing and data retention. These are costs that will exponentially grow over time as the volume of data required for accuracy increases.
5. Third-wave AI has Proven More Effective than Traditional Cybersecurity Platforms and Methodologies
In a market where every vendor claims to have the “best” AI solution, SOC teams face a significant challenge in trying to sort fact from fiction.
Terms like “data normalization” and “machine learning AI” may sound like typical parts of the process, and they are, if you’re dealing with outdated technology. Platforms like Security Information and Event Management (SIEM) run on a multistep process of data logging and labeling and require an ever-growing data lake in order to operate. Worse, these systems are only as up-to-date as the last normalization process performed by human analysts.
Third-wave AI is context-aware. Data doesn’t need to be normalized because third-wave platforms like MixMode rely on an evolving baseline of network behavior that adjusts in real-time. These platforms evaluate traffic in context, quickly changing to meet the “new normal” as real world conditions dictate.
Webinar: Why Your Legacy Cyber Platforms Can’t Defend Against Modern Day Attacks
In our upcoming webinar on Tuesday, May 18th, Geoff will be joined by the CEO of RAVENii, Jeff Shipley, to discuss the new threat landscape, how legacy platforms have completely failed to defend against modern day attacks, and how CISO’s and security teams are turning the tide with modern cybersecurity platforms that consolidate the tool stack and apply a next generation method of cyber defense.
Learn more and RSVP here:
Why Your Legacy Cyber Platforms Can’t Defend Against Modern Day Attacks