On-Demand Webinar

Post Splunk: SIEM-less Security Data Lake Adoption

Modern SOC
On-Demand Webinar

Post Splunk: SIEM-less Security Data Lake Adoption

Detection Strategies

By Kevin Gonzalez, Senior Director, Security & Operations, Anvilogic

This article was originally published as a guest essay on The Last WatchDog.

Cisco’s $28 billion acquisition of Splunk comes at an inflection point of security operations teams beginning to adopt modern, cloud-native data lakes. For years, Splunk has been the workhorse SIEM for many enterprise Security Operation Centers (SOCs). However, security teams have challenges with Splunk’s steeply rising costs. Early adopters of security data lakes like Snowflake are saving more than two-thirds of what they were paying for their Splunk license. Splunk’s inability to migrate to a modern cloud-native architecture makes it difficult to take advantage of these cost-saving benefits or implement advanced data science use cases critical for threat detection. The Cisco acquisition shall exacerbate these challenges and speed up the adoption of security data lakes.

While it’s great to see data lakes gaining so much momentum, many security teams struggle to take advantage of them. Ripping and replacing Splunk overnight is unrealistic. Enterprise security teams need a path to incrementally migrate to a modern data lake with minimal impact on their SOC workflows.

SOCs require the ability to manage detections and analyze real-time security threats in a unified manner, regardless of where their data is stored, which is best achieved by separating their analytics layer from their data logging layer. Here’s how to leverage the power of decoupling to create a distributed data lake architecture where security teams can choose to use multiple data platforms like Splunk and Snowflake, while maintaining a consistent security analytics layer. 

The Role of the Data Lake As The Connector  

From detections written in SQL, KQL, or SIEM-specific languages like Splunk’s SPL, to the utilization of Python notebooks and various data science models for threat hunting, the variety and volume of data in data platforms can pose processing and detection development challenges for detection engineers who are not subject matter experts in multiple query languages. Influxes of data ingestion and the flat architecture of data lakes have led to difficulties in extracting value from repositories. 

Relying on data collection and organization tools like the traditional SIEM to analyze the various log data for threat detection requires constant updating of the analysis methods and, more importantly, puts the onus of observability onto the security engineer. Every new data source becomes a headache for the multiple teams required to collaborate together to get each data source in a usable state.

For detection engineers to efficiently identify and thwart potential threat actors, the data logging and analytics layers need to be decoupled. This provides the flexibility to easily grow and change security to support the organizational/business changes (ex: moving from Splunk to Snowflake over time), reduce costs, and finally start to keep up or even stay ahead of alerts.

How SOC Analysts Can Contribute Most Impactfully to Data Analysis 

A decoupled, purpose-built threat detection platform can work across distributed data lake architectures. SOC teams will no longer need to modify detection logic, hunting notebooks, data science models, or wait for IT to prepare data sources. Each data lake can be connected to the threat detection platform which can analyze and detect threats using a unified set of detection logic and advanced AI, with real-time normalization. This streamlines security operations, and improves response agility, while also reducing vendor lock-in, giving CISOs flexibility for more cost-effective options. It also alleviates the cost and political implications associated with data migration and enables unified querying and analysis across multiple data lake architectures.

To achieve decoupling, organizations need to implement a unified detection layer and adopt the right AI tooling. Implementing a unified detection layer simplifies the process of building detection content, even with diverse skill sets among security analysts. It also provides a standardized schema, enhancing the adaptability of security operations to different data storage scenarios. The unified detection layer should act as a hub for all detection content that connects to and processes detections within each data lake, regardless of the query language.

When you decouple the activity of threat detection from tools for which it is not inherently designed, you free up those resources to do what they need to do: address and remediate threats. Detection engineers can now spend more time protecting the business than figuring out how to protect the business.

Decoupling for Threat Detection Success: The Agnostic Security Approach  

Decoupling enables rapid data access and flexibility in a distributed data lake architecture, meeting the demands of modern data management. By minimizing reliance on vendor-specific data logging platforms, data access can be expanded. SOCs will gain control over their data storage strategy, allowing them to keep the data where it is. At the same time, SOC teams can keep pace with user expectations of more SaaS-ified, agile data management and future-proof security operations. 

By leveraging a unified detection layer and AI, organizations can optimize data storage and analysis processes, leading to smarter and faster detection of security threats. Additionally, it promotes interoperability among different data sources and tools, ensuring a more seamless and flexible security infrastructure. Data duplication and the associated operational costs are reduced, unnecessary logs and the associated costs are reduced, and the dependency on having fully normalized data in your data repository is eliminated in favor of data feeds. Additionally, analysts can be more effective by leveraging low/no-code detection builders, so they neither need to worry about parsing/normalizing the data nor be experts in a specific query language or technology.

With this shift, you can take advantage of modern innovations in storage architectures while simultaneously gaining access to specialized detection and response innovations.

Get the Latest Resources

Leave Your Data Where You Want: Detect Across Snowflake

Demo Series
Leave Your Data Where You Want: Detect Across Snowflake
Watch

MonteAI: Your Detection Engineering & Threat Hunting Co-Pilot

Demo Series
MonteAI: Your Detection Engineering & Threat Hunting Co-Pilot
Watch
White Paper

Post Splunk: SIEM-less Security Data Lake Adoption

Modern SOC
October 26, 2023

Post Splunk: SIEM-less Security Data Lake Adoption

Modern SOC

By Kevin Gonzalez, Senior Director, Security & Operations, Anvilogic

This article was originally published as a guest essay on The Last WatchDog.

Cisco’s $28 billion acquisition of Splunk comes at an inflection point of security operations teams beginning to adopt modern, cloud-native data lakes. For years, Splunk has been the workhorse SIEM for many enterprise Security Operation Centers (SOCs). However, security teams have challenges with Splunk’s steeply rising costs. Early adopters of security data lakes like Snowflake are saving more than two-thirds of what they were paying for their Splunk license. Splunk’s inability to migrate to a modern cloud-native architecture makes it difficult to take advantage of these cost-saving benefits or implement advanced data science use cases critical for threat detection. The Cisco acquisition shall exacerbate these challenges and speed up the adoption of security data lakes.

While it’s great to see data lakes gaining so much momentum, many security teams struggle to take advantage of them. Ripping and replacing Splunk overnight is unrealistic. Enterprise security teams need a path to incrementally migrate to a modern data lake with minimal impact on their SOC workflows.

SOCs require the ability to manage detections and analyze real-time security threats in a unified manner, regardless of where their data is stored, which is best achieved by separating their analytics layer from their data logging layer. Here’s how to leverage the power of decoupling to create a distributed data lake architecture where security teams can choose to use multiple data platforms like Splunk and Snowflake, while maintaining a consistent security analytics layer. 

The Role of the Data Lake As The Connector  

From detections written in SQL, KQL, or SIEM-specific languages like Splunk’s SPL, to the utilization of Python notebooks and various data science models for threat hunting, the variety and volume of data in data platforms can pose processing and detection development challenges for detection engineers who are not subject matter experts in multiple query languages. Influxes of data ingestion and the flat architecture of data lakes have led to difficulties in extracting value from repositories. 

Relying on data collection and organization tools like the traditional SIEM to analyze the various log data for threat detection requires constant updating of the analysis methods and, more importantly, puts the onus of observability onto the security engineer. Every new data source becomes a headache for the multiple teams required to collaborate together to get each data source in a usable state.

For detection engineers to efficiently identify and thwart potential threat actors, the data logging and analytics layers need to be decoupled. This provides the flexibility to easily grow and change security to support the organizational/business changes (ex: moving from Splunk to Snowflake over time), reduce costs, and finally start to keep up or even stay ahead of alerts.

How SOC Analysts Can Contribute Most Impactfully to Data Analysis 

A decoupled, purpose-built threat detection platform can work across distributed data lake architectures. SOC teams will no longer need to modify detection logic, hunting notebooks, data science models, or wait for IT to prepare data sources. Each data lake can be connected to the threat detection platform which can analyze and detect threats using a unified set of detection logic and advanced AI, with real-time normalization. This streamlines security operations, and improves response agility, while also reducing vendor lock-in, giving CISOs flexibility for more cost-effective options. It also alleviates the cost and political implications associated with data migration and enables unified querying and analysis across multiple data lake architectures.

To achieve decoupling, organizations need to implement a unified detection layer and adopt the right AI tooling. Implementing a unified detection layer simplifies the process of building detection content, even with diverse skill sets among security analysts. It also provides a standardized schema, enhancing the adaptability of security operations to different data storage scenarios. The unified detection layer should act as a hub for all detection content that connects to and processes detections within each data lake, regardless of the query language.

When you decouple the activity of threat detection from tools for which it is not inherently designed, you free up those resources to do what they need to do: address and remediate threats. Detection engineers can now spend more time protecting the business than figuring out how to protect the business.

Decoupling for Threat Detection Success: The Agnostic Security Approach  

Decoupling enables rapid data access and flexibility in a distributed data lake architecture, meeting the demands of modern data management. By minimizing reliance on vendor-specific data logging platforms, data access can be expanded. SOCs will gain control over their data storage strategy, allowing them to keep the data where it is. At the same time, SOC teams can keep pace with user expectations of more SaaS-ified, agile data management and future-proof security operations. 

By leveraging a unified detection layer and AI, organizations can optimize data storage and analysis processes, leading to smarter and faster detection of security threats. Additionally, it promotes interoperability among different data sources and tools, ensuring a more seamless and flexible security infrastructure. Data duplication and the associated operational costs are reduced, unnecessary logs and the associated costs are reduced, and the dependency on having fully normalized data in your data repository is eliminated in favor of data feeds. Additionally, analysts can be more effective by leveraging low/no-code detection builders, so they neither need to worry about parsing/normalizing the data nor be experts in a specific query language or technology.

With this shift, you can take advantage of modern innovations in storage architectures while simultaneously gaining access to specialized detection and response innovations.

Build Detection You Want,
Where You Want

Build Detection You Want,
Where You Want