Steps for migrating Splunk
Preparation steps: Identify relevant data sources
Identify data sources and value currently being ingested paying close attention to those that meet compliance requirements
Are there High volume, low value data sources?
Are there sources that should be there that aren't?
Prioritize data sources by criticality and analytical value
Identify/Map core use cases/correlation rules/models
Identify compliance requirements for historical data
Note that XSIAM can support a number of third party data sources and methods of log ingestion: External data ingestion.
Identify any local collection requirements and hosts for Broker VM: What is the Broker VM?
If needed: Identity Data Sources that can be onboarded via the marketplace: https://cortex.marketplace.pan.dev/marketplace/
If possible with your pipeline management, maintain an operational state in the legacy SIEM until all data is migrated to the new SIEM. This is easily accomplished with solutions like Cribl/SyslogNG/WEF/NXLOG/Vector
Preparation steps: Identity and migrate relevant rules
XSIAM uses machine learning on four types of telemetry:endpoint, cloud, identity, and network sources to create high-fidelity and actionable incidents based off of baseline profiles that are built for the user, entity, peer group, and organization. The correlation rules that many have relied upon for years to protect their environments no longer catch modern threats so it is important to contemplate the following as you identify your existing detection rules.
Make sure to select use cases that justify rule migration, considering business priority and efficiency.
Understanding how different rules work in XSIAM can help you decide whether you need a rule or if built in content slightly modified is best for your risk profile: Detection rules.
Understand the built in use cases that XSIAM handles OOTB.
Understand how XSIAM does Analytics on all four types of telemetry: Analytics.
Try to leverage as much OOTB Content as possible: Since XSIAM uses analytics to produce high-fidelity and actionable incidents, it’s likely that some of your existing detections won’t be required anymore
Familiarize yourself with XQL: Cortex XSIAM XQL Language Reference.
Review any rules that haven't triggered any alerts in the past 6-12 months, and determine whether they're still relevant.
Eliminate low-level threats or alerts that you routinely ignore.
SPL to XQL Comparison Table
SPLUNK SPL | XSIAM XQL | |
---|---|---|
Rule Type |
|
|
Criteria |
|
|
Trigger Condition |
|
|
Action |
|
|
To migrate your analytics rules to Cortex XSIAM
Verify that you have a testing system in place for each rule you want to migrate.
Prepare a validation process for your migrated rules, including full test scenarios and scripts.
Ensure that your team has useful resources to test your migrated rules.
Confirm that you have any required data sources connected, and review your data connection methods.
Verify whether your detections are available as built-in BIOC Rules
If you have detections that aren't covered by the built in rules you can use the built in SPL to XQL Converter to get started: Translate to XQL
If neither the built-in rules nor an online rule converter is sufficient, you'll need to create the rule manually. In such cases, use the following steps to start creating your rule:
Identify the data sources you want to use in your rule. Usually you will build rules from the datamodel
Identify any attributes, fields, or entities in your data that you want to use in your rules.
Identify your rule criteria and logic. Use the built in helpers and sample queries to see how XQL uses these in rule sets
Identify the trigger condition and rule action, and then construct and review your XQL query. When reviewing your query, consider XQL optimization guidance resources.
Test the rule with each of your relevant use cases. If it doesn't provide expected results, you may want to review the XQL and test it again.
When you're satisfied, you can consider the rule to have been migrated. Create a playbook for your rule action as needed.
Preparation Steps: Understand the XSIAM Architecture
XSIAM Can ingest logs from many different sources and ALL components of the architecture are not necessarily needed for a working deployment of XSIAM. For example: if a smaller startup has MAC laptops, Okta Directory, and Google Workspaces, then the MAC agent and API data collections for Okta and Google are all that are necessary for a successful deployment of XSIAM. If customers have significant on premise logs and multiple locations then the Broker VM component would be needed for those log sources in addition to the agents and API log collections. The goal is to keep the architecture as simple as possible so that troubleshooting and maintenance can be kept to a minimum.