Note
Ingesting Logs and Data from external sources requires a Cortex XDR Pro per GB license.
Apache Kafka is an open-source distributed event streaming platform for high-performance data pipelines, streaming analytics and data integration. Kafka records are organized into Topics. The partitions for each Topic are spread across the bootstrap servers in the Kafka cluster. The bootstrap servers are responsible for transferring data from Producers to Consumer Groups, which enable the Kafka server to save offsets of each partition in the Topic consumed by each group.
The Broker VM provides a Kafka Collector applet that enables you to monitor and collect events from Topics on self-managed on-prem Kafka clusters directly to your log repository for query and visualization purposes. The applet supports Kafka setups with no authentication, with SSL authentication, and SASL SSL authentication.
After you activate the Kafka Collector applet, you can collect events as datasets (<Vendor>_<Product>_raw) by defining the following.
Kafka connection details including the Bootstrap Server List and Authentication Method.
Topics Collection configuration for the Kafka topics that you want to collect.
Following are the prerequisites for setting up the Kafka Collector applet.
Apache Kafka version 2.5.1 and above.
Kafka cluster set up on premises, from which the data will be ingested.
Privileges to manage Broker Service configuration, such as Instance Administrator privileges.
Complete the following tasks before you begin setting up the Kafka Collector applet.
Create a user in the Kafka cluster with the necessary permissions and the following authentication details.
Broker Certificate and Private Key for an SSL connection.
Username and Password for an SASL SSL connection.
Configure the Broker VM.
Activate the Kafka Collector.
Select
→ → → .In either the Brokers tab or the Clusters tab, locate your Broker VM.
You can either right-click the Broker VM and select APPS column and select → .
→ , or hover in theConfigure the Kafka Connection.
Specify the Bootstrap Server List—The <hostname/ip>:<port> of the bootstrap server (or servers). You can specify multiple servers, separated with a comma. For example,
hostname1:9092,1.1.1.1:9092
.Select one of the Authentication Methods.
No Authentication—Default connection method for a new Kafka setup, which doesn’t require authentication. With a standard Kafka setup, any user or application can write messages to any topic, as well as read data from any topic.
SSL Authentication—Authenticate your connection to Kafka using an SSL certificate. Use this authentication method when the connection to the Kafka server is a secure TCP, and upload the following.
Broker Certificate—Signed certificate used for the applet to authenticate to the Kafka server.
Private Key—Private key for the applet used for decrypting the SSL messages coming from the Kafka server.
(Optional)CA Certificate—CA certificate that was used to sign the server and private certificates. This CA certificate is also used to authenticate the Kafka server identity.
SASL SSL (SCRAM-SHA-256)—Authenticate your connection to the Kafka server with your Username, Password, and optionally, your CA Certificate.
Test Connection to verify that you can connect to the Kafka server. An error message is displayed for each server connection test that fails.
Configure the Topics Collection parameters.
Select the Topic Subscription Method for subscribing to Kafka topics. Use List Topics to specify a list of topics. Use Regex Pattern Matching to specify a regular expression to search available topics.
Specify Topic(s) from the Kafka server. For the List Topics subscription method, use a comma separated list of topics to subscribe to. For the Regex Pattern Matching subscription method, use a regular expression to match the Topic(s) to subscribe to.
(Optional) Specify a Consumer Group, a unique string or label that identifies the consumer group this log source belongs to. Each record that is published to a Kafka topic is delivered to one consumer instance within each subscribing consumer group. Kafka uses these labels to load balance the records over all consumer instances in a group. When specified, the Kafka collector uses the given consumer group. When not specified, Cortex XDR assigns the Kafka applet collector to a new automatically generated consumer group which is automatically generated for this log source with the name PAN-<Broker VM device name>-<topic name>.
Select the Log Format from the list as either RAW (default), JSON, CEF, LEEF, CISCO, or CORELIGHT. This setting defines the parser used to parse all the processed event types defined in the Topics field, regardless of the file names and extension. For example, if the Topics field is set to
*
and the Log Format is JSON, all files (even those namedfile.log
) in the cluster are processed by the collector as JSON, and any entry that does not comply with the JSON format are dropped.Specify the Vendor and Product which will be associated with each entry in the dataset. The vendor and product are used to define the name of your Cortex Query Language (XQL) dataset (
<Vendor>_<Product>_raw
).Note
For CEF and LEEF logs, Cortex XDR takes the vendor and product names from the log itself, regardless of what you configure on this page.
(Optional)Add Topic to create another Topic Collection. Each topic can be added for a server only once.
(Optional) Other available options for Topic Collection.
As needed, you can manage your Topic Collection settings. Here are the actions available to you.
Edit the Topics Collection details.
Disable/Enable a Topics Collection by hovering over the top area of the Topics Collection section, on the opposite side of the Topics Collection name, and selecting the applicable button.
Rename a Topics Collection by hovering over the top area of the Topics Collection section, on the opposite side of the Topics Collection name, and selecting the pen icon.
Delete a Topics Collection by hovering over the top area of the Topics Collection section, on the opposite side of the Topics Collection name, and selecting the delete icon.
(Optional)Add Connection to create another Kafka Connection for collecting data.
(Optional) Other available options for Connections.
As needed, you can return to your Kafka Collector settings to manage your connections.
Here are the actions available to you.
Edit the Connection details.
Rename a connection by hovering over the default Collection name, and selecting the edit icon to edit the text.
Delete a connection by hovering over the top area of the connection section, on the opposite side of the connection name, and selecting the delete icon. You can only delete a connection when you have more than one connection configured. Otherwise, this icon is not displayed.
Activate the Kafka Collector applet. Activate is enabled when all the mandatory fields are filled in.
After a successful activation, the APPS field displays Kafka with a green dot indicating a successful connection.
(Optional) To view metrics about the Kafka Collector, in the Broker VMs page, hover over the Kafka connection displayed in the APPS field for your Broker VM.
Cortex XDR displays Resources, including the amount of CPU, Memory, and Disk space the applet is using.
Manage the Kafka Collector.
After you activate the Kafka Collector, you can make additional changes as needed. To modify a configuration, hover over the Kafka connection in the APPS column to display the Kafka Collector settings, and select the following.
Configure to redefine the Kafka Collector configurations.
Deactivate to disable the Kafka Collector.
Ensure that you Save your changes, which is enabled when all mandatory fields are filled in.
You can also Ingest Apache Kafka Events as Datasets.