Abstract
Learn more about activating the broker VM with an Apache Kafka Collector applet.
Notice
Ingesting logs and data from external sources requires a Cortex XDR Pro per GB license.
Apache Kafka is an open-source distributed event streaming platform for high-performance data pipelines, streaming analytics and data integration. Kafka records are organized into Topics. The partitions for each Topic are spread across the bootstrap servers in the Kafka cluster. The bootstrap servers are responsible for transferring data from Producers to Consumer Groups, which enable the Kafka server to save offsets of each partition in the Topic consumed by each group.
The Broker VM provides a Kafka Collector applet that enables you to monitor and collect events from Topics on self-managed on-prem Kafka clusters directly to your log repository for query and visualization purposes. The applet supports Kafka setups with no authentication, with SSL authentication, and SASL SSL authentication.
After you activate the Kafka Collector applet, you can collect events as datasets (<Vendor>_<Product>_raw
) by defining the following.
Danger
Before activating the Kafka Collector applet, review and perform the following:
Apache Kafka version 2.5.1 and above.
Kafka cluster set up on premises, from which the data will be ingested.
Privileges to manage Broker Service configuration, such as Instance Administrator privileges.
Create a user in the Kafka cluster with the necessary permissions and the following authentication details:
Configure the Broker VM
Select → → → .
In either the Brokers tab or the Clusters tab, locate your Broker VM.
You can either right-click the Broker VM and select → , or in the APPS column, left-click → .
Configure the Kafka Connection.
Specify the Bootstrap Server List, which is the <hostname/ip>:<port>
of the bootstrap server (or servers). You can specify multiple servers, separated with a comma. For example, hostname1:9092,1.1.1.1:9092
.
Select one of the Authentication Methods:
Test Connection to verify that you can connect to the Kafka server. An error message is displayed for each server connection test that fails.
Configure the Topics Collection parameters.
(Optional)Add Connection to create another Kafka Connection for collecting data.
(Optional) Other available options for Connections.
As needed, you can return to your Kafka Collector settings to manage your connections.
Here are the actions available to you.
Edit the Connection details.
Rename a connection by hovering over the default Collection name, and selecting the edit icon to edit the text.
Delete a connection by hovering over the top area of the connection section, on the opposite side of the connection name, and selecting the delete icon. You can only delete a connection when you have more than one connection configured. Otherwise, this icon is not displayed.
Activate the Kafka Collector applet. Activate is enabled when all the mandatory fields are filled in.
After a successful activation, the APPS field displays Kafka with a green dot indicating a successful connection.
(optional) To view metrics about the Kafka Collector, in the Broker VMs page, left-click the Kafka connection displayed in the APPS field for your Broker VM.
Cortex XDR displays Resources, including the amount of CPU, Memory, and Disk space the applet is using.
Manage the Kafka Collector.
After you activate the Kafka Collector, you can make additional changes as needed. To modify a configuration, left-click the Kafka connection in the APPS column to display the Kafka Collector settings, and select the following.
Ensure that you Save your changes, which is enabled when all mandatory fields are filled in.
You can also Ingest Apache Kafka Events as Datasets.