Contrail Insights with Kafka
Set Up Kafka
Contrail Insights does not explicitly create Kafka topics. The Kafka broker cluster should be configured to auto-create topics. Alternatively, you can manually manage the topic creation. If you already have Kafka running, you can skip Set Up Kafka and go directly to Set Up Contrail Insights with Kafka.
Setting up Kafka as a Docker container:
-
Create a Docker network for this Kafka container and its dependencies to be connected to by running the following command:
docker network create AppformixKafka
-
Next, bring up Zookeeper for Kafka to work:
docker run -d \ --name appformix-zookeeper \ --net AppformixKafka \ -e ZOOKEEPER_TICK_TIME=2000 \ -e ZOOKEEPER_CLIENT_PORT=2181 \ -p 2181:2181 \ --restart always \ confluent/zookeeper
-
Bring up the Kafka container by running the following. The variable
ip_address
must be specified appropriately.docker run -d \ --net=AppformixKafka \ --name=appformix-kafka \ -p 9092:9092 \ -e KAFKA_BROKER_ID=2 \ -e KAFKA_ZOOKEEPER_CONNECT=appformix-zookeeper:2181 \ -e KAFKA_ADVERTISED_HOST_NAME=appformix-kafka \ -e KAFKA_ADVERTISED_LISTENERS=PLAINTEXT://<ip_address>:9092 \ -e KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR=1 \ confluentinc/cp-kafka:latest
-
If Kafka with SSL is required, then additional parameters are required:
docker run -d \ --net=AppformixKafkaSSL \ --name=appformix-kafka-ssl \ -p 9092:9092 \ -e KAFKA_BROKER_ID=2 \ -e KAFKA_ZOOKEEPER_CONNECT=appformix-zookeeper-ssl:2181 \ -e KAFKA_ADVERTISED_HOST_NAME=appformix-kafka-ssl \ -e KAFKA_ADVERTISED_LISTENERS=SSL://$ipaddr:9092 \ -e KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR=1 \ -e KAFKA_SECURITY_INTER_BROKER_PROTOCOL=SSL \ -e KAFKA_SSL_KEYSTORE_FILENAME=kafka.broker.keystore.jks \ -e KAFKA_SSL_KEYSTORE_CREDENTIALS=broker_keystore_creds \ -e KAFKA_SSL_KEY_CREDENTIALS=broker_sslkey_creds \ -e KAFKA_SSL_TRUSTSTORE_FILENAME=kafka.broker.truststore.jks \ -e KAFKA_SSL_TRUSTSTORE_CREDENTIALS=broker_truststore_creds \ -e KAFKA_SSL_ENDPOINT_IDENTIFICATION_ALGORITHM=" " \ -e KAFKA_SSL_CLIENT_AUTH=requested \ -v <secret_files_path>:/etc/kafka/secrets \ confluentinc/cp-kafka:latest
The
secret_files_path
should be replaced where all of the keystore and truststore files are present. For an example of how to create the above keystores and truststores, reference: https://github.com/confluentinc/cp-docker-images/blob/5.2.1-post/examples/kafka-cluster-ssl/secrets/create-certs.sh.Note:If Kafka with SSL is required, then all of the hosts monitored by Contrail Insights must have at least Python version of 2.7.9. It is also required that the Certificate Authority (CA) used for the certificates for the Kafka broker(s) be a trusted CA on all of the hosts monitored by Contrail Insights. In order for Contrail Insights containers to communicate with the Kafka broker(s), the CA file must be set as a
group_vars/all
variableappformix_kafka_ssl_ca
at installation time.Now that Kafka is set up, next you can configure Contrail Insights with Kafka.
Set Up Contrail Insights with Kafka
To configure Contrail Insights with Kafka, a POST request must be sent to an Contrail Insights Platform API:
http://<controller_ip>:9000/appformix/controller/v2.0/kafka_config
The following fields must be sent in this request:
Name |
The name of the Kafka cluster, which can be anything. |
BootstrapServers |
A list of host/port pairs to use for establishing the initial connection
to the Kafka cluster. Each item in the list is a string in the format
|
To send a POST body request using Ansible:
Run the POST body request, which is similar to the following:
{ "Name": "Kafka Config", "BootstrapServers": ["10.X.X.1:9092"] }
If Kafka has been set up with SSL, then an additional field is needed:
{ "Name": "Kafka Config", "BootstrapServers": ["10.X.X.1:9092"], "SecurityProtocol": "SSL" }
To send a POST body request from the Contrail Insights Dashboard:
-
Select Settings in the upper right corner, then select AppFormix Settings > Kafka. Next, click + Add Config.
Figure 1: AppFormix Settings for Kafka Page -
Enter a name for the Kafka configuration and list the BootstrapServers as a comma separated list of strings with each string in the
host:port
format. -
Click Setup after the fields have been populated.
Note:The following steps are for streaming network telemetry data to Kafka. All Contrail Insights alarms are automatically sent to Kafka once Kafka has been configured as stated in the earlier procedures above. There are no additional steps needed for alarms. See Contrail Insights Alarms With Kafka.
-
Click + Add Subscription to create a subscription.
-
Next create a Topic, select devices, and then select which Sensors/MIBs you want sent to Kafka. The specified data will then be sent to Kafka under the specified topic. Click Create Subscription after the fields are populated.
Messages from Contrail Insights to Kafka
After configuration, messages from Contrail Insights are received by an appropriate Kafka consumer.
In the following command, bootstrap_server
is one of the bootstrap
servers specified in the BootstrapServers
variable above and
topic
is the topic that was specified in the subscription
created. :
/usr/bin/kafka-console-consumer --bootstrap-server <bootstrap_server> --topic <topic> --from-beginning
This command outputs messages to standard output. Output for topic
grpc-components
with sensor /components/
selected will look something like:
[ { "AgentId": uuid, "Timestamp": 1533915694346, "RoomKey": "QFX0:Routing Engine0", "ResourcePath": "/components/", "Data": { "cpu-utilization-background": 0, "cpu-utilization-user": 1, "temperature": 36, "temperature-cpu": 36 }, }, ]
Contrail Insights Alarms With Kafka
Contrail Insights Alarms are configured to automatically send alerts to Kafka, if
Kafka has been configured in Contrail Insights. See Set Up Contrail Insights with Kafka. Contrail
Insights sends alarms with the topic as the alarm's name. For example, Alarm name
host_cpu
is sent to Kafka with topic
host_cpu
.