Event log archiving via Kafka

We currently do not store long-term events from our platform within the database.

A solution is to read the events from the Database and push them to kafka, where they are available for up to 7 days, to be consumed by an external solution, like Kafka Connectors.

The script which can be set via cron to run every 5 min, so it monitors for any new events, and send them to a Kafka topic called “events” is this:

https://github.com/metalsoft-io/scripts/blob/main/helper-scripts/cron_mysql2kafka.sh

The above file can be named: /root/cron_mysql2kafka.sh And be added to cron to run every 5 minutes.

Some examples of Kafka connectors that can pull the data from kafka and push to other solutions are:

https://hevodata.com/learn/kafka-to-mysql/

https://www.syslog-ng.com/community/b/blog/posts/consuming-logs-from-a-kafka-topic-using-syslog-ng

This Kafka connector will have to be installed by the customer on one of the kubernetes nodes in the cluster, so it can reach the kafka server from within the Metalsoft Controller deployment.