To try to make sense of all the logs from different sources on GCP pub/sub, I created this little serverless framework that uses Kafka streams for alerting correlation on Kubernetes.
Installing Kubeless
Follow these instructions. Customize Kubeless config file at kubeless-config.yaml
and then run:
$ make kl
Creating Kubeless topic
In Kafka, messages are published into topics. The functions ran by kubeless (consumers) are going to receive these messages by creating the topic:
$ kubeless topic create reactor
Firing Up Containers
To run a logstash, elastsearch, zookeeper and kafka (producers) so that it outputs to Kafka's topic for kubeless, run:
$ make pipeline
Debugging
To debug any pods (kubeless
or kafka
or zoo
), grab the name with:
$ make pods
and then run:
$ kubectl logs <podname> --namespace=kubeless
References
- Kubeless github repository.
- Kubernetes CustomResourceDefiniton.
- Kubeless serveless documentation.
- Kafka Concepts and Common Patterns.
Enjoy and let me know what you think! :)
PS: If you want to learn more about GCP, check my resources and labs here.