Apache Kafka is a distributed event streaming platform, similar to RabbitMQ, that can provide a highly-scalable and fault-tolerant system for messaging.
There are two different implementations of the Kafka integration. One support Avro for serialization while the other does not include any specific/additional serialization capability.
In my organization we have started the process of producing and consuming messages in kafka by doing something like that. We are using the confluent kafka dot net implementation, but instead of wrapping the functionality directly in code stages we have created our own .dll and imported it to Blue Prism. The code stages in the VBO then reference the .dll. At this stage we have a working proof of concept where we can produce messages into a kafka topic and also consume from that same topic using Blue Prism.
There are no examples of using Blue Prism with Kafka, that I'm aware of, at the moment. We have Kafka in our backlog as a future integration though. Once we have that integration built there will be an example process included with it.
In the meantime, we do have a RabbitMQ integration available on the DX. While it's not Kafka it could give you some ideas of how Blue Prism would interact in a Kafka deployment. Further, it could provide some context to what development of an integration with Kafka would look like, if you wanted to start your own. In a nutshell, you would pick a .NET implementation of Kafka (ex. kafka-net) and wrap it's functionality with code stages in the VBO.
There are two different implementations of the Kafka integration. One support Avro for serialization while the other does not include any specific/additional serialization capability.
Kafka - No Specific Serialization Support
Kafka - Avro Support
Can you please share the asset link?
Can you share the kafka dll that you created, I am trying to check the feasibility for the same in one of my processes, would be of great help.
In my organization we have started the process of producing and consuming messages in kafka by doing something like that. We are using the confluent kafka dot net implementation, but instead of wrapping the functionality directly in code stages we have created our own .dll and imported it to Blue Prism. The code stages in the VBO then reference the .dll. At this stage we have a working proof of concept where we can produce messages into a kafka topic and also consume from that same topic using Blue Prism.
There are no examples of using Blue Prism with Kafka, that I'm aware of, at the moment. We have Kafka in our backlog as a future integration though. Once we have that integration built there will be an example process included with it.
In the meantime, we do have a RabbitMQ integration available on the DX. While it's not Kafka it could give you some ideas of how Blue Prism would interact in a Kafka deployment. Further, it could provide some context to what development of an integration with Kafka would look like, if you wanted to start your own. In a nutshell, you would pick a .NET implementation of Kafka (ex. kafka-net) and wrap it's functionality with code stages in the VBO.
Are there some examples (thirth party) to consumer Kafka messages with blue prism?