Many organizations around the world have started implementing a Kafka-based internal integration platform over the last years. Clearly, during the first implementation period, the focus was on internal integration, meaning connecting the large set of applications in use through a high-speed, reliable, event-based platform.
However, after successes in internal integration, many large organizations considered what to do with the communication with partners through one or more specific B2B gateways.
Using Kafka to enhance B2B data flows between partners
Simply connecting business partners to Kafka topics creates security issues, as Kafka by design uses low-level, high-speed network protocols that are not intended to support (B2B) security features.
Read this blog for a quick refresher on how Kafka solves real-time data challenges
The good news, however, is that for decades, B2B gateways have been perfectly able to mediate between all kinds of different protocols and data formats. If a B2B gateway can connect with a Kafka platform as well, it is “just another back-end system” from a gateway perspective.
Over the last years at Axway, we’ve seen two types of customer requests when it comes down to connecting a Kafka Platform to our B2B gateway, B2Bi; here’s an overview.
The pragmatic approach
Some customers simply wanted to be able to consume and publish messages on Kafka topics from B2Bi.
A very small percentage of the events on the Kafka platform are relevant for business partners as well, and/or some messages received from partners need to be shared on the Kafka platform for consumption by specific back-end applications.
From a B2Bi perspective, they see Kafka as “yet another back-end application”. That means connectivity to Kafka is used next to direct connectivity to core back-end systems like SAP.
The architecture-led approach
A growing group of large B2Bi customers takes a more fundamental, architecture-led approach. Their basic statement is that any internal application connects to any other application using Kafka (no direct connections between internal applications).
Since a B2B gateway is, at the end of the day, “just another back-end application,” it also follows that the B2B Gateway connects to any other internal application using Kafka.
Direct connections between, for example B2Bi and SAP are replaced by a connection through the Kafka event bus.
The clear advantage of this approach is that it increases agility & flexibility compared to direct connections between B2Bi and back-end applications, which means changes on either side can be performed faster and easier.
Configuring Kafka in Axway B2Bi
Here is a high-level description of how Axway B2Bi can be connected to a Kafka event platform to support either a more pragmatic or architectural-led Kafka-to-B2B gateway connectivity strategy.
Out of the box, B2Bi comes with support for many types of connection points between your back-end systems and B2Bi, such as file system, JMS, MQSeries and FTP and many, many more. On top of that, the product comes with a framework that offers the possibility to create custom integrations that can be re-used in any flow towards any partner.
The remainder of this article explains how to set up an integration between B2Bi and Kafka for both consuming messages from Kafka in B2Bi, as well as publishing messages on a Kafka topic received by B2Bi from a partner.
Final remark: the setup shown below is a basic setup, to make sure it is understandable. Of course, many specific settings can be added to support unique use cases that large organizations might have — and our EDI experts are happy to help tailor the solution to your needs.
How to set up a B2Bi integration as a Kafka consumer
To set up a basic consumer, there are a few things you need in advance:
- A running instance of Kafka
- A configured partner
- A configured community
You start with creating a new Trading Pickup in the preconfigured partner.
In this Trading Pickup you will enter the Kafka Bootstrap Server, the address of the running instance of Kafka, the topic you want to connect the consumer to, the timeout with which you’d want the consumer to address the topic for new messages and the Consumer Group ID with which you will be addressing the Kafka topic.
After setting up this Trading Pickup, it will be possible to use consumed Kafka messages in your B2Bi flows, but for the purpose of this blog post we will be using a limited flow instead.
This way, the messages will not be processed through the B2Bi Integration engine and will instead be immediately passed along to the designated Application Delivery, which in this case is a file system drop off.
When the Kafka message gets delivered it will be placed in the file system in the designated directory where it could be picked up by one of your backend applications for further use.
This file system delivery can just as easily be exchanged for any other application delivery in the B2Bi arsenal to directly send the message to one of those applications more easily.
How to set up a B2Bi integration as a Kafka producer
For the producers, most of the requirements are the same as for the Consumer:
- A running instance of Kafka
- A configured partner
- A configured community
Setting up the producer is mostly the same process as setting up the consumer, but in reverse.
Starting off with the way the message will be picked up, any connector can be used. Here, we’ll use a file system pickup. It scans the designated directory for new files and assigns a to address to send it to the Kafka delivery.
After the message is picked up, much like with the consumer, we send it straight past processing to the designated Kafka partner, but it would be possible to do any processing you’d want to on this message.
The delivery itself is very similar to the earlier discussed Trading Pickup, it contains the Kafka Bootstrap Server, the address of the running instance of Kafka, the topic you want to connect the producer to, the timeout with which you’d want the producer to address the topic for new messages and the Producer Client ID with which you will be addressing the Kafka topic.
The integration between Kafka and B2B gateways represents a significant step forward in modernizing enterprise integration architecture.
Whether taking the pragmatic approach of treating Kafka as another backend system or embracing an architecture-led strategy with Kafka as the central nervous system, organizations can leverage this powerful combination to enhance their B2B communications.
In upcoming articles, we’ll explore more advanced integration patterns, showing how organizations can transform raw Kafka events into well-designed APIs, or scale integration capabilities through reuse to create new business value from existing event streams.
Moving to cloud-enabled SAP S/4HANA? Make sure all your B2B integrations come with you.
Follow us on social