Application Integration

API Management For Kafka

We’ve had several conversations with folks lately about the possibility of API management for Kafka layers of their operations. Begin able to apply the same metering, logging, and ultimately measuring value exchange across Kafka pipes, and be able to invoice internal or partner groups for access. Much of this could be accomplished by using web APIs as the last mile of delivery for Kafka, but we wanted to explore if there were any native options emerging that would help companies measure the value that is exchanged as part of their Kafka usage.

API Management For KafkaBetween the years of 2006 and 2016, API management providers from Mashery, 3Scale, Apigee, to the newer generation of Kong and Tyk, as well as the cloud providers like AWS, Azure, and Google have been helping organizations develop an awareness of how their digital assets are being consumed. API management is often showcased as being about just managing a handful of free, pro, and enterprise tiers of access, but in reality, it helps organizations make sure every internal, partner and public API is being measured, and every bit of usage is accounted for, measured, and invoiced against. Even if API consumers aren’t actively paying their invoices, such as between internal groups, it is still a healthy exercise to understand who is using what, and where the value gets exchanged at the API transaction layer.

Our question today, focuses on whether or not this level of awareness is being developed for Kafka pipes. Are companies measuring consumption, and reporting, invoicing, or otherwise accounting for consumption? Is there schema, object, or resource level tracking of what is being consumed? Are specific topics more popular than others, and possess more value? Can Kafka providers break down who all their consumers are, articulate what they’ve consumed, and the value associated with this consumption. Are there source and sink breakdown of value exchange? In general, what type of awareness building solutions are there for Kafka pipelines, that go beyond the technical, and help us understand more about the business of moving our digital bits around using Kafka?

We’ll be looking to answer this question by profiling as many of the Kafka Service providers to emerge out there, seeing what features they offer. We’ll also look at any logging and analysis tools that have emerged in the ecosystem. We’ll also reach out to many of the existing API management providers to see if they have any API management solutions that target beyond simple web APIs, and help meter, bill, and analyze Kafka traffic. We assume there are solutions out there, but they just haven’t gotten the mainstream attention the more common API management providers have enjoyed. If you have seen anything out there we’d love to hear from you. If you have the need to be quantifying the value exchange at the Kafka sink, source, or client layers of your operations, we’d love to talk. As we evolve our SaaS, and on-premise versions of Streamdata.io, and our Kafka connector, we are looking to better understand what the market needs, as well as opportunities, are for streamlining the exchange of value at the API layer.

AI in Finance White paper - API Management For Kafka

 

**Original source: streamdata.io blog