Application Integration

Using KSQL To Apply Transformations To Kafka Data Streams

We are learning more about KSQL, the SQL stream engine for Apache Kafka. A query language that you can use to express, and then apply transformations to data being delivered streaming Kafka streams. KSQL combines the power of a query language and Kafka data flows to deliver more precise and meaningful streams of data using the popular platform for building real-time data pipelines and streaming apps. Some of the most common use cases for using KSQL to Kafka real time data streams are:

– Applying schema to data
– Filtering and masking data
– Changing data structures
– Changing the serialization format
– Enriching streams of data
– Unifying multiple streams of data

Providing a query language layer on top of your data streams reflects the evolution of how we move our data around. Querying isn’t just about get at data in storage, and can just as easily be about data in transit. Allowing us to query data as it moves around the enterprise, and then also potentially make available via web APIs–making data consumption much more precise, and tailored for each consumer.

We are working to better understand how organizations are putting Kafka to work, and using KSQL to deliver their topic streams. Some of our customers are putting the Streamdata.io Kafka connector to work in the on-premise edition of our service. We want to make sure our solutions are in alignment with how they are using Kafka, help be the last mile of distribution of HTTP streams of data, helping augment the industrial grade capacity of Kafka, with HTTP streams that are sent using Server-Sent Events (SSE), providing incremental updates with JSON PATCH.

AI in Finance White paper - using KSQL

**Original source: streamdata.io blog