Application Integration

Discovery And Delivery Of Data Via APIs

We find some people seem to easily dismiss as a service they don’t need without understanding exactly what it does. It might be the preciseness of our name, and the assumptions people make about streaming just being about real time for a very specific set of use cases. Which is why we work to tell more stories about what is possible, pushing people to move beyond their assumptions about what streaming is, and understand the nuance of what a data stream can do for any existing API. To help better our readers better understand what delivers, we wanted to describe the details of our service pipeline, which helps companies, organizations, institutions, and government agencies with the discovery and delivery of data via APIs:

Discovery – Using the API Gallery we help our customers find the most relevant sources of data and content, then help them turn APIs into real time data streams using the OpenAPI definitions that drive the growing catalog of API resources.
Caching – Our data streams providing efficient caching for any API we proxy, responding to changes, and maintaining the availablility of data and content using web standards.
Differential – Our service identifies the differences between each API request, understanding what has changed without any client-side logic necessary in your applicaitons.
Events – Responding the events that occur within an existing API, delivering what has been added and changed, allowing applications to respond only to the meaningful events that occur.
Delivery – Making sure relevant data and content is delivered via web and mobile interfaces, but also into data lakes, training machine learning models, and other common data pipeline needs.

All of this represents the value of a data stream to us. You notice we didn’t even mention the real time flow of information. A stream is so much more than just a flow of real time information when you are proxying existing APIs. Most people think of streaming APIs as about delivering data via websockets, where we focus on it being about meaningful deliver of data pipelines using Server-Sent Events (SSE), and providing updates using JSON PATCH. We find this to be a much more nuanced approach to streaming data using APIs, build upon the same HTTP protocol developers are already using, but being more efficient about caching, and the differential delivery of meaningful changes and events that are occurring.

When you think about what we do as the discovery and delivery of data via APIs, with the streaming aspect being more about efficient event-driven data pipelines, more than it is about just real time data, you begin to see what we do differently. We are a new generation of streaming data services, focused on augmenting and extending existing APIs with the event-driven infrastructure they will need to compete in the next wave of growth in the API sector. We hope this help you think beyond just the real time component of what we do, and see the nuance of how we can help you discovery and deliver the data you need to get business done in an Internet age.

AI in Finance White paper - data via APIs

**Original source: blog