Application Integration

Efficient Last Mile Delivery Of Information Internally And Externally With Streaming APIs

Previous waves Web services created using SOAP have focused on the delivery of data internally using an RPC format, which is something that became externally focused with the more RESTful, web API, resource-centric approach we see today. As much of the external conversation is still being dominated by web APIs, the internal information is increasingly shifting towards using message brokering platforms like Kafka. Providing a high volume, high reliability approach to moving data around within the corporate domain. To match this shift in the internal landscape, Server-Sent Events (SSE) is emerging to help more efficiently deliver data when it comes to the external “last mile” of data delivery, through streaming APIs.

 

SSE-driven streams are emerging to be the last mile solution

Server-Sent Events (SSE) can be used to take existing web APIs, and stream the data as its provides to web and mobile applications. Providing dedicated connections to existing web APIs, and be a broker for efficient caching and updating of user’s interfaces as updates occur. No repeated polling, and no refreshing of the UI, just streams of data coming in, and being presented to users as it is updated. Bringing data home via the last mile of its journey before it is consumed by users in their browsers and applications. Something that is most commonly seen in externally focused applications, but can just as easily be employed for internal use cases as well.

Streams can be established using Server-Sent Events (SSE) on top of internally exposed web APIs, but it can also be done via other internally consumed 3rd party APIs like SalesForce, message brokers like Kafka, using web APIs and connector solutions. SSE-driven streams are emerging to be the last mile solution, whether it is internally or externally, making sure data is efficiently delivered wherever it is needed. Making SSE a strong connector option for making sure internal, as well as external data sources get where they need within an organization, even alongside high volume, high reliability systems in use by IT and development groups.

This post comes as part of a series of posts we are using to answer customer questions regarding where Streamdata.io, and specifically streaming APIs technology fits into the existing API technology stack. How is it different than web services, and how it augments and compliments existing internal, partner, and 3rd party APIs. Encouraging us to highlight the last mile benefits of developing streams of data around existing API infrastructure. Moving API operations more towards being a stream, topic, and event-driven mindset, where data flows to where it is needed, and users can subscribe, or unsubscribe to the data they need to get their job done.

**Original source: streamdata.io blog