We are learning more about the AWS Serverless Application Repository, trying to understand what types of functions people are publishing there, and how it might fit into the bigger event-driven architectural picture. The repository is a place to discover, deploy, and publish serverless applications. We want to understand how it all fits into the bigger picture, and make sure we track things as it continues to evolve.
To try and understand how the AWS Serverless Application Repository fits into the event-driven picture we wanted to search and see what types of streaming applications were being developed–here is what we’ve come across so far.
– dynamodb-process-stream-python3 – An Amazon DynamoDB trigger that logs the updates made to a table.
– kinesis-firehose-process-record-streams-as-source – An Amazon Kinesis Firehose stream processor that accesses the Kinesis Streams records in the input and returns them with a processing status.
– splunk-dynamodb-stream-processor – Stream events from AWS DynamoDB Stream to Splunk HTTP Event Collector (HEC).
– splunk-kinesis-stream-processor – Stream events from AWS Kinesis Stream to Splunk HTTP Event Collector (HEC).
– kinesis-firehose-process-record-streams-as-source-python – An Amazon Kinesis Firehose stream processor that accesses the Kinesis Streams records in the input and returns them with a processing status.
– kinesis-process-record – An Amazon Kinesis stream processor that logs the data being published.
– kinesis-process-record-python – An Amazon Kinesis stream processor that logs the data being published.
– kinesis-firehose-apachelog-to-csv-python – An Amazon Kinesis Firehose stream processor that converts input records from Apache Common Log format to CSV.
– kinesis-firehose-syslog-to-csv-python – An Amazon Kinesis Firehose stream processor that converts input records from RFC3164 Syslog format to CSV.
– Log4J-To-S3-Helper – This app provides 2 helper lambda functions which are useful for building a data pipeline for streaming Log4J log events from your application host to S3. Later you can use AWS Athena to query these log events in S3 using SQL.
The list provides an interesting snapshot of what is emerging on within the repository, and across the serverless landscape when it comes to streaming. While the majority of it has an AWS service focus, you do see other products emerging like Log4J, and Splunk. Plus the AWS focused solutions begin to paint a picture of what types of streaming approaches developers are interested in, and what AWS systems are being used to publish and generate streams.
We’ve setup a script to help us monitor any new streaming serverless functions. We’ll also be profiling other serverless scripts in future stories, trying to paint a picture of what serverless means to the event-driven evolution going on across the API sector. The serverless movement is just getting going, but with it already being baked into the AWS, Google, and Azure clouds, it is something that will undoubtedly continue to make a mark on how we deliver API infrastructure. The question is, what does it mean for streaming, event-driven, and other real time aspects of doing business with APIs.