This web analytics demo shows how to collect web logs with API Gateway and store them into S3 through Amazon Kinesis. Then this project shows how to analyze web logs with Amazon Athena.
-
Updated
May 2, 2024 - Python
This web analytics demo shows how to collect web logs with API Gateway and store them into S3 through Amazon Kinesis. Then this project shows how to analyze web logs with Amazon Athena.
Utility code for AWS Services. Sample use case could be querying data from KDS. This project is open to receive contribution.
A streaming data pipeline leveraging managed services from AWS & the serverless architecture
An end-to-end AWS data engineering pipeline that orchestrates data ingestion, processing, and storage using Kinesis, S3, Redshift, Lambda, DynamoDB
Scala based wrapper for Kinesis Consumer Library which exposes the stream as an Akka Streams Source
Postgresql To Kinesis For Java
Creation of the almost-real time data processing pipeline for the Pintrest posts.
Embracing Modern Data Solutions: Transitioning from Apache Flink to Redis and AWS Lambda with SQS and Kinesis
Projects and coding completed for Data Engineering immersive program at AiCore.
Simple Stock analysis app which sends email when a certain threshold is reached
Kafka Connect Docker image containing AWS Kinesis and S3 plugins
A unified data pipeline that aggregates, refines, and analyses vast datasets, delivering real-time business intelligence and supporting data-driven decisions.
This repository contains an End to End Real time 🕰️ Machine Learning Pipeline to predict star ⭐️ rating of product reviews. This project uses AWS Sagemaker, Kinesis, Lambda, S3, Redshift, Athena, and Step functions. Deployment of multiple models for AB testing and Bandit testing is also included.
Easy Amazon Kinesis load testing with Locust - a modern load testing framework
A lightweight system to automatically scale Kinesis Data Streams up and down based on throughput.
Streaming data analysis using AWS tools such as Cloud9 to generate events in the cloud, using boto3 to send records to Kinesis Data Firehose to connect to the S3 bucket destination, saving files in .parquet format. With the help of Glue, a data catalog will be created to enable real-time querying of all records with Athena.
Code companion to AWS Compute Series on building a serverless backend for a streaming application. Questions? Contact @jbesw.
Add a description, image, and links to the kinesis-stream topic page so that developers can more easily learn about it.
To associate your repository with the kinesis-stream topic, visit your repo's landing page and select "manage topics."