Episode 1: Real-time data processing core concepts and create your first data stream

July 1, 2020
In this first episode, we will cover the core concepts of real-time data processing. We will create an Amazon Kinesis Data Streams and populate data on stream by running a producer client program on an EC2 instance with sample data. What you learn in Episode 1: · Real-time data stream core concept · How to create a new data stream · How to ingest data on the Kinesis Data Streams · Streaming data storage with Kinesis and Kafka Who should attend: Developers, DBAs, IT Pros, Architects, Data Engineers, Data Analysts Prerequisites: Your own non-production AWS account with administrator access. Speakers: Masudur Rahaman Sayem, Specialist SA - Data & Analytics
Previous Video
Episode 2: Stream data processing with Kinesis Data Firehose and AWS Lambda
Episode 2: Stream data processing with Kinesis Data Firehose and AWS Lambda

In this episode, we will cover how to do serverless stream processing with AWS Lambda and how to ingest dat...

Next Video
Harness the full potential of IoT data with AWS IoT and Analytics Services
Harness the full potential of IoT data with AWS IoT and Analytics Services

If you knew the state of everything and could reason on top of that data, what problems could you solve? Io...