Episode 1: Real-time data processing core concepts and create your first data stream

In this first episode, we will cover the core concepts of real-time data processing. We will create an Amazon Kinesis Data Streams and populate data on stream by running a producer client program on an EC2 instance with sample data.

What you learn in Episode 1:

  • Real-time data stream core concept
  • How to create a new data stream
  • How to ingest data on the Kinesis Data Streams
  • Streaming data storage with Kinesis and Kafka

Who should attend:
Developers, DBAs, IT Pros, Architects, Data Engineers, Data Analysts

Speakers:
Masudur Rahaman Sayem, Specialist SA - Data & Analytics

Previous Video
Episode 2: Stream data processing with Kinesis Data Firehose and AWS Lambda
Episode 2: Stream data processing with Kinesis Data Firehose and AWS Lambda

In this episode, we will cover how to do serverless stream processing with AWS Lambda and how to ingest dat...

Next Video
Hey Operator Using SageMaker with EKS for Machine Learning Pipelines
Hey Operator Using SageMaker with EKS for Machine Learning Pipelines

In this Level 300 webinar, we will cover how and why to use Amazon SageMaker with Kubernetes, with a focus ...