×

First Name
Last Name
Company Name
Country
CDN Province
US State
India State
AU State
Postal Code
Phone Number
Job Role
Industry
This information is associated with my:
I acknowledge that I have read and agree to the AWS Privacy Policy and AWS Event Code of Conduct.
Compliance Opt-in
By completing this form, I agree that I'd like to receive information from Amazon Web Services, Inc. and its affiliates related to AWS services, events and special offers, and my AWS needs by email and post. You may unsubscribe at any time by following the instructions in the communications received. Your information will be handled in accordance with the AWS Privacy Policy.
Thank you!
Error - something went wrong!

Episode 1: Real-time data processing core concepts and create your first data stream

In this first episode, we will cover the core concepts of real-time data processing. We will create an Amazon Kinesis Data Streams and populate data on stream by running a producer client program on an EC2 instance with sample data.

What you learn in Episode 1:

  • Real-time data stream core concept
  • How to create a new data stream
  • How to ingest data on the Kinesis Data Streams
  • Streaming data storage with Kinesis and Kafka

Who should attend:
Developers, DBAs, IT Pros, Architects, Data Engineers, Data Analysts

Speakers:
Masudur Rahaman Sayem, Specialist SA - Data & Analytics

Previous Video
Episode 2: Stream data processing with Kinesis Data Firehose and AWS Lambda
Episode 2: Stream data processing with Kinesis Data Firehose and AWS Lambda

In this episode, we will cover how to do serverless stream processing with AWS Lambda and how to ingest dat...

Next Video
Hey Operator Using SageMaker with EKS for Machine Learning Pipelines
Hey Operator Using SageMaker with EKS for Machine Learning Pipelines

In this Level 300 webinar, we will cover how and why to use Amazon SageMaker with Kubernetes, with a focus ...