Docker and Kubernetes on AWS Training Course

Duration

21 hours (usually 3 days including breaks)

Requirements

  • An understanding of Docker and Kubernetes basics.
  • Experience with the Linux command line.
  • An Amazon AWS account with at least 10 USD on it.

Audience

  • Developers
  • System Administrators
  • DevOps Engineers

Overview

There are a number of options for deploying Docker and Kubernetes on AWS, including Amazon Elastic Container Service, Amazon ECS for Kubernetes, AWS Fargate, and Amazon EC2.

This instructor-led, live training (online or onsite) is aimed at engineers who wish to evaluate each of these services to make an informed decision about which to deploy within their organization.

By the end of this training, participants will be able to:

  • Set up a development environment that includes all needed tools to start developing.
  • Set up, configure and deploy a series of demo containers using a number of different approaches.
  • Understand the architectural and design differences among different Docker/Kubernetes solutions within AWS.
  • Automate the deployment of Docker containers.
  • Set up a continuous integration and deployment pipeline.
  • Integrate Docker and Kubernetes into an existing continuous integration system.

Format of the Course

  • Interactive lecture and discussion.
  • Lots of exercises and practice.
  • Hands-on implementation in a live-lab environment.

Course Customization Options

  • To request a customized training for this course, please contact us to arrange.

Course Outline

Introduction

Overview of Docker and Kubernetes on AWS

Preparing the Development Environment

Using Amazon Elastic Container Service

  • Creating a Containerized Application
  • Deploying the Application

Using Amazon ECS for Kubernetes

  • Creating a Containerized Application
  • Deploying the Application

AWS Fargate

  • Creating a Containerized Application
  • Deploying the Application

Amazon EC2

  • Creating a Containerized Application
  • Deploying the Application

Setting up a Continuous Integration Pipeline

Integrating Docker and Kubernetes with an existing Continuous Integration System

Troubleshooting

Summary and Conclusion

Kubernetes on AWS Training Course

Duration

14 hours (usually 2 days including breaks)

Requirements

  • An understanding of containers and Kubernetes basics
  • Experience with the Linux command line
  • An Amazon AWS account with at least 10 USD on it.

Overview

EKS is a self-managed Kubernetes-as-a-service offering from AWS. EKS is fully scalable and customizable and allows a Kubernetes deployment to mimic and/or integrate with an existing on-premise Kubernetes setup.

In this instructor-led, live training, participants will learn how to set up and manage a production-scale container environment using Kubernetes on EKS. 

By the end of this training, participants will be able to:

  • Configure and manage Kubernetes on EKS
  • Migrate an existing Kubernetes environment from on-premise to AWS cloud
  • Integrate Kubernetes with third-party continuous integration (CI) software
  • Ensure high availability and disaster recovery in Kubernetes
  • Understand and adopt the tools available to efficiently manage EKS

Audience

  • Developers
  • System Administrators
  • DevOps Engineers

Format of the Course

  • Part lecture, part discussion, exercises and heavy hands-on practice in a live-lab environment.

Note

  • To request a customized training for this course, please contact us to arrange.

Course Outline

Introduction

Overview of Docker Containers And Kubernetes in AWS

Overview of AWS Container Management Offerings and Architecture

Getting Started with Kubernetes on EKS

Building A Kubernetes Cluster On EKS 

Networking Kubernetes Pods

Migrating from On-premise to AWS.

Integrate Kubernetes with Continuous Integration (CI).

Ensuring High Availability and Disaster Recovery in Kubernetes.

Using Fargate to Manage EKS

Troubleshooting

Summary and Conclusion

Kubeflow on AWS Training Course

Duration

28 hours (usually 4 days including breaks)

Requirements

  • An understanding of machine learning concepts.
  • Knowledge of cloud computing concepts.
  • A general understanding of containers (Docker) and orchestration (Kubernetes).
  • Some Python programming experience is helpful.
  • Experience working with a command line.

Audience

  • Data science engineers.
  • DevOps engineers interesting in machine learning model deployment.
  • Infrastructure engineers interesting in machine learning model deployment.
  • Software engineers wishing to integrate and deploy machine learning features with their application.

Overview

Kubeflow is a framework for running Machine Learning workloads on Kubernetes. TensorFlow is a machine learning library and Kubernetes is an orchestration platform for managing containerized applications.

This instructor-led, live training (online or onsite) is aimed at engineers who wish to deploy Machine Learning workloads to an AWS EC2 server.

By the end of this training, participants will be able to:

  • Install and configure Kubernetes, Kubeflow and other needed software on AWS.
  • Use EKS (Elastic Kubernetes Service) to simplify the work of initializing a Kubernetes cluster on AWS.
  • Create and deploy a Kubernetes pipeline for automating and managing ML models in production.
  • Train and deploy TensorFlow ML models across multiple GPUs and machines running in parallel.
  • Leverage other AWS managed services to extend an ML application.

Format of the Course

  • Interactive lecture and discussion.
  • Lots of exercises and practice.
  • Hands-on implementation in a live-lab environment.

Course Customization Options

  • To request a customized training for this course, please contact us to arrange.

Course Outline

Introduction

  • Kubeflow on AWS vs on-premise vs on other public cloud providers

Overview of Kubeflow Features and Architecture

Activating an AWS Account

Preparing and Launching GPU-enabled AWS Instances

Setting up User Roles and Permissions

Preparing the Build Environment

Selecting a TensorFlow Model and Dataset

Packaging Code and Frameworks into a Docker Image

Setting up a Kubernetes Cluster Using EKS

Staging the Training and Validation Data

Configuring Kubeflow Pipelines

Launching a Training Job using Kubeflow in EKS

Visualizing the Training Job in Runtime

Cleaning up After the Job Completes

Troubleshooting

Summary and Conclusion

Getting Started with AWS Lambda Functions using Python

Learn how to get started with AWS Lambda Functions using Python

Requirements

  • Ability to program using Python, preferably Python 3.6 or later
  • Computer with Internet connection
  • Valid AWS Account to take the course with Hands on practice

Description

As part of this free course, you will learn how to get started with AWS lambda functions using Python run time. Here is the high-level outline for the course. AWS Lambda is one of the most popular fully managed AWS Services supporting different run times. As the IT Industry has adapted microservices architecture, serverless functions become a vital component in building large-scale complex applications.

  • Overview of AWS Lambda and Getting Started using Python 3
  • Passing Arguments to AWS Lambda and Processing using Python
  • Using Customer Handlers for AWS Lambda Functions using Python 3
  • Using AWS Services such as s3 in AWS Lambda Functions
  • Recap of handling permissions using AWS IAM Roles and User Groups
  • Develop AWS Lambda Function to list objects from AWS S3 Bucket
  • Passing Environment Variables to AWS Lambda Functions
  • Customizing Resources such as memory used for AWS Lambda Function
  • Setup Local Development Environment for AWS Lambda Functions
  • Develop logic for AWS Lambda Function using external packages
  • Build Zip file to deploy as AWS Lambda Function
  • Deploy Application with External Dependencies as AWS Lambda Function

Here is the detailed outline for the course.

  • First, you will start with prerequisites such as having a valid AWS account as AWS Lambda Functions are supposed to be deployed as part of the AWS Account.
  • Once you have a valid AWS account, you will understand what AWS Lambda is and how to deploy the first application using Python 3 run-time using AWS Web Console.
  • We should be able to pass arguments to any applications including AWS Lambda Functions. After deploying the first application as AWS Lambda Function, you will understand how to pass arguments at run time and process as part of the application.
  • When we use AWS Web Console to deploy the application as AWS Lambda Function using a blueprint, it uses the default handler. But when we start developing the applications, we might end up having multiple lambda functions as part of one deployed code base which means you need to define custom handlers in modules with custom names. After going through the details related to arguments, you will understand how to configure AWS Lambda Functions using custom handlers.
  • Quite often we interact with other AWS Services from the applications deployed as AWS Lambda Functions. We will go through the details about interacting with AWS Services using AWS s3 as an example.
  • After integrating AWS Lambda Function with AWS s3, we will go through the details about AWS IAM Roles to understand how the permissions are taken care of between different AWS Services.
  • As we successfully integrate AWS Lambda with AWS s3, we will update the application to list the objects in s3.
  • Quite often we need to customize the run-time behavior of the AWS Lambda Function or any application without changing the code. One of the ways to achieve it is using Environment Variables. We will understand how to use Environment Variables for AWS Lambda Functions.
  • When we invoke AWS Lambda Function, it will be executed using managed resources of AWS. We will also go through the details about reviewing the resources such as memory, CPU, ephemeral storage, and timeout. Also, we will enhance the code which requires customizing the resources, and then validate whether the Lambda Function is running as expected or not.
  • Even though we can use the editor provided by AWS Web Console to develop code for Python-based AWS Lambda Functions, it has its own limitations. After exploring the basics using AWS Web Console, we will go through the details about setting up a local environment for development.
  • Once we have the local environment for the development of AWS Lambda Functions, we will develop a new AWS Lambda Function which depends on 3rd party libraries such as requests.
  • We will then build the application as a zip file and deploy it as AWS Lambda Function. Also, we will validate if the AWS Lambda Function is running as expected or not.

By the end of the course, you will understand how to get started with AWS Lambda Function using Python 3 run-time for free. However, if you would like to understand how to use AWS Lambda Functions for larger and more complex applications, feel free to sign up for our other courses on Udemy.

Who this course is for:

  • Python Developers who want to understand how to get started with AWS Lambda Functions
  • Data Engineers to understand what AWS Lambda is all about and get started with AWS Lambda using Python
  • Cloud Engineers who would like to get started with AWS Lambda Functions
  • Any other IT Professional or Aspirant to learn how to start with AWS Lambda Functions
  • CS or IT Students or Graduates to get an idea about AWS Lambda Functions

Course content

1 section • 16 lectures • 1h 18m total length

Using AWS S3 with Python

An introduction on how to work with S3 and Python

Requirements

  • Some Python knowledge
  • Computer and Internet Connection
  • AWS Account
  • IDE or Text Editor such as Atom

Description

This course is aimed at users who are needing to use S3 as part of their overall solution and are needing to quickly learn what S3 is, what it is for and how to use it. Common users of this course include but not limited to data analysts, business intelligence users and data scientists.

This can also serve as an introductory course to engineers, however the focus is not around security and access management or bucket policies.

If you are new to S3 and are curious about using Python to interact with S3, then this is the course for you.

This course aims to:

  • Introduce you to what S3 is
  • Familiarize you with the AWS Console so you can:
    • Set up groups, users and credentials
    • Interface with S3
  • Show you how to do the following in S3 with Python
    • Create Buckets
    • Upload files
    • Download files
    • Delete files
    • Delete Bucket

After this course you should feel confident in using Python to manage your S3 content.

Note: In order to get the most out of this course, you will want to:

  • Have working knowledge of Python
  • Have a Python environment that you can use to work along with the examples
  • Have an IDE or good Text Editor (e.g. VS Code, Pycharm, Atom)

Lastly, this is a condensed course and my desire is that you can learn what you need over the weekend and feel confident enough to take your new skills to work the next day!

Who this course is for:

  • Beginner Python Developers
  • Data Analysts
  • Data Scientists
  • Business Intelligence Users

Course content

4 sections • 13 lectures • 1h 10m total length

Intro to Machine Learning in AWS for Beginners – New 2023!

Understand the difference between Artificial Intelligence (AI), Machine Learning (ML), Data Science (DS) and Deep Learning (DL)

Build, Train, and Test a Simple Machine Learning Model Using AWS SageMaker Canvas

Leverage a Yolo V3 Object Detection Algorithm available on the AWS Marketplace

Navigate Through the AWS Management Console

Write Your First Code in SageMaker Studio

Requirements

  • Basic knowledge in AWS
  • Basic knowledge in machine learning

Description

Machine Learning is the future one of the top tech fields to be in right now!

ML and AI will change our lives in the same way electricity did 100 years ago. ML is widely adopted in Finance, banking, healthcare, transportation, and technology.

The field is exploding with opportunities and career prospects.

This introductory course is for absolute beginners, students will learn: 

  1. Key AWS services such as Simple Storage Service (S3), Elastic Compute Cloud (EC2), Identity and Access Management (IAM) and CloudWatch,
  2. The benefits of cloud computing and what’s included in the AWS Free Tier Package
  3. How to setup a brand-new account in AWS and navigate through the AWS Management Console
  4. The fundamentals of Machine Learning and understand the difference between Artificial Intelligence (AI), Machine Learning (ML), Data Science (DS) and Deep Learning (DL)
  5. List the key components to build any machine learning models including data, model, and compute
  6. Learn the fundamentals of Amazon SageMaker, SageMaker Components, training options offered by SageMaker including built-in algorithms, AWS Marketplace, and customized ML algorithms
  7. Cover AWS SageMaker Studio and learn the difference between AWS SageMaker JumpStart, SageMaker Autopilot and SageMaker Data Wrangler
  8. Learn how to write our first code in the cloud using Jupyter Notebooks.
  9. We will then have a tutorial covering AWS Marketplace object detection algorithms such as Yolo V3
  10. Learn how to train our first machine learning model using the brand-new AWS SageMaker Canvas without writing any code!

Who this course is for:

  • Absolute Beginners who want to break into machine learning in AWS
  • Beginners Data Scientists wanting to advance their careers by Learning AWS and Machine Learning
  • Tech enthusiasts who are passionate and new to Machine Learning and want to gain basic knowledge in AWS & Machine Learning

Course content