Course Description
Learn how to deploy and manage containerized applications on Google Kubernetes Engine (GKE). Learn how to use other tools on Google Cloud that interact with GKE deployments. This course features a combination of lectures, demos, and hands-on labs to help you explore and deploy solution elements—including infrastructure components like pods, containers, deployments, and services—along with networks and application services. You'll also learn how to deploy practical solutions, including security and access management, resource management, and resource monitoring.
Prerequisites
To get the most out of this course, participants should have: Completed Google Cloud Platform Fundamentals: Core Infrastructure or have equivalent experience Basic proficiency with command-line tools and Linux operating system environments
Content
Module 1: Introducing Google Cloud Platform
- Use the Google Cloud Platform Console
- Use Cloud Shell
- Define cloud computing
- Identify GCPs compute services
- Understand regions and zones
- Understand the cloud resource hierarchy
- Administer your GCP resources
Module 2: Containers and Kubernetes in GCP
- Create a container using Cloud Build
- Store a container in Container Registry
- Understand the relationship between Kubernetes and Google Kubernetes Engine (GKE)
- Understand how to choose among GCP compute platforms
Module 3: Kubernetes Architecture
- Understand the architecture of Kubernetes: pods, namespaces
- Understand the control-plane components of Kubernetes
- Create container images using Google Cloud Build
- Store container images in Google Container Registry
- Create a Kubernetes Engine cluster
Module 4: Kubernetes Operations
- Work with the kubectl command
- Inspect the cluster and Pods
- View a Pods console output
- Sign in to a Pod interactively
Module 5: Deployments, Jobs, and Scaling
- Create and use Deployments
- Create and run Jobs and CronJobs
- Scale clusters manually and automatically
- Configure Node and Pod affinity
- Get software into your cluster with Helm charts and Kubernetes Marketplace
Module 6: GKE Networking
- Create Services to expose applications that are running within Pods
- Use load balancers to expose Services to external clients
- Create Ingress resources for HTTP(S) load balancing
- Leverage container-native load balancing to improve Pod load balancing
- Define Kubernetes network policies to allow and block traffic to pods
Module 7: Persistent Data and Storage
- Use Secrets to isolate security credentials
- Use ConfigMaps to isolate configuration artifacts
- Push out and roll back updates to Secrets and ConfigMaps
- Configure Persistent Storage Volumes for Kubernetes Pods
- Use StatefulSets to ensure that claims on persistent storage volumes persist across restarts
Module 8: Access Control and Security in Kubernetes and Kubernetes Engine
- Understand Kubernetes authentication and authorization
- Define Kubernetes RBAC roles and role bindings for accessing resources in namespaces
- Define Kubernetes RBAC cluster roles and cluster role bindings for accessing cluster-scoped resources
- Define Kubernetes pod security policies
- Understand the structure of GCP IAM
- Define IAM roles and policies for Kubernetes Engine cluster administration
Module 9: Logging and Monitoring
- Use Stackdriver to monitor and manage availability and performance
- Locate and inspect Kubernetes logs
- Create probes for wellness checks on live applications
Module 10: Using GCP Managed Storage Services from Kubernetes Applications
- Understand pros and cons for using a managed storage service versus self-managed containerized storage
- Enable applications running in GKE to access GCP storage services
- Understand use cases for Cloud Storage, Cloud SQL, Cloud Spanner, Cloud Bigtable, Cloud Firestore, and Bigquery from within a Kubernetes application
Module 11: Logging and Monitoring
- CI/CD overview
- CI/CD for Google Kubernetes Engine
- CI/CD Examples
- Manage application code in a source repository that can trigger code changes to a continuous delivery pipeline.
- Create a continuous delivery pipeline using Cloud Build and start it manually or automatically with a code change.
- Implement a canary deployment that hosts two versions of your application in production for release testing.