question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

[aws-eks] Patterns Module

See original GitHub issue

Similar to ECS Patterns, I am interested in contributing EKS Patterns to make deploying and configuring EKS clusters simple. This includes deploying K8s native features such as Cluster Autoscaler and the Kubernetes Dashboard. The goal is to offer a similar experience to eksctl.

I propose the addition of a new module called eks-patterns that defines common patterns. Initially, I recommend we start with the modules that can be extracted from the AWS Documentation.

Use Case

eksctl is a powerful tool for provisioning and managing EKS clusters, however, it abstracts a lot of the complexity and makes it difficult for administrators to audit or adjust the code. By using the CDK, we can create reproducable and auditable Cloudformation stacks that can be version controlled for customers looking for more fine grained visibility. Long term, I am looking to develop “compliant” abstractions for compliance frameworks such as FedRAMP, PCI, etc.

Proposed Solution

I have implemented two references here:

Other

N/A

  • 👋 I may be able to implement this feature request
  • ⚠️ This feature might incur a breaking change

This is a 🚀 Feature Request

Issue Analytics

  • State:open
  • Created 4 years ago
  • Reactions:6
  • Comments:9 (4 by maintainers)

github_iconTop GitHub Comments

6reactions
dodtsaircommented, Oct 1, 2020

Some of the basic steps I went trhough to get he ALB working:

Start with a cluster

const cluster = new eks.FargateCluster(this, 'cluster', {...})

We’ll need the clusterId, which isn’t directly exposed. Extract it from the oidc URL

        //Example URL https://oidc.eks.us-west-2.amazonaws.com/id/B01EF2EC7AC85DCEED81633BDA4ED90A
        const clusterId = Fn.select(4, Fn.split('/', fargateCluster.clusterOpenIdConnectIssuerUrl))

For EKS to create the ALB resources in AWS it needs to assume an IAM role via OIDC integration. Define that role (follows pattern documented by eksctl)

        const federatedPrincipal = new iam.FederatedPrincipal(
            cluster.openIdConnectProvider.openIdConnectProviderArn,
            {
                StringEquals: new CfnJson(this, "FederatedPrincipalCondition", {
                    value: {
                        [`oidc.eks.${vpc.env.region}.amazonaws.com/id/${clusterId}:aud`]: "sts.amazonaws.com",
                        [`oidc.eks.${vpc.env.region}.amazonaws.com/id/${clusterId}:sub`]: "system:serviceaccount:kube-system:${seviceAccountName}"
                    }
                })
            }, "sts:AssumeRoleWithWebIdentity")

A few notes

  1. seviceAccountName should be the same as the service account in the kubernetes cluster. This is defined in rbac-role.yaml as alb-ingress-controller
  2. CfnJson is needed because we have tokens in the keys of the javascript object.

Now associate the principal to an IAM Role:

 const iamRole = new iam.Role(this, 'iam-role', {
            assumedBy: federatedPrincipal
        })

At this time the role has no permissions, it cannot do anything. The alb-ingress-controller repo has an example file that documents all the permissions kubernetes will need in order to create the ALB, rather then reproducing that file we’ll pull it via require

Include the alb ingress controller to gain access to documentation files that contain standard yaml files for kubernetes

  "devDependencies": {
     "aws-alb-ingress-controller": "git://github.com/kubernetes-sigs/aws-alb-ingress-controller.git#v1.1.8",
     ...
   }

Now pull in the iam-policy.json and use it to create a new IAM Managed Policy

        const policyDocument = iam.PolicyDocument.fromJson(require('aws-alb-ingress-controller/docs/examples/iam-policy.json'))
        const managedPolicy = new iam.ManagedPolicy(this, 'managed-policy', {
            document: policyDocument
        })

Now associate the managed policy with the IAM so that kubernetes has the permissions needed to create the ALB

iamRole.addManagedPolicy(managedPolicy)

We are going to start loading kubernetes manifest into the EKS cluster, but some of the require slight modification. Again we could copy the sample code into our repo, or we could make the one change after we load it. I’ll do the latter. This function I’ll use to load the file and make the changes needed:

        let load = function (path, filter) {
            const absolutePath = require.resolve(path);
            const yamlText = readFileSync(absolutePath, 'utf8')
            const configs = yaml.safeLoadAll(yamlText);
            return filter ? filter(configs) : filter
        };

First up alb-ingress-controller.yaml works great, in my case I was using a fargate cluster which is more limited. I need to provide cluster name, vpc id, and region when creating the ingress controller in kubernetes.

I know in the configuration file that there is only one config, and that there is one and only one spec.template.spec.containers. However I still use forEach and Map because I do not like assuming there is only one entry and using [0]. Key point here is that we need to add to container.args and pass in the additional parameters. Everything else is the same.

        let ingressController = load('aws-alb-ingress-controller/docs/examples/alb-ingress-controller.yaml', function (configs) {
            configs.forEach((config) => {
                config.spec.template.spec.containers = config.spec.template.spec.containers.map((container) => {
                    return Object.assign({}, container, {
                        args: [
                            '--cluster-name=' + cluster.clusterName,
                            '--aws-vpc-id=' + vpc.vpcId,
                            '--aws-region=' + vpc.env.region,
                            ...container.args]
                    })
                })
            })
            return configs;

        });

Next we use the sample’s rbac-role.yaml. This will create the service account, role, and role binding in kubernetes. Most of this works fine. Except the service account needs to be created with the ARN of the role we created above. This file actually contains several manifest, we want the one with the ServiceAccount. We’ll then update the annotations with the ARN of the role.

        let rbac = load('aws-alb-ingress-controller/docs/examples/rbac-role.yaml', (configs) => {
            const serviceAccount = configs.find(config => config.kind === 'ServiceAccount')
            serviceAccount.metadata.annotations = Object.assign({}, serviceAccount.metadata.annotations, {
                'eks.amazonaws.com/role-arn': iamRole.roleArn
            })
            return configs;
        });

Then you dump the manifests into your cluster:

        new eks.KubernetesManifest(this, 'rbac-manifest', {
            cluster,
            manifest: rbac
        });
        new eks.KubernetesManifest(this, 'ingress-controller-manifest', {
            cluster,
            manifest: ingressController
        })

Hopefully this will help someone setup EKS + Ingress in the CDK. If I have time later I’ll create some repos like @arhea .

1reaction
iliapolocommented, Jan 26, 2021

@vsetka We are not actively working on this. What specific example are you looking for?

Read more comments on GitHub >

github_iconTop Results From Across the Web

aws-cdk/aws-eks module - AWS Documentation
Amazon EKS managed node groups automate the provisioning and lifecycle management of nodes (Amazon EC2 instances) for Amazon EKS Kubernetes clusters. With ...
Read more >
Amazon EKS Blueprints Quick Start
This repository contains the source code for the eks-blueprints NPM module. It can be used by AWS customers, partners, and internal AWS teams...
Read more >
@aws-quickstart/eks-blueprints - npm
Specifically, customers can leverage the eks-blueprints module to: [x] Deploy Well-Architected EKS clusters across any number of accounts and ...
Read more >
Amazon EKS Blueprints for Terraform - GitHub
This repository contains a collection of Terraform modules that aim to make it easier and faster for customers to adopt Amazon EKS. It...
Read more >
AWS EKS Module - Terraform Registry
eks. aws. Terraform module to create an Elastic Kubernetes (EKS) cluster and associated worker instances on AWS. Published ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found