Skip to content

looking for advice on using terraform and helm to manage karpenter install #7618

Closed as not planned
@tormodmacleod

Description

@tormodmacleod

hello,

currently we're using terraform to install karpenter using the eks blueprints helm chart as below

module "eks_blueprints_addons" {
  source  = "aws-ia/eks-blueprints-addons/aws"
  version = "1.16.3"

  cluster_name      = module.eks.cluster_name
  cluster_endpoint  = module.eks.cluster_endpoint
  cluster_version   = module.eks.cluster_version
  oidc_provider_arn = module.eks.oidc_provider_arn

  enable_karpenter = true

  karpenter = {
    chart_version = "0.37.0"
    values = [
      "replicas: 1",
      "tolerations: [{key: dedicated, operator: Equal, value: infrastructure-base}]",
      "settings: {featureGates: {spotToSpotConsolidation: ${var.spot_to_spot_consolidation}}}",
    ]
  }
}

we're in the process of upgrading to 0.37.6 and have encountered some issues similar to this one. i did some reading and it seems that this is a consequence of helm's behaviour of not updating crd's when a chart is updated. further reading suggested that at some point the advice had been to run kubectl apply -f against the appropriate crd yaml but that the advice is now to install the crd's using a separate helm chart

As an independent Helm chart karpenter-crd (source) that can be used by Helm to manage the lifecycle of these CRDs

this language suggests that some mechanism has been found to overcome helm's behaviour of not updating crd's and that this karpenter-crd chart takes advantage of that. i'd be grateful if someone could confirm that my understanding of this is correct and that once we adopt the approach of using separate charts to manage the resources and crd's we'll no longer experience issues related to outdated crd's

also, i've tested the following terraform config and it seems to work. however, if you can see anything that i've misconfigured please let me know

module "eks_blueprints_addons" {
  source  = "aws-ia/eks-blueprints-addons/aws"
  version = "1.16.3"

  cluster_name      = module.eks.cluster_name
  cluster_endpoint  = module.eks.cluster_endpoint
  cluster_version   = module.eks.cluster_version
  oidc_provider_arn = module.eks.oidc_provider_arn

  enable_karpenter = true

  karpenter = {
    chart_version    = "0.37.0"
    skip_crds        = true
    create_namespace = false
    values = [
      "replicas: 1",
      "tolerations: [{key: dedicated, operator: Equal, value: infrastructure-base}]",
      "settings: {featureGates: {spotToSpotConsolidation: ${var.spot_to_spot_consolidation}}}",
    ]
  }

  depends_on = [helm_release.karpenter_crd]
}

resource "helm_release" "karpenter_crd" {
  name             = "karpenter-crd"
  repository       = "oci://public.ecr.aws/karpenter"
  chart            = "karpenter-crd"
  version          = "0.37.0"
  namespace        = "karpenter"
  create_namespace = true
  force_update     = true
}

Metadata

Metadata

Assignees

No one assigned

    Labels

    lifecycle/closedlifecycle/stalequestionIssues that are support related questionstriage/solvedMark the issue as solved by a Karpenter maintainer. This gives time for the issue author to confirm.

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions