You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I require help to decom cluster : vfde-webapp-preprod-mobile3-eks.
Context:
We have a cluster : vfde-webapp-preprod-mobile3-eks (EKS 1.23), that was deployed on using Terraform, and was bounded to upgraded using TF only , however due a manual intervention the cluster got upgraded to 1.24.
Now I am unable to delete the said cluster using terraform due a hidden dependency, that I am unable to ascertain, so I look up to you guys for some guidance.
I totally get the part that this issue is not a Bug Report, however, I would thoroughly appreciate your guidance on this matter.
Problem Statement :
The TF plan operation get stuck due to below error,
it looks like your issue is with the providers - kubernetes provider specifically. What does your kubernetes provider implementation look like, and have you tried running terraform refreshhttps://developer.hashicorp.com/terraform/cli/commands/refresh#usage before running your plan?
Hello Folks,
I require help to decom cluster : vfde-webapp-preprod-mobile3-eks.
Context:
We have a cluster : vfde-webapp-preprod-mobile3-eks (EKS 1.23), that was deployed on using Terraform, and was bounded to upgraded using TF only , however due a manual intervention the cluster got upgraded to 1.24.
Now I am unable to delete the said cluster using terraform due a hidden dependency, that I am unable to ascertain, so I look up to you guys for some guidance.
I totally get the part that this issue is not a Bug Report, however, I would thoroughly appreciate your guidance on this matter.
Problem Statement :
The TF plan operation get stuck due to below error,
=========================== | ===========================
Error: Get "http://localhost/api/v1/namespaces/kube-system/configmaps/aws-auth": dial tcp 127.0.0.1:80: connect: connection refused
│
│ with kubernetes_config_map.aws_auth,
│ on aws_auth.tf line 1, in resource "kubernetes_config_map" "aws_auth":
│ 1: resource "kubernetes_config_map" "aws_auth" {
│
╵
╷
│ Error: Get "http://localhost/apis/rbac.authorization.k8s.io/v1/clusterroles/readonly": dial tcp 127.0.0.1:80: connect: connection refused
│
│ with kubernetes_cluster_role_v1.readonly,
│ on aws_auth.tf line 27, in resource "kubernetes_cluster_role_v1" "readonly":
│ 27: resource "kubernetes_cluster_role_v1" "readonly" {
│
╵
╷
│ Error: Get "http://localhost/apis/rbac.authorization.k8s.io/v1/clusterrolebindings/readonly": dial tcp 127.0.0.1:80: connect: connection refused
│
│ with kubernetes_cluster_role_binding_v1.readonly,
│ on aws_auth.tf line 41, in resource "kubernetes_cluster_role_binding_v1" "readonly":
│ 41: resource "kubernetes_cluster_role_binding_v1" "readonly" {
│
╵
╷
│ Error: Get "http://localhost/api/v1/namespaces/kube-system/serviceaccounts/argocd-manager": dial tcp 127.0.0.1:80: connect: connection refused
│
│ with kubernetes_service_account_v1.argocd-manager[0],
│ on cicd-sa.tf line 1, in resource "kubernetes_service_account_v1" "argocd-manager":
│ 1: resource "kubernetes_service_account_v1" "argocd-manager" {
│
╵
╷
│ Error: Get "http://localhost/apis/rbac.authorization.k8s.io/v1/clusterroles/argocd-manager-role": dial tcp 127.0.0.1:80: connect: connection refused
│
│ with kubernetes_cluster_role_v1.argocd-manager[0],
│ on cicd-sa.tf line 83, in resource "kubernetes_cluster_role_v1" "argocd-manager":
│ 83: resource "kubernetes_cluster_role_v1" "argocd-manager" {
│
╵
╷
│ Error: Get "http://localhost/apis/rbac.authorization.k8s.io/v1/clusterrolebindings/argocd-manager-role-binding": dial tcp 127.0.0.1:80: connect: connection refused
│
│ with kubernetes_cluster_role_binding_v1.argocd-manager[0],
│ on cicd-sa.tf line 103, in resource "kubernetes_cluster_role_binding_v1" "argocd-manager":
│ 103: resource "kubernetes_cluster_role_binding_v1" "argocd-manager" {
│
╵
=========================== | ===========================
The text was updated successfully, but these errors were encountered: