'Unable to connect to AWS EKS cluster
I am configuring an EKS cluster using terraform in a private subnet and trying to access it using a VPN in a public subnet. When I configured it, it works fine but now when I run kubectl get pods or kubectl get svc, it is throwing an error:
error: exec plugin: invalid apiVersion "client.authentication.k8s.io/v1alpha1"
I don't know why it is happening. Please reply if you the solution.
Thanks
Solution 1:[1]
It's broken with kubectl
version 1.24
. Downgrade to 1.23.6
will fix the issue for now
sudo apt install kubectl=1.23.6-00
Solution 2:[2]
If you're experiencing this issue with GitHub actions and kodermax/kubectl-aws-eks
, configure fixed versions for KUBECTL_VERSION
and IAM_VERSION
for each deployment step.
- name: deploy to cluster
uses: kodermax/kubectl-aws-eks@master
env:
KUBE_CONFIG_DATA: ${{ secrets.KUBE_CONFIG_DATA_STAGING }}
ECR_REGISTRY: ${{ steps.login-ecr.outputs.registry }}
ECR_REPOSITORY: my-app
IMAGE_TAG: ${{ github.sha }
KUBECTL_VERSION: "v1.23.6"
IAM_VERSION: "0.5.3"
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
Solution | Source |
---|---|
Solution 1 | Gavy |
Solution 2 | DileepNimantha |