Hi OVH Community,
I tried to follow the docs and setup OIDC of a Managed Kubernetes cluster.
Therefore I used Terraform like suggested, and it created the cluster with OIDC configuration like suggested but adapted to the google OpenID token claims.
> resource "ovh_cloud_project_kube_oidc" "cluster-oidc" {
> service_name = var.ovh_public_cloud_project_id
> kube_id = ovh_cloud_project_kube.cluster.id
>
> #required field
> client_id = var.oidc_client_id
> issuer_url = var.oidc_issuer_url
>
> oidc_username_claim = "email"
> oidc_username_prefix = "oidc:"
> depends_on = ovh_cloud_project_kube.cluster]
> }
I also created the clusterRoleBindings with hashicorp kubernetes provider.
> resource "kubernetes_cluster_role_binding" "oidc-cluster-admin" {
> metadata {
> name = "oidc-cluster-admin"
> }
> role_ref {
> api_group = "rbac.authorization.k8s.io"
> kind = "ClusterRole"
> name = "cluster-admin"
> }
> subject {
> kind = "User"
> name = "oidc:some.email@fqdn.com"
> api_group = "rbac.authorization.k8s.io"
> }
> depends_on = [ovh_cloud_project_kube_oidc.cluster-oidc]
> }
On kubectl side I used [kubelogin and configured the user like this.
> - name: oidc
> user:
> exec:
> apiVersion: client.authentication.k8s.io/v1beta1
> args:
> - oidc-login
> - get-token
> - --oidc-issuer-url=https://accounts.google.com
> - --oidc-client-id=
> - --oidc-client-secret=
> - --oidc-extra-scope=email
> - --oidc-extra-scope=profile
> - --oidc-extra-scope=openid
> - -v10
> command: kubectl
> env: null
> provideClusterInfo: false
I debugged this a bit, but the OVH Managed Kubernetes apiserver audit log output is nice, but really painfull to use if you search for issues, because there are no filters or other usefull tools. (Yes this could be seen as a feature request)
At kubectl side it seems not to work because of the missing groups claim, which is not provided by Google OpenID Connect tokens.
> I0425 09:26:59.167863 42551 get_token.go:107] you already have a valid token until 2023-04-25 10:26:58 +0200 CEST
> I0425 09:26:59.167869 42551 get_token.go:114] writing the token to client-go
> E0425 09:26:59.169871 42291 memcache.go:265] couldn't get current server API group list: the server has asked for the client to provide credentials
> error: You must be logged in to the server (the server has asked for the client to provide credentials)
There are other projects which also have issues because of this behavor.
Is there anyone who got this to work with Google OpenID Connect?
If so, can you provide more information please?
My goal is that I can use our companies Google workspaces account to authenticate.
If it is without groups it is okay for me at the moment.
Thank you. for your help;
Containers and Orchestration - Managed Kubernetes OIDC with Google OpenID
Related questions
- Managed Kubernetes LoadBalancer is not activated
4171
30.05.2022 13:22
- How to check if Kubernetes is installed?
3522
25.06.2020 08:22
- How to activate Pod Security Policies on Kubernetes cloud?
2762
11.05.2020 16:12
- What is a POD in kubernetes?
2638
11.08.2020 15:45
- No options for opening ports available
2119
29.09.2023 14:32
- Loadbalancer static ip?
2105
18.05.2021 16:47
- Connection to outside services from a pod
1953
08.02.2021 22:14
- Kubernetes control plane ttl and HA
1857
04.12.2023 16:45
- Kubernetes service endpoints no longer auto created, updated or deleted, how to fix?
1772
03.06.2025 08:50
- Managed Kubernetes Service and available regions
1743
05.11.2021 08:18