question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Failed to get contexts

See original GitHub issue

Expected behaviour

When I open the GitOps Tools extension I expect it to use the current context from my kubeconfig. It’s always worked that way in the past.

Actual behaviour

Instead, I get a message saying “Failed to get contexts: Config fetched, but contexts not found.”

Steps to reproduce

Open code from a WSL command line. Select the GitOps Toolkit extension.

Versions

kubectl client v1.25.0 kubectl server v1.22.9+vmware.1 Flux: v0.32.0 Git: 2.25.1 Azure: 2.40.0 Azure extension “k8s-configuration”: not installed Azure extension “k8s-extension”: not installed VSCode: 1.72.0 Extension: 0.22.0 OS: Linux x64 5.15.57.1-microsoft-standard-WSL2

Issue Analytics

  • State:open
  • Created a year ago
  • Comments:10

github_iconTop GitHub Comments

1reaction
CapKenRcommented, Nov 18, 2022

First, it’s not just a WSL problem. If I open VS Code from PowerShell and the Windows desktop, I get the same behavior. I made a short video of starting VS Code from WSL then going to the GitOps extension. You can find it at https://clipchamp.com/watch/gDzBy58oP1l.

Second, my settings.json, regardless of where I open VS Code from, is the same and doesn’t have a vs-kubernetes.knownKubeconfigs. It only has a vs-kubernetes.kubeconfig.

    "vs-kubernetes": {
        "vs-kubernetes.kubeconfig": {
            "collapsibleState": 2,
            "label": "docker-desktop",
            "children": [],
            "clusterProvider": "Generic",
            "clusterProviderManuallyOverridden": false,
            "isCurrent": true,
            "cluster": {
                "name": "docker-desktop",
                "cluster": {
                    "server": "https://kubernetes.docker.internal:6443",
                    "certificate-authority-data": "DATA+OMITTED"
                }
            },
            "clusterContext": {
                "name": "docker-desktop",
                "context": {
                    "cluster": "docker-desktop",
                    "user": "docker-desktop",
                    "clusterInfo": {
                        "name": "docker-desktop",
                        "cluster": {
                            "server": "https://kubernetes.docker.internal:6443",
                            "certificate-authority-data": "DATA+OMITTED"
                        }
                    }
                }
            },
            "clusterName": "docker-desktop",
            "contextName": "docker-desktop",
            "description": "https://kubernetes.docker.internal:6443",
            "iconPath": {
                "light": {
                    "$mid": 1,
                    "path": "/c:/Users/KenRider/.vscode/extensions/weaveworks.vscode-gitops-tools-0.22.3/resources/icons/light/cloud.svg",
                    "scheme": "file"
                },
                "dark": {
                    "$mid": 1,
                    "path": "/c:/Users/KenRider/.vscode/extensions/weaveworks.vscode-gitops-tools-0.22.3/resources/icons/dark/cloud.svg",
                    "scheme": "file"
                }
            },
            "isGitOpsEnabled": false
        }
    },

My WSL kubeconfig (/home/ken/.kube/config):

apiVersion: v1
clusters:
- cluster:
    certificate-authority-data: DATA+OMITTED
    server: https://kubernetes.docker.internal:6443
  name: cluster-b796g6fbkm
- cluster:
    certificate-authority-data: DATA+OMITTED
    server: https://kubernetes.docker.internal:6443
  name: docker-desktop
- cluster:
    certificate-authority-data: DATA+OMITTED
    server: https://tkg-mgmt-azure-20220609080551-xxxxxxxx.westus2.cloudapp.azure.com:6443
  name: tkg-mgmt-azure-20220609080551
- cluster:
    certificate-authority-data: DATA+OMITTED
    server: https://tkg-workload-azure-xxxxxxx.westus2.cloudapp.azure.com:6443
  name: tkg-workload-azure
contexts:
- context:
    cluster: docker-desktop
    user: docker-desktop
  name: docker-desktop
- context:
    cluster: tkg-mgmt-azure-20220609080551
    user: tkg-mgmt-azure-20220609080551-admin
  name: tkg-mgmt-azure-20220609080551-admin@tkg-mgmt-azure-20220609080551
- context:
    cluster: tkg-workload-azure
    namespace: default
    user: tkg-workload-azure-admin
  name: tkg-workload-azure-admin@tkg-workload-azure
current-context: tkg-mgmt-azure-20220609080551-admin@tkg-mgmt-azure-20220609080551
kind: Config
preferences: {}
users:
- name: docker-desktop
  user:
    client-certificate-data: REDACTED
    client-key-data: REDACTED
- name: tkg-mgmt-admin
  user:
    client-certificate-data: REDACTED
    client-key-data: REDACTED
- name: tkg-mgmt-azure-20220609080551-admin
  user:
    client-certificate-data: REDACTED
    client-key-data: REDACTED
- name: tkg-workload-azure-admin
  user:
    client-certificate-data: REDACTED
    client-key-data: REDACTED
- name: user-b796g6fbkm
  user:
    client-certificate-data: REDACTED
    client-key-data: REDACTED

My Windows kubeconfig (C:\Users\KenRider.kube\config):

apiVersion: v1
clusters:
- cluster:
    certificate-authority-data: DATA+OMITTED
    server: https://kubernetes.docker.internal:6443
  name: docker-desktop
contexts:
- context:
    cluster: docker-desktop
    user: docker-desktop
  name: docker-desktop
current-context: docker-desktop
kind: Config
preferences: {}
users:
- name: docker-desktop
  user:
    client-certificate-data: REDACTED
    client-key-data: REDACTED

In regards to the localhost:8080 question, it must not be using my default kubeconfig or the kubeconfig referenced in settings.json as both have valid contexts.

0reactions
CapKenRcommented, Nov 20, 2022

That worked. My settings.json now has the vs-kubernetes.kubeconfig and vs-kubernetes.knownKubeconfigs. I had to add both the WSL and the Windows default kubeconfig. However, I do get the same error message when I switch between the two. I must then use the Set Kubeconfig widget again.

{

"workbench.colorTheme": "Default Dark+",

"redhat.telemetry.enabled": true,

"git.autofetch": true,

"vs-kubernetes": {

    "vs-kubernetes.kubeconfig": "/home/ken/.kube/config",

    "vs-kubernetes.knownKubeconfigs": [

        "/home/ken/.kube/config",

        "c:\\Users\\KenRider\\.kube\\config"

    ]

},

"security.workspace.trust.untrustedFiles": "open",

"window.commandCenter": true

}

From: Kingdon Barrett @.> Sent: Sunday, November 20, 2022 4:43 AM To: weaveworks/vscode-gitops-tools @.> Cc: Ken Rider @.>; Mention @.> Subject: Re: [weaveworks/vscode-gitops-tools] Failed to get contexts (Issue #393)

OK, my best guess right now is that something fails when the extension sets your KUBECONFIG.

Can you please try one more thing? Using this widget, could you point the extension at a Kubeconfig you want it to use: https://user-images.githubusercontent.com/3286998/202899984-7789c03e-d9db-4b1c-a39e-732ce24cfa97.png

This should result in a “known kubeconfigs” section being added to your settings.json – if you have set a kubeconfig there and it still does not connect from end to end, then we know for sure the problem is somewhere between the extension and the kubectl binary (and not just a matter of KUBECONFIG somehow not being passed in properly to the editor’s context)

— Reply to this email directly, view it on GitHub https://github.com/weaveworks/vscode-gitops-tools/issues/393#issuecomment-1321109847 , or unsubscribe https://github.com/notifications/unsubscribe-auth/AC3EFK5UDBSR73CUT3WHFXLWJIFCVANCNFSM6AAAAAAQ77C2FU . You are receiving this because you were mentioned. https://github.com/notifications/beacon/AC3EFK25QWLXWW4OMMJABY3WJIFCVA5CNFSM6AAAAAAQ77C2FWWGG33NNVSW45C7OR4XAZNMJFZXG5LFINXW23LFNZ2KUY3PNVWWK3TUL5UWJTSOX2EVO.gif Message ID: @.*** @.***> >

Read more comments on GitHub >

github_iconTop Results From Across the Web

"failed to get context" error against real server, works fine with ...
Following the ODATA V4 tutorial in step 2: app runs against mockserver, tips are given to run it against a real server. Used...
Read more >
sap.ui.model.odata.v4.Context.failed() : Promise - missing #1499
OpenUI5 version: 1.46.7. I'm trying to create an entity by using OData v4 with the following command: // oContext is my ODataListBinding.
Read more >
Failed to return attached pipeline contexts - Codefresh
Solution. 1) Create a dummy shared context with the same name as the missing context from the error message. 2) This dummy context...
Read more >
delete on sap.ui.mode.odata.v4.Context is failing with error ...
Hi William,. If your context not begin fetched yet, you can't delete directly. That means you need to call requestObject method to get...
Read more >
"Cancelled : error="Failed to get plan from instance for context ...
Description. While executing the flow designer via mid server, throwing the error as 'Failed to get plan from instance for context: 8d0fe27475b114108e4bbabd33 ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found