`cml <sub-cmd>` on cml runner instance fails
See original GitHub issueI have tried a few iterations of this:
jobs:
setup:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: iterative/setup-cml@v1
- name: deploy runner
env:
GOOGLE_APPLICATION_CREDENTIALS_DATA: ${{ secrets.GCP_CML_RUNNER_KEY }}
run: |
cml runner \
--single \
--labels=test \
--token=${{ secrets.DACBD_PAT }} \
--cloud=gcp \
--cloud-region=us-west \
--cloud-type=e2-highcpu-2
test:
needs: [setup]
runs-on: [self-hosted, test]
steps:
- uses: actions/checkout@v2
- run: |
which cml
cml ci
Fails with an odd message: Error: Cannot find module '/tmp/tmp.njlS8oR0sf/.cml/cml-ehsjbxeja3/_work/cml-pulse/cml-pulse/ci'
and:
...
test:
needs: [setup]
runs-on: [self-hosted, test]
steps:
- uses: actions/checkout@v2
- uses: iterative/setup-cml@v1
- run: |
which cml
cml ci
Fails with a EEXIST: file already exists, symlink '../lib/node_modules/@dvcorg/cml/bin/cml.js' -> '/usr/bin/cml'
However using --cml-version=github:iterative/cml.git#master \
to trigger the npm install -g
way of setting up cml on the runner instance works fine.
Issue Analytics
- State:
- Created 2 years ago
- Reactions:1
- Comments:16 (16 by maintainers)
Top Results From Across the Web
cml-runner fails to deploy runners on ec2 · Issue #741 - GitHub
Hey everyone, A random issue started appearing yesterday and cml-runner now fails to deploy runners. The issue seems to coincide with the ...
Read more >Self-hosted Runners | CML
When a workflow requires computational resources (such as GPUs), CML can automatically allocate cloud instances using cml runner . You can spin up...
Read more >Configuring runners - GitLab Docs
To view the IP address of a shared runner you must have administrator access to the GitLab instance. To determine this: On the...
Read more >sitemap-question-1.xml - Cisco Learning Network
... -ospf-router-subcommand-to-protect-router-memory 2022-12-14T15:33:24.000Z ... -other-words-i-would-like-to-know-in-the-instance-of-cml-i-am-running-how- ...
Read more >CML self-hosted runners on demand with GPUs - Iterative.ai
It could be an EC2 instance or the GPU under your desk. In our recently-released project, Continuous Machine Learning (CML), our Docker image...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
👀 🤔
I think what I have in mind shouldn’t care (if it works); as far as I have seen there is no issue with repeated invocations, just the “re-invocation” within the inherited context of the first
pkg
cml runner
call which is the “parent” process of github actions runner and thus subsequent use of thecml
cmdFinished doing some reading and plan to try it out tomorrow evening 😴
While “libelous” I feel validated with suspicion of child_process/spawn weirdness, also well done on the hunt! (https://github.com/vercel/pkg/issues/897#issuecomment-1076988215) I glanced at that chunk but, I missed the value of this in my initial grep of the codebase after I had found the culprit env.
To summarize, I agree I think the fix is solving #802 which I have an idea for beyond what is currently implemented in that branch which I intend to test out soon.