question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Ephemeral (single use) runner registrations

See original GitHub issue

Describe the bug When starting a self hosted runner with ./run.cmd --once, the runner sometimes accepts a second job before shutting down, which causes that second job to fail with the message:

The runner: [runner-name] lost communication with the server. Verify the machine is running and has a healthy network connection.

This looks like the same issue recently fixed here: microsoft/azure-pipelines-agent#2728

To Reproduce Steps to reproduce the behavior:

  1. Create a repo, enable GitHub Actions, and add a new workflow

  2. Configure a new runner on your machine

  3. Run the runner with ./run.cmd --once

  4. Queue two runs of your workflow

  5. The first job will run and the runner will go offline

  6. (Optionally) configure and start a second runner

  7. The second job will time out after several minutes with the message:

    The runner: [runner-name] lost communication with the server. Verify the machine is running and has a healthy network connection.
    

    (where [runner-name] is the name of the first runner)

  8. Also: trying to remove the first runner with the command ./config.cmd remove --token [token] will result in the following error until the second job times out:

    Failed: Removing runner from the server
    Runner "[runner-name]" is running a job for pool "Default"
    

Expected behavior The second job should run on (and wait for) any new runner that comes online rather than try to run as a second job on the, now offline, original runner.

Runner Version and Platform

2.262.1 on Windows

Runner and Worker’s Diagnostic Logs

_diag.zip

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Reactions:57
  • Comments:33 (5 by maintainers)

github_iconTop GitHub Comments

24reactions
rclmenezescommented, Mar 26, 2021

Help us @bryanmacfarlane, you’re our only hope! 🙏

image

19reactions
bryanmacfarlanecommented, Jul 10, 2020

@rclmenezes ack on #1 and #2. we’re currently designing and working on it. The plan is exactly what you laid out, register the runner ephemeral with the backend service so the service auto cleans it up after the job is complete and the runner / container exits.

Read more comments on GitHub >

github_iconTop Results From Across the Web

GitHub Actions: Ephemeral self-hosted runners & new ...
GitHub Actions now supports ephemeral (i.e. single job) self-hosted runners and a new workflow_job webhook to make autoscaling your runners ...
Read more >
Create Ephemeral Self-Hosted Runners for GitHub Actions
GitHub Actions provides the ability to have ephemeral runners so that each one is only used for a single workflow run. In this...
Read more >
Ephemeral Self-Hosted Github Actions Runners
Initial post on a series about orchestrating github actions runners. Tagged with github, docker, bash.
Read more >
On Demand Ephemeral Self-Hosted Runners
In this post I show you how you can use Ephemeral Runners to ... One technique you could use is to autoscale your...
Read more >
GitHub Actions: Ephemeral self-hosted runners and new ...
Can't you run more than one runner per server? My understanding was you start one up, give it a directory to work in,...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found