question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Start Scrapyd with the `daemon` command

See original GitHub issue
  1. I would like to start scrapyd as service but when I start scrapyd, if I close the SSH session the service scrapyd close automatically.

  2. When I try to start as service like this I have an error :

    root@vps:~# service scrapyd start
    scrapyd: Failed to start scrapyd.service: Unit scrapyd.service failed to load: No such file or directory.
    
  3. And when i try to start daemon scrapyd the CURL request return :

    {"status": "error", "message": "Use \"scrapy\" to see available commands", "node_name": "vps"}
    

Can someone help me to start my scrapyd as service please !

Issue Analytics

  • State:closed
  • Created 8 years ago
  • Comments:14 (6 by maintainers)

github_iconTop GitHub Comments

1reaction
ricoxorcommented, Mar 10, 2016

Thank’s Digenis I find the solution : daemon --chdir=/home/Crawler scrapyd

I need to set the --chdir to load the service on the Scrapy folder !

0reactions
Digeniscommented, Mar 4, 2017

twistd can also daemonize scrapyd it’s just that the default console script doesn’t. It also need some additional arguments for the working directory, path to pid file and logging.

For systemd, see twisted’s latest doc: http://twistedmatrix.com/documents/current/core/howto/systemd.html There’s a ticket for scrapyd on systemd: #102

Read more comments on GitHub >

github_iconTop Results From Across the Web

Preferred way to run Scrapyd in the background / as a service
To have scrapyd run as daemon, you can simply do: ... This rather complicated command starts Scrapyd image in the background ( -d...
Read more >
Scrapyd Documentation
Scrapyd is an application (typically run as a daemon) that listens to requests for spiders to run and spawns a process.
Read more >
Scrapyd — Scrapy 2.7.1 documentation
Scrapyd ¶. Scrapyd has been moved into a separate project. Its documentation is now hosted at: https://scrapyd.readthedocs.io/en/latest/ ...
Read more >
The Scrapyd Guide - Deploy & Schedule Your Scrapy Spiders
The Complete Guide To Scrapyd: Deploy, Schedule & Run Your Scrapy Spiders ... And then start the server by using the command: ......
Read more >
vimagick/scrapyd - Docker Image
scrapyd · docker-compose.yml · Run it as background-daemon for scrapyd. $ docker-compose up -d scrapyd $ docker-compose logs -f scrapyd $ docker cp ......
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found