Create scrapy command to get the equivalent Request from a cURL command
See original GitHub issueHi!
Since https://github.com/scrapy/scrapy/pull/3862 was accepted I was suggested to create a scrapy command that receive a cURL command and return the equivalent Request __repr__
.
Something like:
scrapy requestfromcurl "curl 'http://example.org/post' -X POST -H 'Cookie: _gauges_unique_year=1; _gauges_unique=1'"
would print:
Request(method='POST', url='http://example.org/post', cookies={'_gauges_unique_year': '1', '_gauges_unique': '1'})
I just did a PR (https://github.com/scrapy/scrapy/pull/3990) to show you the idea, but I would like to ask you:
1st. Do you like the idea? Is it useful? (I’m not sure about it)
In the affirmative case:
2nd. how would you name that command? (ideas: requestfromcurl, curltorequest…) 3rd. where would you mention that command in the docs?
Thanks in advance
Issue Analytics
- State:
- Created 4 years ago
- Comments:12 (5 by maintainers)
Top GitHub Comments
Hi! I’ve recently built small visual tool which help to translate curl requests to scrapy code. Here’s a link https://michael-shub.github.io/curl2scrapy/ and here’s github page: https://github.com/michael-shub/curl2scrapy
Updated curl2scrapy a bit. Now seems to be working with simpler curl requests. Also added onPaste handler to avoid pressing extra buttons and save your time.