How to save particular result in csv when parsing from CLI ?
See original GitHub issueI’m parsing fb pate from CLI like that:
facebook-scraper --filename nintendo_page_posts.csv --pages 5000 --cookies my_cookies.json nintendo
I parse many-many pages (try to parse all page posts once) and often get different errors (block, timeout, network error, etc). After that all hours of my work are lost. It there any option to save .csv file (maybe with different extension f.e. .csv.partial), even if error ocured ?
Issue Analytics
- State:
- Created 2 years ago
- Comments:6
Top Results From Across the Web
Using the Cut Command to Parse a CSV File by Delimiters ...
Make quick work out of extracting useful data from any output that's semi-structured.Hit the subscribe button to receive more videos like ...
Read more >Working with CSVs on the Command Line - Brian Connelly
To do this, just add the > character and the name of the file where you want the output to be stored to...
Read more >Parse the output of my bash script and save as CSV
But I need to parse the verbose data from output, and eventually save some meaningful results as a .CSV file. Here is my...
Read more >Using the Cut Command to Parse a CSV File ... - Nick Janetakis
Cut can make quick work out of extracting useful data from CSV files or output that has a pattern of characters or bytes....
Read more >How to Parse a CSV File in Bash | Baeldung on Linux
In this tutorial, we'll learn how to parse values from Comma-Separated Values (CSV) files with various Bash built-in utilities.
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
https://github.com/kevinzg/facebook-scraper/commit/c61f3487612de418bf81b69e8b5cc981eecb4e0f should harden CLI post extraction a bit, catching exceptions and then writing out the CSV anyway. I would also recommend increasing the timeout with the timeout parameter if you’re running into timeout issues.
That could be because the TemporarilyBanned exception is caught. It would still be logged in that case. Also note that the way the scraper detects temp bans, is by looking for the following page titles:
If your FB language is set to something other than English, then the title wouldn’t match those.