question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Scrapy csv file has uniform empty rows?

See original GitHub issue

Note: Originally reported on StackOverflow:https://stackoverflow.com/questions/39477662/scrapy-csv-file-has-uniform-empty-rows

here is the spider:

import scrapy
from danmurphys.items import DanmurphysItem

class MySpider(scrapy.Spider):
    name = 'danmurphys'
    allowed_domains = ['danmurphys.com.au']
    start_urls = ['https://www.danmurphys.com.au/dm/navigation/navigation_results_gallery.jsp?params=fh_location%3D%2F%2Fcatalog01%2Fen_AU%2Fcategories%3C%7Bcatalog01_2534374302084767_2534374302027742%7D%26fh_view_size%3D120%26fh_sort%3D-sales_value_30_days%26fh_modification%3D&resetnav=false&storeExclusivePage=false']


    def parse(self, response):        
        urls = response.xpath('//h2/a/@href').extract()
        for url in urls:            
            request = scrapy.Request(url , callback=self.parse_page)      
            yield request

    def parse_page(self , response):
        item = DanmurphysItem()
        item['brand'] = response.xpath('//span[@itemprop="brand"]/text()').extract_first().strip()
        item['name'] = response.xpath('//span[@itemprop="name"]/text()').extract_first().strip()
        item['url'] = response.url     
        return item

and here is the items :

import scrapy
class DanmurphysItem(scrapy.Item):  
    brand = scrapy.Field()
    name = scrapy.Field()
    url = scrapy.Field()

when I run the spider with this command : scrapy crawl danmurphys -o output.csv the output is like this : p2aal

How can I avoid these uniform empty rows , (by the way when I save to json there are no empty values) .

Issue Analytics

  • State:closed
  • Created 7 years ago
  • Comments:8 (2 by maintainers)

github_iconTop GitHub Comments

4reactions
darshilparmarcommented, May 7, 2018

Scrapy : 1.5.0 lxml : 4.2.1.0 libxml2 : 2.9.5 cssselect : 1.0.3 parsel : 1.4.0 w3lib : 1.19.0 Twisted : 17.5.0 Python : 3.6.4 |Anaconda custom (64-bit)| (default, Jan 16 2018, 10:22:32) [MSC v.1900 64 bit (AMD64)] pyOpenSSL : 17.5.0 (OpenSSL 1.0.2o 27 Mar 2018) cryptography : 2.1.4 Platform : Windows-10-10.0.16299-SP0

I have same issue

1reaction
redapplecommented, Sep 14, 2016

@Ibrahim2311 , what version of scrapy are you using? (paste the output of scrapy version -v)

Read more comments on GitHub >

github_iconTop Results From Across the Web

Scrapy csv file has uniform empty rows? - Stack Overflow
To fix this in Scrapy 1.3, you can patch it by adding newline='' as parameter to io.TextIOWrapper in the __init__ method of the ......
Read more >
Scrapy 0.24.4 - exports to csv, but the csv file is empty
I am using Scrapy 0.24. 4, Python 2.7: things are fine when I scrape within Terminal. However, when I try to export it...
Read more >
Csv File Written With Python Has Blank Lines Between Each ...
One way to deal with empty cells is to remove rows that contain empty cells. This is usually In our cleaning examples we...
Read more >
Extracting certain products from a webpage using Scrapy
csv -t csv ). The results found in such CSV file are with a uniform gap between two lines that means there is...
Read more >
4 Finding Web Elements | Web Scraping Using Selenium Python
This is a tutorial for using Selenium Python to scrape websites. ... When the looping is over, we write this string to the...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found