question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Tried to stop a LoopingCall that was not running

See original GitHub issue

I’m using mysql to store my spider data, but when I set the piplines to store data into local mysql server, it raise an error. pipelines.py

import json
import requests
from mysql.connector import connection

MYSQL_SERVER = '192.168.1.90'   # using this, the spider can run
MYSQL_SERVER = 'localhost'   # using this, the spider raise an error
MYSQL_DB = 'scrapy'
MYSQL_USER = 'crawler'
MYSQL_PASS = 'crawl'
MYSQL_TABLE = 'pm25in'

class Pm25InPipeline(object):
    def __init__(self):                                                                
        pass

    def process_item(self, item, spider):                                              
        command = '''insert into {table} (monitortime, monitorcity, monitorpoint,      
            AQIindex, airsituation, primarypullutant, PM25content, PM10content,        
            CO, NO2, O3_1h, O3_8h, SO2)                                                
            values ( "{monitortime}", "{monitorcity}","{monitorpoint}", "{AQIindex}",  
            "{airsituation}", "{primarypollutant}", {PM25content}, {PM10content},      
            {CO}, {NO2}, {O3_1h}, {O3_8h}, {SO2} );                                    
            '''.format(table=MYSQL_TABLE, **dict(item))                                
        self.cursor.execute(command)                                                   
        return item                                                                    

    def open_spider(self, spider):                                                     
        self.cnx = connection.MySQLConnection(                                         
            host=MYSQL_SERVER,                                                         
            user=MYSQL_USER,                                                           
            password=MYSQL_PASS,                                                       
            database=MYSQL_DB,                                                         
            charset='utf8')                                                            
        self.cursor = self.cnx.cursor()                                                

    def close_spider(self,spider):                                                     
        self.cnx.commit()                                                              
        self.cnx.close()                                                               

the Traceback:

wangx@wangx-PC:~/github/pm25in$ scrapy crawl pm25spider
Unhandled error in Deferred:


Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/scrapy/commands/crawl.py", line 57, in run
    self.crawler_process.crawl(spname, **opts.spargs)
  File "/usr/local/lib/python2.7/dist-packages/scrapy/crawler.py", line 163, in crawl
    return self._crawl(crawler, *args, **kwargs)
  File "/usr/local/lib/python2.7/dist-packages/scrapy/crawler.py", line 167, in _crawl
    d = crawler.crawl(*args, **kwargs)
  File "/usr/local/lib/python2.7/dist-packages/twisted/internet/defer.py", line 1274, in unwindGenerator
    return _inlineCallbacks(None, gen, Deferred())
--- <exception caught here> ---
  File "/usr/local/lib/python2.7/dist-packages/twisted/internet/defer.py", line 1126, in _inlineCallbacks
    result = result.throwExceptionIntoGenerator(g)
  File "/usr/local/lib/python2.7/dist-packages/twisted/python/failure.py", line 389, in throwExceptionIntoGenerator
    return g.throw(self.type, self.value, self.tb)
  File "/usr/local/lib/python2.7/dist-packages/scrapy/crawler.py", line 87, in crawl
    yield self.engine.close()
  File "/usr/local/lib/python2.7/dist-packages/scrapy/core/engine.py", line 100, in close
    return self._close_all_spiders()
  File "/usr/local/lib/python2.7/dist-packages/scrapy/core/engine.py", line 340, in _close_all_spiders
    dfds = [self.close_spider(s, reason='shutdown') for s in self.open_spiders]
  File "/usr/local/lib/python2.7/dist-packages/scrapy/core/engine.py", line 298, in close_spider
    dfd = slot.close()
  File "/usr/local/lib/python2.7/dist-packages/scrapy/core/engine.py", line 44, in close
    self._maybe_fire_closing()
  File "/usr/local/lib/python2.7/dist-packages/scrapy/core/engine.py", line 51, in _maybe_fire_closing
    self.heartbeat.stop()
  File "/usr/local/lib/python2.7/dist-packages/twisted/internet/task.py", line 202, in stop
    assert self.running, ("Tried to stop a LoopingCall that was "
exceptions.AssertionError: Tried to stop a LoopingCall that was not running.

Issue Analytics

  • State:closed
  • Created 7 years ago
  • Reactions:2
  • Comments:12

github_iconTop GitHub Comments

3reactions
J-Hongcommented, Jun 1, 2016

Hey, I has a same error log like yours, and my problem is “connection.MySQLConnection()” not connect successfully.

When I connect to my MySQL, Scrapy Crawl is work!

Hope this help for you.

2reactions
frkhitcommented, Jul 5, 2016

I met this error, when my pipeline.py had some bug.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Developers - How to solve this?Tried to stop a LoopingCall that was ...
Can someone help me?When I ran the example link_spider.py, I got some error like this: Unhandled error in Deferred: Traceback (most recent call...
Read more >
Twisted task.LoopingCall : Unhandled error in Deferred
When LoopingCall tries to call None an exception is raised. Since there are no error handlers attached, the exception is reported as an ......
Read more >
internet/task.py · hemamaps/Twisted - Gemfury
assert self.running, ("Tried to stop a LoopingCall that was " "not running.") self.running = False if self.call is not None: self.call.cancel() self.call ...
Read more >
twisted.internet.task.LoopingCall : API documentation
Start running function every interval seconds. Method, stop, Stop running function. Method, reset, Skip the next iteration and reset the timer. Method ...
Read more >
ironic.drivers.modules.console_utils - OpenStack Docs
ESRCH: msg = (_("Could not stop the console for node '%(node)s'. ... run the command as a subprocess try: LOG.debug('Running subprocess: %s', ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found