question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

"TypeError: can't pickle _thread.RLock objects" on usage of BigQueryOperator

See original GitHub issue

Apache Airflow version:1.10.2

Kubernetes version: 1.14.10-gke.27

Environment: GC Composer

  • Cloud provider or hardware configuration: GCP

What happened: The “clear” option crashes the UI.

error messages: TypeError: can’t pickle _thread.RLock objects

What you expected to happen: I expect airflow to clear the task and its dependencies, then re-run the task.

What do you think went wrong?: When I try to include a BigQueryOperator from airflow.contrib.operators.bigquery_operator in my DAG, the error occurs.

How to reproduce it: I am not sure how to reproduce this error in any other environment. Please find the video explaining the bug in the following link : Video explanation of the bug

How often does this problem occur? Every time I try to add the BigQueryOperator to this particular DAG.

Probably similar Issues :

UI Crash Logs Ooops.
                      ____/ (  (    )   )  \___
                     /( (  (  )   _    ))  )   )\
                   ((     (   )(    )  )   (   )  )
                 ((/  ( _(   )   (   _) ) (  () )  )
                ( (  ( (_)   ((    (   )  .((_ ) .  )_
               ( (  )    (      (  )    )   ) . ) (   )
              (  (   (  (   ) (  _  ( _) ).  ) . ) ) ( )
              ( (  (   ) (  )   (  ))     ) _)(   )  )  )
             ( (  ( \ ) (    (_  ( ) ( )  )   ) )  )) ( )
              (  (   (  (   (_ ( ) ( _    )  ) (  )  )   )
             ( (  ( (  (  )     (_  )  ) )  _)   ) _( ( )
              ((  (   )(    (     _    )   _) _(_ (  (_ )
               (_((__(_(__(( ( ( |  ) ) ) )_))__))_)___)
               ((__)        \\||lll|l||///          \_))
                        (   /(/ (  )  ) )\   )
                      (    ( ( ( | | ) ) )\   )
                       (   /(| / ( )) ) ) )) )
                     (     ( ((((_(|)_)))))     )
                      (      ||\(|(|)|/||     )
                    (        |(||(||)||||        )
                      (     //|/l|||)|\\ \     )
                    (/ / //  /|//||||\\  \ \  \ _)

Node: 788ee3e5c207

Traceback (most recent call last): File “/opt/python3.6/lib/python3.6/site-packages/flask/app.py”, line 1982, in wsgi_app response = self.full_dispatch_request() File “/opt/python3.6/lib/python3.6/site-packages/flask/app.py”, line 1614, in full_dispatch_request rv = self.handle_user_exception(e) File “/opt/python3.6/lib/python3.6/site-packages/flask/app.py”, line 1517, in handle_user_exception reraise(exc_type, exc_value, tb) File “/opt/python3.6/lib/python3.6/site-packages/flask/_compat.py”, line 33, in reraise raise value File “/opt/python3.6/lib/python3.6/site-packages/flask/app.py”, line 1612, in full_dispatch_request rv = self.dispatch_request() File “/opt/python3.6/lib/python3.6/site-packages/flask/app.py”, line 1598, in dispatch_request return self.view_functionsrule.endpoint File “/opt/python3.6/lib/python3.6/site-packages/flask_admin/base.py”, line 69, in inner return self._run_view(f, *args, **kwargs) File “/opt/python3.6/lib/python3.6/site-packages/flask_admin/base.py”, line 368, in _run_view return fn(self, *args, **kwargs) File “/opt/python3.6/lib/python3.6/site-packages/flask_login/utils.py”, line 258, in decorated_view return func(*args, **kwargs) File “/usr/local/lib/airflow/airflow/www/utils.py”, line 281, in wrapper return f(*args, **kwargs) File “/usr/local/lib/airflow/airflow/www/utils.py”, line 328, in wrapper return f(*args, **kwargs) File “/usr/local/lib/airflow/airflow/www/views.py”, line 1166, in clear include_upstream=upstream) File “/usr/local/lib/airflow/airflow/models.py”, line 4166, in sub_dag dag = copy.deepcopy(self) File “/opt/python3.6/lib/python3.6/copy.py”, line 161, in deepcopy y = copier(memo) File “/usr/local/lib/airflow/airflow/models.py”, line 4151, in deepcopy setattr(result, k, copy.deepcopy(v, memo)) File “/opt/python3.6/lib/python3.6/copy.py”, line 150, in deepcopy y = copier(x, memo) File “/opt/python3.6/lib/python3.6/copy.py”, line 240, in _deepcopy_dict y[deepcopy(key, memo)] = deepcopy(value, memo) File “/opt/python3.6/lib/python3.6/copy.py”, line 161, in deepcopy y = copier(memo) File “/usr/local/lib/airflow/airflow/models.py”, line 2874, in deepcopy setattr(result, k, copy.deepcopy(v, memo)) File “/opt/python3.6/lib/python3.6/copy.py”, line 180, in deepcopy y = _reconstruct(x, memo, *rv) File “/opt/python3.6/lib/python3.6/copy.py”, line 280, in _reconstruct state = deepcopy(state, memo) File “/opt/python3.6/lib/python3.6/copy.py”, line 150, in deepcopy y = copier(x, memo) File “/opt/python3.6/lib/python3.6/copy.py”, line 240, in _deepcopy_dict y[deepcopy(key, memo)] = deepcopy(value, memo) File “/opt/python3.6/lib/python3.6/copy.py”, line 180, in deepcopy y = _reconstruct(x, memo, *rv) File “/opt/python3.6/lib/python3.6/copy.py”, line 280, in _reconstruct state = deepcopy(state, memo) File “/opt/python3.6/lib/python3.6/copy.py”, line 150, in deepcopy y = copier(x, memo) File “/opt/python3.6/lib/python3.6/copy.py”, line 240, in _deepcopy_dict y[deepcopy(key, memo)] = deepcopy(value, memo) File “/opt/python3.6/lib/python3.6/copy.py”, line 180, in deepcopy y = _reconstruct(x, memo, *rv) File “/opt/python3.6/lib/python3.6/copy.py”, line 280, in _reconstruct state = deepcopy(state, memo) File “/opt/python3.6/lib/python3.6/copy.py”, line 150, in deepcopy y = copier(x, memo) File “/opt/python3.6/lib/python3.6/copy.py”, line 240, in _deepcopy_dict y[deepcopy(key, memo)] = deepcopy(value, memo) File “/opt/python3.6/lib/python3.6/copy.py”, line 150, in deepcopy y = copier(x, memo) File “/opt/python3.6/lib/python3.6/copy.py”, line 215, in _deepcopy_list append(deepcopy(a, memo)) File “/opt/python3.6/lib/python3.6/copy.py”, line 180, in deepcopy y = _reconstruct(x, memo, *rv) File “/opt/python3.6/lib/python3.6/copy.py”, line 280, in _reconstruct state = deepcopy(state, memo) File “/opt/python3.6/lib/python3.6/copy.py”, line 150, in deepcopy y = copier(x, memo) File “/opt/python3.6/lib/python3.6/copy.py”, line 240, in _deepcopy_dict y[deepcopy(key, memo)] = deepcopy(value, memo) File “/opt/python3.6/lib/python3.6/copy.py”, line 169, in deepcopy rv = reductor(4) TypeError: can’t pickle _thread.RLock objects

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Reactions:7
  • Comments:16 (4 by maintainers)

github_iconTop GitHub Comments

1reaction
ImadYIdrissicommented, May 1, 2020

@mik-laj Isn’t the master branch at the version 1.10.10? If so, probably the bugs have been solved. But since Composer is still at V 1.10.3, then possibly these issues will again occur. I am seriously thinking of deploying my own VM or Cluster and use airflow on it. Although a managed service sounds good in theory, the overhead spent on trying to fix already fixed problems is still getting higher.

0reactions
github-actions[bot]commented, Feb 7, 2022

This issue has been closed because it has not received response from the issue author.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Airflow (Google Composer) TypeError: can't pickle _thread ...
What I have tried? From airflow UI interface to clear task, the task, not work; From command like use command backfill, not work;...
Read more >
[GitHub] [airflow] dgies commented on issue #8541: "TypeError
RLock objects" on usage of BigQueryOperator ... Any attempt to clear tasks using downstream fails if the DAG uses GCP Dataproc operators.
Read more >
cannot pickle '_thread.lock' object - You.com | The AI Search ...
I have a problem using pool.map and an instance method in my class, and I get this TypeError: cannot pickle '_thread.lock' object ....
Read more >
2052037 – TypeError: can't pickle _thread.RLock objects ...
Description of problem: I'm working on an improvement for `sos clean` (part of sosreport, which support and customers use heavily every day ) ......
Read more >
How to set sparkTrials? I am receiving this TypeError
I am trying to distribute hyperparameter tuning using hyperopt on a tensorflow.keras ... TypeError: cannot pickle '_thread.lock' object.
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found