Recursion depth exceeded when querying too many nodes
See original GitHub issueAfter a couple of queries like
db.Person.nodes.get(name=partner['name'])
I get a larger traceback as in the example below
result = self._get(limit=2, lazy=lazy, **kwargs)
File "/home/thore/.local/share/virtualenvs/flscrape-5IkkNlD1/lib/python3.6/site-packages/neomodel/match.py", line 539, in _get
self.filter(**kwargs)
File "/home/thore/.local/share/virtualenvs/flscrape-5IkkNlD1/lib/python3.6/site-packages/neomodel/match.py", line 628, in filter
self.q_filters = Q(self.q_filters & Q(*args, **kwargs))
File "/home/thore/.local/share/virtualenvs/flscrape-5IkkNlD1/lib/python3.6/site-packages/neomodel/match_q.py", line 193, in __and__
return self._combine(other, self.AND)
File "/home/thore/.local/share/virtualenvs/flscrape-5IkkNlD1/lib/python3.6/site-packages/neomodel/match_q.py", line 181, in _combine
return copy.deepcopy(other)
File "/home/thore/.local/share/virtualenvs/flscrape-5IkkNlD1/lib64/python3.6/copy.py", line 161, in deepcopy
y = copier(memo)
File "/home/thore/.local/share/virtualenvs/flscrape-5IkkNlD1/lib/python3.6/site-packages/neomodel/match_q.py", line 80, in __deepcopy__
obj.children = copy.deepcopy(self.children, memodict)
File "/home/thore/.local/share/virtualenvs/flscrape-5IkkNlD1/lib64/python3.6/copy.py", line 150, in deepcopy
y = copier(x, memo)
File "/home/thore/.local/share/virtualenvs/flscrape-5IkkNlD1/lib64/python3.6/copy.py", line 215, in _deepcopy_list
append(deepcopy(a, memo))
File "/home/thore/.local/share/virtualenvs/flscrape-5IkkNlD1/lib64/python3.6/copy.py", line 150, in deepcopy
y = copier(x, memo)
File "/home/thore/.local/share/virtualenvs/flscrape-5IkkNlD1/lib64/python3.6/copy.py", line 220, in _deepcopy_tuple
y = [deepcopy(a, memo) for a in x]
File "/home/thore/.local/share/virtualenvs/flscrape-5IkkNlD1/lib64/python3.6/copy.py", line 220, in <listcomp>
y = [deepcopy(a, memo) for a in x]
File "/home/thore/.local/share/virtualenvs/flscrape-5IkkNlD1/lib64/python3.6/copy.py", line 180, in deepcopy
y = _reconstruct(x, memo, *rv)
File "/home/thore/.local/share/virtualenvs/flscrape-5IkkNlD1/lib64/python3.6/copy.py", line 280, in _reconstruct
state = deepcopy(state, memo)
File "/home/thore/.local/share/virtualenvs/flscrape-5IkkNlD1/lib64/python3.6/copy.py", line 150, in deepcopy
...
RecursionError: maximum recursion depth exceeded
A way to cope with this is to set the maximum recursion depth to something larger then 10k. Then I only get terminated by signal SIGSEGV (Address boundary error)
Issue Analytics
- State:
- Created 5 years ago
- Comments:7
Top Results From Across the Web
Python: Maximum recursion depth exceeded - Stack Overflow
Method that I call to get sql results: def returnCategoryQuery(query, variables={}): cursor = db.cursor(cursors.DictCursor); catResults = []; ...
Read more >Python | Handling recursion limit - GeeksforGeeks
When you execute a recursive function in Python on a large input ( > 10^4), you might encounter a “maximum recursion depth exceeded...
Read more >Complicated filter clause causes recursion depth exceeded ...
We're running this query in production currently without SQLAlchemy, and it performs fine, but perhaps I ... RuntimeError: maximum recursion depth exceeded.
Read more >Python maximum recursion depth exceeded in comparison
If you write a recursive function that executes more than a particular number of iterations (usually 997), you'll see an error when you...
Read more >Python: Maximum recursion depth exceeded - DevPress
Answer a question I have the following recursion code, at each node I call sql query to get the nodes belong to the...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
@ThoreKr No worries, you are welcome.
This particular behaviour is related to this issue. (And possibly one more issue that I am trying to figure out its impact at the moment before mentioning it here).
At the moment, the best strategy for use case scenarios such as yours, is to cast to
str
indeed.All the best
@aanastasiou First of all - thank you so much for all the input and the extensive replies.
I’ve managed to trace down the issue to the evaluation of one specific node and debug that one further until I stumbled upon two things:
Person.nodes.get(name=<Array>)
) which caused the same deepcopy recursion error.NavigableString
- which was rather confusing. But astr()
around it fixed it.So
NavigableString
is something returned by BeautifulSoup. I haven’t found out yet why the type differed in that case - but that’s another issue.So from my side this issue can be closed. One may ask the question whether this issue should be caught earlier by checking whether the argument has a valid type. Though I’m not sure this is going to work out since there might be other types as well (?).
Again - thanks for all the help and input - I’m going to revise the ETL process.