SPARQLConnector error when building locally
See original GitHub issue🐛 Bug
When I build this project locally (with pip install .), I get the following error message:
TypeError: unhashable type: 'SPARQLConnector'
I do not get this error message when I install this project remotely (with pip install pyRDF2Vec).
Expected Behavior
No error, regardless of whether I build locally or install from a remote release.
Current Behavior
The above shared error message.
Steps to Reproduce
- Clone this repository.
- Enter the directory of the resulting close and run
pip install . - Run
python main.pywith the following content in filemain.py:
import pandas as pd
from pyrdf2vec import RDF2VecTransformer
from pyrdf2vec.embedders import Word2Vec
from pyrdf2vec.graphs import KG
from pyrdf2vec.walkers import RandomWalker
data = pd.read_csv("https://raw.githubusercontent.com/IBCNServices/pyRDF2Vec/master/samples/countries-cities/entities.tsv", sep="\t")
entities = [entity for entity in data["location"]]
knowledge_graph = KG("https://dbpedia.org/sparql")
transformer = RDF2VecTransformer(
Word2Vec(epochs=10),
walkers=[RandomWalker(4, 10, with_reverse=False, n_jobs=2)])
embeddings, literals = transformer.fit_transform(knowledge_graph, entities)
- Observe the following traceback:
Traceback (most recent call last):
File "dbpedia.py", line 12, in <module>
embeddings, literals = transformer.fit_transform(knowledge_graph, entities)
File "/home/wouter/.local/lib/python3.8/site-packages/pyrdf2vec/rdf2vec.py", line 146, in fit_transform
self.fit(kg, entities, is_update)
File "/home/wouter/.local/lib/python3.8/site-packages/pyrdf2vec/rdf2vec.py", line 107, in fit
walks = self.get_walks(kg, entities)
File "/home/wouter/.local/lib/python3.8/site-packages/pyrdf2vec/rdf2vec.py", line 166, in get_walks
if kg.skip_verify is False and not kg.is_exist(entities):
File "/home/wouter/.local/lib/python3.8/site-packages/pyrdf2vec/graphs/kg.py", line 374, in is_exist
responses = [self.connector.fetch(query) for query in queries]
File "/home/wouter/.local/lib/python3.8/site-packages/pyrdf2vec/graphs/kg.py", line 374, in <listcomp>
responses = [self.connector.fetch(query) for query in queries]
File "/home/wouter/.local/lib/python3.8/site-packages/cachetools/__init__.py", line 686, in wrapper
return c[k]
File "/home/wouter/.local/lib/python3.8/site-packages/cachetools/__init__.py", line 414, in __getitem__
link = self.__getlink(key)
File "/home/wouter/.local/lib/python3.8/site-packages/cachetools/__init__.py", line 501, in __getlink
value = self.__links[key]
File "/home/wouter/.local/lib/python3.8/site-packages/cachetools/keys.py", line 19, in __hash__
self.__hashvalue = hashvalue = hash(self)
TypeError: unhashable type: 'SPARQLConnector'
Environment
- Operating system: Ubuntu 20.04.4 LTS
- pyRDF2Vec version: 0.2.3
- Python version: 3.8.10
Additional remarks
I see the same error message in #64, but the problem description there is very different (use of a specific IDE IIUC).
Issue Analytics
- State:
- Created a year ago
- Comments:16 (7 by maintainers)
Top Results From Across the Web
CommonProblems · blazegraph/database Wiki - GitHub
Problem: You have errors when compiling with Sesame 2.2.4. Exception in thread "main" java.lang.NoSuchMethodError: org.openrdf.sail.helpers.
Read more >Not able to add file to repo larger than 1.5 MB - Stack Overflow
RepositoryException: error executing transaction at org.eclipse.rdf4j.repository.sparql.SPARQLConnection.commit(SPARQLConnection.java:438) ...
Read more >Source code for rdflib.plugins.stores.sparqlconnector
import base64 import logging from io import BytesIO from typing import TYPE_CHECKING, Optional, Tuple from urllib.error import HTTPError from urllib.parse ...
Read more >rdflib - Read the Docs
Graph.parse() can process local files, remote data via a URL, ... If LiveJournal had used RDFlib 5.0.0, an error would have been raised...
Read more >Install and Configure Jena-Fuseki with Fedora Repository
Download Jena-Fuseki distribution to /usr/local/ system path and unpack ... ERROR: $JETTY_HOME/$JETTY_INSTALL_TRACE_FILE is not readable!
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found

@rememberYou Running
pip install .on the last commit in master indeed works for me. Thanks for fixing this!The commit d039a702c9b072983b5f60ba7892b8d7a2cf1993 on the
developbranch has fixed the issue withcachetool. If you clone to themaster, since you did agit resetwith an older version ofcachetool, it should work too.@wouterbeek Does it work on your side?
Using the
developbranch, I’m able to extract the walk from the TriplyDB endpoint without issues. I haven’t test with themasterbranch.