ValueError from FastText
See original GitHub issueHello. I recently installed Torchtext 0.3.1 and tqdm 4.32.1 and tried to use FastText. However, the following error occurred:
torchtext.vocab.FastText() 0%| | 0/1 [00:00<?, ?it/s] Traceback (most recent call last): File “<stdin>”, line 1, in <module> File “/home/ros-slam/.local/lib/python3.5/site-packages/torchtext/vocab.py”, line 411, in init super(FastText, self).init(name, url=url, **kwargs) File “/home/ros-slam/.local/lib/python3.5/site-packages/torchtext/vocab.py”, line 280, in init self.cache(name, cache, url=url, max_vectors=max_vectors) File “/home/ros-slam/.local/lib/python3.5/site-packages/torchtext/vocab.py”, line 370, in cache vectors[vectors_loaded] = torch.tensor([float(x) for x in entries]) File “/home/ros-slam/.local/lib/python3.5/site-packages/torchtext/vocab.py”, line 370, in <listcomp> vectors[vectors_loaded] = torch.tensor([float(x) for x in entries]) ValueError: could not convert string to float: b’version=“1.0”’
To find the cause of ValueError, I checked the vocab.py
class FastText(Vectors):
url_base = 'https://s3-us-west-1.amazonaws.com/fasttext-vectors/wiki.{}.vec'
def __init__(self, language="en", **kwargs):
url = self.url_base.format(language)
name = os.path.basename(url)
super(FastText, self).__init__(name, url=url, **kwargs)
It is assumed that the cause of this error occurred while accessing that website ‘url_base’. So, how can we solve this error? I’ve looked here for a case of a similar problem to mine, but unfortunately it doesn’t appear to be an unusual case. I’ve also tried the code in 0.2.3 versions, but the same thing is happening.
Thank you in advance for your reply.
Issue Analytics
- State:
- Created 4 years ago
- Comments:13 (7 by maintainers)
Top GitHub Comments
I just did a quick test with torchtext-0.4.0 and didn’t have a problem to load torchtext.vocab.FastText(). If possible, please clone the master branch from GitHub and install it with command “python setup.py install”. It’s pretty straightforward.
I just upgraded urllib3 and chardet and warning was disappeared. But error still occured.