question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

corenlp.py CoreNLPServer throws TypeError exception

See original GitHub issue

Hello,

Here’s the code:

>>> s = nltk.parse.corenlp.CoreNLPServer(path_to_jar='/usr/local/share/stanford/stanford-corenlp-3.8.0.jar', path_to_models_jar='/usr/local/share/stanford/stanford-english-corenlp-2017-06-09-models.jar')
Traceback (most recent call last):
  File "<input>", line 1, in <module>
    s = nltk.parse.corenlp.CoreNLPServer(path_to_jar='/usr/local/share/stanford/stanford-corenlp-3.8.0.jar', path_to_models_jar='/usr/local/share/stanford/stanford-english-corenlp-2017-06-09-models.jar')
  File "/Users/adiep/feedback-sentiment/.env/src/nltk/nltk/parse/corenlp.py", line 69, in __init__
    key=lambda model_name: re.match(self._JAR, model_name)
TypeError: '>' not supported between instances of 'NoneType' and 'NoneType'

The max function is throwing this exception.

I think what’s happening is that key=lambda model_name: re.match(self._JAR, model_name) is returning NoneType because it didn’t match anything. So it fills the list of NoneType and max fails to sort it. I found that self._JAR and model_name evaluated to the following:

>>> type(re.match(r'stanford-corenlp-(\d+)\.(\d+)\.(\d+)\.jar', '/usr/local/share/stanford/stanford-corenlp-3.8.0.jar'))
<class 'NoneType'>

Thanks,

Issue Analytics

  • State:closed
  • Created 6 years ago
  • Comments:21 (6 by maintainers)

github_iconTop GitHub Comments

4reactions
f0liecommented, Jun 27, 2017

Good lord, now it’s working.

So overall, I had to change re.match to re.search.group and kill a floating corenlpserver in the background.

Maybe there should be some code to detect if there is other corenlpservers running? Don’t know if it’s even worth doing.

2reactions
dimazestcommented, Oct 31, 2017

Hi, I’m sorry that there is no documentation. The API has changed, here are is what you need to do. Refer to #1510, it contains a long discussion on how to start a server.

The main change: NLTK does not start CoreNLP server, you need to start it. Refer to https://stanfordnlp.github.io/CoreNLP/corenlp-server.html for a detailed explanation, but the command should be something like:

# Run the server using all jars in the current directory (e.g., the CoreNLP home directory)
java -mx4g -cp "*" edu.stanford.nlp.pipeline.StanfordCoreNLPServer -port 9000 -timeout 15000

Open http://localhost:9000/ to make sure that the server is running.

Now you are ready to use NLTK corenlp client:

tagger = CoreNLPNERTagger(url='http://localhost:9000')
tokens = tagger.tag(text)

Please let me know if it works.

Read more comments on GitHub >

github_iconTop Results From Across the Web

CoreNLP Server - Stanford NLP Group
CoreNLP includes a simple web API server for servicing your human language understanding needs (starting with version 3.6.0). This page describes how to...
Read more >
Problems in setting up my own Stanford CoreNLP server
I'm setting up my own stanford parser server following the tutorial in http://stanfordnlp.github.io/CoreNLP/corenlp-server.html.
Read more >
How to Use Stanford CoreNLP in Python | Xiaoxiao's tech blog
Method 3: Use the Stanford CoreNLP Server and write your own wrapper/modify an existing wrapper ... While browsing the source code of the...
Read more >
nltk.parse.corenlp
[docs]class CoreNLPServerError(EnvironmentError): """Exceptions associated with the Core NLP server.""" [docs]def try_port(port=0): sock ...
Read more >
Stanford CoreNLP | UB CSE IT Service Catalog
Stanford CoreNLP provides a set of natural language analysis tools which ... Edit client.py to connect to the Stanford CoreNLP server's location and...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found