Could not initialize class com.johnsnowlabs.util.ConfigHelper$
See original GitHub issueReceiving an error when trying to load pretrained model from hdfs.
Description
In HDFS, loaded offline pre trained model file(s). Apply or use it in code e.g. bert = BertEmbeddings.load(<path>) throws an error “Could not initialize class com.JohnSnowLabs.util. ConfigHelper”
Expected Behavior
It should load pre trained model from the uncompressed file in HDFS.
Current Behavior
Receiving an error message: Py4JJavaError: An error occurred while calling None.com.johnsnowlabs.nlp.embeddings.BertEmbeddings. : java.lang.NoClassDefFoundError: Could not initialize class com.johnsnowlabs.util.ConfigHelper$
Possible Solution
Reference to the offline model might be wrong OR something needs to be updated in Config.
Steps to Reproduce
- Import all spark NLP libs
from sparknlp.base import *
from sparknlp.annotator import *
from sparknlp.common import * import sparknlp - Sparknlp.start() spark = sparknlp.start()
- document_assembler = DocumentAssembler()
.setInputCol(“text”)
.setOutputCol(“document”) - Load the pretrained model from hdfs path.
bert = BertEmbeddings.load(“/user/xxx/bert_base_cased_en_2.4.0_2.4_1580579557778”)
.setInputCols([“document”])
.setOutputCol(“bert”)
.setCaseSensitive(False)
.setPoolingLayer(0)
Context
Trying to apply ClassifierDL - word embedding and sentence Embeddings (USE). classiferDL is new for me, fixing this issue will enable it’s use for many different applications.
Your Environment
- Spark NLP version
sparknlp.version()
: 2.4.5 - Apache NLP version
spark.version
: 2.3.2.3.1.0.0-78 - Java version
java -version
: openjdk version “1.8.0_282”, OpenJDK Runtime Environment (build 1.8.0_282-b08), OpenJDK 64-Bit Server VM (build 25.282-b08, mixed mode) - Setup and installation (Pypi, Conda, Maven, etc.): Pyspark
- Operating System and version: Hadoop Cluster
- Link to your project (if any):
Thank you for the help.
Issue Analytics
- State:
- Created 2 years ago
- Comments:22 (8 by maintainers)
@maziyarpanahi , truly appreciate your help with this. Thank you again. I will follow the steps you mentioned and will get back to you soon. Thank you.
Thank you very much, @maziyarpanahi Getting the answers…in few minutes.