hdfsBuilderConnect error
See original GitHub issueWhen training in Standalone mode, this problem occurs:
hdfsBuilderConnect(forceNewInstance=0, nn=master:9000, port=0, kerbTicketCachePath=(NULL), userName=(NULL)) error:
(unable to get stack trace for java.lang.NoClassDefFoundError exception: ExceptionUtils::getStackTrace error.)
This error message repeats quickly, and fills the whole terminal in 1 second. I cannot catch the message just before this error.
What does this error mean?
My standalone configuration:
${SPARK_HOME}/bin/spark-submit \
--py-files ${TFoS_HOME}/examples/mnist/spark/mnist_dist.py \
--conf spark.cores.max=1 \
--conf spark.task.cpus=1 \
--conf spark.executorEnv.JAVA_HOME="/usr/lib/jvm/java-8-openjdk-amd64" \
--conf spark.executorEnv.LD_LIBRARY_PATH=$JAVA_HOME/jre/lib/amd64/server:$LIB_CUDA:/usr/local/hadoop/lib \
${TFoS_HOME}/examples/mnist/spark/mnist_spark.py \
--cluster_size 1 \
--images mnist/csv/train/images \
--labels mnist/csv/train/labels \
--format csv \
--mode train \
--model mnist_model
And the spark configuration as following:
export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
export SCALA_HOME=/usr/local/scala
export SPARK_MASTER_IP=10.0.3.183
export SPARK_MASTER_PORT=7077
export SPARK_WORKER_MEMORY=4G
export HADOOP_HOME=/usr/local/hadoop
export HADOOP_CONF_DIR=$HADOOP_HOME/etc/hadoop
export HADOOP_HDFS_HOME=/home/hduser/mydata
export SPARK_WORKER_CORES=1
export SPARK_WORKER_INSTANCES=1
export SPARK_EXECUTOR_INSTANCES=1
And I have change the code of mnist_spark.py
at line 23 num_ps = 0
(because it will be another error Job "ps" was not defined in cluster
and then it will be another infinite message waiting for 1 reservations
).
Issue Analytics
- State:
- Created 5 years ago
- Comments:9 (2 by maintainers)
Top Results From Across the Web
hdfsBuilderConnect error while using tfserving load model ...
i solved this problem by adding hadoop absolute path to classpath.
Read more >[#ARROW-13535] connect hdfs error - ASF JIRA
when i use pyarrow to connect my hdfs, I meet error I use from pyarrow ... NoClassDefFoundError) hdfsBuilderConnect(forceNewInstance=1, ...
Read more >Hadoop C++ HDFS test running Exception
loadFileSystems error:(unable to get stack trace for java.lang.NoClassDefFoundError exception ... hdfsBuilderConnect(forceNewInstance=0, nn=default, port=0,
Read more >hdfs.h source code [ClickHouse/contrib/libhdfs3/src/client/ ...
155, * @return Returns a handle to the filesystem, or NULL on error. 156, */. 157, hdfsFS hdfsBuilderConnect (struct hdfsBuilder * bld );....
Read more >hadoop-hdfs-project/hadoop-hdfs/src/main/native/libhdfs/ ...
"hdfsBuilderConnect(%s): error setting conf '%s' to '%s'",. hdfsBuilderToStr(bld, buf, sizeof(buf)), opt->key, opt->val);. goto done;.
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
@OngHoeYuan0214 I think it might be fixed if you add this to your spark command-line:
hello, how do you solve hdfsBuilderConnect problem,i have already export the hadoop classpath in my dockfile, here is my docker file: and when i run : docker run -p 9001:9000 --name tensorflow-serving-11 -e MODEL_NAME=tfrest -e MODEL_BASE_PATH=hdfs://xxx:xxx/user/cess2_test/workspace/cess/models -t tensorflow_serving:1.14-hadoop-2.8.2 this problem occurs:
so could u please tell me how do u solve this problem in detail?