question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Encountering java.lang.NullPointerException when dislpaying Bert transformations

See original GitHub issue

Hello,

My set up is a single laptop computer running Kubuntu 20.10 (Linux kernel version 5.8.0-55-generic) on Intel Core i5-7200U CPU (4 cores) with 5.7 GB of RAM available.

On this modest machine, I am trying to learn how to set up a standalone spark cluster and submit a job with PySpark that uses SparkNLP.

I have base my work off of https://github.com/JohnSnowLabs/spark-nlp-workshop/blob/master/tutorials/blogposts/3.NER_with_BERT.ipynb

I have installed Spark 3.0.2 in this machine in the home directory and have set SPARK_HOME in my environment variables as necessary. Once done, I ran the start-master.sh script from spark’s sbin directory and it launched master successfully. Then, I launched a worker on the same machine and it registered with the master with 4 cores and 4.7 GB of RAM. On this setup, I was able to successfully run the PI approximation example from Spark’s website.

Now, in this machine, I created another directory and setup a virtualenvironment. PIP packages installed in this venv: numpy==1.20.3 py4j==0.10.9 pyspark==3.0.2 spark-nlp==3.1.0 sparknlp==1.0.0

I launched Python 3.8.6 from this virtualenvironment and ran the following script:

from pyspark.sql import SparkSession

spark = SparkSession.builder\
    .master("spark://rajan-X556URK:7077")\
    .appName("nerexample")\
    .config("spark.driver.memory", "4G")\
    .config("spark.executor.memory", "4G")\
    .config("spark.jars.packages", "com.johnsnowlabs.nlp:spark-nlp_2.12:3.1.0")\
    .getOrCreate() 

import sparknlp
from sparknlp.annotator import *
from sparknlp.base import *

from urllib.request import urlretrieve

urlretrieve('https://github.com/JohnSnowLabs/spark-nlp/raw/master/src/test/resources/conll2003/eng.train',
           'eng.train')

urlretrieve('https://github.com/JohnSnowLabs/spark-nlp/raw/master/src/test/resources/conll2003/eng.testa',
           'eng.testa') 

bert_annotator = BertEmbeddings.pretrained('small_bert_L2_128', 'en') \
 .setInputCols(["sentence",'token'])\
 .setOutputCol("bert")\
 .setBatchSize(8)

from sparknlp.training import CoNLL

test_data = CoNLL().readDataset(spark, '/home/w/Assignments/ner/eng.testa')

test_data = bert_annotator.transform(test_data)



test_data.show(3)

Right when I execute the test_data.show() line, I get a NullPointerException.

Following is the log from the stderr file of this worker:

Spark Executor Command: "/usr/lib/jvm/java-11-openjdk-amd64/bin/java" "-cp" "/home/w/Assignments/ner/spark-3.0.2-bin-hadoop2.7/conf/:/home/w/Assignments/ner/spark-3.0.2-bin-hadoop2.7/jars/*" "-Xmx4096M" "-Dspark.driver.port=34205" "org.apache.spark.executor.CoarseGrainedExecutorBackend" "--driver-url" "spark://CoarseGrainedScheduler@192.168.2.103:34205" "--executor-id" "0" "--hostname" "192.168.2.103" "--cores" "4" "--app-id" "app-20210611204208-0009" "--worker-url" "spark://Worker@192.168.2.103:44535"
========================================

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
21/06/11 20:42:09 INFO CoarseGrainedExecutorBackend: Started daemon with process name: 15317@rajan-X556URK
21/06/11 20:42:09 INFO SignalUtils: Registered signal handler for TERM
21/06/11 20:42:09 INFO SignalUtils: Registered signal handler for HUP
21/06/11 20:42:09 INFO SignalUtils: Registered signal handler for INT
21/06/11 20:42:09 WARN Utils: Your hostname, rajan-X556URK resolves to a loopback address: 127.0.1.1; using 192.168.2.103 instead (on interface wlp3s0)
21/06/11 20:42:09 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.spark.unsafe.Platform (file:/home/w/Assignments/ner/spark-3.0.2-bin-hadoop2.7/jars/spark-unsafe_2.12-3.0.2.jar) to constructor java.nio.DirectByteBuffer(long,int)
WARNING: Please consider reporting this to the maintainers of org.apache.spark.unsafe.Platform
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
21/06/11 20:42:09 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
21/06/11 20:42:10 INFO SecurityManager: Changing view acls to: w
21/06/11 20:42:10 INFO SecurityManager: Changing modify acls to: w
21/06/11 20:42:10 INFO SecurityManager: Changing view acls groups to: 
21/06/11 20:42:10 INFO SecurityManager: Changing modify acls groups to: 
21/06/11 20:42:10 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(w); groups with view permissions: Set(); users  with modify permissions: Set(w); groups with modify permissions: Set()
21/06/11 20:42:10 INFO TransportClientFactory: Successfully created connection to /192.168.2.103:34205 after 95 ms (0 ms spent in bootstraps)
21/06/11 20:42:10 INFO SecurityManager: Changing view acls to: w
21/06/11 20:42:10 INFO SecurityManager: Changing modify acls to: w
21/06/11 20:42:10 INFO SecurityManager: Changing view acls groups to: 
21/06/11 20:42:10 INFO SecurityManager: Changing modify acls groups to: 
21/06/11 20:42:10 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(w); groups with view permissions: Set(); users  with modify permissions: Set(w); groups with modify permissions: Set()
21/06/11 20:42:10 INFO TransportClientFactory: Successfully created connection to /192.168.2.103:34205 after 3 ms (0 ms spent in bootstraps)
21/06/11 20:42:10 INFO DiskBlockManager: Created local directory at /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/blockmgr-d8c52b91-a0ef-49fb-8712-7116a0410c3b
21/06/11 20:42:11 INFO MemoryStore: MemoryStore started with capacity 2.2 GiB
21/06/11 20:42:11 INFO CoarseGrainedExecutorBackend: Connecting to driver: spark://CoarseGrainedScheduler@192.168.2.103:34205
21/06/11 20:42:11 INFO WorkerWatcher: Connecting to worker spark://Worker@192.168.2.103:44535
21/06/11 20:42:11 INFO ResourceUtils: ==============================================================
21/06/11 20:42:11 INFO ResourceUtils: Resources for spark.executor:

21/06/11 20:42:11 INFO ResourceUtils: ==============================================================
21/06/11 20:42:11 INFO TransportClientFactory: Successfully created connection to /192.168.2.103:44535 after 31 ms (0 ms spent in bootstraps)
21/06/11 20:42:11 INFO WorkerWatcher: Successfully connected to spark://Worker@192.168.2.103:44535
21/06/11 20:42:11 INFO CoarseGrainedExecutorBackend: Successfully registered with driver
21/06/11 20:42:11 INFO Executor: Starting executor ID 0 on host 192.168.2.103
21/06/11 20:42:11 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 35511.
21/06/11 20:42:11 INFO NettyBlockTransferService: Server created on 192.168.2.103:35511
21/06/11 20:42:11 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
21/06/11 20:42:11 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(0, 192.168.2.103, 35511, None)
21/06/11 20:42:11 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(0, 192.168.2.103, 35511, None)
21/06/11 20:42:11 INFO BlockManager: Initialized BlockManager: BlockManagerId(0, 192.168.2.103, 35511, None)
21/06/11 20:42:11 INFO Executor: Fetching spark://192.168.2.103:34205/files/com.google.protobuf_protobuf-java-util-3.0.0-beta-3.jar with timestamp 1623424325166
21/06/11 20:42:11 INFO TransportClientFactory: Successfully created connection to /192.168.2.103:34205 after 3 ms (0 ms spent in bootstraps)
21/06/11 20:42:12 INFO Utils: Fetching spark://192.168.2.103:34205/files/com.google.protobuf_protobuf-java-util-3.0.0-beta-3.jar to /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/fetchFileTemp16731401106283909142.tmp
21/06/11 20:42:12 INFO Utils: Copying /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/-19307619201623424325166_cache to /home/w/Assignments/ner/spark-3.0.2-bin-hadoop2.7/work/app-20210611204208-0009/0/./com.google.protobuf_protobuf-java-util-3.0.0-beta-3.jar
21/06/11 20:42:12 INFO Executor: Fetching spark://192.168.2.103:34205/files/net.jcip_jcip-annotations-1.0.jar with timestamp 1623424325166
21/06/11 20:42:12 INFO Utils: Fetching spark://192.168.2.103:34205/files/net.jcip_jcip-annotations-1.0.jar to /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/fetchFileTemp4135900916397329853.tmp
21/06/11 20:42:12 INFO Utils: Copying /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/1155917211623424325166_cache to /home/w/Assignments/ner/spark-3.0.2-bin-hadoop2.7/work/app-20210611204208-0009/0/./net.jcip_jcip-annotations-1.0.jar
21/06/11 20:42:12 INFO Executor: Fetching spark://192.168.2.103:34205/files/com.google.code.findbugs_annotations-3.0.1.jar with timestamp 1623424325166
21/06/11 20:42:12 INFO Utils: Fetching spark://192.168.2.103:34205/files/com.google.code.findbugs_annotations-3.0.1.jar to /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/fetchFileTemp5390838511657707315.tmp
21/06/11 20:42:12 INFO Utils: Copying /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/10453638051623424325166_cache to /home/w/Assignments/ner/spark-3.0.2-bin-hadoop2.7/work/app-20210611204208-0009/0/./com.google.code.findbugs_annotations-3.0.1.jar
21/06/11 20:42:12 INFO Executor: Fetching spark://192.168.2.103:34205/files/com.johnsnowlabs.nlp_tensorflow-cpu_2.12-0.3.1.jar with timestamp 1623424325166
21/06/11 20:42:12 INFO Utils: Fetching spark://192.168.2.103:34205/files/com.johnsnowlabs.nlp_tensorflow-cpu_2.12-0.3.1.jar to /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/fetchFileTemp12163369454458897115.tmp
21/06/11 20:42:15 INFO Utils: Copying /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/-6753754811623424325166_cache to /home/w/Assignments/ner/spark-3.0.2-bin-hadoop2.7/work/app-20210611204208-0009/0/./com.johnsnowlabs.nlp_tensorflow-cpu_2.12-0.3.1.jar
21/06/11 20:42:15 INFO Executor: Fetching spark://192.168.2.103:34205/files/org.projectlombok_lombok-1.16.8.jar with timestamp 1623424325166
21/06/11 20:42:15 INFO Utils: Fetching spark://192.168.2.103:34205/files/org.projectlombok_lombok-1.16.8.jar to /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/fetchFileTemp546613331331570155.tmp
21/06/11 20:42:15 INFO Utils: Copying /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/15471060871623424325166_cache to /home/w/Assignments/ner/spark-3.0.2-bin-hadoop2.7/work/app-20210611204208-0009/0/./org.projectlombok_lombok-1.16.8.jar
21/06/11 20:42:15 INFO Executor: Fetching spark://192.168.2.103:34205/files/com.typesafe_config-1.3.0.jar with timestamp 1623424325166
21/06/11 20:42:15 INFO Utils: Fetching spark://192.168.2.103:34205/files/com.typesafe_config-1.3.0.jar to /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/fetchFileTemp501578203029232760.tmp
21/06/11 20:42:15 INFO Utils: Copying /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/-6243396901623424325166_cache to /home/w/Assignments/ner/spark-3.0.2-bin-hadoop2.7/work/app-20210611204208-0009/0/./com.typesafe_config-1.3.0.jar
21/06/11 20:42:15 INFO Executor: Fetching spark://192.168.2.103:34205/files/net.sf.trove4j_trove4j-3.0.3.jar with timestamp 1623424325166
21/06/11 20:42:15 INFO Utils: Fetching spark://192.168.2.103:34205/files/net.sf.trove4j_trove4j-3.0.3.jar to /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/fetchFileTemp4294334457124108819.tmp
21/06/11 20:42:15 INFO Utils: Copying /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/-9179969801623424325166_cache to /home/w/Assignments/ner/spark-3.0.2-bin-hadoop2.7/work/app-20210611204208-0009/0/./net.sf.trove4j_trove4j-3.0.3.jar
21/06/11 20:42:15 INFO Executor: Fetching spark://192.168.2.103:34205/files/org.json4s_json4s-ext_2.12-3.5.3.jar with timestamp 1623424325166
21/06/11 20:42:15 INFO Utils: Fetching spark://192.168.2.103:34205/files/org.json4s_json4s-ext_2.12-3.5.3.jar to /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/fetchFileTemp5724117738489913536.tmp
21/06/11 20:42:15 INFO Utils: Copying /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/-1785968311623424325166_cache to /home/w/Assignments/ner/spark-3.0.2-bin-hadoop2.7/work/app-20210611204208-0009/0/./org.json4s_json4s-ext_2.12-3.5.3.jar
21/06/11 20:42:15 INFO Executor: Fetching spark://192.168.2.103:34205/files/com.google.code.findbugs_jsr305-3.0.1.jar with timestamp 1623424325166
21/06/11 20:42:15 INFO Utils: Fetching spark://192.168.2.103:34205/files/com.google.code.findbugs_jsr305-3.0.1.jar to /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/fetchFileTemp6507586328711846510.tmp
21/06/11 20:42:15 INFO Utils: Copying /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/-19147812741623424325166_cache to /home/w/Assignments/ner/spark-3.0.2-bin-hadoop2.7/work/app-20210611204208-0009/0/./com.google.code.findbugs_jsr305-3.0.1.jar
21/06/11 20:42:15 INFO Executor: Fetching spark://192.168.2.103:34205/files/org.joda_joda-convert-1.8.1.jar with timestamp 1623424325166
21/06/11 20:42:15 INFO Utils: Fetching spark://192.168.2.103:34205/files/org.joda_joda-convert-1.8.1.jar to /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/fetchFileTemp11192836114627213928.tmp
21/06/11 20:42:15 INFO Utils: Copying /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/-18183925021623424325166_cache to /home/w/Assignments/ner/spark-3.0.2-bin-hadoop2.7/work/app-20210611204208-0009/0/./org.joda_joda-convert-1.8.1.jar
21/06/11 20:42:15 INFO Executor: Fetching spark://192.168.2.103:34205/files/dk.brics.automaton_automaton-1.11-8.jar with timestamp 1623424325166
21/06/11 20:42:15 INFO Utils: Fetching spark://192.168.2.103:34205/files/dk.brics.automaton_automaton-1.11-8.jar to /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/fetchFileTemp17414383452524692686.tmp
21/06/11 20:42:15 INFO Utils: Copying /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/18002895341623424325166_cache to /home/w/Assignments/ner/spark-3.0.2-bin-hadoop2.7/work/app-20210611204208-0009/0/./dk.brics.automaton_automaton-1.11-8.jar
21/06/11 20:42:15 INFO Executor: Fetching spark://192.168.2.103:34205/files/com.navigamez_greex-1.0.jar with timestamp 1623424325166
21/06/11 20:42:15 INFO Utils: Fetching spark://192.168.2.103:34205/files/com.navigamez_greex-1.0.jar to /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/fetchFileTemp1310093016529474953.tmp
21/06/11 20:42:15 INFO Utils: Copying /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/444129991623424325166_cache to /home/w/Assignments/ner/spark-3.0.2-bin-hadoop2.7/work/app-20210611204208-0009/0/./com.navigamez_greex-1.0.jar
21/06/11 20:42:15 INFO Executor: Fetching spark://192.168.2.103:34205/files/com.google.code.gson_gson-2.3.jar with timestamp 1623424325166
21/06/11 20:42:15 INFO Utils: Fetching spark://192.168.2.103:34205/files/com.google.code.gson_gson-2.3.jar to /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/fetchFileTemp16952031653904177164.tmp
21/06/11 20:42:15 INFO Utils: Copying /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/-20852710581623424325166_cache to /home/w/Assignments/ner/spark-3.0.2-bin-hadoop2.7/work/app-20210611204208-0009/0/./com.google.code.gson_gson-2.3.jar
21/06/11 20:42:15 INFO Executor: Fetching spark://192.168.2.103:34205/files/it.unimi.dsi_fastutil-7.0.12.jar with timestamp 1623424325166
21/06/11 20:42:15 INFO Utils: Fetching spark://192.168.2.103:34205/files/it.unimi.dsi_fastutil-7.0.12.jar to /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/fetchFileTemp5122682618647664079.tmp
21/06/11 20:42:15 INFO Utils: Copying /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/-5370007131623424325166_cache to /home/w/Assignments/ner/spark-3.0.2-bin-hadoop2.7/work/app-20210611204208-0009/0/./it.unimi.dsi_fastutil-7.0.12.jar
21/06/11 20:42:15 INFO Executor: Fetching spark://192.168.2.103:34205/files/com.johnsnowlabs.nlp_spark-nlp_2.12-3.1.0.jar with timestamp 1623424325166
21/06/11 20:42:15 INFO Utils: Fetching spark://192.168.2.103:34205/files/com.johnsnowlabs.nlp_spark-nlp_2.12-3.1.0.jar to /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/fetchFileTemp4638237247886531412.tmp
21/06/11 20:42:16 INFO Utils: Copying /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/-3144268511623424325166_cache to /home/w/Assignments/ner/spark-3.0.2-bin-hadoop2.7/work/app-20210611204208-0009/0/./com.johnsnowlabs.nlp_spark-nlp_2.12-3.1.0.jar
21/06/11 20:42:16 INFO Executor: Fetching spark://192.168.2.103:34205/files/com.github.universal-automata_liblevenshtein-3.0.0.jar with timestamp 1623424325166
21/06/11 20:42:16 INFO Utils: Fetching spark://192.168.2.103:34205/files/com.github.universal-automata_liblevenshtein-3.0.0.jar to /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/fetchFileTemp18408146982236201037.tmp
21/06/11 20:42:16 INFO Utils: Copying /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/19900329611623424325166_cache to /home/w/Assignments/ner/spark-3.0.2-bin-hadoop2.7/work/app-20210611204208-0009/0/./com.github.universal-automata_liblevenshtein-3.0.0.jar
21/06/11 20:42:16 INFO Executor: Fetching spark://192.168.2.103:34205/files/org.slf4j_slf4j-api-1.7.21.jar with timestamp 1623424325166
21/06/11 20:42:16 INFO Utils: Fetching spark://192.168.2.103:34205/files/org.slf4j_slf4j-api-1.7.21.jar to /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/fetchFileTemp18433314345265653010.tmp
21/06/11 20:42:16 INFO Utils: Copying /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/13339163381623424325166_cache to /home/w/Assignments/ner/spark-3.0.2-bin-hadoop2.7/work/app-20210611204208-0009/0/./org.slf4j_slf4j-api-1.7.21.jar
21/06/11 20:42:16 INFO Executor: Fetching spark://192.168.2.103:34205/files/org.rocksdb_rocksdbjni-6.5.3.jar with timestamp 1623424325166
21/06/11 20:42:16 INFO Utils: Fetching spark://192.168.2.103:34205/files/org.rocksdb_rocksdbjni-6.5.3.jar to /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/fetchFileTemp15154651623340219296.tmp
21/06/11 20:42:16 INFO Utils: Copying /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/19889744071623424325166_cache to /home/w/Assignments/ner/spark-3.0.2-bin-hadoop2.7/work/app-20210611204208-0009/0/./org.rocksdb_rocksdbjni-6.5.3.jar
21/06/11 20:42:16 INFO Executor: Fetching spark://192.168.2.103:34205/files/joda-time_joda-time-2.9.5.jar with timestamp 1623424325166
21/06/11 20:42:16 INFO Utils: Fetching spark://192.168.2.103:34205/files/joda-time_joda-time-2.9.5.jar to /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/fetchFileTemp6878914735123238495.tmp
21/06/11 20:42:16 INFO Utils: Copying /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/-7077374021623424325166_cache to /home/w/Assignments/ner/spark-3.0.2-bin-hadoop2.7/work/app-20210611204208-0009/0/./joda-time_joda-time-2.9.5.jar
21/06/11 20:42:16 INFO Executor: Fetching spark://192.168.2.103:34205/files/com.amazonaws_aws-java-sdk-bundle-1.11.603.jar with timestamp 1623424325166
21/06/11 20:42:16 INFO Utils: Fetching spark://192.168.2.103:34205/files/com.amazonaws_aws-java-sdk-bundle-1.11.603.jar to /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/fetchFileTemp11428706676980878857.tmp
21/06/11 20:42:17 INFO Utils: Copying /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/11445123081623424325166_cache to /home/w/Assignments/ner/spark-3.0.2-bin-hadoop2.7/work/app-20210611204208-0009/0/./com.amazonaws_aws-java-sdk-bundle-1.11.603.jar
21/06/11 20:42:17 INFO Executor: Fetching spark://192.168.2.103:34205/files/com.google.protobuf_protobuf-java-3.0.0-beta-3.jar with timestamp 1623424325166
21/06/11 20:42:17 INFO Utils: Fetching spark://192.168.2.103:34205/files/com.google.protobuf_protobuf-java-3.0.0-beta-3.jar to /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/fetchFileTemp2249757634022456047.tmp
21/06/11 20:42:17 INFO Utils: Copying /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/-12346780511623424325166_cache to /home/w/Assignments/ner/spark-3.0.2-bin-hadoop2.7/work/app-20210611204208-0009/0/./com.google.protobuf_protobuf-java-3.0.0-beta-3.jar
21/06/11 20:42:17 INFO Executor: Fetching spark://192.168.2.103:34205/jars/org.json4s_json4s-ext_2.12-3.5.3.jar with timestamp 1623424325166
21/06/11 20:42:17 INFO Utils: Fetching spark://192.168.2.103:34205/jars/org.json4s_json4s-ext_2.12-3.5.3.jar to /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/fetchFileTemp4902258414204843486.tmp
21/06/11 20:42:17 INFO Utils: /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/6839329141623424325166_cache has been previously copied to /home/w/Assignments/ner/spark-3.0.2-bin-hadoop2.7/work/app-20210611204208-0009/0/./org.json4s_json4s-ext_2.12-3.5.3.jar
21/06/11 20:42:17 INFO Executor: Adding file:/home/w/Assignments/ner/spark-3.0.2-bin-hadoop2.7/work/app-20210611204208-0009/0/./org.json4s_json4s-ext_2.12-3.5.3.jar to class loader
21/06/11 20:42:17 INFO Executor: Fetching spark://192.168.2.103:34205/jars/dk.brics.automaton_automaton-1.11-8.jar with timestamp 1623424325166
21/06/11 20:42:17 INFO Utils: Fetching spark://192.168.2.103:34205/jars/dk.brics.automaton_automaton-1.11-8.jar to /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/fetchFileTemp7723995488492432875.tmp
21/06/11 20:42:17 INFO Utils: /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/6345908611623424325166_cache has been previously copied to /home/w/Assignments/ner/spark-3.0.2-bin-hadoop2.7/work/app-20210611204208-0009/0/./dk.brics.automaton_automaton-1.11-8.jar
21/06/11 20:42:17 INFO Executor: Adding file:/home/w/Assignments/ner/spark-3.0.2-bin-hadoop2.7/work/app-20210611204208-0009/0/./dk.brics.automaton_automaton-1.11-8.jar to class loader
21/06/11 20:42:17 INFO Executor: Fetching spark://192.168.2.103:34205/jars/net.jcip_jcip-annotations-1.0.jar with timestamp 1623424325166
21/06/11 20:42:17 INFO Utils: Fetching spark://192.168.2.103:34205/jars/net.jcip_jcip-annotations-1.0.jar to /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/fetchFileTemp332592907644172826.tmp
21/06/11 20:42:17 INFO Utils: /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/14461652401623424325166_cache has been previously copied to /home/w/Assignments/ner/spark-3.0.2-bin-hadoop2.7/work/app-20210611204208-0009/0/./net.jcip_jcip-annotations-1.0.jar
21/06/11 20:42:17 INFO Executor: Adding file:/home/w/Assignments/ner/spark-3.0.2-bin-hadoop2.7/work/app-20210611204208-0009/0/./net.jcip_jcip-annotations-1.0.jar to class loader
21/06/11 20:42:17 INFO Executor: Fetching spark://192.168.2.103:34205/jars/org.projectlombok_lombok-1.16.8.jar with timestamp 1623424325166
21/06/11 20:42:17 INFO Utils: Fetching spark://192.168.2.103:34205/jars/org.projectlombok_lombok-1.16.8.jar to /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/fetchFileTemp1709548010051135733.tmp
21/06/11 20:42:17 INFO Utils: /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/3280036381623424325166_cache has been previously copied to /home/w/Assignments/ner/spark-3.0.2-bin-hadoop2.7/work/app-20210611204208-0009/0/./org.projectlombok_lombok-1.16.8.jar
21/06/11 20:42:17 INFO Executor: Adding file:/home/w/Assignments/ner/spark-3.0.2-bin-hadoop2.7/work/app-20210611204208-0009/0/./org.projectlombok_lombok-1.16.8.jar to class loader
21/06/11 20:42:17 INFO Executor: Fetching spark://192.168.2.103:34205/jars/net.sf.trove4j_trove4j-3.0.3.jar with timestamp 1623424325166
21/06/11 20:42:17 INFO Utils: Fetching spark://192.168.2.103:34205/jars/net.sf.trove4j_trove4j-3.0.3.jar to /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/fetchFileTemp12992547080912692118.tmp
21/06/11 20:42:17 INFO Utils: /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/18958713891623424325166_cache has been previously copied to /home/w/Assignments/ner/spark-3.0.2-bin-hadoop2.7/work/app-20210611204208-0009/0/./net.sf.trove4j_trove4j-3.0.3.jar
21/06/11 20:42:17 INFO Executor: Adding file:/home/w/Assignments/ner/spark-3.0.2-bin-hadoop2.7/work/app-20210611204208-0009/0/./net.sf.trove4j_trove4j-3.0.3.jar to class loader
21/06/11 20:42:17 INFO Executor: Fetching spark://192.168.2.103:34205/jars/org.joda_joda-convert-1.8.1.jar with timestamp 1623424325166
21/06/11 20:42:17 INFO Utils: Fetching spark://192.168.2.103:34205/jars/org.joda_joda-convert-1.8.1.jar to /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/fetchFileTemp16024886356109174200.tmp
21/06/11 20:42:17 INFO Utils: /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/-21432645511623424325166_cache has been previously copied to /home/w/Assignments/ner/spark-3.0.2-bin-hadoop2.7/work/app-20210611204208-0009/0/./org.joda_joda-convert-1.8.1.jar
21/06/11 20:42:17 INFO Executor: Adding file:/home/w/Assignments/ner/spark-3.0.2-bin-hadoop2.7/work/app-20210611204208-0009/0/./org.joda_joda-convert-1.8.1.jar to class loader
21/06/11 20:42:17 INFO Executor: Fetching spark://192.168.2.103:34205/jars/com.github.universal-automata_liblevenshtein-3.0.0.jar with timestamp 1623424325166
21/06/11 20:42:17 INFO Utils: Fetching spark://192.168.2.103:34205/jars/com.github.universal-automata_liblevenshtein-3.0.0.jar to /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/fetchFileTemp11719577668617794252.tmp
21/06/11 20:42:17 INFO Utils: /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/-19939684301623424325166_cache has been previously copied to /home/w/Assignments/ner/spark-3.0.2-bin-hadoop2.7/work/app-20210611204208-0009/0/./com.github.universal-automata_liblevenshtein-3.0.0.jar
21/06/11 20:42:17 INFO Executor: Adding file:/home/w/Assignments/ner/spark-3.0.2-bin-hadoop2.7/work/app-20210611204208-0009/0/./com.github.universal-automata_liblevenshtein-3.0.0.jar to class loader
21/06/11 20:42:17 INFO Executor: Fetching spark://192.168.2.103:34205/jars/com.navigamez_greex-1.0.jar with timestamp 1623424325166
21/06/11 20:42:17 INFO Utils: Fetching spark://192.168.2.103:34205/jars/com.navigamez_greex-1.0.jar to /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/fetchFileTemp807262545140534729.tmp
21/06/11 20:42:17 INFO Utils: /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/18948526941623424325166_cache has been previously copied to /home/w/Assignments/ner/spark-3.0.2-bin-hadoop2.7/work/app-20210611204208-0009/0/./com.navigamez_greex-1.0.jar
21/06/11 20:42:17 INFO Executor: Adding file:/home/w/Assignments/ner/spark-3.0.2-bin-hadoop2.7/work/app-20210611204208-0009/0/./com.navigamez_greex-1.0.jar to class loader
21/06/11 20:42:17 INFO Executor: Fetching spark://192.168.2.103:34205/jars/joda-time_joda-time-2.9.5.jar with timestamp 1623424325166
21/06/11 20:42:17 INFO Utils: Fetching spark://192.168.2.103:34205/jars/joda-time_joda-time-2.9.5.jar to /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/fetchFileTemp12279758982734310357.tmp
21/06/11 20:42:17 INFO Utils: /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/-5516510511623424325166_cache has been previously copied to /home/w/Assignments/ner/spark-3.0.2-bin-hadoop2.7/work/app-20210611204208-0009/0/./joda-time_joda-time-2.9.5.jar
21/06/11 20:42:17 INFO Executor: Adding file:/home/w/Assignments/ner/spark-3.0.2-bin-hadoop2.7/work/app-20210611204208-0009/0/./joda-time_joda-time-2.9.5.jar to class loader
21/06/11 20:42:17 INFO Executor: Fetching spark://192.168.2.103:34205/jars/com.google.protobuf_protobuf-java-3.0.0-beta-3.jar with timestamp 1623424325166
21/06/11 20:42:17 INFO Utils: Fetching spark://192.168.2.103:34205/jars/com.google.protobuf_protobuf-java-3.0.0-beta-3.jar to /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/fetchFileTemp12934127082379220752.tmp
21/06/11 20:42:17 INFO Utils: /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/-4898677941623424325166_cache has been previously copied to /home/w/Assignments/ner/spark-3.0.2-bin-hadoop2.7/work/app-20210611204208-0009/0/./com.google.protobuf_protobuf-java-3.0.0-beta-3.jar
21/06/11 20:42:17 INFO Executor: Adding file:/home/w/Assignments/ner/spark-3.0.2-bin-hadoop2.7/work/app-20210611204208-0009/0/./com.google.protobuf_protobuf-java-3.0.0-beta-3.jar to class loader
21/06/11 20:42:17 INFO Executor: Fetching spark://192.168.2.103:34205/jars/com.johnsnowlabs.nlp_spark-nlp_2.12-3.1.0.jar with timestamp 1623424325166
21/06/11 20:42:17 INFO Utils: Fetching spark://192.168.2.103:34205/jars/com.johnsnowlabs.nlp_spark-nlp_2.12-3.1.0.jar to /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/fetchFileTemp7551843316349076899.tmp
21/06/11 20:42:17 INFO Utils: /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/17717935161623424325166_cache has been previously copied to /home/w/Assignments/ner/spark-3.0.2-bin-hadoop2.7/work/app-20210611204208-0009/0/./com.johnsnowlabs.nlp_spark-nlp_2.12-3.1.0.jar
21/06/11 20:42:17 INFO Executor: Adding file:/home/w/Assignments/ner/spark-3.0.2-bin-hadoop2.7/work/app-20210611204208-0009/0/./com.johnsnowlabs.nlp_spark-nlp_2.12-3.1.0.jar to class loader
21/06/11 20:42:17 INFO Executor: Fetching spark://192.168.2.103:34205/jars/com.google.code.gson_gson-2.3.jar with timestamp 1623424325166
21/06/11 20:42:17 INFO Utils: Fetching spark://192.168.2.103:34205/jars/com.google.code.gson_gson-2.3.jar to /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/fetchFileTemp8987546978536014081.tmp
21/06/11 20:42:17 INFO Utils: /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/-7546975391623424325166_cache has been previously copied to /home/w/Assignments/ner/spark-3.0.2-bin-hadoop2.7/work/app-20210611204208-0009/0/./com.google.code.gson_gson-2.3.jar
21/06/11 20:42:17 INFO Executor: Adding file:/home/w/Assignments/ner/spark-3.0.2-bin-hadoop2.7/work/app-20210611204208-0009/0/./com.google.code.gson_gson-2.3.jar to class loader
21/06/11 20:42:17 INFO Executor: Fetching spark://192.168.2.103:34205/jars/it.unimi.dsi_fastutil-7.0.12.jar with timestamp 1623424325166
21/06/11 20:42:17 INFO Utils: Fetching spark://192.168.2.103:34205/jars/it.unimi.dsi_fastutil-7.0.12.jar to /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/fetchFileTemp14662829152554853125.tmp
21/06/11 20:42:17 INFO Utils: /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/-20180996401623424325166_cache has been previously copied to /home/w/Assignments/ner/spark-3.0.2-bin-hadoop2.7/work/app-20210611204208-0009/0/./it.unimi.dsi_fastutil-7.0.12.jar
21/06/11 20:42:17 INFO Executor: Adding file:/home/w/Assignments/ner/spark-3.0.2-bin-hadoop2.7/work/app-20210611204208-0009/0/./it.unimi.dsi_fastutil-7.0.12.jar to class loader
21/06/11 20:42:17 INFO Executor: Fetching spark://192.168.2.103:34205/jars/org.rocksdb_rocksdbjni-6.5.3.jar with timestamp 1623424325166
21/06/11 20:42:17 INFO Utils: Fetching spark://192.168.2.103:34205/jars/org.rocksdb_rocksdbjni-6.5.3.jar to /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/fetchFileTemp9949668037197273689.tmp
21/06/11 20:42:18 INFO Utils: /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/5078754801623424325166_cache has been previously copied to /home/w/Assignments/ner/spark-3.0.2-bin-hadoop2.7/work/app-20210611204208-0009/0/./org.rocksdb_rocksdbjni-6.5.3.jar
21/06/11 20:42:18 INFO Executor: Adding file:/home/w/Assignments/ner/spark-3.0.2-bin-hadoop2.7/work/app-20210611204208-0009/0/./org.rocksdb_rocksdbjni-6.5.3.jar to class loader
21/06/11 20:42:18 INFO Executor: Fetching spark://192.168.2.103:34205/jars/com.johnsnowlabs.nlp_tensorflow-cpu_2.12-0.3.1.jar with timestamp 1623424325166
21/06/11 20:42:18 INFO Utils: Fetching spark://192.168.2.103:34205/jars/com.johnsnowlabs.nlp_tensorflow-cpu_2.12-0.3.1.jar to /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/fetchFileTemp9212643548963030178.tmp
21/06/11 20:42:19 INFO Utils: /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/694347761623424325166_cache has been previously copied to /home/w/Assignments/ner/spark-3.0.2-bin-hadoop2.7/work/app-20210611204208-0009/0/./com.johnsnowlabs.nlp_tensorflow-cpu_2.12-0.3.1.jar
21/06/11 20:42:19 INFO Executor: Adding file:/home/w/Assignments/ner/spark-3.0.2-bin-hadoop2.7/work/app-20210611204208-0009/0/./com.johnsnowlabs.nlp_tensorflow-cpu_2.12-0.3.1.jar to class loader
21/06/11 20:42:19 INFO Executor: Fetching spark://192.168.2.103:34205/jars/com.google.code.findbugs_jsr305-3.0.1.jar with timestamp 1623424325166
21/06/11 20:42:19 INFO Utils: Fetching spark://192.168.2.103:34205/jars/com.google.code.findbugs_jsr305-3.0.1.jar to /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/fetchFileTemp6313052555132831521.tmp
21/06/11 20:42:19 INFO Utils: /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/-11647417711623424325166_cache has been previously copied to /home/w/Assignments/ner/spark-3.0.2-bin-hadoop2.7/work/app-20210611204208-0009/0/./com.google.code.findbugs_jsr305-3.0.1.jar
21/06/11 20:42:19 INFO Executor: Adding file:/home/w/Assignments/ner/spark-3.0.2-bin-hadoop2.7/work/app-20210611204208-0009/0/./com.google.code.findbugs_jsr305-3.0.1.jar to class loader
21/06/11 20:42:19 INFO Executor: Fetching spark://192.168.2.103:34205/jars/org.slf4j_slf4j-api-1.7.21.jar with timestamp 1623424325166
21/06/11 20:42:19 INFO Utils: Fetching spark://192.168.2.103:34205/jars/org.slf4j_slf4j-api-1.7.21.jar to /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/fetchFileTemp16533770977053225215.tmp
21/06/11 20:42:19 INFO Utils: /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/18776259231623424325166_cache has been previously copied to /home/w/Assignments/ner/spark-3.0.2-bin-hadoop2.7/work/app-20210611204208-0009/0/./org.slf4j_slf4j-api-1.7.21.jar
21/06/11 20:42:19 INFO Executor: Adding file:/home/w/Assignments/ner/spark-3.0.2-bin-hadoop2.7/work/app-20210611204208-0009/0/./org.slf4j_slf4j-api-1.7.21.jar to class loader
21/06/11 20:42:19 INFO Executor: Fetching spark://192.168.2.103:34205/jars/com.amazonaws_aws-java-sdk-bundle-1.11.603.jar with timestamp 1623424325166
21/06/11 20:42:19 INFO Utils: Fetching spark://192.168.2.103:34205/jars/com.amazonaws_aws-java-sdk-bundle-1.11.603.jar to /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/fetchFileTemp17738135895720612151.tmp
21/06/11 20:42:19 INFO Utils: /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/13928342451623424325166_cache has been previously copied to /home/w/Assignments/ner/spark-3.0.2-bin-hadoop2.7/work/app-20210611204208-0009/0/./com.amazonaws_aws-java-sdk-bundle-1.11.603.jar
21/06/11 20:42:19 INFO Executor: Adding file:/home/w/Assignments/ner/spark-3.0.2-bin-hadoop2.7/work/app-20210611204208-0009/0/./com.amazonaws_aws-java-sdk-bundle-1.11.603.jar to class loader
21/06/11 20:42:19 INFO Executor: Fetching spark://192.168.2.103:34205/jars/com.typesafe_config-1.3.0.jar with timestamp 1623424325166
21/06/11 20:42:19 INFO Utils: Fetching spark://192.168.2.103:34205/jars/com.typesafe_config-1.3.0.jar to /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/fetchFileTemp15020909164608666214.tmp
21/06/11 20:42:19 INFO Utils: /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/-4682533391623424325166_cache has been previously copied to /home/w/Assignments/ner/spark-3.0.2-bin-hadoop2.7/work/app-20210611204208-0009/0/./com.typesafe_config-1.3.0.jar
21/06/11 20:42:19 INFO Executor: Adding file:/home/w/Assignments/ner/spark-3.0.2-bin-hadoop2.7/work/app-20210611204208-0009/0/./com.typesafe_config-1.3.0.jar to class loader
21/06/11 20:42:19 INFO Executor: Fetching spark://192.168.2.103:34205/jars/com.google.protobuf_protobuf-java-util-3.0.0-beta-3.jar with timestamp 1623424325166
21/06/11 20:42:19 INFO Utils: Fetching spark://192.168.2.103:34205/jars/com.google.protobuf_protobuf-java-util-3.0.0-beta-3.jar to /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/fetchFileTemp16503352824305074337.tmp
21/06/11 20:42:19 INFO Utils: /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/-8807534571623424325166_cache has been previously copied to /home/w/Assignments/ner/spark-3.0.2-bin-hadoop2.7/work/app-20210611204208-0009/0/./com.google.protobuf_protobuf-java-util-3.0.0-beta-3.jar
21/06/11 20:42:19 INFO Executor: Adding file:/home/w/Assignments/ner/spark-3.0.2-bin-hadoop2.7/work/app-20210611204208-0009/0/./com.google.protobuf_protobuf-java-util-3.0.0-beta-3.jar to class loader
21/06/11 20:42:19 INFO Executor: Fetching spark://192.168.2.103:34205/jars/com.google.code.findbugs_annotations-3.0.1.jar with timestamp 1623424325166
21/06/11 20:42:19 INFO Utils: Fetching spark://192.168.2.103:34205/jars/com.google.code.findbugs_annotations-3.0.1.jar to /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/fetchFileTemp1442695003069020584.tmp
21/06/11 20:42:19 INFO Utils: /tmp/spark-a819910c-fda9-401a-b6e9-a88810001756/executor-8497494f-01a3-4bb3-a836-c15131d0ca98/spark-aef159f2-4977-4e17-9deb-eac747de8d62/12936857421623424325166_cache has been previously copied to /home/w/Assignments/ner/spark-3.0.2-bin-hadoop2.7/work/app-20210611204208-0009/0/./com.google.code.findbugs_annotations-3.0.1.jar
21/06/11 20:42:19 INFO Executor: Adding file:/home/w/Assignments/ner/spark-3.0.2-bin-hadoop2.7/work/app-20210611204208-0009/0/./com.google.code.findbugs_annotations-3.0.1.jar to class loader
21/06/11 20:42:36 INFO CoarseGrainedExecutorBackend: Got assigned task 0
21/06/11 20:42:36 INFO Executor: Running task 0.0 in stage 0.0 (TID 0)
21/06/11 20:42:36 INFO TorrentBroadcast: Started reading broadcast variable 1 with 1 pieces (estimated total size 4.0 MiB)
21/06/11 20:42:36 INFO TransportClientFactory: Successfully created connection to /192.168.2.103:32947 after 4 ms (0 ms spent in bootstraps)
21/06/11 20:42:36 INFO MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 2.4 KiB, free 2.2 GiB)
21/06/11 20:42:36 INFO TorrentBroadcast: Reading broadcast variable 1 took 128 ms
21/06/11 20:42:36 INFO MemoryStore: Block broadcast_1 stored as values in memory (estimated size 4.2 KiB, free 2.2 GiB)
21/06/11 20:42:37 INFO HadoopRDD: Input split: file:/home/w/cache_pretrained/small_bert_L2_128_en_2.6.0_2.4_1598344320681/metadata/part-00000:0+443
21/06/11 20:42:37 INFO TorrentBroadcast: Started reading broadcast variable 0 with 1 pieces (estimated total size 4.0 MiB)
21/06/11 20:42:37 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 23.6 KiB, free 2.2 GiB)
21/06/11 20:42:37 INFO TorrentBroadcast: Reading broadcast variable 0 took 18 ms
21/06/11 20:42:37 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 198.4 KiB, free 2.2 GiB)
21/06/11 20:42:37 INFO Executor: Finished task 0.0 in stage 0.0 (TID 0). 1414 bytes result sent to driver
21/06/11 20:42:37 INFO CoarseGrainedExecutorBackend: Got assigned task 1
21/06/11 20:42:37 INFO Executor: Running task 0.0 in stage 1.0 (TID 1)
21/06/11 20:42:37 INFO CoarseGrainedExecutorBackend: Got assigned task 2
21/06/11 20:42:37 INFO CoarseGrainedExecutorBackend: Got assigned task 3
21/06/11 20:42:37 INFO Executor: Running task 1.0 in stage 1.0 (TID 2)
21/06/11 20:42:37 INFO CoarseGrainedExecutorBackend: Got assigned task 4
21/06/11 20:42:37 INFO Executor: Running task 2.0 in stage 1.0 (TID 3)
21/06/11 20:42:37 INFO Executor: Running task 3.0 in stage 1.0 (TID 4)
21/06/11 20:42:37 INFO TorrentBroadcast: Started reading broadcast variable 3 with 1 pieces (estimated total size 4.0 MiB)
21/06/11 20:42:37 INFO MemoryStore: Block broadcast_3_piece0 stored as bytes in memory (estimated size 2.4 KiB, free 2.2 GiB)
21/06/11 20:42:37 INFO TorrentBroadcast: Reading broadcast variable 3 took 15 ms
21/06/11 20:42:37 INFO MemoryStore: Block broadcast_3 stored as values in memory (estimated size 4.1 KiB, free 2.2 GiB)
21/06/11 20:42:37 INFO HadoopRDD: Input split: file:/home/w/cache_pretrained/small_bert_L2_128_en_2.6.0_2.4_1598344320681/fields/vocabulary/part-00005:0+111532
21/06/11 20:42:37 INFO TorrentBroadcast: Started reading broadcast variable 2 with 1 pieces (estimated total size 4.0 MiB)
21/06/11 20:42:37 INFO HadoopRDD: Input split: file:/home/w/cache_pretrained/small_bert_L2_128_en_2.6.0_2.4_1598344320681/fields/vocabulary/part-00004:0+111799
21/06/11 20:42:37 INFO HadoopRDD: Input split: file:/home/w/cache_pretrained/small_bert_L2_128_en_2.6.0_2.4_1598344320681/fields/vocabulary/part-00009:0+111710
21/06/11 20:42:37 INFO MemoryStore: Block broadcast_2_piece0 stored as bytes in memory (estimated size 23.6 KiB, free 2.2 GiB)
21/06/11 20:42:37 INFO HadoopRDD: Input split: file:/home/w/cache_pretrained/small_bert_L2_128_en_2.6.0_2.4_1598344320681/fields/vocabulary/part-00003:0+111815
21/06/11 20:42:37 INFO TorrentBroadcast: Reading broadcast variable 2 took 20 ms
21/06/11 20:42:37 INFO MemoryStore: Block broadcast_2 stored as values in memory (estimated size 198.4 KiB, free 2.2 GiB)
21/06/11 20:42:38 INFO Executor: Finished task 3.0 in stage 1.0 (TID 4). 66763 bytes result sent to driver
21/06/11 20:42:38 INFO Executor: Finished task 1.0 in stage 1.0 (TID 2). 66496 bytes result sent to driver
21/06/11 20:42:38 INFO Executor: Finished task 0.0 in stage 1.0 (TID 1). 66779 bytes result sent to driver
21/06/11 20:42:38 INFO Executor: Finished task 2.0 in stage 1.0 (TID 3). 66674 bytes result sent to driver
21/06/11 20:42:38 INFO CoarseGrainedExecutorBackend: Got assigned task 5
21/06/11 20:42:38 INFO Executor: Running task 4.0 in stage 1.0 (TID 5)
21/06/11 20:42:38 INFO HadoopRDD: Input split: file:/home/w/cache_pretrained/small_bert_L2_128_en_2.6.0_2.4_1598344320681/fields/vocabulary/part-00006:0+111573
21/06/11 20:42:38 INFO CoarseGrainedExecutorBackend: Got assigned task 6
21/06/11 20:42:38 INFO Executor: Running task 5.0 in stage 1.0 (TID 6)
21/06/11 20:42:38 INFO CoarseGrainedExecutorBackend: Got assigned task 7
21/06/11 20:42:38 INFO Executor: Running task 6.0 in stage 1.0 (TID 7)
21/06/11 20:42:38 INFO HadoopRDD: Input split: file:/home/w/cache_pretrained/small_bert_L2_128_en_2.6.0_2.4_1598344320681/fields/vocabulary/part-00007:0+111394
21/06/11 20:42:38 INFO CoarseGrainedExecutorBackend: Got assigned task 8
21/06/11 20:42:38 INFO Executor: Running task 7.0 in stage 1.0 (TID 8)
21/06/11 20:42:38 INFO HadoopRDD: Input split: file:/home/w/cache_pretrained/small_bert_L2_128_en_2.6.0_2.4_1598344320681/fields/vocabulary/part-00001:0+111321
21/06/11 20:42:38 INFO HadoopRDD: Input split: file:/home/w/cache_pretrained/small_bert_L2_128_en_2.6.0_2.4_1598344320681/fields/vocabulary/part-00008:0+111429
21/06/11 20:42:38 INFO Executor: Finished task 7.0 in stage 1.0 (TID 8). 66350 bytes result sent to driver
21/06/11 20:42:38 INFO CoarseGrainedExecutorBackend: Got assigned task 9
21/06/11 20:42:38 INFO Executor: Running task 8.0 in stage 1.0 (TID 9)
21/06/11 20:42:38 INFO HadoopRDD: Input split: file:/home/w/cache_pretrained/small_bert_L2_128_en_2.6.0_2.4_1598344320681/fields/vocabulary/part-00011:0+111491
21/06/11 20:42:38 INFO Executor: Finished task 6.0 in stage 1.0 (TID 7). 66242 bytes result sent to driver
21/06/11 20:42:38 INFO CoarseGrainedExecutorBackend: Got assigned task 10
21/06/11 20:42:38 INFO Executor: Finished task 4.0 in stage 1.0 (TID 5). 66494 bytes result sent to driver
21/06/11 20:42:38 INFO Executor: Finished task 5.0 in stage 1.0 (TID 6). 66315 bytes result sent to driver
21/06/11 20:42:38 INFO Executor: Running task 9.0 in stage 1.0 (TID 10)
21/06/11 20:42:38 INFO CoarseGrainedExecutorBackend: Got assigned task 11
21/06/11 20:42:38 INFO HadoopRDD: Input split: file:/home/w/cache_pretrained/small_bert_L2_128_en_2.6.0_2.4_1598344320681/fields/vocabulary/part-00010:0+111524
21/06/11 20:42:38 INFO CoarseGrainedExecutorBackend: Got assigned task 12
21/06/11 20:42:38 INFO Executor: Running task 10.0 in stage 1.0 (TID 11)
21/06/11 20:42:38 INFO Executor: Running task 11.0 in stage 1.0 (TID 12)
21/06/11 20:42:38 INFO HadoopRDD: Input split: file:/home/w/cache_pretrained/small_bert_L2_128_en_2.6.0_2.4_1598344320681/fields/vocabulary/part-00000:0+111679
21/06/11 20:42:38 INFO HadoopRDD: Input split: file:/home/w/cache_pretrained/small_bert_L2_128_en_2.6.0_2.4_1598344320681/fields/vocabulary/part-00002:0+111457
21/06/11 20:42:38 INFO Executor: Finished task 8.0 in stage 1.0 (TID 9). 66412 bytes result sent to driver
21/06/11 20:42:38 INFO Executor: Finished task 11.0 in stage 1.0 (TID 12). 66600 bytes result sent to driver
21/06/11 20:42:38 INFO Executor: Finished task 9.0 in stage 1.0 (TID 10). 66445 bytes result sent to driver
21/06/11 20:42:38 INFO Executor: Finished task 10.0 in stage 1.0 (TID 11). 66378 bytes result sent to driver
21/06/11 20:42:55 INFO CoarseGrainedExecutorBackend: Got assigned task 13
21/06/11 20:42:55 INFO Executor: Running task 0.0 in stage 2.0 (TID 13)
21/06/11 20:42:55 INFO TorrentBroadcast: Started reading broadcast variable 6 with 1 pieces (estimated total size 4.0 MiB)
21/06/11 20:42:55 INFO MemoryStore: Block broadcast_6_piece0 stored as bytes in memory (estimated size 77.1 KiB, free 2.2 GiB)
21/06/11 20:42:55 INFO TorrentBroadcast: Reading broadcast variable 6 took 14 ms
21/06/11 20:42:55 INFO MemoryStore: Block broadcast_6 stored as values in memory (estimated size 376.3 KiB, free 2.2 GiB)
21/06/11 20:42:58 INFO CodeGenerator: Code generated in 392.898976 ms
21/06/11 20:42:58 INFO CodeGenerator: Code generated in 50.294749 ms
21/06/11 20:42:58 INFO CodeGenerator: Code generated in 85.842712 ms
21/06/11 20:42:58 INFO CodeGenerator: Generated method too long to be JIT compiled: org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage2.serializefromobject_doConsume_0$ is 20081 bytes
21/06/11 20:42:58 INFO CodeGenerator: Code generated in 257.430603 ms
21/06/11 20:42:59 INFO CodeGenerator: Code generated in 166.091418 ms
21/06/11 20:42:59 INFO TorrentBroadcast: Started reading broadcast variable 4 with 1 pieces (estimated total size 4.0 MiB)
21/06/11 20:42:59 INFO MemoryStore: Block broadcast_4_piece0 stored as bytes in memory (estimated size 333.3 KiB, free 2.2 GiB)
21/06/11 20:42:59 INFO TorrentBroadcast: Reading broadcast variable 4 took 8 ms
21/06/11 20:42:59 INFO MemoryStore: Block broadcast_4 stored as values in memory (estimated size 3.4 MiB, free 2.2 GiB)
21/06/11 20:42:59 INFO TorrentBroadcast: Started reading broadcast variable 5 with 5 pieces (estimated total size 20.0 MiB)
21/06/11 20:42:59 INFO MemoryStore: Block broadcast_5_piece3 stored as bytes in memory (estimated size 4.0 MiB, free 2.2 GiB)
21/06/11 20:42:59 INFO MemoryStore: Block broadcast_5_piece2 stored as bytes in memory (estimated size 4.0 MiB, free 2.2 GiB)
21/06/11 20:42:59 INFO MemoryStore: Block broadcast_5_piece4 stored as bytes in memory (estimated size 1039.2 KiB, free 2.2 GiB)
21/06/11 20:42:59 INFO MemoryStore: Block broadcast_5_piece0 stored as bytes in memory (estimated size 4.0 MiB, free 2.2 GiB)
21/06/11 20:42:59 INFO MemoryStore: Block broadcast_5_piece1 stored as bytes in memory (estimated size 4.0 MiB, free 2.2 GiB)
21/06/11 20:42:59 INFO TorrentBroadcast: Reading broadcast variable 5 took 126 ms
21/06/11 20:43:00 INFO MemoryStore: Block broadcast_5 stored as values in memory (estimated size 17.5 MiB, free 2.2 GiB)
21/06/11 20:43:00 ERROR Executor: Exception in task 0.0 in stage 2.0 (TID 13)
java.lang.NullPointerException
	at com.johnsnowlabs.ml.tensorflow.TensorflowWrapper.getTFHubSession(TensorflowWrapper.scala:109)
	at com.johnsnowlabs.ml.tensorflow.TensorflowBert.tag(TensorflowBert.scala:90)
	at com.johnsnowlabs.ml.tensorflow.TensorflowBert.$anonfun$calculateEmbeddings$1(TensorflowBert.scala:223)
	at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:484)
	at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:490)
	at scala.collection.Iterator.toStream(Iterator.scala:1415)
	at scala.collection.Iterator.toStream$(Iterator.scala:1414)
	at scala.collection.AbstractIterator.toStream(Iterator.scala:1429)
	at scala.collection.TraversableOnce.toSeq(TraversableOnce.scala:303)
	at scala.collection.TraversableOnce.toSeq$(TraversableOnce.scala:303)
	at scala.collection.AbstractIterator.toSeq(Iterator.scala:1429)
	at com.johnsnowlabs.ml.tensorflow.TensorflowBert.calculateEmbeddings(TensorflowBert.scala:221)
	at com.johnsnowlabs.nlp.embeddings.BertEmbeddings.$anonfun$batchAnnotate$2(BertEmbeddings.scala:237)
	at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:238)
	at scala.collection.IndexedSeqOptimized.foreach(IndexedSeqOptimized.scala:36)
	at scala.collection.IndexedSeqOptimized.foreach$(IndexedSeqOptimized.scala:33)
	at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:198)
	at scala.collection.TraversableLike.map(TraversableLike.scala:238)
	at scala.collection.TraversableLike.map$(TraversableLike.scala:231)
	at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:198)
	at com.johnsnowlabs.nlp.embeddings.BertEmbeddings.batchAnnotate(BertEmbeddings.scala:229)
	at com.johnsnowlabs.nlp.HasBatchedAnnotate.$anonfun$batchProcess$1(HasBatchedAnnotate.scala:41)
	at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:484)
	at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:490)
	at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:458)
	at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:458)
	at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage2.processNext(Unknown Source)
	at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
	at org.apache.spark.sql.execution.WholeStageCodegenExec$$anon$1.hasNext(WholeStageCodegenExec.scala:729)
	at org.apache.spark.sql.execution.SparkPlan.$anonfun$getByteArrayRdd$1(SparkPlan.scala:345)
	at org.apache.spark.rdd.RDD.$anonfun$mapPartitionsInternal$2(RDD.scala:872)
	at org.apache.spark.rdd.RDD.$anonfun$mapPartitionsInternal$2$adapted(RDD.scala:872)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:349)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:313)
	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
	at org.apache.spark.scheduler.Task.run(Task.scala:127)
	at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:462)
	at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1377)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:465)
	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
	at java.base/java.lang.Thread.run(Thread.java:829)
21/06/11 20:43:00 INFO CoarseGrainedExecutorBackend: Got assigned task 14
21/06/11 20:43:00 INFO Executor: Running task 0.1 in stage 2.0 (TID 14)
21/06/11 20:43:00 ERROR Executor: Exception in task 0.1 in stage 2.0 (TID 14)
java.lang.NullPointerException
	at com.johnsnowlabs.ml.tensorflow.TensorflowWrapper.getTFHubSession(TensorflowWrapper.scala:109)
	at com.johnsnowlabs.ml.tensorflow.TensorflowBert.tag(TensorflowBert.scala:90)
	at com.johnsnowlabs.ml.tensorflow.TensorflowBert.$anonfun$calculateEmbeddings$1(TensorflowBert.scala:223)
	at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:484)
	at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:490)
	at scala.collection.Iterator.toStream(Iterator.scala:1415)
	at scala.collection.Iterator.toStream$(Iterator.scala:1414)
	at scala.collection.AbstractIterator.toStream(Iterator.scala:1429)
	at scala.collection.TraversableOnce.toSeq(TraversableOnce.scala:303)
	at scala.collection.TraversableOnce.toSeq$(TraversableOnce.scala:303)
	at scala.collection.AbstractIterator.toSeq(Iterator.scala:1429)
	at com.johnsnowlabs.ml.tensorflow.TensorflowBert.calculateEmbeddings(TensorflowBert.scala:221)
	at com.johnsnowlabs.nlp.embeddings.BertEmbeddings.$anonfun$batchAnnotate$2(BertEmbeddings.scala:237)
	at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:238)
	at scala.collection.IndexedSeqOptimized.foreach(IndexedSeqOptimized.scala:36)
	at scala.collection.IndexedSeqOptimized.foreach$(IndexedSeqOptimized.scala:33)
	at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:198)
	at scala.collection.TraversableLike.map(TraversableLike.scala:238)
	at scala.collection.TraversableLike.map$(TraversableLike.scala:231)
	at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:198)
	at com.johnsnowlabs.nlp.embeddings.BertEmbeddings.batchAnnotate(BertEmbeddings.scala:229)
	at com.johnsnowlabs.nlp.HasBatchedAnnotate.$anonfun$batchProcess$1(HasBatchedAnnotate.scala:41)
	at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:484)
	at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:490)
	at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:458)
	at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:458)
	at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage2.processNext(Unknown Source)
	at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
	at org.apache.spark.sql.execution.WholeStageCodegenExec$$anon$1.hasNext(WholeStageCodegenExec.scala:729)
	at org.apache.spark.sql.execution.SparkPlan.$anonfun$getByteArrayRdd$1(SparkPlan.scala:345)
	at org.apache.spark.rdd.RDD.$anonfun$mapPartitionsInternal$2(RDD.scala:872)
	at org.apache.spark.rdd.RDD.$anonfun$mapPartitionsInternal$2$adapted(RDD.scala:872)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:349)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:313)
	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
	at org.apache.spark.scheduler.Task.run(Task.scala:127)
	at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:462)
	at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1377)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:465)
	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
	at java.base/java.lang.Thread.run(Thread.java:829)
21/06/11 20:43:00 INFO CoarseGrainedExecutorBackend: Got assigned task 15
21/06/11 20:43:00 INFO Executor: Running task 0.2 in stage 2.0 (TID 15)
21/06/11 20:43:00 ERROR Executor: Exception in task 0.2 in stage 2.0 (TID 15)
java.lang.NullPointerException
	at com.johnsnowlabs.ml.tensorflow.TensorflowWrapper.getTFHubSession(TensorflowWrapper.scala:109)
	at com.johnsnowlabs.ml.tensorflow.TensorflowBert.tag(TensorflowBert.scala:90)
	at com.johnsnowlabs.ml.tensorflow.TensorflowBert.$anonfun$calculateEmbeddings$1(TensorflowBert.scala:223)
	at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:484)
	at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:490)
	at scala.collection.Iterator.toStream(Iterator.scala:1415)
	at scala.collection.Iterator.toStream$(Iterator.scala:1414)
	at scala.collection.AbstractIterator.toStream(Iterator.scala:1429)
	at scala.collection.TraversableOnce.toSeq(TraversableOnce.scala:303)
	at scala.collection.TraversableOnce.toSeq$(TraversableOnce.scala:303)
	at scala.collection.AbstractIterator.toSeq(Iterator.scala:1429)
	at com.johnsnowlabs.ml.tensorflow.TensorflowBert.calculateEmbeddings(TensorflowBert.scala:221)
	at com.johnsnowlabs.nlp.embeddings.BertEmbeddings.$anonfun$batchAnnotate$2(BertEmbeddings.scala:237)
	at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:238)
	at scala.collection.IndexedSeqOptimized.foreach(IndexedSeqOptimized.scala:36)
	at scala.collection.IndexedSeqOptimized.foreach$(IndexedSeqOptimized.scala:33)
	at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:198)
	at scala.collection.TraversableLike.map(TraversableLike.scala:238)
	at scala.collection.TraversableLike.map$(TraversableLike.scala:231)
	at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:198)
	at com.johnsnowlabs.nlp.embeddings.BertEmbeddings.batchAnnotate(BertEmbeddings.scala:229)
	at com.johnsnowlabs.nlp.HasBatchedAnnotate.$anonfun$batchProcess$1(HasBatchedAnnotate.scala:41)
	at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:484)
	at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:490)
	at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:458)
	at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:458)
	at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage2.processNext(Unknown Source)
	at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
	at org.apache.spark.sql.execution.WholeStageCodegenExec$$anon$1.hasNext(WholeStageCodegenExec.scala:729)
	at org.apache.spark.sql.execution.SparkPlan.$anonfun$getByteArrayRdd$1(SparkPlan.scala:345)
	at org.apache.spark.rdd.RDD.$anonfun$mapPartitionsInternal$2(RDD.scala:872)
	at org.apache.spark.rdd.RDD.$anonfun$mapPartitionsInternal$2$adapted(RDD.scala:872)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:349)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:313)
	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
	at org.apache.spark.scheduler.Task.run(Task.scala:127)
	at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:462)
	at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1377)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:465)
	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
	at java.base/java.lang.Thread.run(Thread.java:829)
21/06/11 20:43:00 INFO CoarseGrainedExecutorBackend: Got assigned task 16
21/06/11 20:43:00 INFO Executor: Running task 0.3 in stage 2.0 (TID 16)
21/06/11 20:43:01 ERROR Executor: Exception in task 0.3 in stage 2.0 (TID 16)
java.lang.NullPointerException
	at com.johnsnowlabs.ml.tensorflow.TensorflowWrapper.getTFHubSession(TensorflowWrapper.scala:109)
	at com.johnsnowlabs.ml.tensorflow.TensorflowBert.tag(TensorflowBert.scala:90)
	at com.johnsnowlabs.ml.tensorflow.TensorflowBert.$anonfun$calculateEmbeddings$1(TensorflowBert.scala:223)
	at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:484)
	at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:490)
	at scala.collection.Iterator.toStream(Iterator.scala:1415)
	at scala.collection.Iterator.toStream$(Iterator.scala:1414)
	at scala.collection.AbstractIterator.toStream(Iterator.scala:1429)
	at scala.collection.TraversableOnce.toSeq(TraversableOnce.scala:303)
	at scala.collection.TraversableOnce.toSeq$(TraversableOnce.scala:303)
	at scala.collection.AbstractIterator.toSeq(Iterator.scala:1429)
	at com.johnsnowlabs.ml.tensorflow.TensorflowBert.calculateEmbeddings(TensorflowBert.scala:221)
	at com.johnsnowlabs.nlp.embeddings.BertEmbeddings.$anonfun$batchAnnotate$2(BertEmbeddings.scala:237)
	at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:238)
	at scala.collection.IndexedSeqOptimized.foreach(IndexedSeqOptimized.scala:36)
	at scala.collection.IndexedSeqOptimized.foreach$(IndexedSeqOptimized.scala:33)
	at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:198)
	at scala.collection.TraversableLike.map(TraversableLike.scala:238)
	at scala.collection.TraversableLike.map$(TraversableLike.scala:231)
	at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:198)
	at com.johnsnowlabs.nlp.embeddings.BertEmbeddings.batchAnnotate(BertEmbeddings.scala:229)
	at com.johnsnowlabs.nlp.HasBatchedAnnotate.$anonfun$batchProcess$1(HasBatchedAnnotate.scala:41)
	at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:484)
	at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:490)
	at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:458)
	at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:458)
	at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage2.processNext(Unknown Source)
	at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
	at org.apache.spark.sql.execution.WholeStageCodegenExec$$anon$1.hasNext(WholeStageCodegenExec.scala:729)
	at org.apache.spark.sql.execution.SparkPlan.$anonfun$getByteArrayRdd$1(SparkPlan.scala:345)
	at org.apache.spark.rdd.RDD.$anonfun$mapPartitionsInternal$2(RDD.scala:872)
	at org.apache.spark.rdd.RDD.$anonfun$mapPartitionsInternal$2$adapted(RDD.scala:872)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:349)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:313)
	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
	at org.apache.spark.scheduler.Task.run(Task.scala:127)
	at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:462)
	at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1377)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:465)
	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
	at java.base/java.lang.Thread.run(Thread.java:829)
21/06/11 20:43:04 INFO CoarseGrainedExecutorBackend: Driver commanded a shutdown
21/06/11 20:43:04 INFO MemoryStore: MemoryStore cleared
21/06/11 20:43:04 ERROR CoarseGrainedExecutorBackend: RE

I am a novice and this is probably a trivial issue, but raising it nonetheless since I couldnt’ find a solution anywhere.

Thanks!

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Comments:22 (11 by maintainers)

github_iconTop GitHub Comments

1reaction
havellaycommented, Jun 24, 2021

I too can confirm that both JARs fix the issue that I was facing. Thank you so much for your timely help! Best, Hari

1reaction
maziyarpanahicommented, Jun 22, 2021

Thanks for confirming that both errors were fixed. I will include them in tomorrow’s release of Spark NLP 3.1.1

Read more comments on GitHub >

github_iconTop Results From Across the Web

"java.lang.NullPointerException" showing while transforming ...
Hi Developers,. I am facing an issue in transform map and the issue is like while run transfer , error is showing as...
Read more >
java - What is a NullPointerException, and how do I fix it?
If a reference variable is set to null either explicitly by you or through Java automatically, and you attempt to dereference it you...
Read more >
Stack Trace: java.lang.NullPointerException error — oracle-tech
I suspect that the transformation is not handling this data well but can't say for sure. We will need to get a sample...
Read more >
Java NullPointerException - Detect, Fix, and Best Practices
This is one of the most common occurrences of java.lang.NullPointerException because it's the caller who is passing the null argument. The below ...
Read more >
BIRT » NullPointerException Encountered When Loading A ...
java.lang.NullPointerException at org.eclipse.birt.report.designer.ui.editors.MultiPageReportE ditor.addPages(MultiPageReportEditor.java:222)
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found