Cannot send lineage data to (SocketTimeoutException: Read timed out)
See original GitHub issueusing spark-2.4-spline-agent-bundle_2.11-0.6.1.jar, spark 2.4, spline ui and rest-server 0.6, arangodb:3.7.10. spline components using docker quickstart
some data does go to spline ui but then at the end it give this error:
2021-07-02 02:23:46,996 [main] ERROR za.co.absa.spline.harvester.listener.SplineQueryExecutionListener - Unexpected error occurred during lineage processing for application: lin1 #local-1625192542547
java.lang.RuntimeException: Cannot send lineage data to http://redac:8080/producer/execution-plans
at za.co.absa.spline.harvester.dispatcher.HttpLineageDispatcher.sendJson(HttpLineageDispatcher.scala:85)
at za.co.absa.spline.harvester.dispatcher.HttpLineageDispatcher.send(HttpLineageDispatcher.scala:60)
at za.co.absa.spline.harvester.QueryExecutionEventHandler$$anonfun$onSuccess$2.apply(QueryExecutionEventHandler.scala:45)
at za.co.absa.spline.harvester.QueryExecutionEventHandler$$anonfun$onSuccess$2.apply(QueryExecutionEventHandler.scala:43)
at scala.Option.foreach(Option.scala:257)
at za.co.absa.spline.harvester.QueryExecutionEventHandler.onSuccess(QueryExecutionEventHandler.scala:43)
at za.co.absa.spline.harvester.listener.SplineQueryExecutionListener$$anonfun$onSuccess$1$$anonfun$apply$mcV$sp$1.apply(SplineQueryExecutionListener.scala:40)
at za.co.absa.spline.harvester.listener.SplineQueryExecutionListener$$anonfun$onSuccess$1$$anonfun$apply$mcV$sp$1.apply(SplineQueryExecutionListener.scala:40)
at scala.Option.foreach(Option.scala:257)
at za.co.absa.spline.harvester.listener.SplineQueryExecutionListener$$anonfun$onSuccess$1.apply$mcV$sp(SplineQueryExecutionListener.scala:40)
at za.co.absa.spline.harvester.listener.SplineQueryExecutionListener.withErrorHandling(SplineQueryExecutionListener.scala:49)
at za.co.absa.spline.harvester.listener.SplineQueryExecutionListener.onSuccess(SplineQueryExecutionListener.scala:39)
at org.apache.spark.sql.util.ExecutionListenerManager$$anonfun$onSuccess$1$$anonfun$apply$mcV$sp$1.apply(QueryExecutionListener.scala:129)
at org.apache.spark.sql.util.ExecutionListenerManager$$anonfun$onSuccess$1$$anonfun$apply$mcV$sp$1.apply(QueryExecutionListener.scala:128)
at org.apache.spark.sql.util.ExecutionListenerManager$$anonfun$org$apache$spark$sql$util$ExecutionListenerManager$$withErrorHandling$1.apply(QueryExecutionListener.scala:157)
at org.apache.spark.sql.util.ExecutionListenerManager$$anonfun$org$apache$spark$sql$util$ExecutionListenerManager$$withErrorHandling$1.apply(QueryExecutionListener.scala:155)
at scala.collection.immutable.List.foreach(List.scala:392)
at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:35)
at scala.collection.mutable.ListBuffer.foreach(ListBuffer.scala:45)
at org.apache.spark.sql.util.ExecutionListenerManager.org$apache$spark$sql$util$ExecutionListenerManager$$withErrorHandling(QueryExecutionListener.scala:155)
at org.apache.spark.sql.util.ExecutionListenerManager$$anonfun$onSuccess$1.apply$mcV$sp(QueryExecutionListener.scala:128)
at org.apache.spark.sql.util.ExecutionListenerManager$$anonfun$onSuccess$1.apply(QueryExecutionListener.scala:128)
at org.apache.spark.sql.util.ExecutionListenerManager$$anonfun$onSuccess$1.apply(QueryExecutionListener.scala:128)
at org.apache.spark.sql.util.ExecutionListenerManager.readLock(QueryExecutionListener.scala:168)
at org.apache.spark.sql.util.ExecutionListenerManager.onSuccess(QueryExecutionListener.scala:127)
at org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:679)
at org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:286)
at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:272)
at com.yotpo.metorikku.output.writers.file.FileOutputWriter.save(FileOutputWriter.scala:106)
at com.yotpo.metorikku.output.writers.file.FileOutputWriter.write(FileOutputWriter.scala:76)
at com.yotpo.metorikku.output.writers.file.CSVOutputWriter.write(CSVOutputWriter.scala:23)
at com.yotpo.metorikku.metric.Metric.com$yotpo$metorikku$metric$Metric$$writeBatch(Metric.scala:102)
at com.yotpo.metorikku.metric.Metric$$anonfun$write$1.apply(Metric.scala:156)
at com.yotpo.metorikku.metric.Metric$$anonfun$write$1.apply(Metric.scala:134)
at scala.collection.immutable.List.foreach(List.scala:392)
at com.yotpo.metorikku.metric.Metric.write(Metric.scala:134)
at com.yotpo.metorikku.metric.MetricSet$$anonfun$run$1.apply(MetricSet.scala:50)
at com.yotpo.metorikku.metric.MetricSet$$anonfun$run$1.apply(MetricSet.scala:45)
at scala.collection.immutable.List.foreach(List.scala:392)
at com.yotpo.metorikku.metric.MetricSet.run(MetricSet.scala:45)
at com.yotpo.metorikku.Metorikku$$anonfun$runMetrics$1.apply(Metorikku.scala:47)
at com.yotpo.metorikku.Metorikku$$anonfun$runMetrics$1.apply(Metorikku.scala:45)
at scala.collection.immutable.List.foreach(List.scala:392)
at com.yotpo.metorikku.Metorikku$.runMetrics(Metorikku.scala:45)
at com.yotpo.metorikku.Metorikku$.delayedEndpoint$com$yotpo$metorikku$Metorikku$1(Metorikku.scala:19)
at com.yotpo.metorikku.Metorikku$delayedInit$body.apply(Metorikku.scala:9)
at scala.Function0$class.apply$mcV$sp(Function0.scala:34)
at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
at scala.App$$anonfun$main$1.apply(App.scala:76)
at scala.App$$anonfun$main$1.apply(App.scala:76)
at scala.collection.immutable.List.foreach(List.scala:392)
at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:35)
at scala.App$class.main(App.scala:76)
at com.yotpo.metorikku.Metorikku$.main(Metorikku.scala:9)
at com.yotpo.metorikku.Metorikku.main(Metorikku.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:845)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:161)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:184)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:920)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:929)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.net.SocketTimeoutException: Read timed out
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at sun.net.www.protocol.http.HttpURLConnection$10.run(HttpURLConnection.java:1950)
at sun.net.www.protocol.http.HttpURLConnection$10.run(HttpURLConnection.java:1945)
at java.security.AccessController.doPrivileged(Native Method)
at sun.net.www.protocol.http.HttpURLConnection.getChainedException(HttpURLConnection.java:1944)
at sun.net.www.protocol.http.HttpURLConnection.getInputStream0(HttpURLConnection.java:1514)
at sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1498)
at java.net.HttpURLConnection.getResponseCode(HttpURLConnection.java:480)
at scalaj.http.HttpRequest.scalaj$http$HttpRequest$$doConnection(Http.scala:367)
at scalaj.http.HttpRequest.exec(Http.scala:343)
at scalaj.http.HttpRequest.asString(Http.scala:491)
at za.co.absa.spline.harvester.dispatcher.httpdispatcher.rest.RestEndpoint.post(RestEndpoint.scala:43)
at za.co.absa.spline.harvester.dispatcher.HttpLineageDispatcher.sendJson(HttpLineageDispatcher.scala:78)
... 66 more
Caused by: java.net.SocketTimeoutException: Read timed out
at java.net.SocketInputStream.socketRead0(Native Method)
at java.net.SocketInputStream.socketRead(SocketInputStream.java:116)
at java.net.SocketInputStream.read(SocketInputStream.java:171)
at java.net.SocketInputStream.read(SocketInputStream.java:141)
at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
at java.io.BufferedInputStream.read1(BufferedInputStream.java:286)
at java.io.BufferedInputStream.read(BufferedInputStream.java:345)
at sun.net.www.http.HttpClient.parseHTTPHeader(HttpClient.java:735)
at sun.net.www.http.HttpClient.parseHTTP(HttpClient.java:678)
at sun.net.www.protocol.http.HttpURLConnection.getInputStream0(HttpURLConnection.java:1593)
at sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1498)
at scalaj.http.HttpRequest.scalaj$http$HttpRequest$$doConnection(Http.scala:365)
... 70 more
Issue Analytics
- State:
- Created 2 years ago
- Comments:17 (11 by maintainers)
Top Results From Across the Web
java.net.SocketTimeoutException: Read timed out under Tomcat
It just means the client isn't sending. You don't need to worry about it. Browser clients come and go in all sorts of...
Read more >Caused by: java.net.SocketTimeoutException: Read timed out ...
I am trying to insert data to redis (Azure Cache for Redis) through spark. There are around 700 million rows and I am...
Read more >ERROR : java.net.SocketTimeoutException: Read timed out
Pretty new to Tachyon and spark.. i am basically trying to write partitioned data into parquet files using spark on tachyon.. spark version...
Read more >java.net.SocketTimeoutException: Read timed out exception ...
Server is trying to read data from the request, but its taking longer than the timeout value for the data to arrive from...
Read more >'Read timed out', seen during SyncNode request - IBM
SocketTimeoutException : Read timed out] at org.apache.soap.transport.http.SOAPHTTPConnection.send(Unknown Source.
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
It’s this issue - #272. We are still investigating. Thanks for the info.
Solved by #272