Scala code giving this exception while pushing data to couchbase bucket - java.lang.RuntimeException: java.util.concurrent.TimeoutException

Hi,

When i am pushing data from hive table to couchbase bucket using Spark scala code, i am getting below error.

I am starting the spark shell with below configuration-
spark-shell --conf spark.couchbase.nodes=10.32.83.53 --conf “queue=ssm_queue” --conf spark.couchbase.username=test_user --conf spark.couchbase.password=123456 --conf “spark.local.dir=/data01/spark/applicationHistory” --conf “spark.couchbase.bucket.test_bucket=” --conf “spark.couchbase.queryTimeout=99999” --conf “spark.couchbase.viewTimeout=99999” --conf “spark.couchbase.searchTimeout=99999” --conf “spark.couchbase.kvTimeout=99999” --conf “spark.couchbase.connectTimeout=99999” --conf “spark.couchbase.managementTimeout=99999” --conf “spark.couchbase.disconnectTimeout=99999” --conf “spark.executor.extraClassPath=/usr/hdp/current/spark2-client/jars/denodo-vdp-jdbcdriver.jar:/usr/hdp/current/spark2-client/jars/spark-connector-assembly-2.3.0-SNAPSHOT.jar:/home/m0058854/Couchbase_jars/core-io-1.7.1.jar:/home/m0058854/Couchbase_jars/opentracing-api-0.31.0.jar:/home/m0058854/Couchbase_jars/rxjava-1.3.8.jar:/usr/hdp/current/spark2-client/jars/java-client-2.7.1.jar” --conf “spark.driver.extraClassPath=/usr/hdp/current/spark2-client/jars/denodo-vdp-jdbcdriver.jar:/usr/hdp/current/spark2-client/jars/spark-connector-assembly-2.3.0-SNAPSHOT.jar:/usr/hdp/current/spark2-client/jars/java-client-2.7.1.jar:/home/m0058854/Couchbase_jars/core-io-1.7.1.jar:/home/m0058854/Couchbase_jars/opentracing-api-0.31.0.jar:/home/m0058854/Couchbase_jars/rxjava-1.3.8.jar” --conf “spark.network.timeout=100000000” --conf “spark.executor.heartbeatInterval=100000000” --conf “spark.sql.broadcastTimeout=99999999” --conf “spark.network.timeout=1000000” --conf “spark.rdd.compress=true” --conf “spark.driver.maxResultSize=0” --conf “spark.sql.autoBroadcastJoinThreshold=-1” --num-executors 1 --executor-memory 5g --driver-memory 5g --executor-cores 1 pyspark-shell

Error:
java.lang.RuntimeException: java.util.concurrent.TimeoutException
at rx.exceptions.Exceptions.propagate(Exceptions.java:57)
at rx.observables.BlockingObservable.blockForSingle(BlockingObservable.java:463)
at rx.observables.BlockingObservable.single(BlockingObservable.java:340)
at rx.lang.scala.observables.BlockingObservable$.single$extension(BlockingObservable.scala:188)
at rx.lang.scala.observables.BlockingObservable$.lastOption$extension(BlockingObservable.scala:85)
at rx.lang.scala.observables.BlockingObservable$.lastOrElse$extension(BlockingObservable.scala)
at com.couchbase.spark.DocumentRDDFunctions$$anonfun$saveToCouchbase$1.apply(DocumentRDDFunctions.scala:81)
at com.couchbase.spark.DocumentRDDFunctions$$anonfun$saveToCouchbase$1.apply(DocumentRDDFunctions.scala:46)
at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$29.apply(RDD.scala:926)
at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$29.apply(RDD.scala:926)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2069)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2069)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
at org.apache.spark.scheduler.Task.run(Task.scala:108)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:338)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.util.concurrent.TimeoutException
at rx.internal.operators.OnSubscribeTimeoutTimedWithFallback$TimeoutMainSubscriber.onTimeout(OnSubscribeTimeoutTimedWithFallback.java:166)
at rx.internal.operators.OnSubscribeTimeoutTimedWithFallback$TimeoutMainSubscriber$TimeoutTask.call(OnSubscribeTimeoutTimedWithFallback.java:191)
at rx.internal.schedulers.EventLoopsScheduler$EventLoopWorker$2.call(EventLoopsScheduler.java:189)
at rx.internal.schedulers.ScheduledAction.run(ScheduledAction.java:55)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
… 3 more
19/02/08 12:16:02 WARN TaskSetManager: Lost task 0.0 in stage 0.0 (TID 0, localhost, executor driver): java.lang.RuntimeException: java.util.concurrent.TimeoutException
at rx.exceptions.Exceptions.propagate(Exceptions.java:57)
at rx.observables.BlockingObservable.blockForSingle(BlockingObservable.java:463)
at rx.observables.BlockingObservable.single(BlockingObservable.java:340)
at rx.lang.scala.observables.BlockingObservable$.single$extension(BlockingObservable.scala:188)
at rx.lang.scala.observables.BlockingObservable$.lastOption$extension(BlockingObservable.scala:85)
at rx.lang.scala.observables.BlockingObservable$.lastOrElse$extension(BlockingObservable.scala)
at com.couchbase.spark.DocumentRDDFunctions$$anonfun$saveToCouchbase$1.apply(DocumentRDDFunctions.scala:81)
at com.couchbase.spark.DocumentRDDFunctions$$anonfun$saveToCouchbase$1.apply(DocumentRDDFunctions.scala:46)
at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$29.apply(RDD.scala:926)
at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$29.apply(RDD.scala:926)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2069)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2069)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
at org.apache.spark.scheduler.Task.run(Task.scala:108)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:338)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)

Hey @annapurna.narendra,

Does any data make it through to the bucket before the timeout? How long does the timeout take to happen?