Timeout in Spark Connector

I’m learning Spark with Couchbase now, and am running into a timeout issue even when performing simple operations (the machines I am working on are remote, so it is entirely possible the connection exceeds the default 2.5 seconds timeout.

Can anyone point me to how to correctly set timeout options when generating the SparkContext object in Java? I couldn’t find any example (in neither Java nor Scala) for setting any other options besides the nodes and bucket…

SparkConf conf = new SparkConf().setAppName(“TestCouchbase”).set(“com.couchbase.nodes”,“couchbase01.server.domain”).set(“com.couchbase.bucket.beer-sample”, “”);
JavaSparkContext sc = new JavaSparkContext(conf);
CouchbaseSparkContext csc = couchbaseContext(sc);


Probably the easiest thing to do is set it is properties. The Couchbase Spark Connector is based on the Couchbase Java Client so the available settings are the ones described here:


Thanks Will, I saw that document but I’m not sure which values I am supposed to set, and where. The connector may be based on the Java client, but it looks like the same method for setting env settings in the Java client (using an environment builder) will not work in the Spark connector, and there is no documentation regarding setting any of the options in the Spark connector, except the hostname and the bucket to use.

The following does not work - it has no effect on any of the timeout values.

SparkConf conf = new SparkConf().setAppName(“TestCouchbase”)
.set(“com.couchbase.bucket.beer-sample”, “”)
.set(“com.couchbase.kvTimeout”, “10000”)
.set(“com.couchbase.connectTimeout”, “30000”)
.set(“com.couchbase.socketConnect”, “10000”);

Those params need to be set as system properties, not spark properties right now - we are looking to improve it for future versions!

Thanks, that worked :slight_smile:

I’ve the same issue, system properties as in , for Couchbase Server?