Connect to Couchbase on pySpark using JDBC driver

Hi,
I would like to be able to connect to Couchbase through pySpark on a JupyterNotebook as it is already possible for PostgresQL for example, using :

spark.read
.format(“jdbc”)
.option(“url”, "jdbc:postgresql:XXXXXXXXXXXX)
.option(“dbtable”, “XXXXX”)
.option(“user”, “XXX”)
.option(“password”, “XXX”)
.load()

I have only seen scala ways to do it but I would like to be able to interact pySpark and Couchbase.
For example, one purpose is to insert a Spark Dataframe that I have on my Jupyter Notebook into my local Couchbase Server.

Thanks !
Benoit

After having created a sparksession like the following :

sparkSession = pyspark.sql.SparkSession.builder
.config(“spark.jars”, “./cdata.jdbc.couchbase.jar”)
.config(“spark.sql.caseSensitive”, True)
.config(“spark.couchbase.nodes”, “127.0.0.1”)
.config(“spark.couchbase.bucket.travel-sample”,"")
.config(“spark.couchbase.username”, “XXX”)
.config(“spark.couchbase.password”, “XXX”)
.config(“spark.driver.extraClassPath”, “./cdata.jdbc.couchbase.jar”)
.appName(app_name)
.getOrCreate())