importing pyspark in python shell

I have Spark installed properly on my machine and am able to run python programs with the pyspark modules without error when using ./bin/pyspark as my python interpreter.

However, when I attempt to run the regular Python shell, when I try to import pyspark modules I get this error:

from pyspark import SparkContext

and it says

"No module named pyspark".

How can I fix this? Is there an environment variable I need to set to point Python to the pyspark headers/libraries/etc.? If my spark installation is /spark/, which pyspark paths do I need to include? Or can pyspark programs only be run from the pyspark interpreter?

Hi @ursyan - is this question specific to Couchbase or Spark in general? At the moment we do not have support for connecting Spark to Couchbase in Python, only with Java

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.