Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ImportError: pyspark home needs to be added to PYTHONPATH #58

Open
ghost opened this issue May 13, 2016 · 1 comment
Open

ImportError: pyspark home needs to be added to PYTHONPATH #58

ghost opened this issue May 13, 2016 · 1 comment

Comments

@ghost
Copy link

ghost commented May 13, 2016

During execution of following simple code with Sparkit-Learn:

from splearn.svm import SparkLinearSVC
spark=SparkLinearSVC()

I get following error message:

ImportError: pyspark home needs to be added to PYTHONPATH.
export PYTHONPATH=$PYTHONPATH:$SPARK_HOME/python:../

In accordance with those anserws:
http://stackoverflow.com/questions/28829757/unable-to-add-spark-to-pythonpath
http://stackoverflow.com/questions/23256536/importing-pyspark-in-python-shell
I have added every possible configuration of those PYTHONPATHs to my .bashrc,but error is still occuring.

Currently my .bashrc paths looks like that:

export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
export PATH=$JAVA_HOME/bin:$PATH
export PATH=/home/123/anaconda2/bin:$PATH
export SPARK_HOME=/home/123/Downloads/spark-1.6.1-bin-hadoop2.6
export PATH=$SPARK_HOME/bin:$PATH
export PATH=$JAVA_HOME/jre/lib/amd64/server:$PATH
export PATH=$JAVA_HOME/jre/lib/amd64:$PATH
export PYTHONPATH=$SPARK_HOME/python/:$PYTHONPATH
export PYTHONPATH=$SPARK_HOME/python/lib/py4j-0.9-src.zip:$PYTHONPATH

Any possible solution? I am running this on Ubuntu 16.04,Pycharm and spark-1.6.1-bin-hadoop2.6

@alonsopg
Copy link

I will share you my .profile. Hope this helps:

# For a ipython notebook and pyspark integration
if which pyspark > /dev/null; then
    #/usr/local/Cellar/apache-spark/1.5.1
  export SPARK_HOME="/usr/local/Cellar/apache-spark/1.6.1/libexec/"
  export PYSPARK_SUBMIT_ARGS="--master local[2]"
fi

export SPARK_HOME="/usr/local/Cellar/apache-spark/1.6.1/libexec"
export PYTHONPATH=$SPARK_HOME/python3/:$PYTHONPATH
export PYTHONPATH=$SPARK_HOME/python3/lib/py4j-0.9-src.zip:$PYTHONPATH
#export PYSPARK_PYTHON=python3

Note that I am working with a Jupyter notebook:

In:
from splearn.svm import SparkLinearSVC
spark=SparkLinearSVC()

I do not get the same error message.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant