Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to change PYSPARK_PYTHON from 2.7 to 3.6? #100

Open
cqpaul opened this issue Jul 8, 2020 · 4 comments
Open

How to change PYSPARK_PYTHON from 2.7 to 3.6? #100

cqpaul opened this issue Jul 8, 2020 · 4 comments

Comments

@cqpaul
Copy link

cqpaul commented Jul 8, 2020

I have got an Exception as below, hope some one can help to solve it, thanks.

Python in worker has different version 2.7 than that in driver 3.6, PySpark cannot run with different minor versions.Please check environment variables PYSPARK_PYTHON and PYSPARK_DRIVER_PYTHON are correctly set.

@Flyfoxs
Copy link

Flyfoxs commented Sep 30, 2020

docker exec -it -e PYSPARK_PYTHON=python3 spark-master /spark/bin/pyspark

@hugokoopmans
Copy link

ok @Flyfoxs this works for the master but not for the worker?
how can i make sure the worker also starts up python 3?

@sowizdrzal
Copy link

@hugokoopmans Do you already know what went wrong?

@stator85
Copy link

stator85 commented Jul 6, 2022

Have You already fixed also for worker in the latest branch?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants