Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

spark task is not working #143

Open
joyatcloudfall opened this issue Mar 1, 2022 · 2 comments
Open

spark task is not working #143

joyatcloudfall opened this issue Mar 1, 2022 · 2 comments

Comments

@joyatcloudfall
Copy link

I tried to use pyspark to do a count task.
image

The task was sent to spark, however, the task was not running.
image

Any idea what's going wrong?

@rilakgg
Copy link

rilakgg commented Mar 31, 2022

Hi,
I've submitted an application to master from remote server.
I got the same problem too.
Loop messages:
WARN TaskSchedulerImpl: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources

Do you have any suggestions ?

Images: Spark 3.2.0 for Hadoop 3.2 with OpenJDK 8 and Scala 2.12
Thanks.

@kiran-jayaram
Copy link

Hi, I am having the same issue, did it get resolved ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants