Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Spark Job getting stuck after upgrading to Snappydata 1.3.0 #1568

Open
parimalaadinarayan opened this issue Dec 2, 2021 · 1 comment
Open

Comments

@parimalaadinarayan
Copy link

parimalaadinarayan commented Dec 2, 2021

val windowSpec: WindowSpec = Window.partitionBy("imsi").orderBy($"epochTime".asc) df = df.withColumn("Data", udf(SuspiciousActivityDetection.datUsageDetection(_: String, _: String, broadcastDataUsageEventToSubEventList)).apply($"subevent", $"event")) df.select().write.mode(SaveMode.Append).saveAsTable(tableName)
1.Above code used to work in Snappydata 1.1.0, after upgrading to 1.3.0 same code does not work.
2.Executors itself are not getting launched.
3.Where as after adding persist in between the code, it seems to work with newer version.
Please let us know why saveAsTable is not triggering any action?
And also why job is getting stuck?
Are there any new overloaded methods to saveAsTable that we are supposed to use?

@sumwale
Copy link
Contributor

sumwale commented Dec 5, 2021

@parimalaadinarayan One reason for jobs getting stuck can be due to executors not available. Check the available executors in the "Executors" tab and then check the thread dump there for the driver. Check the jobs tab to see if the job is getting created there. For the saveAsTable, you have not given any format or other properties, so this will use parquet files for storage in the default hive configuration ("spark-warehouse" in lead working directory by default). So other issue could be that you have a hive-site.xml or equivalent that points elsewhere but that is not reachable or something? I just checked similar commands and they are working fine.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants