Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Q: How to handle exceptions? #1134

Open
Krumelur opened this issue Jan 27, 2023 · 0 comments
Open

Q: How to handle exceptions? #1134

Krumelur opened this issue Jan 27, 2023 · 0 comments

Comments

@Krumelur
Copy link

Krumelur commented Jan 27, 2023

I'm facing a scenario where reading files fails, because certain folders within my source (data lake) are empty. The following code (snippet) will then fail:

foreach(var repoFolder in repoFolders)
{
    var inputDf = spark
        .Read()
        .Option("multiline", "true")
        .Json($"{cdsInputRootUri}/{snapshotId}/{repoFolder}/*/*.json");
...

In Python I could try but using a C# try-catch around my code won't work. What's the recommended way of converting such code to DotNet Spark?

for add in addrs:
    try:
        spark.read.format("parquet").load(add)
    except:
        print(add)
        addrs.remove(add)
@Krumelur Krumelur changed the title How to handle exceptions? Q: How to handle exceptions? Jan 27, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant