You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
If you have triaged this as a bug, then file an issue directly.
Describe the problem you faced
A clear and concise description of the problem.
To Reproduce
Steps to reproduce the behavior:
Expected behavior
hoodie.parquet.zstd.compression.level
Config to support the level setting for zstd in SPARK or parquet write.
A clear and concise description of what you expected to happen.
Environment Description
Hudi version :
Spark version : 3.3
Hive version :
Hadoop version :
Storage (HDFS/S3/GCS..) :
Running on Docker? (yes/no) :
Additional context
Add any other context about the problem here.
Stacktrace
Add the stacktrace of the error.
The text was updated successfully, but these errors were encountered:
In Flink, you can use parquet. prefix for any property that you wanna customize with the parquet writer, not sure whether Spark has the similiar function.
Tips before filing an issue
Have you gone through our FAQs?
Join the mailing list to engage in conversations and get faster support at dev-subscribe@hudi.apache.org.
If you have triaged this as a bug, then file an issue directly.
Describe the problem you faced
A clear and concise description of the problem.
To Reproduce
Steps to reproduce the behavior:
Expected behavior
hoodie.parquet.zstd.compression.level
Config to support the level setting for zstd in SPARK or parquet write.
A clear and concise description of what you expected to happen.
Environment Description
Hudi version :
Spark version : 3.3
Hive version :
Hadoop version :
Storage (HDFS/S3/GCS..) :
Running on Docker? (yes/no) :
Additional context
Add any other context about the problem here.
Stacktrace
Add the stacktrace of the error.
The text was updated successfully, but these errors were encountered: