Skip to content

Commit

Permalink
Merge pull request #9050 from NvTimLiu/release-tmp
Browse files Browse the repository at this point in the history
Merge branch 'branch-23.08' into main [skip ci]
  • Loading branch information
pxLi committed Aug 15, 2023
2 parents b33605a + 10540ba commit ee2de89
Show file tree
Hide file tree
Showing 35 changed files with 144 additions and 77 deletions.
10 changes: 9 additions & 1 deletion CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
# Change log
Generated on 2023-08-14
Generated on 2023-08-15

## Release 23.08

Expand Down Expand Up @@ -31,6 +31,9 @@ Generated on 2023-08-14
### Bugs Fixed
|||
|:---|:---|
|[#9034](https://github.com/NVIDIA/spark-rapids/issues/9034)|[BUG] java.lang.ClassCastException: com.nvidia.spark.rapids.RuleNotFoundExprMeta cannot be cast to com.nvidia.spark.rapids.GeneratorExprMeta|
|[#9032](https://github.com/NVIDIA/spark-rapids/issues/9032)|[BUG] Multiple NDS queries fail with Spark-3.4.1 with bloom filter exception|
|[#8962](https://github.com/NVIDIA/spark-rapids/issues/8962)|[BUG] Nightly build failed: ExecutionPlanCaptureCallback$.class is not bitwise-identical across shims|
|[#9021](https://github.com/NVIDIA/spark-rapids/issues/9021)|[BUG] test_map_scalars_supported_key_types failed in dataproc 2.1|
|[#9020](https://github.com/NVIDIA/spark-rapids/issues/9020)|[BUG] auto-disable snapshot shims test in github action for pre-release branch|
|[#9010](https://github.com/NVIDIA/spark-rapids/issues/9010)|[BUG] Customer failure 23.08: Cannot compute hash of a table with a LIST of STRUCT columns.|
Expand Down Expand Up @@ -85,6 +88,11 @@ Generated on 2023-08-14
### PRs
|||
|:---|:---|
|[#9044](https://github.com/NVIDIA/spark-rapids/pull/9044)|[DOC] update release version from v2308.0 to 2308.1 [skip ci]|
|[#9036](https://github.com/NVIDIA/spark-rapids/pull/9036)|Fix meta class cast exception when generator not supported|
|[#9042](https://github.com/NVIDIA/spark-rapids/pull/9042)|Bump up project version to 23.08.1-SNAPSHOT|
|[#9035](https://github.com/NVIDIA/spark-rapids/pull/9035)|Handle null values when merging Bloom filters|
|[#9029](https://github.com/NVIDIA/spark-rapids/pull/9029)|Update 23.08 changelog to latest [skip ci]|
|[#9023](https://github.com/NVIDIA/spark-rapids/pull/9023)|Allow WindowLocalExec to run on CPU for a map test.|
|[#9024](https://github.com/NVIDIA/spark-rapids/pull/9024)|Do not trigger snapshot spark version test in pre-release maven-verify checks [skip ci]|
|[#8975](https://github.com/NVIDIA/spark-rapids/pull/8975)|Init 23.08 changelog [skip ci]|
Expand Down
8 changes: 4 additions & 4 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -113,15 +113,15 @@ mvn -pl dist -PnoSnapshots package -DskipTests
Verify that shim-specific classes are hidden from a conventional classloader.

```bash
$ javap -cp dist/target/rapids-4-spark_2.12-23.08.0-cuda11.jar com.nvidia.spark.rapids.shims.SparkShimImpl
$ javap -cp dist/target/rapids-4-spark_2.12-23.08.1-cuda11.jar com.nvidia.spark.rapids.shims.SparkShimImpl
Error: class not found: com.nvidia.spark.rapids.shims.SparkShimImpl
```

However, its bytecode can be loaded if prefixed with `spark3XY` not contained in the package name

```bash
$ javap -cp dist/target/rapids-4-spark_2.12-23.08.0-cuda11.jar spark320.com.nvidia.spark.rapids.shims.SparkShimImpl | head -2
Warning: File dist/target/rapids-4-spark_2.12-23.08.0-cuda11.jar(/spark320/com/nvidia/spark/rapids/shims/SparkShimImpl.class) does not contain class spark320.com.nvidia.spark.rapids.shims.SparkShimImpl
$ javap -cp dist/target/rapids-4-spark_2.12-23.08.1-cuda11.jar spark320.com.nvidia.spark.rapids.shims.SparkShimImpl | head -2
Warning: File dist/target/rapids-4-spark_2.12-23.08.1-cuda11.jar(/spark320/com/nvidia/spark/rapids/shims/SparkShimImpl.class) does not contain class spark320.com.nvidia.spark.rapids.shims.SparkShimImpl
Compiled from "SparkShims.scala"
public final class com.nvidia.spark.rapids.shims.SparkShimImpl {
```
Expand Down Expand Up @@ -163,7 +163,7 @@ mvn package -pl dist -am -Dbuildver=340 -DallowConventionalDistJar=true
Verify `com.nvidia.spark.rapids.shims.SparkShimImpl` is conventionally loadable:
```bash
$ javap -cp dist/target/rapids-4-spark_2.12-23.08.0-cuda11.jar com.nvidia.spark.rapids.shims.SparkShimImpl | head -2
$ javap -cp dist/target/rapids-4-spark_2.12-23.08.1-cuda11.jar com.nvidia.spark.rapids.shims.SparkShimImpl | head -2
Compiled from "SparkShims.scala"
public final class com.nvidia.spark.rapids.shims.SparkShimImpl {
```
Expand Down
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -73,7 +73,7 @@ as a `provided` dependency.
<dependency>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark_2.12</artifactId>
<version>23.08.0</version>
<version>23.08.1</version>
<scope>provided</scope>
</dependency>
```
4 changes: 2 additions & 2 deletions aggregator/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,12 +22,12 @@
<parent>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-parent</artifactId>
<version>23.08.0</version>
<version>23.08.1</version>
</parent>
<artifactId>rapids-4-spark-aggregator_2.12</artifactId>
<name>RAPIDS Accelerator for Apache Spark Aggregator</name>
<description>Creates an aggregated shaded package of the RAPIDS plugin for Apache Spark</description>
<version>23.08.0</version>
<version>23.08.1</version>

<properties>
<!--
Expand Down
4 changes: 2 additions & 2 deletions api_validation/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,10 +22,10 @@
<parent>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-parent</artifactId>
<version>23.08.0</version>
<version>23.08.1</version>
</parent>
<artifactId>rapids-4-spark-api-validation</artifactId>
<version>23.08.0</version>
<version>23.08.1</version>

<profiles>
<profile>
Expand Down
4 changes: 2 additions & 2 deletions delta-lake/delta-20x/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,14 +22,14 @@
<parent>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-parent</artifactId>
<version>23.08.0</version>
<version>23.08.1</version>
<relativePath>../../pom.xml</relativePath>
</parent>

<artifactId>rapids-4-spark-delta-20x_2.12</artifactId>
<name>RAPIDS Accelerator for Apache Spark Delta Lake 2.0.x Support</name>
<description>Delta Lake 2.0.x support for the RAPIDS Accelerator for Apache Spark</description>
<version>23.08.0</version>
<version>23.08.1</version>

<properties>
<rapids.compressed.artifact>false</rapids.compressed.artifact>
Expand Down
4 changes: 2 additions & 2 deletions delta-lake/delta-21x/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,14 +22,14 @@
<parent>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-parent</artifactId>
<version>23.08.0</version>
<version>23.08.1</version>
<relativePath>../../pom.xml</relativePath>
</parent>

<artifactId>rapids-4-spark-delta-21x_2.12</artifactId>
<name>RAPIDS Accelerator for Apache Spark Delta Lake 2.1.x Support</name>
<description>Delta Lake 2.1.x support for the RAPIDS Accelerator for Apache Spark</description>
<version>23.08.0</version>
<version>23.08.1</version>

<properties>
<rapids.compressed.artifact>false</rapids.compressed.artifact>
Expand Down
4 changes: 2 additions & 2 deletions delta-lake/delta-22x/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,14 +22,14 @@
<parent>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-parent</artifactId>
<version>23.08.0</version>
<version>23.08.1</version>
<relativePath>../../pom.xml</relativePath>
</parent>

<artifactId>rapids-4-spark-delta-22x_2.12</artifactId>
<name>RAPIDS Accelerator for Apache Spark Delta Lake 2.2.x Support</name>
<description>Delta Lake 2.2.x support for the RAPIDS Accelerator for Apache Spark</description>
<version>23.08.0</version>
<version>23.08.1</version>

<properties>
<rapids.compressed.artifact>false</rapids.compressed.artifact>
Expand Down
4 changes: 2 additions & 2 deletions delta-lake/delta-24x/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,14 +22,14 @@
<parent>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-parent</artifactId>
<version>23.08.0</version>
<version>23.08.1</version>
<relativePath>../../pom.xml</relativePath>
</parent>

<artifactId>rapids-4-spark-delta-24x_2.12</artifactId>
<name>RAPIDS Accelerator for Apache Spark Delta Lake 2.4.x Support</name>
<description>Delta Lake 2.4.x support for the RAPIDS Accelerator for Apache Spark</description>
<version>23.08.0</version>
<version>23.08.1</version>

<properties>
<rapids.compressed.artifact>false</rapids.compressed.artifact>
Expand Down
4 changes: 2 additions & 2 deletions delta-lake/delta-spark321db/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,14 +22,14 @@
<parent>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-parent</artifactId>
<version>23.08.0</version>
<version>23.08.1</version>
<relativePath>../../pom.xml</relativePath>
</parent>

<artifactId>rapids-4-spark-delta-spark321db_2.12</artifactId>
<name>RAPIDS Accelerator for Apache Spark Databricks 10.4 Delta Lake Support</name>
<description>Databricks 10.4 Delta Lake support for the RAPIDS Accelerator for Apache Spark</description>
<version>23.08.0</version>
<version>23.08.1</version>

<properties>
<rapids.compressed.artifact>false</rapids.compressed.artifact>
Expand Down
4 changes: 2 additions & 2 deletions delta-lake/delta-spark330db/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,14 +22,14 @@
<parent>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-parent</artifactId>
<version>23.08.0</version>
<version>23.08.1</version>
<relativePath>../../pom.xml</relativePath>
</parent>

<artifactId>rapids-4-spark-delta-spark330db_2.12</artifactId>
<name>RAPIDS Accelerator for Apache Spark Databricks 11.3 Delta Lake Support</name>
<description>Databricks 11.3 Delta Lake support for the RAPIDS Accelerator for Apache Spark</description>
<version>23.08.0</version>
<version>23.08.1</version>

<properties>
<rapids.compressed.artifact>false</rapids.compressed.artifact>
Expand Down
4 changes: 2 additions & 2 deletions delta-lake/delta-spark332db/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,14 +22,14 @@
<parent>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-parent</artifactId>
<version>23.08.0</version>
<version>23.08.1</version>
<relativePath>../../pom.xml</relativePath>
</parent>

<artifactId>rapids-4-spark-delta-spark332db_2.12</artifactId>
<name>RAPIDS Accelerator for Apache Spark Databricks 12.2 Delta Lake Support</name>
<description>Databricks 12.2 Delta Lake support for the RAPIDS Accelerator for Apache Spark</description>
<version>23.08.0</version>
<version>23.08.1</version>

<properties>
<rapids.compressed.artifact>false</rapids.compressed.artifact>
Expand Down
4 changes: 2 additions & 2 deletions delta-lake/delta-stub/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,14 +22,14 @@
<parent>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-parent</artifactId>
<version>23.08.0</version>
<version>23.08.1</version>
<relativePath>../../pom.xml</relativePath>
</parent>

<artifactId>rapids-4-spark-delta-stub_2.12</artifactId>
<name>RAPIDS Accelerator for Apache Spark Delta Lake Stub</name>
<description>Delta Lake stub for the RAPIDS Accelerator for Apache Spark</description>
<version>23.08.0</version>
<version>23.08.1</version>

<properties>
<rapids.compressed.artifact>false</rapids.compressed.artifact>
Expand Down
4 changes: 2 additions & 2 deletions dist/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,12 +22,12 @@
<parent>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-parent</artifactId>
<version>23.08.0</version>
<version>23.08.1</version>
</parent>
<artifactId>rapids-4-spark_2.12</artifactId>
<name>RAPIDS Accelerator for Apache Spark Distribution</name>
<description>Creates the distribution package of the RAPIDS plugin for Apache Spark</description>
<version>23.08.0</version>
<version>23.08.1</version>
<dependencies>
<dependency>
<groupId>com.nvidia</groupId>
Expand Down
2 changes: 1 addition & 1 deletion docs/configs.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ The following is the list of options that `rapids-plugin-4-spark` supports.
On startup use: `--conf [conf key]=[conf value]`. For example:

```
${SPARK_HOME}/bin/spark-shell --jars rapids-4-spark_2.12-23.08.0-cuda11.jar \
${SPARK_HOME}/bin/spark-shell --jars rapids-4-spark_2.12-23.08.1-cuda11.jar \
--conf spark.plugins=com.nvidia.spark.SQLPlugin \
--conf spark.rapids.sql.concurrentGpuTasks=2
```
Expand Down
12 changes: 6 additions & 6 deletions docs/dev/shims.md
Original file line number Diff line number Diff line change
Expand Up @@ -68,17 +68,17 @@ Using JarURLConnection URLs we create a Parallel World of the current version wi
Spark 3.0.2's URLs:

```text
jar:file:/home/spark/rapids-4-spark_2.12-23.08.0.jar!/
jar:file:/home/spark/rapids-4-spark_2.12-23.08.0.jar!/spark3xx-common/
jar:file:/home/spark/rapids-4-spark_2.12-23.08.0.jar!/spark302/
jar:file:/home/spark/rapids-4-spark_2.12-23.08.1.jar!/
jar:file:/home/spark/rapids-4-spark_2.12-23.08.1.jar!/spark3xx-common/
jar:file:/home/spark/rapids-4-spark_2.12-23.08.1.jar!/spark302/
```

Spark 3.2.0's URLs :

```text
jar:file:/home/spark/rapids-4-spark_2.12-23.08.0.jar!/
jar:file:/home/spark/rapids-4-spark_2.12-23.08.0.jar!/spark3xx-common/
jar:file:/home/spark/rapids-4-spark_2.12-23.08.0.jar!/spark320/
jar:file:/home/spark/rapids-4-spark_2.12-23.08.1.jar!/
jar:file:/home/spark/rapids-4-spark_2.12-23.08.1.jar!/spark3xx-common/
jar:file:/home/spark/rapids-4-spark_2.12-23.08.1.jar!/spark320/
```

### Late Inheritance in Public Classes
Expand Down
14 changes: 8 additions & 6 deletions docs/download.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ cuDF jar, that is either preinstalled in the Spark classpath on all nodes or sub
that uses the RAPIDS Accelerator For Apache Spark. See the [getting-started
guide](https://nvidia.github.io/spark-rapids/Getting-Started/) for more details.

## Release v23.08.0
## Release v23.08.1
### Hardware Requirements:

The plugin is tested on the following architectures:
Expand Down Expand Up @@ -55,21 +55,23 @@ for your hardware's minimum driver version.
*For Cloudera and EMR support, please refer to the
[Distributions](./FAQ.md#which-distributions-are-supported) section of the FAQ.

### Download v23.08.0
### Download v23.08.1
* Download the [RAPIDS
Accelerator for Apache Spark 23.08.0 jar](https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark_2.12/23.08.0/rapids-4-spark_2.12-23.08.0.jar)
Accelerator for Apache Spark 23.08.1 jar](https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark_2.12/23.08.1/rapids-4-spark_2.12-23.08.1.jar)

This package is built against CUDA 11.8, all CUDA 11.x and 12.x versions are supported through [CUDA forward
compatibility](https://docs.nvidia.com/deploy/cuda-compatibility/index.html). It is tested
on V100, T4, A10, A100, L4 and H100 GPUs with CUDA 11.8-12.0. For those using other types of GPUs
which do not have CUDA forward compatibility (for example, GeForce), CUDA 11.8 or later is required.

Note that v23.08.0 is deprecated.

### Verify signature
* Download the [RAPIDS Accelerator for Apache Spark 23.08.0 jar](https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark_2.12/23.08.0/rapids-4-spark_2.12-23.08.0.jar)
and [RAPIDS Accelerator for Apache Spark 23.08.0 jars.asc](https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark_2.12/23.08.0/rapids-4-spark_2.12-23.08.0.jar.asc)
* Download the [RAPIDS Accelerator for Apache Spark 23.08.1 jar](https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark_2.12/23.08.1/rapids-4-spark_2.12-23.08.1.jar)
and [RAPIDS Accelerator for Apache Spark 23.08.1 jars.asc](https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark_2.12/23.08.1/rapids-4-spark_2.12-23.08.1.jar.asc)
* Download the [PUB_KEY](https://keys.openpgp.org/search?q=sw-spark@nvidia.com).
* Import the public key: `gpg --import PUB_KEY`
* Verify the signature: `gpg --verify rapids-4-spark_2.12-23.08.0.jar.asc rapids-4-spark_2.12-23.08.0.jar`
* Verify the signature: `gpg --verify rapids-4-spark_2.12-23.08.1.jar.asc rapids-4-spark_2.12-23.08.1.jar`

The output if signature verify:

Expand Down
4 changes: 2 additions & 2 deletions docs/get-started/getting-started-databricks.md
Original file line number Diff line number Diff line change
Expand Up @@ -66,7 +66,7 @@ Navigate to your home directory in the UI and select **Create** > **File** from
create an `init.sh` scripts with contents:
```bash
#!/bin/bash
sudo wget -O /databricks/jars/rapids-4-spark_2.12-23.08.0.jar https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark_2.12/23.08.0/rapids-4-spark_2.12-23.08.0.jar
sudo wget -O /databricks/jars/rapids-4-spark_2.12-23.08.1.jar https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark_2.12/23.08.1/rapids-4-spark_2.12-23.08.1.jar
```
Then create a Databricks cluster by going to "Compute", then clicking `+ Create compute`. Ensure the
cluster meets the prerequisites above by configuring it as follows:
Expand Down Expand Up @@ -115,7 +115,7 @@ cluster meets the prerequisites above by configuring it as follows:
```bash
spark.rapids.sql.python.gpu.enabled true
spark.python.daemon.module rapids.daemon_databricks
spark.executorEnv.PYTHONPATH /databricks/jars/rapids-4-spark_2.12-23.08.0.jar:/databricks/spark/python
spark.executorEnv.PYTHONPATH /databricks/jars/rapids-4-spark_2.12-23.08.1.jar:/databricks/spark/python
```
Note that since python memory pool require installing the cudf library, so you need to install cudf library in
each worker nodes `pip install cudf-cu11 --extra-index-url=https://pypi.nvidia.com` or disable python memory pool
Expand Down
4 changes: 2 additions & 2 deletions docs/get-started/getting-started-on-prem.md
Original file line number Diff line number Diff line change
Expand Up @@ -58,13 +58,13 @@ CUDA and will not run on other versions. The jars use a classifier to keep them
- CUDA 11.x => classifier cuda11

For example, here is a sample version of the jar with CUDA 11.x support:
- rapids-4-spark_2.12-23.08.0-cuda11.jar
- rapids-4-spark_2.12-23.08.1-cuda11.jar

For simplicity export the location to this jar. This example assumes the sample jar above has
been placed in the `/opt/sparkRapidsPlugin` directory:
```shell
export SPARK_RAPIDS_DIR=/opt/sparkRapidsPlugin
export SPARK_RAPIDS_PLUGIN_JAR=${SPARK_RAPIDS_DIR}/rapids-4-spark_2.12-23.08.0-cuda11.jar
export SPARK_RAPIDS_PLUGIN_JAR=${SPARK_RAPIDS_DIR}/rapids-4-spark_2.12-23.08.1-cuda11.jar
```

## Install the GPU Discovery Script
Expand Down
6 changes: 3 additions & 3 deletions integration_tests/DATA_GEN.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,12 +28,12 @@ corresponding profile flag `-P<jdk11|jdk17>`

After this the jar should be at
`target/rapids-4-spark-integration-tests_2.12-$PLUGIN_VERSION-spark$SPARK_VERSION.jar`
for example a Spark 3.3.0 jar for the 23.08.0 release would be
`target/rapids-4-spark-integration-tests_2.12-23.08.0-spark330.jar`
for example a Spark 3.3.0 jar for the 23.08.1 release would be
`target/rapids-4-spark-integration-tests_2.12-23.08.1-spark330.jar`

To get a spark shell with this you can run
```shell
spark-shell --jars target/rapids-4-spark-integration-tests_2.12-23.08.0-spark330.jar
spark-shell --jars target/rapids-4-spark-integration-tests_2.12-23.08.1-spark330.jar
```

After that you should be good to go.
Expand Down
6 changes: 3 additions & 3 deletions integration_tests/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -250,7 +250,7 @@ individually, so you don't risk running unit tests along with the integration te
http://www.scalatest.org/user_guide/using_the_scalatest_shell

```shell
spark-shell --jars rapids-4-spark-tests_2.12-23.08.0-tests.jar,rapids-4-spark-integration-tests_2.12-23.08.0-tests.jar,scalatest_2.12-3.0.5.jar,scalactic_2.12-3.0.5.jar
spark-shell --jars rapids-4-spark-tests_2.12-23.08.1-tests.jar,rapids-4-spark-integration-tests_2.12-23.08.1-tests.jar,scalatest_2.12-3.0.5.jar,scalactic_2.12-3.0.5.jar
```

First you import the `scalatest_shell` and tell the tests where they can find the test files you
Expand All @@ -273,7 +273,7 @@ If you just want to verify the SQL replacement is working you will need to add t
assumes CUDA 11.0 is being used.

```
$SPARK_HOME/bin/spark-submit --jars "rapids-4-spark_2.12-23.08.0-cuda11.jar" ./runtests.py
$SPARK_HOME/bin/spark-submit --jars "rapids-4-spark_2.12-23.08.1-cuda11.jar" ./runtests.py
```

You don't have to enable the plugin for this to work, the test framework will do that for you.
Expand Down Expand Up @@ -372,7 +372,7 @@ To run cudf_udf tests, need following configuration changes:
As an example, here is the `spark-submit` command with the cudf_udf parameter on CUDA 11.0:

```
$SPARK_HOME/bin/spark-submit --jars "rapids-4-spark_2.12-23.08.0-cuda11.jar,rapids-4-spark-tests_2.12-23.08.0.jar" --conf spark.rapids.memory.gpu.allocFraction=0.3 --conf spark.rapids.python.memory.gpu.allocFraction=0.3 --conf spark.rapids.python.concurrentPythonWorkers=2 --py-files "rapids-4-spark_2.12-23.08.0-cuda11.jar" --conf spark.executorEnv.PYTHONPATH="rapids-4-spark_2.12-23.08.0-cuda11.jar" ./runtests.py --cudf_udf
$SPARK_HOME/bin/spark-submit --jars "rapids-4-spark_2.12-23.08.1-cuda11.jar,rapids-4-spark-tests_2.12-23.08.1.jar" --conf spark.rapids.memory.gpu.allocFraction=0.3 --conf spark.rapids.python.memory.gpu.allocFraction=0.3 --conf spark.rapids.python.concurrentPythonWorkers=2 --py-files "rapids-4-spark_2.12-23.08.1-cuda11.jar" --conf spark.executorEnv.PYTHONPATH="rapids-4-spark_2.12-23.08.1-cuda11.jar" ./runtests.py --cudf_udf
```

### Enabling fuzz tests
Expand Down

0 comments on commit ee2de89

Please sign in to comment.