Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

doc: Add sample common issues. #88

Merged
merged 16 commits into from Feb 23, 2021
11 changes: 4 additions & 7 deletions .readme-partials.yaml
Expand Up @@ -19,13 +19,10 @@ custom_content: |
<!--- TODO(jiangmichael): Release on Maven Central and add Maven Central link -->
The connector will be available from the Maven Central repository. It can be used using the `--packages` option or the `spark.jars.packages` configuration property.

<!--
| Scala version | Connector Artifact |
| --- | --- |
| Scala 2.11 | `com.google.cloud.pubsublite.spark:pubsublite-spark-sql-streaming:0.1.0:with-dependencies` |
-->

<!--- TODO(jiangmichael): Add exmaple code and brief description here -->
## Compatibility
| Connector version | Spark version |
| --- | --- |
| 0.1.0 | 2.4.X |

## Usage

Expand Down
12 changes: 11 additions & 1 deletion samples/README.md
Expand Up @@ -87,4 +87,14 @@ To run the word count sample in Dataproc cluster, follow the steps:
3. Delete Dataproc cluster.
```sh
gcloud dataproc clusters delete $CLUSTER_NAME --region=$REGION
```
```

## Common issues
1. Permission not granted. <br>
This could happen when creating topic and subscription, or submitting a job to your Dataproc cluster.
jiangmichaellll marked this conversation as resolved.
Show resolved Hide resolved
Make sure your service account has at least `Editor` permissions for Pub/Sub Lite and Dataproc.
Your Dataproc cluster needs `scope=cloud-platform` to access other services and resources within the same project.
Your `gcloud` and `GOOGLE_APPLICATION_CREDENTIALS` should access the same project. Check out which project your `gcloud` and `gstuil` commands use with `gcloud config get-value project`.

2. Your Dataproc job fails with `ClassNotFound` or similar exceptions. <br>
Make sure your Dataproc cluster uses images of [supported Spark versions](https://github.com/googleapis/java-pubsublite-spark#compatibility).