Skip to content

Commit

Permalink
docs: update readme and other minor nits (#77)
Browse files Browse the repository at this point in the history
  • Loading branch information
jiangmichaellll committed Feb 12, 2021
1 parent 5e6476e commit ae18de6
Show file tree
Hide file tree
Showing 3 changed files with 18 additions and 8 deletions.
8 changes: 4 additions & 4 deletions samples/README.md
Expand Up @@ -27,7 +27,7 @@ PUBSUBLITE_SPARK_SQL_STREAMING_JAR_LOCATION= # downloaded pubsublite-spark-sql-s

To run the word count sample in Dataproc cluster, follow the steps:

1. `cd samples/`
1. `cd samples/snippets`
2. Set the current sample version.
```sh
SAMPLE_VERSION=$(mvn -q \
Expand Down Expand Up @@ -59,7 +59,7 @@ To run the word count sample in Dataproc cluster, follow the steps:
7. Create GCS bucket and upload both `pubsublite-spark-sql-streaming-$CONNECTOR_VERSION-with-dependencies.jar` and the sample jar onto GCS
```sh
gsutil mb $BUCKET
gsutil cp snapshot/target/pubsublite-spark-snapshot-$SAMPLE_VERSION.jar $BUCKET
gsutil cp target/pubsublite-spark-snippets-$SAMPLE_VERSION.jar $BUCKET
gsutil cp $PUBSUBLITE_SPARK_SQL_STREAMING_JAR_LOCATION $BUCKET
```
8. Set Dataproc region
Expand All @@ -70,14 +70,14 @@ To run the word count sample in Dataproc cluster, follow the steps:
9. Run the sample in Dataproc. You would see the word count result show up in the console output.
```sh
gcloud dataproc jobs submit spark --cluster=$CLUSTER_NAME \
--jars=$BUCKET/pubsublite-spark-snapshot-$SAMPLE_VERSION.jar,$BUCKET/pubsublite-spark-sql-streaming-$CONNECTOR_VERSION-with-dependencies.jar \
--jars=$BUCKET/pubsublite-spark-snippets-$SAMPLE_VERSION.jar,$BUCKET/pubsublite-spark-sql-streaming-$CONNECTOR_VERSION-with-dependencies.jar \
--class=pubsublite.spark.WordCount -- $SUBSCRIPTION_PATH
```

## Cleaning up
1. Delete Pub/Sub Lite topic and subscription.
```sh
gcloud pubsub lite-subscriptions delete $SUBSCRIPTION_ID --zone=$REGION-$ZONE_ID=
gcloud pubsub lite-subscriptions delete $SUBSCRIPTION_ID --zone=$REGION-$ZONE_ID
gcloud pubsub lite-topics delete $TOPIC_ID --zone=$REGION-$ZONE_ID
```
2. Delete GCS bucket.
Expand Down
16 changes: 12 additions & 4 deletions samples/snippets/src/main/java/pubsublite/spark/AdminUtils.java
Expand Up @@ -85,8 +85,12 @@ public static void createTopicExample(
try (AdminClient adminClient = AdminClient.create(adminClientSettings)) {
Topic response = adminClient.createTopic(topic).get();
System.out.println(response.getAllFields() + "created successfully.");
} catch (AlreadyExistsException e) {
System.out.println(topicPath + " already exists");
} catch (ExecutionException e) {
if (e.getCause() instanceof AlreadyExistsException) {
System.out.println(topicPath + " already exists");
} else {
throw e;
}
}
}

Expand Down Expand Up @@ -130,8 +134,12 @@ public static void createSubscriptionExample(
try (AdminClient adminClient = AdminClient.create(adminClientSettings)) {
Subscription response = adminClient.createSubscription(subscription).get();
System.out.println(response.getAllFields() + "created successfully.");
} catch (AlreadyExistsException e) {
System.out.println(subscriptionPath + " already exists");
} catch (ExecutionException e) {
if (e.getCause() instanceof AlreadyExistsException) {
System.out.println(topicPath + " already exists");
} else {
throw e;
}
}
}

Expand Down
Expand Up @@ -70,5 +70,7 @@ public static void main(String[] args) throws Exception {
createSubscriptionExample(cloudRegion, zoneId, projectNumber, topicId, subscriptionId);

publisherExample(cloudRegion, zoneId, projectNumber, topicId, words);

System.exit(0);
}
}

0 comments on commit ae18de6

Please sign in to comment.