Skip to content

Enabling Continuous Data Processing with Apache Spark and Azure Event Hubs

License

Notifications You must be signed in to change notification settings

Azure/azure-event-hubs-spark

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Azure Event Hubs + Apache Spark Connector

Azure Event Hubs Connector for Apache Spark

chat on gitter build status star our repo

This is the source code of the Azure Event Hubs Connector for Apache Spark.

Azure Event Hubs is a highly scalable publish-subscribe service that can ingest millions of events per second and stream them into multiple applications. Spark Streaming and Structured Streaming are scalable and fault-tolerant stream processing engines that allow users to process huge amounts of data using complex algorithms expressed with high-level functions like map, reduce, join, and window. This data can then be pushed to filesystems, databases, or even back to Event Hubs.

By making Event Hubs and Spark easier to use together, we hope this connector makes building scalable, fault-tolerant applications easier for our users.

Latest Releases

Spark

Spark Version Package Name Package Version
Spark 3.0 azure-eventhubs-spark_2.12 Maven Central
Spark 2.4 azure-eventhubs-spark_2.11 Maven Central
Spark 2.4 azure-eventhubs-spark_2.12 Maven Central

Databricks

Databricks Runtime Version Artifact Id Package Version
Databricks Runtime 8.X azure-eventhubs-spark_2.12 Maven Central
Databricks Runtime 7.X azure-eventhubs-spark_2.12 Maven Central
Databricks Runtime 6.X azure-eventhubs-spark_2.11 Maven Central

Roadmap

There is an open issue for each planned feature/enhancement.

FAQ

We maintain an FAQ - reach out to us via gitter if you think anything needs to be added or clarified!

Usage

Linking

For Scala/Java applications using SBT/Maven project definitions, link your application with the artifact below. Note: See Latest Releases to find the correct artifact for your version of Apache Spark (or Databricks)!

groupId = com.microsoft.azure
artifactId = azure-eventhubs-spark_2.11
version = 2.3.22

or

groupId = com.microsoft.azure
artifactId = azure-eventhubs-spark_2.12
version = 2.3.22

Documentation

Documentation for our connector can be found here. The integration guides there contain all the information you need to use this library.

If you're new to Apache Spark and/or Event Hubs, then we highly recommend reading their documentation first. You can read Event Hubs documentation here, documentation for Spark Streaming here, and, the last but not least, Structured Streaming here.

Further Assistance

If you need additional assistance, please don't hesitate to ask! General questions and discussion should happen on our gitter chat. Please open an issue for bug reports and feature requests! Feedback, feature requests, bug reports, etc are all welcomed!

Contributing

If you'd like to help contribute (we'd love to have your help!), then go to our Contributor's Guide for more information.

Build Prerequisites

In order to use the connector, you need to have:

More details on building from source and running tests can be found in our Contributor's Guide.

Build Command

// Builds jar and runs all tests
mvn clean package

// Builds jar, runs all tests, and installs jar to your local maven repository
mvn clean install