Skip to content
mateiz edited this page Oct 25, 2012 · 78 revisions

Spark is a MapReduce-like cluster computing framework designed to support low-latency iterative jobs and interactive use from an interpreter. It is written in Scala, a high-level language for the JVM, and exposes a clean language-integrated syntax that makes it easy to write parallel jobs. Spark runs on top of the Apache Mesos cluster manager.

Downloads and Documentation

Get the latest Spark release from the Spark website: www.spark-project.org/downloads.html. You can also find the latest Spark documentation at www.spark-project.org/documentation.html.

Community

To keep up with Spark development or get help, sign up for the spark-users mailing list. If you're in the San Francisco Bay Area, there's also a regular Spark meetup every few weeks. Come by to meet the developers and other users.

To report issues or suggest improvements, post to the Spark issue tracker.

If you'd like to contribute code to Spark, read how to contribute.