Skip to content
mateiz edited this page Oct 25, 2012 · 78 revisions

Spark is a MapReduce-like cluster computing framework designed to support low-latency iterative jobs and interactive use from an interpreter. It is written in Scala, a high-level language for the JVM, and exposes a clean language-integrated syntax that makes it easy to write parallel jobs. Spark runs on top of the Apache Mesos cluster manager.

Downloading

Get the latest Spark release from the Spark website: www.spark-project.org/downloads.html.

Documentation

You can find the latest Spark documentation at www.spark-project.org/documentation.html.

Community

To keep up with Spark development or get help, sign up for the spark-users mailing list.

If you're in the San Francisco Bay Area, there's a regular Spark meetup every few weeks. Come by to meet the developers and other users.

If you'd like to contribute code to Spark, read how to contribute.