Skip to content

delijati/spark-optimizer

Repository files navigation

Spark-optimizer

Build Status

Optimize spark settings (for cluster aka yarn run)

Original source: http://c2fo.io/c2fo/spark/aws/emr/2016/07/06/apache-spark-config-cheatsheet/

Usage

Install:

$ virtualenv env
$ env/bin/pip install spark-optimizer

Dev install:

$ virtualenv env
$ env/bin/pip install -e .

Generate settings for c4.4xlarge with 4 nodes:

$ env/bin/spark-optimizer c4.4xlarge 4
Optimal numPartitions: 162 
{'spark.default.parallelism': '108',
 'spark.driver.cores': '2',
 'spark.driver.maxResultSize': '3481m',
 'spark.driver.memory': '3481m',
 'spark.driver.memoryOverhead': '614m',
 'spark.executor.cores': '2',
 'spark.executor.instances': '27',
 'spark.executor.memory': '3481m',
 'spark.executor.memoryOverhead': '614m'}

Update instance info:

$ env/bin/python spark_optimizer/emr_update.py

About

Optimize AWS EMR spark settings (apache-spark-config-cheatsheet)

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages