spark environment

myspark.env

This helps starting Spark in a Jupyter Notebook Session

To Activate this

source ~\myspark.env

The file contents looks like this....

export SCALA_HOME=/usr/local/opt/scala/idea
alias spark=/usr/local/spark/bin/pyspark
export JAVA_HOME=/Library/Java/JavaVirtualMachines/jdk1.8.0_111.jdk/Contents/Home



export PYSPARK_PYTHON=~/pe27/bin/
export PYSPARK_DRIVER_PYTHON=~/pe27/bin/jupyter 
export PYSPARK_DRIVER_PYTHON_OPTS="notebook"

Java Settings

You may also need to "adjust" your Java settings.

To activate this

source ~\java.env

The file contents looks like this....

ANT_HOME=~/JAVA_ENV/bin
PATH=$ANT_HOME:$PATH 
JAVA_HOME=$(/usr/libexec/java_home)
export JAVA_HOME