Public Repository

Last pushed: 21 days ago
Short Description
Jupyter Notebook Python, Scala, R, Spark, Mesos Stack from https://github.com/jupyter/docker-stacks
Full Description

Jupyter Notebook Python, Scala, R, Spark, Mesos Stack

An image from the jupyter/docker-stacks project, a set of opinionated stacks of ready-to-run Jupyter applications in Docker.

See the all-spark-notebook README and Dockerfile for information about the contents and options of this particular image. Also, see the docker-stacks README for details about the project including how these images are built and versioned.

Docker Pull Command
Owner
jupyter

Comments (2)
parente
a year ago

@mioalter The Dockerfile is in the linked github repository. We had all sorts of trouble turning this into an automated build on Docker Hub because of the dependencies among images. The GitHub project has a make file that automates local building if you want to reproduce the images yourself.

mioalter
a year ago

This looks fantastic!
It would be great if the Dockerfile were on docker hub.
When I try to build this image, I get stuck the following error.
Step 1 : FROM jupyter/minimal-notebook
---> 6c38122fe99a
Step 2 : MAINTAINER Jupyter Project jupyter@googlegroups.com
---> Using cache
---> 3cc6cda94483
Step 3 : USER root
---> Using cache
---> 5eb7f8842adc
Step 4 : ENV APACHE_SPARK_VERSION 1.5.1
---> Using cache
---> a9919dd773b2
Step 5 : RUN apt-get -y update && apt-get install -y --no-install-recommends openjdk-7-jre-headless && apt-get clean
---> Using cache
---> f69d9eb3603b
Step 6 : RUN wget -qO - http://d3kbcqa49mib13.cloudfront.net/spark-${APACHE_SPARK_VERSION}-bin-hadoop2.6.tgz | tar -xz -C /usr/local/
---> Using cache
---> c015ee88ca49
Step 7 : RUN cd /usr/local && ln -s spark-${APACHE_SPARK_VERSION}-bin-hadoop2.6 spark
---> Using cache
---> 95dedcad5773
Step 8 : RUN apt-key adv --keyserver keyserver.ubuntu.com --recv E56151BF && DISTRO=debian && CODENAME=wheezy && echo "deb http://repos.mesosphere.io/${DISTRO} ${CODENAME} main" > /etc/apt/sources.list.d/mesosphere.list && apt-get -y update && apt-get --no-install-recommends -y --force-yes install mesos=0.22.1-1.0.debian78 && apt-get clean
---> Using cache
---> d9fe9bd4eb4b
Step 9 : RUN cd /tmp && echo deb http://dl.bintray.com/sbt/debian / > /etc/apt/sources.list.d/sbt.list && apt-get update && git clone https://github.com/ibm-et/spark-kernel.git && apt-get install -yq --force-yes --no-install-recommends sbt && cd spark-kernel && git checkout 3905e47815 && make dist SHELL=/bin/bash && mv dist/spark-kernel /opt/spark-kernel && chmod +x /opt/spark-kernel && rm -rf ~/.ivy2 && rm -rf ~/.sbt && rm -rf /tmp/spark-kernel && apt-get remove -y sbt && apt-get clean
---> Using cache
---> dc97ef43f5cb
Step 10 : ENV SPARK_HOME /usr/local/spark
---> Using cache
---> e89ecdb62e44
Step 11 : ENV R_LIBS_USER $SPARK_HOME/R/lib
---> Using cache
---> e57039f37b81
Step 12 : ENV PYTHONPATH $SPARK_HOME/python:$SPARK_HOME/python/lib/py4j-0.8.2.1-src.zip
---> Using cache
---> b856721f185c
Step 13 : ENV MESOS_NATIVE_LIBRARY /usr/local/lib/libmesos.so
---> Using cache
---> 4f07edfa277c
Step 14 : ENV SPARK_OPTS --driver-java-options=-Xms1024M --driver-java-options=-Xmx4096M --driver-java-options=-Dlog4j.logLevel=info
---> Using cache
---> 6d6b91448b7c
Step 15 : RUN apt-get update && apt-get install -y --no-install-recommends fonts-dejavu gfortran gcc && apt-get clean
---> Using cache
---> af6c107a9f07
Step 16 : USER jovyan
---> Using cache
---> 7849f8861351
Step 17 : RUN conda install --yes 'ipywidgets=4.0' 'pandas=0.17' 'matplotlib=1.4' 'scipy=0.16' 'seaborn=0.6' 'scikit-learn=0.16' && conda clean -yt
---> Using cache
---> 257df174a09a
Step 18 : RUN conda create -p $CONDA_DIR/envs/python2 python=2.7 'ipython=4.0' 'ipywidgets=4.0' 'pandas=0.17' 'matplotlib=1.4' 'scipy=0.16' 'seaborn=0.6' 'scikit-learn=0.16' pyzmq && conda clean -yt
---> Using cache
---> 1ab25717e1f5
Step 19 : RUN conda config --add channels r
---> Using cache
---> eca87fb3d2bf
Step 20 : RUN conda install --yes 'r-base=3.2
' 'r-irkernel=0.5' 'r-ggplot2=1.0' 'r-rcurl=1.95*' && conda clean -yt
---> Using cache
---> eb9897a1e871
Step 21 : RUN mkdir -p /opt/conda/share/jupyter/kernels/scala
---> Using cache
---> fbba6c7db9ae
Step 22 : COPY kernel.json /opt/conda/share/jupyter/kernels/scala/
stat kernel.json: no such file or directory