sahara-image-elements/elements/hadoop-cloudera/package-installs.yaml
Daniele Venzano f376b0f480 Use Cloudera element for Spark HDFS
Update the Spark element to use the existing hadoop-cloudera element for HDFS
for Spark versions > 1.0, instead of the ad-hoc cloudera-cdh one. For Spark 1.0.2,
CDH4 via the old hadoop-cdh element is used, since a precompiled binary for CDH5
is not available.

This change also makes it possible to specify an arbitrary Spark version via the
new -s commandline switch, reducing the amount of code for supporting future
versions of Spark. The defaults for Spark are 1.3.1 and CDH 5.3, a combination
that works well in our deployments.

A small change is needed in the cloudera element: when creating a Spark image,
only the HDFS packages have to be installed.

README files have been updated to clarify that default versions are tested, while
other combinations are not. A reference to the SparkPlugin wiki page was added
to point to a table of supported versions.

Change-Id: Ifc2a0c8729981e1e1df79b556a4c2e6bd1ba893a
Implements: blueprint support-spark-1-3
Depends-On: I8fa482b6d1d6abaa6633aec309a3ba826a8b7ebb
2015-07-10 14:12:37 +00:00

7 lines
156 B
YAML

wget:
phase: pre-install.d
ntp:
hadoop-hdfs-datanode:
hadoop-hdfs-namenode:
# other packages are installed conditionally in install.d/50-install-cloudera