Renaming all Savanna references to Sahara
WARNING: ---- Before merging this commit the alias for http://sahara-files.mirantis.com needs to be in place. Also before merging this commit the new openstack git project must be available at https://git.openstack.org/openstack/sahara-image-elements/ NOTE: ---- The file 'elements/hadoop-hdp/source-repository-hadoopswift' contains a link to the HortonWorks repository that holds the Hadoop Swift rpm, this link needs to be updated when HortonWorks makes the change. Implements: blueprint savanna-renaming-image-elements Change-Id: Icb9a992f8545535af3a111580ce7c9622d754c67
This commit is contained in:
parent
b532fc8105
commit
0e1241d923
@ -1,11 +1,11 @@
|
|||||||
Savanna Style Commandments
|
Sahara Style Commandments
|
||||||
==========================
|
==========================
|
||||||
|
|
||||||
- Step 1: Read the OpenStack Style Commandments
|
- Step 1: Read the OpenStack Style Commandments
|
||||||
http://docs.openstack.org/developer/hacking/
|
http://docs.openstack.org/developer/hacking/
|
||||||
- Step 2: Read on
|
- Step 2: Read on
|
||||||
|
|
||||||
Savanna Specific Commandments
|
Sahara Specific Commandments
|
||||||
-----------------------------
|
-----------------------------
|
||||||
|
|
||||||
None so far
|
None so far
|
||||||
|
@ -1,7 +1,7 @@
|
|||||||
Savanna image elements project
|
Sahara image elements project
|
||||||
==============================
|
==============================
|
||||||
|
|
||||||
This repo is a place for Savanna-related for diskimage-builder elements.
|
This repo is a place for Sahara-related for diskimage-builder elements.
|
||||||
|
|
||||||
Script for creating Fedora and Ubuntu cloud images with our elements and default parameters. You should only need to run this command:
|
Script for creating Fedora and Ubuntu cloud images with our elements and default parameters. You should only need to run this command:
|
||||||
|
|
||||||
@ -9,4 +9,4 @@ Script for creating Fedora and Ubuntu cloud images with our elements and default
|
|||||||
|
|
||||||
sudo bash diskimage-create.sh
|
sudo bash diskimage-create.sh
|
||||||
|
|
||||||
Note: More information about script `diskimage-create <https://github.com/openstack/savanna-image-elements/blob/master/diskimage-create/README.rst>`_
|
Note: More information about script `diskimage-create <https://github.com/openstack/sahara-image-elements/blob/master/diskimage-create/README.rst>`_
|
||||||
|
@ -1,4 +1,4 @@
|
|||||||
We have CentOS 6.4 and 6.5 cloud images. Recommended is CentOS 6.5 (http://savanna-files.mirantis.com/CentOS-6.5-cloud-init.qcow2).
|
We have CentOS 6.4 and 6.5 cloud images. Recommended is CentOS 6.5 (http://sahara-files.mirantis.com/CentOS-6.5-cloud-init.qcow2).
|
||||||
|
|
||||||
For preparing your own CentOS cloud image with pre-installed cloud-init you should follow this guide:
|
For preparing your own CentOS cloud image with pre-installed cloud-init you should follow this guide:
|
||||||
`CentOS cloud image. <http://docs.openstack.org/image-guide/content/centos-image.html>`_ Use the latest version of cloud-init package from `testing repository <http://pkgs.org/centos-6/epel-testing-i386/cloud-init-0.7.4-2.el6.noarch.rpm.html>`_
|
`CentOS cloud image. <http://docs.openstack.org/image-guide/content/centos-image.html>`_ Use the latest version of cloud-init package from `testing repository <http://pkgs.org/centos-6/epel-testing-i386/cloud-init-0.7.4-2.el6.noarch.rpm.html>`_
|
||||||
|
@ -1,7 +1,7 @@
|
|||||||
Diskimage-builder script for creation cloud images
|
Diskimage-builder script for creation cloud images
|
||||||
==================================================
|
==================================================
|
||||||
|
|
||||||
This script builds Ubuntu, Fedora, CentOS cloud images for use in Savanna. By default the all plugin are targeted, all images will be built. The '-p' option can be used to select plugin (vanilla, spark, idh or hdp). The '-i' option can be used to select image type (ubuntu, fedora or centos). The '-v' option can be used to select hadoop version (1, 2 or plain).
|
This script builds Ubuntu, Fedora, CentOS cloud images for use in Sahara. By default the all plugin are targeted, all images will be built. The '-p' option can be used to select plugin (vanilla, spark, idh or hdp). The '-i' option can be used to select image type (ubuntu, fedora or centos). The '-v' option can be used to select hadoop version (1, 2 or plain).
|
||||||
|
|
||||||
NOTE: You should use Ubuntu or Fedora host OS for building images, CentOS as a host OS has not been tested well.
|
NOTE: You should use Ubuntu or Fedora host OS for building images, CentOS as a host OS has not been tested well.
|
||||||
|
|
||||||
@ -13,31 +13,31 @@ For users:
|
|||||||
|
|
||||||
.. sourcecode:: bash
|
.. sourcecode:: bash
|
||||||
|
|
||||||
sudo bash savanna-image-elements/diskimage-create/diskimage-create.sh
|
sudo bash sahara-image-elements/diskimage-create/diskimage-create.sh
|
||||||
|
|
||||||
3. If you want to use your local mirrors, you should specify http urls for Fedora and Ubuntu mirrors using parameters 'FEDORA_MIRROR' and 'UBUNTU_MIRROR' like this:
|
3. If you want to use your local mirrors, you should specify http urls for Fedora and Ubuntu mirrors using parameters 'FEDORA_MIRROR' and 'UBUNTU_MIRROR' like this:
|
||||||
|
|
||||||
.. sourcecode:: bash
|
.. sourcecode:: bash
|
||||||
|
|
||||||
sudo USE_MIRRORS=true FEDORA_MIRROR="url_for_fedora_mirror" UBUNTU_MIRROR="url_for_ubuntu_mirror" bash savanna-image-elements/diskimage-create/diskimage-create.sh
|
sudo USE_MIRRORS=true FEDORA_MIRROR="url_for_fedora_mirror" UBUNTU_MIRROR="url_for_ubuntu_mirror" bash sahara-image-elements/diskimage-create/diskimage-create.sh
|
||||||
|
|
||||||
4. To select which plugin to target use the '-p' commandline option like this:
|
4. To select which plugin to target use the '-p' commandline option like this:
|
||||||
|
|
||||||
.. sourcecode:: bash
|
.. sourcecode:: bash
|
||||||
|
|
||||||
sudo bash savanna-image-elements/diskimage-create/diskimage-create.sh -p [vanilla|spark|hdp|idh]
|
sudo bash sahara-image-elements/diskimage-create/diskimage-create.sh -p [vanilla|spark|hdp|idh]
|
||||||
|
|
||||||
5. To select which hadoop version to target use the '-v' commandline option like this:
|
5. To select which hadoop version to target use the '-v' commandline option like this:
|
||||||
|
|
||||||
.. sourcecode:: bash
|
.. sourcecode:: bash
|
||||||
|
|
||||||
sudo bash savanna-image-elements/diskimage-create/diskimage-create.sh -v [1|2|plain]
|
sudo bash sahara-image-elements/diskimage-create/diskimage-create.sh -v [1|2|plain]
|
||||||
|
|
||||||
6. To select which image type to target use the '-i' commandline option like this:
|
6. To select which image type to target use the '-i' commandline option like this:
|
||||||
|
|
||||||
.. sourcecode:: bash
|
.. sourcecode:: bash
|
||||||
|
|
||||||
sudo bash savanna-image-elements/diskimage-create/diskimage-create.sh -i [ubuntu|fedora|centos]
|
sudo bash sahara-image-elements/diskimage-create/diskimage-create.sh -i [ubuntu|fedora|centos]
|
||||||
|
|
||||||
NOTE for 4, 5, 6:
|
NOTE for 4, 5, 6:
|
||||||
|
|
||||||
@ -50,8 +50,8 @@ For developers:
|
|||||||
|
|
||||||
1. If you want to add your element to this repository, you should edit this script in your commit (you should export variables for your element and add name of element to variables 'element_sequence').
|
1. If you want to add your element to this repository, you should edit this script in your commit (you should export variables for your element and add name of element to variables 'element_sequence').
|
||||||
|
|
||||||
2. If you want to test your Patch Set to savanna-image-elements or diskimage-builder, you can specify 'SIM_REPO_PATH' or 'DIB_REPO_PATH' (this parameters should be a full path to repositories) and run this script like this:
|
2. If you want to test your Patch Set to sahara-image-elements or diskimage-builder, you can specify 'SIM_REPO_PATH' or 'DIB_REPO_PATH' (this parameters should be a full path to repositories) and run this script like this:
|
||||||
|
|
||||||
.. sourcecode:: bash
|
.. sourcecode:: bash
|
||||||
|
|
||||||
sudo SIM_REPO_PATH="$(pwd)/savanna-image-elements" DIB_REPO_PATH="$(pwd)/diskimage-builder" bash savanna-image-elements/diskimage-create/diskimage-create.sh
|
sudo SIM_REPO_PATH="$(pwd)/sahara-image-elements" DIB_REPO_PATH="$(pwd)/diskimage-builder" bash sahara-image-elements/diskimage-create/diskimage-create.sh
|
||||||
|
@ -90,21 +90,21 @@ popd
|
|||||||
|
|
||||||
export ELEMENTS_PATH="$DIB_REPO_PATH/elements"
|
export ELEMENTS_PATH="$DIB_REPO_PATH/elements"
|
||||||
|
|
||||||
# savanna-image-elements repo
|
# sahara-image-elements repo
|
||||||
|
|
||||||
if [ -z $SIM_REPO_PATH ]; then
|
if [ -z $SIM_REPO_PATH ]; then
|
||||||
SIM_REPO_PATH="$(dirname $base_dir)"
|
SIM_REPO_PATH="$(dirname $base_dir)"
|
||||||
if [ $(basename $SIM_REPO_PATH) != "savanna-image-elements" ]; then
|
if [ $(basename $SIM_REPO_PATH) != "sahara-image-elements" ]; then
|
||||||
echo "Can't find Savanna-image-elements repository. Cloning it."
|
echo "Can't find Sahara-image-elements repository. Cloning it."
|
||||||
git clone https://git.openstack.org/openstack/savanna-image-elements
|
git clone https://git.openstack.org/openstack/sahara-image-elements
|
||||||
SIM_REPO_PATH="$(pwd)/savanna-image-elements"
|
SIM_REPO_PATH="$(pwd)/sahara-image-elements"
|
||||||
fi
|
fi
|
||||||
fi
|
fi
|
||||||
|
|
||||||
ELEMENTS_PATH=$ELEMENTS_PATH:$SIM_REPO_PATH/elements
|
ELEMENTS_PATH=$ELEMENTS_PATH:$SIM_REPO_PATH/elements
|
||||||
|
|
||||||
pushd $SIM_REPO_PATH
|
pushd $SIM_REPO_PATH
|
||||||
export SAVANNA_ELEMENTS_COMMIT_ID=`git rev-parse HEAD`
|
export SAHARA_ELEMENTS_COMMIT_ID=`git rev-parse HEAD`
|
||||||
popd
|
popd
|
||||||
|
|
||||||
#############################
|
#############################
|
||||||
@ -113,7 +113,7 @@ popd
|
|||||||
|
|
||||||
if [ -z "$PLUGIN" -o "$PLUGIN" = "vanilla" ]; then
|
if [ -z "$PLUGIN" -o "$PLUGIN" = "vanilla" ]; then
|
||||||
export JAVA_DOWNLOAD_URL=${JAVA_DOWNLOAD_URL:-"http://download.oracle.com/otn-pub/java/jdk/7u51-b13/jdk-7u51-linux-x64.tar.gz"}
|
export JAVA_DOWNLOAD_URL=${JAVA_DOWNLOAD_URL:-"http://download.oracle.com/otn-pub/java/jdk/7u51-b13/jdk-7u51-linux-x64.tar.gz"}
|
||||||
export OOZIE_DOWNLOAD_URL=${OOZIE_DOWNLOAD_URL:-"http://savanna-files.mirantis.com/oozie-4.0.0.tar.gz"}
|
export OOZIE_DOWNLOAD_URL=${OOZIE_DOWNLOAD_URL:-"http://sahara-files.mirantis.com/oozie-4.0.0.tar.gz"}
|
||||||
export HIVE_VERSION=${HIVE_VERSION:-"0.11.0"}
|
export HIVE_VERSION=${HIVE_VERSION:-"0.11.0"}
|
||||||
|
|
||||||
ubuntu_elements_sequence="base vm ubuntu hadoop swift_hadoop oozie mysql hive"
|
ubuntu_elements_sequence="base vm ubuntu hadoop swift_hadoop oozie mysql hive"
|
||||||
@ -121,7 +121,7 @@ if [ -z "$PLUGIN" -o "$PLUGIN" = "vanilla" ]; then
|
|||||||
centos_elements_sequence="vm rhel hadoop swift_hadoop oozie mysql hive redhat-lsb"
|
centos_elements_sequence="vm rhel hadoop swift_hadoop oozie mysql hive redhat-lsb"
|
||||||
|
|
||||||
# Workaround for https://bugs.launchpad.net/diskimage-builder/+bug/1204824
|
# Workaround for https://bugs.launchpad.net/diskimage-builder/+bug/1204824
|
||||||
# https://bugs.launchpad.net/savanna/+bug/1252684
|
# https://bugs.launchpad.net/sahara/+bug/1252684
|
||||||
if [ -z "$IMAGE_TYPE" -o "$IMAGE_TYPE" = "centos" -o "$IMAGE_TYPE" = "fedora" ]; then
|
if [ -z "$IMAGE_TYPE" -o "$IMAGE_TYPE" = "centos" -o "$IMAGE_TYPE" = "fedora" ]; then
|
||||||
if [ "$platform" = 'NAME="Ubuntu"' ]; then
|
if [ "$platform" = 'NAME="Ubuntu"' ]; then
|
||||||
echo "**************************************************************"
|
echo "**************************************************************"
|
||||||
@ -146,13 +146,13 @@ if [ -z "$PLUGIN" -o "$PLUGIN" = "vanilla" ]; then
|
|||||||
if [ -z "$IMAGE_TYPE" -o "$IMAGE_TYPE" = "ubuntu" ]; then
|
if [ -z "$IMAGE_TYPE" -o "$IMAGE_TYPE" = "ubuntu" ]; then
|
||||||
if [ -z "$HADOOP_VERSION" -o "$HADOOP_VERSION" = "1" ]; then
|
if [ -z "$HADOOP_VERSION" -o "$HADOOP_VERSION" = "1" ]; then
|
||||||
export DIB_HADOOP_VERSION=${DIB_HADOOP_VERSION_1:-"1.2.1"}
|
export DIB_HADOOP_VERSION=${DIB_HADOOP_VERSION_1:-"1.2.1"}
|
||||||
export ubuntu_image_name="ubuntu_savanna_vanilla_hadoop_1_latest"
|
export ubuntu_image_name="ubuntu_sahara_vanilla_hadoop_1_latest"
|
||||||
disk-image-create $ubuntu_elements_sequence -o $ubuntu_image_name
|
disk-image-create $ubuntu_elements_sequence -o $ubuntu_image_name
|
||||||
mv $ubuntu_image_name.qcow2 ../
|
mv $ubuntu_image_name.qcow2 ../
|
||||||
fi
|
fi
|
||||||
if [ -z "$HADOOP_VERSION" -o "$HADOOP_VERSION" = "2" ]; then
|
if [ -z "$HADOOP_VERSION" -o "$HADOOP_VERSION" = "2" ]; then
|
||||||
export DIB_HADOOP_VERSION=${DIB_HADOOP_VERSION_2:-"2.3.0"}
|
export DIB_HADOOP_VERSION=${DIB_HADOOP_VERSION_2:-"2.3.0"}
|
||||||
export ubuntu_image_name="ubuntu_savanna_vanilla_hadoop_2_latest"
|
export ubuntu_image_name="ubuntu_sahara_vanilla_hadoop_2_latest"
|
||||||
disk-image-create $ubuntu_elements_sequence -o $ubuntu_image_name
|
disk-image-create $ubuntu_elements_sequence -o $ubuntu_image_name
|
||||||
mv $ubuntu_image_name.qcow2 ../
|
mv $ubuntu_image_name.qcow2 ../
|
||||||
fi
|
fi
|
||||||
@ -162,13 +162,13 @@ if [ -z "$PLUGIN" -o "$PLUGIN" = "vanilla" ]; then
|
|||||||
if [ -z "$IMAGE_TYPE" -o "$IMAGE_TYPE" = "fedora" ]; then
|
if [ -z "$IMAGE_TYPE" -o "$IMAGE_TYPE" = "fedora" ]; then
|
||||||
if [ -z "$HADOOP_VERSION" -o "$HADOOP_VERSION" = "1" ]; then
|
if [ -z "$HADOOP_VERSION" -o "$HADOOP_VERSION" = "1" ]; then
|
||||||
export DIB_HADOOP_VERSION=${DIB_HADOOP_VERSION_1:-"1.2.1"}
|
export DIB_HADOOP_VERSION=${DIB_HADOOP_VERSION_1:-"1.2.1"}
|
||||||
export fedora_image_name="fedora_savanna_vanilla_hadoop_1_latest$suffix"
|
export fedora_image_name="fedora_sahara_vanilla_hadoop_1_latest$suffix"
|
||||||
disk-image-create $fedora_elements_sequence -o $fedora_image_name
|
disk-image-create $fedora_elements_sequence -o $fedora_image_name
|
||||||
mv $fedora_image_name.qcow2 ../
|
mv $fedora_image_name.qcow2 ../
|
||||||
fi
|
fi
|
||||||
if [ -z "$HADOOP_VERSION" -o "$HADOOP_VERSION" = "2" ]; then
|
if [ -z "$HADOOP_VERSION" -o "$HADOOP_VERSION" = "2" ]; then
|
||||||
export DIB_HADOOP_VERSION=${DIB_HADOOP_VERSION_2:-"2.3.0"}
|
export DIB_HADOOP_VERSION=${DIB_HADOOP_VERSION_2:-"2.3.0"}
|
||||||
export fedora_image_name="fedora_savanna_vanilla_hadoop_2_latest$suffix"
|
export fedora_image_name="fedora_sahara_vanilla_hadoop_2_latest$suffix"
|
||||||
disk-image-create $fedora_elements_sequence -o $fedora_image_name
|
disk-image-create $fedora_elements_sequence -o $fedora_image_name
|
||||||
mv $fedora_image_name.qcow2 ../
|
mv $fedora_image_name.qcow2 ../
|
||||||
fi
|
fi
|
||||||
@ -182,16 +182,16 @@ if [ -z "$PLUGIN" -o "$PLUGIN" = "vanilla" ]; then
|
|||||||
export DIB_IMAGE_SIZE="10"
|
export DIB_IMAGE_SIZE="10"
|
||||||
# Read Create_CentOS_cloud_image.rst to know how to create CentOS image in qcow2 format
|
# Read Create_CentOS_cloud_image.rst to know how to create CentOS image in qcow2 format
|
||||||
export BASE_IMAGE_FILE="CentOS-6.5-cloud-init.qcow2"
|
export BASE_IMAGE_FILE="CentOS-6.5-cloud-init.qcow2"
|
||||||
export DIB_CLOUD_IMAGES="http://savanna-files.mirantis.com"
|
export DIB_CLOUD_IMAGES="http://sahara-files.mirantis.com"
|
||||||
if [ -z "$HADOOP_VERSION" -o "$HADOOP_VERSION" = "1" ]; then
|
if [ -z "$HADOOP_VERSION" -o "$HADOOP_VERSION" = "1" ]; then
|
||||||
export DIB_HADOOP_VERSION=${DIB_HADOOP_VERSION_1:-"1.2.1"}
|
export DIB_HADOOP_VERSION=${DIB_HADOOP_VERSION_1:-"1.2.1"}
|
||||||
export centos_image_name="centos_savanna_vanilla_hadoop_1_latest$suffix"
|
export centos_image_name="centos_sahara_vanilla_hadoop_1_latest$suffix"
|
||||||
disk-image-create $centos_elements_sequence -n -o $centos_image_name
|
disk-image-create $centos_elements_sequence -n -o $centos_image_name
|
||||||
mv $centos_image_name.qcow2 ../
|
mv $centos_image_name.qcow2 ../
|
||||||
fi
|
fi
|
||||||
if [ -z "$HADOOP_VERSION" -o "$HADOOP_VERSION" = "2" ]; then
|
if [ -z "$HADOOP_VERSION" -o "$HADOOP_VERSION" = "2" ]; then
|
||||||
export DIB_HADOOP_VERSION=${DIB_HADOOP_VERSION_2:-"2.3.0"}
|
export DIB_HADOOP_VERSION=${DIB_HADOOP_VERSION_2:-"2.3.0"}
|
||||||
export centos_image_name="centos_savanna_vanilla_hadoop_2_latest$suffix"
|
export centos_image_name="centos_sahara_vanilla_hadoop_2_latest$suffix"
|
||||||
disk-image-create $centos_elements_sequence -n -o $centos_image_name
|
disk-image-create $centos_elements_sequence -n -o $centos_image_name
|
||||||
mv $centos_image_name.qcow2 ../
|
mv $centos_image_name.qcow2 ../
|
||||||
fi
|
fi
|
||||||
@ -207,7 +207,7 @@ if [ -z "$PLUGIN" -o "$PLUGIN" = "spark" ]; then
|
|||||||
echo "For spark plugin options -i and -v are ignored"
|
echo "For spark plugin options -i and -v are ignored"
|
||||||
|
|
||||||
export DIB_HADOOP_VERSION="2.0.0-mr1-cdh4.5.0"
|
export DIB_HADOOP_VERSION="2.0.0-mr1-cdh4.5.0"
|
||||||
export ubuntu_image_name="ubuntu_savanna_spark_latest"
|
export ubuntu_image_name="ubuntu_sahara_spark_latest"
|
||||||
|
|
||||||
ubuntu_elements_sequence="base vm ubuntu hadoop-cdh spark"
|
ubuntu_elements_sequence="base vm ubuntu hadoop-cdh spark"
|
||||||
|
|
||||||
@ -237,7 +237,7 @@ if [ -z "$PLUGIN" -o "$PLUGIN" = "hdp" ]; then
|
|||||||
# - Disable including 'base' element for CentOS
|
# - Disable including 'base' element for CentOS
|
||||||
# - Export link and filename for CentOS cloud image to download
|
# - Export link and filename for CentOS cloud image to download
|
||||||
export BASE_IMAGE_FILE="CentOS-6.4-cloud-init.qcow2"
|
export BASE_IMAGE_FILE="CentOS-6.4-cloud-init.qcow2"
|
||||||
export DIB_CLOUD_IMAGES="http://savanna-files.mirantis.com"
|
export DIB_CLOUD_IMAGES="http://sahara-files.mirantis.com"
|
||||||
|
|
||||||
# Each image has a root login, password is "hadoop"
|
# Each image has a root login, password is "hadoop"
|
||||||
export DIB_PASSWORD="hadoop"
|
export DIB_PASSWORD="hadoop"
|
||||||
@ -246,7 +246,7 @@ if [ -z "$PLUGIN" -o "$PLUGIN" = "hdp" ]; then
|
|||||||
if [ -z "$HADOOP_VERSION" -o "$HADOOP_VERSION" = "1" ]; then
|
if [ -z "$HADOOP_VERSION" -o "$HADOOP_VERSION" = "1" ]; then
|
||||||
export centos_image_name_hdp_1_3="centos-6_4-64-hdp-1-3"
|
export centos_image_name_hdp_1_3="centos-6_4-64-hdp-1-3"
|
||||||
# Elements to include in an HDP-based image
|
# Elements to include in an HDP-based image
|
||||||
centos_elements_sequence="vm rhel hadoop-hdp redhat-lsb root-passwd savanna-version source-repositories yum"
|
centos_elements_sequence="vm rhel hadoop-hdp redhat-lsb root-passwd sahara-version source-repositories yum"
|
||||||
# generate image with HDP 1.3
|
# generate image with HDP 1.3
|
||||||
export DIB_HDP_VERSION="1.3"
|
export DIB_HDP_VERSION="1.3"
|
||||||
disk-image-create $centos_elements_sequence -n -o $centos_image_name_hdp_1_3
|
disk-image-create $centos_elements_sequence -n -o $centos_image_name_hdp_1_3
|
||||||
@ -256,7 +256,7 @@ if [ -z "$PLUGIN" -o "$PLUGIN" = "hdp" ]; then
|
|||||||
if [ -z "$HADOOP_VERSION" -o "$HADOOP_VERSION" = "2" ]; then
|
if [ -z "$HADOOP_VERSION" -o "$HADOOP_VERSION" = "2" ]; then
|
||||||
export centos_image_name_hdp_2_0="centos-6_4-64-hdp-2-0"
|
export centos_image_name_hdp_2_0="centos-6_4-64-hdp-2-0"
|
||||||
# Elements to include in an HDP-based image
|
# Elements to include in an HDP-based image
|
||||||
centos_elements_sequence="vm rhel hadoop-hdp redhat-lsb root-passwd savanna-version source-repositories yum"
|
centos_elements_sequence="vm rhel hadoop-hdp redhat-lsb root-passwd sahara-version source-repositories yum"
|
||||||
# generate image with HDP 2.0
|
# generate image with HDP 2.0
|
||||||
export DIB_HDP_VERSION="2.0"
|
export DIB_HDP_VERSION="2.0"
|
||||||
disk-image-create $centos_elements_sequence -n -o $centos_image_name_hdp_2_0
|
disk-image-create $centos_elements_sequence -n -o $centos_image_name_hdp_2_0
|
||||||
@ -266,7 +266,7 @@ if [ -z "$PLUGIN" -o "$PLUGIN" = "hdp" ]; then
|
|||||||
if [ -z "$HADOOP_VERSION" -o "$HADOOP_VERSION" = "plain" ]; then
|
if [ -z "$HADOOP_VERSION" -o "$HADOOP_VERSION" = "plain" ]; then
|
||||||
export centos_image_name_plain="centos-6_4-64-plain"
|
export centos_image_name_plain="centos-6_4-64-plain"
|
||||||
# Elements for a plain CentOS image that does not contain HDP or Apache Hadoop
|
# Elements for a plain CentOS image that does not contain HDP or Apache Hadoop
|
||||||
centos_plain_elements_sequence="vm rhel redhat-lsb root-passwd savanna-version yum"
|
centos_plain_elements_sequence="vm rhel redhat-lsb root-passwd sahara-version yum"
|
||||||
# generate plain (no Hadoop components) image for testing
|
# generate plain (no Hadoop components) image for testing
|
||||||
disk-image-create $centos_plain_elements_sequence -n -o $centos_image_name_plain
|
disk-image-create $centos_plain_elements_sequence -n -o $centos_image_name_plain
|
||||||
mv $centos_image_name_plain.qcow2 ../
|
mv $centos_image_name_plain.qcow2 ../
|
||||||
@ -283,8 +283,8 @@ if [ -z "$PLUGIN" -o "$PLUGIN" = "idh" ]; then
|
|||||||
|
|
||||||
export DIB_IMAGE_SIZE="10"
|
export DIB_IMAGE_SIZE="10"
|
||||||
export BASE_IMAGE_FILE="CentOS-6.4-cloud-init.qcow2"
|
export BASE_IMAGE_FILE="CentOS-6.4-cloud-init.qcow2"
|
||||||
export DIB_CLOUD_IMAGES="http://savanna-files.mirantis.com"
|
export DIB_CLOUD_IMAGES="http://sahara-files.mirantis.com"
|
||||||
export centos_image_name_idh="centos_savanna_idh_latest"
|
export centos_image_name_idh="centos_sahara_idh_latest"
|
||||||
|
|
||||||
centos_elements_sequence="vm rhel hadoop-idh"
|
centos_elements_sequence="vm rhel hadoop-idh"
|
||||||
|
|
||||||
|
@ -20,11 +20,11 @@ Steps how to create cloud image with Apache Hadoop installed using diskimage-bui
|
|||||||
chmod 440 /etc/sudoers.d/img-build-sudoers
|
chmod 440 /etc/sudoers.d/img-build-sudoers
|
||||||
chown root:root /etc/sudoers.d/img-build-sudoers
|
chown root:root /etc/sudoers.d/img-build-sudoers
|
||||||
|
|
||||||
5. Export savanna-elements commit id variable (from savanna-extra directory):
|
5. Export sahara-elements commit id variable (from sahara-extra directory):
|
||||||
|
|
||||||
.. sourcecode:: bash
|
.. sourcecode:: bash
|
||||||
|
|
||||||
export SAVANNA_ELEMENTS_COMMIT_ID=`git show --format=%H | head -1`
|
export SAHARA_ELEMENTS_COMMIT_ID=`git show --format=%H | head -1`
|
||||||
|
|
||||||
6. Move elements/ directory to disk-image-builder/elements/
|
6. Move elements/ directory to disk-image-builder/elements/
|
||||||
|
|
||||||
@ -62,7 +62,7 @@ In this command 'DIB_HADOOP_VERSION' parameter is version of hadoop needs to be
|
|||||||
You can use 'JAVA_DOWNLOAD_URL' parameter to specify download link for JDK (tarball or bin).
|
You can use 'JAVA_DOWNLOAD_URL' parameter to specify download link for JDK (tarball or bin).
|
||||||
'DIB_IMAGE_SIZE' is parameter that specifes a volume of hard disk of instance. You need to specify it because Fedora and CentOS don't use all available volume.
|
'DIB_IMAGE_SIZE' is parameter that specifes a volume of hard disk of instance. You need to specify it because Fedora and CentOS don't use all available volume.
|
||||||
If you have already downloaded the jdk package, move it to "elements/hadoop/install.d/" and use its filename as 'JAVA_FILE' parameter.
|
If you have already downloaded the jdk package, move it to "elements/hadoop/install.d/" and use its filename as 'JAVA_FILE' parameter.
|
||||||
In order of working EDP components with Savanna DIB images you need pre-installed Oozie libs.
|
In order of working EDP components with Sahara DIB images you need pre-installed Oozie libs.
|
||||||
Use OOZIE_DOWNLOAD_URL to specify link to Oozie archive (tar.gz). For example we have built Oozie libs here:
|
Use OOZIE_DOWNLOAD_URL to specify link to Oozie archive (tar.gz). For example we have built Oozie libs here:
|
||||||
http://savanna-files.mirantis.com/oozie-4.0.0.tar.gz
|
http://sahara-files.mirantis.com/oozie-4.0.0.tar.gz
|
||||||
If you have already downloaded archive, move it to "elements/oozie/install.d/" and use its filename as 'OOZIE_FILE' parameter.
|
If you have already downloaded archive, move it to "elements/oozie/install.d/" and use its filename as 'OOZIE_FILE' parameter.
|
||||||
|
@ -1,3 +1,3 @@
|
|||||||
java
|
java
|
||||||
ssh
|
ssh
|
||||||
savanna-version
|
sahara-version
|
||||||
|
@ -9,7 +9,7 @@ Currently, the following versions of the Hortonworks Data Platform are supported
|
|||||||
|
|
||||||
The following script:
|
The following script:
|
||||||
|
|
||||||
savanna-image-elements/diskimage-create/diskimage-create.sh
|
sahara-image-elements/diskimage-create/diskimage-create.sh
|
||||||
|
|
||||||
is the default script to use for creating CentOS images with HDP installed/configured. This script can be used without modification, or can be used as an example to describe how a more customized script may be created with the "hadoop-hdp" diskimage-builder element.
|
is the default script to use for creating CentOS images with HDP installed/configured. This script can be used without modification, or can be used as an example to describe how a more customized script may be created with the "hadoop-hdp" diskimage-builder element.
|
||||||
|
|
||||||
|
@ -1 +1 @@
|
|||||||
savanna-version
|
sahara-version
|
||||||
|
@ -22,7 +22,7 @@
|
|||||||
|
|
||||||
#
|
#
|
||||||
# This script just changes the enabled flag to 0 to prevent yum install
|
# This script just changes the enabled flag to 0 to prevent yum install
|
||||||
# from going out over the network. It allows Savanna to provision VMs
|
# from going out over the network. It allows Sahara to provision VMs
|
||||||
# in disconnected mode.
|
# in disconnected mode.
|
||||||
#
|
#
|
||||||
|
|
||||||
|
@ -20,7 +20,7 @@
|
|||||||
# once the HDP install has completed.
|
# once the HDP install has completed.
|
||||||
#
|
#
|
||||||
# This is a recommended network optimization for images used
|
# This is a recommended network optimization for images used
|
||||||
# by Savanna and OpenStack.
|
# by Sahara and OpenStack.
|
||||||
##########################################################
|
##########################################################
|
||||||
|
|
||||||
|
|
||||||
|
@ -2,7 +2,7 @@ Install the Intel Distribution Hadoop
|
|||||||
|
|
||||||
The following script:
|
The following script:
|
||||||
|
|
||||||
savanna-image-elements/diskimage-create/diskimage-create.sh
|
sahara-image-elements/diskimage-create/diskimage-create.sh
|
||||||
|
|
||||||
is the default script to use for creating CentOS images with IDH installed/configured.
|
is the default script to use for creating CentOS images with IDH installed/configured.
|
||||||
|
|
||||||
|
@ -1,3 +1,3 @@
|
|||||||
java
|
java
|
||||||
ssh
|
ssh
|
||||||
savanna-version
|
sahara-version
|
||||||
|
@ -24,7 +24,7 @@ function firstboot_common()
|
|||||||
done
|
done
|
||||||
chown -R $user:$user /home/$user
|
chown -R $user:$user /home/$user
|
||||||
|
|
||||||
#TODO: configure iptables (https://bugs.launchpad.net/savanna/+bug/1195744)
|
#TODO: configure iptables (https://bugs.launchpad.net/sahara/+bug/1195744)
|
||||||
iptables -F
|
iptables -F
|
||||||
;;
|
;;
|
||||||
CentOS )
|
CentOS )
|
||||||
|
@ -1 +1 @@
|
|||||||
savanna-version
|
sahara-version
|
||||||
|
10
elements/sahara-version/install.d/01-sahara-version
Executable file
10
elements/sahara-version/install.d/01-sahara-version
Executable file
@ -0,0 +1,10 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
|
||||||
|
if [ -z "$SAHARA_ELEMENTS_COMMIT_ID" -o -z "$DIB_COMMIT_ID" ]
|
||||||
|
then
|
||||||
|
echo "Both SAHARA_ELEMENTS_COMMIT_ID and DIB_COMMIT_ID must be specified, exiting"
|
||||||
|
exit 3
|
||||||
|
else
|
||||||
|
echo -e "Sahara-elements-extra commit id: $SAHARA_ELEMENTS_COMMIT_ID,
|
||||||
|
Diskimage-builder commit id: $DIB_COMMIT_ID" > /etc/sahara-extra.version
|
||||||
|
fi
|
@ -1,10 +0,0 @@
|
|||||||
#!/bin/bash
|
|
||||||
|
|
||||||
if [ -z "$SAVANNA_ELEMENTS_COMMIT_ID" -o -z "$DIB_COMMIT_ID" ]
|
|
||||||
then
|
|
||||||
echo "Both SAVANNA_ELEMENTS_COMMIT_ID and DIB_COMMIT_ID must be specified, exiting"
|
|
||||||
exit 3
|
|
||||||
else
|
|
||||||
echo -e "Savanna-elements-extra commit id: $SAVANNA_ELEMENTS_COMMIT_ID,
|
|
||||||
Diskimage-builder commit id: $DIB_COMMIT_ID" > /etc/savanna-extra.version
|
|
||||||
fi
|
|
@ -1,2 +1,2 @@
|
|||||||
This element installs an SSH server then configures it to be suitable
|
This element installs an SSH server then configures it to be suitable
|
||||||
for use with Savanna
|
for use with Sahara
|
||||||
|
@ -5,10 +5,10 @@ install-packages wget
|
|||||||
if [[ "$DIB_HADOOP_VERSION" < "2.0.0" ]];
|
if [[ "$DIB_HADOOP_VERSION" < "2.0.0" ]];
|
||||||
then
|
then
|
||||||
HDFS_LIB_DIR="/usr/share/hadoop/lib"
|
HDFS_LIB_DIR="/usr/share/hadoop/lib"
|
||||||
SWIFT_LIB_URI="http://savanna-files.mirantis.com/hadoop-swift/hadoop-swift-latest.jar"
|
SWIFT_LIB_URI="http://sahara-files.mirantis.com/hadoop-swift/hadoop-swift-latest.jar"
|
||||||
else
|
else
|
||||||
HDFS_LIB_DIR="/opt/hadoop/share/hadoop/common/lib"
|
HDFS_LIB_DIR="/opt/hadoop/share/hadoop/common/lib"
|
||||||
#TODO(sreshetniak): make jar and upload to savanna-files
|
#TODO(sreshetniak): make jar and upload to sahara-files
|
||||||
SWIFT_LIB_URI="https://repository.cloudera.com/artifactory/repo/org/apache/hadoop/hadoop-openstack/2.3.0/hadoop-openstack-2.3.0.jar"
|
SWIFT_LIB_URI="https://repository.cloudera.com/artifactory/repo/org/apache/hadoop/hadoop-openstack/2.3.0/hadoop-openstack-2.3.0.jar"
|
||||||
fi
|
fi
|
||||||
|
|
||||||
|
@ -1,7 +1,7 @@
|
|||||||
[metadata]
|
[metadata]
|
||||||
name = savanna-image-elements
|
name = sahara-image-elements
|
||||||
version = 2014.1
|
version = 2014.1
|
||||||
summary = Image elements for Savanna
|
summary = Image elements for Sahara
|
||||||
description-file = README.rst
|
description-file = README.rst
|
||||||
license = Apache Software License
|
license = Apache Software License
|
||||||
classifiers =
|
classifiers =
|
||||||
@ -11,11 +11,11 @@ classifiers =
|
|||||||
Operating System :: POSIX :: Linux
|
Operating System :: POSIX :: Linux
|
||||||
author = OpenStack
|
author = OpenStack
|
||||||
author-email = openstack-dev@lists.openstack.org
|
author-email = openstack-dev@lists.openstack.org
|
||||||
home-page = https://savanna.readthedocs.org
|
home-page = https://docs.openstack.org/developer/sahara
|
||||||
|
|
||||||
[files]
|
[files]
|
||||||
data_files =
|
data_files =
|
||||||
share/savanna-elements = elements/*
|
share/sahara-elements = elements/*
|
||||||
|
|
||||||
[wheel]
|
[wheel]
|
||||||
universal = 1
|
universal = 1
|
||||||
|
Loading…
Reference in New Issue
Block a user