sahara-image-elements/elements
Luigi Toscano 8cdff5d71f Update the links to artifacts (removing sahara-files too)
The canonical location for the artifacts going on is
tarballs.openstack.org/sahara-extra/, so fix the link to use that
and also use https.

Moreover, since the last tarball required for building images
is available on tarballs.openstack.org, remove the last references
to sahara-files for artifacts and documentation
(the new location was used already for a while in few places).
there are still few references to sahara-files,
but they are all about CentOS6 which is no more supported
by diskimage-builder, and should be removed separately.

Change-Id: Iab5a4d50a0abc6ab278837b6a9efd5e30f31c44a
2018-02-27 12:14:09 +01:00
..
ambari Disables CA checking for Ambari on Centos/RHEL 2018-02-13 10:43:11 +01:00
apt-mirror Improvements to README.rst of elements 2015-04-29 18:41:30 +02:00
centos-mirror Improvements to README.rst of elements 2015-04-29 18:41:30 +02:00
disable-firewall Remove the usage of 'which' (following dib) 2017-07-10 16:37:41 +02:00
extjs Update the links to artifacts (removing sahara-files too) 2018-02-27 12:14:09 +01:00
fedora-mirror Improvements to README.rst of elements 2015-04-29 18:41:30 +02:00
hadoop Revise s3_hadoop 2018-02-07 05:58:44 +00:00
hadoop-cdh Deprecate Spark 0.x and 1.0.x images 2015-07-17 09:28:26 +00:00
hadoop-cloudera Add support to create CDH 5.11 images 2017-07-25 17:26:40 +00:00
hadoop-mapr mapr: fix the discovery of the version of Scala 2017-07-14 18:12:37 +02:00
hdp-local-mirror Fix pep8 issues (environment file should not be executable) 2016-12-21 11:31:23 +04:00
hive Improvements to README.rst of elements 2015-04-29 18:41:30 +02:00
java Adding rhel7 to elements checks 2016-09-20 17:28:02 -03:00
kdc Update the links to artifacts (removing sahara-files too) 2018-02-27 12:14:09 +01:00
mysql Fix starting hive failure in Ubuntu xenial 2018-01-18 21:59:33 +08:00
nc include netcat package for centos images 2016-12-19 14:34:44 +00:00
nfs-shares NFS share utility installation 2015-07-28 13:07:13 -04:00
ntp Add elements for sync time on VM 2015-07-21 12:25:53 +00:00
oozie drop vanilla 2.6.0 support from elements 2016-08-22 17:41:05 +03:00
openjdk Build Xenial images for Vanilla and Storm 2017-07-19 19:49:17 +00:00
oracle-java Fix pep8 issues (environment file should not be executable) 2016-12-21 11:31:23 +04:00
root-passwd Improvements to README.rst of elements 2015-04-29 18:41:30 +02:00
s3_hadoop Revise s3_hadoop 2018-02-07 05:58:44 +00:00
sahara-version/root.d Remove the usage of 'which' (following dib) 2017-07-10 16:37:41 +02:00
spark Adding Spark 2.2.0 2017-12-13 09:55:41 -03:00
ssh Fix: set the Fedora-specific ssh_config file for augeas 2017-04-27 15:23:07 +02:00
storm Build Xenial images for Vanilla and Storm 2017-07-19 19:49:17 +00:00
swift_hadoop Update the links to artifacts (removing sahara-files too) 2018-02-27 12:14:09 +01:00
xfs-tools Install xfsprogs for ability to formatting volumes in XFS FS 2015-08-18 08:50:53 +00:00
zookeeper update zookeeper download link 2016-09-09 08:07:31 +03:00
.gitignore Add a .gitignore. 2013-09-02 12:58:57 +04:00
README.rst Update the links to artifacts (removing sahara-files too) 2018-02-27 12:14:09 +01:00

Diskimage-builder tools for creation cloud images

Steps how to create cloud image with Apache Hadoop installed using diskimage-builder project:

  1. Clone the repository "https://github.com/openstack/diskimage-builder" locally. Note: Make sure you have commit 43b96d91 in your clone, it provides a mapping for default-jre.
git clone https://github.com/openstack/diskimage-builder
  1. Add ~/diskimage-builder/bin/ directory to your path (for example, PATH=$PATH:/home/$USER/diskimage-builder/bin/ ).
  2. Export the following variable ELEMENTS_PATH=/home/$USER/diskimage-builder/elements/ to your .bashrc. Then source it.
  3. Copy file "img-build-sudoers" from ~/disk-image-builder/sudoers.d/ to your /etc/sudoers.d/.
chmod 440 /etc/sudoers.d/img-build-sudoers
chown root:root /etc/sudoers.d/img-build-sudoers
  1. Export sahara-elements commit id variable (from sahara-extra directory):
export SAHARA_ELEMENTS_COMMIT_ID=`git show --format=%H | head -1`
  1. Move elements/ directory to disk-image-builder/elements/
mv elements/*  /path_to_disk_image_builder/diskimage-builder/elements/
  1. Export DIB commit id variable (from DIB directory):
export DIB_COMMIT_ID=`git show --format=%H | head -1`
  1. Call the following command to create cloud image is able to run on OpenStack:

8.1. Ubuntu cloud image

JAVA_FILE=jdk-7u21-linux-x64.tar.gz DIB_HADOOP_VERSION=1.2.1 OOZIE_FILE=oozie-4.0.0.tar.gz disk-image-create base vm hadoop oozie ubuntu root-passwd -o ubuntu_hadoop_1_2_1

8.2. Fedora cloud image

JAVA_FILE=jdk-7u21-linux-x64.tar.gz DIB_HADOOP_VERSION=1.2.1 OOZIE_FILE=oozie-4.0.0.tar.gz DIB_IMAGE_SIZE=10 disk-image-create base vm fedora hadoop root-passwd oozie -o fedora_hadoop_1_2_1

Note: If you are building this image from Ubuntu or Fedora 18 OS host, you should add element 'selinux-permissive'.

JAVA_FILE=jdk-7u21-linux-x64.tar.gz DIB_HADOOP_VERSION=1.2.1 OOZIE_FILE=oozie-4.0.0.tar.gz DIB_IMAGE_SIZE=10 disk-image-create base vm fedora hadoop root-passwd oozie selinux-permissive -o fedora_hadoop_1_2_1

In this command 'DIB_HADOOP_VERSION' parameter is version of hadoop needs to be installed. You can use 'JAVA_DOWNLOAD_URL' parameter to specify download link for JDK (tarball or bin). 'DIB_IMAGE_SIZE' is parameter that specifes a volume of hard disk of instance. You need to specify it because Fedora and CentOS don't use all available volume. If you have already downloaded the jdk package, move it to "elements/hadoop/install.d/" and use its filename as 'JAVA_FILE' parameter. In order of working EDP components with Sahara DIB images you need pre-installed Oozie libs. Use OOZIE_DOWNLOAD_URL to specify link to Oozie archive (tar.gz). For example the Oozie libraries for Hadoop 2.7.1 are available from: https://tarballs.openstack.org/sahara-extra/dist/oozie/oozie-4.2.0-hadoop-2.7.1.tar.gz If you have already downloaded archive, move it to "elements/oozie/install.d/" and use its filename as 'OOZIE_FILE' parameter.