Merge "Fix six typos on sahara documentation"

This commit is contained in:
Jenkins 2015-08-10 13:42:31 +00:00 committed by Gerrit Code Review
commit 9e373896e7
4 changed files with 6 additions and 6 deletions

View File

@ -157,7 +157,7 @@ Setting fixed IP address for VMware Fusion VM
fixed-address 192.168.55.20; fixed-address 192.168.55.20;
} }
6. Now quit all the VmWare Fusion applications and restart vmnet: 6. Now quit all the VMware Fusion applications and restart vmnet:
.. sourcecode:: console .. sourcecode:: console

View File

@ -77,7 +77,7 @@ machines simultaneously. Running in distributed mode allows sahara to
offload intensive tasks to the engine processes while keeping the API offload intensive tasks to the engine processes while keeping the API
process free to handle requests. process free to handle requests.
For an expanded discussion of configuring sahara to run in distrbuted For an expanded discussion of configuring sahara to run in distributed
mode please see the :ref:`distributed-mode-configuration` documentation. mode please see the :ref:`distributed-mode-configuration` documentation.
Hadoop HDFS High Availability Hadoop HDFS High Availability
@ -86,7 +86,7 @@ Hadoop HDFS High Availability
Hadoop HDFS High Availability (HDFS HA) provides an architecture to ensure Hadoop HDFS High Availability (HDFS HA) provides an architecture to ensure
that HDFS will continue to work in the result of an active namenode failure. that HDFS will continue to work in the result of an active namenode failure.
It uses 2 namenodes in an active/standby configuration to provide this It uses 2 namenodes in an active/standby configuration to provide this
availibility. availability.
High availability is achieved by using a set of journalnodes and Zookeeper High availability is achieved by using a set of journalnodes and Zookeeper
servers along with ZooKeeper Failover Controllers (ZKFC) and additional servers along with ZooKeeper Failover Controllers (ZKFC) and additional
@ -180,7 +180,7 @@ Volume-to-instance locality
Having an instance and an attached volume on the same physical host can Having an instance and an attached volume on the same physical host can
be very helpful in order to achieve high-performance disk I/O operations. be very helpful in order to achieve high-performance disk I/O operations.
To achieve this, sahara provides access to the Block Storage To achieve this, sahara provides access to the Block Storage
volume intance locality functionality. volume instance locality functionality.
For more information on using volume instance locality with sahara, For more information on using volume instance locality with sahara,
please see the :ref:`volume_instance_locality_configuration` please see the :ref:`volume_instance_locality_configuration`

View File

@ -62,7 +62,7 @@ For more information about HDP images, refer to
https://github.com/openstack/sahara-image-elements. https://github.com/openstack/sahara-image-elements.
There are three VM images provided for use with the HDP Plugin, that can also There are three VM images provided for use with the HDP Plugin, that can also
be built using the tools available in sahara-image-elemnts: be built using the tools available in sahara-image-elements:
1. `sahara-juno-hdp-1.3.2-centos-6.5.qcow2 <http://sahara-files.mirantis.com/sahara-juno-hdp-1.3.2-centos-6.5.qcow2>`_: 1. `sahara-juno-hdp-1.3.2-centos-6.5.qcow2 <http://sahara-files.mirantis.com/sahara-juno-hdp-1.3.2-centos-6.5.qcow2>`_:
This image contains most of the requisite packages necessary for HDP This image contains most of the requisite packages necessary for HDP

View File

@ -40,7 +40,7 @@ To create vanilla images follow these steps:
tox -e venv -- sahara-image-create tox -e venv -- sahara-image-create
Tox will create a virtualenv and install requried python packages in it, Tox will create a virtualenv and install required python packages in it,
clone the repositories "https://github.com/openstack/diskimage-builder" and "https://github.com/openstack/sahara-image-elements" and export necessary parameters. clone the repositories "https://github.com/openstack/diskimage-builder" and "https://github.com/openstack/sahara-image-elements" and export necessary parameters.
* ``DIB_HADOOP_VERSION`` - version of Hadoop to install * ``DIB_HADOOP_VERSION`` - version of Hadoop to install
* ``JAVA_DOWNLOAD_URL`` - download link for JDK (tarball or bin) * ``JAVA_DOWNLOAD_URL`` - download link for JDK (tarball or bin)