Move prepare-zanata-client to o-z-j

This role is currently in project-config where it is untested.  Move
it to o-z-j, and rename to prepare-zanata-client for consistency with
other names.  Add integration test.  We will migrate the jobs to the
new name when this merges.

Change-Id: I26ee7057b9ad70507959286acc17d9c7bc1471db
This commit is contained in:
Ian Wienand 2017-12-06 08:12:17 +11:00 committed by Andreas Jaeger
parent 39f95294fd
commit 3cdfbc6f67
17 changed files with 1915 additions and 1 deletions

View File

@ -0,0 +1,37 @@
Prepare zanata client use
.. note:: This role is only available for Debian based platforms
currently.
**Role Variables**
.. zuul:rolevar:: zanata_api_credentials
Complex argument which contains the ssh key information. It is
expected that this argument comes from a `Secret`
.. zuul:rolevar:: server_id
This is the ID of the zanata server to use
.. zuul:rolevar:: url
The url to the zanata server
.. zuul:rolevar:: username
The username to use with the zanata server
.. zuul:rolevar:: key
The key to login with
.. zuul:rolevar:: zanata_client_version
:default: 4.3.3
The version of zanata client to install
.. zuul:rolevar:: zanata_client_checksum
:default: 25368516c2c6b94a8ad3397317abf69c723f3ba47a4f0357a31a1e075dd6f810
The expected SHA256 checksum of the zanata client

View File

@ -0,0 +1,3 @@
---
zanata_client_version: 4.3.3
zanata_client_checksum: 25368516c2c6b94a8ad3397317abf69c723f3ba47a4f0357a31a1e075dd6f810

View File

@ -0,0 +1,203 @@
# Copyright (c) 2015 Hewlett-Packard Development Company, L.P.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from io import BytesIO
import json
from lxml import etree
import os
import re
import requests
try:
import configparser
except ImportError:
import ConfigParser as configparser
try:
from urllib.parse import urljoin
except ImportError:
from urlparse import urljoin
class IniConfig(object):
"""Object that stores zanata.ini configuration
Read zanata.ini and make its values available.
Attributes:
inifile: The path to the ini file to load values from.
"""
def __init__(self, inifile):
self.inifile = inifile
self._load_config()
def _load_config(self):
"""Load configuration from the zanata.ini file
Parses the ini file and stores its data.
"""
if not os.path.isfile(self.inifile):
raise ValueError('zanata.ini file not found.')
config = configparser.ConfigParser()
try:
config.read(self.inifile)
except configparser.Error:
raise ValueError('zanata.ini could not be parsed, please check '
'format.')
for item in config.items('servers'):
item_type = item[0].split('.')[1]
if item_type in ('username', 'key', 'url'):
setattr(self, item_type, item[1])
class ZanataRestService(object):
def __init__(self, zconfig, accept='application/xml',
content_type='application/xml', verify=True):
self.url = zconfig.url
if "charset" not in content_type:
content_type = "%s;charset=utf8" % content_type
self.headers = {'Accept': accept,
'Content-Type': content_type,
'X-Auth-User': zconfig.username,
'X-Auth-Token': zconfig.key}
self.verify = verify
def _construct_url(self, url_fragment):
return urljoin(self.url, url_fragment)
def query(self, url_fragment, raise_errors=True):
request_url = self._construct_url(url_fragment)
try:
r = requests.get(request_url, verify=self.verify,
headers=self.headers)
except requests.exceptions.ConnectionError:
raise ValueError('Connection error')
if raise_errors and r.status_code != 200:
raise ValueError('Got status code %s for %s' %
(r.status_code, request_url))
if raise_errors and not r.content:
raise ValueError('Did not receive any data from %s' % request_url)
return r
def push(self, url_fragment, data):
request_url = self._construct_url(url_fragment)
try:
return requests.put(request_url, verify=self.verify,
headers=self.headers, data=json.dumps(data))
except requests.exceptions.ConnectionError:
raise ValueError('Connection error')
class ProjectConfig(object):
"""Object that stores zanata.xml per-project configuration.
Write out a zanata.xml file for the project given the supplied values.
Attributes:
zconfig (IniConfig): zanata.ini values
xmlfile (str): path to zanata.xml to read or write
rules (list): list of two-ples with pattern and rules
"""
def __init__(self, zconfig, xmlfile, rules, verify, **kwargs):
self.rest_service = ZanataRestService(zconfig, verify=verify)
self.xmlfile = xmlfile
self.rules = self._parse_rules(rules)
for key, value in kwargs.items():
setattr(self, key, value)
self._create_config()
def _get_tag_prefix(self, root):
"""XML utility method
Get the namespace of the XML file so we can
use it to search for tags.
"""
return '{%s}' % etree.QName(root).namespace
def _parse_rules(self, rules):
"""Parse a two-ple of pattern, rule.
Returns a list of dictionaries with 'pattern' and 'rule' keys.
"""
return [{'pattern': rule[0], 'rule': rule[1]} for rule in rules]
def _create_config(self):
"""Create zanata.xml
Use the supplied parameters to create zanata.xml by downloading
a base version of the file and adding customizations.
"""
xml = self._fetch_zanata_xml()
self._add_configuration(xml)
self._write_xml(xml)
def _fetch_zanata_xml(self):
"""Get base zanata.xml
Download a basic version of the configuration for the project
using Zanata's REST API.
"""
r = self.rest_service.query(
'/rest/projects/p/%s/iterations/i/%s/config'
% (self.project, self.version))
project_config = r.content
p = etree.XMLParser(remove_blank_text=True)
try:
xml = etree.parse(BytesIO(project_config), p)
except etree.ParseError:
raise ValueError('Error parsing xml output')
return xml
def _add_configuration(self, xml):
"""Insert additional configuration
Add locale mapping rules to the base zanata.xml retrieved from
the server.
Args:
xml (etree): zanata.xml file contents
"""
root = xml.getroot()
s = etree.SubElement(root, 'src-dir')
s.text = self.srcdir
t = etree.SubElement(root, 'trans-dir')
t.text = self.txdir
rules = etree.SubElement(root, 'rules')
for rule in self.rules:
new_rule = etree.SubElement(rules, 'rule')
new_rule.attrib['pattern'] = rule['pattern']
new_rule.text = rule['rule']
if self.excludes:
excludes = etree.SubElement(root, 'excludes')
excludes.text = self.excludes
tag_prefix = self._get_tag_prefix(root)
# Work around https://bugzilla.redhat.com/show_bug.cgi?id=1219624
# by removing port number in URL if it's there
url = root.find('%surl' % tag_prefix)
url.text = re.sub(':443', '', url.text)
def _write_xml(self, xml):
"""Write xml
Write out xml to zanata.xml.
"""
try:
xml.write(self.xmlfile, pretty_print=True)
except IOError:
raise ValueError('Error writing zanata.xml.')

View File

@ -0,0 +1,779 @@
#!/bin/bash -xe
# Common code used by propose_translation_update.sh and
# upstream_translation_update.sh
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
SCRIPTSDIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
source $SCRIPTSDIR/common.sh
# Set start of timestamp for subunit
TRANS_START_TIME=$(date +%s)
SUBUNIT_OUTPUT=testrepository.subunit
# Our images have os-test installed in the following location
TESTR_VENV=/usr/os-testr-env
# Topic to use for our changes
TOPIC=zanata/translations
# Used for setup.py babel commands
QUIET="--quiet"
# Have invalid files been found?
INVALID_PO_FILE=0
# ERROR_ABORT signals whether the script aborts with failure, will be
# set to 0 on successful run.
ERROR_ABORT=1
# We need a UTF-8 locale, set it properly in case it's not set.
export LANG=en_US.UTF-8
trap "finish" EXIT
# Set up some branch dependent variables
function init_branch {
local branch=$1
# The calling environment puts upper-constraints.txt in our
# working directory.
# UPPER_CONSTRAINTS_FILE needs to be exported so that tox can use it
# if needed.
export UPPER_CONSTRAINTS_FILE=$(pwd)/upper-constraints.txt
GIT_BRANCH=$branch
}
# Get a module name of a project
function get_modulename {
local project=$1
local target=$2
python $SCRIPTSDIR/get-modulename.py -p $project -t $target
}
function finish {
if [[ "$ERROR_ABORT" -eq 1 ]] ; then
$TESTR_VENV/bin/generate-subunit $TRANS_START_TIME $SECONDS \
'fail' $JOBNAME >> $SUBUNIT_OUTPUT
else
$TESTR_VENV/bin/generate-subunit $TRANS_START_TIME $SECONDS \
'success' $JOBNAME >> $SUBUNIT_OUTPUT
fi
gzip -9 $SUBUNIT_OUTPUT
# Only run this if VENV is setup.
if [ "$VENV" != "" ] ; then
# Delete temporary directories
rm -rf $VENV
VENV=""
fi
if [ "$HORIZON_ROOT" != "" ] ; then
rm -rf $HORIZON_ROOT
HORIZON_ROOT=""
fi
}
# The ensure-babel and ensure-sphinx roles create a venv in ~/.venv containing
# the needed software. However, it's possible we may want to switch that to
# using pip install --user and ~/.local instead of a venv, so make this
# compatible with either.
function setup_venv {
if [ -d ~/.venv ] ; then
source ~/.venv/bin/activate
else
# Ensure ~/.local/bin is in the path
export PATH=~/.local/bin:$PATH
fi
}
# Setup nodejs within the python venv. Match the nodejs version with
# the one used in the nodejs6-npm jobs.
function setup_nodeenv {
# The ensure-babel and ensure-sphinx roles create a venv in
# ~/.venv containing the needed software. However, it's possible
# we may want to switch that to using pip install --user and
# ~/.local instead of a venv, so make this compatible with either.
NODE_VENV=~/.local/node_venv
if [ -d ~/.venv ] ; then
pip install nodeenv
nodeenv --node 6.9.4 $NODE_VENV
else
pip install --user nodeenv
~/.local/bin/nodeenv --node 6.9.4 $NODE_VENV
fi
source $NODE_VENV/bin/activate
}
# Setup a project for Zanata. This is used by both Python and Django projects.
# syntax: setup_project <project> <zanata_version>
function setup_project {
local project=$1
local version=$2
# Exclude all dot-files, particuarly for things such such as .tox
# and .venv
local exclude='.*/**'
python $SCRIPTSDIR/create-zanata-xml.py \
-p $project -v $version --srcdir . --txdir . \
-r '**/*.pot' '{path}/{locale_with_underscore}/LC_MESSAGES/{filename}.po' \
-e "$exclude" -f zanata.xml
}
# Set global variable DocFolder for manuals projects
function init_manuals {
project=$1
DocFolder="doc"
ZanataDocFolder="./doc"
if [ $project = "api-site" -o $project = "security-doc" ] ; then
DocFolder="./"
ZanataDocFolder="."
fi
}
# Setup project manuals projects (api-site, openstack-manuals,
# security-guide) for Zanata
function setup_manuals {
local project=$1
local version=${2:-master}
# Fill in associative array SPECIAL_BOOKS
declare -A SPECIAL_BOOKS
source doc-tools-check-languages.conf
# Grab all of the rules for the documents we care about
ZANATA_RULES=
# List of directories to skip.
# All manuals have a source/common subdirectory that is a symlink
# to doc/common in openstack-manuals. We have to exclude this
# source/common directory everywhere, only doc/common gets
# translated.
EXCLUDE='.*/**,**/source/common/**'
# Generate pot one by one
for FILE in ${DocFolder}/*; do
# Skip non-directories
if [ ! -d $FILE ]; then
continue
fi
DOCNAME=${FILE#${DocFolder}/}
# Ignore directories that will not get translated
if [[ "$DOCNAME" =~ ^(www|tools|generated|publish-docs)$ ]]; then
continue
fi
IS_RST=0
if [ ${SPECIAL_BOOKS["${DOCNAME}"]+_} ] ; then
case "${SPECIAL_BOOKS["${DOCNAME}"]}" in
RST)
IS_RST=1
;;
skip)
EXCLUDE="$EXCLUDE,${DocFolder}/${DOCNAME}/**"
continue
;;
esac
fi
if [ ${IS_RST} -eq 1 ] ; then
tox -e generatepot-rst -- ${DOCNAME}
ZANATA_RULES="$ZANATA_RULES -r ${ZanataDocFolder}/${DOCNAME}/source/locale/${DOCNAME}.pot ${DocFolder}/${DOCNAME}/source/locale/{locale_with_underscore}/LC_MESSAGES/${DOCNAME}.po"
else
# Update the .pot file
./tools/generatepot ${DOCNAME}
if [ -f ${DocFolder}/${DOCNAME}/locale/${DOCNAME}.pot ]; then
ZANATA_RULES="$ZANATA_RULES -r ${ZanataDocFolder}/${DOCNAME}/locale/${DOCNAME}.pot ${DocFolder}/${DOCNAME}/locale/{locale_with_underscore}.po"
fi
fi
done
# Project setup and updating POT files for release notes.
if [[ $project == "openstack-manuals" ]] && [[ $version == "master" ]]; then
ZANATA_RULES="$ZANATA_RULES -r ./releasenotes/source/locale/releasenotes.pot releasenotes/source/locale/{locale_with_underscore}/LC_MESSAGES/releasenotes.po"
fi
python $SCRIPTSDIR/create-zanata-xml.py \
-p $project -v $version --srcdir . --txdir . \
$ZANATA_RULES -e "$EXCLUDE" \
-f zanata.xml
}
# Setup a training-guides project for Zanata
function setup_training_guides {
local project=training-guides
local version=${1:-master}
# Update the .pot file
tox -e generatepot-training
python $SCRIPTSDIR/create-zanata-xml.py \
-p $project -v $version \
--srcdir doc/upstream-training/source/locale \
--txdir doc/upstream-training/source/locale \
-f zanata.xml
}
# Setup a i18n project for Zanata
function setup_i18n {
local project=i18n
local version=${1:-master}
# Update the .pot file
tox -e generatepot
python $SCRIPTSDIR/create-zanata-xml.py \
-p $project -v $version \
--srcdir doc/source/locale \
--txdir doc/source/locale \
-f zanata.xml
}
# Setup a ReactJS project for Zanata
function setup_reactjs_project {
local project=$1
local version=$2
local exclude='node_modules/**'
setup_nodeenv
# Extract messages
npm install
npm run build
# Transform them into .pot files
npm run json2pot
python $SCRIPTSDIR/create-zanata-xml.py \
-p $project -v $version --srcdir . --txdir . \
-r '**/*.pot' '{path}/{locale}.po' \
-e "$exclude" -f zanata.xml
}
# Setup project so that git review works, sets global variable
# COMMIT_MSG.
function setup_review {
# Note we cannot rely on the default branch in .gitreview being
# correct so we are very explicit here.
local branch=${1:-master}
FULL_PROJECT=$(grep project .gitreview | cut -f2 -d= |sed -e 's/\.git$//')
set +e
read -d '' INITIAL_COMMIT_MSG <<EOF
Imported Translations from Zanata
For more information about this automatic import see:
https://docs.openstack.org/i18n/latest/reviewing-translation-import.html
EOF
set -e
git review -s
# See if there is an open change in the zanata/translations
# topic. If so, get the change id for the existing change for use
# in the commit msg.
# Function setup_commit_message will set CHANGE_ID if a change
# exists and will always set COMMIT_MSG.
setup_commit_message $FULL_PROJECT proposal-bot $branch $TOPIC "$INITIAL_COMMIT_MSG"
# Function check_already_approved will quit the proposal process if there
# is already an approved job with the same CHANGE_ID
check_already_approved $CHANGE_ID
}
# Propose patch using COMMIT_MSG
function send_patch {
local branch=${1:-master}
local output
local ret
local success=0
# We don't have any repos storing zanata.xml, so just remove it.
rm -f zanata.xml
# Don't send a review if nothing has changed.
if [ $(git diff --cached | wc -l) -gt 0 ]; then
# Commit and review
git commit -F- <<EOF
$COMMIT_MSG
EOF
CUR_PATCH_ID=$(git show | git patch-id | awk '{print $1}')
# Don't submit if we have the same patch id of previously submitted
# patchset
if [[ "${PREV_PATCH_ID}" != "${CUR_PATCH_ID}" ]]; then
# Show current status of tree for debugging purpose
git status
# Do error checking manually to ignore one class of failure.
set +e
# We cannot rely on the default branch in .gitreview being
# correct so we are very explicit here.
output=$(git review -t $TOPIC $branch)
ret=$?
[[ "$ret" -eq 0 || "$output" =~ "No changes between prior commit" ]]
success=$?
set -e
fi
fi
return $success
}
# Delete empty pot files
function check_empty_pot {
local pot=$1
# We don't need to add or send around empty source files.
trans=$(msgfmt --statistics -o /dev/null ${pot} 2>&1)
if [ "$trans" = "0 translated messages." ] ; then
rm $pot
# Remove file from git if it's under version control. We previously
# had all pot files under version control, so remove file also
# from git if needed.
git rm --ignore-unmatch $pot
fi
}
# Run extract_messages for python projects.
function extract_messages_python {
local modulename=$1
local pot=${modulename}/locale/${modulename}.pot
# In case this is an initial run, the locale directory might not
# exist, so create it since extract_messages will fail if it does
# not exist. So, create it if needed.
mkdir -p ${modulename}/locale
# Update the .pot files
# The "_C" and "_P" prefix are for more-gettext-support blueprint,
# "_C" for message with context, "_P" for plural form message.
pybabel ${QUIET} extract \
--add-comments Translators: \
--msgid-bugs-address="https://bugs.launchpad.net/openstack-i18n/" \
--project=${PROJECT} --version=${VERSION} \
-k "_C:1c,2" -k "_P:1,2" \
-o ${pot} ${modulename}
check_empty_pot ${pot}
}
# Django projects need horizon installed for extraction, install it in
# our venv. The function setup_venv needs to be called first.
function install_horizon {
# TODO(mordred) Replace this with something else that uses the horizon
# repo on disk
HORIZON_ROOT=$(mktemp -d)
# Checkout same branch of horizon as the project - including
# same constraints.
git clone --depth=1 --branch $GIT_BRANCH \
https://git.openstack.org/openstack/horizon.git $HORIZON_ROOT/horizon
(cd ${HORIZON_ROOT}/horizon && pip install -c $UPPER_CONSTRAINTS_FILE .)
rm -rf HORIZON_ROOT
HORIZON_ROOT=""
}
# Extract messages for a django project, we need to update django.pot
# and djangojs.pot.
function extract_messages_django {
local modulename=$1
local pot
KEYWORDS="-k gettext_noop -k gettext_lazy -k ngettext_lazy:1,2"
KEYWORDS+=" -k ugettext_noop -k ugettext_lazy -k ungettext_lazy:1,2"
KEYWORDS+=" -k npgettext:1c,2,3 -k pgettext_lazy:1c,2 -k npgettext_lazy:1c,2,3"
for DOMAIN in djangojs django ; do
if [ -f babel-${DOMAIN}.cfg ]; then
mkdir -p ${modulename}/locale
pot=${modulename}/locale/${DOMAIN}.pot
touch ${pot}
pybabel ${QUIET} extract -F babel-${DOMAIN}.cfg \
--add-comments Translators: \
--msgid-bugs-address="https://bugs.launchpad.net/openstack-i18n/" \
--project=${PROJECT} --version=${VERSION} \
$KEYWORDS \
-o ${pot} ${modulename}
check_empty_pot ${pot}
fi
done
}
# Extract doc messages
function extract_messages_doc {
# Temporary build folder for gettext
mkdir -p doc/build/gettext
# Extract messages
sphinx-build -b gettext doc/source \
doc/build/gettext/
# Manipulates pot translation sources if needed
if [[ -f tools/doc-pot-filter.sh ]]; then
tools/doc-pot-filter.sh
fi
# New translation target projects may not have locale folder
mkdir -p doc/source/locale
# Sphinx builds a pot file for each directory and for each file
# in the top-level directory.
# We keep the directory files and concatenate all top-level files.
local has_other=0
for f in doc/build/gettext/*.pot; do
local fn=$(basename $f .pot)
# If a pot file corresponds to a directory, we use the pot file as-is.
if [ -d doc/source/$fn ]; then
# Remove UUIDs, those are not necessary and change too often
msgcat --use-first --sort-by-file $f | \
awk '$0 !~ /^\# [a-z0-9]+$/' > doc/source/locale/doc-$fn.pot
rm $f
else
has_other=1
fi
done
# We concatenate remaining into a single pot file so that
# "git add ${DIRECTORY}/source/locale" will only add a
# single pot file for all top-level files.
if [ "$has_other" = "1" ]; then
# Remove UUIDs, those are not necessary and change too often
msgcat --use-first --sort-by-file doc/build/gettext/*.pot | \
awk '$0 !~ /^\# [a-z0-9]+$/' > doc/source/locale/doc.pot
fi
rm -rf doc/build/gettext/
}
# Extract releasenotes messages
function extract_messages_releasenotes {
local keep_workdir=$1
# Extract messages
$HOME/.venv/bin/sphinx-build -b gettext -d releasenotes/build/doctrees \
releasenotes/source releasenotes/work
rm -rf releasenotes/build
# Concatenate messages into one POT file
mkdir -p releasenotes/source/locale/
msgcat --sort-by-file releasenotes/work/*.pot \
> releasenotes/source/locale/releasenotes.pot
if [ ! -n "$keep_workdir" ]; then
rm -rf releasenotes/work
fi
}
# Check releasenote translation progress per language.
# It checks the progress per release. Add the release note translation
# if at least one release is well translated (>= 75%).
# Keep the release note translation in the git repository
# if at least one release is translated >= 40%.
# Otherwise (< 40%) the translation are removed.
#
# NOTE: this function assume POT files in releasenotes/work
# extracted by extract_messages_releasenotes().
# The workdir should be clean up by the caller.
function check_releasenotes_per_language {
local lang_po=$1
# The expected PO location is
# releasenotes/source/locale/<lang>/LC_MESSAGES/releasenotes.po.
# Extract language name from 4th component.
local lang
lang=$(echo $lang_po | cut -d / -f 4)
local release_pot
local release_name
local workdir=releasenotes/work
local has_high_thresh=0
local has_low_thresh=0
mkdir -p $workdir/$lang
for release_pot in $(find $workdir -name '*.pot'); do
release_name=$(basename $release_pot .pot)
# The index file usually contains small number of words,
# so we skip to check it.
if [ $release_name = "index" ]; then
continue
fi
msgmerge --quiet -o $workdir/$lang/$release_name.po $lang_po $release_pot
check_po_file $workdir/$lang/$release_name.po
if [ $RATIO -ge 75 ]; then
has_high_thresh=1
has_low_thresh=1
fi
if [ $RATIO -ge 40 ]; then
has_low_thresh=1
fi
done
if ! git ls-files | grep -xq $lang_po; then
if [ $has_high_thresh -eq 0 ]; then
rm -f $lang_po
fi
else
if [ $has_low_thresh -eq 0 ]; then
git rm -f --ignore-unmatch $lang_po
fi
fi
}
# Filter out files that we do not want to commit.
# Sets global variable INVALID_PO_FILE to 1 if any invalid files are
# found.
function filter_commits {
local ret
# Don't add new empty files.
for f in $(git diff --cached --name-only --diff-filter=A); do
case "$f" in
*.po)
# Files should have at least one non-empty msgid string.
if ! grep -q 'msgid "[^"]' "$f" ; then
git reset -q "$f"
rm "$f"
fi
;;
*.json)
# JSON files fail msgid test. Ignore the locale key and confirm
# there are string keys in the messages dictionary itself.
if ! grep -q '"[^"].*":\s*"' "$f" ; then
git reset -q "$f"
rm "$f"
fi
;;
*)
# Anything else is not a translation file, remove it.
git reset -q "$f"
rm "$f"
;;
esac
done
# Don't send files where the only things which have changed are
# the creation date, the version number, the revision date,
# name of last translator, comment lines, or diff file information.
REAL_CHANGE=0
# Always remove obsolete log level translations.
for f in $(git diff --cached --name-only --diff-filter=D); do
if [[ $f =~ -log-(critical|error|info|warning).po$ ]] ; then
REAL_CHANGE=1
fi
done
# Don't iterate over deleted files
for f in $(git diff --cached --name-only --diff-filter=AM); do
# It's ok if the grep fails
set +e
REGEX="(POT-Creation-Date|Project-Id-Version|PO-Revision-Date|Last-Translator|X-Generator|Generated-By)"
changed=$(git diff --cached "$f" \
| egrep -v "$REGEX" \
| egrep -c "^([-+][^-+#])")
added=$(git diff --cached "$f" \
| egrep -v "$REGEX" \
| egrep -c "^([+][^+#])")
set -e
# Check that imported po files are valid
if [[ $f =~ .po$ ]] ; then
set +e
msgfmt --check-format -o /dev/null $f
ret=$?
set -e
if [ $ret -ne 0 ] ; then
# Set change to zero so that next expression reverts
# change of this file.
changed=0
echo "ERROR: File $f is an invalid po file."
echo "ERROR: The file has not been imported and needs fixing!"
INVALID_PO_FILE=1
fi
fi
if [ $changed -eq 0 ]; then
git reset -q "$f"
git checkout -- "$f"
# We will take this import if at least one change adds new content,
# thus adding a new translation.
# If only lines are removed, we do not need to generate an import.
elif [ $added -gt 0 ] ; then
REAL_CHANGE=1
fi
done
# If no file has any real change, revert all changes.
if [ $REAL_CHANGE -eq 0 ] ; then
# New files need to be handled differently
for f in $(git diff --cached --name-only --diff-filter=A) ; do
git reset -q -- "$f"
rm "$f"
done
for f in $(git diff --cached --name-only) ; do
git reset -q -- "$f"
git checkout -- "$f"
done
fi
}
# Check the amount of translation done for a .po file, sets global variable
# RATIO.
function check_po_file {
local file=$1
local dropped_ratio=$2
trans=$(msgfmt --statistics -o /dev/null "$file" 2>&1)
check="^0 translated messages"
if [[ $trans =~ $check ]] ; then
RATIO=0
else
if [[ $trans =~ " translated message" ]] ; then
trans_no=$(echo $trans|sed -e 's/ translated message.*$//')
else
trans_no=0
fi
if [[ $trans =~ " untranslated message" ]] ; then
untrans_no=$(echo $trans|sed -e 's/^.* \([0-9]*\) untranslated message.*/\1/')
else
untrans_no=0
fi
total=$(($trans_no+$untrans_no))
RATIO=$((100*$trans_no/$total))
fi
}
# Remove obsolete files. We might have added them in the past but
# would not add them today, so let's eventually remove them.
function cleanup_po_files {
local modulename=$1
for i in $(find $modulename -name *.po) ; do
check_po_file "$i"
if [ $RATIO -lt 40 ]; then
git rm -f --ignore-unmatch $i
fi
done
}
# Remove obsolete log lovel files. We have added them in the past but
# do not translate them anymore, so let's eventually remove them.
function cleanup_log_files {
local modulename=$1
local levels="info warning error critical"
for i in $(find $modulename -name *.po) ; do
# We do not store the log level files anymore, remove them
# from git.
local bi
bi=$(basename $i)
for level in $levels ; do
if [[ "$bi" == "$modulename-log-$level.po" ]] ; then
git rm -f --ignore-unmatch $i
fi
done
done
}
# Remove all pot files, we publish them to
# http://tarballs.openstack.org/translation-source/{name}/VERSION ,
# let's not store them in git at all.
# Previously, we had those files in git, remove them now if there
# are still there.
function cleanup_pot_files {
local modulename=$1
for i in $(find $modulename -name *.pot) ; do
# Remove file; both local and from git if needed.
rm $i
git rm -f --ignore-unmatch $i
done
}
# Reduce size of po files. This reduces the amount of content imported
# and makes for fewer imports.
# This does not touch the pot files. This way we can reconstruct the po files
# using "msgmerge POTFILE POFILE -o COMPLETEPOFILE".
function compress_po_files {
local directory=$1
for i in $(find $directory -name *.po) ; do
msgattrib --translated --no-location --sort-output "$i" \
--output="${i}.tmp"
mv "${i}.tmp" "$i"
done
}
function pull_from_zanata {
local project=$1
# Since Zanata does not currently have an option to not download new
# files, we download everything, and then remove new files that are not
# translated enough.
zanata-cli -B -e pull
# We skip directories starting with '.' because they never contain
# translations for the project (in particular, '.tox'). Likewise
# 'node_modules' only contains dependencies and should be ignored.
for i in $(find . -name '*.po' ! -path './.*' ! -path './node_modules/*' -prune | cut -b3-); do
# We check release note translation percentage per release.
# To check this we need to extract messages per RST file.
# Let's defer checking it to propose_releasenotes.
local basefn=
if [ "$(basename $i)" = "releasenotes.po" ]; then
continue
fi
check_po_file "$i"
# We want new files to be >75% translated. The
# common documents in openstack-manuals have that relaxed to
# >40%.
percentage=75
if [ $project = "openstack-manuals" ]; then
case "$i" in
*common*)
percentage=40
;;
esac
fi
if [ $RATIO -lt $percentage ]; then
# This means the file is below the ratio, but we only want
# to delete it, if it is a new file. Files known to git
# that drop below 40% will be cleaned up by
# cleanup_po_files.
if ! git ls-files | grep -xq "$i"; then
rm -f "$i"
fi
fi
done
}
# Copy all pot files in modulename directory to temporary path for
# publishing. This uses the exact same path.
function copy_pot {
local all_modules=$1
local target=.translation-source/$ZANATA_VERSION/
for m in $all_modules ; do
for f in `find $m -name "*.pot" ` ; do
local fd
fd=$(dirname $f)
mkdir -p $target/$fd
cp $f $target/$f
done
done
}

View File

@ -0,0 +1,58 @@
#!/usr/bin/env python
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import argparse
import os
import sys
from ZanataUtils import IniConfig, ProjectConfig
def get_args():
parser = argparse.ArgumentParser(description='Generate a zanata.xml '
'file for this project so we can '
'process translations')
parser.add_argument('-p', '--project')
parser.add_argument('-v', '--version')
parser.add_argument('-s', '--srcdir')
parser.add_argument('-d', '--txdir')
parser.add_argument('-e', '--excludes')
parser.add_argument('-r', '--rule', nargs=2, metavar=('PATTERN', 'RULE'),
action='append',
help='Append a rule, used by the Zanata client to '
'match .pot files to translations. Can be specified '
'multiple times, and if no rules are specified a '
'default will be used.')
parser.add_argument('--no-verify', action='store_false', dest='verify',
help='Do not perform HTTPS certificate verification')
parser.add_argument('-f', '--file', required=True)
return parser.parse_args()
def main():
args = get_args()
default_rule = ('**/*.pot',
'{locale_with_underscore}/LC_MESSAGES/{filename}.po')
rules = args.rule or [default_rule]
try:
zc = IniConfig(os.path.expanduser('~/.config/zanata.ini'))
ProjectConfig(zc, args.file, rules, args.verify, project=args.project,
version=args.version,
srcdir=args.srcdir, txdir=args.txdir,
excludes=args.excludes)
except ValueError as e:
sys.exit(e)
if __name__ == '__main__':
main()

View File

@ -0,0 +1,112 @@
#!/usr/bin/env python
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from __future__ import print_function
import argparse
import ConfigParser as configparser
import sys
DJANGO_PROJECT_SUFFIXES = (
'-dashboard',
'horizon', # to match horizon and *-horizon
'-ui',
'django_openstack_auth',
)
def get_args():
parser = argparse.ArgumentParser(
description='Find module names in a repository.')
parser.add_argument('-p', '--project', required=True)
parser.add_argument('-t', '--target',
choices=['python', 'django'],
default='python',
help='Type of modules to search (default: python).')
parser.add_argument('-f', '--file', default='setup.cfg',
help='Path of setup.cfg file.')
return parser.parse_args()
def split_multiline(value):
value = [element for element in
(line.strip() for line in value.split('\n'))
if element]
return value
def read_config(path):
parser = configparser.SafeConfigParser()
parser.read(path)
config = {}
for section in parser.sections():
config[section] = dict(parser.items(section))
return config
def get_option(config, section, option, multiline=False):
if section not in config:
return
value = config[section].get(option)
if not value:
return
if multiline:
value = split_multiline(value)
return value
def get_translate_options(config, target):
translate_options = {}
for key, value in config['openstack_translations'].items():
values = split_multiline(value)
if values:
translate_options[key] = values
return translate_options.get('%s_modules' % target, [])
def get_valid_modules(config, project, target):
is_django = any(project.endswith(suffix)
for suffix in DJANGO_PROJECT_SUFFIXES)
if is_django != (target == 'django'):
return []
modules = get_option(config, 'files', 'packages', multiline=True)
# If setup.cfg does not contain [files] packages entry,
# let's assume the project name as a module name.
if not modules:
print('[files] packages entry not found in setup.cfg. '
'Use project name "%s" as a module name.' % project,
file=sys.stderr)
modules = [project]
return modules
def main():
args = get_args()
config = read_config(args.file)
if 'openstack_translations' in config:
translate_options = get_translate_options(config, args.target)
print(' '.join(translate_options))
return
modules = get_valid_modules(config, args.project, args.target)
if modules:
print(' '.join(modules))
if __name__ == '__main__':
main()

View File

@ -0,0 +1,357 @@
#!/bin/bash -xe
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
PROJECT=$1
BRANCH=$2
JOBNAME=$3
# Replace /'s in the branch name with -'s because Zanata does not
# allow /'s in version names.
ZANATA_VERSION=${BRANCH//\//-}
SCRIPTSDIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
source $SCRIPTSDIR/common_translation_update.sh
init_branch $BRANCH
function cleanup_module {
local modulename=$1
# Remove obsolete files.
cleanup_po_files "$modulename"
cleanup_pot_files "$modulename"
# Compress downloaded po files, this needs to be done after
# cleanup_po_files since that function needs to have information the
# number of untranslated strings.
compress_po_files "$modulename"
}
# Add all po files to the git repo in a target directory
function git_add_po_files {
local target_dir=$1
local po_file_count
po_file_count=`find $1 -name *.po | wc -l`
if [ $po_file_count -ne 0 ]; then
git add $target_dir/*/*
fi
}
# Add all JSON files to the git repo (used in javascript translations)
function git_add_json_files {
local target_dir=$1
local json_file_count
json_file_count=`find $1 -name '*.json' | wc -l`
if [ $json_file_count -ne 0 ]; then
git add $target_dir/*
fi
}
# Propose updates for manuals
function propose_manuals {
# Pull updated translations from Zanata.
pull_from_zanata "$PROJECT"
# Compress downloaded po files
case "$PROJECT" in
openstack-manuals)
# Cleanup po and pot files
cleanup_module "doc"
;;
api-site)
# Cleanup po and pot files
cleanup_module "api-quick-start"
cleanup_module "firstapp"
;;
security-doc)
cleanup_module "security-guide"
;;
esac
# Add imported upstream translations to git
for FILE in ${DocFolder}/*; do
DOCNAME=${FILE#${DocFolder}/}
if [ -d ${DocFolder}/${DOCNAME}/locale ] ; then
git_add_po_files ${DocFolder}/${DOCNAME}/locale
fi
if [ -d ${DocFolder}/${DOCNAME}/source/locale ] ; then
git_add_po_files ${DocFolder}/${DOCNAME}/source/locale
fi
done
}
# Propose updates for training-guides
function propose_training_guides {
# Pull updated translations from Zanata.
pull_from_zanata "$PROJECT"
# Cleanup po and pot files
cleanup_module "doc/upstream-training"
# Add all changed files to git
git_add_po_files doc/upstream-training/source/locale
}
# Propose updates for i18n
function propose_i18n {
# Pull updated translations from Zanata.
pull_from_zanata "$PROJECT"
# Cleanup po and pot files
cleanup_module "doc"
# Add all changed files to git
git_add_po_files doc/source/locale
}
# Propose updates for python and django projects
function propose_python_django {
local modulename=$1
local version=$2
# Check for empty directory and exit early
local content
content=$(ls -A $modulename/locale/)
if [[ "$content" == "" ]] ; then
return
fi
# Now add all changed files to git.
# Note we add them here to not have to differentiate in the functions
# between new files and files already under git control.
git_add_po_files $modulename/locale
# Cleanup po and pot files
cleanup_module "$modulename"
if [ "$version" == "master" ] ; then
# Remove not anymore translated log files on master, but not
# on released stable branches.
cleanup_log_files "$modulename"
fi
# Check first whether directory exists, it might be missing if
# there are no translations.
if [[ -d "$modulename/locale/" ]] ; then
# Some files were changed, add changed files again to git, so
# that we can run git diff properly.
git_add_po_files $modulename/locale
fi
}
# Handle either python or django proposals
function handle_python_django_project {
local project=$1
setup_project "$project" "$ZANATA_VERSION"
pull_from_zanata "$project"
handle_python_django $project python
handle_python_django $project django
handle_project_doc $project
}
# Handle project doc proposals
function handle_project_doc {
local project=$1
# doing only things in the test repos for project doc translation
if ! [[ "$project" =~ ^(horizon|openstack-ansible|openstack-helm)$ ]]; then
return
fi
# setup_project and pull_from_zanata are already done
# we start directly with generating .pot files
extract_messages_doc
# cleanup po and pot files
cleanup_module "doc"
# Add all changed files to git
git_add_po_files doc/source/locale
}
# Handle either python or django proposals
function handle_python_django {
local project=$1
# kind can be "python" or "django"
local kind=$2
local module_names
module_names=$(get_modulename $project $kind)
if [ -n "$module_names" ]; then
if [[ "$kind" == "django" ]] ; then
install_horizon
fi
propose_releasenotes "$ZANATA_VERSION"
for modulename in $module_names; do
# Note that we need to generate the pot files so that we
# can calculate how many strings are translated.
case "$kind" in
django)
# Update the .pot file
extract_messages_django "$modulename"
;;
python)
# Extract messages from project except log messages
extract_messages_python "$modulename"
;;
esac
propose_python_django "$modulename" "$ZANATA_VERSION"
done
fi
}
function propose_releasenotes {
local version=$1
# This function does not check whether releasenote publishing and
# testing are set up in zuul/layout.yaml. If releasenotes exist,
# they get pushed to the translation server.
# Note that releasenotes only get translated on master.
if [[ "$version" == "master" && -f releasenotes/source/conf.py ]]; then
# Note that we need to generate these so that we can calculate
# how many strings are translated.
extract_messages_releasenotes "keep_workdir"
local lang_po
local locale_dir=releasenotes/source/locale
for lang_po in $(find $locale_dir -name 'releasenotes.po'); do
check_releasenotes_per_language $lang_po
done
# Remove the working directory. We no longer needs it.
rm -rf releasenotes/work
# Cleanup POT files.
# PO files are already clean up in check_releasenotes_translations.
cleanup_pot_files "releasenotes"
# Compress downloaded po files, this needs to be done after
# cleanup_po_files since that function needs to have information the
# number of untranslated strings.
compress_po_files "releasenotes"
# Add all changed files to git - if there are
# translated files at all.
if [ -d releasenotes/source/locale/ ] ; then
git_add_po_files releasenotes/source/locale
fi
fi
# Remove any releasenotes translations from stable branches, they
# are not needed there.
if [[ "$version" != "master" && -d releasenotes/source/locale ]]; then
# Note that content might exist, e.g. from downloaded translations,
# but are not under git control.
git rm --ignore-unmatch -rf releasenotes/source/locale
fi
}
function propose_reactjs {
pull_from_zanata "$PROJECT"
# Clean up files (removes incomplete translations and untranslated strings)
cleanup_module "i18n"
# Convert po files to ReactJS i18n JSON format
for lang in `find i18n/*.po -printf "%f\n" | sed 's/\.po$//'`; do
npm run po2json -- ./i18n/$lang.po -o ./i18n/$lang.json
# The files are created as a one-line JSON file - expand them
python -m json.tool ./i18n/$lang.json ./i18n/locales/$lang.json
rm ./i18n/$lang.json
done
# Add JSON files to git
git_add_json_files i18n/locales
}
# Setup git repository for git review.
setup_git
# Check whether a review already exists, setup review commit message.
# Function setup_review calls setup_commit_message which will set CHANGE_ID and
# CHANGE_NUM if a change exists and will always set COMMIT_MSG.
setup_review "$BRANCH"
# If a change already exists, let's pull it in and compute the
# 'git patch-id' of it.
PREV_PATCH_ID=""
if [[ -n ${CHANGE_NUM} ]]; then
# Ignore errors we get in case we can't download the patch with
# git-review. If that happens then we will submit a new patch.
set +e
git review -d ${CHANGE_NUM}
RET=$?
if [[ "$RET" -eq 0 ]]; then
PREV_PATCH_ID=$(git show | git patch-id | awk '{print $1}')
fi
set -e
# The git review changed our branch, go back to our correct branch
git checkout -f ${BRANCH}
fi
# Setup venv - needed for all projects for subunit
setup_venv
case "$PROJECT" in
api-site|openstack-manuals|security-doc)
init_manuals "$PROJECT"
setup_manuals "$PROJECT" "$ZANATA_VERSION"
propose_manuals
propose_releasenotes "$ZANATA_VERSION"
;;
training-guides)
setup_training_guides "$ZANATA_VERSION"
propose_training_guides
;;
i18n)
setup_i18n "$ZANATA_VERSION"
propose_i18n
;;
tripleo-ui)
setup_reactjs_project "$PROJECT" "$ZANATA_VERSION"
propose_reactjs
;;
*)
# Common setup for python and django repositories
handle_python_django_project $PROJECT
;;
esac
# Filter out commits we do not want.
filter_commits
# Propose patch to gerrit if there are changes.
send_patch "$BRANCH"
if [ $INVALID_PO_FILE -eq 1 ] ; then
echo "At least one po file in invalid. Fix all invalid files on the"
echo "translation server."
exit 1
fi
# Tell finish function that everything is fine.
ERROR_ABORT=0

View File

@ -0,0 +1,55 @@
#!/usr/bin/env python
# Copyright (c) 2015 Hewlett-Packard Development Company, L.P.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import argparse
import os
import json
import sys
from ZanataUtils import IniConfig, ZanataRestService
def get_args():
parser = argparse.ArgumentParser(description='Check if a version for the '
'specified project exists on the Zanata '
'server')
parser.add_argument('-p', '--project', required=True)
parser.add_argument('-v', '--version', required=True)
parser.add_argument('--no-verify', action='store_false', dest='verify',
help='Do not perform HTTPS certificate verification')
return parser.parse_args()
def main():
args = get_args()
zc = IniConfig(os.path.expanduser('~/.config/zanata.ini'))
rest_service = ZanataRestService(zc, content_type='application/json',
accept='application/json',
verify=args.verify)
try:
r = rest_service.query(
'/rest/project/%s/version/%s'
% (args.project, args.version))
except ValueError:
sys.exit(1)
if r.status_code == 200:
details = json.loads(r.content)
if details['status'] == 'READONLY':
sys.exit(1)
sys.exit(0)
if __name__ == '__main__':
main()

View File

@ -0,0 +1,138 @@
#!/bin/bash -xe
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
PROJECT=$1
JOBNAME=$2
# Replace /'s in branch names with -'s because Zanata doesn't
# allow /'s in version names.
# Zuul v3 native job passes the branch in as parameter but
# does not set ZUUL_REFNAME.
if [ -z "$ZUUL_REFNAME" ] ; then
BRANCHNAME=$3
else
BRANCHNAME=$ZUUL_REFNAME
fi
ZANATA_VERSION=${BRANCHNAME//\//-}
SCRIPTSDIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
source $SCRIPTSDIR/common_translation_update.sh
init_branch $BRANCHNAME
# List of all modules to copy POT files from
ALL_MODULES=""
# Setup venv - needed for all projects for our tools
setup_venv
if ! python $SCRIPTSDIR/query-zanata-project-version.py \
-p $PROJECT -v $ZANATA_VERSION; then
# Exit successfully so that lack of a version doesn't cause the jenkins
# jobs to fail. This is necessary because not all branches of a project
# will be translated.
# Tell finish function that everything is fine.
ERROR_ABORT=0
exit 0
fi
setup_git
# Project setup and updating POT files.
case "$PROJECT" in
api-site|openstack-manuals|security-doc)
init_manuals "$PROJECT"
# POT file extraction is done in setup_manuals.
setup_manuals "$PROJECT" "$ZANATA_VERSION"
case "$PROJECT" in
api-site)
ALL_MODULES="api-quick-start firstapp"
;;
security-doc)
ALL_MODULES="security-guide"
;;
*)
ALL_MODULES="doc"
;;
esac
if [[ "$ZANATA_VERSION" == "master" && -f releasenotes/source/conf.py ]]; then
extract_messages_releasenotes
ALL_MODULES="releasenotes $ALL_MODULES"
fi
;;
training-guides)
setup_training_guides "$ZANATA_VERSION"
ALL_MODULES="doc"
;;
i18n)
setup_i18n "$ZANATA_VERSION"
ALL_MODULES="doc"
;;
tripleo-ui)
setup_reactjs_project "$PROJECT" "$ZANATA_VERSION"
# The pot file is generated in the ./i18n directory
ALL_MODULES="i18n"
;;
*)
# Common setup for python and django repositories
setup_project "$PROJECT" "$ZANATA_VERSION"
# ---- Python projects ----
module_names=$(get_modulename $PROJECT python)
if [ -n "$module_names" ]; then
if [[ "$ZANATA_VERSION" == "master" && -f releasenotes/source/conf.py ]]; then
extract_messages_releasenotes
ALL_MODULES="releasenotes $ALL_MODULES"
fi
for modulename in $module_names; do
extract_messages_python "$modulename"
ALL_MODULES="$modulename $ALL_MODULES"
done
fi
# ---- Django projects ----
module_names=$(get_modulename $PROJECT django)
if [ -n "$module_names" ]; then
install_horizon
if [[ "$ZANATA_VERSION" == "master" && -f releasenotes/source/conf.py ]]; then
extract_messages_releasenotes
ALL_MODULES="releasenotes $ALL_MODULES"
fi
for modulename in $module_names; do
extract_messages_django "$modulename"
ALL_MODULES="$modulename $ALL_MODULES"
done
fi
# ---- Documentation ----
# Let's test this with some repos :)
DOC_TARGETS=('horizon' 'openstack-ansible' 'openstack-helm')
if [[ -f doc/source/conf.py ]]; then
if [[ ${DOC_TARGETS[*]} =~ "$PROJECT" ]]; then
extract_messages_doc
ALL_MODULES="doc $ALL_MODULES"
fi
fi
;;
esac
# The Zanata client works out what to send based on the zanata.xml file.
# Do not copy translations from other files for this change.
zanata-cli -B -e push --copy-trans False
# Move pot files to translation-source directory for publishing
copy_pot "$ALL_MODULES"
mv .translation-source translation-source
# Tell finish function that everything is fine.
ERROR_ABORT=0

View File

@ -0,0 +1,110 @@
---
# This is all bad and I feel bad, but it's translated from
# https://github.com/openstack-infra/puppet-zanata/blob/master/manifests/client.pp
- name: Find java package name
include_vars: "{{ ansible_os_family }}.yaml"
- name: install necessary packages
package:
name: "{{ zanata_jre_package }}"
state: present
become: yes
- name: ensure zanata install dir
file:
path: /opt/zanata
owner: "{{ ansible_ssh_user }}"
state: directory
become: true
- name: Look for cached zanata client
stat:
path: "/opt/cache/files/zanata-cli-{{ zanata_client_version }}-dist.tar.gz"
checksum_algorithm: sha256
register: cached_client
- name: Ensure correct checksum of cached client
assert:
that:
- cached_client.stat.checksum == "{{ zanata_client_checksum }}"
when: cached_client.stat.exists
- name: Extract cached client tarball
unarchive:
src: "/opt/cache/files/zanata-cli-{{ zanata_client_version }}-dist.tar.gz"
dest: /opt/zanata
creates: "/opt/zanata/zanata-cli-{{ zanata_client_version }}/bin/zanata-cli"
remote_src: yes
when: cached_client.stat.exists
- name: Download Zanata client archive
get_url:
url: "https://search.maven.org/remotecontent?filepath=org/zanata/zanata-cli/{{ zanata_client_version }}/zanata-cli-{{ zanata_client_version }}-dist.tar.gz"
dest: "/tmp/zanata-cli-{{ zanata_client_version }}-dist.tar.gz"
checksum: "sha256:{{ zanata_client_checksum }}"
register: result
until: result | success
retries: 5
delay: 5
when: not cached_client.stat.exists
- name: Extract Zanata client archive
unarchive:
src: "/tmp/zanata-cli-{{ zanata_client_version }}-dist.tar.gz"
dest: /opt/zanata
creates: "/opt/zanata/zanata-cli-{{ zanata_client_version }}/bin/zanata-cli"
remote_src: yes
when: not cached_client.stat.exists
- name: ensure zanata-cli perms
file:
path: "/opt/zanata/zanata-cli-{{ zanata_client_version }}/bin/zanata-cli"
mode: 0755
- name: link zanata-cli
file:
path: /usr/local/bin/zanata-cli
src: "/opt/zanata/zanata-cli-{{ zanata_client_version }}/bin/zanata-cli"
state: link
become: true
# This is a preview module in Ansible 2.3. It may not work.
- name: import cert to java keystore
java_cert:
cert_url: "{{ zanata_api_credentials.fqdn }}"
keystore_path: /etc/ssl/certs/java/cacerts
keystore_pass: changeit
keystore_create: true
become: true
# Use sudo to ensure root ownership
- name: set permissions for cacert
file:
path: /etc/ssl/certs/java/cacerts
mode: 0644
become: true
- name: ensure zanata config dir
file:
path: ~/.config
state: directory
- name: write out zanata config
template:
src: zanata.ini
dest: ~/.config/zanata.ini
- name: Copy translation scripts to the script dir on the node
copy:
dest: '{{ ansible_user_dir }}/scripts/'
src: '{{ item }}'
mode: 0755
with_items:
- common_translation_update.sh
- create-zanata-xml.py
- get-modulename.py
- propose_translation_update.sh
- query-zanata-project-version.py
- upstream_translation_update.sh
- ZanataUtils.py

View File

@ -0,0 +1,4 @@
[servers]
{{ zanata_api_credentials.server_id }}.url={{ zanata_api_credentials.url }}
{{ zanata_api_credentials.server_id }}.username={{ zanata_api_credentials.username }}
{{ zanata_api_credentials.server_id }}.key={{ zanata_api_credentials.key }}

View File

@ -0,0 +1 @@
zanata_jre_package: default-jre-headless

View File

@ -11,4 +11,3 @@
- include: configure-mirrors.yaml - include: configure-mirrors.yaml
- include: fetch-zuul-cloner.yaml - include: fetch-zuul-cloner.yaml
- include: validate-host.yaml - include: validate-host.yaml

7
tests/extra.yaml Normal file
View File

@ -0,0 +1,7 @@
# Testing for non-base roles that are used across various jobs
# If you add new tests, also update the files section in job
# extra-integration in zuul.d/jobs.yaml.
- include: prepare-zanata-client.yaml
when: ansible_os_family == 'Debian'

View File

@ -0,0 +1,17 @@
- name: Test the prepare-zanata-client role
hosts: all
vars:
zanata_api_credentials:
fqdn: translate.openstack.org
server_id: translate.openstack.org
url: https://translate.openstack.org/
username: infra
key: a_fake_key
roles:
- role: prepare-zanata-client
post_tasks:
- name: Check zanata client works
command: "/opt/zanata/zanata-cli-{{ zanata_client_version }}/bin/zanata-cli --version"
changed_when: false

View File

@ -1496,3 +1496,35 @@
nodes: nodes:
- name: fedora-28 - name: fedora-28
label: fedora-28-vexxhost label: fedora-28-vexxhost
- job:
name: openstack-infra-extra-integration
description: |
Runs non-base roles that are used within various jobs to prevent
regressions. As opposed to base roles, these may run in a
limited set of environments or have other simplifying
assumptions.
abstract: true
protected: true
parent: base
required-projects:
- openstack-infra/project-config
roles:
- zuul: openstack-infra/zuul-jobs
run: tests/extra.yaml
files:
- ^zuul.d/*
- ^roles/prepare-zanata-client/.*
- ^tests/.*
# NOTE(ianw): This test restricted to the two node types these roles
# run on in the gate.
- job:
name: openstack-infra-extra-integration-xenial
parent: openstack-infra-extra-integration
nodeset: ubuntu-xenial
- job:
name: openstack-infra-extra-integration-bionic
parent: openstack-infra-extra-integration
nodeset: ubuntu-bionic

View File

@ -24,6 +24,8 @@
- openstack-infra-multinode-integration-opensuse423 - openstack-infra-multinode-integration-opensuse423
- openstack-infra-multinode-integration-opensuse-tumbleweed: - openstack-infra-multinode-integration-opensuse-tumbleweed:
voting: false voting: false
- openstack-infra-extra-integration-xenial
- openstack-infra-extra-integration-bionic
- openstack-zuul-jobs-linters - openstack-zuul-jobs-linters
gate: gate:
jobs: jobs: