tools/centos-mirror-tools/stx_mirror_scripts/dl_utils.sh
Davlet Panech ac49ff342c use curl + avoid partial downloads
Mirror scripts sometimes leave corrupted/partial files behind.

Problems
========

1) wget is called with the -O flag, and the server returns an HTTP
error for the requested URL (404 etc). Wget leaves a zero-length file
behind. This doesn't seem to happen without the -O flag.

2) wget starts the download which stalls & times out half-way; wget
gives up and requests the same file with a byte offset of the form
"Range: bytes=1234-", and the web server doesn't support open-ended
ranges. In this case wget prints out a warning, leaves a partial file
behind and returns success.

3) Sites like GitHub generate repo tarballs on the fly, eg:
https://github.com/kubernetes/kubernetes/archive/refs/tags/v1.19.3.tar.gz
Since tags can move, downloading such a file twice may result in a
different file. Therefore HTTP "resume download" may corrupt files in
this case.

4) Git "keyword expansion" feature may result in differences in source
files being downloaded. For example, this file:

  https://github.com/kubernetes/kubernetes/blob/v1.19.3/staging/src/k8s.io/component-base/version/base.go

contains lines similar to:

  gitVersion  = "v0.0.0-master+$Format:%h$"

where %h is replaced with a short SHA when the tar file is
exported/downloaded.  How short the SHA is depends on git history and
sometimes results in shortened SHAs of different lengths. So
downloading that file may result in different files.

Therefore HTTP "Range" header may corrupt files in this case as
well.

5) Curl is invoked with the "--retry" option and starts the download;
connection stalls; curl gives up, connects again, skips the 1st N
bytes and appends to the partial file. If the file changes while we
are doing this, it will end up corrupting the file. This is very
unlikely to happen and I haven't been able to reproduce this case.

Problems with HTTP Range header
===============================
Curl/wget "resume/continue download" feature has no way of verifying
whether the partial file on disk, and the one being re-requested, are in
fact the same file.  If the file changes on the server between
downloads, "resume download" will corrupt it.

Some web servers don't support this at all, which triggers case (2)
with wget.

Some web servers support the Range header, but require that the end
byte position is present. This is not compatible with wget & curl.
For example curl & wget add headers similar to: "Range: bytes=1234-"
means give me the file starting at offset 1234 and till EOF. This also
triggers case (2).

This patch
==========

* Always download the file to a temporary name, then rename into place

* Use curl instead wget (better error handling). The only exception is
"recursive downloads", which curl doesn't support.

Bug: https://bugs.launchpad.net/starlingx/+bug/1950017
Change-Id: Iaa89009ce23efe5b73ecb8163556ce6db932028b
Signed-off-by: Davlet Panech <davlet.panech@windriver.com>
2021-11-10 14:25:47 -05:00

280 lines
5.9 KiB
Bash

#
# SPDX-License-Identifier: Apache-2.0
#
# Utility function for the download of gits and tarballs.
#
# This script was originated by Scott Little.
#
DL_UTILS_DIR="$(dirname "$(readlink -f "${BASH_SOURCE[0]}" )" )"
if [ -f "$DL_UTILS_DIR/utils.sh" ]; then
source "$DL_UTILS_DIR/utils.sh"
elif [ -f "$DL_UTILS_DIR/../utils.sh" ]; then
source "$DL_UTILS_DIR/../utils.sh"
else
echo "Error: Can't find 'utils.sh'"
exit 1
fi
DOWNLOAD_PATH_ROOT=${DOWNLOAD_PATH_ROOT:-/export/mirror/centos}
#
# dl_git_from_url <git-url> <branch> <dir>
#
# Download a git from supplied url into directory,
# and checkout desired branch.
#
dl_git_from_url () {
local GIT_URL="$1"
local BRANCH="$2"
local DL_DIR="$3"
local DL_ROOT_DIR=""
local SAVE_DIR
local CMD=""
SAVE_DIR="$(pwd)"
if [ "$DL_DIR" == "" ]; then
DL_DIR="$DOWNLOAD_PATH_ROOT/$(repo_url_to_sub_path "$GIT_URL" | sed 's#[.]git$##')"
fi
echo "dl_git_from_url GIT_URL='$GIT_URL' BRANCH='$BRANCH' DL_DIR='$DL_DIR'"
DL_ROOT_DIR=$(dirname "$DL_DIR")
if [ ! -d "$DL_DIR" ]; then
if [ ! -d "$DL_ROOT_DIR" ]; then
CMD="mkdir -p '$DL_ROOT_DIR'"
echo "$CMD"
eval $CMD
if [ $? -ne 0 ]; then
echo "Error: $CMD"
cd "$SAVE_DIR"
return 1
fi
fi
CMD="cd '$DL_ROOT_DIR'"
echo "$CMD"
eval $CMD
if [ $? -ne 0 ]; then
echo "Error: $CMD"
cd "$SAVE_DIR"
return 1
fi
CMD="git clone '$GIT_URL' '$DL_DIR'"
echo "$CMD"
eval $CMD
if [ $? -ne 0 ]; then
echo "Error: $CMD"
cd "$SAVE_DIR"
return 1
fi
fi
CMD="cd '$DL_DIR'"
echo "$CMD"
eval $CMD
if [ $? -ne 0 ]; then
echo "Error: $CMD"
cd "$SAVE_DIR"
return 1
fi
CMD="git fetch"
echo "$CMD"
eval $CMD
if [ $? -ne 0 ]; then
echo "Error: $CMD"
cd "$SAVE_DIR"
return 1
fi
CMD="git checkout '$BRANCH'"
echo "$CMD"
eval $CMD
if [ $? -ne 0 ]; then
echo "Error: $CMD"
cd "$SAVE_DIR"
return 1
fi
CMD="git pull"
echo "$CMD"
eval $CMD
if [ $? -ne 0 ]; then
echo "Error: $CMD"
cd "$SAVE_DIR"
return 1
fi
cd "$SAVE_DIR"
return 0
}
#
# dl_bare_git_from_url <git-url> <dir>
#
# Download a bare git from supplied url into desired directory.
#
dl_bare_git_from_url () {
local GIT_URL="$1"
local DL_DIR="$2"
local DL_ROOT_DIR=""
local SAVE_DIR
local CMD=""
SAVE_DIR="$(pwd)"
if [ "$DL_DIR" == "" ]; then
DL_DIR="$DOWNLOAD_PATH_ROOT/$(repo_url_to_sub_path "$GIT_URL" | sed 's#[.]git$##')"
fi
echo "dl_bare_git_from_url GIT_URL='$GIT_URL' DL_DIR='$DL_DIR'"
DL_ROOT_DIR=$(dirname "$DL_DIR")
if [ ! -d "$DL_DIR" ]; then
if [ ! -d "$DL_ROOT_DIR" ]; then
CMD="mkdir -p '$DL_ROOT_DIR'"
echo "$CMD"
eval $CMD
if [ $? -ne 0 ]; then
echo "Error: $CMD"
cd "$SAVE_DIR"
return 1
fi
fi
CMD="cd '$DL_ROOT_DIR'"
echo "$CMD"
eval $CMD
if [ $? -ne 0 ]; then
echo "Error: $CMD"
cd "$SAVE_DIR"
return 1
fi
CMD="git clone --bare '$GIT_URL' '$DL_DIR'"
echo "$CMD"
eval $CMD
if [ $? -ne 0 ]; then
echo "Error: $CMD"
cd "$SAVE_DIR"
return 1
fi
CMD="cd '$DL_DIR'"
echo "$CMD"
eval $CMD
if [ $? -ne 0 ]; then
echo "Error: $CMD"
cd "$SAVE_DIR"
return 1
fi
CMD="git --bare update-server-info"
echo "$CMD"
eval $CMD
if [ $? -ne 0 ]; then
echo "Error: $CMD"
cd "$SAVE_DIR"
return 1
fi
if [ -f hooks/post-update.sample ]; then
CMD="mv -f hooks/post-update.sample hooks/post-update"
echo "$CMD"
eval $CMD
if [ $? -ne 0 ]; then
echo "Error: $CMD"
cd "$SAVE_DIR"
return 1
fi
fi
fi
CMD="cd '$DL_DIR'"
echo "$CMD"
eval $CMD
if [ $? -ne 0 ]; then
echo "Error: $CMD"
cd "$SAVE_DIR"
return 1
fi
CMD="git fetch"
echo "$CMD"
eval $CMD
if [ $? -ne 0 ]; then
echo "Error: $CMD"
cd "$SAVE_DIR"
return 1
fi
cd "$SAVE_DIR"
return 0
}
#
# dl_file_from_url <url>
#
# Download a file to the current directory
#
dl_file_from_url () {
local URL="$1"
local DOWNLOAD_PATH=""
local DOWNLOAD_DIR=""
local PROTOCOL=""
local CMD=""
DOWNLOAD_PATH="$DOWNLOAD_PATH_ROOT/$(repo_url_to_sub_path "$URL")"
DOWNLOAD_DIR="$(dirname "$DOWNLOAD_PATH")"
PROTOCOL=$(url_protocol $URL)
echo "$PROTOCOL $URL $DOWNLOAD_PATH"
if [ -f "$DOWNLOAD_PATH" ]; then
echo "Already have '$DOWNLOAD_PATH'"
return 0
fi
case "$PROTOCOL" in
https|http)
if [ ! -d "$DOWNLOAD_DIR" ]; then
CMD="mkdir -p '$DOWNLOAD_DIR'"
echo "$CMD"
eval "$CMD"
if [ $? -ne 0 ]; then
echo "Error: $CMD"
return 1
fi
fi
CMD="$(get_download_file_command $URL $DOWNLOAD_PATH.dl_part)"
echo "$CMD"
eval $CMD
if [ $? -ne 0 ]; then
\rm -f "$DOWNLOAD_PATH.dl_part"
echo "Error: $CMD"
return 1
fi
\mv -fT "$DOWNLOAD_PATH.dl_part" "$DOWNLOAD_PATH"
;;
*)
echo "Error: Unknown protocol '$PROTOCOL' for url '$URL'"
return 1
;;
esac
return 0
}