sources/curl: don't limit total download time

Some RPMs might be very large, and limiting the total download time
might lead to failed build even in cases where downloading is making
progress. Instead, set a minimum download speed (1kbps). If the
minimum is not surpassed for 30 seconds in a row, the download fails
and is retried. This follows the logic employed by DNF.

Adjust the number of retries to 10 and the connection timeout to 30,
in order to match what DNF does. One difference is that DNF does 10
retries across all downloads, whereas we do it per download, this
could be changed in a follow-up.

Old:
 - a download taking more than 5 minutes is unconditionally aborted

New:
 - slow but working downloads will never be aborted
 - downloads will be stalled for at most five minutes
   in total before being aborted
 - time spent making progress does not count towards
   the five minutes

Signed-off-by: Tom Gundersen <teg@jklm.no>
This commit is contained in:
Tom Gundersen 2022-03-11 16:37:41 +01:00 committed by Tomáš Hozza
parent 87d1299888
commit e175529f7c

View file

@ -15,12 +15,10 @@ up the download.
import concurrent.futures
import itertools
import math
import os
import subprocess
import sys
import tempfile
import time
from osbuild import sources
@ -93,17 +91,13 @@ def fetch(url, checksum, directory):
with tempfile.TemporaryDirectory(prefix="osbuild-unverified-file-", dir=directory) as tmpdir:
# some mirrors are sometimes broken. retry manually, because we could be
# redirected to a different, working, one on retry.
start_time = time.monotonic()
return_code = 0
for _ in range(20):
elapsed_time = time.monotonic() - start_time
if elapsed_time >= 300:
continue
for _ in range(10):
curl_command = [
"curl",
"--silent",
"--max-time", f"{int(math.ceil(300 - elapsed_time))}",
"--connect-timeout", "60",
"--speed-limit", "1000",
"--connect-timeout", "30",
"--fail",
"--location",
"--output", checksum,