weldr: ensure a fresh dnf cache when making a new compose

dnf-json relies on dnf's ability to cache repository metadata. This is
important, because the API calls it quite often to serve requests for
package lists and depsolves.

However, osbuild's dnf stage always fetches new metadata, because it
doesn't have access to the host's cache. Since metadata is valid for
some time, even after a repository changed, the checksum we put in
the pipeline might be old.

Force a new metadata download when producing the pipeline. This is still
not perfect, but greatly reduces the probability of putting stale
metadata into the pipeline.
This commit is contained in:
Lars Karlitski 2019-12-19 20:33:57 +01:00
parent 0ef89aa864
commit 839a109c78
5 changed files with 20 additions and 15 deletions

View file

@ -4,6 +4,7 @@ import datetime
import dnf
import hashlib
import json
import shutil
import sys
DNF_ERROR_EXIT_CODE = 10
@ -35,9 +36,12 @@ def dnfrepo(desc, parent_conf=None):
return repo
def create_base(repos):
def create_base(repos, clean=False):
base = dnf.Base()
if clean:
shutil.rmtree(base.conf.cachedir, ignore_errors=True)
for repo in repos:
base.repos.add(dnfrepo(repo, base.conf))
@ -79,7 +83,7 @@ command = call["command"]
arguments = call.get("arguments", {})
if command == "dump":
base = create_base(arguments.get("repos", {}))
base = create_base(arguments.get("repos", {}), arguments.get("clean", False))
packages = []
for package in base.sack.query().available():
packages.append({
@ -100,7 +104,7 @@ if command == "dump":
}, sys.stdout)
elif command == "depsolve":
base = create_base(arguments.get("repos", {}))
base = create_base(arguments.get("repos", {}), arguments.get("clean", False))
errors = []
try: