I intend to write a Puppet external custom fact, using DNF's Python API, which would collect all DNF modules data and output as YAML data, ready to be consumed by Puppet Facter.
I managed to list all modules, then each module streams, and each stream profiles, plus the default profile for each stream (if defined), and Facter parsed the output successfully. The API exposes simple method for all this. But I couldn't find in the official docs, nor in the internal help (pydoc), nor trying to read the source code, how to get following data for each module:
- Enabled stream
- Default stream
- Enabled profile
Can someone help me finding a way to query this data using DNF Python API?
This is where I got so far, and here's a simplified version:
from dnf.base import Base
from dnf.module.module_base import ModuleBase
from yaml import dump
base = Base() # Empty
base.read_all_repos() # Read main conf file and .repo files
base.fill_sack() # Prepare the Sack and the Goal objects
# All modules as API objects with all available data
mod_base = ModuleBase(base)
module_objs = mod_base.get_modules('*')[0]
modules = {}
for module_obj in module_objs:
module_name = module_obj.getName()
# Each stream comes in a different object of the same module
if module_name not in modules:
modules[module_name] = {'streams': {}}
modules[module_name]['streams'][module_obj.getStream()] = {
'default_profile': module_obj.getDefaultProfile().getName(),
'profiles': [profile_obj.getName() for profile_obj in module_obj.getProfiles()]
}
print(dump({'dnf_modules': modules}))
I could get missing data with some shell scripting, using dnf module commands instead of the API. But this brings in 3 problems:
dnf modulecommands output is not meant to be machine parsed, so filtering the results is quite cumbersome- All loops and filters end up taking too long to run
- Adds a probably unnecessary complexity layer