Often we want to build a test collection that substitutes different
sequences of tasks into a parallel/sequential construction. However, the
yaml combination that happens when generating jobs is not smart enough to
substitute some fragment into a deeply-nested piece of yaml.
Instead, make these sequences top-level entries in the config dict, and
reference them. For example:
tasks:
- install:
- ceph:
- parallel:
- workload
- upgrade-sequence
workload:
workunit:
- something
upgrade-sequence:
install.restart: [osd.0, osd.1]
Signed-off-by: Sage Weil <sage@inktank.com>
- tasktest:
- tasktest:
+ You can also reference the job from elsewhere:
+
+ foo:
+ tasktest:
+ tasks:
+ - parallel:
+ - foo
+ - tasktest:
+
+ That is, if the entry is not a dict, we will look it up in the top-level
+ config.
+
Sequential task and Parallel tasks can be nested.
"""
log.info('starting parallel...')
with parallel.parallel() as p:
for entry in config:
+ if not isinstance(entry, dict):
+ entry = ctx.config.get(entry, {})
((taskname, confg),) = entry.iteritems()
p.spawn(_run_spawned, ctx, confg, taskname)
- tasktest:
- tasktest:
+ You can also reference the job from elsewhere:
+
+ foo:
+ tasktest:
+ tasks:
+ - sequential:
+ - tasktest:
+ - foo
+ - tasktest:
+
+ That is, if the entry is not a dict, we will look it up in the top-level
+ config.
+
Sequential task and Parallel tasks can be nested.
"""
stack = []
try:
for entry in config:
+ if not isinstance(entry, dict):
+ entry = ctx.config.get(entry, {})
((taskname, confg),) = entry.iteritems()
log.info('In sequential, running task %s...' % taskname)
mgr = run_tasks.run_one_task(taskname, ctx=ctx, config=confg)