Something like this:
my_pipeline:
- collect: my_collector
- process:
- proc_1
- parallel:
- proc_2
- proc_3
- proc_4
- forward: my_forwarder
This way, we could get the full benefit of a DAG, Apache Airflow style (https://en.wikipedia.org/wiki/Directed_acyclic_graph). These parallel steps would theoretically interact with orthogonal subsets of the data, so there would be no concerns of race conditions. We would need to figure out a way to gather the event after the parallel steps in such a way where we only have one event with the combined, processed data. Will most definitely require changes in the pipeline manager.