Apparently the real problem is to get the metadata of a filter’s input to the output generated by that exact input. So by some magical means I have to wait for a filter plugin to have generated all data that it will generate for the input it has been fed, and we’re potentially talking about dozens of milliseconds! This is an outrage!
Of course I don’t have to do that for every block of input… I only need to wait for all previous input to generate an output if the input metadata changes.
Unfortunately I can currently not think of a way to reliably notice that there is still some data in the filter’s pipeline, or to reliably notice that there isn’t. Sure, I could impose some arbitrary limits, like waiting 50 milliseconds, but I will never have a guarantee that all data has been flushed and I can safely input new data in a potentially new format.
Or I simply assume that the format of a stream never changes… this way I would have to create sub-pipelines, e.g. for a multi-file source but that might in fact be a lot easier than to deal with all this metadata shit. I guess I will try that. Tomorrow.