That's not far off another idea I had....to just try to read the stream as fast as pos and then calculate the change over time with code. Basically creating you own little rolling 20ms baskets and doing your own conflation... somehow. I'm not calculating LTA myself but that's how I'm working out other changes over time and the results seem to be sensible. I don't use the ConflateMs to control the batching, I do it at my end.
That wouldn’t work for the kind of tool I’m trying to build, which is to get a Telegram alert as soon as a big chunk gets matched in specific markets.
It also depends which technology you use to read the stream. My choice for now is NodeJs as it’s non-blocking and can send the Telegram alert “in background” without halting the stream reading process.
If you use a normal “blocking” language, most probably you’re right: have one process that reads and saves the stream, while another process makes sense of the data.