It's important to consider whether it's worth it, even?
I worked on stream processing, it was fun, but I also believe it was over-engineered and brittle. The customers also didn't want real-time data, they looked at the calculated values once a week, then made decisions based on that.
Then, I joined another company that somehow had money to pay 50-100 people, and they were using CSV, sh scripts, batch processing, and all that. It solved the clients' needs, and they didn't need to maintain a complicated architecture and the code that could have been difficult to reason about otherwise.
The first company with the stream processing after I left, was bought by a competitor at fire sale price, some of the tech were relevant for them, but the stream processing stuff was immediately shut down. The acquiring company had just simple batch processing and they were printing money in comparison.
If you think it's still worth going with stream processing, give your reasoning to the team, and most reasonable developers would learn it if they really believe it's a significantly better solution for the given problem.
Not to over-simplify, but if you can't convince 5 out of 10 people to learn to make their job better, it's either that the people are not up to the task, or you are wrong that stream processing would make a difference.
I agree. Unless the downstream data is going to be used to feed a system to make automated decisions (ex. HFT or Ad buying), having real time analytics is usually never worth the cost. It's almost always easier and more robust to have high tail latencies for humans to consume and as computers get faster and faster that tail latency decreases.
Systems that needed complex streaming architectures in 2015 could probably be handled today with fast disk and large postgres instance (or BigQuery).
Yeah that reminds me of a startup I worked at that did real-time analytics for digital marketing campaigns. We went to all kinds of trouble to update dashboards with 5-minute latency, and real-time updates made for impressive sales demos, but I don't think we had a single customer that actually needed to make business decisions within 24 hours of looking at the data.
We were doing TV ads analytics by detecting ads on TV channels and checking web impact (among other things). The only thing is, most of these ads are deals made weeks or months in advance, so customers checked analytics about once before a renewal… so not sure it needed to be near real time…
Batch processing is just stream processing with a really big window ;-). More seriously, I find streaming windows are often the disconnect. Surprisingly often, users don't want windowed results. They want aggregation, filtering, uniqueness, ordering, and reporting over some batch. Or, they want to flexibly specify their window / partitioning / grouping for each reporting query. Modern OLAP systems are plenty fast enough to do that on the fly for most use cases - so even older streaming patterns like stream processing for real time stats in parallel with batch to an OLAP system aren't worth the complexity. Just query the DB and cache...
I worked on stream processing, it was fun, but I also believe it was over-engineered and brittle. The customers also didn't want real-time data, they looked at the calculated values once a week, then made decisions based on that.
Then, I joined another company that somehow had money to pay 50-100 people, and they were using CSV, sh scripts, batch processing, and all that. It solved the clients' needs, and they didn't need to maintain a complicated architecture and the code that could have been difficult to reason about otherwise.
The first company with the stream processing after I left, was bought by a competitor at fire sale price, some of the tech were relevant for them, but the stream processing stuff was immediately shut down. The acquiring company had just simple batch processing and they were printing money in comparison.
If you think it's still worth going with stream processing, give your reasoning to the team, and most reasonable developers would learn it if they really believe it's a significantly better solution for the given problem.
Not to over-simplify, but if you can't convince 5 out of 10 people to learn to make their job better, it's either that the people are not up to the task, or you are wrong that stream processing would make a difference.