It is about using modern tools (ClickHouse) for data engineering without the fluff - when you can take whatever dataset or data stream and make what you need without the need for complex infrastructure.
Nevertheless, the statement "big data is dead" is short-sighted, and I don't entirely follow this opinion.
For example, here is one of ClickHouse's use-case:
> Main cluster is 110PB nvme storage, 100k+ cpu cores, 800TB ram. The uncompressed data size on the main cluster is 1EB.
And when you have this sort of data for realtime processing, no other technology can help you.
It is about using modern tools (ClickHouse) for data engineering without the fluff - when you can take whatever dataset or data stream and make what you need without the need for complex infrastructure.
Nevertheless, the statement "big data is dead" is short-sighted, and I don't entirely follow this opinion.
For example, here is one of ClickHouse's use-case:
> Main cluster is 110PB nvme storage, 100k+ cpu cores, 800TB ram. The uncompressed data size on the main cluster is 1EB.
And when you have this sort of data for realtime processing, no other technology can help you.