Event Streaming in Apache Kafka
Event streaming platforms are the new revolution happening in the data infrastructure and application architecture. The infrastructure technology has concentrated chiefly on building one application, meaning one data store servicing one app. However, in a company, there are more than hundreds of applications, datastores, data warehouses. The data is fragmented all over these applications. However, they need to interconnect to create a company.
Even in modern architecture like microservices and SAS applications, the data is even more fragmented. To interconnect these data is hugely complicated. It gets more challenging as the organizations have diverse lines of businesses, which have their tech stack.
What’s more, if any of these business lines move to the Public Cloud, each tech stack would split over multiple environments. So, how to handle these connections among so many things?
Data sitting in a warehouse is of no use. It creates value when it reacts and operates as events occur. This requires reorienting the thoughts behind event streaming. An event refers to each data point in the system, such as sales, HR, product development, and the streaming refers to the continuous delivery of these events. The ability to react in real-time to event streams is called stream processing.
In this video, Confluent CEO Jay Kreps introduces an emerging category of software, an event streaming platform based on Apache Kafka, and how they are serving businesses.