Event Streaming in Apache Kafka Event streaming platforms are the new revolution happening in the data infrastructure and application architecture. The infrastructure technology has concentrated chiefly on building one application, meaning one data store servicing one app. However, in a company, there are more than hundreds of applications, datastores, data warehouses. The
Decoupling microservices architecture with a streaming platform J.S. Haldane wrote about animals falling down mine shafts: "A rat is killed, a man is broken, a horse splashes." That means the bigger the object, the greater its fragility. Modern enterprises deal with numerous systems, data applications on a regular basis. Eventually
Machine learning with Kafka - The modern Stream Processing platform for all data preprocessing needs Some bits about Kafka Kafka is a popular messaging platform based on pub-sub mechanism. It is a highly available, fault tolerant & distributed system. Most organisations are using Kafka in different use-cases, but the best
CASE STUDY: Pan-European Stock Exchange, Euronext using Confluent Kafka Streams for Trading Platform
Company: Headquartered in Amsterdam, Netherlands, Euronext is a leading stock exchange with a global reach. The Exchange operates in regulated securities and derivatives markets in Amsterdam, Brussels, Lisbon and Paris, Ireland, and the UK. In this case study, we will see Kafka Streams example on how Euronext used Confluent Kafka
Bank Rakyat Indonesia (BRI) – the largest bank in Indonesia - is also the largest microfinance institution globally. It has 75 million customers and a market capitalization that exceeds $38 billion. Let's see how they expanded their operations with the help of big data analytics using Kafka. Challenge: The bank
Every large corporation and SME have found real-time data analysis quite critical. Many industries such as legal services, financial services and IT operation management are in need of massive real-time data along with historical data. When we are in need of handling high volume data, we should implement the best
What is a Data pipeline? A data pipeline is a system where data is transferred in chunks in a serial and systematic manner (Messages, records) between systems. These flows are well defined, audited and might contain sensitive information, which needs to be secured. These pipelines can be application queues, transfers
What Is Real-time Data Streaming Real-time data streaming is a process that filters and infers data to accomplish tasks at the time it is published or generated. Businesses use real-time data streams to incorporate modern technologies like IoT, advanced machine learning, and artificial intelligence. These are critical in improving human-machine
The rise of real-time data analytics has enabled business systems to read and react to events most compellingly and accurately. It has proved to be a boon for companies to implement needs that require quick attention, such as preventive maintenance, cross-selling, recognizing potential cyber threats, customer behavior, media sharing, etc.
Undoubtedly, Apache Kafka has taken the business world by storm with its impeccable event streaming capabilities. It acts as a backbone to the companies in creating a real-time, continuous data source to streamline their business processes. You must prioritize leveraging Kafka Security features before implementing them to your business applications.
Apache Kafka Security- What is it? Undoubtedly, Apache Kafka acts as an internal medium and enables businesses to communicate using real-time data. However, it also exposes you to cyber risks such as unauthorized access to data that could adversely affect your business. Therefore, Kafka streams have an integrated security feature
As most folks are adopting cloud setup. One can install Apache Kafka on AWS (Amazon Linux) and manage it themselves. This kind of setup is highly customisable and can be tailored to meet exclusive needs. This again has its associated risks and benefit. Installation of Apache Kafka on AWS