Company: Mojix is a US-based leading item chain management solutions provider to retail and industrial enterprises around the globe. The company’s software solutions work in real-time and are hardware agnostic. They aggregate and process data streams from various sensors to provide business insights. Let's see how they used real-time data
Machine learning with Kafka - The modern Stream Processing platform for all data preprocessing needs Some bits about Kafka Kafka is a popular messaging platform based on pub-sub mechanism. It is a highly available, fault tolerant & distributed system. Most organisations are using Kafka in different use-cases, but the best
Delivery Hero-PedidosYa is a market-leading online food ordering platform in Latin America. It has an innovative web and mobile app providing its users access to 12,000 restaurants across six countries in the region. This case study explains how they used real-time data streaming to detect malicious & fraudulent activities &
Company: AO.com, an online retailer AO is one of the UK’s leading electrical retailers. The company has been operating for the last twenty years, selling over 9,000 electrical products to millions of customers across the UK. Challenge: For a retail business speed of its operations is of utmost importance.
What Is Real-time Data Streaming Real-time data streaming is a process that filters and infers data to accomplish tasks at the time it is published or generated. Businesses use real-time data streams to incorporate modern technologies like IoT, advanced machine learning, and artificial intelligence. These are critical in improving human-machine
“The perfect kind of architecture decision is the one which never has to be made” Robert C. Martin From being a phenomenal trend to becoming one of the most essential IT designs considerations, event streaming has evolved to be pivotal for businesses due to an exponential increase in data and
Data transaction streaming is managed through many platforms, with one of the most common being Apache Kafka. In our first article in this data streaming series, we delved into the definition of data transaction and streaming and why it is critical to manage information in real-time for the most accurate
There are levels of event recording in accordance with the use cases for which we are tracking like application logging (Log4j levels). Now a mission-critical application (like transactions) will generate a huge number of events, this will enable to debug/analyse in case of any failure or audits. For low to