Modern software applications heavily rely on analyzing large volumes of sequences of events or ‘event streams’ that are continuously generated by different sources in real-time to capture actionable insights and immediately respond to business challenges.
As modern business applications require to ingest, collect, store and process terabytes of data that comes in the form of event streams, you need to choose an Event Streaming Platform that is performant, interoperable, scalable, reliable, secure, and cost-efficient.
The popularity and adoption of Event Streaming platforms such as Apache Kafka, Azure Event Hubs, AWS Kinesis, etc. are increasing. In this session, we’ll take a closer look at the key characteristics of cloud native event streaming platforms. These characteristics include
- Multi-protocol: Ability to ingest and consume event streams with a wide array of protocols such as AMQP, Kafka, HTTP, WebSockets, and so on
- High-Performance data streaming: Low end-to-end latency, high throughput
- Dynamic scaling: Scale event stream ingestion capacity dynamically
- Multi-tenanted PaaS with workload isolation
- High availability and resiliency: Replicas. availability Zones
- Geo Disaster recovery with data and state replication
- Security and Compliance
- Stream governance: Using schema-driven formats, Fine-grained resource governance