1.What is Apache KAFKA & It’s Benefits?
- A powerful asynchronous messaging (Publish-Subscribe) technology
- Streaming platform is a central hub for real time applications.
- The view of data is as events rather than rows in database
Benefits :
- At some point enterprise system grows much and more complex and where decoupling of service is needed.There event driven architecture is the best fit.
- It stores the messages in flat files and consumers ask messages based on an offset
- The published messages/events to topics get persisted
- Old messages can be retained on a time base (like expire_logs_days) and/or on a storage usage base
- Kafka will record which messages were delivered to which consumer group, so that it doesn’t serve it up again
- Kafka is having great performance also
2. REST VS KAFKA ?
REST :
- If we have a large number of applications REST layer for building request & response is poor fit.
- Data quality issues may arise since that data is not real time
- Ex: 2 systems had similar data but found discrepancy in data
KAFKA :
- There KAFKA comes into picture to focus on modeling streams of data.
- KAFKA allow us to both transport these streams of data to all the systems and applications that needed them as well as build rich real-time applications on top of them.
- A modern stream centric data architecture built around KAFKA
- Ex: Backup or Stand By Copy of a database,Taking full dump may not be good option for large volume of data & It takes more system resources.Instead more efficient way is to take the difference of what has changed(New/Update/Delete) and fetch only those records.This kind of event stream is called change data capture.
3. What are the different types of Change Data Capture ?
In databases, change data capture (CDC) is a set of software design patterns used to determine (and track) the data that has changed so that action can be taken using the changed data. Also, Change data capture (CDC) is an approach to data integration that is based on the identification, capture and delivery of the changes made to enterprise data sources.
EX:
Oracle
|
XStream
GoldenGate
|
MySQL
|
binlog replication
|
Postgres
|
Logical Log Streaming Replication
|
4. What are the different message brokers ?
- RabbitMQ
- Kafka
- ActiveMQ
- Kestrel
5. KAFKA to Salesforce ,How KAFKA push topics to Salesforce?
- Risk : Don't overwhelm Salesforce with events coming from Kafka.Which may quickly hits the Governor limits.
- Design Solution : As shown,The Microservice / Vendor App monitors the KAFKA and Only selective events (Events filter logic need’s to be implemented in this external system as per business rules) written to Salesforce.
6.Salesforce to KAFKA,How Salesforce push topics to KAFKA?
- Consumer API logic contains connectivity to Salesforce & Like below to handle the events and send to KAFKA. Refer
No comments:
Post a Comment