Share data with Kafka Streams — KT.team

Make it easier to work on Airflow integrations

Make it easier to work on integrations with Apache Kafka

Apache Airflow message broker is a distributed streaming platform that can process millions of events daily. Kafka guarantees easy integration into the project infrastructure, the reliability and scalability of the system.

Kafka Connect

Kafka Connect is an Apache Kafka framework that provides scalability and flexibility to move data between Kafka and other repositories. This framework allows the broker to act as an ESB service bus.

The Kafka Connect framework allows the Kafka broker to act as a service tire — KT.team

Apache Airflow features

1

Streaming data processing

Learn more about Kafka features — KT.team
Streaming data processing in Kafka— KT.team

Airflow streaming works with data in real time at the speed of message generation. Messages are processed continuously and without blocking. Many business processes are also ongoing and do not require a response for processing. Streaming data processing is necessary for business processes such as alerting suspicious transactions or tracking mail delivery.

2

App activity tracking

Learn more about Kafka features — KT.team
Tracking app activity in Kafka — KT.team

Any messages that appear in the app can be published in an Airflow special topic. For example, every ERP document or every user action on the site: clicks, adding to favorites, adding/removing from the Cart, filling out forms, page views (and its depth) — can be sent and distributed according to specially defined Airflow topics. Thus, other topics (consumers) can subscribe to the topics they need for various purposes — monitoring, analysis, reporting, personalization, etc.

3

Logging and log monitoring

Learn more about Kafka features — KT.team
Logging and monitoring Kafka's dens — KT.team

Airflow allows you to keep logs and monitor logs. You can publish logs to Airflow topics, and logs can be stored and processed in the cluster for as long as you need. If you have a dedicated monitoring app, it can get real-time data from Airflow topics.

4

Message storage

Learn more about Kafka features — KT.team
Message storage in Kafka — KT.team

Airflow adds each message to the log (saves it to disk) and stores it there until the log is cleared of old messages, which the user assigns in advance. This allows Kafka to be used as a reliable data source (as opposed to RabbitMQ, which deletes messages as soon as they are delivered).

The benefits of Kafka Apache

1

Scalability

Apache Airflow allows you to process data of any size. You can start working with one broker to try out Airflow and then increase the number of brokers to fully exploit the system. It is also possible to increase the volume while the current number of brokers is running — this will not affect the system as a whole in any way.

2

System reliability

One of the advantages of Airflow is its reliability. For example, if one of the Airflow brokers “falls” for some reason, it will switch the entire data flow to other brokers and automatically distribute the load between them, and the system will continue to operate normally.

3

Productivity

Due to its high throughput, Apache Airflow is able to process more than a million events per second. This makes Airflow the most popular message broker when working with big data.

ESB system implementation cases

Watch all

Watch all

Is there a need for implementation?

Contact us and we will calculate the time and cost of implementing the ESB system

YouTube

We have collected all the mistakes in integrations,
for you not to
make them

Watch all

We write articles for our blog and for specialized publications

Read more

Point to point, broker, ESB: what integrations will help build a loosely coupled IT architecture

Learn more

The properties of a flexible and scalable IT infrastructure: an educational program on basic concepts and the main evaluation criteria

Learn more

Talend implementation cases in enterprise projects: global experience

Learn more

Watch all

IT consulting for medium and large businesses | KT.team

System integration calculator (ESB)

System Integration Project (ESB) Calculator

How many streams will the systems send
Example: The “Product Management System” will send product data. “Order Management System” — about orders. “Warehouse management system” — about the status of the shipment. This is 3 streams.
0
Example: The “Product Management System” will send product data. “Order Management System” — about orders. “Warehouse management system” — about the status of the shipment. This is 3 streams.
0
100
How many streams will the system receive
Example: The “Warehouse Management System” will receive data on goods and orders. “Order Management System” — about goods and shipment status. This is 4 streams.
0
Example: The “Warehouse Management System” will receive data on goods and orders. “Order Management System” — about goods and shipment status. This is 4 streams.
0
100
The calculator calculates using an accurate but simplified formula. The scope of work for your project and the final cost may vary. The final calculation will be made by your personal manager.

1

Calculation example

Learn more about Mule ESB features — KT.team
Creating and hosting services on Mule ESB — KT.team

To transfer data between systems, we create a “stream”. Some streams are needed to send data, while others are needed to receive data. Orders, goods, or other entities may be transferred in a separate stream.

For example, on the diagram:
1. The “Merchandise Management System” sends goods. “Warehouse management system” is the fact that an order has been shipped. “Order Management System” — orders. In total, the systems will send 3 streams;

2. The Warehouse Management System accepts goods and orders. “Order management system” — goods and the fact that the order has been shipped. In total, the systems will receive 4 streams.

2

Scope of work in the calculator

Learn more about Mule ESB features — KT.team

Included in the calculation

Additionally

Preparing a map of systems and data flows (SOA scheme)

Preparing the infrastructure for connectors to operate

Development of object logic (connector business process diagram)

Setting up a monitoring and logging loop

Creating connectors for exchanging data for each stream on 3 stands (test, preprod, prod)

Creating connectors (storage - receiver) for exchanging data on each high-load stream (>100 messages per minute) on 3 stands (test, preprod, prod)

Set up to three dashboards per connector within a ready-made monitoring circuit

Over 15 attributes per stream

Documentation on copying integration, reusing, and maintaining

Demonstration of the implemented functionality

Included into account

Preparing a map of systems and data flows (SOA scheme)

Development of object logic (connector business process diagram)

Creating connectors (source - storage, storage - receiver) for exchanging data on each object on 3 stands (test, preprod, prod)

Set up to three dashboards per connector within a ready-made monitoring circuit

Over 15 attributes per object

Additionally

Preparing the infrastructure for connectors to operate

Setting up a monitoring and logging loop

Creating connectors (storage - receiver) for exchanging data on each high-load object (>100 messages per minute) on 3 stands (test, preprod, prod)

Over 15 attributes per object

We use cookies to provide the best site experience

Ok
Visit our bot for prompt advice and useful information