- Startup
- Business
- Enterprise
- On-Premise
- Add-on
About Azure Event Hubs
Event Hubs is a fully managed, real-time data ingestion service that’s simple, trusted, and scalable. Read more about Event Hubs.
When to use this connector
- to read messages from and write messages to a given Event Hub.
- to implement a log-based CDC with a message queue.
Create Kafka-enabled Event Hubs
On API level Azure Event Hubs is compatible with Apache Kafka.
Read how to create Kafka-enabled Azure event hubs.
Create a Connection
Step 1. In the Connections
window, click +
, and type in azure event hub
.
Step 2. Select Azure Event Hub
.
Step 3. Enter the Connection parameters.
Connection parameters
Namespace
: the Event Hubs namespace.Topic(s)
: a topic to read messages from or write messages to. For reading, the wildcard topic names, for example,inbound.*
or comma-separated topic names, for example,topic1,topic2
topics are supported. Event Hub is a synonym to Topic.Access Key
: the Event Hub access key can be found in the Azure console underEvent Hub
/Shared acccess policies
/SAS Policy
/ Connection string-primary key.Properties
: the additional properties for the Kafka consumer, Kafka producer, and Kafka security. The properties must be in a Formatkey1=value;key1=value1
.Auto Commit
: if enabled, the Kafka consumer will periodically commit the offset when reading the messages from the queue. It is recommended to keep it disabled so the system can commit the offset right after the messages have been processed.Starting Offset
: a starting offset at which to begin the fetch.Key Deserializer
: the deserializer for the key.Value Deserializer
: the deserializer for the value. When the value is a document in Avro Format use eitherAvro
(when processing messages enqueued by Etlworks Integrator), orAvro Record
(when processing messages enqueued by the third-party application). The latter requires anAvro Schema
.Max number of records to read
: the total maximum number of records to read in one micro-batch. The default limit is 1000000.Poll duration
: how long (in milliseconds) the consumer should wait while fetching the data from the queue. The default is 1000 milliseconds.Max number of records to poll
: the maximum number of records which can be fetched from the queue in a single poll call.Number of retries before stop polling
: the number of retries before stop polling if the poll returns no records. The default is 5.Integration with CDC providers
: select the CDC provider(s) if you are planning to use this Connection for capturing and processing CDC events. Currently, only Debezium is supported.Key Serializer
: the serializer for the key.Value Serializer
: the serializer for the value. Use Avro when writing messages in Avro Format.Compression
: the compression algorithm used when writing messages.Record headers
: record headers are key-value pairs that give you the ability to add some metadata about the record, without adding any extra information to the key/value pair of the record itself.
Comments
0 comments
Please sign in to leave a comment.