Beam Kafka Produce
The main limitation of the Kafka Producer is that it currently only supports writing or producing Strings as keys and String and Avro Record as values.
Here are some options you need to send Avro Record values to a Kafka server. The schema of Avro values are not sent to Kafka but to a schema registry. As such you need to have one available. Here are some options you need to set to make this work on a Confluent Cloud Kafka instance. There are various parts of the software stack that need authentication, hence the bit of redundancy. We recommend that you put these options in variables in your environment configuration file.
Option | Example |
---|---|
value.converter.schema.registry.url | |
auto.register.schemas | true |
security.protocol | SASL_SSL |
sasl.jaas.config | org.apache.kafka.common.security.plain.PlainLoginModule required username=”CLUSTER_API_KEY” password=”CLUSTER_API_SECRET”; |
username | CLUSTER_API_KEY |
password | |
sasl.mechanism | PLAIN |
client.dns.lookup | use_all_dns_ips |
acks | ALL |
basic.auth.credentials.source | USER_INFO |
basic.auth.user.info | CLUSTER_API_KEY:CLUSTER_API_SECRET |
schema.registry.basic.auth.user.info | SCHEMA_REGISTRY_API_KEY:SCHEMA_REGISTRY_API_SECRET |