Ask Your Question

Revision history [back]

click to hide/show revision 1
initial version

Here are the steps to write Avro data into Kafka by using Flink:

  1. Add the necessary dependencies to the Flink project:
<dependency>
    <groupId>org.apache.flink</groupId>
    <artifactId>flink-avro</artifactId>
    <version>${flink.version}</version>
</dependency>
<dependency>
    <groupId>org.apache.kafka</groupId>
    <artifactId>kafka-clients</artifactId>
    <version>${kafka.version}</version>
</dependency>
  1. Create a Kafka producer by setting up properties such as bootstrap servers and topic name:
Properties properties = new Properties();
properties.setProperty("bootstrap.servers", "localhost:9092");
FlinkKafkaProducer<String> kafkaProducer = new FlinkKafkaProducer<>("topic-name", new SimpleStringSchema(), properties);
  1. Convert the Avro data into a string format using the toString() method:
GenericRecord record = new GenericData.Record(schema);
record.put("field1", value1);
record.put("field2", value2);
String avroString = record.toString();
  1. Create a Flink DataStream that contains the Avro data:
DataStream<String> avroStream = env.fromElements(avroString);
  1. Write the Avro data into Kafka using the Kafka producer:
avroStream.addSink(kafkaProducer);
  1. Submit the Flink program and verify the Avro data is successfully written into Kafka.