Ask Your Question
3

How to write Avro data into Kafka by using Flink?

asked 2023-05-13 10:08:47 +0000

huitzilopochtli gravatar image

edit retag flag offensive close merge delete

1 Answer

Sort by ยป oldest newest most voted
0

answered 2023-05-13 10:25:01 +0000

lakamha gravatar image

Here are the steps to write Avro data into Kafka by using Flink:

  1. Add the necessary dependencies to the Flink project:
<dependency>
    <groupId>org.apache.flink</groupId>
    <artifactId>flink-avro</artifactId>
    <version>${flink.version}</version>
</dependency>
<dependency>
    <groupId>org.apache.kafka</groupId>
    <artifactId>kafka-clients</artifactId>
    <version>${kafka.version}</version>
</dependency>
  1. Create a Kafka producer by setting up properties such as bootstrap servers and topic name:
Properties properties = new Properties();
properties.setProperty("bootstrap.servers", "localhost:9092");
FlinkKafkaProducer<String> kafkaProducer = new FlinkKafkaProducer<>("topic-name", new SimpleStringSchema(), properties);
  1. Convert the Avro data into a string format using the toString() method:
GenericRecord record = new GenericData.Record(schema);
record.put("field1", value1);
record.put("field2", value2);
String avroString = record.toString();
  1. Create a Flink DataStream that contains the Avro data:
DataStream<String> avroStream = env.fromElements(avroString);
  1. Write the Avro data into Kafka using the Kafka producer:
avroStream.addSink(kafkaProducer);
  1. Submit the Flink program and verify the Avro data is successfully written into Kafka.
edit flag offensive delete link more

Your Answer

Please start posting anonymously - your entry will be published after you log in or create a new account. This space is reserved only for answers. If you would like to engage in a discussion, please instead post a comment under the question or an answer that you would like to discuss

Add Answer


Question Tools

Stats

Asked: 2023-05-13 10:08:47 +0000

Seen: 19 times

Last updated: May 13 '23